The FCC: After Four Frustrating Years, Tough Work Ahead

Work Ahead (© iQoncept - Fotolia.com)

 

Julius Genachowski was one of President Obama’s original tech warriors, so hopes were high when he became chairman of the Federal Communications Commission in 2009. He leaves the post some modest accomplishments, some bigger disappointments, and a general sense of stasis that has replaced the excitement of 2008.

This situation is not Genachowski’s fault and there is not much chance that his successor, no matter who it is, will be able to speed process. Inertia is a powerful force in Washington, and few institutions are harder to get moving than the FCC. Why else would the commission still be arguing over rules prohibiting cross-ownership of newspapers and television stations–and issue likely to come to a boil again if Rupert Murdoch goes ahead with a bid for The Los Angeles Times–even as both sets of institutions fade into irrelevance?

The commission has two huge problems. First, the FCC’s actions are governed by a terrible and hopelessly obsolete law, the Telecommunications Act of 1996. Any time the commission seeks to stretch its authority, say, by trying to regulate network neutrality, it can count of being sued and probably slapped down by the courts.

Second, major industry constituencies—big telecommunications companies and wireless carriers, broadcasters, cable companies—see much to lose and little to gain from change, and then opposition of any one constituency can cause things to drag on interminably.

A good example is freeing unused or underused television broadcast spectrum for wireless data use. The fight stems from the transition to digital TV mandated in the mid 1990s and completed in 2007. TV stations ended up with more spectrum than they had good use for. The result was a plan for “incentive auctions,” in which stations would receive part of the proceeds from the sale of spectrum (which they didn’t pay for in the first place). The FCC plan was complex, and Congress, at the behest of broadcasters, made it even more baroque in 2011 legislation authorizing the sales. Broadcasters continue to throw up roadblocks and it now appears that the auction process, originally expected to start next year, is unlikely to get going until 2015. The TV fight is also holding up a plan to make some of the unused TV spectrum, the so-called TV whitespaces, available for unlicensed wireless data. Not surprisingly, the broadcasters oppose that plan too.

Unfortunately, there’s not a lot an FCC chairman can do to speed the agency’s glacial pace. Federal law creates endless possibilities for delay. Any time the commission tries to push its boundaries, it will be sued and objectors have generally found a friendly ear at the conservative D.C. Circuit Court of Appeals, which hears all challenges to FCC actions.[pullquote]Unfortunately, there’s not a lot an FCC chairman can do to speed the agency’s glacial pace. Federal law creates endless possibilities for delay.[/pullquote]

This is troubling, because the FCC has some major items on its agenda. The most urgent is finding ways to find more spectrum for wireless data. It has become clear that the traditional approach of transferring spectrum from incumbents to new users has limited potential to increase bandwidth, at least in any reasonable amount of time. What’s needed is sharing of spectrum—especially between government agencies and private users–and new technologies to use the spectrum we have more efficiently. Steps to do both, sometimes simultaneously–as in the sharing of of the 3.5 gigahertz band between military radars and small-cell wireless data–are underway, but it is going to be a long slog. Incumbent holders of spectrum don’t give it up easily, even for sharing, while establisher service providers will maneuver to prevent competitors from gaining any perceived advantage. Look for a long slog.

Another major issue is mundane, even boring, but very important. The nearly 150-year-old public switched telephone network has nearly reached the end of its useful life; internet technology is a far more efficient way to move voice traffic than traditional circuit switching. The prospect of this happening has been looming for some years, but AT&T has forced the issue with a formal petition to transition its land-line services to an IP network. A lot of money is at stake–there is a huge investment tied up in the existing network. The FCC has to make sure that the transition balances the interests of customers and shareholders of the carriers–this mostly affects AT&T and Verizon Communications–and guarantees a reliable and affordable landline network for the future. (Much as techies disparage it, the landline network is still tremendously important and, of course, the same IP network that will carry voice calls forms the backbone of the internet.

That’s a big agenda for change coming up against a system strongly biased to inertia, complicated by a Congress whose passion for meddling is exceeded only by its lack of understanding of the issues.

Spectrum: Multiplication Beats Addition

Dark Side of the Moon album cover

Martin Cooper recalls the days of mobile radio-telephones before cellular service:

You’d have one station in a city and you could conduct in that city 12 phone calls at one time. During the busy hour, the probability of connecting, of getting a dial tone, was about 10%. Of course, the reason was a city with 12 channels could support perhaps 50 people with reasonable service. They put 1,000 people on it. So the service was abominable.

The solution had been developing for a long time before Cooper made the first cellular call in 1973. Back in 1947, engineers at Bell Labs came up with a scheme for using relatively low-powered transmitters to serve hexagonal cells. With some care and cleverness in assigning channels, the same spectrum could be reused, provided the cells were far enough apart. Over time, AT&T developed the technology that allowed a call to stay connected as a mobile phone moved from one cell to another and Motorola created the mobile handsets. An industry and a new way of life was born.

The sort of subdivision that made the cell phone possible will also enable a vast expansions of the amount of data that wireless networks can carry without a commensurate increase in wireless spectrum. Get ready for heterogeneous networks, or hetnets, that will use a variety of techniques to chop up spectrum and space into smaller chunks that will allow for greater reuse.Get ready for heterogeneous networks, or hetnets, that will use a variety of technologies to boost data capacity.

Wi-Fi handoff will be a key part of the hetnet. It’s being used that way today, albeit in a somewhat random and uncoordinated way. Nearly all Wi-Fi-capable mobile devices are designed to switch to Wi-Fi for data whenever it is available. One big problem, is that the device has only a vague idea of what available means. This works fine when I come home and my devices automatically connect to the network, whose password is in memory. My iPhone connects automatically to AT&T hotspots and my iPad does the same for Verizon.

Many other networks, however, require a login. Sometimes it’s a password that you can enter and it will be remembered from then on. Sometimes its a popup page that just wants you to agree to terms and conditions. And sometimes it’s a page that require a username, a password, and often a credit card number for payment. While these methods vary int he annoyance they cause, all are a serious impediment to a seamless handoff. Even worse, is that your device will try to use a Wi-Fi network to which you haven’t connected, either because you lack a password or don’t care to pay. Sometimes you have to manually turn Wi-Fi off to get your phone or tablet to work properly.

Change is coming, through a technology known as Passpoint or Access Point 2.0. This will allow truly seamless handoffs between cellular and Wi-Fi (and perhaps, in the future, white space) networks, which the device itself providing authentication. The standards are nearing final ratification, Once that happens, says Doug Lodder, vice president for business development of hotpot provider Boingo, “the carriers will run it through their labs and will negotiate roaming agreements. It’s starting to roll out, but we won’t see widespread availability until 2014.”

Small cells. Traditional cell antennas, mounted on towers or other structures, typically serve a radius of from several kilometers to several hundred meters, depending mostly on the height of the tower. Small cells, also known as microcells, picocells, and femtocells, serve ranges from a couple hundred meters down to a few tens of meters. Home femtocells are designed to provide connectivity to otherwise unserved places and connect to the network through a residential broadband connection. But other small cells are a fully managed part of a cellular network, intended to multiply the use of spectrum by chopping areas into very small cells.

You can’t just plop small cells down in areas already covered by standard cell service, at least not using the same frequencies. The Federal Communications commission is proposing that the shared 3500megahertz band be dedicated to small-cell use. Higher frequency signals have shorter range and less ability to penetrate obstructions than the 700 to 2100 megahertz signals typically used for wireless data, making them well suited to small cells.

Small cells, and a related technology known as distributed antenna systems (DAS) have the advantage of making it much easier to provide good coverage inside buildings. As Cooper says, “It’s kind of an anomaly that if you think about it, most of our cellular conversations are in buildings and in offices, because that’s where we spend most of our time. But all the stations that provide services, almost all of them are outside. It’s kind of backwards.” Whereas small cells use multiple miniature access points, not unlike a Wi-Fi network, DAS splits the signal of a single base station among multiple antennas, each serving a small region. “You have smaller pipes, but fewer people attached to each pipe,” says Boingo’s Lodder. A single DAS array can also carry signals for several cellular networks.

Smart antennas. Cellular communication is a broadcast service. A single cell antenna covers, typically a 120° sector of its cell. But smart antenna technology makes it possible to focus that beam and steer the signal to a recipient, allowing closer reuse of spectrum. There has been a lot of research on smart antennas, but limited deployment in the field. A versions, called multi-input, multi-output (MIMO) is used with Wi-Fi and LTE, but the purpose has been more to extend range than to increase spectrum reuse. Smart antennas are one more tool in the engineering toolbox that can allow us to move a lot more data on the spectrum we have.

More wireless data spectrum is always welcome and the growth of demand for bandwidth probably cannot be met entirely within existing spectrum allocations, But new spectrum is getting harder and harder to find and the politics of prying it loose are exhausting and not terribly productive. Our best hope for meeting demand is to do more with what we have. And, fortunately, there is a great deal more that can be done.

Spectrum: The Shortage Is a Crisis, but Not Serious

Dark Side of the Moon album cover

The late economist Herb Stein used to say that “if something cannot go on forever, it will stop.”

A profound economic truth lies behind that seeming flip statement. The world is forever on the verge of running out of vital commodities–oil, food, water, and many more–but somehow we never do. In the worst case, as a commodity grows scarce, its price rises and demand shrinks. The real world, however, human ingenuity triumphs over shortages. We find alternatives to whatever we are running out of, or, better, we find ways to use what we have much more efficiently. So it is with the spectrum we need to move ever-growing volumes of wireless data to our proliferating mobile devices.

In the short run, available spectrum is more or less fixed, creating an atmosphere of shortage. The established carriers, especially Verizon Wireless and AT&T, warn of “exponential”* growth in demand and use claims of shortage both to lobby for new allocations of spectrum for wireless data use and to justify data caps and higher rates. Critics argue that while dedicating more spectrum to wireless data is desirable, much can be accomplished through greater efficiency in the use of what we have.

In this an subsequent articles in this series on spectrum, I will examine the claims and look at possible solutions. Perhaps the biggest issue is just what is happening with demand for spectrum. The truth appears to be that it is still growing very quickly, but at a decelerating rate. Cisco’s Visual Networking Index, which has often been criticized for exaggerating the growth rate, indicates this clearly. It shows the growth rate for mobile data slowing from 133% in 2011 to an estimated 78% in 2014. A growth rate of nearly 80% is still staggeringly fast, but the effect of this deceleration is enormous. At a 133% compound annual growth rate, consumption would increase 240-fold over a decade; at 78%, just 60-fold. The difference: More than 100 exabytes of data per month.Stein’s Law: “If something cannot go on forever, it will stop.”

But even if we discount the more breathless and self-serving estimates of growth in wireless data use, it is clear that the amount of spectrum allocated to wireless data will be, at some point in the not too distant future will be inadequate to meet demand, based on today’s technologies. It is also clear that to meet this demand, we must both find additional spectrum and find ways to use it more efficiently. Fortunately, both are eminently doable.

The actions that can be taken to improve the availability of spectrum for data include:

  • Auctioning spectrum currently used for other purposes. This is the course favored by the incumbent carriers and, to a considerable extent, by Congress and the Federal Communications Commission. The big problem is that it is extremely difficult to get anyone–public or private–who currently holds spectrum to part with it. Legislation passed last year provides for the auction of 100 MHz of unused or under-used television spectrum for  data, with the current broadcast licensees sharing in the proceeds. The rules for these “incentive auctions” are extremely complex. No spectrum will actually be sold until next year at the earliest, and it seems unlikely that the amount freed will ever come up to 100 MHz. Prying spectrum from the vast hoard held by government agencies, particularly the Defense Dept., is even more difficult.
  • Speeding buildout of unused spectrum. Even while complaining of spectrum shortages, the incumbent carriers still have a lot of spectrum in the bank. Neither Verizon nor AT&T has completed the build-out of LTE networks on the 700 MHz-band spectrum they bought in 2007, a Verizon has just acquired considerable additional spectrum in a deal with Comcast and other cable companies. The biggest chunk of barely used spectrum is nationwide coverage at 2.5 GHz held by Clearwire, whose financial woes have allowed only a small portion of the network to be built out. Both Sprint and Dish Networks are bidding for control of Clearwire with the fate of this spectrum in the balance.
  • Spectrum sharing. A lot of spectrum is assigned to entities, usually government agencies, that sue it only sparingly. For example, Defense Dept. operates a scattering of military radars in the 3.5 GHz band. The FCC is currently implementing a plan that will allow commercial use of this spectrum by devices and base stations specially designed to operate only where and when they will not interfere with the radar.
  • White spaces. This is a Wi-Fi-like spectrum-sharing variant that operates on unused portions of the television band. Unfortunately, white space is most available in rural areas and scarce in crowded cities where it is really needed. It is most likely to have its main impact as an alternative to wired broadband service in rural areas.
  • Small cells. The basic principle  of cellular communication is that limiting the range of base stations to fairly small areas allows spectrum to be reused, as long as the cells are far enough apart to avoid interference. Cell sizes, which depend on transmit power and the height of the antenna, range from a radius of 30 kilometers in the country to 1 km or less in dense cities. But reuse of spectrum can be increased greatly by using very small cells in the densest areas.
  • Wi-Fi offload. Unlike other wireless technologies, Wi-Fi operates on spectrum that is free for anyone to use, and Wi-Fi access points serve areas with a radium of 100 m or less. The load on crowded cellular data networks can be reduced greatly if as much traffic as possible is shifted to Wi-Fi, and new technologies are enhancing the ability of this offload to be handled automatically and seamlessly.
  • Smart antennas. While small cells reduce the radius of coverage, smart antennas can reduce the angle of the sector covered. Current cellular antennas typically cover a 120° sector. Smart antenna technology can allow base stations to beam their transmissions to the devices to which they are connected, again allowing for greater resuse of spectrum.

Most or all of these technologies are going to be needed in combination to deal with the growing demand for wireless data,  but the fact is that the spectrum “crisis” is a challenge we can meet with a combination of sound policy and good technology. I’ll be looking at each of these options in more detail in coming articles in this series.

*–Truth in mathematics time. The essential characteristic of exponential growth is that it increases at an ever increasing rate. (For those of you who remember your calculus, all derivatives are positive.) This never happens in the real world, at least not for long, because growth is always constrained by something. As noted below, there is, in fact, evidence that the growth in demand for wireless data is already decelerating.

The Shape of 2013: Predictions for the Year Ahead

Crystal ball graphic
After 15 years of making predictions, with a track record that would have made you rich if you’d bet on them, I’ve been away from the practice for a couple of years. But as the regulars at Tech.pinions have agreed to end the year with a set of predictions each, I’m back at the game. My best guesses for 2013:

A Modest Rebound for BlackBerry. Like many others, I was prepared to write off BlackBerry during the last year as its market share cratered. And if Windows Phone 8 had really taken off or if Android had made a serious play for the enterprise, it would be very hard to see where there might be room in the market for Research In Motion, no matter how promising BlackBerry 10 looks. But I think there is room for at least three players in the business, and right now the competition for #3 is still wide-open. BlackBerry still enjoys a lot of residual support in the enterprise IT community, and some key federal agencies that had been planning to move away from the platform, such as Homeland Security’s Immigration & Customs Enforcement, have indicated they are open to a second look. The challenge Research In Motion faces is that BlackBerry 10, which will be leased on Jan. 30, needs to be appealing enough to users, not just IT managers, that it can at least slow the tide of bring-you-own devices into the enterprise.

A Windows Overhaul, Sooner Rather Than Later. Even before Windows 8 launched to distinctly mixed reviews, there were rumors about that Microsoft was moving toward a more Apple-like scheme of more frequent, less sweeping OS revisions. Microsoft sometimes has a tendency to become doctrinaire in the defense of its products; for example, it took many months for officials to accept that User Access Control in Vista was an awful mess that drove users crazy. But Microsoft has had some lessons in humility lately and the company knows that it is in a fight that will determine its relevance to personal computing over the next few years. I expect that, at a minimum, Windows 8.1 (whatever it is really called) will give users of conventional PCs the ability to boot directly into Desktop mode, less need to ever used the Metro interface, and the return of some version of the Start button. On the new UI side, for both Windows 8 and RT, look for a considerable expansion of Metrofied control panels and administrative tools, lessening the need to work in Desktop. In other words, Microsoft will move closer to what it should have done in the first place: Offer different UIs for different kinds of uses. The real prize, truly touch-ready versions of Office, though, are probably at least a year and a half away.

Success for touch notebooks. When Windows 8 was first unveiled, I was extremely dubious about the prospects for touch-enable conventional laptops. The ergonomics seemed all wrong. And certainly the few touchscreen laptops that ran Windows 7 weren’t every good. Maybe its my own experience using an iPad with a keyboard,  but the keyboard-and-touch combination no longer seems anywhere near as weird as it once did. And OEMs such as Lenovo, Dell, HP, and Acer are coming up with some very nice touch laptops, both conventional and hybrid. Even with a premium of $150 to $200 over similarly equipped non-touch models, I expect the touch products to pick up some significant market share.

Significant wireless service improvements. We’ll all grow old waiting for the government’s efforts to free more spectrum for wireless data to break fruit. The incentive auctions of underused TV spectrum are not going to be held until 2014, and it will be some time before that spectrum actually becomes available. The same is true for a new FCC plan to allow sharing of government-held spectrum in the 3.5 GHz band. But the good news is we don’t have to wait. Technology will allow significant expansion of both the capacity and coverage of existing spectrum. Probably the two most important technologies are Wi-Fi offload, which will allow carrier traffic to move over hotspots set up in high-traffic areas, and femtocells and small cells, which can greatly increase the reuse of of the spectrum we already have. Unlicensed white space–unused free space between TV channels–should begin to make a contribution, especially in rural areas where TV channels are more sparse. And the huge block of mostly idle spectrum the Sprint is acquiring with its proposed purchase of Clearwire will also ease the congestion, probably starting next year. (Stay tuned for a Tech.pinions series on spectrum issues in January.)

Intel Will Make a Major ARM Play. It’s hard to believe today, but Intel was once a major player in the ARM chip business. In 1997, it bought the StrongARM business from a foundering Digital Equipment. Renamed XScale, the Intel ARM chips enjoyed considerable success with numerous design wins as early smartphone applications processors. But XScale was always tiny compared to Intel’s x86 business and in 2006, Intel sold its XScale operations to Marvell. A year later, Apple introduced the ARM-based iPhone. Today, ARM-based tablets are in the ascendancy, x86-based PCs are in decline, and Intel is struggling to convince the world that a new generation of very low power Atom systems-on-chips are competitive. Maybe the Clover Trail SOCs and their successors  will gain a significant share of the mobile market, but Intel can’t afford to wait very long to find out. With its deep engineering and manufacturing skills, Intel could become a major ARM player quickly, either through acquisition or internal development.

Marty Cooper’s Billion Dollar Spectrum Contest Idea

Martin Cooper photo (S. Wildstrom)
It has been almost 40 years since Martin Cooper made the mobile phone call that earned him the title of father of the cell phone. Today he is still active in the industry, looking for ways to make mobile better. Like many others, he thinks t5hat finding enough spectrum to handle soaring wireless data usage is the great challenge. Unlike many, however, he has ideas that go beyond reallocating a limited pool of wireless spectrum.

One of his concerns is that what has been spectacular growth in the efficiency of spectrum use has slowed. “There’s not much motivation for the people who have the spectrum to get more efficient,” he says. “Why should they get more efficient when all they have to do is ask for more spectrum? Yes, they have to pay for it, but the cost of spectrum at auction is the bargain of the century. Just think about it. You may spend $1 billion to get a piece of spectrum but that spectrum is going to double in value every 2½ years.”

So Cooper, who has spent many years working on smart antenna technology that would allow more effective reuse of spectrum, has an idea to create an incentive. “One possible way, and a way that I suggest would be really valuable for the government to get people to operate more efficiently, is what I call the Presidential Prize. Suppose the government offers the industry the opportunity to get, say, 10 MHz of spectrum free of charge, no auction price or anything, All you’d have to do to get that 10 MHz of spectrum is demonstrate that you could operate at least 50 times more efficiently than existing people.  Well, if somebody could do that, they’d have the equivalent spectrum of 50 times 10 MHz, or 500 MHz of spectrum today.

“So my suggestion is let’s have a contest to see who can get to 50 times improvement over the next 10 years or so. It’s going to cost a lot of money to do that, but we’re going to find that we’ll have some new carriers , people that have made substantial investments, and we’ll now be using the spectrum more efficiently. The spectrum belongs to us, to the public, not to the carriers. We only lease it to the carriers, and they are supposed to operate in the public interest. It is in the public interest to use that spectrum efficiently and make it available to more and more people. The only way to do that is to get the cost down.”

You can see much more of my interview with Cooper, including video, on Cisco’s The Network.

The Spectrum Shortage That Isn’t

Photo of Dan Mead
Verizon CEO Dan Mead

If you listen to wireless operators, their industry is on the brink of a catastrophe caused by success. “Innovation is at risk today due to the spectrum shortage that we face,” Verizon Wireless President Daniel S. Mead said in a keynote at the CTIA Wireless 2012 show. “There is no doubt there is a looming spectrum crunch.” CTIA President Steve Largent says we are “on the brink of a major wireless traffic jam.”

Demand for wireless data is definitely growing quickly, though just how fast is subject to dispute (as in the glory days of wireline internet growth in the late 1990s, there’s a tendency to overstate current growth rates and then project them into the indefinite future.) But despite the claims that we will exhaust our wireless data capacity by 2014, or 2016, or 2020, the evidence that a shortage of capacity is crippling wireless now, or will anytime in  the near or medium term future is simply lacking.

And that’s a good thing, because notwithstanding the wailing of the wireless carriers and their trade association, the CTIA, the prospects for any major new allocation of spectrum are grim. Congress has authorized a complex scheme known as incentive auctions, in which television broadcasters will receive part of the proceeds if they allow the government to auction off spectrum they are not using.

It’s a fine idea, but it’s complicated by the fact that creating usable blocks of bandwidth will require some TV stations to move to new frequencies. Broadcasters are not flocking to offer spectrum. Bottom line,  it’s going to take a lot longer to free any bandwidth for wireless data and in the end, the amount of  new spectrum is likely to be substantially less than the 120 MHz that the Federal Communications Commission was hoping for. Wresting unused or underused spectrum from federal agencies (especially the military) is likely to prove even harder.

Promoting spectrum shortages serves carriers’ interest in several ways. AT&T used it as a major justification for its failed acquisition of T-Mobile and Verizon makes the argument to support its proposed purchase of unused spectrum from a group of cable operators. Considering bandwidth  a scarce resource  helps justify high prices and restrictive usage caps.

What carriers can do.

Speed LTE deployment. But there is a lot the carriers can  do–and in some cases are doing–to alleviate any crunch. The first is an accelerated move to LTE technology. The carriers have promoted LTE as being faster than existing technologies and, in general, it is, but its real importance is that it that it uses its bandwidth far more efficiently than the 3G EV-DO and HSPA technologies. Verizon, which had hit a speed wall in EV-DO has been the most aggressive in deploying LTE, but AT&T is catching up. Sprint,  which made a bad bet on alternative WiMAX technology, and T-Mobile are starting to move.

More Wi-Fi offload. Especially in the locations where demand is greatest, carriers can ease the pressure on their wireless networks by moving data traffic to Wi-Fi. The new Hotspot 2.0 (IEEE 802.11u) standard  should provide for seamless transfer  of sessions between wireless broadband and Wi-Fi. But the carriers have to support the hotspots and provide adequate backhaul capacity.

Small cells. Cellular communications is based on the concept that bandwidth can be reused by having each base station provide coverage to a relatively small  area whose size is governed by power levels and the height of the antenna. In rural areas, carriers use very tall towers to cover big, but lightly used, areas, while in dense city cores, antennas are mounted much lower. Carriers could provide for much greater reuse of spectrum by going to even smaller microcells, which would be more like Wi-Fi hotspots in coverage. The downside is that this required building, paying for, and siting many more base stations, but it could greatly increase capacity. Ericsson, Alcatel Lucent, and Cisco are all developing small-cell gear and AT&T plans to begin testing service later this year.

Agile radios. From the beginning of wireless communications, the basic approach has been to assign dedicated spectrum to each user, with hardware designed to operate at very specific frequencies. This guarantees am environment in which some assigned frequency bands are very crowded while others are underused. There may be plenty of spectrum in the aggregate, while specific slices of it are clogged. For years, the dream has been to move to the use of agile, or software-defined, radios that could operate on  any available spectrum. The technology is finally reaching the point where this sort of agility is technologically possible. But the transition will be very complex: We have a nearly century-old regulatory regime based on discrete spectrum slices. Licensees have valuable assets in their assigned spectrum, which also serves as a powerful barrier to new entrants. And billions of existing devices  would have to be replaced to take advantage of an agile system. Needsless to say, a move to a new system is going to take a very long time.

Wireless is clearly the future and a powerful driver of innovation and economic growth. More spectrum is always better. But there are good solutions to alleviate shortages in the short and medium term. The situation is nowhere near as dire as the carriers would have us believe.