Podcast: Oculus Quest 2 VR Headset, Sony Playstation 5, Apple Event

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing the news from the Facebook-owned Oculus press event on their new VR headset and plans for AR glasses, chatting about the release details of Sony’s forthcoming gaming console, and analyzing the news from Apple’s event, including new Apple Watches, iPads and its Apple One bundle of services.

Takeaways from CBRS Auction and Implications for the C-Band

The 3.5 GHz CBRS PAL auctions were concluded on August 26, and the winners were announced a week later. Overall, the auction raised $4.4 billion, which was slightly above what many analysts had expected. The big winners: Verizon, DISH, Comcast, and Charter. Also interestingly, a number of enterprises also won licenses, which demonstrates the interest to deploy private wireless networks as a complement to/replacement for Wi-Fi and for certain specialized applications.

Here’s my take on the winners and losers in the auction, what it means for 5G and the competitive structure of the wireless industry, and the implications on the even more consequential C-Band auction, which is scheduled to start on December 8.

As a quick backgrounder, the FCC has made 150 MHz of spectrum available in the 3550-3700 MHz band, or the 3.5 GHz CBRS ‘mid-band’. This band, incidentally, is also being used for 5G in many other countries. The initial phase of the CBRS auction, called the GAA layer, consists of 80 MHz of shared spectrum, meaning that it can be used by anyone (i.e. not auctioned) as long as it’s available, through a regime managed by four spectrum database administrators (SASs). The GAA layer became commercially available in late 2019, and so far has been used for mainly corporate or venue type deployments.

The second phase was the 70 MHz PAL auction (called Auction 105), for 10 year licenses, in 10 MHz channels, on a county-by-county basis. A provider could bid for up to 40 MHz in a particular geographic region. That auction concluded on August 26. Of the $4.4 billion spent on the auction, Verizon was the big winner, spending $1.9 billion. DISH spent $913 million, Comcast spent $458 million, and Charter spent $212 million. T-Mobile only won a handful of licenses, and AT&T did not win any licenses at all.

The auction was especially critical for Verizon, which is in the weakest spectrum position of all the major operators, especially in the mid-band. I expect its 3.5 GHz efforts are a precursor to what will be a more significant spend in the C-band auction, where 280 MHz of spectrum in the 3.7-4.2 GHz band will be made available.

The magnitude of DISH’s spend was a bit of a surprise, given DISH’s already strong spectrum position and its rather precarious cash situation, as the company bleeds pay TV subscribers. But it’s an important signal of DISH’s commitment to being in the wireless business. In addition to its treasure trove of low-band spectrum, DISH reaped the spoils of the T-Mobile-Sprint merger, getting some of the excess spectrum, Boost Mobile from Sprint, and an MVNO relationship with T-Mobile. So DISH is now a bona fide retail player in the prepaid wireless space, with 9 million Boost subscribers (and 270,000 from the recently acquired Ting), with plans to build a nationwide 5G network in what is likely to be a hybrid of retail/wholesale strategy.

The cable companies were the other notable winners in the auction. Cable — mainly Comcast (Xfinity Mobile), Charter (Spectrum Mobile), and Altice — have been steadily growing their wireless business, counting some 4 million wireless subscribers between them. However, their mobile business relies on the several million Wi-Fi hotspots and an MVNO relationship with Verizon (Altice’s is through T-Mobile). Their participation in the PAL auction is the first time the cable companies have acquired a meaningful amount of wireless spectrum, enabling them to add some physical cellular infrastructure to complement their Wi-Fi/MVNO strategy. It demonstrates their commitment to being in the mobile space, and their desire to be less dependent on the MVNO structure, given the rather unfavorable economics.

T-Mobile’s relative lack of participation in the 3.5 GHz auction was no big surprise, given its strong mid-band position by virtue of the 150 MHz of 2.5 GHz spectrum it got from Sprint. AT&T was conspicuously absent from the CBRS auction. We suspect they’re saving their powder for the C-band auction.

Outside of the major operators, this was also a successful auction in bringing in some new, innovative players. Among the winners were:

  • Several Wireless Internet Service Providers (WISPs), some of them already using the GAA spectrum, who will use it to provide rural broadband using fixed wireless access (FWA). They’re also counting on additional funds being made available for rural broadband initiatives, an idea that’s gained steam in the wake of Covid and the need to narrow the digital divide.
  • Numerous private companies. A 10 MHz license is sufficient bandwidth for a company to deploy a private LTE or 5G network. Among the auction winners were Deere & Company and Chevron, who could use their licenses to provide connectivity to manufacturing and other facilities that are harder to reach with Wi-Fi. Several real-estate companies also won licenses, the idea being that building or campus-wide networks could be deployed. Power companies also spent more than $50 million, collectively, on licenses, to power smart grid and IoT applications, or to use wireless as a connectivity backup.

Implications for the C-Band Auction

Given that the CBRS auction is in the mid-band, it’s viewed as somewhat of a warm-up act for the upcoming C-Band auction. This will be the largest auction in U.S. wireless history, with 280 MHz being made available in the attractive 3.7-4.2 GHz band. We expect all the major players to be there. Wall Street analysts expect this auction could raise $50 billion or more.

The auction is probably most consequential for Verizon, which still needs additional mid-band spectrum to meet 5G coverage and capacity needs. I also expect AT&T to spend big, especially since it sat out CBRS. And even though DISH and T-Mobile are in a pretty good spot spectrum-wise, it’s almost certain they’ll be active in the auction, given the attractiveness of the C-band, the seemingly insatiable need for additional capacity…and a defensive strategy to ensure that major competitors don’t end up in an overly advantageous position.

Financing will be a big story over the next three months, as the C-band auction looms. I don’t think it’s a coincidence that there have been recent rumors about AT&T potentially shedding some of its media assets, particularly DirecTV and possibly Xandr. AT&T’s balance sheet isn’t pretty. Wall Street and Elliott Management aren’t going to let AT&T go wild at the C-band auction without the means to pay for it.

DISH also needs to come up with some source of funding if it’s going to both be active in the C-band auction and spend the $10 billion (or more) required to build its 5G network. I believe that DISH’s strategy is to offer a retail wireless operation (like Verizon, AT&T, etc.) but also to run an active wholesale business, given its favorable capacity position. In fact, DISH really needs a major anchor tenant to sign a long-term deal, which would provide DISH with the resources needed to execute its wireless strategy. My bet is that it will be one of the major Internet players, such as Amazon. The cable companies are also possible candidates, as they look for more favorable MVNO terms than they currently have with Verizon. Given that DISH is a major competitor in the pay TV space, this would be among the frenemiest relationships in telecom.

Between the mmWave auctions completed earlier this year, the recently completed CBRS auction, the upcoming C-band auction, and the FCC’s announcement a few weeks ago that it would auction off another 100 MHz of spectrum in the 3.45-3.55 GHz band in 2021, we’re seeing an unprecedented expansion of capacity being made available for commercial wireless use. This will alter the competitive landscape and will be a catalyst for the sorts of innovative new use cases envisioned for 5G. Ensuring adequate spectrum resources for 5G is also the ante needed for being competitive on the global 5G stage.

Podcast: Microsoft Surface Duo, Motorola Razr, Microsoft XBox Series X, Apple Event Preview

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing their experiences with Microsoft’s Surface Duo device, analyzing the launch of the second generation Motorola Razr foldable 5G phone, chatting about the details of the next generation XBox gaming console and previewing Apple’s Event for next week.

Podcast: Samsung Galaxy Z Fold 2, Nvidia Gaming GPUs, Intel CPUs and Branding, Qualcomm IFA Announcements

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing their experiences with Samsung’s second generation foldable device, analyzing the GeForce RTX 3000 series GPU announcements from Nvidia, talking about Intel’s new 11th Generation Core CPUs and the company’s new Evo platform brand, and chatting about the many different announcements from Qualcomm’s IFA keynote speech.

Podcast: TikTok, Apple Facebook, HP and Dell Earnings, Fall Product Preview

This week’s Techpinions podcast features Ben Bajarin and Bob O’Donnell discussing the latest developments and challenges around the potential sale of social media app TikTok, the controversies between Apple and Facebook on activity tracking, the latest quarterly earnings from PC industry leaders HP and Dell, and the potential impact of a range of tech products expected to be released this fall.

Podcast: 5G, Radio Frequency Spectrum and What it All Means

This week’s Techpinions podcast features Mark Lowenstein and Bob O’Donnell explaining many of the details of how 5G works, what radio frequency (RF) spectrum is, why it’s critically important and what the latest developments are, how how all of this impacts telco carriers and device makers, and more.

Podcast: Microsoft Surface Duo, Qualcomm Court Decision, Fortnite Battle with Apple and Google

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell analyzing the news around Microsoft’s Surface Duo mobile device, discusses the positive legal outcome for Qualcomm’s IP licensing business, and debates the issues around Epic’s Games’ Fortnite-driven battle with Apple and Google’s app store policies.

Podcast: Samsung Unpacked, T-Mobile 5G, Apple App Store, Microsoft-TikTok

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing the announcements from the Samsung Unpacked event, including their new Note 20 and Galaxy Z Fold 2 smartphones as well as their partnership with Microsoft on software and gaming services, chatting about T-Mobile’s launch of the world’s first 5G SA (Standalone) network, controversies around Apple’s App Store policy and cloud-based gaming services like Microsoft’s upcoming xCloud, and analyzing the potential purchase of TikTok by Microsoft.

Podcast Special: Marta Karczewicz of Qualcomm Discussing Video Compression Technology

This is a special Techpinions podcast with Carolina Milanesi and Bob O’Donnell along with special guest Marta Karczewicz, VP of Technology at Qualcomm, discussing the evolution of video compression technology and standards and how they impact our ability to watch streaming videos from services like Netflix on our smartphones and TVs. In addition, they discuss the role of women in engineering roles and the importance of diversity in technology research and development.

Samsung #Unpacked2020: Strong High-End Portfolio & Deeper Collaboration with Microsoft

The Samsung Galaxy Unpacked part one, back in February, was the last live event I attended. Hence, as we approached this week’s Unpacked, I was as curious about the products to be announced as I was to see how Samsung would pull off their first digital launch.

Overall, I thought Samsung did a good job mixing content videos, technical and informational videos and time on the virtual stage. Personally, I was not a fan of the virtual audience, but I think it did fit the feeling that the live venues over the past couple of times created, with the big floor to ceiling screens that displayed both content and the audience in the room.

I appreciated having the opportunity to see new faces from the engineering and design teams. I suppose it is the silver lining of having a digital event and recording in Korea. This setup also brought more women on stage, which is always a good thing!

There was a lot to cover product-wise, but Samsung kept a fast pace and, for the first time, brought all the products together, demonstrating the value of having more than one Samsung product. Samsung has been trying to paint that “better together” picture, but what was missing was the software portion that would bring the products together. This time, both thanks to Samsung’s software and a renewed partnership with Microsoft, that the dotted line between products was much more obvious and natural.

Products that Do More At a Time When We All Do More

With August-Unpacked being the Galaxy Note reveal show, we have been accustomed to focusing on the latest and greatest tech and so the very high-end of the Samsung portfolio. Over the past year, there have been some questions on whether the Note line continued to fulfill the initial promise of being the best of what Samsung has to offer in mobile, especially as the market moves into Foldables. I think the Galaxy Note20 Ultra took care of those concerns by embracing quite a few technology firsts from the 5G Snapdragon 865+ chipset to Gorilla Glass Victus and UWB.

What was interesting this year is that Samsung announced a whole portfolio of high-end devices around the Galaxy Note line. Galaxy Z Fold 2 teaser aside, we saw the Galaxy TabS7, the Galaxy Watch 3 and the Galaxy Buds Live. While it might seem strange to bring to market high-end products in the current economic environment, we need to consider that this is not Samsung’s only offering. Earlier in the summer, Samsung launched a whole range of mid-tier phones that added to its Galaxy S line to give smartphone buyers an ample choice of features, designs and price points.

Together with a lot of economic uncertainty, the pandemic also brought a stronger need for technology and reliable devices, whether it is for working from home, distance learning, keeping healthy, or just trying to stay sane. While being stuck at home might have increased the time we spend on larger screens, it has not taken away how much we rely on our phones. Phones also remain the tech device that you can more easily plan for financially thanks to installment plans that limit the impact that a one-off purchase would have.

Samsung’s strong carrier channel and 5G integration might also make the Galaxy TabS7 line to be as easy to purchase as a phone at a time when many consumers are re-evaluating their computing needs as well as their broadband constraints!

The one product that I find harder to justify, although it fits into the portfolio, is the Galaxy Watch 3, where the price point reflects more design choices than technology ones. I would have liked Samsung to double down on its Galaxy Watch Active line maybe with a new color variant to fit with the new Galaxy Buds Live. The good news is that many of the features and capabilities announced for the Watch 3 are software-driven, which might mean we will see them trickle down to the Watch Active line at some point.

The Galaxy Buds Live is possible the product with the smallest footprint and the biggest opportunity across everything that Samsung announced on stage. Having used them for a few days, I am convinced they will become the default for Android users and possibly win over some iOS users too because of their price point and fit. They are by far the most comfortable earbuds I have ever used with good sound and ok active noise canceling.

Samsung and Microsoft Better Together

Microsoft has been focusing on improving how users can move seamlessly from their Android phones to their PCs for quite some time. In the process, the relationship between Samsung and Microsoft got tighter to the advantage of both companies.  For Microsoft, Samsung offers a fleet of mobile phones for their apps and services, especially in the enterprise. For Samsung, Microsoft offers apps and services that help them lessen their dependence on Google and offer differentiation within the Android ecosystem.

This week the relationship between the two companies deepened on the productivity side and expanded into the entertainment side.

On productivity, Microsoft’s Your Phone app and Link to Windows will allow Galaxy Note20 users to access and interact with their Android apps. Samsung updated its Samsung Notes app, which soon will be able to automatically synch with OneNote feed in Outlook on the web or OneNote as an image. Inking support for the Note20 was also extended to photos and Outlook brings the Play My Emails feature to Android.

On entertainment, Xbox Chief, Phil Spencer, announced that from September 15, Galaxy users would be able to download the Xbox Game Pass app from the Samsung Galaxy Store. This version will allow Xbox players to redeem tokens and make in-app purchases like buying skins or DLC items in the Xbox Store. The Xbox Game Pass on the Google Play Store will not offer these types of in-app purchases. Customers pre-ordering the Galaxy Note20 can select the Gaming Bundle at purchase and get three months of Xbox Game Pass Ultimate and PowerA’s MOGA XP5-X Plus, the controller used with xCloud. For xCloud to be successful, Microsoft must reach beyond console and PC gamers and rely on the army of Android users out there. Working with Samsung offers a way to bypass the Google Play Store for some offers as well as leverage Samsung’s market share in TVs, the next logical step where xCloud gaming has a natural fit. For Samsung, who stated two years ago at their developer conference, that it wanted its devices to the best gaming experience in the Android ecosystem, xCloud offers an alternative to Stadia and with it another differentiator against competitors.

A Peak at the Galaxy Z Fold 2

Understandably the Galaxy  Z Fold 2 only appeared briefly on stage, most likely not to steal the moment that belongs to the Galaxy Note20. The time on stage, albeit limited, was very focused. True to its customer-focused nature, Samsung started the segment with an acknowledgment that the launch of the original Fold did not unfold as planned, sorry I could not resist the pun! From acknowledging we issue, Samsung moved on to show what has changed with the Galaxy Z Fold 2 from a design perspective to improve usability and increase confidence in durability. We saw the larger 6.2” external display, the front camera system that went down to a punch-hole from the previous design that took up a larger corner of the screen, creating a lopsided forehead. We were also shown a pretty detailed video on the new “sweeper” technology that Samsung created to limit the amount of debris that can affect the now even thinner gap along the hinge. Technology that apparently was inspired by the bristles used on vacuum cleaners.

We will have to wait till September 1 to know more about the Galaxy Z Fold 2, but from what we heard today, it is clear that there has been quite a bit of refinement from version one.

The Big Picture

 For the first time at an Unpacked event, we had a Q&A where Samsung Mobile’s President and Head of Communications Business, TM Roh, shared his view on the business direction saying 5G and Foldables will be the cornerstones of Samsung’s future. He also shared that he understands the responsibility Samsung has to make a better world safeguarding privacy and security as well as the environment. While not quite the off the cuff conversation I would have liked, it seems that TM Roh wants to make more of an attempt at storytelling (see his blog Steering the Mobile Industry through the next normal), a skill that is certainly growing in importance among tech leaders.

Leading up to Unpacked, there were rumors that Samsung was in discussion with Google to embrace more of their services, but we heard nothing about that on stage. This is not surprising as Unpacked was really centered on the relationship between Samsung and Microsoft. Something we also did not hear much about, however, was Bixby, which to me, is where Samsung might move into more with Google and decide just to embrace Google Assistant. The currently added friction of using Google Assistant on Samsung’s devices is a limitation that will become more and more noticeable as users’ reliance on digital assistant grows and as Google Assistant is embedded in more services and applications and becomes a value add in the experience that products like the Galaxy Buds Live can deliver. The relationship with Google might be more center stage on September 1 when Samsung will provide more details on the Galaxy Z Fold 2, let’s see!

Podcast: AMD Earnings, Congressional Hearings, Amazon, Apple, Facebook and Google Earnings

This week’s Techpinions podcast features Ben Bajarin and Bob O’Donnell analyzing the quarterly financial results from AMD and what they say about the semiconductor industry overall, discussing the congressional anti-trust hearings with major tech CEOs, and chatting about the earnings from those same companies as well.

The Shifting Semiconductor Sands

There was a time—and it wasn’t really that long ago—when if you asked anyone who tracked the chip business about Intel, they probably would have said they were invincible. After all, they owned 99% of the datacenter CPU market, close to 90% of the PC CPU market and even made ambitious acquisitions in other “alternative” architectures, such as FPGAs (Field Programmable Gate Arrays) and dedicated AI processors. On top of that, they had a decades-long history of impeccable execution and industry-leading innovations in the process of semiconductor manufacturing.

And then, last Thursday hit.

During what was otherwise a stellar second quarter earnings report, with impressive revenues and growth numbers across the board, the company acknowledged that their already delayed transition to 7nm process technology for future generation CPUs was delayed for another six months. Now, arguably, that really shouldn’t be that big of a deal. After all, this is ridiculously complex technology. The company said they knew what the problem was and, therefore, had a clear path to fixing it. They also certainly wouldn’t be the first major tech company to face some technical challenges that caused delays in the release of eagerly awaited new products.

But the market didn’t see it that way, and subsequently, Intel stock has lost nearly 20% of its value in the last week. To be fair, this is also a stock market that over the last few months has shown absolutely no sense of rationality, so you have to take any dramatic stock price moves in the current environment with a very large grain of salt.

Fundamentally, however, there appears to be some loss of faith in Intel’s previously irreproachable reputation for delivering what they said, when they said they would do it. While some view the most recent news, as well as the forthcoming and likely related departure of chief engineering officer Murthy Renduchintala, as the primary catalyst for this perspective, you could make the argument that the problem started earlier. In the case of dedicated AI accelerators, for example, Intel made a large investment in Nervana and put Nervana’s main execs in charge of their dedicated AI investments back in 2016. Then, shortly after they released their first Nervana chips to customers, they essentially abandoned all that work to purchase Habana Labs for $2 billion late last year and moved in a different direction. Obviously, cutting edge technologies like AI accelerators can certainly shift quickly, and, in this case, Intel clearly recognized that they needed to make an aggressive move. However, it certainly raised some questions.

At the same time, there are also several other very interesting developments in the semiconductor market that appear to be driving some fundamental shifts in how people (and investors) are viewing it. One, of course, is a hugely reinvigorated AMD—a fact that’s been reflected in the company’s impressive growth and even more impressive stock price run over the last several years (as well as the nice boost it received last week as a result of Intel’s news).

To their enormous credit, AMD’s CEO Lisa Su, CTO Mark Papermaster and team have done a remarkable job in turning a company that some felt was headed for extinction just a few years back, into a formidable competitor and an important force in the chip industry overall. You could argue (and many have) that, from a market valuation perspective, the company has received more credit than its sales numbers reflect. However, there’s no question that AMD has been shaking up and enlivening the previously static CPU market and that it will continue to do so for many years to come.

In addition, there’s been a great deal of momentum recently towards Arm-based CPUs in both datacenters and PCs. Apple’s recent announcement to switch from Intel to its own Arm-based CPU designs in future Mac, for example, highlights some of the high-level changes that are happening in the CPU market.

Despite all this bad news for Intel, it is important to keep everything in perspective. Intel is still by far the largest CPU manufacturer in the world and will be for some time to come. The compaSny will certainly be facing a more competitive marketplace than it has had to worry about for a very long time, but it’s undoubtedly up to the task. Also, in the long run, good competition will inevitably be better for all of us.

As a long-time Intel follower who essentially learned most everything about the importance of process technology from Intel (they’ve done a fantastic job of educating analysts and press about these issues for a very long time), I have to admit that it’s somewhat shocking to see Intel in this state. At the same time, it’s also important to remember that not all numbers in the semiconductor process game are created equal. While it’s certainly up for debate, Intel has argued for years that it’s 7nm process is closer to what other vendors call 5nm.

Regardless of the numbers, however, it is clear that Intel has slipped from its mantel of invincibility and will need to reprove itself to the industry and market at large. The fact that the company has already discussed working with third-party foundries on advanced process nodes for some of its upcoming chips (including its widely anticipated new GPU) is a testament to that. In the Intel of old, that decision would have probably been unthinkable. But we are in a new era, and despite these short-term concerns, it is encouraging to see Intel’s CEO Bob Swan willing to admit the challenges they have and take some aggressive actions to address them.

The sands beneath the semiconductor market are clearly shifting, and it’s going to be very interesting to see how things look over time.

Podcast: Microsoft Inspire, Google G Suite Essentials, Netflix, Microsoft and Intel Earnings

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell analyzing the Azure and Microsoft M365 news and diversity and leadership sessions from Inspire, discussing the addition of Google’s G Suite Essentials offering, and chatting about the quarterly earnings from Netflix, Microsoft and Intel and what they say about the current state of the tech market.

Microsoft and Partners Bring More Hyperconverged Hybrid Cloud Options to Azure

When it comes to cloud computing, there’s little doubt that we’re in a hybrid world. In fact, that point comes through loud and clear in two different studies published this year by TECHnalysis Research. Both the Hybrid and Multi-Cloud Strategies in the Enterprise report and recently published Pandemic-Based IT Priority Shifts report highlight the high degree of usage, strategic importance and budgets spent on hybrid computing models. In fact, in many instances, hybrid cloud is considered more important than the older and more established public cloud computing methodologies.

The reason? While every company would certainly like to be running nothing but containerized, cloud-native applications, the reality is that almost none do so. There’s simply too much legacy software (typically still close to 50% of most organization’s applications) and hardware datacenters that companies need to use for a variety of reasons, including regulatory, security, cost and more. In the meantime, private clouds and hybrid models that combine or connect private cloud workloads with public cloud workloads serve as a critical steppingstone for most organizations.

As a result, we’ve seen many different tech vendors create new hybrid cloud offerings recently to tap into the burgeoning demand. At the company’s partner-focused Inspire event, Microsoft unveiled several new hybrid cloud-focused additions to its Azure cloud computing platform. In particular, they announced additional capabilities for Azure Stack HCI—the local, on-premises compatible version of Azure that runs on specialized, Microsoft-certified hardware appliance devices from hardware partners like Dell EMC, HPE and Lenovo.

These hardware appliances are built using an architecture called hyperconverged infrastructure, or HCI, that essentially combines all the elements of a data center, including compute, storage and networking, into a single, software-defined box. The beauty of the HCI approach is that it virtualizes all these elements so that simple, off-the-shelf servers can be organized and optimized in a way that improves their performance, functionality, and reliability. For example, virtualizing the storage provides SAN (Storage Area Network)-like capabilities and dependability to an HCI environment without the costs and complexities of a SAN. Similarly, virtualizing the networking lets an HCI device offers the capabilities of a load balancer via software, again without the costs and complexities of purchasing and deploying one. Best of all, these software-defined datacenter capabilities can both scale up to large datacenter environments or scale down for branch offices or other edge computing applications.

While Microsoft has talked about Azure Stack HCI before, they announced several new capabilities at Inspire. Notably, Azure Stack HCI is now a fully native Azure service, which means you can now use the Azure Portal as a combined management tool for public cloud Azure computing resources along with any local Azure Stack HCI resources, such as virtual machines, virtualized storage and more. This allows IT administrators the classic “single pane of glass” UI for monitoring and managing all their different public, private and hybrid-cloud-based workloads. In addition, by making Azure Stack HCI a native Azure service, it makes it significantly easier to use other Azure PaaS (Platform as a Service) capabilities, such as Azure Backup and Azure Security Center, with private cloud workloads. In other words, it essentially allows companies to pull these two “worlds” together in ways that weren’t possible before.

One particularly nice feature of these new Microsoft-certified systems is that they can be purchased with the Azure Stack HCI software already installed and configured on them, making them about as easy to set up and configure as possible. You literally plug them in and turn them on and they’re ready to install, making it suitable for smaller businesses, branch offices or other locations where there may not be dedicated or specially trained IT staff. In addition, Microsoft offers the option of installing the new Azure Stack HCI on existing datacenter hardware if it meets the necessary hardware certification requirements.

Combining the software-defined datacenter (SDDC) capabilities inherent in HCI along with the cloud-native opportunities of Azure Stack initially was a big step forward in getting companies to modernize their datacenters from both a hardware (HCI) and a software (Azure) perspective. While it may seem logical to do so, those two modernization efforts don’t necessarily go hand-in-hand, so it was an important step for Microsoft to take. In doing so, they made the process of migrating more apps to the cloud (and, hopefully, modernizing them along the way) much easier.

This is particularly important for companies who may have been a bit slower in moving their applications to the cloud and/or those organizations who may have run into roadblocks on some of their legacy applications. Not all organizations have all the skillsets they need in their IT organizations to do this kind of work, so the more efforts that can be done to make the process easier, the better. With their latest additions to Azure Stack HCI, Microsoft is moving down the path of further simplification and helping draw the worlds of legacy applications and hardware and the cloud a little bit closer together. No matter how you look at it, that’s a step in the right direction.

Podcast: Google Cloud Next, G Suite, IT Priority Study, Twitter Hack

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell analyzing the announcements from Google’s Cloud Next event, including new offerings for GCP and G Suite, discussing a new study on IT prioritization changes from the pandemic, and chatting on the big Twitter hack.

New Study Highlights Pandemic-Driven Shifts in IT Priorities

At this point, everyone understands that the COVID-19 pandemic has had a profound impact on all aspects of society, including our personal and professional lives. But just as our understanding of how the virus spreads and its impact has shifted over time, so too has our perception of exactly how that impact is being felt in different areas.

In order to better understand specifically how the pandemic has affected IT environments in US-based medium businesses (100-999 employees) and large enterprises (1,000+ employees), TECHnalysis Research embarked on a study last month of over 600 US-based IT decision makers. Survey respondents were asked a number of questions about what their companies’ strategic and spending priorities were before the pandemic at the beginning of the year, and what they are now several months into the pandemic. In addition, respondents were asked how they expect their work environments to change, how they are acquiring and deploying PCs for their employees, how their cloud computing and app modernization efforts are evolving, and much more.

Needless to say, the results were fascinating. At a high level, one of the most interesting discoveries was that despite many dire early warnings, IT spending plans for the year are generally still intact with average annual IT budgets expected to increase 7% for this year. From a change perspective, as Fig. 1 illustrates, that means the overall levels are expected to be down just 1% versus what they were expected to be at the beginning of the year. Breaking it down by company size shows that medium-sized businesses are now expecting their IT budgets to grow slightly, while large enterprises are expecting a larger 2.3% drop overall.

Fig. 1

Priority-wise, what’s clear from the data is that companies shifted their focus from things that would be “nice-to-have” to things that they “need-to-have”. Specifically, this means that from both an overall strategic as well as spending perspective, purchasing laptops for their employees became the top priority, overtaking (at least temporarily) the attention and dollars given to their private, hybrid, and public cloud computing efforts. Conversely, it also means that some of the biggest decreases in prioritization and spending impact highly touted technologies such as edge computing, IoT, and private enterprise cellular networks.

From a PC client perspective, there have also been some very interesting shifts in the acceptance of different deployment and acquisition strategies. Notably, VDI (virtual desktop infrastructure) usage—which many have downplayed in the past as a backward-looking technology—has seen growth of over 11% since the start of the year. In addition, after appearing to have fallen out of favor, BYOD (Bring Your Own Device) programs—where employees purchase and use their own PCs—are now in place in over half of the companies that responded to the survey. Obviously, many of these changes are driven by the massive work-from-home experiment that IT departments around the world have had to immediately respond to. However, given the widely touted productivity levels that many people have reported working from home, many of those policies are likely to stay.

What’s also likely not to change is a dramatic increase in people who want to continue working from home. As Fig. 2 illustrates, on average, companies expect to have just over 1/3 of all employees still working from home into next year.


Fig. 2

Once people go back to the office, they’re also likely to see some dramatic differences when they get there. In fact, only 12% of respondents don’t expect changes to their work environments, meaning 88% do. Anticipated changes include increased sizes of work areas and cubicles, physical barriers between work areas and cubicles, and shifts from open office environments to traditional office/cube arrangements. In addition, about ¾ of respondents expect their companies to adjust the amount of real estate they have. Interestingly, medium-sized businesses expect to increase their amount of office space in order to accommodate more space per worker, but respondents from large enterprises felt their companies were more likely to close some offices and have less real estate.

Of course, as recent news has highlighted, the virus and its impact continue to evolve, so there’s no great way to know exactly how all these different factors play out until time passes. Overall, however, it’s clear that, from an IT perspective, the reactions to and impact from the virus so far are less severe than many feared. In addition, one positive side to the pandemic is that companies are throwing out their old rule books and looking at all the various technological tools at their disposal with a fresh set of eyes. In addition, many organizations plan to aggressively adopt more advanced technologies as a means not only to survive but to thrive in our new normal.

Technology, in its many forms, has proven to be a real saving grace for many organizations in these first few months of the pandemic. As a result, company leadership recognizes the importance of IT initiatives and will likely continue to allocate resources there into the foreseeable future. This isn’t to say we won’t see big challenges for some tech, particularly for IT shops and tech suppliers to hard-hit industries like travel, entertainment, etc. For the IT departments in many businesses, and most of the major tech vendors supplying them, however, the opportunities even in these challenging times continue to be strong.

(You can download a free copy of the highlights of the “Pandemic-Based IT Priority Shifts” report here. A copy of the complete 75-slide study is available for purchase.)

Podcast: Q2 2020 US CE and PC Sales Trends with NPD’s Steve Baker

This week’s Techpinions podcast features Ben Bajarin and Bob O’Donnell, along with special guest Steve Baker of NPD, talking about the surprisingly strong consumer electronics and PC sales data from the recently completed quarter, including discussions on overall trends, specific sub-category performance, and the retail brick-and-mortar vs. online sales splits.

Nvidia Virtual GPU Update Brings Remote Desktops, Workstations and VR to Life

The new work habits that we’ve all adjusted to because of the pandemic have led many companies to take a fresh look at how they can provide computing resources to people working from home. In some cases, this is giving new life to technologies, such as VDI (virtual desktop infrastructure), that provide server-based computing sessions to remote desktops.

In addition, companies have also had to figure out how to provide remote access to workers with very specific, and very demanding technical requirements, such as architects, product designers, data scientists, media creators, and other people who typically use workstations in an office environment.

One critical technology for these new challenges is server-based virtual GPUs, or vGPUs for short. Nvidia has built datacenter optimized GPUs for many years, and several years back made them a shareable and manageable resource through the introduction of Virtual GPU software. The company’s latest July 2020 vGPU software release (one of two they typically do per year), adds several enhancements designed to make these server-based graphics chips function in a wider variety of software operating environments, with better compatibility across more applications, and managed in an easier way.

As with many enterprise-focused technologies, the devil is in the details when it comes to exactly how and where virtual GPUs can function. Given the wide range of different server virtualization platforms and the graphics driver optimizations required for certain workstation applications, it can be challenging to get promising sounding technologies, like vGPUs, to work in all environments. To address these needs, the new release adds native support for virtualization on SUSE Linux Enterprise Server-based infrastructure, which is often used by data scientists, and offers additional management optimizations for VMware-based environments.

The new release also expands the capabilities of the different level GPU drivers that Nvidia provides, thereby increasing the range of applications it can support. Even details like different versions of drivers can make a difference in compatibility and performance. The latest release gives IT managers the flexibility to run different driver versions on the server and on a client device. This capability, called cross-branch support, is critically important for shared resources like vGPUs, because one application running on one device may need one version of a driver, and another application on another device may require another one.

Real-time collaboration across multiple applications is also supported in this July 2020 release. For VR-based applications, the new software, in conjunction with Nvidia’s CloudXR platform, can provide support for untethered mixed reality headsets with 4K resolutions at up to 120 Hz refresh rates over WiFi and 5G networks.
With the Quadro Virtual Workstation software—one of the several levels of drivers that Nvidia makes available through its vGPU software—multiple people can work on CAD, architecture, or other highly-demanding applications with real-time rendering on regular PCs. For designers, engineers, and others working from home, this capability can allow them to function as they normally would in a workstation-equipped office.

Interest in the ability to get remote access to these graphically demanding applications has been extremely high during the pandemic, which should be surprising to no one. This also aligns with results from a newly completed survey by TECHnalysis Research of over 600 US-based IT managers about the impact that COVID-19 has had on their IT strategies, priorities, and computing programs.

According to the study, virtual desktop infrastructure (VDI) usage grew over 11% in just a few months, from 48% of companies saying they used server-based computing models at the beginning of the year to 59% who said they are using them now. Not all of those instances of VDI use virtual GPUs, of course, but they do represent a significant and critical portion of them.

Ongoing flexibility has become the mantra by which IT organizations and workers are adapting to new work realities. As a result, technologies, such as vGPUs, that can enable flexibility are going to be a critical part of IT managers’ toolkits for some time to come.

Power Efficient Computing Noteworthy During Pandemic

One of the few benefits many people have experienced as part of the great work-at-home and learn-at-home experiment that we’ve all been through is improved air quality. In fact, because of the significant reduction in both commuting and travel, both the visual and measured quality of the air has gotten noticeably better in most places around the world.

As a result, the pandemic has inspired a refocus on environmental issues. At the same time, there’s been a huge focus on how digital technology—particularly computing devices, cloud infrastructure, and various types of networks—has allowed us to stay as productive (if not even more so!) as we were prior to the pandemic.

Interestingly, the stories of computing and conservation have also started to become entwined in several different ways. First, there’s been a strong transition to laptop PCs, which use significantly less power than desktops, as many peoples’ primary computing device. While many people think notebooks have been the default standard for a while, the truth is that desktop PCs still represented a fairly significant portion of computers used in many businesses up through the start of the pandemic. However, with the requirement to work at home, companies have been scrambling to get laptops to their employees. As a result, the incredible reliance we have on these more power-efficient devices has never been more apparent. The real-world impact of their increased use is less demand on the electrical grid to power them which, in turn, can offer benefits to the environment.

Second, there’s been a much bigger focus on cloud-delivered apps and services, which can also indirectly lead to an improved environment. In particular, there’s been a great deal more attention placed on modernizing and “cloudifying” applications for business. Because these modernized applications can run in power-efficient cloud-computing data centers, this too has the benefit of reducing the power demands necessary to complete specific tasks.

In a recently completed survey by TECHnalysis Research of over 600 US-based IT professionals, we found that when asked to rank the top 2 priorities for IT initiatives since the rise of the pandemic, modernizing applications is the most important, followed closely by purchasing laptops for their employees. Not surprisingly, growing usage of hybrid, private, and public cloud usage rounded out the top 5, as shown in Figure 1. The app modernization effort, of course, entails the process of converting legacy applications into newer app formats that can run efficiently in one of these hybrid, private and/or public cloud environments.

Fig. 1

What’s interesting about these developments from a conservation perspective is that there have even been studies which show that cloud-based computing resources are more energy efficient than many people realize. In fact, thanks to a combination of significantly more controlled usage of computing, storage, and networking resources in large cloud data centers, new types of computing (and pricing) models that use those resources more efficiently, and the growing use of more power efficient CPUs, there have been great improvements in computing power per watt. In other words, with cloud computing, it’s possible to get even more computing work done with the same (or even smaller) amounts of power than were used in the past.

On the notebook PC side, there have been similar trends in power efficiency as well. In fact, just last week AMD announced that they surpassed their 25×20 goals set back in 2014. Specifically, the company announced six years ago that they wanted to improve the power efficiency of their mobile CPUs by a factor of 25 before the end of this year. With the release of their recent Ryzen 7 4800H mobile processor, the company actually achieved an impressive 31.7X improvement in power efficiency—specifically a 5x increase in performance combined with a reduction to 1/6th of the required power—versus a 2014 vintage AMD FX-7600P chip.

The improvements are due to a wide range of factors, including better core designs, new chiplet architectures within their CPUs, and the company’s move to 7nm production from the 28nm process used back in 2014. The company also made a number of enhancements to the chip’s thermal design and power management capabilities over the years. All told, it’s another impressive example of how far AMD has improved their technical capabilities and competitive strengths over the last few years.

As companies start to bring their employees back into the office and commuting and travel trends slowly start to tick up, we may begin to see some negative impact on the environment. In terms of computing resources, however, the ongoing developments in power and performance efficiency for both data centers and laptops can hopefully keep their influence to a minimum.

Apple Transition Provides Huge Boost for Arm

You have to imagine that yesterday was a pretty good one for the folks at Arm—the little understood, but highly influential chip design company. Not only were they able to report that their designs power the world’s fastest supercomputer, there’s also that little detail about Apple choosing to switch from Intel-based CPUs to Apple designed custom silicon built on Arm’s core architecture for future generations of Macs.

A word on the supercomputer news first. Every year at the opening of the ISC high-performance computing conference, the organization running it releases the Top 500 performing supercomputers. As with most years, this year’s list was utterly dominated by Intel-based machines, but there was a surprise at the top. For the first time ever, Arm-based chips (in this instance, built by Fujitsu) are the CPU brains being used in the number 1 ranked machine—the Fugaku supercomputer, which is operated by the RIKEN Center for Computational Science in Japan. In addition to the prestige, it’s a huge psychological win for Arm, which has been working to make an impact on the enterprise computing world with its Neoverse CPU architecture for the last several years.

In the personal computing world, Arm notched an equally impressive victory with the official unveiling of the long-rumored Arm-powered chips for next generation Macs. Apple doesn’t have the largest market share in the PC market—it’s around 7% or so overall—but its impact, of course, greatly outstrips those numbers. As a result, by making the official announcement of custom Apple Silicon for the Mac, which was designed leveraging Apple’s architectural license of Arm’s chip IP designs (though Arm is never mentioned in the keynote or any of the press releases for the event), Arm scored a huge gain in credibility and awareness.

Of course, awareness doesn’t translate to success, and as exciting as the development may be, there are a great deal of questions, as well as previous history, to suggest that challenges await. First, while Apple talked about switching to this new design to both improve performance and reduce power consumption, it has yet to show any comparative benchmarks to existing Intel-based Macs for either of those metrics. Of course, that’s likely because the silicon isn’t done. Heck, Apple didn’t even announce the name of the new chips. (The A12Z Bionic chip in the developer system, and currently in the iPad Pro, is likely only an interim solution.) My guess is that we won’t get any of these details until the end of the year, when the first-generation Macs with these new chips are unveiled.

Apple’s primary stated reason for making the move away from Intel to custom silicon was to improve the experience, so these comparative details are going to be critically important. This is particularly true because of the generally disappointing performance of Arm-based Qualcomm and Microsoft chips in Windows on Arm PCs like the Surface Pro X. The key question will be if Apple is able to overcome some of the limitations and truly beat Intel-level performance, while simultaneously offering significantly better battery life. It’s an extremely challenging task but one that Apple clearly laid out as its goal.

There are also many unanswered questions about the ability to pair these new chips with external GPUs, such as the AMD Radeon parts Apple currently offers in certain Macs, or any other companion chips, such as 5G modems. While Apple currently uses Qualcomm modems for the iPhone and certain iPads, the company is known to be working on its own modems, and it’s not clear if those will be available in time for the launch of a 5G-equipped Macbook (should they choose to do so). As for graphics, Apple only uses its own GPU designs for its other custom parts for iPhones and iPads, but some computing applications require more graphics horsepower than those devices do, so it will be interesting to see if Apple offers the option to pair its new Mac-specific SOCs with external GPUs.

Finally, of course, there is the question of software. To get the best possible performance on any platform, you need to have software developers write applications that are native to the instruction sets being used. Because that can take a while, you also have to have a means to run existing software (that is, designed for Intel-based Macs) on the new chips via emulation. Ironically, Apple has chosen to use the exact same playbook to transition away from Intel processors that it used to transition into Intel processors. In fact, it’s even using the same names (with the addition of a version 2) for the core technologies: Universal Binaries 2 are combined applications that run on both Intel CPUs and the new Apple custom silicon chips and Rosetta 2 is the software used to emulate Intel instructions. This time around Apple also added some virtualization capabilities and demoed the ability to run Linux in a virtualized container. However, interestingly, there was no discussion of Windows running on the new Mac. Presumably all the work that Microsoft and its partners have done to bring Windows to Arm-based CPUs should port over fairly easily to Apple designs as well, but the details on this are not clear just yet.

To the company’s credit, Apple did an impressive job when it created this playbook to move from PowerPC-based chips to Intel, so here’s hoping the same strategy works the other way around. While Apple made it seem like it was a fairly trivial task to shift from x86-based instructions to Arm, if you use its Xcode development environment, history strongly suggests that the transition can be a bit daunting for some developers. To their credit, however, Apple did show functioning demos of critical Microsoft Office, Adobe Creative Cloud, and Apple professional apps running natively in the new environment. One concern Apple didn’t address at all was hardware device drivers. That was a key challenge for early Arm on Windows devices, so it will be interesting to see how Apple does with this.

One nice advantage that Apple and its developers gain by moving over to the same Arm-based architectures that it uses for the iPhone and iPad is that iOS and iPadOS applications should easily run on these new Macs—a point Apple was eager to make. As exciting as that first sounds, however, there is that detail of a lack of a touch screen on any existing Mac. Imagine trying to use a mouse with your iPhone, and you can see how initial enthusiasm for this capability may dampen, unless Apple chooses to finally allow touchscreens on Macs. We shall see.

The last point to make regarding all of these developments is that Apple ultimately chose to move to Arm to gain complete control over the Mac experience. As good as Intel’s processors have been, Apple has shown with its other devices that it likes to own the complete vertical technology stack, and the only way to do that was to design the CPU as well. It’s the last critical piece of the puzzle for Apple’s strategy to control its own destiny.

Regardless of that reasoning, however, it’s clear that both Apple’s decision and the supercomputing win mentioned earlier provide a great deal of credence to Arm’s efforts. At the same time, it arguably puts even more pressure on Arm to continue its pace of innovations. For a company that so few people really appreciate and understand, it’s great to see how far and how wide Arm has pushed the boundaries of computing. Now let’s see how they continue to evolve.

Podcast: Cisco Live, Qualcomm Snapdragon 690, Apple App Store Controversy

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing the many announcements from the Cisco Live event, analyzing the potential impact of low-cost 5G phones from the latest Qualcomm chip, and debating the controversies around Apple’s app store payment model for developers.

Cisco Highlights Focus on Location as Companies Start to Reopen

As states in the US start to reopen and businesses around the country (and the world) start to plan for employees to return, there’s been a lot of discussion around what the new “normal” in the workplace will be. Of course, we don’t really know what it’s going to be like, but most people are fairly certain it’s going to be different. Whether it’s staggered work schedules, spread out workspaces, plexiglass shield-equipped lunch tables, or other workplace adjustments, many people who start to return to the office will likely encounter different environments.

Of course, many others won’t be returning for some time, if at all—upcoming research data from TECHnalysis Research suggests we could have as many as 35% of workers still working from home even into 2021. Regardless of where people do their work, however, it’s never been clearer that the need for flexible, secure access to work resources is extremely high. In addition, as some people do start to venture back into the office, it’s also clear that they’re going to want/need tools that can help them stay safe while they’re there.

At the Cisco Live Digital event, the networking giant highlighted a number of new and updated initiatives it has been working on to address some of these issues. On the security side, the company’s big news is around its SecureX cloud-native cybersecurity platform, which it is starting to integrate into all Cisco security products at the end of this month. Key enhancements include a single dashboard for viewing live threat data, increased automation of security tools, and enhanced security capabilities that can intelligently leverage analytics data from multiple sources simultaneously.

The company also unveiled a number of enhancements to its Webex collaboration platform, including the announcement that it now has an impressive 3x the capacity to handle more meetings. For those returning to the office, Cisco also made some interesting additions via its Webex Control Hub application. Control Hub lets IT managers quickly install the Webex voice assistant onto conference room devices, which keeps people from having to touch the screens or touchpads in meeting rooms. In addition, Control Hub offers expanded analytics on meeting room usage, which can impact cleaning schedules for those rooms and can manage meeting room locations/configurations to keep people spread out. Cisco also enhanced the support capabilities for meetings that will incorporate both on-site and remote workers.

Another intriguing location-based set of capabilities comes via the updated DNA Spaces offering. Related to the company’s larger Digital Network Architecture (DNA) initiative, which is essentially Cisco’s enhanced version of software-defined networking (SDN), DNA Spaces is an indoor location-based service platform that can leverage data from WiFi hotspots, including those from its Meraki division, to determine how people are moving through or congregating within a location. The company made two additions to the platform, including the descriptively named Cisco DNA Spaces for Return to Business, and Indoor IoT Services, which can use WiFi6-enabled access points to work with Bluetooth LE devices, such as beacons, to do things like asset tracking, environmental monitoring, room tracking, and more.

In a manner that’s conceptually similar to the Bluetooth-based contact tracing apps that have been in the news, DNA Spaces for Return to Business can track the WiFi (or GPS) signals from mobile devices, and then can use that to analyze people’s real-time movement patterns through the office. The resulting data can subsequently be used to do things like limit the number of people in a given building, or section of the office, that a company could define as being at maximum capacity. In conjunction with Indoor IoT Services, which Cisco claims is the first indoor IoT-as-a Service offering, the same data could be combined with other sensor data to do things like suggest alternative places to meet, encourage employees to social distance, and more.

While there are certainly going to be some questions about privacy concerns for any location-based service, companies (and likely a decent percentage of employees) probably feel that the potential safety benefits outweigh those privacy concerns within the limited office environment. Over time those feelings may change—and it will certainly be an interesting trend to watch—but to get people to feel comfortable about returning to office environments, these types of technology-based solutions will likely play an important role. Companies that deploy these solutions will have to make sure employees feel confident that they aren’t being tracked once they leave the workplace, however, otherwise they’ll likely face significant pushback. As long as companies ensure privacy outside the workplace, employees are likely to accept these tracking solutions as just one of the many new aspects of the new normal inside the workplace.

Podcast: Facial Recognition Technology, Sony PS5, Android 11, Adobe Photoshop Camera

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing tech companies’ recent shifts in policy around facial recognition, and analyzing the debut of Sony’s PS5 gaming console, the beta of Google’s Android 11 and Adobe’s new Photoshop Camera app for smartphones.

WiFi 6E Opens New Possibilities for Fast Wireless Connectivity

One of the most obvious impacts of the COVID-19 pandemic is how reliant we have all become on connectivity, particularly wireless connectivity. For most of us, the combination of a fast broadband connection along with a solid WiFi wireless network inside our home has literally made the difference between being able to work, attend classes, and enjoy entertainment on a consistent, reliable basis or not being able to do so.

As a result, there’s significantly more attention being placed on connectivity overall these days, within all of our different devices. Of course, it doesn’t hurt that we’re also at the dawn of a new era of wireless connectivity, thanks to the recent launch of 5G networks and the growing availability of lower-cost 5G-capable devices. But, while 5G may currently be getting the lion’s share of attention, there have been some tremendously important developments happening in the world of WiFi as well.

In fact, just six weeks ago, the FCC gave official approval for WiFi to extend its reach to an enormous swath of new radio spectrum in the 6 GHz band here in the US. Specifically, the new WiFi 6E standard will have access to 1.2 GHz, or 1,200 MHz of radio spectrum, ranging from 5.9 GHz to 7.1 GHz (and incorporating all the 6 GHz frequencies in between, hence the 6 GHz references). Just to put that in perspective, even the widest connections for millimeter wave 5G—the fastest kind of 5G connection available—are limited to 800 MHz. In other words, the new WiFi connections have access to nearly 1.5 times the amount of frequencies to transmit on as the fastest 5G connections.

Theoretically, that means that WiFi 6E connection speeds could prove to be significantly faster than even the best that 5G has to offer. Plus, because of the basic laws of physics and signal propagation, WiFi 6E coverage can actually be wider than millimeter wave 5G. To be fair, total coverage is very dependent on the amount of power used for transmission—cellular transmission levels are typically several times stronger than WiFi—but in environments like office buildings, conference centers, as well as in our homes, it’s not unreasonable to expect that WiFi 6E will be faster than 5G, just as current 5 GHz WiFi (802.11a and its variants) are typically faster than 4G LTE signals.

One important clarification is that all of these benefits only extend to WiFi 6E—not WiFi 6, which is also relatively new. For WiFi 6, there are a number of improvements in the way signals are encoded and transmitted, all of which should decrease the congestion and reduce the power requirements for using WiFi. However, all those improvements still use the traditional 2.4 and 5 GHz frequency bands that WiFi has used for the last 20 years. The critical new addition for WiFi 6E is the 6 GHz frequency band.

To make sense of all this, you have to understand at least a little bit about radio frequency spectrum (whether you want to or not!). The bottom line is, the higher the frequency, the shorter the distance a wireless signal can travel and the lower the frequency, the farther it can travel. The analogy I like to use is to think of hearing a music concert from a far-away stadium. If you’re driving by a concert venue while a band is playing, you typically can hear a wide range of frequencies and can better make out what’s being played. The farther away you are, however, the more that the higher frequencies are harder to hear—all that’s left is the low-frequency rumble of bass frequencies, making it difficult to tell what song is being played. All radio frequency signals, including both cellular and WiFi, follow these basic rules of frequency and distance.

There is a critically important twist for data transmission, however, and that has to do with availability and width of channels for transmitting (and receiving) signals. The basic rule of thumb is the lower the frequency, the smaller the channel width and the higher the frequency, the wider the channel width. Data throughput and overall wireless connection speed is determined by the width of these channels. For 4G and what’s called low-band 5G (such as with T-Mobile’s 600 MHz 5G network), those channels can be as small as 5 MHz wide or up to 20 MHz. The mmWave frequencies for 5G, on the other hand, are 100 MHz wide and, in theory up to eight of them are available for a total of 800 MHz of bandwidth.

The beauty of WiFi 6E is that it supports up to 7 channels of 160 MHz, or a total of 1,120 MHz of bandwidth. (As a point of comparison, 5 GHz WiFi supports a maximum of two 160 MHz channels and 500 MHz overall, while 2.4 GHz WiFi only supports a maximum of three 20 MHz channels and 70 MHz overall.) In addition, WiFi 6E has these wide channels at a significantly lower frequency than used for millimeter wave (typically 24 GHz and up, although most US carriers are using 39 GHz), which explains why WiFi 6E can have broader coverage than mmWave. Finally, because 6 GHz spectrum will be unoccupied by other devices, the real-world speed should be even better. The lack of other traffic will enable much lower latency, or lag, times for devices on WiFi 6E networks.

Of course, to take advantage of WiFi 6E, you need to have both routers and devices that support that standard. To do that, you need to use chips that also support the standard (as well as live in a country that supports the full frequency range—right now the US is leading the way and the only country to support the full 1.2 GHz of new spectrum). Broadcom and Intel have both announced support for WiFi 6E, but the only company currently shipping chips for both types of devices is Qualcomm. For client devices like smartphones, PCs and others, the company offers the FastConnect 6700 and 6900, while for routers, the company has a new line of tri-band (that is, supporting 2.4 GHz, 5 GHz and 6 GHz) Networking Pro Series chips, including the Networking Pro 610, 810, 1210 and 1610, which support 6, 8, 12, and 16 streams, respectively, of WiFi 6E connectivity.

In addition, the new Networking Pro line supports what the company calls Qualcomm Max User Architecture and Multi-User Traffic Management, which enable up to 2,000 simultaneous client connections, thanks to advanced OFDMA (Orthogonal Frequency-Division Multiple Access) and 8-user MU-MIMO (Multi User—Multiple Input, Multiple Output) per channel. The new router-focused Networking Pro chips also support SON (Self-Organizing Networks), which makes them well suited for future versions of WiFi mesh routers.

In a way, the benefits of WiFi6E offer an interesting challenge for Qualcomm and other companies that make both 5G cellular and WiFi-focused chips and devices. For certain applications—notably public venues, certain office environments, etc.—the two technologies are likely to compete directly with one another, in which case the core component companies will essentially have to sell against themselves. Because of the increasingly complex range of wireless network architectures, different security requirements, business models and more, however, the likely truth is that both technologies will co-exist for some time to come. As a result, it makes better business sense to have offerings that support both than to simply pick a side.

The good news for those of us in the US is that we’re about to enjoy a significantly improved range of wireless networking options, thanks to both of these recent WiFi 6E enhancements, as well as the forthcoming auctions for mid-band (3.5 GHz) 5G spectrum. Despite the many other challenges we face, it’s looking to be a good year for wireless.

Podcast: Twitter Controversy, Arm IP Designs, Qualcomm XR Viewers

This week’s Techpinions podcast features Carolina Milanesi and Bob O’Donnell discussing the controversy around Twitter’s efforts to tag tweets and the implications for social media overall, analyzing the new mobile IP chips designs from Arm, and chatting about the latest announcements around AR/VR/XR headsets attached to Qualcomm-powered 5G smartphones in conjunction with telco carriers.