Buzz Around Device as a Service Continues to Grow

on February 16, 2018
Reading Time: 4 minutes

This week Device as a Service (DaaS) pioneer HP announced it was expanding its hardware lineup. In addition to adding HP virtual reality products including workstations and headsets, the company also announced it would begin offering Apple iPhones, iPads, and Macs to its customers. It’s a bold move that reflects the intense and growing interest in this space, as well as Apple’s increasingly prominent role on the commercial side of the industry.

First Came PCaaS
IDC’s early research on PC as a Service (PCaaS) showed the immense potential around this model. It’s exciting because it is a win/win for all involved. For companies, shifting to the as a service model means no longer having to budget for giant capital outlays around hardware refreshes. As IT budgets have tightened, and companies have moved to address new challenges and opportunities around mobile, cloud, and security, device refreshes have often extended out beyond what’s reasonable. Old PCs limit productivity and represent ongoing security threats, but that’s not stopped many companies from keeping them in service for five years and more.

PCaaS lets companies pay an ongoing monthly fee that builds in a more reasonable life cycle. That fee can also include a long list of deployment and management services. In other words, companies can offload the day-to-day management of the PC from their IT to a third party. And embedded within these services are the ability of the provider to capture analytics that helps guide future hardware deployments and ensure security compliance.

PC vendors and other service providers offering PCaaS like it because it allows them to capture more services revenue, shorten product lifecycles, and smooth out the challenges associated with the historical ebb and flow of big hardware refreshes often linked to an operating system’s end of life. HP was the first major PC vendor to do a broad public push into the PCaaS space, leveraging what the company learned from its managed print services group. Lenovo has been dabbling in the space for some time but has recently become more public about its plans here. And Dell has moved aggressively into the space in the last year, announcing its intentions at the 2017 DellWorld conference. Each of the three major PC vendors brings its own set of strengths to the table in this competitive market.

Moving to DaaS
HP’s announcement about offering more than just PCs, as well as Apple devices, is important for several reasons. Chief among them is that in many markets, including the U.S. (where this is launching first), iOS has already established itself as the preferred platform in many companies. By acknowledging this, HP quickly makes its DaaS service much more interesting to companies who have shown an interest in this model, but who were reluctant to do so if it only included PCs. Second, while HP has a solid tablet business, it doesn’t have a viable phone offering today. For many companies, this would be an insurmountable blocker, but to HP’s credit, it owned this issue and went out and found the solution in Apple. It will be interesting to see if the other PC vendors eventually announce similar partnerships with age-old competitors. It’s worth noting that Dell also doesn’t have a phone offering, while Lenovo does have a phone business that includes the Moto brand.

It was also very heartening to see HP announce it would begin offering its virtual reality hardware as a service, too. Today that means the HP Z4 Workstation and the HP Windows Mixed Reality VR headset, but over time I would expect that selection to grow. As I’ve noted before, there is strong interest from companies in commercial VR. By offering the building blocks As A Service, HP enables companies to embrace this new technology without a massive capital outlay up front. I would expect to see both Dell and Lenovo, which also have VR products, to do the same in time. And while VR represents a clear near-term opportunity, Augmented Reality represents a much larger commercial opportunity long term. There’s good reason to believe that many companies will turn to AR as a Service as the primary way to deploy this technology in the future. And beyond endpoint devices such as PCs, tablets, phones, and headsets, it is reasonable to expect that over time more companies will look to leverage the As A Service model for items such as servers and storage, too.

Today just a small percentage of commercial shipments of PCs go out as part of As a Service agreement, but I expect that to ramp quickly in the next few years. The addition of phones, tablets, AR/VR headsets, and other hardware will help accelerate this shift as more companies warm to the idea. That said, this type of change doesn’t come easily within all companies, and there will likely continue to be substantial resistance inside many of them. Much of this resistance will come from IT departments who find this shift threatening. The best companies, however, will transition these IT workers away from the day-to-day grind of deployment and management of devices to higher-priority IT initiatives such as company-wide digital transformation.

At IDC we’re about to launch a new research initiative around Device as a Service, including multiple regional surveys and updated forecasts. We’ll be closely watching this shift, monitoring what works, and calling out the areas that need further refinement. Things are about to get very interesting in the DaaS space.

Is Qualcomm’s Connected PC a threat to Intel?

on February 16, 2018
Reading Time: 3 minutes

I have been spending a lot of time with clients and people in the industry lately about Qualcomm and Microsoft’s push to create an ARM-based platform for Windows-based laptops. Although these two companies launched a Windows on ARM program four years ago, that initiative failed due to the underpowered ARM processor available at that time and the version of Windows used on these laptops that did not work well on these early ARM based laptops.

New AMD chip takes on Intel with better graphics

on February 15, 2018
Reading Time: 3 minutes

Nearly a full year after the company started revamping its entire processor lineup to catch up with Intel, AMD has finally released a chip that can address one of the largest available markets. Processors with integrated graphics ship in the majority of PCs and notebooks around the world, but the company’s first Ryzen processors released in 2017 did not include graphics technology.

Information from Jon Peddie Research indicates that 267 million units of processors with integrated or embedded graphics technology were shipped in Q3 2017 alone. The new AMD part that goes on sale today in systems and the retail channel allows AMD an attempt to cut into Intel’s significant market leadership in this segment, replacing a nearly 2-year-old product.

Today AMD stands at just 5% market share in the desktop PC space with integrated graphics processors, a number that AMD CEO Lisa Su believes can grow with this newest Ryzen CPU.

Early reviews indicate that the AMD integrated graphics chips are vastly superior to the Intel counterparts when it comes to graphics and gaming workloads and are competitive in standard everyday computing tasks. Testing we ran that was published over at PC Perspective shows that when playing modern games at mainstream resolutions and settings (720p to 1080p depending on the specific title in question), the Ryzen 5 2400G is as much as 3x faster than the Core i5-8400 from Intel when using integrated processor graphics exclusively. This isn’t a minor performance delta and is the difference between having a system that is actually usable for gaming and one that isn’t.

The performance leadership in gaming means AMD processors are more likely to be used in mainstream and small form factor gaming PCs and should grab share in expanding markets.

China and India, both regions that are sensitive to cost, power consumption, and physical system size, will find the AMD Ryzen processor with the updated graphics chip on-board compelling. AMD offers much higher gaming performance using the same power and at a lower price. Intel systems that want to compete with the performance AMD’s new chip offers will need to add a separate graphics card from AMD or NVIDIA, increasing both cost and complexity of the design.

Though Intel is the obvious target of this new product release, NVIDIA and AMD (ironically) could also see impact as sales of low-cost discrete graphics chips won’t be necessary for systems that use the new AMD processor. This will only affect the very bottom of the consumer product stack though, leaving the high-end of the market alone, where NVIDIA enjoys much higher margins and market share.

The GT 1030 from NVIDIA and the Radeon RX 550 from AMD are both faster in gaming than the new Ryzen processor with integrated graphics, but the differences are in an area where consumers in this space are like to see it as a wash. Adding to the story is the fact that the Ryzen processor is cheaper, draws less power, and puts fewer requirements on the rest of the system (lower cost power supply, small chassis).

AMD’s biggest hurdle now might be to overcome the perception of integrated processor graphics and the stigma it has in the market. DIY consumers continue to believe that all integrated graphics is bad, a position made prominent by the lack of upgrades and improvements from Intel over the years. Users are going to need to see proof (from reviews and other users) to buy into the work that AMD has put into this product. Even system integrators and OEMs that often live off the additional profit margin of upgrades to base system builds (of which discrete graphics additions are a big part) will push back on the value that AMD provides.

AMD has built an excellent and unique processor for the mainstream consumer and enterprise markets that places the company in a fight that it been absent from for the last several generations. Success here will be measured not just by channel sales but also how much inroad it can make in the larger consumer and SMB pre-built space. Messaging and marketing the value of having vastly superior processor graphics is the hurdle leadership needs to tackle out the gate.

Apple Watch’s Big Quarter and a Series of Firsts

on February 15, 2018
Reading Time: 3 minutes

It is no secret that I’ve been very bullish on Apple Watch since day one. I’ve held my ground against the naysayers and defended this product because I believed in it and the broader role it can and will play in the future of computing. After a rough second year, when many of the naysayers thought they were right, Apple Watch is truly gaining steam.

I love this headline even though it is wrong: “Apple and Android are destroying the Swiss Watch Industry.”

What’s at Stake in the Voice Assistant Race

on February 14, 2018
Reading Time: 3 minutes

It is interesting that Apple’s HomePod has ignited a broader philosophical debate, within the tech industry and industry pundits and observers, around what is really at stake with voice assistants in the future. Everyone has an opinion on this, and there is a degree of implications for the future to be thought about as well.

Do We Really want an Always Connected PC?

on February 14, 2018
Reading Time: 4 minutes

I suppose, first, we should ask if we want a PC at all! Our recent US study run across 1262 consumers says we do. Less than one percent of the panelists said they have no intention to buy another PC or Mac. As a matter of fact, twenty-five percent of the panel is in the market to buy a new PC or Mac in the next twelve months.

What do We want When buying a Notebook?

Well, it depends who you are!

No matter which brand of PC we own, or how savvy of a user we are, when it comes to notebooks there is one thing we want out of the next computer we are buying: a longer battery life. Fifty-nine percent of our panelists picked battery life as a must-have feature in their next PC – one third more than for any other feature.

The other top two features are a little different depending on the camp you are in. While not strictly a feature, brand comes as the second most important consideration point for Mac buyers, which I am sure is not a surprise as with brand you buy into a whole entire ecosystem of both software and devices. Outside of Apple users, current PC owners only view brand as the sixth most important feature which causes some interesting challenges to the many brands in the Windows ecosystem trying to establish themselves in the high-end. Going back to hardware, what comes after battery very much depends on the kind of user you are. For early adopters a higher resolution display matters (34%) but for everybody else, including Mac owners, it is about more memory.

So where is connectivity in the list of features for our next notebook? Not much of a priority it seems.

Only 23% of our panel picked cellular connectivity as one of the top three features they want in their next notebook. Even more interesting, only 19% of early tech did so. I believe there are a couple of things at play here: either early tech adopters are quite happy to use their phone as a hotspot when they need connectivity, or they are actually just happy to use their phone for any of their on the go needs. It seems that, in this case, being tech-savvy is working against a feature that is being marketed as cutting edge. Where cellular connectivity resonates is with mainstream users (28% of which listed it in their top three features) and late adopters (20%). It seems to me that with these users, the marketing message around the PC being the same as your phone is working quite well.

The short-term opportunity, considering current owners in the market to upgrade their notebook within the next twelve month is not much more encouraging as only 25% of them picked cellular connectivity as a top three must have.

We also wanted to see if people who have a more flexible work set up in both hours and location might be better placed to appreciate such a feature but it does not seem that this is the case. Cellular was, in fact, only selected as a top three feature by 19% of panelists fitting that work style.

We say We want it but do We want to pay for It!

While the interest in cellular was not great, let’s dig a little deeper and understand what kind of premium potential buyers are willing to pay for the luxury of being connected any time any place.

We asked to imagine the following scenario: “you are purchasing your next notebook, and you have settled on a model when they tell you that it comes with a variant that offers 22 hours of continuous battery life and always-on internet connectivity. What premium (above the cost of the model without those features) would you be prepared to pay for it?”

Maybe conditioned by the current iPad offering that still puts a $100 premium on cellular or maybe because it is the sweet spot for this feature, 34% of the panelists would consider to pay up to $100 more. Seventeen percent would choose the cheaper model, and another 12% would expect those features to come as standard. This picture does not change much even among people who picked cellular connectivity among their top-three must-have features.

Where we find a considerable difference is in the willingness to pay a monthly fee for that cellular connectivity. Among consumers who are interested in cellular capability, only 19% said not to be interested in paying monthly, while among the overall panel, that number almost double to 39%.

When companies talk about PC connectivity and points to user behavior with smartphones as a parameter to determine demand and success potential, I think they are missing the mark. There are two major differences that play a big role in how consumers will interact with PC compared to their phones:

  •  Smartphones are truly mobile, and PCs are nomadic. This is a big difference as it implies I might be using my phone while I walk or I am standing in a crowded train/bus, but I would never do that with a PC. When I use a PC I am sitting somewhere and in more cases than not that place will have Wi-Fi. This is certainly true in the US, but less so in Europe and Asia, which is why those markets offer better opportunities for cellular enabled PCs.
  • The other factor that I think is not considered enough is the much wider application pool we have to choose from on our smartphones compared to our PCs. On the smartphone it is not just about email and video, it is about social media, news, books, chat, gaming, and the list goes on. So in a way, there are more things I can do with my smartphones that I might want to do while on the go than I will ever be able to do on my PC. Sometimes having a larger screen is a disadvantage, not just in case of portability but privacy too.

Does Always-Connected Simply mean Always-on?

Maybe when we think of connectivity, we think more about power than cellular. From the crave for longer battery life transpiring from our data it sure seems that way. That is the feature we all agree on we want in our future notebook. Our panelists would even consider buying a PC with a processor they were not familiar with as long as it delivered on battery. 29% saying they would do so for a notebook delivering 14 and 16 hours, another 17% wanting 16 to 18 hours and another 17% wanting 18 to 20 hours. Early adopters are even more demanding with 35% wanting between 14 and 16 hours before they consider a processor brand they are not familiar with.

This is where the short term opportunity for Qualcomm and Microsoft and their always-connected PC really is.  Among the panelists looking to upgrade in the next 12 months, a whopping 67% would consider a PC with an unfamiliar brand if it delivered between 14 and 20 hours of battery.

Help Me 5G You’re My Only Hope

on February 13, 2018
Reading Time: 3 minutes

I live with modern technology and with the bleeding edge of technology in my home. But I don’t live with the modern Internet. What I mean by that is I don’t live with modern Internet speeds. Brace yourself when I tell you this but my average broadband speed at home is 3.5mbps. Yes, megabits per second. My home broadband speed is not that different than the average speeds of third world countries. In fact, several third world countries have better broadband than I do.

The Modern State of WiFi

on February 13, 2018
Reading Time: 4 minutes

So easy to take for granted, yet impossible to ignore. I’m speaking, of course, of WiFi, the modern lifeblood of virtually all our tech devices. First introduced as a somewhat odd—it’s commonly believed to be short for Wireless Fidelity—marketing term in 1999, the wireless networking technology leverages the 802.11 technical standards—which first appeared in 1997.

Since then, WiFi has morphed and adapted through variations including 802.11b, a, g, n, ac, ad, and soon, ax and ay, among others, and has literally become as essential to all our connected devices as power. Along the way, we’ve become completely reliant on it, placing utility-like demands upon its presence and its performance.

Unfortunately, some of those demands have proven to be ill-placed as WiFi has yet to reach the ubiquity, and certainly not the consistency, of a true utility. As a result, WiFi has become the technology that some love to hate, despite the incredibly vital role it serves. To be fair, no one really hates WiFi—they just hate when it doesn’t work the way they want and expect it to.

Part of the challenge is that our expectations for WiFi continue to increase—not only in terms of availability, but speed, range, number of devices supported, and much more. Thankfully, a number of both component technology and product definition improvements to help bring WiFi closer to the completely reliable and utterly dependable technology we all want it to be have started to appear.

One of the most useful of these for most home users is a technology called WiFi mesh. First popularized by smaller companies like Eero nearly two years, then supported by Google in its home routers, WiFi mesh systems have become “the” hot technology for home WiFi networks. Products using the technology are now available from a wide variety of vendors including Netgear, Linksys, TP-Link, D-Link and more. These WiFi mesh systems consist of at least two (and often three) router-like boxes that all connect to one another, boosting the strength of the WiFi signal, and creating more efficient data paths for all your devices to connect to the Internet. Plus, they do so in a manner that’s significantly simpler to set up than range extenders and other devices that attempt to improve in-home WiFi. In fact, most of the new systems configure themselves automatically.

From a performance perspective, the improvements can be dramatic, as I recently learned firsthand. I’ve been living with about a 30 Mbps connection from the upstairs home office where I work down to the Comcast Xfinity home gateway providing my home’s internet connection, even though I’m paying for Comcast’s top-of-the-line package that theoretically offers download speeds of 250 Mbps. After I purchased and installed a three-piece Netgear Orbi system from my local Costco, my connection speed over the new Orbi WiFi network jumped by over 5x to about 160 Mbps—a dramatic improvement, all without changing a single setting on the Comcast box. Plus, I’ve found the connection to be much more solid and not subject to the kinds of random dropouts I would occasionally suffer through with the Xfinity gateway’s built-in WiFi router.

In addition, there were a few surprise benefits to the Netgear system that—though they may not be relevant for everyone—really sealed the deal for me. In another upstairs home office, there’s a desktop PC and an Ethernet-equipped printer, both of which had separate WiFi hardware. The PC used a USB-based WiFi adapter and the printer had a WiFi-to-Ethernet adapter. Each of the “satellite” routers in the Orbi system have four Ethernet ports supporting up to Gigabit speeds, allowing me to ditch those flaky WiFi adapters and plug both the PC and printer into a rock-solid, fast Ethernet connection on the Orbi. What a difference that made as well.

The technology used in the Netgear Orbi line is called a tri-band WiFi system because it leverages three simultaneously functioning 802.11 radios, one of which supports 802.11b/g/n at 2.4 GHz for dedicated connections with older WiFi devices, and two of which support 802.11a/n/ac at 5GHz. One of the 802.11ac-capable radios handles connection with new devices, and the other is used to connect with the other satellite routers and create the mesh network. The system also uses critical technologies like MU-MIMO (Multi-User, Multiple Input, Multiple Output) for leveraging several antennas, and data compression methods like 256 QAM (Quadrature Amplitude Modulation) to improve data throughput speeds.

Looking ahead in WiFi technology from a component perspective, we’ve started to see the introduction of pre-standard silicon for the forthcoming 802.11ax standard, which offers some nominal speed improvements over existing 802.11ac, but is more clearly targeted at improving WiFi reliability in dense environments, such as large events, tradeshows, meetings, etc. There’s also been some discussion about 802.11ay, which is expected to operate in the 60 GHz band for high speeds over short distances, similar to the current 802.11ad (formerly called WiGig) standard.

As with previous generations of WiFi, there will be chips from companies like Qualcomm that implement a pre-finalized version of 802.11ax for those who are eager to try the technology out, but compatibility could be limited, and it’s not entirely clear yet if devices that deploy them will be upgradable when the final spec does get released sometime in 2019.

The bottom line for all these technology and component improvements is that even at the dawn of the 5G age, WiFi is well positioned for a long, healthy future. Plus, even better, these advancements are helping the standard make strong progress toward the kind of true utility-like reliability and ubiquity for which we all long.

Podcast: Apple HomePod, Google-Nest Integration, Twitter and Nvidia Earnings

on February 10, 2018
Reading Time: 1 minute

This week’s Tech.pinions podcast features Ben Bajarin, Carolina Milanesi and Bob O’Donnell discussing Apple’s HomePod smart speaker, the re-integration into Google of the Nest smart home products business, and the quarterly earnings for Twitter and Nvidia.

If you happen to use a podcast aggregator or want to add it to iTunes manually the feed to our podcast is: techpinions.com/feed/podcast

Learnings from Qualcomm’s ‘5G Day’

on February 9, 2018
Reading Time: 3 minutes

This past week, Qualcomm hosted analysts and trade press for a ‘5G Day’, where they charted their progress on 5G and announced 18 operator and 19 OEM commitments to their X50 5G chipset. But in addition to the major news, this presents a good opportunity to reflect on the state of 5G, particularly on the eve of Mobile World Congress, where I’m told you’re admitted unless you pledge to say the words ‘5G’ at 50 times a day.

So here are my top takeaways:

  1. 5G is on track. If only our government could pass a budget or we could get going on repairing our infrastructure with this level of urgency! Qualcomm’s announcements about modem availability and OEM/operator agreements (plus expected news from the network equipment folks at MWC) tell us that we will large scale trials and some initial mobile 5G services (i.e. not just fixed wireless) launch in late 2018, with more widespread launches and commercial device availability in 2019. This initial phase will be ‘non standalone’, which means a simultaneous connection to both LTE and 5G. We’ll see a handful of cities turned up initially, and within those cities, swiss-cheesy type coverage and on a limited number of devices. This will be on a combination of sub- 6 GHz and mmWave spectrum, depending on the operator (but tending toward sub-s6 GHZ initially).
  2. The Level of Technical Accomplishment is Impressive. A few of the highlights:
  • Smaller than expected form factor in the X50 chip, which will help from the standpoint of power efficiency
  • Carrier aggregation of up to 8 channels in the X50 chip. That’s what will allow us to get to GB or better service, if the operators have the spectrum and open up the floodgates.
  • Real world demos that showed download 4 Gbps speeds or better, latency below 5 milliseconds (ms), and in a pleasant surprise, upload speeds of up to 360 Mbps (a dramatic improvement over today’s LTE).
  • Important advances in antennas, beam forming capabilities, and spatial multiplexing. This is reflected in the improvements in latency and performance at the cell edge
  • The sheer complexity of RF systems and the number of bands and band combinations that have to be successfully supported
  1. Millimeter Wave Remains a Bit of a Wildcard. My impression is that there is still a lot that is still being figured out about how to design devices given the finickiness of mmWave signals. For example, the signal degrades much more quickly if your hand is covering part of the phone. This presents particular challenges in antenna placement. Another under-discussed wildcard, in my view, is what battery performance will look like in mmWave.
  1. LTE Will Play an Important Role for the Foreseeable Future. Irrespective of the standalone/non-standalone discussion, LTE is going to be a big part of 5G. For the next several years, it’s going to be ‘islands of 5G’ in a sea of LTE. We will need LTE in order to continue to provide reliable voice coverage (yes, some people still make calls on their phones), since a standalone 5G network would be all IP and would not have reliable enough coverage to support voice.Even with all the hype of 5G, the LTE roadmap is compelling. Don’t get me wrong – 5G is a big jump up from 4G in many respects and opens up some new market opportunities. But, if the operators enable some of the new capabilities in the LTE roadmap, your phone will be able to do pretty much anything you would want it to do, until someone comes along with that killer AR/VR app. Economics and data capacity will be big drivers of the move to 5G.
  2. 5G Will Be About A Lot More Than Smartphones. This is really one of the big stories in 5G. It’s being built to support a very large number of connected devices, with highly varying demands on the network. This has been the #2 or #3 item in 5G PowerPoints up till now, but this is being incorporated as a development priority in reality. At the Qualcomm event, we saw some impressive real-world simulations of millions of IoT devices connected with a part of a city.
  3. Major Investment is Going into Building New ‘Ecosystems’ for 5G. I was impressed with the level of effort going thinking about specific solutions for some very large verticals, among them the automotive and the industrial IoT segment.

There will be a lot more 5G related news in the coming weeks. But on some of the most challenging aspects of 5G development, things appear to be well on track and the accomplishments are impressive.

News You might have missed: Week of February 9, 2018

on February 9, 2018
Reading Time: 5 minutes

Nest is rolled back into Google

On Wed. Alphabet Inc announced that it rolled Nest into its hardware group. Under the new org structure, Nest CEO Marwan Fawaz reports to Google’s hardware chief, Rick Osterloh, a former Motorola executive who took charge of all Google’s consumer devices in 2016. That includes Google Home smart speakers, Pixel smartphones, and Chromecast streaming devices.

Attack of the Chromebooks

on February 8, 2018
Reading Time: 4 minutes

Google is about to make a hard push with Chromebooks. Chromebooks have had back-to-back holiday quarters in the US where Chromebooks were one of the bright spots regarding growth. Google seems to be orienting themselves to initiate a strategy to grow Chromebooks outside of the only market where they have meaningful sales–education.

Ex-Intel President Leads Ampere into Arm Server Race

on February 8, 2018
Reading Time: 4 minutes

In a world where semiconductor consolidation is the norm, it’s not often that a new player enters the field. Even fabless semiconductor companies have been the target of mergers and acquisitions (Qualcomm being the most recent and largest example) making the recent emergence of Ampere all the more interesting. Ampere is building a new Arm-based processor and platform architecture to address the hyperscale cloud compute demands of today and the future.

Though the name will be new to most of you, the background and history is not. Owned by the Carlyle Group, which purchased the Applied Micro CPU division from MACOM last year, Ampere has a solid collection of CPU design engineers and has put together a powerful executive leadership team. At the helm as CEO is Renee James, former President at Intel, leaving the company in 2015. She brings a massive amount of experience from the highest level of the world’s largest semiconductor company. Ampere also touts an ex-AMD Fellow, former head of all x86 architecture from Intel, ex-Intel head of platform engineering, and even an ex-Apple semiconductor group lead.

Architecturally, the Ampere platforms are built with a custom core design based on the Arm architecture, utilizing the ARMv8 instruction set. Currently shipping is the 16nm processor codenamed Skylark with 32-cores and a 3.0 GHz or higher clock speed. The platform includes eight DDR4 channels, 42 lanes of PCI Express 3.0, and a TDP of 125 watts. The focus of this design is on memory capacity and throughput, with competitive SPECint performance. In my conversation with James last week, the emphasis on memory and connectivity is a crucial component of targeting lower costs for the cloud infrastructure that demands it.

The second generation of Ampere’s product stack called Quicksilver is coming in mid-2019. It will move to the updated ARMv8.2 instruction set, increase core count, improve overall IPC, and add multi-socket capability. Memory speed will get a bump and connectivity gets moved to PCI Express 4.0. It will include CCIX support as well, an industry-standard cache coherent interface for connecting processors and accelerators from various vendors.

Interestingly, this part will be built on the TSMC 7nm process technology which Ampere CEO James says will have a “fighting chance” to compete or beat the capabilities provided to Intel by its own in-house developed process technology. That isn’t a statement to make lightly and puts in context the potential impact that Intel’s continued 10nm delays might have for the company long-term.

For systems partnership, Ampere is working with Lenovo. This is a strong move by both parties, as Lenovo has a significant OEM and ODM resources, along with worldwide distribution and support. If the Ampere parts do indeed have impact in the cloud server ecosystem, having a partner like Lenovo that is both capable and eager to grow in the space provides a lot of flexibility.

Hardware is one thing but solving the software puzzle around Ampere’s move into the hyperscale cloud server market is equally important. James told me that the team she has put together knows the importance of a strong software support system for enterprise developers and seeing that happen first hand at Intel gives her a distinct advantage. Even though other players like Arm and Qualcomm are already involved in the open source community, Ampere believes that it will be able to make a more significant impact in a shorter period, moving forward support for all Arm-processors in the server space. Migrating the key applications and workloads, like Apache, memcache, Hadoop, and Swift to native, and most importantly efficient, code paths is required for widescale adoption.

Followers of the space may be wondering why now is the right time for a company like Ampere to succeed. We have seen claims and dealt with false promises from numerous other Arm-based server platform providers, including AMD and the source of Ampere’s current team, Applied Micro. Are the processors that much different in 2018 from those that existed in 2013? At their core, no. But it’s the surrounding tentpoles that make it different this time.

“Five years ago, this couldn’t have happened,” said James in our interview. The Arm architecture and instruction set has changed, with a lot more emphasis on the 64-bit superset and expanding the capability for it to address larger and faster pools of memory. Third party foundries have caught up to Intel as well – remember that James believes TSMC’s 7nm node will rival Intel competitively for the first time. Finally, the workloads and demands from the datacenter have changed, moving even further away from the needs of “big cores” and towards the smaller, more power efficient cores Ampere and other Arm options provide.

Obviously, that doesn’t apply to ALL server workloads, but the growth in the market is in that single-socket, memory and connectivity focused segment. AMD backs up Ampere’s belief here, with its own focus on single-socket servers to combat the Intel dominated enterprise space, though EPYC still runs at higher power and performance levels than anything from the Arm ecosystem.

James ended our interview with a comparison of the Arm server options today to x86 servers more than 25 years ago. At the time, the datacenter was dominated by Sun and Sparc hardware, with Sun Microsystems running advertising claiming that Intel’s entry into the space with “toy” processors wasn’t possible. Fast forward to today and Intel has 99% market share in the server market with that fundamental architecture. James believes that same trajectory lies before the Arm-based counterparts rising today, including Ampere.

There is still a tremendous mountain to climb for both Ampere and the rest of the Arm ecosystem, and to be blunt, there is nothing that proves to me that any one company is committed completely. Qualcomm has announced its Centriq CPUs last year and Ampere claims to have started sampling in 2017 as well. We don’t yet have one single confirmed customer that has deployed Arm-based systems in a datacenter. Until that happens, and we see momentum pick up, Ampere remains in the position that previous and current Arm-based servers are found: behind.

Can the PC Market Ever Grow Again?

on February 7, 2018
Reading Time: 3 minutes

One of the big questions we at Creative Strategies get asked about by all of the big PC and semiconductor companies who have much skin in the PC game is whether the PC market could ever grow again. If you look at the Gartner chart below, you see that starting in 2012; the PC industry has been in decline significantly. Since 2011, the PC market shrank by 32%, and while 2017 numbers are not in yet, we believe it was down to 3-4% last year. That is a huge drop in PC sales that has had a major impact on just about everyone in the PC ecosystem today.

Reflections on an interview with Uber’s Chief Brand Officer Bozoma “Boz” Saint John

on February 7, 2018
Reading Time: 4 minutes

Last Thursday I had the great opportunity to attend an event titled “Driving Change” at the Computer History Museum in Mountain View where Verge Editor Lauren Goode masterfully moderated over an hour long conversation with Uber’s Chief Brand Officer Bozoma Saint John.

Like many, I had been blown away at Apple’s Developer Conference in 2016 when this force of nature that is Bozoma Saint John walked onto the stage and was able to instantly captivate the attention of a room full of geeks and press. Self-proclaimed “badass Boz” became the face of Apple Music and to some extent helped Apple shake off that “all white male” image.

When Saint John took her role at Uber as Chief Brand Officer, I wished her well given the mess Uber was in and how much work it would require turning the brand around. It is not often that in tech you see a Black leader and even less so a Black Woman as a leader, and Saint John, while she has her work cut out for her, has the opportunity to be the mastermind of such a big brand turn-around.

When my husband flagged this event to me, I put it on the calendar straight away. As a tech analyst, I was eager to understand more about the Uber culture, her role and how she planned to make a difference. I also happen to have a biracial daughter. So as a mom, I look for opportunities for her to see and meet smart, driven, successful women and women of color in particular. This event was a great double whammy!

Needless to say, I  got what I was hoping for and then some!

Saint John made several interesting points on marketing and branding one of which was about measuring success. When she was asked how she will measure success, she said she would use all the available data such as net promoter scores and brand affinity. However, she believes more in softer measures like “being proud of walking to the store wearing an Uber t-shirt.”  While this might seem like a warm and fuzzy kind of answer, I think it is precisely the kind of measure Uber needs.

While I was interested in what Saint John had to say about marketing and branding, I was particularly looking forward to listening to her views on diversity in tech. Here are some of my takeaways.

On Diversity…

There were so many good points made in the conversation. As I was listening to what was being discussed, I kept on thinking about the fact that the room I was in was the most diverse audience I had been in since I moved to Silicon Valley six years ago. Undoubtedly the most diverse audience I have seen at any event that was directly or indirectly related to tech. My daughter had never seen so many black women in one room, and it was empowering for her. Their participation in the room and on Twitter told me that aside from marketing and tech, they were there for leadership and inspiration. This speaks of the need for diversity leaders we can identify in. Diversity leaders who can show us that there is room at the table for people like us. Of course, this does not stop with gender and race. Representation is crucial.

When it comes to tech companies, Saint John said it is shameful how few black employees there are. Changing this issue, she argued, is not the sole job of a diversity officer, more often than not a person of color. The responsibility of driving diversity rests with the CEO and the whole management of a company. As I was listening, I could not help but think she was referring to Apple, which might be unfair on my part. Tim Cook has been a very vocal advocate for diversity, but while the numbers within the company have been growing, they have been doing so at a very slow pace. Yet, Apple is ahead of many other tech companies which is what is most discouraging. Hiring practices must change to see a significant impact. If you are trying to diversify your work force and your talent pull remains Silicon Valley, things will not change. Tech must broaden its reach when it comes to talent pull and must support schools and organizations across the US to work with kids from minority groups, so they have a chance to get into tech, grow their talent and get ready for the job market. Such work must be done with a sense of urgency and higher goals should be set for what is considered success. One percentage point increase in lower paying jobs is not what companies should be aiming to achieve over a year.

Another fascinating point Saint John made was about the great responsibility minority leaders have on their shoulders. They get judged for who they represent: women, Black, gay, Muslim…But this is not how any white person is judged. I am sure that for any minority person reading this article I am stating the obvious but this might be news to others. I am fortunate, I only have gender to contend with but I often think how something I do or say would reflect on other women. But the reality is that I am me and, for good or bad, there is nobody else like me so why should I feel responsible or made to feel accountable for a whole group of people?

On Being Yourself…

“If you say something, own it! Don’t add LOL at the end of your strong statement to soften the message” said Saint John to a member of the audience who signed her question “impatient black woman LOL”.

I know I do that all the time, especially with male colleagues, clients, and peers, the verbal equivalent of “just kidding” that I add at the end of a criticism, or an opinion just so that I do not come across as threatening. I think many women do this, independent of the color of their skin and we really should stop worrying and starting owning what we say.

Maybe the best message that could have been given to my daughter and any young person who looks or feels different from the status quo was that it is ok to be different. Not only it is ok to be different, but if you are in an environment that does not allow you to be yourself, you should not be there. You also should not fit a stereotype that others have created for you so they feel safe.

I was not sure how much of what was said on stage sunk in with my daughter. Meeting Lauren and Boz at the end of the night was for sure the highlight of her evening. On the way home, though, as I asked her what she learned and she said: “I learned that is ok to be me and that I am not everybody else that looks like me I am just me.” I smiled and kept on driving, pride oozing from every pore.

Wearables to Benefit from Simplicity

on February 6, 2018
Reading Time: 3 minutes

Sometimes simplicity really is better—especially for tech products.

Yet, we’ve become so accustomed and conditioned to believe that tech products need to be sophisticated and full-featured, that our first reaction when we see or hear about products with limited functionality is that they’re doomed to failure.

That’s certainly been the case for a while now with wearables, an ever-evolving category of devices that has been challenged to match the hype surrounding it for 5-plus years. Whether head-worn, wrist-worn, or ear-worn, wearables were supposed to be the next big thing, in large part because they were going to do so many things. In fact, there were many who believed that wearables were going to become “the” personal computing platform/device of choice. Not surprisingly, expectations for sales and market impact have been very large for a long time.

Reality, of course, has been different. It’s not that wearables have failed as a category—far from it—but they’ve certainly had a slower ramp than many expected. Still, there are signs that things are changing. Shipments of the Apple Watch, which dominates many people’s definition of the wearable market, continue to grow at an impressive rate for the company. In fact, some research firms (notably IDC) believe the numbers surpassed a notable marker this past quarter, out-shipping the collective output of the entire Swiss watch industry in the same period. Now, whether that’s really the most relevant comparison is certainly up for discussion, but it’s an impressive data point nonetheless.

Initially, the Apple Watch was positioned as a general-purpose device, capable of doing a wide range of different things. While that’s certainly still true, over time, the product’s positioning has shifted towards a more narrowly focused set of capabilities—notably around health and fitness. While it’s hard to specifically quantify, I strongly believe that the narrower device focus, and the inherent notion of simplicity that goes along with it, have made significant contributions to its growing success. When it comes to wearables, people want simple, straightforward devices.

With that in mind, I’m intrigued by news of a new, very simple glasses-based wearable design from an organization at Intel called the New Devices Group. These new glasses, called Vaunt, look very much like traditional eyeglasses, but feature a low-power laser-based display that shoots an image directly onto your retina. Enabled by a VCSEL (vertical-cavity surface-emitting laser) and a set of mirrors embedded into the frame, the Vaunt creates a simple, single-color LED-like display that appears to show up in the lower corner of your field of view, near the edge of your peripheral vision, according to people who have tried it. (Here’s a great piece describing it in detail at The Verge.)

There are several aspects of the device that seem intriguing. First, of course, the fact that it’s a simple, lightweight design, not a bulky or unusual design, and therefore draws no attention to itself, is incredibly important. In an era when every device seems to want to make a statement with its presence, the notion of essentially “invisible” hardware is very refreshing. Second, the display’s very simple capabilities essentially prevent it from overwhelming you with content. Its current design is only meant to provide simple notifications or a small amount of location or context-based information.

In addition, while details are still missing, there doesn’t seem to be a major platform-type effort, but rather a simple set of information services that could theoretically be embedded into the device from the start, or slowly added to it in an app-life fashion that seems more akin to how skills get added to an Amazon Alexa. So, for example, it could start out by providing the same kind of notifications you currently get on your phone, but start to add location-based services, such as directions, or simple ratings for restaurants that are literally right in front of you, as well as intelligent contextual information about a person you might be having a conversation with.

Key to all of this, however, is that the design intentionally minimizes the impact of the display by putting it out of the way, allowing it to essentially disappear when you don’t actively look for it. That ability to minimize the impact of the technology—both functionally and visibly—as well as to intentionally limit its capabilities is a critically important and new way of thinking that I believe can drive wearables and other tech devices to new levels of success.

Ironically, as we look to the future evolution of tech devices, I think their ability to become more invisible will lead them to have a more pervasive, positive impact on our lives.

Diving Deeper on HomePod

on February 6, 2018
Reading Time: 4 minutes

I encourage you to read my public thoughts on HomePod from spending about a week with Apple’s newest addition to the product family. I think a caveat needs to be made with my take on HomePod. I’m not the normal consumer who will get their hands on this product and form an opinion. Due to the nature of my job, I use more technology, and try a vast array of products and integrate them all into my life in ways most consumers will never do. So the comparisons I can make of products against each other are not things normal people will ever experience.

You Can’t Unhear Apple’s HomePod

on February 6, 2018
Reading Time: 7 minutes

Before receiving my Apple HomePod to review, I found myself in a house in Noe Valley in San Francisco. Apple invited me to see and experience HomePod in a unique home setup before taking one home to try for myself. I’ll spare you the details of the entire demo as there was one demonstration where HomePod’s value was truly made clear.

On an entertainment center, that looked like a retro design out of the 70’s with silver and copper knobs, wood like old cedar, and metallic grates, sat a Sonos Play One (Alexa enabled), A Google Home Max, Apple’s HomePod, and an Amazon Echo 2nd generation.

This demo was not that different from the one I had in June at Apple’s World Wide Developer Conference. At that event, the demo was HomePod, an Amazon Echo (first gen) and a Sono Play 3. Even in that demo, which was a highly controlled room, HomePod was head and shoulders better sounding.

This demo included the much-praised Google Home Max and a quality new speaker from Sonos in the Alexa enabled Play One which I own three of personally. When you listened the comparison of each speaker to the HomePod, you realized how the audio engineers at each company focused on different things. The Google Max focused on bass; it had a lot of bass, so much, in fact, the product ships with a rubber mat it recommends you place the speaker on. Given how much bass the Google Max emphasizes, at the expense of clean vocals and certain instrumentation, I imagine this rubber mat is to help limit vibrations the speaker may make on any hard surface due to the amount of bass it emits. Google Max was too heavy on bass for my preferences.

The Sonos Play One, again which I own three personally, was the second best sounding after HomePod. I felt the Sonos sound engineers (in comparison to the others) did a good job balancing sound more in the mid-range and not emphasizing too much bass or too much treble. It was clean and balanced. The Echo sounded the worst by a large margin. But the HomePod was a different audio experience entirely.

Listening to these demos side by side, and even once I got HomePod home and had it play in my living room comparing it to my Sonos Play One and my Amazon Echo, what hit me was once you hear the HomePod it is hard to unhear it. Once you listen to it and experience it for yourself, there is no going back. My Sonos, as great as it sounds, and my Echo’s just didn’t sound the same after listening to the same songs on the HomePod. You can’t unhear the quality of the HomePod, and it will change your opinion of many others speakers you may own. There was no going back. I can say, with absolute confidence, the HomePod will be the best sounding speaker many people have ever owned.

Many of my friends have asked me how I would describe what HomePod sounds like. With the caveat that you just have to hear HomePod to truly experience it, let me attempt to articulate my experience.


The Sound Experience

HomePod has what can only be explained by the most balanced audio, not just of any smart speaker but of any speaker I currently own. Which includes a number of Sonos speakers, and a Bose Home Theatre system. By balanced, I mean evenly distributed quality sound. With many speakers and sound systems, there is a zone of perfection. That is a specific place, or alignment of your body, where the system sounds the best. HomePod is unique in that it doesn’t have a singular place where it sounds the best. Apple’s engineers designed HomePod to sound the best no matter where you are in the room. In our demo, Apple explained how this was done technically which is beyond my limited understanding of audio engineering. But I did try to prove them wrong and failed. HomePod truly did sound great from any place in the room.

The other thing that really impressed me about HomePod was how great it sounded at nearly every volume level. If you have any experience with speakers you know, there is also a sweet spot for volume. Too low and you lost almost all bass, and too high you blow out the high end/treble and often your ears hurt as the high-end parts of the audio start to distort and lose clarity. With HomePod even at low volumes, you heard balanced bass and clarity across the spectrum of sound frequencies as was the case when you put HomePod to >90% volume. What was impressive was how you could stand right next to HomePod even when the volume was quite high and did not feel like your ears were being blown out, and it maintained clarity in the sound.

Interestingly, what both the balance of audio, and the way sound is distributed, I found HomePod even sounded terrific at a distance. Meaning when I was in other rooms of my house. HomePod lived in my living room, and even as I moved as far away as the upstairs, I could still hear the distinct bass, vocals, and overall clarity. When I tried the same with my Sonos or Bose, HomePod won in all of these tests of overall audio quality.

I have no doubt, HomePod will compete with the best speakers in your house even if you have an expensive/high-end setup. Granted that is not a massive portion of the market, which is why I’m confident in saying that for most consumers HomePod will be the best sounding speaker they have ever owned.

Siri on HomePod
Now, we have to talk to about the experience with Siri. It is difficult with me to go as deep as I want here without fully articulating why I feel Siri on HomePod is an important strategic initiative for Apple. I have gone into depth for our subscribers in these articles which I encourage reading. But, Apple’s strategy with Siri has to be zero friction in access to Siri no matter where I am or what I’m doing. Before HomePod being in my home I used Alexa dramatically more than Siri. The reason was simple. I didn’t have to do anything but speak. You may say well Siri is on your wrist, true but I have to raise my wrist and tilt my Apple Watch toward me to initiate Siri. As easy as that may sound, I’m not always in a position to do that, especially when I’m cooking or doing things where my hands are occupied. Second, you may say well Siri is on your phone always listening. True but my phone is not always near me when I’m home. Sometimes I set it on a table in the other room. If it is near me, it is in my pocket. While “Hey Siri” may work when my iPhone is in my pocket it’s an awkward place to engage with Siri.

Having Siri on a truly always listening loud ambient speaker creates a true zero friction engagement with Siri in my home. HomePod became my preferred way to interact with Siri, and I found myself using Apple’s assistant dramatically more than I ever did when at home. Confirming to me why this is such an important strategic move for Apple. And luckily, Siri worked very well and delivered on many of expectations and even some of my past criticisms.

Now, Siri on HomePod is not much different than Siri on your other Apple products. She is a little more limited on HomePod and can’t do everything you can do on your iPhone. The reason for this is because HomePod is designed to be used by others in your household, not just the person who set it up. This works as expected, Siri can play songs, set alarms, check the weather and do a bunch of general tasks for all those in the household even when the person whose iPhone/iCloud account it is tied to is not there.

When it came to music, Siri knocked it out of the park. In fact, because Siri is learning about its owner when you ask to play music, say I said “play Jack Johnson radio” she would say “sure, here is a personalized X for you.” What’s happening is Siri is acting as a “mixologist” as Apple likes to say, but essentially she is playing DJ according to my music preferences. This worked fantastically, and I have about the widest range of music interests as anyone. To me, good music is good music, and I enjoy the creativity of all artists and genres. Usually, this has given Siri troubles, but I found the playlists she generated to be very relevant. Like I mentioned above this feature is not unique to HomePod and functions the same way on other Siri enabled devices, but I make the point because Siri and the music functionality is particularly useful with HomePod given music will be what most consumers use it for primarily. Which is why I was happy Siri delivered on the music experience.

One part where Siri on HomePod really stood out against other smart speakers was the communications element, particularly with text messages. Now, this only works for the person who set up HomePod, which in this case was me, but always being able to send messages and have messages read to me was extremely useful in the home environment. Given I’m in Apple’s ecosystem, I knew the communications/productivity part would be the one angle that differentiated Siri from Amazon’s Alexa and Google Assistant.

Siri on HomePod is not as full-featured as Siri on your Macs or iOS devices, and this was done by design. Siri on HomePod focused on doing a few things well both for the individual and the communal family of the house and from my experience it did those things well. Alexa and Google Assistant do have more features, for now, but are definitely more advanced in their functionality.

Overall, what stood out to me in my experience was the deeper you are in Apple’s ecosystem, the more value you will find from HomePod if you are in the market for a smart speaker. Being able to see what song is playing on your Apple Watch, or quickly move a phone call to the HomePod as a speakerphone from your iPhone, or being able to change songs from your Apple Watch, etc., were all key differentiators for me. Most consumers don’t have as many Apple products as I do, but for those who do HomePod is a great addition to the ecosystem and will be more appreciated in its functionality over competing smart speakers.

To Buy or Not to Buy
Thinking about HomePod within the broader market for smart speakers, it is smart for Apple to emphasize not just music quality but even Siri as it relates to Music. Both those use cases deliver 100% on their value. Music (specifically the experience with Apple Music) is where HomePod, and Siri, will shine. Consumers who lean toward these value propositions first and foremost will be delighted.

To the question of is HomePod worth the premium over a product like the Sonos One that is $199 and has Amazon’s Alexa, I’d say absolutely if you truly care and are picky about sound quality and/or you are deeply embedded in Apple’s ecosystem.

For the time being, if you want a great sounding speaker, with multi-room capability, a bit more full-featured assistant in Amazon’s Alexa (with Google Assistant support coming) then the Sonos Play One is a great option and great value for the money. In fact, the more I compared the Sonos Play One to the HomePod, while HomePod did sound better, I was still impressed with the sound quality of the Sonos comparatively for the price.

This space is going to be interesting to watch. We all have our suspicions for how this market may play out, but we now have a legitimately competitive market with lots of options for consumers at many price points and features. Ultimately, this is when a market gets exciting because with all this competition consumers win.

Why the Connected PC Initiative Misses the Mark

on February 4, 2018
Reading Time: 4 minutes

Last December, Qualcomm held a major media event in Maui, HI to launch what they call their connected PC initiative. Qualcomm is best known for their cellular radios that are in almost all smartphones, and their new SnapDragon 735, and 845 processors are now capable enough to also power a laptop. The key idea is to add a cellular connection to laptops using their Snapdragon processors thus making them a “connected PC” since that laptop would always have a connection via WIFI or cellular just as our smartphones have today.

Joining them in this announcement was Microsoft who strongly supported Windows OS on a Qualcomm processor, also known as Windows on ARM. If this sounds familiar, Microsoft launched a similar program with various ARM processor companies in 2014, but it failed since the processors back then were not powerful enough to handle Windows OS and Windows had to be run in an emulation mode which made these ARM-based laptops run sluggishly at best.

This time around the processor that Qualcomm is bringing to the table is fast enough to run Windows OS 10 even when, in some cases, it has to revert to emulation mode to do so.

As I sat through the major presentation by Qualcomm and Microsoft Executives describing their new “Connected PC” program at the Maui event, the first thing I thought was “is this just a new try at Windows on ARM” and remembering what a disaster that was the last time this was tried. But as I got to check out the demos and do some one on one’s with Qualcomm and Microsoft Executives about the role a more powerful Snapdragon processor and a tailored version of Windows 10S created for this program could deliver, I saw that this idea had real merit and potential.

While in theory, I like the idea of always being connected, anytime and anywhere, I knew from our research that connectivity via cellular was not a high priority when it comes to features wanted in a laptop. Indeed, we have had the availability of cellular modems as options for laptops for over ten years, and demand for this feature in laptops is very low.

Another good benchmark to measure demand for cellular connectivity beyond a smartphone is the cellular activation rates of iPads. It turns out that of all iPads sold, around 50% buy up to include a cellular modem. But our research shows that less than 20% of those iPads with a cellular modem in them activate them.

The key reason for lack of real demand for a cellular connection in a laptop or a tablet is the additional cellular costs this adds to a person’s cell phone bill. When I asked one major cellular carriers about how they would price the connection on a connected PC, they said it would be an additional $10 or 12 dollars a month fee, and data used on a laptop would count against the person’s monthly data allotment they pay for already.

I could imagine that a younger demographic user who watches a lot of Youtube videos and accesses a lot of content on their laptops now, could go through their allotted all-you-can-eat 22-25 gig personal data plan in one or two weeks and then their data speeds on both their smartphone and connected laptop go down to 128 kbps.

Our research about the demand for cellular in a laptop was done sometime back so early this year we updated this survey by asking people “what are the three most important features you want in the next notebook or laptop you will buy.” As you can see from this chart below, long battery life, more memory, and larger hard drive storage topped their list. Cellular connectivity came in farther down the list at just over 20% interested, which pretty much maps to our iPad research mentioned above.

The good news for Qualcomm and Microsoft is that while both touted the “connected PC” initiative at the event, they also emphasized that by using these new Snapdragon processors one could get as much as 22 hours of continuous battery life. In talks with their execs after the main announcement, they hinted that people could probably get even more hours of battery life depending on how their OEM partners configured them and the OS versions they would use from Microsoft.

My fear for both Qualcomm and Microsoft is that by leading with the connected PC story and subsequent marketing pushes around this focus, it will not drive the kind of tech adoption they hope to get from this program, and we could have another Windows on Arm failure in the works. The research we did a year ago and in the last week shows that the real interest is in longer battery life. That would drive significant demand for Windows on ARM with QQ this time around provided it delivers the kind of performance they stated at the launch event in Maui in early December. They should brand this the “All Day PC” and make this the new battle cry for laptop upgrades going forward.

This is an important moment for the PC industry. While consumers like new designs that are thinner and lighter, as our survey points out, that is not what drives purchases of new laptops. Longer Battery life, more memory, and storage top their buying criteria. If Qualcomm and Microsoft, along with others, who want to compete with a feature that may drive a new level of demand for laptops in the future then they need to cater to these interests and make cellular connectivity a nice to have feature for those who are willing to pay the connectivity tax they will get from their carriers.

Tech.pinions Podcast: Apple Earnings and Industry Learnings

on February 3, 2018
Reading Time: 1 minute

On this weeks podcast, Ben Bajarin and Carolina Milanesi chat about Apple’s earnings and weave key points about Apple’s latest quarter than shed light on some new industry trends and themes with a few key markets like smartphones and wearables.

Apple’s Holiday Quarter iPhone Hat Trick

on February 2, 2018
Reading Time: 3 minutes

As is often the case, Apple was able to quiet rumors of the iPhone sky falling by announcing stellar results during its quarterly earnings call that put the company at the top of the market in terms of smartphone shipments, average selling price (ASP), and revenues. While total iPhone shipments were down slightly from the year-ago quarter, that doesn’t tell the whole story. Based on IDC’s preliminary numbers for the quarter, Apple shipped more smartphones than any other vendor at 77.3 million units (Samsung shipped 74.1 million). That’s a 1.3% Apple decline versus the broader market that slipped 6.3%. Importantly, Apple shipped these volumes while enjoying an ASP that increased by more than a $100 from the year-ago quarter, driven by strong sales of its iPhone X, 8, and 8 Plus. That means the company drove smartphone revenues of $61.6 billion, to lead the market. Not bad for a quarter when your marquee product started shipping in November.

Apple’s ASP Increases
Apple’s ability to dramatically increase its ASP year over year in a slowing market is impressive. Over the last few years, Apple has repeatedly introduced new iPhones with higher selling prices. While this usually results in an ASP spike during the initial quarter, things tend to fall back in subsequent quarters. But there’s not always a clear pattern to this rise and fall. For example, in 2017 the third fiscal quarter (second calendar quarter) was Apple’s second-highest at $703. For the total year of 2017 Apple’s ASP was $707, up from $647 for the full year 2016, which was down from the previous full-year ASP of $671.

To put Apple’s recent ASP performance into perspective, let’s look at Samsung’s numbers. While Apple out shipped the company in the holiday quarter, Samsung still shipped more smartphones in total for 2017 (317.3 million units versus Apple’s 215.8 million). But Samsung’s ASP has headed the opposite direction. Based on IDC’s data (Samsung doesn’t publicly announce units or ASPs), the company’s calendar-year third quarter ASP was $327 (Note: we don’t have 4Q ASP yet). For the first three-quarters of 2017 combined, Samsung’s smartphone ASP was $313. That’s down from $319 in 2016, and $344 in 2015. (Samsung’s ASP tends to spike in calendar Q2, around the launch of its latest flagship Galaxy S phone).

It’s worth noting that Samsung isn’t the only smartphone company with declining average selling prices, and Apple isn’t the only one with year-over-year increases. In fact, many of the top ten smartphone vendors have managed to increase their ASPs year-over-year through three quarters of 2017, but none have managed to increase so dramatically as Apple. And of course, none are operating at its scale.

Continued ASP growth?
So the question becomes, can Apple maintain or grow its iPhone ASP in 2018, or has it reached the top of the mountain? There are a number of factors to consider, including some things that are unique to this year’s market. One key question is whether everyone who wanted an iPhone X, and who could afford it, already bought it in the fourth quarter. This seems unlikely. While Apple sorted supply constraints quickly after launch, there were undoubtedly some who looked at early wait times and opted to hold off until the dust settled.

Another new wrinkle this year was Apple’s launch of three new phones instead of two. While the iPhone X has the highest ASPs, the iPhone 8 and 8 Plus also carry high prices and were a major driver in Apple’s quarterly increase. In past years Apple launched two new flagship phones, so we’re in uncharted waters with three, which means even as shipment mix shifts in subsequent quarters, the ASPs may hold nearer to the top than in the past.

Another element is Apple’s ongoing iPhone battery life public relations challenge. During the earnings call, one analyst asked Tim Cook if he felt Apple’s battery-replacement program might incentivize buyers to get a new battery for their existing phone and to hold off on buying a new iPhone. This might impact total iPhone shipments to a slight degree, but as wise folks have noted, these customers probably weren’t on the verge of buying a new top-line iPhone anyway. (Cook said Apple was more concerned about taking care of customers than worrying about its impact on future shipments.)

The bigger question for me is how Apple will price the new phones it will launch later this year. Supply-side chatter suggests that there will likely be at least one new X-class phone with a larger screen than today’s 5.8-inch product. Can Apple sell this phone at an even higher ASP than today’s iPhone X, or will it need to price this larger phone at today’s top-end and lower the price of the iterative 5.8-inch product? Also, do the 8 and 8 Plus get refreshed, or do they stay the same and see a price drop? My gut tells me the company may have maxed out its ability to raise the top-end price, but it has surprised me before so only time will tell.

The next several quarters should be instructive in this regard. If Apple’s ASPs drop significantly over the next six months, indicating a mix shift away from the top end, Apple will have a good sense of what the market will support. In the meantime, we have Samsung’s Galaxy S9 launch to watch later this month. How Samsung markets and prices this phone should be instructive, too.

News You might have missed: Week of February 2, 2018

on February 2, 2018
Reading Time: 5 minutes

Earnings Thursday!

Apple

  • Revenue reached $88.3 billion, growing 13% over Q1 FY2017 and above the high end of Apple’s guidance.
  • This quarter was a week shorter than 2017: average revenue per week was up 21%.
  • Apple’s business is growing in all product categories and in all regions worldwide.
  • Apple’s installed base hit 1.3 billion devices in January, up 30% in just two years.
  • iPhone saw its highest revenue ever.
  • Apple returned $14.5 billion to investors during the quarter.
  • In the March quarter, Apple expects revenue to be between $60 billion and $62 billion.

AMD CEO Refocused Company on Growth in Graphics, High-Performance Compute

on February 1, 2018
Reading Time: 4 minutes

CEO Lisa Su believes that 2017 was “a year of inflection” for the company and one that will bridge between two different versions of AMD. Prior, AMD was a company in flux, with a seemingly unfocused direction that affected its ability to compete with Intel and NVIDIA. Moving from an organization that wanted to address every facet of the market to one that realigned toward high-performance computing and graphics, Su has created a more efficient and more targeted company. From 2018 onward, Su has AMD on track to grow in well-established segments including processors, graphics, and enterprise, but also in expanding markets like blockchain and cryptocurrency, machine learning, and cloud computing.

The just announced Q4 AMD financials are out and paint a positive picture of growth for the once struggling chip company. Revenue spiked 34% to $1.5B this quarter and margins for 2017 were up 3% YoY, reaching 35%. Last year as a whole saw a 25% jump in revenue for AMD, a total of $1B of additional income, bringing the company to a full year profit for the time in recent memory.

The Now

By far the most head-turning result last quarter from AMD was its growth in the compute and graphics segment. Covering both its processor and graphics divisions, Q4 saw a 60% jump in year-over-year revenue, attributed to the position of its consumer Ryzen processors and Radeon graphics solutions. 2017 saw the release of the Ryzen family of parts for mainstream and enthusiast buyers, enterprise systems, workstations, and even notebook and 2-in-1 PCs.

This group saw $140M in quarter-to-quarter revenue increase, of which roughly one-third is attributed to sales of graphics chips for blockchain processing, or cryptocurrency mining as it is most often referred to. (Blockchain is the underlying technology behind cryptocurrencies.) That means around $46M of the revenue growth in the fourth quarter is attributed to the sales of graphics cards into the cryptocurrency market, with the remainder coming from the combination of graphics chip sales for gaming, professional cards, and Ryzen processors. In total, more than 50% of the total revenue growth for this segment of AMD is coming from its graphics products.

Accurately measuring the impact of cryptocurrency-based graphics chip sales can be difficult, as they are sold through the same partners and channels as gaming hardware. There are pros and cons to being a significant player in the coin-mining markets, from the surge in sales and revenue to the potential saturation of the graphics market should there be a steep decline in blockchain demand.

The Ryzen processor family was up in both units and revenue for the quarter, but AMD didn’t break down in any more detail what that looked like. ASPs (average selling prices) are up on the year but flat for the quarter as a result of the introduction of the Ryzen 3 family targeting the budget OEM market. Selling additional lower-priced hardware will generally bring down ASP and margin, but the size of the addressable market is larger in this price segment than any other consumer space.

AMD’s EESC group (enterprise, embedded, and semi-custom) was up only 3% year-on-year, a surprise considering the continued success of the game consoles that AMD hardware powers (Microsoft Xbox One and Sony PlayStation 4), and the release of EPYC processors for server and cloud infrastructure. Though the media reception has been strong for the performance and capability of the EPYC processor family, ramping sales in the server market is greatly dependent on refresh cycles and availability from vendors like HP and Dell.

The Future

AMD CEO Lisa Su also stated in the earnings call that it would be attempting to “ramp up” graphics chip production in order to meet the demand for graphics cards in the market for both gaming and blockchain/cryptocurrency. The problem for AMD (and NVIDIA as well) is that the bottleneck of production doesn’t lie with the vendors that build the chips (groups like TSMC, Samsung, and GlobalFoundries that make the chips for fabless semiconductor companies like AMD, NVIDIA and Qualcomm) but apparently with the memory ecosystem. The prices of memory have increased drastically over the last year as the demand for all variants has increased without movement in memory production capability.

As a result, even if AMD could or wanted to significantly increase its graphics chip production for its current family of parts, shortages in the memory space could hold back inventory increases to a large degree. There is a limit to how much impact any “ramp up” that AMD might move forward with on how its growth in the graphics markets is affected.

In some ways this might be a happy accident for AMD as it means limited risk of inventory concerns in the future. One of the biggest fears the cryptocurrency market has created is that a crash of the market will mean significant inventory pushed back into the resale market for consumers, stunting any future product releases or sales of product currently being produced.

While the cryptocurrency market will remain fluid and speculation will continue to drive price swings for the foreseeable future, blockchain technology itself has proven to be a viable solution to many computing and security problems. Previous Bitcoin crashes were dependent on only a single virtual currency but today we have seen the expansion to dozens of currency options and additional use cases for the underlying technology. AMD believes that the demand for blockchain processing on graphics processors will remain strong through the first quarter of 2018 and will be a continued source of revenue and sales well into the future.

Looking at the processor market, AMD’s move into the mobile processor segment comes with the most competitive product it has fielded in over a decade. By combining the technology of both a high-performance graphics chip and traditional processor in a single package, AMD can offer a solution that is unique from anything Intel currently sells.

We only saw a moderate growth rate in the enterprise segment for AMD with its EPYC processor but I expect that to increase through 2018 as more partners like Baidu and Microsoft integrate and upgrade systems in datacenters across the world. This space traditionally takes much longer than consumer areas to migrate to newer technology and AMD still believes it has a strong position of growth.