The Shape of 2013: Predictions for the Year Ahead

Crystal ball graphic
After 15 years of making predictions, with a track record that would have made you rich if you’d bet on them, I’ve been away from the practice for a couple of years. But as the regulars at Tech.pinions have agreed to end the year with a set of predictions each, I’m back at the game. My best guesses for 2013:

A Modest Rebound for BlackBerry. Like many others, I was prepared to write off BlackBerry during the last year as its market share cratered. And if Windows Phone 8 had really taken off or if Android had made a serious play for the enterprise, it would be very hard to see where there might be room in the market for Research In Motion, no matter how promising BlackBerry 10 looks. But I think there is room for at least three players in the business, and right now the competition for #3 is still wide-open. BlackBerry still enjoys a lot of residual support in the enterprise IT community, and some key federal agencies that had been planning to move away from the platform, such as Homeland Security’s Immigration & Customs Enforcement, have indicated they are open to a second look. The challenge Research In Motion faces is that BlackBerry 10, which will be leased on Jan. 30, needs to be appealing enough to users, not just IT managers, that it can at least slow the tide of bring-you-own devices into the enterprise.

A Windows Overhaul, Sooner Rather Than Later. Even before Windows 8 launched to distinctly mixed reviews, there were rumors about that Microsoft was moving toward a more Apple-like scheme of more frequent, less sweeping OS revisions. Microsoft sometimes has a tendency to become doctrinaire in the defense of its products; for example, it took many months for officials to accept that User Access Control in Vista was an awful mess that drove users crazy. But Microsoft has had some lessons in humility lately and the company knows that it is in a fight that will determine its relevance to personal computing over the next few years. I expect that, at a minimum, Windows 8.1 (whatever it is really called) will give users of conventional PCs the ability to boot directly into Desktop mode, less need to ever used the Metro interface, and the return of some version of the Start button. On the new UI side, for both Windows 8 and RT, look for a considerable expansion of Metrofied control panels and administrative tools, lessening the need to work in Desktop. In other words, Microsoft will move closer to what it should have done in the first place: Offer different UIs for different kinds of uses. The real prize, truly touch-ready versions of Office, though, are probably at least a year and a half away.

Success for touch notebooks. When Windows 8 was first unveiled, I was extremely dubious about the prospects for touch-enable conventional laptops. The ergonomics seemed all wrong. And certainly the few touchscreen laptops that ran Windows 7 weren’t every good. Maybe its my own experience using an iPad with a keyboard,  but the keyboard-and-touch combination no longer seems anywhere near as weird as it once did. And OEMs such as Lenovo, Dell, HP, and Acer are coming up with some very nice touch laptops, both conventional and hybrid. Even with a premium of $150 to $200 over similarly equipped non-touch models, I expect the touch products to pick up some significant market share.

Significant wireless service improvements. We’ll all grow old waiting for the government’s efforts to free more spectrum for wireless data to break fruit. The incentive auctions of underused TV spectrum are not going to be held until 2014, and it will be some time before that spectrum actually becomes available. The same is true for a new FCC plan to allow sharing of government-held spectrum in the 3.5 GHz band. But the good news is we don’t have to wait. Technology will allow significant expansion of both the capacity and coverage of existing spectrum. Probably the two most important technologies are Wi-Fi offload, which will allow carrier traffic to move over hotspots set up in high-traffic areas, and femtocells and small cells, which can greatly increase the reuse of of the spectrum we already have. Unlicensed white space–unused free space between TV channels–should begin to make a contribution, especially in rural areas where TV channels are more sparse. And the huge block of mostly idle spectrum the Sprint is acquiring with its proposed purchase of Clearwire will also ease the congestion, probably starting next year. (Stay tuned for a Tech.pinions series on spectrum issues in January.)

Intel Will Make a Major ARM Play. It’s hard to believe today, but Intel was once a major player in the ARM chip business. In 1997, it bought the StrongARM business from a foundering Digital Equipment. Renamed XScale, the Intel ARM chips enjoyed considerable success with numerous design wins as early smartphone applications processors. But XScale was always tiny compared to Intel’s x86 business and in 2006, Intel sold its XScale operations to Marvell. A year later, Apple introduced the ARM-based iPhone. Today, ARM-based tablets are in the ascendancy, x86-based PCs are in decline, and Intel is struggling to convince the world that a new generation of very low power Atom systems-on-chips are competitive. Maybe the Clover Trail SOCs and their successors  will gain a significant share of the mobile market, but Intel can’t afford to wait very long to find out. With its deep engineering and manufacturing skills, Intel could become a major ARM player quickly, either through acquisition or internal development.

Integration Gives iPhone an Unbeatable Advantage

A6 ,chip image (Apple)There was only one real surprise when did its by-now ritual teardown of the new iPhone 5. The phone sports a 1440 milliamp-hour battery just a hair bigger than the battery in the  iPhone 4S. Yet despite going to a bigger display, boosting processor performance, and using faster but more power-hungry LTE wireless, the new phone seems to deliver about the same battery life as its predecessor. And instead of having to go to a bigger battery, Apple was able to use improvements in case and display design to reduce the iPhone’s thickness and weight markedly.

This is the result of obsessive engineering, not magic. Apple uses its control over every aspect of the iPhone’s design, from the silicon to the software, to fine-tune a device that squeezes maximum performance from minimal resources. This gives Apple an enormous advantage over all competitors save Research In Motion (whose severe problems are the result of its inability to read and respond to the changing market for BlackBerry, not its engineering.)

 In his detailed examination of the iPhone 5, the redoubtable Anand Lal Shimpi found compelling evidence that the iPhone 5’s A6 system-on-chip uses custom, Apple-designed ARM processor cores. In previous A-series SOCs, Apple had customized Samsung ARM designs, mostly by pruning circuitry that the iPhone and iPad didn’t need. No USB ports or SD card slots, no need to have controllers for unused devices (its then nature of chips the even unused circuits increase the power draw, not by much but significantly in a design where every microwatt counts.) With the A6, Apple takes the customization a step further, achieving complete control over the heart of this system.

With a fully customized SOC, Apple could then fine-tune the software to wring out every microgram of performance while minimizing power consumption. Even the compiler used to generate  iOS code can be tweaked to optimize apps’ power consumption and performance. The tradeoffs between battery size and run time are still there–even Apple cannot escape the laws of physics–but the terms of trade are improved dramatically.

There’s no way Android can match this. Google has to write code that can support a wide variety of SOCs, including those from NVIDIA, Qualcomm, Texas Instruments, and Samsung. Android devices use several graphics systems and provide support for assorted peripherals. Code designed to run on heterogeneous systems will never be as efficient as Apple’s singleminded approach. Things are somewhat better in the Windows Phone 8 world, where the initial offerings all use a Qualcomm Snapdragon SOC. We’ll see how that afffects battery life and performance when the phones ship.

HSA Foundation: for Show or for Real?

I recently spent a few days at AMD’s Fusion Developer Summit in Seattle, Washington.  Among many of the announcements was one to introduce the HSA Foundation, an organization  currently including AMD, ARM,  Imagination, MediaTek, and Texas Instruments.  The HSA Foundation was announced to “make it easy to program for parallel computing.”  That sounds a bit like an oxymoron as parallel programming has been the realm of “ninja programmers” according to Adobe’s Chief Software Architect, Tom Malloy at AMD’s event.  Given today’s parallel programming challenge, lots of work needs to be done to make this happen, and in the case of the companies above, it comes in the form of a foundation.  I spent over 20 years planning, developing, and marketing products and when you first hear the word “foundation” or “consortium” it conjures up visions of very long and bureaucratic meetings where little gets done and there is a lot of infighting.  The fact is, some foundations are like that but some are extremely effective   like the Linux Foundation. So which path will the HSA Foundation go down?  Let’s drill in.

The Parallel/GPU Challenge

The first thing I must point out is that if CPUs and GPUs keep increasing compute performance at their current pace, the GPU will continue to maintain a raw compute performance advantage over the CPU, so it is very important that the theoretical performance is turned into a real advantage.  The first thing we must do is distinguish is between serial and parallel processing.  Don’t take these as absolutes, as both CPUs and GPUs can both run serially and in parallel.  Generally speaking, CPUs do a better job on serial, out of order code, and GPUs do a better job on parallel, in-order code.   I know there are 100’s of dependencies but work with me here.  This is why GPUs do so much better on games and CPUs do so well on things like pattern matching. The reality is, few tasks just use the CPU and few just use the GPU; both are required to work together and at the same level to get the parallel processing gains.  By working at the same level I mean getting the same access to memory, unlike today where the CPU really dictates who gets what and when.  A related problem today is that coding for the GPU is very difficult, given the state of the languages and tools.  The other challenge is the numbers of programmers who can write GPU versus CPU code.  According to IDC, over 10M CPU coders exist compared to 100K GPU coders.  Adobe calls GPU coders  “ninja” developers because it is just so difficult, even with tools like OpenCL and CUDA given they are such low level languages.  That’s OK for markets like HPC (high performance computing) and workstations, but not for making tablet, phone and PC applications that could use development environments such as the Android SDK or even Apple’s XCode.  Net-net there are many challenges for a typical programmer to code an GPU-accelerated app for a phone, tablet, or a PC.

End User Problem/Opportunity

Without the need to solve an end user or business problem, any foundation is dead in the water.  Today NVIDIA  is using CUDA (C, C++, C#,), OpenCL, and OpenACC and AMD supports OpenCL to solve the most complex industrial workloads in existence.  As an example, NVIDIA simulated at their GTC developer conference what the galaxy would look like 3.8B years in the future.  Intel is using MIC, or Many Integrated Cores to tackle these huge tasks.  These technologies are for high-performance computing, not for phones, tablets or PCs. The HSA Foundation is focused on solving the next generation problems and uncovering opportunities in areas like the natural user interface with a multi-modal voice, touch and gesture inputs, bio-metric recognition for multi-modal security, augmented reality and managing all of the visual content at work and at home.  ARM also talked on-stage and in the Q&A about the power-savings they believed they could attain from a shared memory, parallel compute architecture, which surprised me.  Considering ARM powers almost 100% of today’s smartphones and tablets around the world, I want to highlight what they said.  Programming for these levels of apps at low power and enabling 100’s of thousands of programmers ultimately requires very simple tools which don’t exist today to create these apps.

The HSA Foundation Solution

The HSA Foundation goal, as stated above, was to “make it easy to program for parallel computing.” What does this mean?  The HSA Foundation will agree on hardware and software standards.  That’s unique in that most initiatives are just focused on the hardware or the software.  The goal of the foundation is to literally bend the hardware to fit the software.  On the hardware side this first means agreement on the hardware architectural definition of the shared memory architecture between CPU and GPU.  This is required for the CPU and GPU to be at the same level and not be restricted by buses today like PCI Express.  The second version of that memory specification can be found here.  The software architecture spec and the programmer reference manual are still in the working group.  Ultimately, simple development environments like the Google Android SDK, Apple’s XCode and Microsoft’s Visual Studio would need to holistically support this to get the support of the more mainstream, non-ninja programmer.  This will be a multi-year effort and will need to be measured on a quarterly basis to really see the progress the foundation is making.

Foundations are Tricky

The HSA Foundation will encounter issues every other foundation encounters at one time in its life.  First is the challenge of founding members changing their minds or getting goal-misaligned.  This happens a lot where someone who joins stops buying into the premise of the group or staunchly believes it isn’t valuable anymore.  Typically that member stops contributing but could even become a drag on the initiative and needs to be voted off.  The good news is that today, AMD, ARM, TI, MediaTek and Imagination have a need as they all need to accelerate parallel processing.  The founding members need to make this work for their future businesses to be as successful as they would like. Second challenge is the foundation is missing key players in GPUs.  NVIDIA is the discrete GPU PC and GPU-compute market share leader, Intel is the PC integrated GPU market share leader, and Qualcomm is the smartphone GPU market share leader.  How far can the HSA Foundation get without them?  This will ultimately be up to guys like Microsoft, Google and Apple with their development environments.  One wild-card here is SOC companies with standard ARM licenses.  To get agreement on a shared memory architecture, the CPU portion of ARM SOC would need to be HSA-compliant too, which means that every standard ARM license derived product would be HSA-compliant.  If you had an ARM architecture license like Qualcomm has then it wouldn’t need to be HSA-compliant.  The third challenge is speed.  Committees are guaranteed to be slower than a partnership between two companies and obviously slower than one company.  I will be looking for quarterly updates on specifications, standards and tools.

For Show or for Real?

The HSA Foundation is definitely for real and formed to make a real difference.  The hardware is planned to be literally bent to fit the software, and that’s unique.  The founding members have a business and technical need, solving the problem means solving huge end user and business problems so there is demand, and the problem will be difficult to solve without many companies agreeing on an approach.  I believe over time, the foundation will need to get partial or full support from Intel, NVIDIA, and/or Qualcomm to make this initiative as successful as it will need to be to accelerate the benefits of parallel processing on the GPU.



Intel Could Use a Dose of Andy Grove

No 286In a presentation to financial analysts on May 10, Intel CEO Paul Otellini said he was not particularly worried about the prospect of Microsoft issuing a version of Windows for ARM processors later this year. “We think [x86 is] a differentiator,” he said. “We have the advantage of the incumbency, the legacy support.”

Maybe he’s right. But it is disconcerting to hear this sort of complacency from the head of Intel, especially at a time when ARM-powered smartphones and tablets pose an unprecedented threat to Intel’s core laptop and desktop business.

I can only wonder what Andy Grove would say. Grove, who was Intel CEO from 1987 to 1998, famously wrote: “Business success contains the seeds of its own destruction. Success breeds complacency. Complacency breeds failure. Only the paranoid survive.” Grove also once ordered an advertising campaign attacking what was then the company’s most successful product the 80286 processor, in an effort to get customers to move to the newer, much more capable, and ultimately wildly successful 80386.

Grove remains a senior adviser to Intel and has always avoided any public criticism of his successors. But I find it hard to believe he is happy watching the company he built acting so passively in the face of a threat.

Side note: Intel was actually a major player in the ARM business for some years. It bought Digital Equipment’s StrongARM business in 1998. The chips, renamed XScale, powered many handheld computers and early smartphones. Intel sold the division to Marvell in 2007.

Windows 8 on ARM: The Big Questions

Microsoft released a lengthy blog post yesterday on their website specifically around Windows 8 on ARM. Although the post shed some insight into a number of the looming questions we all have about Windows on ARM, there are still a few things I am concerned about.

Windows 8 on ARM has the potential to be either wildly successful and disruptive but it also has the potential to fail in the short-term.

How Will Microsoft and Retail Position the X86 vs the ARM hardware Versions?
When I put myself in the consumer buying mindset for a new Windows-based PC, I see some potential confusion when it comes to product positioning. Microsoft has a challenge on their hand that I am fascinated to see how they figure it out.

What Microsoft, their hardware partners, and their retail partners can not do is position ARM notebooks or other form factors as limited devices. So they can’t use terms like “full Windows experience” or “the Windows you know and love” types of terminology for non-X86 devices. Taking this direction would cause consumers to ask of their ARM counterparts: “I don’t get the full Windows experience I know and love on these products”? Which would essentially deem Windows on ARM devices to fail because they would be positioned as truncated.

This is actually an area where I am intrigued to see if the Intel inside branding efforts of years past have any relative spill over. It actually could if consumers are on the fence. Consumers may consider going with a product with Intel, or AMD for that matter, the “safe bet” if there is any confusion what-so-ever.

Unfortunately, or fortunately depending on who you are, I don’t think any of the ARM companies benefit by touting their brand name in a Windows 8 on ARM device. For example saying “Runs Nvidia Tegra 3” or “Qualcomm Snapdragon”. In fact that may add to the confusion rather than help clear it up.

It is of course possible that Microsoft and retail partners ignore trying to position Windows on X86 and Windows 8 on ARM differently at all. However, unless the device experiences have no difference at all this would be a mistake.

Will All Drivers Be Supported?
To quote the blog post directly:

“Our device strategy uses standardized protocols and class drivers extensively”

“Of course Windows has many class drivers inside, which you experience when you plug-in a wide variety of USB devices, such as storage, mice, or keyboards.“

“The majority of printers selling today are supported using the class driver, which means you’ll be able to “plug and print” on WOA without additional drivers”

This must be true and must be delivered upon. I want to be optimistic about this and take Microsoft at their word that drivers won’t be an issue, as they appear to insinuate. However, I will feel better once I see Windows 8 on ARM working with a wide variety of peripherals.

Are Consumers Willing to Invest in New Software?
This may be perhaps the biggest point to wrestle with. As I have stated before, I believe Microsoft, with Windows 8 in general, has come as close to fundamentally starting over with Windows as they possibly could without completely starting over. Windows 8 is a step in the right direction to optimize Windows for the future of computing.

Consumers being willing to start fresh with software is the wild card for me. Unfortunately I have no hard data (yet) on this but I will offer some observational logic as to why this may be the case.

Firstly, consumers switching to the Mac platform at incredible rates is an indicator. Apple continually mentions their stats on each quarterly call that 50% of Mac sales are to first time Mac buyers. This would mean that many of those customers have made investments in Windows software and are willing to start over. Perhaps this same buying psychology could translate to Windows 8 on ARM with a reality that legacy Windows software isn’t as important as many would think.

Secondly, reports came out in late December from Flurry that on Christmas day there was a 125% increase in app downloads mostly coming from the 353% increase in device activations on the same day. This leads us to believe that as consumers get a new device they go app shopping.

Lastly, the economics support this trend. The reality is that the new app economy has driven the cost of software down. This is not only true of mobile devices but of desktop / notebook as well. The days of selling software and software bundles in the hundreds of dollars are over. If you look at the top-selling apps in the Mac OSX App store there isn’t a single one over $29.99 and most are well south of that figure. With lower overall app pricing becoming the norm it makes it feasible for consumers to actually start over with software.

Could it be Netbooks all Over Again?
In all of these scenarios I am generally concerned that Windows 8 on ARM devices may be headed down the path of Netbooks in their early days if we are not careful. Netbook return rates were north of 30% in their early years mainly because consumers bought them expecting a “full PC” experience and early Netbooks didn’t deliver. This was primarily because early devices were Linux-based. However, even once the devices ran Windows, they were still positioned as “not full PCs” mainly because they were underpowered. It was a positioning mess in my opinion.

I am not as concerned of these devices being underpowered as much as I am them fully delivering on the full PC experience. This will have to include a robust list of software, which Microsoft and partners are working on. There are a number of form factors outside of the clamshell PC design that I think will be more successful for Windows 8 on ARM vendors and Hybrids being the most interesting potential.

Even with all the questions still looming, ultimately the positioning of these products is what will make or break Windows on ARM devices.

Windows on ARM to Include Desktop Office. But What About Outlook?

Office logoWhile Microsoft has said a lot of Windows 8, it has revealed very little about its almost equally important software partner, Office 15. In in a post on the Building Windows 8 blog today, Windows boss Steve Sinofsky disclosed a vital bit of information about Windows on ARM (WOA), the version that will run on ARM, rather than Intel 86, processors and is especially important for tablets:

“WOA includes desktop versions of the new Microsoft Word, Excel, PowerPoint, and OneNote. These new Office applications, codenamed “Office 15”, have been significantly architected for both touch and minimized power/resource consumption, while also being fully-featured for consumers and providing complete document compatibility. WOA supports the Windows desktop experience including File Explorer, Internet Explorer 10 for the desktop, and most other intrinsic Windows desktop features—which have been significantly architected for both touch and minimized power/resource consumption.”

I don’t know how much to read into this but there is one critical application missing from the list: Outlook. Sinofsky says the Windows 8 Metro mail app will support Exchange Active Sync (EAS) for mail, contacts, and calendaring. But supporting EAS does not necessarily mean the full Exchange policy support that enterprises want to see. Android phones, for example, can connect to Exchange servers for mail, but do not natively provide full Exchange support (some OEMs have tweaked Android to do this, and there are third-party solutions.)

I think Enterprise adoption  is going to be key to the success of Windows 8 tablets, so this is a big deal. On the other hand, porting Outlook as it currently exists to ARM is a non-starter. Outlook is a notorious resource hog and ARM programs are going to have to be resource sippers because of the relatively limited processing and memory power available on tablets. And Outlook’s massive databases would swamp the storage available on a tablet.

A Microsoft spokesperson declined to elaborate on Sinofsky’s blog, so I guess we’ll have to wait a while longer to find out.



The ARM Wrestle Match

I have an un-healthy fascination with semiconductors. I am not an engineer nor I do know much about quantum physics but I still love semiconductors. Perhaps because I started my career drawing chip diagrams at Cypress Semiconductor.

I genuinely enjoy digging into architecture differences and exploring how different semiconductor companies look to innovate and tackle our computing problems of the future.

This is probably why I am so deeply interested in the coming processor architecture war between X86 and ARM. For the time being, however, there is a current battle within several ARM vendors that I find interesting.

Qualcomm and Nvidia, at this point in time, have two of the leading solutions for most of the cutting edge smart phones and tablets inside non-Apple products.

Both companies are keeping a healthy pace of innovation looking to bring next generation computing processors to the mass market.

What is interesting to me is how both these companies are looking to bring maximum performance to their designs without sacrificing low-power efficiency with two completely different approaches.

One problem in particular I want to explore is how each chipset tackles tasks that require both computationally complex functions (like playing a game or transcoding a video) and ones that require less complex functions (like using Twitter or Facebook). Performing computationally complex functions generally require a great deal of processing power and result in draining battery life quickly.

Not all computing tasks are computationally complex however. Therefore the chipset that will win is one that has a great deal of performance but also can utilize that performance with very low power draw. Both Nvidia and Qualcomm license the ARM architecture which for the time being is the high performance-low power leader.

Nvidia’s Tegra 3
With their next chipset, Tegra 3, Nvidia is going to be the first to market with a quad-core chipset. Tegra 3 actually has five cores but the primary four cores will be used for computationally complex functions while the fifth core will be used to handle tasks that do not require a tremendous amount of processing power.

The terminology for this solution is called Variable SMP (symmetric multiprocessing). What makes this solution interesting is that it provides a strategic and task based approach to utilizing all four cores. For example when playing a multi-media rich game or other multi-media apps all four cores can be utilized as needed. Yet when doing a task like loading a media rich web page two cores may be sufficient rather than all four. Tegra 3 can manage the cores usage, based on the task and amount of computer power needed, to deliver the appropriate amount of performance for the task at hand.

Tegra 3’s four cores are throttled at 1.4Ghz in “single core mode” and 1.3Ghz when more than one core is active. The fifth core’s frequency is .5Ghz and is used for things like background tasks , active standby, and playing video or music, all things that do not require much performance. This fifth core because it is only running at .5Ghz requires very little power to function and will cover many of the “normal” usage tasks of many consumers.

The strategic managing of cores is what makes Tegra 3 interesting. This is important because the cores that run at 1.4 Ghz can all turn off completely when not needed. Therefore Tegra 3 will deliver performance when you need it but save the four cores only for computationally complex tasks which will in essence save battery life. Nvidia’s approach is clever and basically gives you both a low power single-core, and quad-core performance computer at the same time.

Qualcomm’s S40 Chipset
Qualcomm, with their SnapDragon chipset, takes a different approach with how they tackle the high performance yet low power goal. There are two parts of Qualcomm’s S40 Snapdragon chipsets that interest me.

The first is that the S40 chipset from Qualcomm will be the first out the door on the latest ARM process the Cortex A15. There are many advantages to this new architecture, namely that it takes place on the new 28nm process technology that provides inherent advantages in frequency scaling, power consumption and chipset size reduction.

The second is that Qualcomm uses a proprietary process in their chipsets called asynchronous symmetric multiprocessing or aSMP. The advantage to aSMP is that the frequency of the core can support a range of performance rather than be static at just one frequency. In the case of the S40 each core has a range of 1.5Ghz to 2.5Ghz and can scale up and down the frequency latter based on the task at hand.

Qualcomm’s intelligent approach to frequency scaling that is built into each core allows the core to operate at different frequencies giving a wide range of performance and power efficiency. For tasks that do not require much performance like opening a document or playing a simple video, the core will run at the minimum performance level thus being power efficient. While when running a task like playing a game, the core can run at a higher frequency delivering maximum performance.

This approach of intelligently managing each core and scaling core frequency depending on tasks and independent of other processes is an innovative approach to simultaneously delivering performance while consuming less power.

I choose to highlight Nvidia and Qualcomm in this analysis not to suggest that other silicon vendors are not doing interesting things as well. Quite the contrary actually as TI, Apple, Marvel, Broadcom, Samsung and others certainly are innovating as well. I choose Qualcomm and Nvidia simply because I am hearing that they are getting the majority of vendor design wins.

The Role of Software in Battery Management
Although the processor play’s a key role in managing overall power and performance of a piece of hardware, the software also plays a critical role.

Software, like the processor, needs to be tuned and optimized for maximum efficiency. If software is not optimized as well it can lead to significant power drains and result in less than stellar battery life.

This is the opportunity and the challenge staring everyone who makes mobile devices in the face. Making key decisions on using the right silicon along with effectively optimizing the software both in terms of the OS and the apps is central going forward.

I am hoping that when it comes to software both Google and Microsoft are diligently working on making their next generation operating systems intelligent enough to take advantage of the ARM multi-core innovations from companies like Qualcomm and Nvidia.

These new ARM chipset designs combined with software that can intelligently take advantage of them is a key element to solving our problem with battery life. For too long we consumers have had an un-healthy addiction to power chords. I hope this changes in the years to come.

Windows 8 Desktop on ARM Decision Driven by Phones and Consoles

There has been a lot written about the possibility of Microsoft not supporting the Windows 8 Desktop environment on the ARM architecture. If true, this could impact Microsoft, ARM and ARM’s licensees and Texas Instruments, NVIDIA, and Qualcomm are in the best position to challenge the high end of the ARM stack and are publicly supported by Microsoft.  One question that hasn’t been explored is, why would Microsoft even consider something like this? It’s actually quite simple and makes a lot of sense the position they’re in; it’s all about risk-return and the future of phones and living room consoles.

The Threat to Microsoft

The real short and mid term threat isn’t from Macs stealing significant Windows share from Microsoft, it’s all about the Apple iPad and iOS.  It could also be a little about Android, but so far, Android has only seen tablet success in platforms that are little risk to a PC, like the Amazon Kindle Fire.  Market-wise, the short term threat is about consumer, too, not business.  Businesses work in terms of years, not months. The reality is that while long term, the phone could disrupt the business PC, short term it won’t impact where Microsoft makes their profits today. Businesses, short term, won’t buy three devices for their employees and therefore tablets will most likely get squeezed there.  Business employees first need a PC, then a smart phone, and maybe a few a tablet.  There could be exceptions, of course, primarily in verticals like healthcare, retail and transportation.

What About Convertibles?

One wild-card are business convertibles.  Windows 8 has the best chance here given Microsoft’s ownership on business and if you assume Intel or AMD can deliver custom SOCs with low enough power envelopes, thermal solutions and proper packaging for thin designs.  Thinking here is that if business wants a convertible, they’ll also want Windows 8 Desktop and more than likely backward compatibility, something only X86 can provide.  So net-net, Microsoft is covered here if Intel and AMD can deliver.

Focus is Consumer and Metro Apps

So the focus for Microsoft then is clearly consumer tablets, and Microsoft needs a ton of developers writing high quality, Metro apps to compete in the space.  Metro is clearly the primary Windows 8 tablet interface and Desktop is secondary, as it’s an app.  Developers don’t have money or time to burn so most likely they will have to choose between writing a Metro app or rewriting or recompiling their desktop to work with ARM and X86 (Intel and AMD) desktop. It’s not just about development; it’s as expensive for devs to test and validate, too.  Many cases it’s more expensive to test and validate than it is to actually develop the app.  Strategically, it then could make sense for Microsoft to push development of the Metro apps and possibly by eliminating the Desktop on ARM option, makes the dev’s decision easier.

Strategically, It’s About Phones and the Living Room in the Endimage

Windows 8, Windows Phone 7, and XBOX development environments are currently related but not identical.  I would expect down the road we will see an environment that for most apps that don’t need to closely touch the hardware, you write once and deploy onto a Microsoft phone, tablet, PC and XBOX.  The unifier here is Metro, so getting developers on Metro is vitally important.

If Microsoft needed to improve the chances developers will swarm to Metro and do it by taking a risk by limiting variables, let’s say by eliminating ARM desktop support, it makes perfect sense.

Windows and ARM: A Fork in the Road

ZDnet reports  that Microsoft has tentatively decided that Windows 8 running on ARM processors will only support new Metro-style applications, not programs written for older versions of Windows and Intel processors.

In one sense, this is not surprising. Existing applications would have to be recompiled to run at all on ARM systems and would probably need substantial tweaking to run well. The ARM systems would probably be mostly tablets, and the existing  Windows desktop interface does not work at all well on touch systems. On the whole, users of ARM-based Windows systems will be better off without these old applications.

The problem is that the result of this decision, if Microsoft goes ahead with it, is two operating systems, both called Windows 8, with radically different capabilities.  This is a situation that cannot help but create confusion for users, especially if there are both ARM and x86 tablets with very different software abilities.

I have long though that Microsoft would have been much better off following Apple’s iPad approach and use an enhanced version of a phone operating system for tablets rather than a cut-down version of a desktop OS.  What looks like it may be a fundamental fork in Windows  suggests that Microsoft made the wrong choice.

Quad Core Smartphones: What it Will Take to Become Relevant

hedgeThere has been a lot of industry discussion on multi-core smartphones in the past year, and the dialog has increased with NVIDIA’s launch of Tegra 3, a quad core SOC targeted to phones and tablets. The big question lingering with all of these implementations particularly with phones is, what will end users do with all those general purpose compute units that provide significant incremental benefit? In the end, it’s all about the improved experience that’s relevant, unique, demonstrable, and easily marketable.

Multi-Core Background

Before we talk usage models, we first have to get grounded on some of the technology basics. First, whether it’s a multi-core server, PC, tablet or phone, many these things must exist to fully take advantage of more than one general purpose computing core in any platform:

  • operating system that efficiently supports multiple cores, multitasking across cores, and mullti-threaded apps
  • applications that efficiently take advantage of multiple cores
  • intelligent energy efficiency tradeoffs

Once those elements get into place, you have an environment where multiple cores can be leveraged. The next step is to optimize the platform for energy efficiency. All of the hardware and software platform elements, even down to transistors, must be optimized for low power when you need it and high performance when you need it. The Tegra 3 utilizes a fifth core, which NVIDIA says initiates when extremely low power state is required.

Assuming all the criteria above are met, then it comes down to what an end user can actually do with a phone with four cores.

Modularity Could Be the Key

Quad core phones could potentially add value in “modular” usage environments. While there have been a lot of attempts at driving widespread modularity, most haven’t been a big hit. I personally participated on the Device Bay Consortium when I was at Compaq, along with Intel and Microsoft. It didn’t end up materializing into anything, but the concept at the time from an end user perspective was solid.

Today and beyond, smartphone modularity is quite different than Device Bay’s “modules”. The smartphone concept is simple; use a high powered smartphone which can then extend to different physical environments. These environments span entertainment to productivity. Here are just a few of today’s examples of modularity in use today:

These are all forms of today’s modularity with different levels of interest, penetration, and adoption.

So what could quad core potentially add to the mix? Here are some potential improved usages:

  • Modular video and photo editing. These apps have historically always been multithreaded and could leverage a clamshell “dock” similar to the Lapdock or Multimedia Dock.
  • Modular multi-tab web browsing. Active browser tabs require a lot of performance and overhead. Just use Chrome PC browser and check your performance monitor. iOS5 actually halts the tab when moving to another tab forcing the user to reload the tab.
  • Modular games that heavily utilize a general purpose processor. Caveat here is that most of the games leverage the GPU a lot more than a general purpose CPU. It all depends on how the game is written, extent of AI use, UI complexity, where physics are done, and how the resources are programmed.
  • Modular natural user interface. While plugged in and “docked” at the desk or living room, the smartphone could power interfaces like improved voice control and “air” gestures. This may sound like science fiction, but the XBOX 360 is doing it today with Kinect.
  • Multitasking: Given enough memory and memory bandwidth, more cores typically means better multitasking.

Will It Be Relevant?

Many things need to materialize before anyone can deem a quad core smartphone a good idea or just a marketing idea for advanced users. First, smartphones actually need to ship with quad cores and a modular-capable OS. The HTC Edge is rumored to be the first. Then apps and usage models outlined above need to be tested by users and with benchmarks. Users will have to first “get” the modularity concept and notice an experiential difference. Moving from standard phone to modular experience must be seamless, something that Android 4.0 has the potential to deliver. Finally, some segments of users like enthusiasts will need to see the benchmarks to be swayed to pay more over a dual core phone.

There is a lot of proving to do on quad core smartphones before relevance can be established with any market segment beyond enthusiasts. Enthusiast will always want the biggest and baddest spec phone on the block but marketing to different segments, even if it provides an improved experience, will be a challenge.