I Need a PC and I Know It

One of the fundamental characteristics of a mature market, is mature consumers. These consumers are mature in the sense that they know what they want and more importantly they know why they want it. This kind of maturity can only come with a defined sense of needs, wants, and desires.

That defined sense, can only come when you have experience with a product. Owning multiple generations of a product or category is required to fully understand not just what you want but why you want it. For many consumers they know by now whether they value a traditional PC like a desktop or notebook and they know why. These consumers know they need a PC and have a sense of what they want. Interestingly with smartphones and tablets, I don’t believe we have fully mature customers. ((I’ll dive into this in a future column, but some of the experimentation we are seeing in platform switching or experimenting demonstrates this nuance of the consumer market. ))

The Screens That Rule Our Lives

When the iPad joined our world, we knew it was more than a screen to entertain us. We knew it was a profound new kind of computer. At the same time, recognizing that the tablet will not replace the PC is a key understanding. For many, the tablet can and will become a primary computing device, but I doubt the presence of a more powerful computing will cease to exist in most consumers home in some way or another. But as important as the tablet is, there are many hundreds of millions of consumers who depend on the traditional PC to make a living. What is interesting for this class of customer is that they need a PC and they know it.

We are fond of saying we are in the post PC-era. What this term simply means is that the PC is no longer the only computer in which we can perform computing tasks. But the metrics of how a PC is valued has changed. One can make a strong argument that there are many consumer who don’t value the PC and will rather value the tablet and that may be true. But for those who need a PC, and know it, value has shifted from processing power to battery life.

Battery Life is the New MHZ Race

The raging question throughout the PC industry has been “what is going to get consumers to upgrade their PCs?” The answer is iPad like battery life.

At last weeks WWDC Apple released new MacBook Air’s running Intel’s 4th generation core processor. At one point in time when a company released a new PC, they proudly announced how much processing power it had, and the crowd would applaud. At WWDC last week when Apple discussed the MacBook Air, the crowd did not cheer or applaud when they announced the speed of the processor. Instead, the crowd went wild when they announced the new metrics for battery life. The new 11″ MacBook Air now has 9 hours of batter life, and the new 13″ MacBook Air now has 12 hours of battery life. Even now, we learn that after some benchmarking and reviews those battery life claims may even be conservative. No computer on the market comes close to these battery life claims and I will be interested to see if a battery life competitor to the MacBook Air comes to market this year.

Casually read some of the reviews of the new MacBook Airs and you will see how the reviewers are raving about their experience having more than all-day battery life in a notebook.

Without question there is a huge opportunity waiting for the PC industry with regard to notebook upgrades. Many consumers and corporate workers are using PCs that are out dated in nearly every major category. Yet it is not the high-definition screens, the touch screens (or lack there of on Macs), the ultra-thin design, or the overall look that will give their new owners a profound computing experience–It is the battery life.

Apple has set the bar high with these new battery benchmarks. All PC makers are making progress in this area and the new processors from Intel and AMD will help push this needle forward. ((If Windows RT can gain traction, ARM processors can be a solution for even longer battery life)) One thing I will be watching very closely with the fall lineup is the battery life claims from all the new notebooks. I am convinced this is the feature-of-all-features for the PC industry this year.

On the Impact of Paul Otellini’s CEO Years at Intel

Intel’s CEO Paul Otellini is retiring in May 2013. His 40-year career at Intel now ending, it’s a timely opportunity to look at his impact on Intel.

Intel As Otellini Took Over

In September 2004 when it was announced that Paul Otellini would take over as CEO, Intel was #46 on the Fortune 100 list, and had ramped production to 1 million Pentium 4’s a week (today over a million processors a day). The year ended with revenues of $34.2 billion. Otellini, who joined Intel with a new MBA in 1974, had 30 years of experience at Intel.

The immediate challenges the company faced fell into four areas: technology, growth, competition, and finance:

Technology: Intel processor architecture had pushed more transistors clocking faster, generating more heat. The solution was to use the benefits of Moore’s Law to put more cores on each chip and run them at controllable — and eventually much reduced — voltages.

Growth: The PC market was 80% desktops and 20% notebooks in 2004 with the North America and Europe markets already mature. Intel had chip-making plants (aka fabs) coming online that were scaled to a continuing 20%-plus volume growth rate. Intel needed new markets.

Competition: AMD was ascendant, and a growing menace.  As Otellini was taking over, a market research firm reported AMD had over 52% market share at U.S. retail, and Intel had fallen to #2. Clearly, Intel needed to win with better products.

Finance: Revenue in 2004 recovered to beat 2000, the Internet bubble peak. Margins were in the low 50% range — good but inadequate to fund both robust growth and high returns to shareholders.

Where Intel Evolved Under Paul Otellini

Addressing these challenges, Otellini changed the Intel culture, setting higher expectations, and moving in many new directions to take the company and the industry forward. Let’s look at major changes at Intel in the past eight years in the four areas: technology, growth, competition, and finance:

Technology

Design for Manufacturing: Intel’s process technology in 2004 was at 90nm. To reliably achieve a new process node and architecture every two years, Intel introduced the Tick-Tock model, where odd years deliver a new architecture and even years deliver a new, smaller process node. The engineering and manufacturing fab teams work together to design microprocessors that can be manufactured in high volume with few defects. Other key accomplishments include High-K Metal Gate transistors at 45nm, 32nm products, 3D tri-gate transistors at 22nm, and a 50% reduction in wafer production time.

Multi-core technology: The multi-core Intel PC was born in 2006 in the Core 2 Duo. Now, Intel uses Intel Architecture (IA) as a technology lever for computing across small and tiny (Atom), average (Core and Xeon), and massive (Phi) workloads. There is a deliberate continuum across computing needs, all supported by a common IA and an industry of IA-compatible software tools and applications.

Performance per Watt: Otellini led Intel’s transformational technology initiative to deliver 10X more power-efficient processors. Lower processor power requirements allow innovative form factors in tablets and notebooks and are a home run in the data center. The power-efficiency initiative comes to maturity with the launch of the fourth generation of Core processors, codename Haswell, later this quarter. Power efficiency is critical to growth in mobile, discussed below.

Growth

When Otellini took over, the company focused on the chips it made, leaving the rest of the PC business to its ecosystem partners. Recent unit growth in these mature markets comes from greater focus on a broader range of customer’s computing needs, and in bringing leading technology to market rapidly and consistently. In so doing, the company gained market share in all the PC and data center product categories.

The company shifted marketing emphasis from the mature North America and Europe to emerging geographies, notably the BRIC countries — Brazil, Russia, India, and China. That formula accounted for a significant fraction of revenue growth over the past five years.

Intel’s future growth requires developing new opportunities for microprocessors:

Mobile: The early Atom processors introduced in late 2008 were designed for low-cost netbooks and nettops, not phones and tablets. Mobile was a market where the company had to reorganize, dig in, and catch up. The energy-efficiency that benefits Haswell, the communications silicon from the 2010 Infineon acquisition, and the forthcoming 14nm process in 2014 will finally allow the company to stand toe-to-toe with competitors Qualcomm, nVidia, and Samsung using the Atom brand. Mobile is a huge growth opportunity.

Software: The company acquired Wind River Systems, a specialist in real-time software in 2009, and McAfee in 2010. These added to Intel’s own developer tools business. Software services business accelerates customer time to market with new, Intel-based products. The company stepped up efforts in consumer device software, optimizing the operating systems for Google (Android), Microsoft (Windows), and Samsung (Tizen). Why? Consumer devices sell best when an integrated hardware/software/ecosystem like Apple’s iPhone exists.

Intelligent Systems: Specialized Atom systems on a chip (SoCs) with Wind River software and Infineon mobile communications radios are increasingly being designed into medical devices, factory machines, automobiles, and new product categories such as digital signage. While the global “embedded systems” market lacks the pizzazz of mobile, it is well north of $20 billion in size.

Competition

AMD today is a considerably reduced competitive threat, and Intel has gained back #1 market share in PCs, notebooks, and data center.

Growth into the mobile markets is opening a new set of competitors which all use the ARM chip architecture. Intel’s first hero products for mobile arrive later this year, and the battle will be on.

Financial

Intel has delivered solid, improved financial results to stakeholders under Otellini. With ever more efficient fabs, the company has improved gross margins. Free cash flow supports a dividend above 4%, a $5B stock buyback program, and a multi-year capital expense program targeted at building industry-leading fabs.

The changes in financial results are summarized in the table below, showing the year before Otellini took over as CEO through the end of 2012.

GAAP 2004 2012 Change
Revenue 34.2B 53.3B 55.8%
Operating Income 10.1B 14.6B 44.6%
Net Income 7.5B 11B 46.7%
EPS $1.16 $2.13 83.6%

 

The Paul Otellini Legacy

There will be books written about Paul Otellini and his eight years at the helm of Intel. A leader should be measured by the institution he or she leaves behind. I conclude those books will describe Intel in 2013 as excelling in managed innovation, systematic growth, and shrewd risk-taking:

Managed Innovation: Intel and other tech companies always are innovative. But Intel manages innovation among the best, on a repeatable schedule and with very high quality. That’s uncommon and exceedingly difficult to do with consistency. For example, the Tick-Tock model is a business school case study: churning out ground-breaking transistor technology, processors, and high-quality leading-edge manufacturing at a predictable, steady pace of engineering to volume manufacturing. This repeatable process is Intel’s crown jewel, and is a national asset.

Systematic Growth: Under Otellini, Intel made multi-billion dollar investments in each of the mobile, software, and intelligent systems markets. Most of the payback growth will come in the future, and will be worth tens of billions in ROI.

The company looks at the Total Addressable Market (TAM) for digital processors, decides what segments are most profitable now and in the near future, and develops capacity and go-to-market plans to capture top-three market share. TAM models are very common in the tech industry. But Intel is the only company constantly looking at the entire global TAM for processors and related silicon. With an IA computing continuum of products in place, plans to achieve more growth in all segments are realistic.

Shrewd Risk-Taking: The company is investing $35 billion in capital expenses for new chip-making plants and equipment, creating manufacturing flexibility, foundry opportunities, and demonstrating a commitment to keep at the forefront of chip-making technology. By winning the battle for cheaper and faster transistors, Intel ensures itself a large share of a growing pie while keeping competitors playing catch-up.

History and not analysts will grade the legacy of Paul Otellini as CEO at Intel. I am comfortable in predicting he will be well regarded.

HSA Foundation: for Show or for Real?

I recently spent a few days at AMD’s Fusion Developer Summit in Seattle, Washington.  Among many of the announcements was one to introduce the HSA Foundation, an organization  currently including AMD, ARM,  Imagination, MediaTek, and Texas Instruments.  The HSA Foundation was announced to “make it easy to program for parallel computing.”  That sounds a bit like an oxymoron as parallel programming has been the realm of “ninja programmers” according to Adobe’s Chief Software Architect, Tom Malloy at AMD’s event.  Given today’s parallel programming challenge, lots of work needs to be done to make this happen, and in the case of the companies above, it comes in the form of a foundation.  I spent over 20 years planning, developing, and marketing products and when you first hear the word “foundation” or “consortium” it conjures up visions of very long and bureaucratic meetings where little gets done and there is a lot of infighting.  The fact is, some foundations are like that but some are extremely effective   like the Linux Foundation. So which path will the HSA Foundation go down?  Let’s drill in.

The Parallel/GPU Challenge

The first thing I must point out is that if CPUs and GPUs keep increasing compute performance at their current pace, the GPU will continue to maintain a raw compute performance advantage over the CPU, so it is very important that the theoretical performance is turned into a real advantage.  The first thing we must do is distinguish is between serial and parallel processing.  Don’t take these as absolutes, as both CPUs and GPUs can both run serially and in parallel.  Generally speaking, CPUs do a better job on serial, out of order code, and GPUs do a better job on parallel, in-order code.   I know there are 100’s of dependencies but work with me here.  This is why GPUs do so much better on games and CPUs do so well on things like pattern matching. The reality is, few tasks just use the CPU and few just use the GPU; both are required to work together and at the same level to get the parallel processing gains.  By working at the same level I mean getting the same access to memory, unlike today where the CPU really dictates who gets what and when.  A related problem today is that coding for the GPU is very difficult, given the state of the languages and tools.  The other challenge is the numbers of programmers who can write GPU versus CPU code.  According to IDC, over 10M CPU coders exist compared to 100K GPU coders.  Adobe calls GPU coders  “ninja” developers because it is just so difficult, even with tools like OpenCL and CUDA given they are such low level languages.  That’s OK for markets like HPC (high performance computing) and workstations, but not for making tablet, phone and PC applications that could use development environments such as the Android SDK or even Apple’s XCode.  Net-net there are many challenges for a typical programmer to code an GPU-accelerated app for a phone, tablet, or a PC.

End User Problem/Opportunity

Without the need to solve an end user or business problem, any foundation is dead in the water.  Today NVIDIA  is using CUDA (C, C++, C#,), OpenCL, and OpenACC and AMD supports OpenCL to solve the most complex industrial workloads in existence.  As an example, NVIDIA simulated at their GTC developer conference what the galaxy would look like 3.8B years in the future.  Intel is using MIC, or Many Integrated Cores to tackle these huge tasks.  These technologies are for high-performance computing, not for phones, tablets or PCs. The HSA Foundation is focused on solving the next generation problems and uncovering opportunities in areas like the natural user interface with a multi-modal voice, touch and gesture inputs, bio-metric recognition for multi-modal security, augmented reality and managing all of the visual content at work and at home.  ARM also talked on-stage and in the Q&A about the power-savings they believed they could attain from a shared memory, parallel compute architecture, which surprised me.  Considering ARM powers almost 100% of today’s smartphones and tablets around the world, I want to highlight what they said.  Programming for these levels of apps at low power and enabling 100’s of thousands of programmers ultimately requires very simple tools which don’t exist today to create these apps.

The HSA Foundation Solution

The HSA Foundation goal, as stated above, was to “make it easy to program for parallel computing.” What does this mean?  The HSA Foundation will agree on hardware and software standards.  That’s unique in that most initiatives are just focused on the hardware or the software.  The goal of the foundation is to literally bend the hardware to fit the software.  On the hardware side this first means agreement on the hardware architectural definition of the shared memory architecture between CPU and GPU.  This is required for the CPU and GPU to be at the same level and not be restricted by buses today like PCI Express.  The second version of that memory specification can be found here.  The software architecture spec and the programmer reference manual are still in the working group.  Ultimately, simple development environments like the Google Android SDK, Apple’s XCode and Microsoft’s Visual Studio would need to holistically support this to get the support of the more mainstream, non-ninja programmer.  This will be a multi-year effort and will need to be measured on a quarterly basis to really see the progress the foundation is making.

Foundations are Tricky

The HSA Foundation will encounter issues every other foundation encounters at one time in its life.  First is the challenge of founding members changing their minds or getting goal-misaligned.  This happens a lot where someone who joins stops buying into the premise of the group or staunchly believes it isn’t valuable anymore.  Typically that member stops contributing but could even become a drag on the initiative and needs to be voted off.  The good news is that today, AMD, ARM, TI, MediaTek and Imagination have a need as they all need to accelerate parallel processing.  The founding members need to make this work for their future businesses to be as successful as they would like. Second challenge is the foundation is missing key players in GPUs.  NVIDIA is the discrete GPU PC and GPU-compute market share leader, Intel is the PC integrated GPU market share leader, and Qualcomm is the smartphone GPU market share leader.  How far can the HSA Foundation get without them?  This will ultimately be up to guys like Microsoft, Google and Apple with their development environments.  One wild-card here is SOC companies with standard ARM licenses.  To get agreement on a shared memory architecture, the CPU portion of ARM SOC would need to be HSA-compliant too, which means that every standard ARM license derived product would be HSA-compliant.  If you had an ARM architecture license like Qualcomm has then it wouldn’t need to be HSA-compliant.  The third challenge is speed.  Committees are guaranteed to be slower than a partnership between two companies and obviously slower than one company.  I will be looking for quarterly updates on specifications, standards and tools.

For Show or for Real?

The HSA Foundation is definitely for real and formed to make a real difference.  The hardware is planned to be literally bent to fit the software, and that’s unique.  The founding members have a business and technical need, solving the problem means solving huge end user and business problems so there is demand, and the problem will be difficult to solve without many companies agreeing on an approach.  I believe over time, the foundation will need to get partial or full support from Intel, NVIDIA, and/or Qualcomm to make this initiative as successful as it will need to be to accelerate the benefits of parallel processing on the GPU.

 

 

AMD’s “Trinity” APU Delivers Impressive Multimedia

AMD officially launched its “Trinity” line of second-generation AMD A-Series APUs for notebooks two weeks ago and systems will be hitting the store shelves in a few weeks; desktops are expected later this summer. Reviews are showing that AMD significantly increased its performance per watt over its predecessor, “Llano,” and as a result, Trinity is competitive with Intel on battery life as well. One set of special hardware and software features AMD collectively refers to as the AMD HD Media Accelerator relates to a visibly enhanced and faster multimedia experience, which I think deserves a closer look as mainstream and techies users alike can benefit significantly from their use. It is also good indicator that chip makers are focusing even more on the end user experience and ways to improve it.

Smooth out shaky videos

All of us reading this column have taken a shaky video on our smartphone, palmcorder or camcorder. Whether it’s soccer games, track meets, or the first time our babies run, we all take shaky video. And all of us have watched a shaky video, too, and people relate differently. Some have no issues, but many do and won’t even watch the video. My wife actually feels sick watching any video like this and I’m sure she isn’t alone.

To smooth out these videos and remove the “shakes”, AMD developed AMD Steady Video technology. When run on a Trinity-APU system, AMD Steady Video technology significantly reduces the amount of jitter the user experiences when watching these videos. It is also done automatically without user intervention when video is displayed on supported browsers and media players. AMD Steady Video works with the top web browsers and media players. AMD Steady Video web browsing is supported by Chrome, Internet Explorer, and Firefox. The feature is also supported in Windows Media Player, VLC Player, and ArcSoft and Cyberlink media players, too. This covers an incredibly wide swath of global users.

Improve video quality on any device

One of the technologies more sophisticated users can appreciate is the AMD Accelerated Video Converter. This technology significantly accelerates the recoding and transcoding of video. Recoding means changing the format and size of a video. This helps when users capture video in a very high quality format that is very dense, but want to put the video on their phones, tablets or even upload to YouTube. Recoding the video makes it smaller or places it into a different format where it can be better viewed, shared or edited. Transcoding means recoding and playing back the output in real-time versus storing and sharing. This is the area where the AMD Accelerated Video Converter significantly improves the experience because it cleans up that same video at the same time as it’s recoding the video.

Transcoding comes in handy when you have devices spread out all over the house with video files on them and you want to watch those videos on a myriad of devices from TVs to PCs to game consoles to tablets to smartphones. Transcoding optimizes the video specifically for the playback device as every device prefers different kind of video. A “Trinity”-based notebook using the Accelerated Video Converter with a program like ArcSoft Link+ acts as a “media server” and transcodes the video and sends it to the playback device. The source file doesn’t have to be on the “Trinity” notebook; it can be on any device in the home and if it supports the latest DLNA protocol and if on the same LAN. DLNA isn’t niched anymore as it is supported on virtually every new major consumer electronics device and will even be the basis for all future set top boxes that will stream protected content around the home.

As a final benefit, AMD Perfect Picture technology is a video post-processing capability that works in concert with AMD Steady Video such that all the video that passes through the Trinity-based notebook is cleaned up to look sharper and have richer, more accurate colors. As a result, users can playback better looking video on their companion devices regardless of where they are in the home. This usage model may be for the more sophisticated users, but through features like Apple’s AirPlay and iTunes Share, consumers are getting much more comfortable playing back content from remote devices

Speed up file downloads

Today on a Windows-based PC, there isn’t a QOS (quality of service) arbiter to determine which application gets bandwidth. Users can be downloading a gigantic file like a movie, game, or app and the rest of the system is useless for anything like web browsing and video conferencing. With data density increasing at a faster pace than bandwidth, this will become a larger issue in the future. This is where AMD Quick Stream assists.

AMD Quick Stream adds the QOS feature that Windows lacks. The concept is simple; it provides equal access to each download. If four apps are downloading content, each app gets 25% of the available bandwidth. With three apps, each would get 33%. I found this feature useful as I was downloading a game in the background, looking up stuff on the web, and doing a Skype call. The system just felt more responsive.

Wrapping up

By adding the AMD HD Media Accelerator to all Trinity APUs, AMD is in many ways bucking the tech hardware industry rut of feeds and speeds that don’t demonstrably improve the end user experience. This is neither a minor investment nor a desire to load systems with bloat-ware; these are expensive technologies designed to improve user’s expressed areas of pain while fitting into very small resource footprints. The multimedia features are also comprehensively released and supported for multiple web browsers, media players, and home data protocols which ensure widespread deployment and regular updates. What AMD has done with AMD Steady Video and AMD Quick Stream is undoubtedly positive for end but also a positive message for the ecosystem, too.

Affordable UltraBooks are Coming But They aren’t UltraBooks

There are looming PC wars coming and it isn’t between Macs and Windows based notebooks. If you follow this industry you know that Intel is seeking to rejuvenate the notebook market. They are doing this by putting quite a bit of marketing weight behind the term UltraBook. To spur development in this category, Intel is putting some very specific hardware specifications around the term that OEMs like Dell, HP, Acer, etc., must conform to if they want their notebook to be called an UltraBook and take advantage of Intel’s marketing dollars for UltraBooks. Obviously every OEM is making UltraBooks.

The challenge as I see it for UltraBooks is that many of the first ones at launch and perhaps those that follow will be priced more in the premium price range rather than value. Many of the early UltraBooks we will see will be $699 and above although a few may get lower and many will skew higher as well. What our consumer data from our own research and consumer interviews is telling us is that Apple has about a $250 grace price point. Consumers know Apple’s Macbook Pro and MacBook Air lines are not the cheapest products on the market. For MacBook intenders, any comparable product must be at least $250 less than a comparable MacBook product to fully sway a consumer when price comes into play. But as I have pointed out before price is becoming less and less of an issue in mature markets.

Although we expect UltraBooks to continue to drop in price there is a sub-category of notebooks emerging which may be even more interesting.

If It Looks Like an UltraBook…

Intel wants to own the UltraBook category. They are investing a lot of money around the term. However, there is a strict set of requirements notebook OEMs must abide by if they want to use the term. If there is one thing I have learned in my 12 years of being an industry analyst it is that OEMs don’t generally like being told what they can and can’t do with their hardware designs. Every OEM wants to take advantage of the thin and light designs driving UltraBooks but they may want to vary the CPU capabilities, and what if they want to use an non-Intel chip for a design that looks exactly like an UltraBook? The answer is they can’t call it an UltraBook.

Earlier in the week AMD launched a very impressive 2nd-Generation A-Series APU, codenamed “Trinity.” Many OEMs have strong relationships with AMD and will most likely use these chips in their lineup of notebooks. So how do OEMs cover their bases by making non-Intel UltraBooks? Well, HP recently launched a new term called SleekBooks. We call this category Ultrathins and we expect many Ultrathins to enter the market well below the price of UltraBooks. And that is what makes this so interesting.

While Intel is going out and spending millions of dollars marketing the UltraBook term, it will indirectly benefit a range of competing platforms. Ultrathins will look nearly identical to UltraBooks with the only minor configurations or specifications, that many consumers may not even notice. The bottom line is that consumers will walk into retail and see UltraBooks, SleekBooks, and perhaps more terms on the way, and with all of these options consumers may very well go with price and walk with with something other than an UltraBook. Perhaps even not knowing they didn’t purchase an UltraBook.

Now, on the surface it may seem as though Intel may not like this scenario. But realistically Intel simply wanted to rejuvenate the notebook category. I believe their marketing of UltraBooks is going to do just that. Even though it may very well help their competitors chipsets and even to a degree help Apple.

I have a feeling there is a large chunk of consumers who are due for a notebook upgrade. The iPad has, for some, served as a sufficient supplement to their existing notebook making it easier to delay the purchase of a new notebook. Whether it is UltraBooks or these new thin and lights that will look and smell like UltraBooks but be priced quite a bit lower, we expect at least a short term positive jump the overall notebook category over the next few years.

Mac Momentum

This is one of the more interesting things to watch. Mac sales are growing at incredible rates. It seems each quarter Apple is selling more Macs than ever before. I was recently in an Apple store with a newly renovated training center. When I walked into the store I assumed the training tables would be filled with people learning how to use their iPads. Instead every table and every consumer at that table was learning how to use the new Mac they just purchased.

If Ultrathins that are very thin, light, and powerful hit the market below the $599 price like we think may happen, it could provide a serious jump start to the notebook category. And at $599 or lower the prices of quality notebooks will be significantly less than an entry level MacBook Air, which may be a key in slowing down Apple’s momentum with Macs.

The Notebook form factor is facing important times as consumers are faced with new questions about computing and their own computing preferences due to the iPad. Consumers are asking new questions about their own computing needs and looking more intently for specific solutions–especially those shopping for new notebooks.

This is exciting and challenging for many in the notebook ecosystem.

Gaming AMD’s 2012 Strategy

AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.

There’s an awful lot of mis-guided analysis wafting about regarding AMD’s new strategic direction, which the company says it will make public in February. This piece is to help you (and me) sort through the facts and the opportunities. I last took a look at AMD’s strategies earlier this year, available here.

Starting With the Facts

  • AMD is a fabless semiconductor company since 2009. The company depends on GlobalFoundries and soon Taiwan Semiconductor to actually fabricate its chips;
  • In its latest quarter, AMD had net income of about $100 million on $1.7 billion in revenue. Subsequently, the company announced a restructuring that seeks to cut costs by $118 million in 2012, largely through a reduction in force of about ten percent;
  • AMD has about a 20% market share in the PC market, which Intel says is growing north of 20% this year, largely in emerging markets;
  • AMD’s products compete most successfully against rival Intel in the low- to mid-range PC categories, but 2011 PC processors have underwhelmed reviewers, especially in performance as compared to comparable Intel products;
  • AMD has less than a 10% market share in the server market of about 250,000 units, which grew 7.6% last quarter according to Gartner Group;
  • AMD’s graphics division competes with nVidia in the discrete graphics chip business, which is growing in profitable commercial applications like high-performance supercomputing and declining in the core PC business as Intel’s integrated graphics is now “good enough” for mainstream buyers;
  • AMD has no significant expertise in phone and tablet chip design, especially the multi-function “systems on a chip (SOCs)” that make up all of today’s hot sellers.

What Will AMD CEO Rory Read’s Strategy Be?

I have no insider information and no crystal ball. But my eyebrows were seriously raised this morning in perplexity to see several headlines such as “AMD to give up competing with Intel on X86“, which led to “AMD struggling to reinvent itself” in the hometown Mercury News. I will stipulate that AMD is indeed struggling to reinvent itself, as the public process has taken most of 2011. The board of directors itself seems unclear on direction. That said, here is my score card on reinvention opportunities in descending order of attractiveness:

  1. Servers —  For not much more work than a desktop high-end Bulldozer microprocessor, AMD makes Opteron 6100 server processors. Hundreds or thousands more revenue dollars per chip at correspondingly higher margins. AMD has a tiny market share, but keeps a foot in the door at the major server OEMs. The company has been late and underdelivered to its OEMs recently. But the problem is execution, not computer science.
  2. Desktop and Notebook PCs — AMD is in this market and the volumes are huge. AMD needs volume to amortize its R&D and fab preparation costs for each generation of products. Twenty percent of a 400 million chip 2011 market is 80 million units! While faster, more competitive chips would help gain market share from Intel, AMD has to execute profitably in the PC space to survive. I see no role for AMD that does not include PCs — unless we are talking about a much smaller, specialized AMD.
  3. Graphics Processors (GPUs) — ATI products are neck-and-neck with nVidia in the discrete graphics card space. But nVidia has done a great job of late creating a high-performance computing market that consumes tens of thousands of commercial-grade (e.g., high price) graphics cards. Intel is about to jump into the HPC space with Knight’s Corner, a many-X86-core chip. Meanwhile, AMD needs the graphics talent onboard to drive innovation in its Fusion processors that marry a processor and graphics on one chip. So, I don’t see an AMD without a graphics component, but neither do I see huge profit pools either.
  4. Getting Out of the X86 Business — If you’re reading along and thinking you might short AMD stock, this is the reason not to: the only legally sanctioned software-compatible competition to X86 inventor Intel. If AMD decides to get out of making X86 chips, it better have a sound strategy in mind and the ability to execute. But be assured that the investment bankers and hedge funds would be flailing elbows to buy the piece of AMD that allows them to mint, er, process X86 chips. So, I describe this option as “sell off the family jewels”, and am not enthralled with the prospects for success in using those funds to generate $6.8 billion in profitable revenue or better to replace today’s X86 business.
  5. Entering the ARM Smartphone and Tablet Market— A sure path to Chapter 11. Remember, AMD no longer makes the chips it designs, so it lacks any fab margin to use elsewhere in the business. It starts against well-experienced ARM processor designers including Apple, Qualcomm, Samsung, and TI … and even nVidia. Most ARM licensees take an off-the-shelf design from ARM that is tweaked and married to input-output to create an SOC design, that then competes for space at one of the handful of global fab companies. AMD has absolutely no special sauce to win in the ARM SOC kitchen.To win, AMD would have to execute flawlessly in its maiden start (see execution problems above), gain credibility, nail down 100+ design wins for its second generation, and outrace the largest and most experienced companies in the digital consumer products arena. Oh, and don’t forget volume, profitability, and especially cash flow. It can’t be done. Or if it can be done, the risks are at heart-attack levels.

“AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.” One way to read that ambiguous sentence by AMD is a strategy that includes:

  • Tablets and netbooks running X86 Windows 8;
  • Emerging geographic markets, chasing Intel for the next billion Internet users in places like Brazil, China, and even Africa. Here, AMD’s traditional value play resonates;
  • Internet-based businesses such as lots of profitable servers in the cloud. Tier 4 datacenters for Amazon, Apple, Facebook, Google, and Microsoft are a small but off-the-charts growing market.

So, let’s get together in February and see how the strategy chips fall. Or post a comment on your game plan for AMD.

BAPco SYSmark 2012: Dropping the Llano Shoe

No wonder AMD was upset enough over BAPco’s SYSmark 2012 benchmark to drop out of the non-profit benchmarking organization in June with much sturm und drang.

My testing of the AMD Fusion high-end “Llano” processor, the A8-3850 APU, shows an overall rating on SYSmark 2012 of 91. Except for the 3D component of the benchmark, the Intel “Sandy Bridge” Pentium 840 scores higher in individual components — and higher overall — with a score of 98, according to the official SYSmark 2012 web site.

The SYSmark 2012 reference platform scores 100. That puts the high-end Llano desktop performance at 90% of a 2010 Intel “Clarkdale” first-generation Core i3-540, a low-end mainstream processor.

Moreover, the Intel “Sandy Bridge” Core i3-2120 dual-core processor with integrated graphics costs within a dollar of the “Llano” A8-3850 but delivers a 36 point higher score – noticeably snappier performance, in my actual use experience (see chart below).

I also tested AMD’s Phenom II 1100T, a top-end AMD six-core processor with an ATI Radeon HD 4290 graphics card, against an Intel “Sandy Bridge” second generation Core i5-2500 with integrated graphics. The Core i5-2500 is the superior processor on this benchmark; the much-maligned Intel internal graphics barely loses to the ATI 4920 external graphics card in the 3D component, while delivering a 44 point overall advantage. The results are shown below in Chart 1.

Chart 1: Selected BAPco SYSmark 2012 Scores

Processor Overall Office Media Analysis 3D Web Sys Mgt
Intel i5-2500 166 144 162 191 181 168 153
Intel i3-2120 127 123 125 146 125 121 122
AMD Phenom II 1100T 122 109 116 122 183 108 110
Intel Pentium 840 98 100 102 106 87 90 107
AMD A8-3850 91 91 84 96 121 73 88
Intel Pentium G620T 79 81 81 88 70 71 86

Source: Peter S. Kastner andBusiness Applications Performance Corporation

Is SYSmark 2012 Relevant?
SYSmark 2012 is relevant because it allows evaluators to test specific PC configurations against actual, commonly used business applications.

AMD says “AMD will only endorse benchmarks based on real-world computing models and software applications, and which provide useful and relevant information. AMD believes benchmarks should be constructed to provide unbiased results and be transparent to customers making decisions based on those results.” Let’s look at what SYSmark does and how it does it.

Serious readers will study the SYSmark 2012 Overview published at the BAPco web site. This benchmark version is built on 20 years of collaborative experience by BAPco in modeling business work loads into application scenarios and corresponding benchmarks through a 26-phase process that takes years to complete. The last version was SYSmark2007 under Windows Vista. SYSmark is real-world in that it incorporates widely used applications such as Office, AutoCAD, Acrobat, Flash, Photoshop and Internet Explorer under Windows 7 in component scenarios.

SYSmark is widely used around the globe in business and public tenders to select PCs without bias towards vendor and processor manufacturer. SYSmark is the only generally accepted benchmark for general business computers since it uses actual application code in the tests, not synthetic models.

The benchmark is intensive, reflecting workload snapshots of what power users actually do, rather than light-duty office workers. There are six scenario components to SYSmark 2012, each of which counts equally in the final rating:

Office Productivity: The Office Productivity scenario models productivity usage including word processing, spreadsheet data manipulation, email creation/management and web browsing.

Media Creation: The Media Creation scenario models using digital photos and digital video to create, preview, and render a video advertisement for a fictional business.

Web Development: The Web Development scenario models the creation of a website for a fictional company.

Data/Financial Analysis: The Data/Financial Analysis scenario creates financial models to review, evaluate and forecast business expenses. In addition, the performance and viability of financial investments is analyzed using past and projected performance data.

3D Modeling: The 3D Modeling scenario focuses on creating, rendering, and previewing 3D objects and/or environments suitable for use in still imagery. The creation of 3D architectural models/landscapes and rendering of 2D images and video of models are also included.

System Management: The System Management scenario models the creation of data backup sets and the compression, and decompression of various file types. Updates to installed software are also performed.

For each of the six components, BAPco develops a workflow scenario. Only then are applications chosen to do the work. BAPco licenses the actual application source code and assembles it into application fragments together with its workflow measurement framework. The data/financial analysis component, for example, runs a large Microsoft Excel spreadsheet model.

What I don’t like is the “2012” moniker. This SYSmark version is built on business application components as of 2010. By naming it SYSmark 2012, BAPco implies the benchmark is forward looking, when it actually looks back to 2010 application versions. The labeling should be 2010. In spite of the labeling, SYSmark 2012 is unique as a cross-platform benchmark for stressing business desktops using real-world applications in job-related scenarios.

Analysis and Conclusions
The SYSmark 2012 reference-point PC is a Core i3-540 and has a 100 point score. When I used this processor with Windows 7 last year as my “daily-driver PC” for a month, I was underwhelmed by its overall feel. Subjective comment, yes, but my point is that the reference machine is no speed demon.

The new AMD “Llano” A8-3850, a quad-core processor with integrated graphics, is adequate for light-weight office duties as measured by BAPco SYSmark 2012. The top-of-the-line AMD Phenom II 1100T with a discrete graphics card is better suited for mainstream task-specific business computing than the “Llano” processors.

Intel’s low-end dual-core “Sandy Bridge” Pentium 620 and 840 bracket the “Llano” A8-3850 in processor performance, while lagging in graphics-intensive 3D benchmark components.

Intel’s entry-level Core i3-2120 with integrated graphics handily beats the top-of-the-line Phenom II 1100T with a discrete graphics card in all but graphics-intensive 3D benchmarks, making it an attractive price-performer. The high-end Core i5-2500 tops the top-of-the line Phenom II 1100T with a 44 point overall advantage, despite using integrated graphics.

SYSmark’s results do not plow new performance ground. An Internet search will quickly turn up numerous reviews that conclude, using a different set of benchmarks, that the “Llano” line is weak as a processing engine and pretty good at graphics, especially 3D consumer games. Yet consumer games are not typically not high on the business PC evaluation checklist.

Many of the SYSmark 2012 applications use graphics-processor acceleration, when available, including Adobe Photoshop, Flash, Premier Pro CS5, Autodesk 3ds Max and AutoCAD, and Microsoft Excel. SYSmark 2012 convinces me that today’s integrated graphics are plenty good enough for business PCs shy of dedicated workstations. But a strong processor is still necessary for good overall performance.

Business desktops ought to be replaced every three to four years. However, the reality is many businesses keep desktops for five or more years, and many have instituted a “replace it when it breaks” cycle. Productivity studies show that knowledge workers deserve the higher end of today’s performance curve in a new PC so as not to be completely obsolete — and less productive — before the machine is replaced.

No single benchmark should be the sole criteria for selecting a computer, and SYSmark 2012 is no exception. However, I disagree with AMD that SYSmark is no longer worthy of consideration, and by other analysts that SYSmark is dead because AMD walked away from BAPco.

The bottom line for PC evaluators is simple: if you believe that the extensive work by the BAPco consortium across two decades stands up to scientific and peer scrutiny, then the SYSmark results discussed above show AMD at a serious performance disadvantage. If you don’t think SYSmark is a relevant benchmark for business PCs, then neither AMD nor I have a viable substitute.

The next shoe to drop is AMD’s high-end “Bulldozer” processor, expected in the next 60 days.

 

Will UltraBooks Make PCs Interesting Again?

I ask this question specifically because this is the question those who make PC’s are asking. In particular this initiative to make the PC relevant again is being driven by Intel and in part by AMD. This sounds rather silly because of course the PC is still relevant, the fact of the matter is the PC has become boring.

PC’s are mainstream and there isn’t much interesting about them these days. Consumers are familiar with them and understand what they are and what they are good for. Consumers are more interested in learning about things like smart phones and tablets to which they are still in discovery mode with.
Continue reading Will UltraBooks Make PCs Interesting Again?