Nvidia’s Shield Was Built for Folks Like Me

I’m right in the sweet spot of Nvidia’s target demographic for Shield. I’m a hard core gamer, I play mainly console games and not as much PC games. Life and career have taken more time and its been harder to find the time to play video games like I once did. This is why the promise of a true mobile console experience has always interested me.

This is why I was very interested when Nvidia announced Shield. I was skeptical I’ll be honest, and a bit surprised. But I remained optimistic because of what I know about Nvidia and how hopeful I am that someone will actually deliver on the mobile console promise.

I’ve been playing with Shield for a while now and I have to say I am impressed.

Some Thoughts on the Hardware

First off the hardware is excellent. The controller feels very much like an XBOX controller, which I would argue is the best controller around. ((This is subjective of course, but the overall feel in my hand and the “just right” stiffness of the joysticks is perfect for me)). If you have spent many hours gaming with the XBOX controller, you will feel right at home with the Shield controls.

Second, the screen is fantastic. I’ve used all the latest and greatest Android devices and the screen on the Shield and although its resolution and PPI isn’t has high as devices like the Galaxy S4, to the naked eye it feels extremely close. Which means the games and whole visual experience are top notch.

Android Gaming

The biggest question here is games on Android. Nvidia chose the Android operating system to run Shield because of Androids open nature. There is no question in my mind that more immersive games will come to mobile devices, but I’ve felt for some time that a controller experience was necessary for this to fully happen. Now that Nvidia has released Shield and that Shield delivers a truly mobile console experience in my opinion, the ingredients are there for console game developers to start taking mobile more seriously.

There are already a handful of Shield Optimized Android games, and like all new console launches I anticipate this number to grow and because Shield is built on Android, I expect the amount of Shield optimized games to grow faster than any other mobile gaming console to this point.

Interestingly, although there are about two dozen Shield optimized games already there are many more in the Android market that work already given their support for third party game controllers.

One last point. In using Shield, I have had the most positive expereince with Android yet. Not only is it a pure implementation running stock Jelly Bean, but In the many of the entertainment use cases Shield is focused on brings to light some of the best of Android. Android is great on Smartphones and tablets, but in my opinion, its even better on Shield.

A Bit of Nostalgia

Although, there is a fair amount of content already available to play on Shield, being built on Android has its advantages. Namely that given its open nature there are very good Nintendo and Super Nintendo Emulators for Android. I downloaded my favorite, SuperGNES, and loaded up the games I have been using on the Galaxy S4 I have. Namely, Super Punchout, Street Fighter II, Mario Kart, and Super Mario World. Low and behold, right out the gate every one worked with the Shield controller with no modification or customization. So here I am now playing Street Fighter II and Super Punchout with the glory of using a game controller.

Having access to all the Nintendo games I know and love, and grew up with, and being able to use a game controller with them, was perhaps the most eye opening experience for me in using Shield.

Powerful Accessories

Beyond the games, there are other benefits for being built on Android that showcase a device like Shield’s advantage over a more closed mobile console gaming experience. Being built on Android opens the door for other unique hardware accessory expereinces to benefit Shield. One in particular I want to highlight. And that is using Shield to fly my Parott AR Drone.

Yes, I have one of those drones, and it is one of my favorite gadgets / toys. You may not know this but the AR Drone has a number of augmented reality games available for it. Games where you use the camera to shoot digital objects in the air or the ground. Or games where you race through a digital course in the physical world. All of these experiences through a touch screen are possible but made all the better using a physcical game controller. To say that Shield has profoundly impacted my flying ability with my AR Drone would be an understatement.

This brings up a broader point. We are seeing a number of electronics like this, whether RC cars, planes, etc., come with software for smart phones. Being able to use Shield as a game controller with some of hardware expereinces like these may open up some doors that were not possible before.

Things to Consider

For us gaming enthusiasts we are faced with a difficult holiday season. This is the first time in a while when the holiday season will feature simultaneous avaialablity of the two top gaming consoles in their launch year. Most of us can’t afford them all this holiday season.

However, if a mobile console gaming experience is a priority for you then strongly consider Shield. It is the best mobile gaming console experience I have encountered. And as I stated there is a big potential upside being built on Android. I strongly believe it is only a matter of time before console first game developers shift to a mobile first development focus. This does not mean they will only develop for mobile device, only that they will embrace the mobile first strategy. Android will clearly benefit from this move and inevitably so will Shield.

Shield may be the most future proof mobile gaming console to hit the market yet. And as I pointed out, playing Nintendo games, using to fly drones, etc., are all icing on the cake.

How Windows RT could Thrive

Microsoft’s decision to create a Windows 8 version for use on ARM processors called Windows RT has become a bit of an enigma in the industry. Windows RT based tablets were launched with much fan fare yet sales of RT based devices has fallen way short of predictions.

In fact, Microsoft is selling their Surface RT to schools now for $100, something that suggests that the Windows Surface RT experiment is pretty much dead. Microsoft has its own self to blame for this. Their decision to include Office minus Outlook was a serious blow for these early models. While newly created Windows 8 apps worked on RT, the fact that it was not backward compatible with existing Windows Apps really added to its lack of allure for most customers

Also their TV ads didn’t help either. Instead of showing people the virtues of Surface they decided to show hip young people dancing and jiving holding RT Surface tablets, something that makes no sense to anyone who wanted to know what Surface really was and why they should even consider buying it. These ads were a waste of money and a big mistake in my book.

Our research suggests that Windows RT in 10-inch tablets and laptops probably will never take off. Mostly because of lack of backward compatibility with current Windows apps, which to a lot of people is still a big issue. While it is true that Windows 8 apps work on RT devices, the lack of Windows 8 apps, especially those long tail apps, will continue to hurt it in these types of models too.

However, there is one device, or area, where RT could be quite welcomed. One of the things you may have noticed is that 7” or 8” tablet prices have come down in price. Over the weekend I saw a 9” tablet for $99.00 at Fry’s. Sure it was a no-name brand but it had Android Ice Cream Sandwich on it and was more than serviceable as a basic tablet. What we are seeing is a race to the bottom with smaller screen tablets and it is becoming harder and harder for any tablet players to compete when prices get this low and they are all pretty much alike.

Gaming and Media

What is needed in the small tablet space is differentiation. Just using a mainstream processor will not cut it if the goal is to be heard above the crowd. It is true that being tied to a rich ecosystem like Amazon and Apple have for their smaller tablets helps them differentiate but for others, especially those betting on Windows 8 for tablets, they have no edge against this onslaught of race to the bottom low-end tablet space.

While CPUs in smaller tablets are important for delivering long battery life, the need for an upscale processor is somewhat minimal. However, one area of content that is important–even in small tablets–is games and video. For games, the GPU will become an important part of differentiating these smaller tablets. Especially since the use case for many of these smaller tablets will lean toward media and entertainment.

This is where RT could be on somewhat equal footing. In smaller tablets, backward compatibility with existing Windows apps is not important. Rather, it just needs to run Windows 8 apps and do them extremely well. But games and video built for Windows 8 could have an advantage when running an ARM processor like Nvidia’s Tegra or Qualcomm’s Snapdragon. Both processors which, for the time being, are likely to have a graphics advantage over their lower cost x86 counterparts. ((We can debate all we want the degree of which “good enough” experiences exist, but graphics is still an area where we will continue to observe clearly better visual experiences))

Nvidia has made the GPU a key part of their mobile processor known as Tegra and to date, Nvidia has had some pretty big wins in tablets because of the robustness of Tegra’s CPU and GPU. Qualcomm, with Adreno, and Intel as well, both realize that the GPU is becoming much more important in mobile and they too have been working hard on developing more powerful graphics processors for use with their mobile SoCs.

Most of Nvidia’s tablet wins have been for use with Android but vendors wanting to do Windows 8 ARM based tablets need to look closely at the role a GPU will have in driving greater differentiation with these smaller tablets. From our research we are finding that smaller tablets are mostly used for content consumption and games and not productivity. Making these smaller tablets exceed consumer’s expectations, especially with games, could allow Windows RT to be taken seriously. An SoC with an emphasis on graphics added to deliver a great gaming experience could help deliver on this use case. And if the graphics and media experience is objectively clear, consumers will pay a premium for this if the tablet is to be used for HD games and video. ((Obviously there are many variables to this, including rich applications and games being developed for Windows RT))

It will be important to watch what happens at Microsoft’s Build conf in SF next week and see how much emphasis they make on creating games for Windows 8. If this is a major part of their strategy, then RT based small notebooks and tablets could thrive in this space even if they are not a bargain based prices.

NVIDIA GeForce Grid: Killing off Game Consoles?

Yesterday, NVIDIA launched VGX and the GeForce Grid, which, among many things, could render future game consoles obsolete.  This may sound very far-fetched right now, but as I dig into the details of the capability of the GeForce Grid and map that against consumer future needs, unless future consoles can demonstrably deliver something unique and different, they will just be an unnecessary expense and a hassle to the end consumer.

Problems with Cloud Gaming Today

Services exist today for cloud gaming like OnLive and Gaikai.  They have received a lot of press, but it’s uncertain if their business models and experiences would exist years from now if they stay with their current approaches and implementations.

Scalability is one issue.  Services need to directly match one cloud game session with one graphics card, so if you have 1,000 gamers, you need 1,000 graphics cards.  You can just imagine the challenges in scaling that experience out to millions of users.  You would need millions of graphics cards, which in a data center environment doesn’t make a lot of sense logistically or financially.

Latency is another issue.  Cloud game services need to maintain severs 100s of miles away to maintain an appropriate latency in game-play.  Latency is the lag time between when a user does something and they get a response. Imagine if there were a one second delay between the time you pull the trigger in Battlefield 3 and the time which something happens.  This would render the cloud game absolutely unplayable. Latency in social media apps like Facebook is acceptable, but not with games. Having to provide “edge servers” close to end users like the industry does today is completely unproductive as you cannot leverage these same servers during off-times and it’s difficult to even leverage servers across different time zones.  Therefore, servers are sitting around idle with nothing to do. This places another immense financial burden on the cloud game provider.  NVIDIA and their partners are attempting to solve these problems.

Nvidia VGX and the GeForce Grid

NVIDIA, with VGX and the GeForce Grid is attempting to solve the scalability and latency problems associated with today’s cloud gaming services like Gaikai and OnLive.  NVIDIA VGX are the technologies addressing the current virtual display issues and the GeForce Grid is the specific implementation to attack issues in cloud gaming.  They are addressing the problems with two very distinct, but related technologies: GPU virtualization and low latency remote display.

Virtualization of the GPU enables more than one user to share the resources of a graphics card.  Therefore, the one to one ratio between user gaming sessions and graphics card goes away.  With NVDIA VGX, multiple users can share a single, monster-sized graphics card.  This provides much better scalability for the cloud game data center and correspondingly reduces costs and increases flexibility.

Lower latency remote displays enable a significant improvement in the speed at which the remote image is sent to the end client device.  In this cloud gaming scenario, the gaming frames are actually converted into an H.264 movie and sent to the user.  NVIDIA has enabled improvements in the process by eliminating many steps in the process.  The frame of the game no longer needs to touch the CPU or main memory and is encoded directly on the NVIDIA VGX card and sent directly over PCI Express to the network card.  By bypassing many of the previous components and removing steps, this speeds up the process immensely.  This delivers a few benefits.  First, all things equal, it can deliver a much faster experience to the gamer that they never experienced before.  The experience just feels more like it is happening locally.  Combined with GPU virtualization, the reduced latency also enables cloud gaming data centers to be located farther away from users, which increases data center utilization and efficiency.  It also enables entire geographies to be served that could never be served before as “edge servers” can be consolidated.

Wither Future Game Consoles?

If NVIDIA and its partners can execute on the technology and the experience, it would essentially enable any device that could currently playback YouTube video well to be a virtual game device. Gamers could play any game, any time, and immediately.  What kinds of devices do that today?  They are all around us.  They are smartphones, Smart TVs, and even tablets.  There’s no loading games off of a disc, no downloading 500MB onto a PC; its just pick the game and play.  Once the gamer is done playing on the TV, they can just take their tablet and pick up in their bedroom where they left off.

This kind of usage model is quite common when you think of it.  Many consumer books, movies and even music in this same way, so why not games?  For many consumers, convenience trumps quality and that’s one of the issues I can see with future consoles.  There is no doubt that the visual detail and user interfaces will be much more sophisticated than cloud gaming. As I look to how well the iPod did with its “inferior” music quality, consumers chose convenience over quality.  Look at Netflix on a phone or tablet.  Consumers can get much higher quality on the local cable service, but a growing number of consumers choose convenience over quality.

Device makers and service providers who don’t see any monetization currently off of games today will very aggressively adopt this approach.  TV makers, for instance, see no revenue from any game played on their devices.  Gaikai, as an example, is cutting deals with TV manufacturers like LG to provide this service built into every Smart TV in the future.  Telcos and cable companies are also very motivated to tap into the huge gaming revenue stream.

I believe that consoles will adopt cloud gaming capabilities in addition to physical media or they will be viewed as lacking the features gamers want.  I also believe that cloud gaming will seriously cannibalize future game consoles.  Many who would have purchased a new game console if cloud gaming with NVIDIA VGX and GeForce Grid had not existed will not buy game consoles.  With that premise, it begs the question if future game consoles have a bright future.  If game console makers don’t do something aggressive, their future is looking dim.

If you would like a deeper dive on NVIDIA VGX and the GeForce Grid, you can download my whitepaper here.

NVIDIA Solved the Ultrabook Discrete Graphics Problem with Kepler

When Intel released their first Ultrabook specification, one of the first component implications I thought of were the impact 633882_NVLogo_3D_DarkTypeto discrete graphics.  My thought process was simple; based on the Intel specifications for battery life, weight and thickness, designing-in discrete graphics that were additive to Intel’s own graphics would be difficult, but not impossible. By additive, I mean really making a demonstrable difference to the experience versus just a spec bump.  While I respect OEMs need to add discrete graphics for line logic and perception, sometimes it doesn’t make an experiential difference.  This is why I was so surprised and pleased to see NVIDIA’s latest discrete graphics solutions inside Ultrabooks. NVIDIA’s new GPUs based on the “Kepler” architecture not only provide an OEM differentiator, but they also provide a demonstrable, experiential bump to games and video.

Today’s Ultrabooks share similar specs

Today’s field of Ultrabooks is impressive but lack a sense of differentiated hardware specification and usage models. I deeply respect differentiation in design as I point out in my assessment of the Dell XPS 13, but on the whole, I can do very similar things and run very similar apps with the current top crop of Ultrabooks.

As an example, let’s take a look at the offerings at Best Buy.   Of the 13 Ultrabooks, all offer roughly the same or similar specifications: processor (Intel Core I-Series), graphics (Intel HD), operating system (Windows 7 64-bit), display size (13-14″), display resolution (1,366×768), memory (4GB RAM), and storage (128 GB).


Of these specifications, the level of the Intel Core CPU primarily determines the differential in what a user can actually do with their Ultrabook.  As Ultrabooks have matured a full cycle, differentiating with graphics makes a lot of sense, particularly in the consumer space.

NVIDIA’s Kepler-based GeForce GT 640M Mobile Graphics

Today, NVIDIA launched the first of their latest and greatest GeForce 600M graphics family, the GeForce GT 640M. This GPU features a new architecture code-named “Kepler” which is destined for desktops, notebooks, Ultrabooks, and workstations.  Designed to be incredibly powerful and efficient and created on TSMC’s lower-power HP 28nm process, test results I’ve seen show these new GPUs deliver twice the performance per watt of the prior generation.  Anandtech has thoroughly reviewed the desktop variant, the NVIDIA GeForce GTX 680, and have given NVIDIA the single card graphics performance crown.

NVIDIAs Kepler differentiates the Acer Timeline Ultra M3

acer m3With the NVIDIA GeForce GT 640M, users can now get the new graphics and the Ultrabook benefits of thin, light, responsive and great battery life. Consumers can actually buy this capability today in Acer’s new Timeline Ultra M3.  The M3 can play all the greatest game titles like Battlefield 3 at Ultra settings, is only 20mm thin and gets 8 hours of battery life.  NVIDIA suggests that this new combination of Ultrabook and Kepler-based graphics equates to the “World’s First True Ultrabook”. I need to test this for myself, but they have a point here given that it provides between a 2X and 10x bump in the most demand gaming titles over the Intel HD graphics.

How NVIDIA’s Kepler-based GPU fits in an Ultrabook

As I said earlier, when I saw the Ultrabook specification, I thought it would be very difficult to get decent discrete graphics into an Ultrabook. My concerns were around power draw to achieve minimum battery requirements and chassis height in 13 and 14″ form factors to include a proper cooling solution.

Between NVIDIA and their OEMs, many different factors played into enabling this capability:

  • NVIDIA Kepler architecture is twice as efficient as the prior SM architecture. The inverse of this is that at half the power, you can provide the same performance. For instance, the GT 640M reportedly provides the same performance as the previous GTX 460M enthusiast class GPU, at around half the power consumption.
  • NVIDIA Optimus technology automatically shifts between the lower power/performance of the Intel HD graphics and the higher power/performance NVIDIA discrete graphics. When the user is doing email, the Intel graphics are operating and the GeForce GPU is consuming zero power.  When the consumer is playing Battlefield 3, Optimus automatically turns on GeForce GPU to provide the best possible performance.
  • New and better power management allows GeForce GPUs to maximize performance by intelligently utilizing the full potential of the notebook’s power and thermal budget. For example, if the notebook’s heat sink assembly has spare thermal headroom, the GeForce GPU can dynamically increase frequency to provide the best possible performance without adversely effecting operating temperature or stability.

I was correct earlier in that this was very challenging and between NVIDIA and its OEMs. It’s clear they stepped up and made it happen.

Ecosystem and NVIDIA Implications

Having NVIDIA’s new high performance graphics inside Ultrabooks is good for the entire ecosystem of consumers, channel partners, OEMs, ODMs, game ISVs and of course, NVIDIA:

  • Consumers get between 2-10X the gaming performance plus all the other Ultrabook attributes.
  • Channel partners, OEMs, and ODMs can now offer a much more differentiated and profitable line of Ultrabooks.
  • Game ISVs and their distribution partners can now participate more fully in the Ultrabook ecosystem.

And, of course, NVIDIA has a big potential win here, too.  According to GFK, over the past two quarters NVIDIA has picked up nearly 10 points of market share inside Intel-based notebooks. NVIDIA’s Kepler only enables them to further increase this share, particularly with Intel Ivy Bridge-based Ultrabooks.  AMD hasn’t yet played their full mobile cards yet, but given AMD’s known GCN architecture and TSMC’s 28nm; they have limited weapons to pull out of their 2012 arsenal in Ultrabooks. The graphics world is a very dynamic market, so you never can be certain what each player is holding back.  AMD held the discrete graphics leadership position for a while, but 2012 looks very good for NVIDIA.

HTC One X: A Big Win for Nvidia’s Tegra 3

At this years Mobile World Congress HTC made an announcement that I found interesting. They announced that their latest and greatest smart phone the One X will run Nvidia’s latest processor named Tegra 3. Granted, Tegra has been making news winning a number of handset and tablet OEMs but the news that HTC has chosen Tegra 3 is of particular interest. The reason is because HTC has largely been extremely loyal to Qualcomm. HTC has been one of Qualcomm’s most loyal customers, launching all their flagship top tier devices with Qualcomm silicon. Taylor Wimberly at AndroidAndMe asks a similar question in his post called “Is Qualcomm losing their strongman grip on HTC.

HTC choosing Nvidia’s latest Tegra chip is a testament to the quality of the Tegra 3 architecture. As I pointed out in my my column, The Arm Wrestling Match, both Qualcomm and Nvidia have different approaches with their multi-core strategies. Both companies have viable strategies when it comes to their approach to multi-core and both are gaining design wins all over the industry. However, for Nvidia and Tegra, winning an HTC design was the first in many key strategic steps for Nvidia to get their silicon into a wider portfolio of OEMs.

For Nvidia, and Tegra in particular, winning the HTC One X is a big win. It is a testament to the Tegra 3 multi-core architecture and something that I believe signals the breadth and depth of not only Nvidia chips in 2012 but that quad-core is the new dual-core in smart phones and tablets in 2012.

Nvidia still has work to do however, they are working to build LTE support into Tegra 3, which we expect to be finalized in devices in the second half of 2012. LTE support into Qualcomm’s S4 is still an advantage for Qualcomm since modem technology is core to Qualcomm’s heritage. This is why it will be very interesting to see how Nvidia integrates their Icera acquisition into the Tegra roadmap.

For Nvidia Tegra has always had the advantage as a solution for tablets in terms of performance and won many tablet design wins. I have been waiting to see how Tegra and in particular now Tegra 3 generates broader support with smart phones. It looks as though the win of the HTC One X may signal the upwards trend for Tegra 3 in smart phones.

However we slice the fascinating competition between Nvidia’s Tegra and Qualcomm’s SnapDragon chipsets the main point remains clear–Quad core chipsets will invade devices of all shapes and sizes in 2012 and beyond.

The Verge’s Vlad Savov recently interviews Nvidia’s Tegra GM Mike Rayfield on Tegra 3. I encourage you to read that interview here. Also Fierce Wireless had a great interview with HTC lead product designer on the decision to use Tegra 3 in the One X, you can read that here.

NVIDIA’S Tegra 3 Leading the Way for Smartphone Modularity

I have been an advocate of modularity before it became popular to do so. The theory seems straight-forward to me, in that if the capabilities of a smartphone were outpacing the usage model drivers of a rich client PC, then consumers someday could use their own smartphone as a PC.  Large displays, keyboards and mice still exist in this usage model, but the primary intelligence is in the smartphone then combined with wireless peripherals.  At this year’s Mobile World Congress, NVIDIA took us one step closer to this reality with their partners and the formal announcement of Tegra 3 based smartphones.

Tegra 3 for Smartphones

Tegra 3 is NVIDIA’s latest and greatest SOC for smartphones, “superphones“,  and tablets.  It has four ARM A9- based high performance, 1.5 GHz cores and one “battery saver” core that operates when the lowest power is required.  The fifth core comes in handy when the system is idling or when the phone is checking for messages.  Tegra 3 also includes a very high performance graphics subsystem for games and watching HD video, much more powerful than Qualcomm’s current Adreno 2XX hardware and software implementation.

clip_image004NVIDIA announced five major Tegra 3 designs at Mobile World Congress; the HTC One X, LG Optimus 4X HD, ZTE Era, Fujitsu’s “ultra high spec smartphone” and the K-Touch Treasure V8.  These wins were in what NVIDIA coins as “superphones” as they have the largest screens, the highest resolutions, the best audio, etc.  You get the idea.  For example, the HTC One X sports a 4.7″ 720P HD display, the latest Android 4.0 OS, Beats audio, NFC (Near Field Communication), and its own image processor with a 28mm lens to take great pictures at extremely low light.  You get the idea.

There is a lot of goodness in the package, but that doesn’t remove the challenge of communicating the benefits of four cores on a 5 inch screen device.

Quad Core Phone Challenge

As I wrote previously, NVIDIA needs to overcome the challenge of leveraging four cores beyond the spec on the retail tear clip_image002pad.  It’s a two part challenge, the first to actually make sure there is a real benefit, then to articulately and simply communicate that.  These are similar challenges PC manufacturers had to deal with.  The difference is that PC makers had 20 years of dual socket machines to establish an ecosystem and a messaging system.  Quad core tablets are an easier challenge and quad core convertibles are even easier in that you can readily spot places where 4 cores matter like web browsing and multitasking. Smartphones is a different situation in that due to screen size limitations, multitab browsing and multitasking rarely pegs a phone to its limits.  One major exception is in a modular environment where NVIDIA shines the most.

Tegra 3 Shines the Most in Modular Usage Models

Modularity, simply put, is extending the smartphone beyond the built-in limitations. Those limitations are in the display, audio, and input mechanisms.  When the smartphone breaks the barriers of itself, this is where NVIDIA Tegra 3 shines the most.  I want to be clear; Tegra 3 is a competitive and differentiated smartphone and tablet SOC without modularity, but is most differentiated when it breaks free from its limited environment.

NVIDIA has done a much better job showing the vision of modularity but its partners could do a better job actually delivering it.  On the positive side, partners are showing some levels of modularity. HTC just announced the HTC Link for the HTC One X, software and hardware solution that plugs into an HDTV where you can wirelessly mirror what is on the phone’s display.  It’s like Apple’s AirPlay but better in some ways like being able to project a video on the large display and do something different on the phone display, like surfing the web.  Details are a bit sketchy specifically for the HTC One X and HTC Link, but I am hopeful they will roll out some useful modular features in the future for usage models. Apple already supports wireless mirroring supporting games so in this way, HTC Link is behind.

What NVIDIA Tegra 3 Should Do

What NVIDIA’s partners need to create is a game console and digital media adapter solution that eliminates the need to buy an XBOX, PlayStation, Wii, Roku, or Apple TV.  The partners then need to attack that.  All of the base clip_image006software and hardware is already there and what HTC, ZTE, or LG needs to do now is package it to make it more convenient for gaming. This Tegra 3 “phone-console” should have a simple base near the TV providing it power, wired LAN, HDMI, and USB.  This way, someone could connect a wireless game controller and play games like the recently announced Tegra 3 optimized games in great resolutions with rich audio. The user would have the ability to send phone calls to voice mail or even to a Bluetooth headset.  Notifications can be muted if desired as well.  And of course, if you want to watch Netflix, Hulu, or Amazon movies it’s all there, too.  The alternative to this scenario is for a Wi-Fi Direct implementation that doesn’t require a base where the user can utilize the phone as a multi-axis game controller with force feedback.  The challenge here is battery life but the user can pause the game or movie and pick up phone calls and messages. This usage model isn’t for everyone, but think for a moment about a teenager or college bound guy who loves gaming, wants a cool phone, and doesn’t have the cash to buy everything.  You know the type.

Other types of modularity that NVIDIA’s partners must develop are around productivity, where the phone drives a laptop shell, similar to Motorola’s Lapdock implementations as I analyzed here. Neither the software, hardware, or price made the Lapdock a good solution, but many of the technologies now exist to change that.  NVIDIA’s Tegra 3 would be a great start in that it enables real multitasking when using the Lapdock in clamshell PC mode.  Android 4.0 provides a much more modular computing environment to properly display applications on a 5″ and 11″ display including scaling the fonts and reorienting windows.  The Motorola Lapdock used two environments, one Android Gingerbread a a different one for PC mode.  Unsurprisingly, it was a good start but very rough one too, with room to improve.

NVIDIA, the Silicon Modularity Leader with Tegra 3

NVIDIA with its Tegra 3 solution is clearly the current silicon leader to support future modular use cases.  They are ahead of the pack with their modularity vision, patiently waiting for their partners to catch up.  This was the most evident at CES where NVIDIA showed an ASUS Transformer Prime connected to an XBOX controller and an HDTV playing high quality games. They also demoed the Prime playing high end PC games through remote desktop. Now that is different.

The opportunity for HTC, ZTE, LG and potentially new customers like Sony, RIM, and Nokia is there, and the only question remains is if they see the future well enough to capitalize on it.  With all the complaints from handset vendors on differentiation and profitability with Android, I continue to be puzzled by their lack of aggression.  An aggressive handset maker will jump on this opportunity in the next two years and make a lot of money doing in the process.

The Case for Intel’s Future Smartphone Success

In my many weekly conversations with industry insiders we discuss Intel’s chances in mobility markets, specifically smartphones. Few people are betting against Qualcomm and for very good reason in that they are entrenched at handset vendors and their 2012 roadmap, at least on paper, looks solid. What few are discussing is how Intel will pick up market share. My last column on Intel’s smartphone efforts outlined what Intel needs to demonstrate quickly to start gaining share and getting people to believe they can be a player. Now I want to take a look at why I believe Intel can and will pick up relevant market share over the next three years.

Intel Finally Broke the Code with Medfield

This isn’t Intel’s first time in mobility. Intel owned XScale, an ARM-based mobile processor that was in the most popular WinCE devices like the Compaq iPaq, one of the more popular Pocket PCs. XScale products even powered Blackberrys for a time as well. Intel sold the entire XScale mobile application processor business to Marvell in 2006 for $600M. This move was driven by Intel’s desire to focus on X86 designs. What followed were some failed mobile attempts with Menlo and Moorestown, two low power, Atom-branded processors that made their way into MIDs (Mobile Internet Devices). It appeared that Intel would make grand announcements with big names like LG for smartphones then nothing would happen afterward. Things are very different with Medfield. Handsets are at China Unicom in testing for Lenovo and Motorola announced their handsets would be at carriers for the summer.

Medfield is a huge step forward in design and integration for Intel. First, it combines the application processor with I/O capabilities on a single chip. This saves handset makers integration time and board space. Secondly, it is paired with the Intel XMM 6260 radio based on the Infineon Wireless Solutions (WLS) acquisition. This increases the Intel revenue BOM (Bill of Material) and also helps with handset integration. Finally, Intel has embraced the Android mobile OS in a huge way with a large developer investment and will provide optimized drivers for Medfield’s subsystems. This move is in contrast to their MeeGo OS efforts that didn’t go anywhere. Intel has even gone to the effort to emulate ARM instructions so that it can run native apps that talk directly to ARM. These apps are typically games that need to be closer to the hardware. This is a very good start for Intel, but as I tell my clients, if there are 10 steps to mobile silicon success, Intel just successfully crossed step 3.

It’s a Tough Smartphone Market

Intel made some very serious headway with Medfield, but it is a very competitive market out there. According to IDC, in Q4 2011, Apple and Samsung combined to garner almost 50% of the smartphone market. As I pointed out in my previous column, Apple already designs their A-Series processors and I don’t see that changing. I expect Samsung with the exception of the very low end to lean into their own Exynos silicon. Nokia at 12% Q4 smartphone share is tied to Windows Phone and Qualcomm at least for the short term. Struggling RIM doesn’t need another variable to worry about with their muddled operating system strategy and is currently tied to Qualcomm. Finally, HTC is rumored to tie up with NVIDIA on its Tegra platform on the high end. Who does this leave for Intel?

For Intel in the short term, with Motorola and Lenovo on-board, this leaves private label for carriers, LG, Sony, ZTE, Huawei, Kyocera, Sanyo and a very long tail of small manufacturers. The long tail will be a challenge for Medfield until Intel waterfalls the products line to be cost-competitive with lower end models. I expect Intel to start waterfalling products down in the end of 2012.

Why Intel Could Succeed

While I outlined the many challenges, Intel could very well succeed in the space longer term. First, the phone marketplace is a rapidly changing market. Not only have there been tremendous share shifts in the last two years, but feature phones are migrating to smartphone market resulting in exploding growth.

Operating systems are clear from shaking out. Microsoft will not go gently into the night with Windows Phone and will invest what it takes to be successful even if it takes another Nokia-like investment to own another platform. I also believe once Microsoft starts gaining share, they will devote resources for X86 on Windows Phone 8 or 9 platforms. They see Intel as successful with Medfield and the WINTEL alliance could be brought back from the dead. Long-term, I do not believe Samsung will be happy licensing someone else’s operating system, particularly with Apple’s integration and experience success. I expect Samsung to do one of three things, possibly two; increase investment in Bada to a point that it can compete with Android in a closed environment, embrace webOS, oe lean heavily into Tizen. Marketplaces in dynamic change are an opportunity for newcomers, even companies worth $140B like Intel.

One other important factor that hasn’t fully played out is “carrier versus handset-maker” dominance. Up until the Apple iPhone, the carriers dictated terms to the handset makers. Every carrier who has adopted the iPhone has taken a gross margin reduction. This doesn’t mean they made a bad decision; they had to carry the iPhone. That carrier margin reduction money is going to Apple and not the carriers. Carriers are strategizing how they can regain that dominance going forward and I believe Intel will part of those plans. Intel has the capability to partner with an extremely low cost manufacturer or ODM an entire solution, white label it to a carrier and provide a competitive Android experience. I expect a few key announcements this month at this year’s Mobile World Congress.

Of course, we cannot forget about Intel’s technology. According to tests run at Anandtech, Intel’s Medfield is competitive in power at 32nm LP so you must assume that it only gets better at Intels 22nm 3DTriGatetechnology. Intel will roll Atom into 22nm in 2013 and 14nm in 2014. This is all the while in 2012 TSMC is at best case at 28nm and GLOBALFOUNDRIES and Samsung is at 32nm.

I define success as the ability to reach a relevant level of profitable business that supports the desired brand goals. For Intel, this doesn’t need to be 80% like they have in the PC market, but needs to be a profitable 20%.

What this Means for Intel, Qualcomm, Texas Instruments, and NVIDIA

Over a period of three years, Intel will start to take market share from Qualcomm, Texas Instruments and NVIDIA, albeit very small in 2012. As Intel integrates wireless, moves to 14nm, and waterfalls their offerings to lower price point smartphones, this makes much more competitive to handset makers and carriers. I expect Huawei, ZTE, or a major carrier to go big with Intel in 2013 which will make a huge difference. One thing to remember about Intel; unlike others in the marketplace, Intel also captures the manufacturing margin TSMC and GLOBALFOUNDRIES makes and the design margin ARM earns. While Intel has a long way to go in proving themselves, they have the start they never had before at a time to take advantage of the mammoth growth in smartphones. Never count Intel out of any market, no matter how many times they have tried and failed.

What Intel Must Demonstrate in Smartphones (and soon)

Intel made a big splash at CES 2012 with the announcement that Motorola and Lenovo committed to Intel’s Medfield clip_image002smartphone solution. This came on the heels of a disappointing break-up between Intel and Nokia as well as a lack of previous traction with LG. While Intel has come farther than they have ever come before with one of their X86 SOCs, they still have a long way to go to claim smartphone victory. Of course Intel knows this and is working diligently and sparing no expense. The biggest challenge Intel faces is attacking a market where the incumbent, ARM ecosystem partners Qualcomm, NVIDIA, and Texas Instruments have almost 100% market share. To start gaining share in smartphones, Intel must demonstrate many things in the near future.

More Design Wins with Key Players

The Motorola announcement was impressive in that Moto has a respected name in smartphones, but they won’t carry Intel that far alone. Lenovo is an even smaller player and while very successful in PCs, hasn’t been able to secure a lot of smartphone market share even in their home country, China. Intel knows they need a few more partners to start chipping away at market share and I expect them to announce at least one at this year’s Mobile World Congress.

One of the challenges is that many of the top players are already locked-in in one way or another, Intel has some negative history with, or has rapidly declining share. Apple already has their own A-Series SOC, Samsung has Exynos SOC, and Nokia rebuffed Intel last year and is clearly locked into ARM and Microsoft for the time being. RIM as a partner is a shaky proposition and HTC is an aggressive player but is recently dropping share. That leaves lower smartphone market share holders LG, Sony, Sharp, NEC and ZTE in the short term.

Longer term, I don’t expect Apple or Samsung to get out of the SOC business because they have been successful with their own strategies. I cannot see Nokia or Microsoft motivated to drive a change or provide dual support for X86 until Windows 9. RIM is in a free-fall with no bottom in sight. Intel is forced to take the long-term approach as they are with Lenovo by developing smaller smartphone players to become larger ones. ZTE certainly is a good long term prospect as is Huawei. If Intel can leverage their PC franchise with them I could see them being successful.

Relevant, Differentiated, and Demonstrable Usage Models

In fighting any incumbent, the new entrant must provide something well above and beyond what the incumbent offers to incent a change in behavior. I am assuming that Intel won’t lead in low price or lowest development cost, so they must offer handset makers or the carriers a way to make more money or get consumers to demand an Intel-based smartphone. Regardless of which variable Intel wants to push, they must devise relevant, differentiated and demonstrable usage models that ARM cannot.

By relevant I mean that it must be fixing a known pain point or creating a real “wow” feature consumers never asked for, but is so cool it cannot be passed up. One pain point example is battery life. Battery life is simply not good enough on smartphones when used many times daily. If this weren’t true, car chargers and battery backs wouldn’t be so popular. Wireless display is useful and cool but not differentiated in that Apple can enable this via AirPlay. Demonstrable means that it must be demonstrated at the store, an ad, or on-line on a web site. If something isn’t demonstrable then it may as well not exist.

I would like to see Intel invest heavily in modularity, or the ability to best turn the smartphone into a PC through wireless display and wireless input. Yes, this is dangerous short-term in that if Intel does a great job at it then they could eat into their PC processor franchise. But, this is the innovator’s dilemma, and a leader must sacrifice something today to get something tomorrow. I could envision an Intel-based emerging region smartphone that enables PC functionality. ARM cannot offer this well today but will be able to in the future with their A15 and beyond-based silicon. Intel should jump on the modularity opportunity while it lasts.

One other opportunity here is for Intel to leverage their end-to-end experience from the X86-based Intel smartphone to the X86-based data center. If Intel can demonstrate something incredible in the end-to-end experience with something like security or a super-fast virtualized desktop, this could be incredibly impactful. One thing that will be with us for at least another 5 years is bandwidth limitation.

Carrier Excitement

Outside of Apple, the carriers are the gatekeepers. Consumers must go through them to get the wireless plans, the phones, and most importantly, the wireless subsidy. Apple’s market entry strategy with AT&T on the iPhone was a strategic masterpiece in how to get into a market and change the rules over time. Apple drove so much consumer demand for iPhones that the carriers were begging Apple to carry the iPhone, the exact opposite of the previous decade.

Intel must get carriers excited in the new usage models, bring them a new stream of revenue they feel they are being cut out from, or lower their costs. Intel doesn’t bring them revenue from content side but could I can imagine Intel enabling telcos to get a piece of classic retailer’s PC action once “family plans” become a reality. While telco-distributed PCs weren’t a big success in the past, this was due primarily from the absence of family data plans. I can also imagine Intel helping telcos lower the costs of their massive data centers with Xeon-based servers. Finally, if Intel could shift traffic on the already oversold “wire” by shifting processing done in the cloud and onto their SOCs, this would be very good in a bandwidth-constrained environment.

Competitive Handset Power

At CES, Intel showed some very impressive battery life figures for Medfield handsets:

• 6 hour HD video playback

• 5 hours 3G browsing

• 45 hour audio playback

• 8 hour 3G talk time

• 14 day standby

This was measured on Intel’s own reference platform which is somewhat representative of how OEMs handsets will perform. What will be very telling will be how Medfield performs on a Tier 1 handset maker, Motorola when they launch in Q3 2012. There is no reason to think the Moto handset won’t get as impressive battery life figures, but Intel could gain even more credibility by releasing those figures as available.

When Will We Know When/If Intel’s Smartphone Effort is a Success?

Intel has slowly but surely made inroads into the smartphone market. Medfield is impressive but competing with and taking share from an incumbent with 99%+ market share is a daunting task. The easy answer to measure Intel progress is by market share alone but that’s lazy. I believe that Intel smartphone efforts should first be measured by handset carrier alliances, the number of handset wins, the handset quality and the new end usage models their SOCs and software can enable. As these efforts lead to potential share gain does it make sense to start measuring and scrutinizing share.

The ARM Wrestle Match

I have an un-healthy fascination with semiconductors. I am not an engineer nor I do know much about quantum physics but I still love semiconductors. Perhaps because I started my career drawing chip diagrams at Cypress Semiconductor.

I genuinely enjoy digging into architecture differences and exploring how different semiconductor companies look to innovate and tackle our computing problems of the future.

This is probably why I am so deeply interested in the coming processor architecture war between X86 and ARM. For the time being, however, there is a current battle within several ARM vendors that I find interesting.

Qualcomm and Nvidia, at this point in time, have two of the leading solutions for most of the cutting edge smart phones and tablets inside non-Apple products.

Both companies are keeping a healthy pace of innovation looking to bring next generation computing processors to the mass market.

What is interesting to me is how both these companies are looking to bring maximum performance to their designs without sacrificing low-power efficiency with two completely different approaches.

One problem in particular I want to explore is how each chipset tackles tasks that require both computationally complex functions (like playing a game or transcoding a video) and ones that require less complex functions (like using Twitter or Facebook). Performing computationally complex functions generally require a great deal of processing power and result in draining battery life quickly.

Not all computing tasks are computationally complex however. Therefore the chipset that will win is one that has a great deal of performance but also can utilize that performance with very low power draw. Both Nvidia and Qualcomm license the ARM architecture which for the time being is the high performance-low power leader.

Nvidia’s Tegra 3
With their next chipset, Tegra 3, Nvidia is going to be the first to market with a quad-core chipset. Tegra 3 actually has five cores but the primary four cores will be used for computationally complex functions while the fifth core will be used to handle tasks that do not require a tremendous amount of processing power.

The terminology for this solution is called Variable SMP (symmetric multiprocessing). What makes this solution interesting is that it provides a strategic and task based approach to utilizing all four cores. For example when playing a multi-media rich game or other multi-media apps all four cores can be utilized as needed. Yet when doing a task like loading a media rich web page two cores may be sufficient rather than all four. Tegra 3 can manage the cores usage, based on the task and amount of computer power needed, to deliver the appropriate amount of performance for the task at hand.

Tegra 3’s four cores are throttled at 1.4Ghz in “single core mode” and 1.3Ghz when more than one core is active. The fifth core’s frequency is .5Ghz and is used for things like background tasks , active standby, and playing video or music, all things that do not require much performance. This fifth core because it is only running at .5Ghz requires very little power to function and will cover many of the “normal” usage tasks of many consumers.

The strategic managing of cores is what makes Tegra 3 interesting. This is important because the cores that run at 1.4 Ghz can all turn off completely when not needed. Therefore Tegra 3 will deliver performance when you need it but save the four cores only for computationally complex tasks which will in essence save battery life. Nvidia’s approach is clever and basically gives you both a low power single-core, and quad-core performance computer at the same time.

Qualcomm’s S40 Chipset
Qualcomm, with their SnapDragon chipset, takes a different approach with how they tackle the high performance yet low power goal. There are two parts of Qualcomm’s S40 Snapdragon chipsets that interest me.

The first is that the S40 chipset from Qualcomm will be the first out the door on the latest ARM process the Cortex A15. There are many advantages to this new architecture, namely that it takes place on the new 28nm process technology that provides inherent advantages in frequency scaling, power consumption and chipset size reduction.

The second is that Qualcomm uses a proprietary process in their chipsets called asynchronous symmetric multiprocessing or aSMP. The advantage to aSMP is that the frequency of the core can support a range of performance rather than be static at just one frequency. In the case of the S40 each core has a range of 1.5Ghz to 2.5Ghz and can scale up and down the frequency latter based on the task at hand.

Qualcomm’s intelligent approach to frequency scaling that is built into each core allows the core to operate at different frequencies giving a wide range of performance and power efficiency. For tasks that do not require much performance like opening a document or playing a simple video, the core will run at the minimum performance level thus being power efficient. While when running a task like playing a game, the core can run at a higher frequency delivering maximum performance.

This approach of intelligently managing each core and scaling core frequency depending on tasks and independent of other processes is an innovative approach to simultaneously delivering performance while consuming less power.

I choose to highlight Nvidia and Qualcomm in this analysis not to suggest that other silicon vendors are not doing interesting things as well. Quite the contrary actually as TI, Apple, Marvel, Broadcom, Samsung and others certainly are innovating as well. I choose Qualcomm and Nvidia simply because I am hearing that they are getting the majority of vendor design wins.

The Role of Software in Battery Management
Although the processor play’s a key role in managing overall power and performance of a piece of hardware, the software also plays a critical role.

Software, like the processor, needs to be tuned and optimized for maximum efficiency. If software is not optimized as well it can lead to significant power drains and result in less than stellar battery life.

This is the opportunity and the challenge staring everyone who makes mobile devices in the face. Making key decisions on using the right silicon along with effectively optimizing the software both in terms of the OS and the apps is central going forward.

I am hoping that when it comes to software both Google and Microsoft are diligently working on making their next generation operating systems intelligent enough to take advantage of the ARM multi-core innovations from companies like Qualcomm and Nvidia.

These new ARM chipset designs combined with software that can intelligently take advantage of them is a key element to solving our problem with battery life. For too long we consumers have had an un-healthy addiction to power chords. I hope this changes in the years to come.

Windows 8 Desktop on ARM Decision Driven by Phones and Consoles

There has been a lot written about the possibility of Microsoft not supporting the Windows 8 Desktop environment on the ARM architecture. If true, this could impact Microsoft, ARM and ARM’s licensees and Texas Instruments, NVIDIA, and Qualcomm are in the best position to challenge the high end of the ARM stack and are publicly supported by Microsoft.  One question that hasn’t been explored is, why would Microsoft even consider something like this? It’s actually quite simple and makes a lot of sense the position they’re in; it’s all about risk-return and the future of phones and living room consoles.

The Threat to Microsoft

The real short and mid term threat isn’t from Macs stealing significant Windows share from Microsoft, it’s all about the Apple iPad and iOS.  It could also be a little about Android, but so far, Android has only seen tablet success in platforms that are little risk to a PC, like the Amazon Kindle Fire.  Market-wise, the short term threat is about consumer, too, not business.  Businesses work in terms of years, not months. The reality is that while long term, the phone could disrupt the business PC, short term it won’t impact where Microsoft makes their profits today. Businesses, short term, won’t buy three devices for their employees and therefore tablets will most likely get squeezed there.  Business employees first need a PC, then a smart phone, and maybe a few a tablet.  There could be exceptions, of course, primarily in verticals like healthcare, retail and transportation.

What About Convertibles?

One wild-card are business convertibles.  Windows 8 has the best chance here given Microsoft’s ownership on business and if you assume Intel or AMD can deliver custom SOCs with low enough power envelopes, thermal solutions and proper packaging for thin designs.  Thinking here is that if business wants a convertible, they’ll also want Windows 8 Desktop and more than likely backward compatibility, something only X86 can provide.  So net-net, Microsoft is covered here if Intel and AMD can deliver.

Focus is Consumer and Metro Apps

So the focus for Microsoft then is clearly consumer tablets, and Microsoft needs a ton of developers writing high quality, Metro apps to compete in the space.  Metro is clearly the primary Windows 8 tablet interface and Desktop is secondary, as it’s an app.  Developers don’t have money or time to burn so most likely they will have to choose between writing a Metro app or rewriting or recompiling their desktop to work with ARM and X86 (Intel and AMD) desktop. It’s not just about development; it’s as expensive for devs to test and validate, too.  Many cases it’s more expensive to test and validate than it is to actually develop the app.  Strategically, it then could make sense for Microsoft to push development of the Metro apps and possibly by eliminating the Desktop on ARM option, makes the dev’s decision easier.

Strategically, It’s About Phones and the Living Room in the Endimage

Windows 8, Windows Phone 7, and XBOX development environments are currently related but not identical.  I would expect down the road we will see an environment that for most apps that don’t need to closely touch the hardware, you write once and deploy onto a Microsoft phone, tablet, PC and XBOX.  The unifier here is Metro, so getting developers on Metro is vitally important.

If Microsoft needed to improve the chances developers will swarm to Metro and do it by taking a risk by limiting variables, let’s say by eliminating ARM desktop support, it makes perfect sense.

Gaming AMD’s 2012 Strategy

AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.

There’s an awful lot of mis-guided analysis wafting about regarding AMD’s new strategic direction, which the company says it will make public in February. This piece is to help you (and me) sort through the facts and the opportunities. I last took a look at AMD’s strategies earlier this year, available here.

Starting With the Facts

  • AMD is a fabless semiconductor company since 2009. The company depends on GlobalFoundries and soon Taiwan Semiconductor to actually fabricate its chips;
  • In its latest quarter, AMD had net income of about $100 million on $1.7 billion in revenue. Subsequently, the company announced a restructuring that seeks to cut costs by $118 million in 2012, largely through a reduction in force of about ten percent;
  • AMD has about a 20% market share in the PC market, which Intel says is growing north of 20% this year, largely in emerging markets;
  • AMD’s products compete most successfully against rival Intel in the low- to mid-range PC categories, but 2011 PC processors have underwhelmed reviewers, especially in performance as compared to comparable Intel products;
  • AMD has less than a 10% market share in the server market of about 250,000 units, which grew 7.6% last quarter according to Gartner Group;
  • AMD’s graphics division competes with nVidia in the discrete graphics chip business, which is growing in profitable commercial applications like high-performance supercomputing and declining in the core PC business as Intel’s integrated graphics is now “good enough” for mainstream buyers;
  • AMD has no significant expertise in phone and tablet chip design, especially the multi-function “systems on a chip (SOCs)” that make up all of today’s hot sellers.

What Will AMD CEO Rory Read’s Strategy Be?

I have no insider information and no crystal ball. But my eyebrows were seriously raised this morning in perplexity to see several headlines such as “AMD to give up competing with Intel on X86“, which led to “AMD struggling to reinvent itself” in the hometown Mercury News. I will stipulate that AMD is indeed struggling to reinvent itself, as the public process has taken most of 2011. The board of directors itself seems unclear on direction. That said, here is my score card on reinvention opportunities in descending order of attractiveness:

  1. Servers —  For not much more work than a desktop high-end Bulldozer microprocessor, AMD makes Opteron 6100 server processors. Hundreds or thousands more revenue dollars per chip at correspondingly higher margins. AMD has a tiny market share, but keeps a foot in the door at the major server OEMs. The company has been late and underdelivered to its OEMs recently. But the problem is execution, not computer science.
  2. Desktop and Notebook PCs — AMD is in this market and the volumes are huge. AMD needs volume to amortize its R&D and fab preparation costs for each generation of products. Twenty percent of a 400 million chip 2011 market is 80 million units! While faster, more competitive chips would help gain market share from Intel, AMD has to execute profitably in the PC space to survive. I see no role for AMD that does not include PCs — unless we are talking about a much smaller, specialized AMD.
  3. Graphics Processors (GPUs) — ATI products are neck-and-neck with nVidia in the discrete graphics card space. But nVidia has done a great job of late creating a high-performance computing market that consumes tens of thousands of commercial-grade (e.g., high price) graphics cards. Intel is about to jump into the HPC space with Knight’s Corner, a many-X86-core chip. Meanwhile, AMD needs the graphics talent onboard to drive innovation in its Fusion processors that marry a processor and graphics on one chip. So, I don’t see an AMD without a graphics component, but neither do I see huge profit pools either.
  4. Getting Out of the X86 Business — If you’re reading along and thinking you might short AMD stock, this is the reason not to: the only legally sanctioned software-compatible competition to X86 inventor Intel. If AMD decides to get out of making X86 chips, it better have a sound strategy in mind and the ability to execute. But be assured that the investment bankers and hedge funds would be flailing elbows to buy the piece of AMD that allows them to mint, er, process X86 chips. So, I describe this option as “sell off the family jewels”, and am not enthralled with the prospects for success in using those funds to generate $6.8 billion in profitable revenue or better to replace today’s X86 business.
  5. Entering the ARM Smartphone and Tablet Market— A sure path to Chapter 11. Remember, AMD no longer makes the chips it designs, so it lacks any fab margin to use elsewhere in the business. It starts against well-experienced ARM processor designers including Apple, Qualcomm, Samsung, and TI … and even nVidia. Most ARM licensees take an off-the-shelf design from ARM that is tweaked and married to input-output to create an SOC design, that then competes for space at one of the handful of global fab companies. AMD has absolutely no special sauce to win in the ARM SOC kitchen.To win, AMD would have to execute flawlessly in its maiden start (see execution problems above), gain credibility, nail down 100+ design wins for its second generation, and outrace the largest and most experienced companies in the digital consumer products arena. Oh, and don’t forget volume, profitability, and especially cash flow. It can’t be done. Or if it can be done, the risks are at heart-attack levels.

“AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.” One way to read that ambiguous sentence by AMD is a strategy that includes:

  • Tablets and netbooks running X86 Windows 8;
  • Emerging geographic markets, chasing Intel for the next billion Internet users in places like Brazil, China, and even Africa. Here, AMD’s traditional value play resonates;
  • Internet-based businesses such as lots of profitable servers in the cloud. Tier 4 datacenters for Amazon, Apple, Facebook, Google, and Microsoft are a small but off-the-charts growing market.

So, let’s get together in February and see how the strategy chips fall. Or post a comment on your game plan for AMD.

Quad Core Smartphones: What it Will Take to Become Relevant

hedgeThere has been a lot of industry discussion on multi-core smartphones in the past year, and the dialog has increased with NVIDIA’s launch of Tegra 3, a quad core SOC targeted to phones and tablets. The big question lingering with all of these implementations particularly with phones is, what will end users do with all those general purpose compute units that provide significant incremental benefit? In the end, it’s all about the improved experience that’s relevant, unique, demonstrable, and easily marketable.

Multi-Core Background

Before we talk usage models, we first have to get grounded on some of the technology basics. First, whether it’s a multi-core server, PC, tablet or phone, many these things must exist to fully take advantage of more than one general purpose computing core in any platform:

  • operating system that efficiently supports multiple cores, multitasking across cores, and mullti-threaded apps
  • applications that efficiently take advantage of multiple cores
  • intelligent energy efficiency tradeoffs

Once those elements get into place, you have an environment where multiple cores can be leveraged. The next step is to optimize the platform for energy efficiency. All of the hardware and software platform elements, even down to transistors, must be optimized for low power when you need it and high performance when you need it. The Tegra 3 utilizes a fifth core, which NVIDIA says initiates when extremely low power state is required.

Assuming all the criteria above are met, then it comes down to what an end user can actually do with a phone with four cores.

Modularity Could Be the Key

Quad core phones could potentially add value in “modular” usage environments. While there have been a lot of attempts at driving widespread modularity, most haven’t been a big hit. I personally participated on the Device Bay Consortium when I was at Compaq, along with Intel and Microsoft. It didn’t end up materializing into anything, but the concept at the time from an end user perspective was solid.

Today and beyond, smartphone modularity is quite different than Device Bay’s “modules”. The smartphone concept is simple; use a high powered smartphone which can then extend to different physical environments. These environments span entertainment to productivity. Here are just a few of today’s examples of modularity in use today:

These are all forms of today’s modularity with different levels of interest, penetration, and adoption.

So what could quad core potentially add to the mix? Here are some potential improved usages:

  • Modular video and photo editing. These apps have historically always been multithreaded and could leverage a clamshell “dock” similar to the Lapdock or Multimedia Dock.
  • Modular multi-tab web browsing. Active browser tabs require a lot of performance and overhead. Just use Chrome PC browser and check your performance monitor. iOS5 actually halts the tab when moving to another tab forcing the user to reload the tab.
  • Modular games that heavily utilize a general purpose processor. Caveat here is that most of the games leverage the GPU a lot more than a general purpose CPU. It all depends on how the game is written, extent of AI use, UI complexity, where physics are done, and how the resources are programmed.
  • Modular natural user interface. While plugged in and “docked” at the desk or living room, the smartphone could power interfaces like improved voice control and “air” gestures. This may sound like science fiction, but the XBOX 360 is doing it today with Kinect.
  • Multitasking: Given enough memory and memory bandwidth, more cores typically means better multitasking.

Will It Be Relevant?

Many things need to materialize before anyone can deem a quad core smartphone a good idea or just a marketing idea for advanced users. First, smartphones actually need to ship with quad cores and a modular-capable OS. The HTC Edge is rumored to be the first. Then apps and usage models outlined above need to be tested by users and with benchmarks. Users will have to first “get” the modularity concept and notice an experiential difference. Moving from standard phone to modular experience must be seamless, something that Android 4.0 has the potential to deliver. Finally, some segments of users like enthusiasts will need to see the benchmarks to be swayed to pay more over a dual core phone.

There is a lot of proving to do on quad core smartphones before relevance can be established with any market segment beyond enthusiasts. Enthusiast will always want the biggest and baddest spec phone on the block but marketing to different segments, even if it provides an improved experience, will be a challenge.

The Asus PadFone is a Glimpse of the Future

As a part of my work as an industry analyst I do a great deal of thinking about the future. Many of the projects we get pulled into and asked to add analysis on are related to the distant not the near future. This happens to be one of the things I love most about my job, thinking about the future and imaging what the world of technology will be like 5 years out.

Pat Moorhead wrote an article yesterday highlighting Why Convertible PC’s Are About To Get Very Popular. I agree these product designs have a place in the market and we will likely see a good deal of hardware experimentation through 2013. I however think another product idea may have much longer staying power.

Without going into too much detail on things I can’t go into much detail on, I want to use the Asus PadFone as an example of a future I think is highly possible. This future is one where the smart phone is the center of our personal connected ecosystem and in essence becomes the brains that power all the other screens in our lives.

We talk a great deal about the “smart screens” which will invade consumers lives and homes. Although it certainly looks like we are heading in this direction, I sometimes ask: “if the smartest screen is in our pocket why couldn’t that device power the others.” Thus eliminating the need to have a high performance CPU in all my screens.

The Asus PadFone is an example of this concept. In Asus’ solution the smart phone is the most important device in the ecosystem because it is the device with the brains. The smart phone has the CPU, the OS and the software. In the PadFone solution the smart phone slips into the tablet thus giving you a two in one solution.

The Motorola Atrix 4G employs a similar idea where the Atrix can be docked with a laptop shell. The laptop shell simply has a battery and a screen and the Atrix provides the rest of the intelligence needed to have a full laptop.

Both of these designs highlight something that I think gives us a glimpse of how our future connected gadgetry may come together. The biggest indicator for this future reality is the trajectory every major semiconductor company is heading in. Namely very small multi-CPU cores performing at very low power consumption levels.

We can envision a future where we could have an eight core processor in our mobile phones. An eight core mobile chipset would be more than adequate to power every potential smart screen we can dream up. In this model you would simply dock your phone into every screen size possible in order to make every screen you own “smart.” Docking your phone to your TV would create a “smart TV” for example. Docking your phone with you car would create a “smart car.” You could also purchase laptop docks, desktop docks, tablet docks, smart mirror docks, smart refrigerator docks, etc.

What’s also interesting about this model is that your phone can also power devices that don’t have screens. In this scenario you would be able to use your smart phone to interact with all your appliances without screens like washer, dryer, coffee pot, and others. We call these specific interactions “micro-experiences” where you use your phone to have experiences with non-screen appliances.

It is obviously way to early to conclude when or if the market could adopt a solution like this. None-the-less it is an interesting future to think about.