This article is exclusively for subscribers to the Think.Tank.
Drones are the next revolution, the next insanely great thing, the pirate, the multi-billion dollar business, the integration of the physical and the digital, the device that will fight our wars, provide web access to the poor, deliver our pizzas in way under 30 minutes, ensure the air is safe, expose dictators, and turn us all into Hollywood-style directors, even if just for some grand selfie.
I don’t make, I write. If I made, I would make drones.
If I was that guy in The Graduate, my one word would be: “Drones”.
If I were the next Steve Jobs, I would dream of drones. If I were the next Bill Gates, I would envision software empowering drones built on every kitchen table.
You know what’s going to power the DeLorean back to the future? Drones.
Not since the launch of the iPhone and possibly not since I first used Mosaic have I felt about a technology as I do about drones. The market for drones is expected to reach $91 billion by 2020. I think this radically understates their impact, even considering the current muddled legal environment.
Drones are the next ‘stack’ of the global internet, and will radically re-make our perception of location, privacy and commerce. They are as if the PC and the Internet launched together. In 1988.
Not surprisingly, everyone wants in on the action.
- Mark Zuckerberg is funding efforts so drones can “beam internet to people from the sky.”
- The Defense Advanced Research Projects Agency (DARPA) wants to re-tool aircraft to serve as a “flying fortress” filled with drones able to carry out all manner of missions in any region of the planet.
- Amazon is “doubling down” on drones for delivery.
- Skycatch is already building a sort of Uber for drones, linking drone “pilots” and makers with those who need drone-based services.
Despite all this, it is hobbyists who are advancing drone development even more than government or business.
There is a thriving community of drone builders and enthusiasts at OpenPilot.org,which has created an open source platform for drones. The nonprofit OpenPilot hopes to make drone technology more affordable, more accessible — and optimized for improving humanity’s lot.
DIY Drones claims to be the world’s largest community for drone hobbyists. DIY Drones was also instrumental in the development of the Dronecode Project, which aims to “bring together existing open source drone projects and assets under a nonprofit structure governed by The Linux Foundation”. Drones just had their Tim Berners-Lee moment.
Yes, the rules for drone use in the US are in flux and clearly lagging the technology.
“After years of waiting, a Federal Aviation Administration (FAA) official said the agency was close to releasing a ruling that would give commercial entities greater access to fly small unmanned aerial system in the domestic airspace.”
It’s not just the FAA. The Office of Management and Budget is also involved. Then there’s the FCC and the Government Accountability Office. All are working to enact Congress’ 2012 “FAA Modernization and Reform Act,” which is meant to bring a clearer legal framework for the commercial operation of drones (unmanned vehicles weighing less than 55 pounds). In addition, several states and cities have enacted their own rules. Businesses don’t know what to do, other than do nothing or operate in secret.
For hobbyists, the rules are essentially that drones must remain within line of sight and away from airports and below 400 feet.
Don’t fear, I know a secret: This will all get taken care of — because, just as with PCs and the Internet, the spread of drones cannot be stopped.
It’s a drone world after all….and the best is yet to come.
The FAA expects more than 30,000 drones in commercial use by 2020. These will be used by law enforcement, military, logistics companies, businesses, and tech giants. The potential, however, is limitless. Witness: The nonprofit Drone Adventures sends drones to impoverished areas of the world, assessing air quality, agricultural impact, promoting conservation and archaeological efforts.
How is all this possible? Smartphones.
Smartphone-optimized technologies, including GPS, accelerometers, gyroscopes, mobile cameras, a litany of sensors, mobile battery power, lenses and more, have all become widely available, shockingly affordable — and are transferable to the drone industry.
Then there’s the rapid drop in price. The new Lumia 535 is available for $137 — inclusive. Only a few years ago, such a price for so much technology was unthinkable. A similar phenomenon is happening in the drone industry. Consider this is what you can get now for the price of an iPhone 6, off-contract: the Phantom can fly 22mph and reach an altitude of 1,000 feet. GoPro optional.
You were not part of the original Homebrew Computer Club. You’ve just been given a second chance. Nowhere to go but up.
This post originally appeared at MattRichman.net and was re-posted at Tech.pinions with Matt’s permission.
When Steve Jobs announced the transition from PowerPC to Intel processors in 2005, he revealed something that, in hindsight, seemed obvious to everyone who didn’t anticipate the switch:
There are two major challenges in this transition. The first one is making Mac OS X sing on Intel processors. Now, I have something to tell you today: Mac OS X has been leading a secret double life for the past five years.
We’ve had teams doing the “just in case” scenario. And our rules have been that our designs for OS X must be processor independent, and that every project must be built for both the PowerPC and Intel processors. And so today, for the first time, I can confirm the rumors that every release of Mac OS X has been compiled for both PowerPC and Intel. This has been going on for the last five years.
There’s not a doubt in my mind that if you substitute Intel for PowerPC and ARM for Intel, what Steve Jobs said then holds 100 percent true today, word for word. Mac OS X designs must be processor independent, every project must be built for both Intel and ARM processors, and each Mac OS X release in the last five years has been compiled for both Intel and ARM.
Somewhere on Apple’s campus, ARM-based Macs are already running OS X.
User Experience Would Improve
In his iPhone 5S review, Anand Shimpi compared the Apple-designed A7 processor with Intel’s fastest tablet chip at the time. He wrote:
In September of 2013, the world’s preeminent independent processor expert compared Apple’s latest iPhone chip with Intel’s fastest tablet chip and concluded that the two perform similarly — even though the Intel chip draws more power, contains four cores versus the A7’s two, and is produced with a more advanced manufacturing technique. If Apple’s chip design team can create a phone processor that performs on par with Intel’s fastest tablet chip, the company’s “highest priority”, then there’s no reason to believe that the same team at Apple can’t design chips powerful enough for any Mac in the company’s lineup.
Apple has already released a line of A-series chips tailored specifically for iOS devices, and the company is most definitely working on a line of B-series chips tailored specifically for Macs. When that B-series chip — or set of B-series chips that runs in parallel — is ready, Apple will be able to switch to ARM-based Macs without sacrificing user experience. On the contrary, because the company is no doubt designing its line of B-series chips in tandem with Mac OS X, there would be iPhone-like hardware-software optimization, improving user experience.
Apple Would Make More Money Per Mac And Sell More Macs
Going from chip concept to manufactured product can be broken down into two separate and distinct steps. The first is chip design — figuring out what features the processor will have and how it will work. The second is manufacturing — turning a file that exists on a screen into a physical product you can hold in your hand.
Today, Intel designs the chips in Macs and manufactures them, profiting on both of those steps. But if Apple swapped out Intel’s chips for its own ARM-based designs, an external company would profit on only one step of the chip creation process, not both, leading to a decrease in the cost of building a Mac. By my conservative estimate, Apple would be able to drop the price of the base model 11- and 13″ MacBook Airs by $50 and still make more profit per unit on each than it currently does.
This cost savings would apply to the entire Mac lineup. Apple would be able to drop prices across the board and make more money per Mac than it does today — and with lower prices, the company would sell more of them, too.
Apple Would Be Able To Create Better Macs
When Apple announced the iPhone 5S, it explained that all of the fingerprint data associated with Touch ID “is encrypted and stored inside the secure enclave in our new A7 chip” where it’s “locked away from everything else”.
Apple wouldn’t have been able to create Touch ID if the iPhone were powered by an Intel chip instead of an Apple-designed one. There wouldn’t have been a “secure enclave” on the iPhone’s processor to store the fingerprint data, nor would there have been perfect hardware-software integration. Apple was able to implement Touch ID because it designed the A7 chip in tandem with the iPhone 5S’s software and the rest of its hardware.
I’d bet that there are features Apple envisions for the Mac that simply can’t be built while Intel designs the chips inside of them. To implement those ideas, Apple would need to switch the Mac to ARM-based processors, because only then would the company have the ability to design chips customized for specific features. If Apple moved the Mac to ARM-based chips, the company would literally be able to create better products than it can today.
This brings me to something else Steve Jobs said when he announced the transition from PowerPC to Intel. Ultimately, he explained, Apple switched for one simple reason: “We can envision some amazing products we want to build for you, and we don’t know how to build them with the future PowerPC roadmap.”
The same logic applies today. It’s not a stretch to imagine Tim Cook walking out on stage and saying, “We can envision some amazing products we want to build for you, and we don’t know how to build them with Intel’s chips.”
As I first said more than three years ago: ARM-based Macs are definitely coming.
I’ve spent the past week with a Surface Pro 3. I’ve used every previous version of Surface and given what the promise of a 2-in-1 PC was supposed to be, the Surface Pro 3 is the closest yet to fulfilling that promise. I was very harsh on the original Surface and, while my overall thesis (which we will get to) on the 2-in-1 PC form factor has not changed, my stance on the Surface itself has softened. Don’t consider this a review of the Surface Pro 3. There are many good reviews of the Surface Pro 3 and serious buyers should read those as well. I’d like to do a more analytical take on the form factor.
Comparing to a Tablet or a Laptop?
The first point we need to address is which computing form factor, laptop or tablet, must we use to create a comparison for the Surface Pro 3. Which type of buyer is the Surface attempting to appeal to? The potential laptop buyer or the potential tablet buyer? Microsoft’s own marketing gives us a clue. They are clearly targeting a customer looking to buy a laptop in the near future.
On that basis how does it compare to a laptop? Overall, it is a decent notebook. Many features are exceptional, like the extremely high screen resolution of 2160 x 1440. This should literally be the standard resolution on all medium to premium priced Windows PCs. While I appreciated the screen compared to other Windows PCs I tested, I am spoiled by the 15″ Retina MacBook Pro, my primary notebook, with a resolution of 2880 x 1800. As I used the Surface Pro 3, I had to leave my Mac experience behind and just think of the Surface as a competitor to other similarly priced premium Windows PCs. Two of my favorite premium Windows PCs are the Lenovo Carbon X1 and the Dell XPS 13. Despite what many may believe about the Surface Pro 3, I would consider it to be in a class of premium Windows PC products. Given that a traditional clamshell notebook is the Surface’s competition, we have to again evaluate it as a viable notebook competitor.
Overall, I was pleased and impressed with the Surface Pro 3 as a notebook. The size and weight certainly put it into the class of ultra-portables. The Touch Cover keyboard case has been dramatically improved. But there was one area that was a rub for me. What I call “time to on.” Time to on is the time it takes to open my laptop and start working. Every busy person who walks from meeting to meeting knows how valuable it is to sit down, open your notebook, and quickly be ready to start a meeting. Since Microsoft wants to compare the Surface Pro 3 to a MacBook Air, I compared the “time to on” of both machines. For this test, I only looked at the time it took for me to open my notebook and get to a point where my computer was on and usable. For both tests, the MacBook Air and Surface Pro 3 were placed in the exact same position on the desk in front of me. I simply tested how long it took for each machine to be usable from a “sleep state”.
The average “time to on” in five tests with a MacBook Air was 1.63 seconds. That’s the time it took to flip the lid up, wake up, and let me move the mouse and start using the notebook. The average “time to on” of the Surface Pro 3 in five tests was 6.68 seconds. That was the time it took to tilt it up off the desk, set it back down, flip the keyboard down, flip out the kickstand, swipe the log-in screen (no passcode set), and actively be using the mouse and the notebook. Now, a 5.05 second difference may not seem like a lot, but when you compare the difference in “experience” to having a near instant on notebook to a somewhat clunky process to get the Surface up and running, the two seem like worlds apart.
The Surface as a Tablet
I’m not going to spend a ton of time on this part, since consensus seems to be that comparing the Surface to a tablet (iPad) is a losing battle. I actually think Windows 8 is becoming a better large screen tablet platform than Android, but that is because Microsoft is getting some decent larger screen dedicated tablet apps where Android is not. However, very few are buying an Android tablet as a PC replacement, and even fewer are buying Android tablets with screen sizes above 9″. One other positive improvement was Microsoft’s adding of a better portrait mode experience with Windows 8.1. Previously Windows 8, running on tablet form factors, was terrible in portrait mode. Some discount this mode but our observational research shows extremely high amounts of use in portrait mode for many tablet use cases. I’ve argued portrait mode is an important experience with any product attempting to be a tablet. Microsoft finally got portrait mode usable.
Another plus for the Surface was the stylus. Not a feature I see being attractive in pure consumer markets but in vertical enterprise environments like in medical, construction, legal, etc., where notes and pen/paper are still heavily used, I can see the appeal. The stylus was not perfect, but still worked better than any stylus solution I’ve used to date.
The Surface Pro 3 is a bit too large for me in pure slate mode. Most of the time I use my iPad I am laying down in bed, or reclining on the couch or a chair. Most often, I’m also holding the iPad up and not resting it on my body or chest. While the Surface Pro 3 is the thinnest and lightest Surface yet, it still caused discomfort while holding it for long periods of time. In all honesty, the Surface Pro 3 would be an outstanding tablet, if the iPad and iOS tablet ecosystem was not in existence.
Who is the Surface Pro 3 for?
This is the main question. I have no doubt there is a market for the Surface. As I point out, the Surface Pro 3, while competitive, will be bested experientially by the pure notebook clamshell form factor. However, as I pointed out earlier, in the Microsoft ecosystem, given the touch landscape for devices and Windows 8 in general, the Surface Pro 3 is a competitive product with other premium Windows notebooks.
While, I struggle to see the opportunity for the Surface in pure consumer markets, I do feel Microsoft has improved the hardware so that, for key vertical segments, the Surface Pro 3 is a viable solution that will suffice as a laptop but can add perks of tablet mode for those in specific fields where those features are useful.
In this week’s Tech.pinions Podcast we discussed the 2-in-1 PC at length. I still believe the demand or market pull for this product is limited. That being said, there are plenty of things Microsoft, Intel, and partners can do to extend this category. Given the trends in tablets we are seeing, where new quarterly data is showing up signaling usage declining in key areas by tablet users, I fear the tablet was not the potentially disruptive force Microsoft and Intel believed it to be. Which means it is reasonable the entire touch based desktop/notebook/2-1 solutions were built out of a reaction to a concern that didn’t really exist.
Each product has its role, its context, and its value. For some, a pure slate will be a form of entertainment. For some, a laptop/notebook replacement. Still, for others, it is a luxury. And in some enterprise markets it will be a necessity. Intel and Microsoft would love to believe the 2-in-1 form factor is the future of the notebook. This may be the case, and those two companies can certainly force their will on the ecosystem. Intel hopes that this form factor will make up over 70% of the notebook shipments in 2018. But my contention is if that happens, it won’t be because the market demands it.
One of the fundamental characteristics of a mature market, is mature consumers. These consumers are mature in the sense that they know what they want and more importantly they know why they want it. This kind of maturity can only come with a defined sense of needs, wants, and desires.
That defined sense, can only come when you have experience with a product. Owning multiple generations of a product or category is required to fully understand not just what you want but why you want it. For many consumers they know by now whether they value a traditional PC like a desktop or notebook and they know why. These consumers know they need a PC and have a sense of what they want. Interestingly with smartphones and tablets, I don’t believe we have fully mature customers. ((I’ll dive into this in a future column, but some of the experimentation we are seeing in platform switching or experimenting demonstrates this nuance of the consumer market. ))
The Screens That Rule Our Lives
When the iPad joined our world, we knew it was more than a screen to entertain us. We knew it was a profound new kind of computer. At the same time, recognizing that the tablet will not replace the PC is a key understanding. For many, the tablet can and will become a primary computing device, but I doubt the presence of a more powerful computing will cease to exist in most consumers home in some way or another. But as important as the tablet is, there are many hundreds of millions of consumers who depend on the traditional PC to make a living. What is interesting for this class of customer is that they need a PC and they know it.
We are fond of saying we are in the post PC-era. What this term simply means is that the PC is no longer the only computer in which we can perform computing tasks. But the metrics of how a PC is valued has changed. One can make a strong argument that there are many consumer who don’t value the PC and will rather value the tablet and that may be true. But for those who need a PC, and know it, value has shifted from processing power to battery life.
Battery Life is the New MHZ Race
The raging question throughout the PC industry has been “what is going to get consumers to upgrade their PCs?” The answer is iPad like battery life.
At last weeks WWDC Apple released new MacBook Air’s running Intel’s 4th generation core processor. At one point in time when a company released a new PC, they proudly announced how much processing power it had, and the crowd would applaud. At WWDC last week when Apple discussed the MacBook Air, the crowd did not cheer or applaud when they announced the speed of the processor. Instead, the crowd went wild when they announced the new metrics for battery life. The new 11″ MacBook Air now has 9 hours of batter life, and the new 13″ MacBook Air now has 12 hours of battery life. Even now, we learn that after some benchmarking and reviews those battery life claims may even be conservative. No computer on the market comes close to these battery life claims and I will be interested to see if a battery life competitor to the MacBook Air comes to market this year.
Casually read some of the reviews of the new MacBook Airs and you will see how the reviewers are raving about their experience having more than all-day battery life in a notebook.
Without question there is a huge opportunity waiting for the PC industry with regard to notebook upgrades. Many consumers and corporate workers are using PCs that are out dated in nearly every major category. Yet it is not the high-definition screens, the touch screens (or lack there of on Macs), the ultra-thin design, or the overall look that will give their new owners a profound computing experience–It is the battery life.
Apple has set the bar high with these new battery benchmarks. All PC makers are making progress in this area and the new processors from Intel and AMD will help push this needle forward. ((If Windows RT can gain traction, ARM processors can be a solution for even longer battery life)) One thing I will be watching very closely with the fall lineup is the battery life claims from all the new notebooks. I am convinced this is the feature-of-all-features for the PC industry this year.
Intel’s CEO Paul Otellini is retiring in May 2013. His 40-year career at Intel now ending, it’s a timely opportunity to look at his impact on Intel.
Intel As Otellini Took Over
In September 2004 when it was announced that Paul Otellini would take over as CEO, Intel was #46 on the Fortune 100 list, and had ramped production to 1 million Pentium 4’s a week (today over a million processors a day). The year ended with revenues of $34.2 billion. Otellini, who joined Intel with a new MBA in 1974, had 30 years of experience at Intel.
The immediate challenges the company faced fell into four areas: technology, growth, competition, and finance:
Technology: Intel processor architecture had pushed more transistors clocking faster, generating more heat. The solution was to use the benefits of Moore’s Law to put more cores on each chip and run them at controllable — and eventually much reduced — voltages.
Growth: The PC market was 80% desktops and 20% notebooks in 2004 with the North America and Europe markets already mature. Intel had chip-making plants (aka fabs) coming online that were scaled to a continuing 20%-plus volume growth rate. Intel needed new markets.
Competition: AMD was ascendant, and a growing menace. As Otellini was taking over, a market research firm reported AMD had over 52% market share at U.S. retail, and Intel had fallen to #2. Clearly, Intel needed to win with better products.
Finance: Revenue in 2004 recovered to beat 2000, the Internet bubble peak. Margins were in the low 50% range — good but inadequate to fund both robust growth and high returns to shareholders.
Where Intel Evolved Under Paul Otellini
Addressing these challenges, Otellini changed the Intel culture, setting higher expectations, and moving in many new directions to take the company and the industry forward. Let’s look at major changes at Intel in the past eight years in the four areas: technology, growth, competition, and finance:
Design for Manufacturing: Intel’s process technology in 2004 was at 90nm. To reliably achieve a new process node and architecture every two years, Intel introduced the Tick-Tock model, where odd years deliver a new architecture and even years deliver a new, smaller process node. The engineering and manufacturing fab teams work together to design microprocessors that can be manufactured in high volume with few defects. Other key accomplishments include High-K Metal Gate transistors at 45nm, 32nm products, 3D tri-gate transistors at 22nm, and a 50% reduction in wafer production time.
Multi-core technology: The multi-core Intel PC was born in 2006 in the Core 2 Duo. Now, Intel uses Intel Architecture (IA) as a technology lever for computing across small and tiny (Atom), average (Core and Xeon), and massive (Phi) workloads. There is a deliberate continuum across computing needs, all supported by a common IA and an industry of IA-compatible software tools and applications.
Performance per Watt: Otellini led Intel’s transformational technology initiative to deliver 10X more power-efficient processors. Lower processor power requirements allow innovative form factors in tablets and notebooks and are a home run in the data center. The power-efficiency initiative comes to maturity with the launch of the fourth generation of Core processors, codename Haswell, later this quarter. Power efficiency is critical to growth in mobile, discussed below.
When Otellini took over, the company focused on the chips it made, leaving the rest of the PC business to its ecosystem partners. Recent unit growth in these mature markets comes from greater focus on a broader range of customer’s computing needs, and in bringing leading technology to market rapidly and consistently. In so doing, the company gained market share in all the PC and data center product categories.
The company shifted marketing emphasis from the mature North America and Europe to emerging geographies, notably the BRIC countries — Brazil, Russia, India, and China. That formula accounted for a significant fraction of revenue growth over the past five years.
Intel’s future growth requires developing new opportunities for microprocessors:
Mobile: The early Atom processors introduced in late 2008 were designed for low-cost netbooks and nettops, not phones and tablets. Mobile was a market where the company had to reorganize, dig in, and catch up. The energy-efficiency that benefits Haswell, the communications silicon from the 2010 Infineon acquisition, and the forthcoming 14nm process in 2014 will finally allow the company to stand toe-to-toe with competitors Qualcomm, nVidia, and Samsung using the Atom brand. Mobile is a huge growth opportunity.
Software: The company acquired Wind River Systems, a specialist in real-time software in 2009, and McAfee in 2010. These added to Intel’s own developer tools business. Software services business accelerates customer time to market with new, Intel-based products. The company stepped up efforts in consumer device software, optimizing the operating systems for Google (Android), Microsoft (Windows), and Samsung (Tizen). Why? Consumer devices sell best when an integrated hardware/software/ecosystem like Apple’s iPhone exists.
Intelligent Systems: Specialized Atom systems on a chip (SoCs) with Wind River software and Infineon mobile communications radios are increasingly being designed into medical devices, factory machines, automobiles, and new product categories such as digital signage. While the global “embedded systems” market lacks the pizzazz of mobile, it is well north of $20 billion in size.
AMD today is a considerably reduced competitive threat, and Intel has gained back #1 market share in PCs, notebooks, and data center.
Growth into the mobile markets is opening a new set of competitors which all use the ARM chip architecture. Intel’s first hero products for mobile arrive later this year, and the battle will be on.
Intel has delivered solid, improved financial results to stakeholders under Otellini. With ever more efficient fabs, the company has improved gross margins. Free cash flow supports a dividend above 4%, a $5B stock buyback program, and a multi-year capital expense program targeted at building industry-leading fabs.
The changes in financial results are summarized in the table below, showing the year before Otellini took over as CEO through the end of 2012.
The Paul Otellini Legacy
There will be books written about Paul Otellini and his eight years at the helm of Intel. A leader should be measured by the institution he or she leaves behind. I conclude those books will describe Intel in 2013 as excelling in managed innovation, systematic growth, and shrewd risk-taking:
Managed Innovation: Intel and other tech companies always are innovative. But Intel manages innovation among the best, on a repeatable schedule and with very high quality. That’s uncommon and exceedingly difficult to do with consistency. For example, the Tick-Tock model is a business school case study: churning out ground-breaking transistor technology, processors, and high-quality leading-edge manufacturing at a predictable, steady pace of engineering to volume manufacturing. This repeatable process is Intel’s crown jewel, and is a national asset.
Systematic Growth: Under Otellini, Intel made multi-billion dollar investments in each of the mobile, software, and intelligent systems markets. Most of the payback growth will come in the future, and will be worth tens of billions in ROI.
The company looks at the Total Addressable Market (TAM) for digital processors, decides what segments are most profitable now and in the near future, and develops capacity and go-to-market plans to capture top-three market share. TAM models are very common in the tech industry. But Intel is the only company constantly looking at the entire global TAM for processors and related silicon. With an IA computing continuum of products in place, plans to achieve more growth in all segments are realistic.
Shrewd Risk-Taking: The company is investing $35 billion in capital expenses for new chip-making plants and equipment, creating manufacturing flexibility, foundry opportunities, and demonstrating a commitment to keep at the forefront of chip-making technology. By winning the battle for cheaper and faster transistors, Intel ensures itself a large share of a growing pie while keeping competitors playing catch-up.
History and not analysts will grade the legacy of Paul Otellini as CEO at Intel. I am comfortable in predicting he will be well regarded.
After 15 years of making predictions, with a track record that would have made you rich if you’d bet on them, I’ve been away from the practice for a couple of years. But as the regulars at Tech.pinions have agreed to end the year with a set of predictions each, I’m back at the game. My best guesses for 2013:
A Modest Rebound for BlackBerry. Like many others, I was prepared to write off BlackBerry during the last year as its market share cratered. And if Windows Phone 8 had really taken off or if Android had made a serious play for the enterprise, it would be very hard to see where there might be room in the market for Research In Motion, no matter how promising BlackBerry 10 looks. But I think there is room for at least three players in the business, and right now the competition for #3 is still wide-open. BlackBerry still enjoys a lot of residual support in the enterprise IT community, and some key federal agencies that had been planning to move away from the platform, such as Homeland Security’s Immigration & Customs Enforcement, have indicated they are open to a second look. The challenge Research In Motion faces is that BlackBerry 10, which will be leased on Jan. 30, needs to be appealing enough to users, not just IT managers, that it can at least slow the tide of bring-you-own devices into the enterprise.
A Windows Overhaul, Sooner Rather Than Later. Even before Windows 8 launched to distinctly mixed reviews, there were rumors about that Microsoft was moving toward a more Apple-like scheme of more frequent, less sweeping OS revisions. Microsoft sometimes has a tendency to become doctrinaire in the defense of its products; for example, it took many months for officials to accept that User Access Control in Vista was an awful mess that drove users crazy. But Microsoft has had some lessons in humility lately and the company knows that it is in a fight that will determine its relevance to personal computing over the next few years. I expect that, at a minimum, Windows 8.1 (whatever it is really called) will give users of conventional PCs the ability to boot directly into Desktop mode, less need to ever used the Metro interface, and the return of some version of the Start button. On the new UI side, for both Windows 8 and RT, look for a considerable expansion of Metrofied control panels and administrative tools, lessening the need to work in Desktop. In other words, Microsoft will move closer to what it should have done in the first place: Offer different UIs for different kinds of uses. The real prize, truly touch-ready versions of Office, though, are probably at least a year and a half away.
Success for touch notebooks. When Windows 8 was first unveiled, I was extremely dubious about the prospects for touch-enable conventional laptops. The ergonomics seemed all wrong. And certainly the few touchscreen laptops that ran Windows 7 weren’t every good. Maybe its my own experience using an iPad with a keyboard, but the keyboard-and-touch combination no longer seems anywhere near as weird as it once did. And OEMs such as Lenovo, Dell, HP, and Acer are coming up with some very nice touch laptops, both conventional and hybrid. Even with a premium of $150 to $200 over similarly equipped non-touch models, I expect the touch products to pick up some significant market share.
Significant wireless service improvements. We’ll all grow old waiting for the government’s efforts to free more spectrum for wireless data to break fruit. The incentive auctions of underused TV spectrum are not going to be held until 2014, and it will be some time before that spectrum actually becomes available. The same is true for a new FCC plan to allow sharing of government-held spectrum in the 3.5 GHz band. But the good news is we don’t have to wait. Technology will allow significant expansion of both the capacity and coverage of existing spectrum. Probably the two most important technologies are Wi-Fi offload, which will allow carrier traffic to move over hotspots set up in high-traffic areas, and femtocells and small cells, which can greatly increase the reuse of of the spectrum we already have. Unlicensed white space–unused free space between TV channels–should begin to make a contribution, especially in rural areas where TV channels are more sparse. And the huge block of mostly idle spectrum the Sprint is acquiring with its proposed purchase of Clearwire will also ease the congestion, probably starting next year. (Stay tuned for a Tech.pinions series on spectrum issues in January.)
Intel Will Make a Major ARM Play. It’s hard to believe today, but Intel was once a major player in the ARM chip business. In 1997, it bought the StrongARM business from a foundering Digital Equipment. Renamed XScale, the Intel ARM chips enjoyed considerable success with numerous design wins as early smartphone applications processors. But XScale was always tiny compared to Intel’s x86 business and in 2006, Intel sold its XScale operations to Marvell. A year later, Apple introduced the ARM-based iPhone. Today, ARM-based tablets are in the ascendancy, x86-based PCs are in decline, and Intel is struggling to convince the world that a new generation of very low power Atom systems-on-chips are competitive. Maybe the Clover Trail SOCs and their successors will gain a significant share of the mobile market, but Intel can’t afford to wait very long to find out. With its deep engineering and manufacturing skills, Intel could become a major ARM player quickly, either through acquisition or internal development.
One of Intel’s big programs for this year has been the development and marketing of a new line of slimline notebooks called Ultrabooks. This product is really following in Apple’s MacBook Air footprints, a product that has been very successful for Apple.
In reality, as technology has gotten smaller and more powerful, it was inevitable that laptops would follow Apple’s MacBook Air example and become thinner, lighter and still have serious computing power. To that end, Intel and almost all of their OEM partners have jumped on the Ultrabook bandwagon.
In the spring, Intel and their partners launched a major marketing push for Ultrabooks and have spent a huge amount of money trying to get the attention of business and consumer users to try and move over to the Ultrabook platform of notebooks. Also, most OEM’s have created some great versions of Ultrabooks that at the very least has caught the eye of these folks who really do like the idea of a lighter laptop.
However, the cheapest Ultrabook starts at $699 and is a relatively low powered system. But most of the Ultrabooks have been priced in the $799-$899 range, a price that although reasonable, we believe may not be an attractive price range for consumers. And although business users are OK with these upper endprices for laptops, they continue to want laptops that have a lot of power and features that can’t be crammed into these thinner laptops. This has been at the heart of the slow uptake in Ultrabook purchases so far.
But there seems to be another reason for the slow uptake in Ultrabooks with consumers and even many business users. We have been privy to some very interesting research that shows that the market for laptops appears to be bifurcating into one that is focused on low cost notebooks and the other on the higher end of the notebook market. The research suggests that the mid market for laptops is declining and that laptops priced at $699-$899 may be going away as users either opt for low cost laptops or if they want more powerful laptops, buy up to laptops in the $999-$1299 range instead.
Part of this lack of overwhelming interest in the $699-$799 price range is also due to the iPad. The interest in the iPad remains high, and right now from our research we are learning much higher than notebooks by the mass market. Because of that UltraBooks priced around the range of the iPad seem to of less interest. It appears for the mass market next generation notebooks need to be lower cost than the iPad or much higher and include valuable innovations in the upper end to make it attractive.
One could also argue that Windows 8 could play a role but many consumers in the market we speak to are not that interested in Windows 8 yet.
If true, this is bad news for the current crop of Ultrabooks. Due to component costs and other related marketing costs, almost all of the Ultrabooks are priced between $699-$899 with a few even at $999-$1200. To be fair, many upper end models that are really high-powered laptops are being called Ultrabooks, but at this price they are considered upper end laptops.
This research reflects similar information we are getting from consumers. Over the last 3 weeks I have spoken to dozens of consumers about their back to school or fall laptop purchases and all planned to spend no more than $599 for a laptop. All where aware of Ultrabooks and while they would have liked to have one, they did not have the budget for anything more then $599. And if they were buying it for their kids as part of the back to school requirements, the prices they planned to spend was closer to $399 to $499 for laptops this year.
If it is true that the mid market for laptops in the $699-$899 range is going to evaporate, it will put a lot of pressure on the OEMs next year to try and get prices down on Ultrabooks if they want any traction with consumers in 2013. The good news for them is that business users seem to want to buy up and laptops in the $999-$1299 range have good margins which means they can actually make some money on these laptops.
Ultimately, Ultrabooks will be successful since the technology is here to make them lighter, thinner and still have good computing power. But for them to sell in the volumes OEM’s need to make money on low end laptops they have to have more consumer friendly pricing to really take off.
Last week I pointed out the competitive dilemma for OEMs when it comes to Surface. A key point in my mind is how tablets are becoming the next generation computers for the mass market. What I pointed out in my column about notebooks becoming history is that the notebook will remain relevant but it will do so for only a segment of the market rather than the market as a whole, which has historically been the case.
When we started doing consumer research with the late adopters (anyone not an early adopter) we started realizing that for a large majority of consumers a notebook was overkill with respect to what they did with the product on a daily basis. We discovered that many consumers purchased notebooks due to their convenience around portability more than anything else. It is this fundamental point which leads me to be convinced of the tablet form factor. This is also why the tablet + desktop solution becomes even more interesting.
Further Reading: Notebooks are the Past, Tablets are the Future
With that context in mind, I am beginning to wonder if Microsoft launching their own line of tablets hurts the OEMs in a much more important area than just competing with them –namely with their notebook products. If this industry is headed in the direction I think then more interest may be given to Surface like products, by the masses, than notebooks in 2013 particularly. I am wondering if by launching Surface Microsoft has not just potentially hurt interest in their partners notebooks over the short term.
If what we write here on our site as well as feedback I have received from many media outlets is an indication of market interest, then what I am proposing would be on track. Our content on tablets and recently Surface far exceeds the amount of reads than we write about notebooks and UltraBooks in particular. I have heard similar things from other media that tablet content does better than notebook content in terms of interest.
Intel is trying to inject life into the notebook category with their UltraBook campaign and Microsoft has just injected life into tablets built for Windows 8. Surface’s form factor is different enough from what most consumers are used to with a notebook that I believe there will be serious consideration for it by anyone who is in the market for Windows notebook. Time will tell how many will buy surface but I believe it matches up with enough trends we are seeing to at least generate interest.
However, if there is enough interest, Surface may very well impact notebook sales for Microsoft partners which will hurt OEMs more in the short term than Microsoft competing with them in a segment. In this case Surface is more disruptive to OEMs notebook strategy than their tablet strategy.
Of course another scenario could be that Surface plays the spoiler for both Win 8 tablets and Windows notebook. It may be that the wide array of differences in the Windows 8 ecosystem may be confusing for customers who then turn and consider the Apple ecosystem. In fact 2013 will be a very interesting year because the feedback we are getting from both tablet and notebook intenders will heavily evaluate both ecosystems before making a decision. Consumers will choose with their wallet and perhaps more importantly with their loyalty and it will make 2013 and fascinating year.
One year ago almost to the day at Computex 2011, Intel introduced Ultrabooks to the world. The first generation of Ultrabooks was nice, but they were also homogeneous (exception Dell XPS 13) and very expensive, limiting access to many demographics. So are things any different a year later? After seeing what Intel’s Ultrabook partners launched so far at Computex, it’s only fair to characterize it as impressive. Design wins and choice don’t guarantee sales, but you cannot have sales without it. OEMs at Computex launched some serious innovation with new form factors and usage models while lowering price points which I think deserves a deeper look. It also gives us a good indication how far Intel has come and where this is all going.
First generation Ultrabooks came with 13” displays. Display size is one of the most important purchase criteria and the fact is, consumers like varying display sizes to match their primary use case. Generally speaking, those who want to travel prefer smaller displays and those who want to replace desktops or do more content creation prefer the larger screens. Here are a few examples:
- HP ENVY 6t– 15.6” display, .78” thick and up to 9 hours battery life
- Gigabyte U2440– 14″ display, 22mm thick
- Dell XPS 13– 13″ display in a 12″ chassis, 18mm, up to 9 hours battery life
- Sony Vaio T– 11.6” display, .71” thick, and up to 7.5 hours battery life
Most Gen 1 Ultrabooks came with a 13″ display with 1,366 x 768 display resolutions. That is still the case today on average, but if consumers want more, they can get HD resolutions. As a reference, the highest supported MacBook Air resolution is 1,440 x 900, 48% less detail than full HD.
- Acer Aspire S7- 1,920 x 1,080 with a 13.3” display, with capacitive multi-touch.
- ASUS Zenbook Prime UX31– 1,920×1,080 resolution with a 13.3” display
- Toshiba U840W– 1,792 x 768 resolution with a 14.4″ display
- Lenovo ThinkPad X1 Carbon– 1,600 x 900 resolution with a 14″ display
- Lenovo IdeaPad YOGA– 1,600 x 900 display resolution with a 13.1″ display
Even though the first Ultrabooks were light, OEMs are even challenging each other on weight. This is a good thing because in the future because when Windows 8 and convertibles arrive, these systems need to be a lot lighter as to operate as decent tablets. Here is the lightest, or at least the lightest claimed:
- Gigabyte X11– at 34.3 oz. with an 11.6″ display, claim to be the lightest (10% lighter than 11.6″ MacBook Air)
- NEC LaVie Z- 35.2 oz., with a 13.3″ display (26% lighter than 13.3″ MacBook Air)
Machined aluminum is nice, but it also exactly what Apple uses. That’s not necessarily a bad thing because it looks upscale, but in some ways unoriginal and also expensive and heavy. OEMs have gone out of their way to be different in this with chassis materials. Here are just a few that I think deserve highlighting:
- Dell XPS 13– carbon fiber bottom, machined aluminum chassis with rubberized palm rest
- HP ENVY– soft-touch, slip resistant base
- Gigabyte X11– carbon fiber
- Sony Vaio T– magnesium with aluminum lid
- Lenovo ThinkPad X1 Carbon: carbon fiber
- NEC LaVie Z- lithium magnesium alloy
Touch didn’t make a lot of sense with Windows 7, but Windows 8 changes all this. As I am an avid tablet + dock user, I am constantly trying to touch my Ultrabook display. As long as the display doesn’t push back as it is being touched, I think it will work well. Even better will be convertible designs.
- Acer Aspire S7– the 11.6″ or 13.1″ display folds back 180 degrees
- ASUS Transformer Book– the 11.6″, 13″, or 14″ display detaches from the keyboard to use as a tablet
- Lenovo IdeaPad YOGA– the 13.1″ 1,600 x 900 display bends back 360 degrees to use as a tablet
- ASUS TAICHI– dual sided HD display in 11.6″ and 13″ form factors. Open the clamshell and it’s a notebook. Close it and it’s a tablet.
Original Ultrabooks came with Intel HD 3000 graphics. Second generation Ultrabooks come with HD 4000 graphics which, while improved, just won’t cut it for everyone. Discrete graphics attach rates in Western Europe, China, and Russia are all above 50%, so the need is there, and the new Ultrabooks deliver. With Nvidia launching the GeForce GTX 680M yesterday, things will be getting even more interesting. Discrete graphics are not available on the Apple MacBoook Air. Here’s what consumers can order today:
- Asus Zenbook Prime UX32VD– NVIDIA GeForce 620M and is 18mm thick
- Lenovo IdeaPad U410– NVIDIA GeForce 610M, up to 9 hours battery life, 21mm thick
- Gigabyte U2440– NVIDIA GeForce GT 630M, 21.5mm thick
First generation Ultrabooks were 100% consumer-focused. As we are one year into the Ultrabook initiative, commercial and even enterprise-grade solutions are available:
Premium price points aren’t necessarily a bad thing; just look at Apple. BUT, if you don’t have Apple’s brand, then as an OEM in the short-term you will need to live with lower volumes. Generally, the opening price point for the first gen Ultrabooks was $999 and for second generation, this will be much lower. Intel and their partners invested in alternative chassis designs, hybrid hard drives, display technology and the inclusion of Core i3 processors to lower opening price points. One important point to keep in mind is that these same, lower-priced Ultrabooks still provide the minimum acceptable bar of experience on vectors like battery life, responsiveness, fast resume, and built-in security features. Here are just a few:
- Lenovo U310 – starting at $749 with 13.3″ display, 500GB hybrid hard drive, 3rd Gen Intel Core i3 Processors, unknown case material (42% less than MacBook Air)
- Sony VAIO T– starting at $769 with 13.3″ display, 3rd Gen Intel Core i5 processor, 320GB hybrid hard drive, magnesium/aluminum case, (41% less than MacBook Air)
So have Intel and its partners come a long way in a year? Absolutely. In one year everything has changed; from design diversity and differentiation, user interfaces, improved usage models and price points. These are all good future indicators for solid sales, because in the end, this is exactly what consumers really want. To be clear, design wins don’t guarantee sales but you cannot have sales without them. The traditional notebook market will continue to grow but I believe many consumers will choose to pay a bit more for an Ultrabook as they are looking at a better experience for a $100-200 increased investment. In many regions and demographics, that isn’t a lot to ask for, but in many, it’s not. For those who need a new notebook, want a tablet, but cannot afford both, I believe many more will choose an Ultrabook than ever before. As touch and convertibles become more pervasive and more affordable given many of the huge Intel touch investments, even more consumers who want it all will choose a touch Ultrabook. This will be an interesting 12 months, that’s for sure.
There are looming PC wars coming and it isn’t between Macs and Windows based notebooks. If you follow this industry you know that Intel is seeking to rejuvenate the notebook market. They are doing this by putting quite a bit of marketing weight behind the term UltraBook. To spur development in this category, Intel is putting some very specific hardware specifications around the term that OEMs like Dell, HP, Acer, etc., must conform to if they want their notebook to be called an UltraBook and take advantage of Intel’s marketing dollars for UltraBooks. Obviously every OEM is making UltraBooks.
The challenge as I see it for UltraBooks is that many of the first ones at launch and perhaps those that follow will be priced more in the premium price range rather than value. Many of the early UltraBooks we will see will be $699 and above although a few may get lower and many will skew higher as well. What our consumer data from our own research and consumer interviews is telling us is that Apple has about a $250 grace price point. Consumers know Apple’s Macbook Pro and MacBook Air lines are not the cheapest products on the market. For MacBook intenders, any comparable product must be at least $250 less than a comparable MacBook product to fully sway a consumer when price comes into play. But as I have pointed out before price is becoming less and less of an issue in mature markets.
Although we expect UltraBooks to continue to drop in price there is a sub-category of notebooks emerging which may be even more interesting.
If It Looks Like an UltraBook…
Intel wants to own the UltraBook category. They are investing a lot of money around the term. However, there is a strict set of requirements notebook OEMs must abide by if they want to use the term. If there is one thing I have learned in my 12 years of being an industry analyst it is that OEMs don’t generally like being told what they can and can’t do with their hardware designs. Every OEM wants to take advantage of the thin and light designs driving UltraBooks but they may want to vary the CPU capabilities, and what if they want to use an non-Intel chip for a design that looks exactly like an UltraBook? The answer is they can’t call it an UltraBook.
Earlier in the week AMD launched a very impressive 2nd-Generation A-Series APU, codenamed “Trinity.” Many OEMs have strong relationships with AMD and will most likely use these chips in their lineup of notebooks. So how do OEMs cover their bases by making non-Intel UltraBooks? Well, HP recently launched a new term called SleekBooks. We call this category Ultrathins and we expect many Ultrathins to enter the market well below the price of UltraBooks. And that is what makes this so interesting.
While Intel is going out and spending millions of dollars marketing the UltraBook term, it will indirectly benefit a range of competing platforms. Ultrathins will look nearly identical to UltraBooks with the only minor configurations or specifications, that many consumers may not even notice. The bottom line is that consumers will walk into retail and see UltraBooks, SleekBooks, and perhaps more terms on the way, and with all of these options consumers may very well go with price and walk with with something other than an UltraBook. Perhaps even not knowing they didn’t purchase an UltraBook.
Now, on the surface it may seem as though Intel may not like this scenario. But realistically Intel simply wanted to rejuvenate the notebook category. I believe their marketing of UltraBooks is going to do just that. Even though it may very well help their competitors chipsets and even to a degree help Apple.
I have a feeling there is a large chunk of consumers who are due for a notebook upgrade. The iPad has, for some, served as a sufficient supplement to their existing notebook making it easier to delay the purchase of a new notebook. Whether it is UltraBooks or these new thin and lights that will look and smell like UltraBooks but be priced quite a bit lower, we expect at least a short term positive jump the overall notebook category over the next few years.
This is one of the more interesting things to watch. Mac sales are growing at incredible rates. It seems each quarter Apple is selling more Macs than ever before. I was recently in an Apple store with a newly renovated training center. When I walked into the store I assumed the training tables would be filled with people learning how to use their iPads. Instead every table and every consumer at that table was learning how to use the new Mac they just purchased.
If Ultrathins that are very thin, light, and powerful hit the market below the $599 price like we think may happen, it could provide a serious jump start to the notebook category. And at $599 or lower the prices of quality notebooks will be significantly less than an entry level MacBook Air, which may be a key in slowing down Apple’s momentum with Macs.
The Notebook form factor is facing important times as consumers are faced with new questions about computing and their own computing preferences due to the iPad. Consumers are asking new questions about their own computing needs and looking more intently for specific solutions–especially those shopping for new notebooks.
This is exciting and challenging for many in the notebook ecosystem.
In a presentation to financial analysts on May 10, Intel CEO Paul Otellini said he was not particularly worried about the prospect of Microsoft issuing a version of Windows for ARM processors later this year. “We think [x86 is] a differentiator,” he said. “We have the advantage of the incumbency, the legacy support.”
Maybe he’s right. But it is disconcerting to hear this sort of complacency from the head of Intel, especially at a time when ARM-powered smartphones and tablets pose an unprecedented threat to Intel’s core laptop and desktop business.
I can only wonder what Andy Grove would say. Grove, who was Intel CEO from 1987 to 1998, famously wrote: “Business success contains the seeds of its own destruction. Success breeds complacency. Complacency breeds failure. Only the paranoid survive.” Grove also once ordered an advertising campaign attacking what was then the company’s most successful product the 80286 processor, in an effort to get customers to move to the newer, much more capable, and ultimately wildly successful 80386.
Grove remains a senior adviser to Intel and has always avoided any public criticism of his successors. But I find it hard to believe he is happy watching the company he built acting so passively in the face of a threat.
Side note: Intel was actually a major player in the ARM business for some years. It bought Digital Equipment’s StrongARM business in 1998. The chips, renamed XScale, powered many handheld computers and early smartphones. Intel sold the division to Marvell in 2007.
I take articles like this claiming the iPad will drop below 50% market share by as early as next year with a grain of salt. I don’t want this article to be about all the reasons why we believe the iPad will maintain significant market share, we have written quite extensively about those reasons. I’d rather examine a few flaws in competitors thinking about how to compete with the iPad and to do that I’d like to start off by making a point. I genuinely believe that it is possible to compete with the iPad. I don’t think it’s easy. I don’t think many companies can; but I don’t think it is impossible.
There is always room to innovate. The problem is simply that the companies attempting to create competing touch computers don’t understand touch computing or the market dynamics for tablets. It seems as though many vendors and software platform providers believe that by simply slapping a touch screen on a piece of hardware, regardless of what that hardware looks like, that it will hit the market and instantly be competitive. This is the fallacy number one.
Touch computing requires a touch based ecosystem. This is everything from carefully designed hardware, software, and to a degree services, all around touch (not mouse and keyboard) as a computing paradigm. This is no trivial task. Android is a weak touch computing ecosystem in my opinion. Mostly due to Android being an advertising strategy not a software strategy to Google. Time will tell with Windows 8 what kind of touch computing platform it truly becomes. Windows 8’s success rests largely on the hardware manufacturers and software developers ability to understand touch computing and develop a truly competitive ecosystem.
Fallacy number two is that the number of designs in the market on a particular platform is a competitive advantage. When I ask why a particular platform release may be competitive, often number of designs is the answer. “There will be over xx designs in the market,” is a phrase I hear often. I don’t believe that number of designs alone makes a particular platform competitive. In fact, it is perhaps quite the opposite. There is a book I like to reference called The Paradox of Choice by Barry Schwartz. The overall premise of this great book is that too much choice or too much variation in choice can overwhelm the purchaser to the point of frustration and lead to the inability to make a decision. My concern with too many products on a particular platform is that consumers may find the decision making process painful and confusing. This is why I believe there is a lot of merit to the argument for very limited product offerings per vendor and per platform to a degree.
Fallacy number three is that low cost always wins. I don’t believe that today’s consumers in mature markets want things that are cheap. I believe they want things that are valuable to them at a personal level. A key point to understand is that in mature markets what is valuable varies quite a bit. This is because in mature markets consumers make specific purchases for specific reasons. Often in mature markets consumers know roughly what they want, why they want it, and they are shopping with a pre-set of conditions. What one segment finds valuable may not be the same as another group. This is why product segmentation is important. The key is to create products in a segment–hopefully a large one– that consumers in said segment find valuable. In the automotive industry, for example, minivans target a segment, trucks target a segment, motorcycles target a segment, economy cars target a segment, and so on and so forth. In this case, the automotive manufacturer understands the segment a product is being created for and then innovates and delivers solutions to meet that segments needs on an annual basis. This understanding of the market dynamics for tablets is what I think is largely being missed by those desiring to create competitive tablets.
The question anyone who desires to create a tablet to compete with the iPad needs to answer is “What will my tablet do better than the iPad.” And what can they do with it that they can’t do with an iPad?
If there is not a well reasoned answer to this question then get back to the drawing board and innovate. The answer may not be obvious or easy to figure out but just trying to be me too is a recipe for disaster. Perhaps if these new Windows tablet vendors can create a product that is unique, does specific things the iPad doesn’t, and meets additional needs that the iPad can’t (or Apple isn’t interested in), then they might have a chance to truly deliver a competitive product that gains market traction.
I’ve been using Windows-based tablet computers for almost a decade. I was hooked the moment Bill Gates trotted out Microsoft’s first prototype tablets at a developer event in mid-2001. I got my first tablet, a Fujitsu Stylistic, in 2003 and I’ve carried it or its successors to meetings ever since, migrating along the way from Windows XP Tablet Edition to Vista to Windows 7. Nothing beats a Tablet PC for capturing notes during meetings and presentations, especially if the material contains diagrams, graphs or mathematical equations. When I’m not using my tablet to take notes, I use it to get my mail via Outlook, or to work on documents and spreadsheets with Word and Excel. It’s usually the only mobile system, other than my phone, that accompanies me when I travel.
Some suggest that the structure of the tablet market has already been settled. Apple rules, Android-based suppliers challenge; no other platforms need apply. The failures of HP’s Touchpad and RIM’s Playbook prove there’s no room for another software platform. I beg to differ. Android and iOS tablets do a yeoman’s job when it comes to consuming content, but lack the software tools and hardware features needed to create content. Windows-based tablets, which have been around since 2002, have always included the features needed for content creation, but lacked the easy to use interfaces needed for content consumption. The Metro User Interface in Windows 8 supplies these missing elements, and thus positions Win 8-based tablets as the only ones suitable for those who want to both create and consume content on a single device.
“Content Creation” as I use the term applies to a broad range of activities that includes tasks as varied as a student taking notes, a worker recording and distributing meeting notes, a club secretary assembling and distributing newsletters, a teenager spiffing up the audio from a band performance, a webmaster updating a website, and a mother preparing her annual Christmas letter. Contemporary PCs and MacBooks handle such work effortlessly. But, have you tried to accomplish tasks like these on an iPad or Android tablet? The process is at best arcane, and often impossible. Printing from a tablet? Most of the people I know e-mail the files they want to print to their PCs, and print from there. Manage a mail list? Forget about it. iPads and Android tablets work best as “companion devices,” and assume you have access to a PC or MacBook to handle everyday computing tasks. In fact, when I took my new iPad2 out of its box, it insisted that I connect it to iTunes running on a PC or Mac before it would let me do anything. Fortunately, there’s no shortage of those systems around my office, but what if I purchased it in the airport store, and tried to use it for the first time on a flight to China?
Windows 8 provides a more complete environment. Unless you’ve spent the last six months on the International Space Station, you’ve probably seen its vaunted Start screen, which replaces the Start menu used in earlier versions of Windows. The various colored blocks, referred to as “tiles,” contain live content updated by applications running in the background. Touch a tile and its associated program fills the screen. Switch from one app to the next by dragging your finger from left to right. Drag your finger up from the bottom of the screen to call up menus for the app. Drag your finger from the right edge of the screen to call up system menus, or to get back to the Start screen. Multi-finger gestures for pinching and zooming work intuitively, just as you’d expect. All-in-all, a well architected, contemporary user interface, great for leaning back and reading web content, watching videos, or whatever. But Windows 8 also supports more serious endeavors. Tap on the Desktop tile, and you are instantly transported to the familiar Windows 7 desktop. The applications you invested years learning to use are there in all their glory; not striped down versions that some guy in a marketing department thought were “good enough” for tablet users. Although the touchscreen interface works with these packages, odds are you will want to use a traditional keyboard and pointing device (mouse or track pad) arrangement, whether built into a dock or case, or freestanding. They may be old fashioned, but after 30 years of development, the industry has refined these input devices to the point where they’re hard to beat for content creation.
Digital Ink: Microsoft’s Unsung Advantage
Microsoft’s Tablet PC software includes a feature it calls “digital ink” that allows users to write on the surface of the display the same way one writes on a sheet of paper. The system makes no attempt to convert pen strokes entered this way into machine-readable text in real time, a la Apple’s failed Newton (although the option remains to convert information entered this way into a more conventional format if needed). Digital ink documents can be filed and searched in the same manner as conventional text documents. My tablet contains inked notes I’ve entered over the last eight years; I back them up, transfer them from one machine to another, and read them on my desktop when needed. Almost nobody knows this feature exists. Often, when I’m scribbling notes on my tablet at a conference, people sitting nearby will ask me what magical device I’m using. They’re amazed when I tell them it’s a five-year old tablet PC that runs Windows 7 and Office. I view Micro0soft’s failure to capitalize on this feature to be one of its biggest marketing disasters ever, almost as bad as Vista or Bob.
I don’t doubt the claims of a few of my colleagues that they can type faster than they can write. But can they capture graphic information as well? Here’s a snippet from the notes I took at a recent event where Intel’s Mark Bohr discussed the company’s new 22 nanometer technology. I captured the charts Bohr flashed on the screen on my tablet as he touted the advantages of Intel’s approach. I doubt any of my colleagues could key this in on their PCs.
Digital ink has always struck me as one of the most natural ways (other than pen on paper) for students to take notes in class or attendees to take notes in meetings. Yet Windows Tablets with this feature never gained much market share. Some of this resistance can be attributed to the premium (typically $300 or more) that suppliers charged for Windows Tablets, compared with conventional laptops. Some of this premium stems from the specialized hardware needed to implement digital ink (see below), which adds to the cost of Windows-based tablets. Suppliers like MSI omit such hardware in the interest of lowering the system’s cost. I’m confident the cost premium will shrink over time. I’m less confident that Microsoft will figure out how to market this capability successfully.
Since there will likely be a range of Windows 8 Tablets on the market, some with and some without the hardware needed to handle digital ink correctly, buyers who care about this feature should evaluate the specs of the devices they are considering with regard to the digitizer technology they use.
All told, Windows 8 melds a modern multi-touch user interface that’s great for consuming content with Microsoft’s successful Windows 7 environment that excels at creating content. No other tablet OS can deliver this one-two punch.
Intel and their partners are about to launch the biggest promotion in a decade for a new product category called UltraBooks. Microsoft is also about to launch a major update to Windows called Windows 8 that introduces a new user interface based on touch with their new Metro UI. Together they are critical products for the future of each company individually.
Form Factor Evolution
In the case of UltraBooks, I actually see them as the natural evolution of laptops and not revolutionary as Intel would like us to think. Rather, they take advantage of the industry’s constant push to make things smaller, lighter, thinner and have better battery life. For mainstream consumers who have had to lug around their rather bulky laptops for the last 5 years, they would be justified in asking Intel and other Wintel vendors “what took you so long?” Given the fact that Apple has had their MacBook Air on the market for 5 years and it has defined what an Ultrabook should be.
With Windows 8 and Metro, Microsoft is also following an evolutionary path towards touch interfaces with their Metro based smart phones and soon to be Metro based tablets and PCs. Again, consumers could ask Microsoft “what took you so long?” since Apple has had their touch UI on the iPhone for 5 years and on their iPads for 2 years.
But both products have some interesting challenges attached to them when they launch later this year. In the case of UltraBooks, they most likely will have starting prices of at least $799-$899 although I hear there could be at least one that is pretty stripped down coming out at around $699.00. At these prices, they completely miss the mainstream laptop market that represents the bulk of laptops sold and are priced from $299 to $599.
In the case of Windows 8 and Metro, while Metro is great on Microsoft’s phones and works very well on the tablets I have tested it on, it does not translate well to the laptop or PC since 100% of existing PCs don’t have touch screens on them. And most of the PC vendors are not putting touch screens on the majority of their new laptops because to do so adds at least another $100-$150 in cost to the customer. If you have tested the consumer preview of Windows 8 and Metro on an existing laptop, you know how frustrating it is to use it on existing trackpads. I consider this an Achilles’ heal for Windows 8 and one that could really hurt its short-term prospects.
To be fair, Microsoft has recently (three weeks ago) released recommended guidelines for next generation track pads and a new design I have seen from Synaptics could make laptops work well with Metro once it gets into new laptops. But this should have been something Microsoft focused on a year ago and had all of the new laptops “Metro” enabled at launch. My sense is that Microsoft should have only launched Metro on tablets this year and gradually moved Windows 8 Metro to the consumer PC markets once they had laptops optimized for it.
Instead, I see a lot of consumer confusion on the horizon when they try to use Metro on existing trackpads and any other non-touch input device, as the experience will be confusing at first and frustrating afterwards. Also, you notice that Apple has not put touch screens on their laptops and desktops and instead, worked extra hard to create trackpads and external trackpads that map to the touch experience on the iPhone and iPad.
I consider the initial pricing for UltraBooks and putting Metro on laptops and desktops issues that could slow down any early adoption of these products this year and perhaps deliver a graduated adoption in the future. The two companies do have a secret weapon in the works that could get them a lot of kudo’s from the marketplace and be a key component in getting users really interested in Intel and Microsoft again.
A New Category
The secret weapon comes in the form of a new form factor often referred to as “hybrids.” These are either tablets that can be docked into a keyboard, turning them into a laptop or a laptop with a detachable keyboard. You might think they are one in the same, but they are very different in terms of design goals. In the case of the first, the design is specifically around the tablet and the keyboard dock is modular. We already have a lot of examples of this with the iPad where the tablet is the central device and the attachable Bluetooth keyboards are more of an after thought. In this case the keyboard just supports the input functions of the tablet. The same is true with the Asus Transformer line of devices.
But in the latter case, the design is around a slim laptop case and the screen (tablet) can be taken off and used as a tablet. I believe this latter design is the secret weapon that Microsoft and Intel can use against Apple and at least on paper, give Apple a run for the money especially in business and the enterprise. To a lesser extent it could be hot in some consumer segments where the keyboard is critical to what they do with a tablet and want a laptop centered experience as well.
This is where Apple’s current strategy can be challenged as they are offering these market two distinct products. There is the iPad that stands by itself, and then the MacBook Air, their UltraBook that like the iPad, also stands by itself as a separate product. The key reason is that each has their own operating system and although Mountain Lion, Apple’s new version of OS X brings a lot of iPad like iOS features to OSX, they are still separate and distinct operating systems.
But with the introduction of Windows 8 and used especially on a laptop centered hybrid in which the screen (tablet) can be detached and used as a true tablet that takes full advantage of Metro, Microsoft and Intel can give their customers the best of both worlds in a single device. When in “UltraBook” laptop mode, users can use Windows 7 and its comfortable UI they are used to and have available to them the over hundreds of thousands of Windows apps as is. But when the screen detaches, it automatically defaults to the Metro UI and the touch experience is now central to the device. Now apps designed for Metro can give the users a rich tablet experience out of the box. Sure, they could default to old Windows programs if needed, but running those on a tablet is clunky at best.
If done right, the user would end up with a Windows 8 UltraBook with a detachable screen (tablet) and have to only buy one device instead of two. Our research shows that IT and even some consumers would have no trouble paying $999 and above for this combo product. At this price it would be a bargain. Most IT purchased laptops are in the $699-$999 range now and those who bought iPads to augment their users work experiences cost at least $599 so a combo device say at even $1299-$1399 is more then reasonable for them. Intel knows this and believes that as much as 50% of all Windows tablets will be hybrids. And Microsoft will push these types of designed products especially if the uptake on Windows 8 on laptops doesn’t take off as planned.
Could anything potentially derail Intel and Microsoft’s “hybrid” strategy? Well, if Apple applied their great innovative design knowledge to creating a hybrid that blends the iPad and the MacBook Air into a single device, it could have an impact their ability to dominate this market. On the other hand, it would validate Intel and Microsoft’s strategy as well. If they beat Apple to the market with their version, which is highly likely since at least four hybrids are set to come out by Oct, it could be the “hero” product of the launch that shows users the value of an X 86 ecosystem and highlight to Windows users the need for Ultrabooks and Tablets and Win 8.
In my many weekly conversations with industry insiders we discuss Intel’s chances in mobility markets, specifically smartphones. Few people are betting against Qualcomm and for very good reason in that they are entrenched at handset vendors and their 2012 roadmap, at least on paper, looks solid. What few are discussing is how Intel will pick up market share. My last column on Intel’s smartphone efforts outlined what Intel needs to demonstrate quickly to start gaining share and getting people to believe they can be a player. Now I want to take a look at why I believe Intel can and will pick up relevant market share over the next three years.
Intel Finally Broke the Code with Medfield
This isn’t Intel’s first time in mobility. Intel owned XScale, an ARM-based mobile processor that was in the most popular WinCE devices like the Compaq iPaq, one of the more popular Pocket PCs. XScale products even powered Blackberrys for a time as well. Intel sold the entire XScale mobile application processor business to Marvell in 2006 for $600M. This move was driven by Intel’s desire to focus on X86 designs. What followed were some failed mobile attempts with Menlo and Moorestown, two low power, Atom-branded processors that made their way into MIDs (Mobile Internet Devices). It appeared that Intel would make grand announcements with big names like LG for smartphones then nothing would happen afterward. Things are very different with Medfield. Handsets are at China Unicom in testing for Lenovo and Motorola announced their handsets would be at carriers for the summer.
Medfield is a huge step forward in design and integration for Intel. First, it combines the application processor with I/O capabilities on a single chip. This saves handset makers integration time and board space. Secondly, it is paired with the Intel XMM 6260 radio based on the Infineon Wireless Solutions (WLS) acquisition. This increases the Intel revenue BOM (Bill of Material) and also helps with handset integration. Finally, Intel has embraced the Android mobile OS in a huge way with a large developer investment and will provide optimized drivers for Medfield’s subsystems. This move is in contrast to their MeeGo OS efforts that didn’t go anywhere. Intel has even gone to the effort to emulate ARM instructions so that it can run native apps that talk directly to ARM. These apps are typically games that need to be closer to the hardware. This is a very good start for Intel, but as I tell my clients, if there are 10 steps to mobile silicon success, Intel just successfully crossed step 3.
It’s a Tough Smartphone Market
Intel made some very serious headway with Medfield, but it is a very competitive market out there. According to IDC, in Q4 2011, Apple and Samsung combined to garner almost 50% of the smartphone market. As I pointed out in my previous column, Apple already designs their A-Series processors and I don’t see that changing. I expect Samsung with the exception of the very low end to lean into their own Exynos silicon. Nokia at 12% Q4 smartphone share is tied to Windows Phone and Qualcomm at least for the short term. Struggling RIM doesn’t need another variable to worry about with their muddled operating system strategy and is currently tied to Qualcomm. Finally, HTC is rumored to tie up with NVIDIA on its Tegra platform on the high end. Who does this leave for Intel?
For Intel in the short term, with Motorola and Lenovo on-board, this leaves private label for carriers, LG, Sony, ZTE, Huawei, Kyocera, Sanyo and a very long tail of small manufacturers. The long tail will be a challenge for Medfield until Intel waterfalls the products line to be cost-competitive with lower end models. I expect Intel to start waterfalling products down in the end of 2012.
Why Intel Could Succeed
While I outlined the many challenges, Intel could very well succeed in the space longer term. First, the phone marketplace is a rapidly changing market. Not only have there been tremendous share shifts in the last two years, but feature phones are migrating to smartphone market resulting in exploding growth.
Operating systems are clear from shaking out. Microsoft will not go gently into the night with Windows Phone and will invest what it takes to be successful even if it takes another Nokia-like investment to own another platform. I also believe once Microsoft starts gaining share, they will devote resources for X86 on Windows Phone 8 or 9 platforms. They see Intel as successful with Medfield and the WINTEL alliance could be brought back from the dead. Long-term, I do not believe Samsung will be happy licensing someone else’s operating system, particularly with Apple’s integration and experience success. I expect Samsung to do one of three things, possibly two; increase investment in Bada to a point that it can compete with Android in a closed environment, embrace webOS, oe lean heavily into Tizen. Marketplaces in dynamic change are an opportunity for newcomers, even companies worth $140B like Intel.
One other important factor that hasn’t fully played out is “carrier versus handset-maker” dominance. Up until the Apple iPhone, the carriers dictated terms to the handset makers. Every carrier who has adopted the iPhone has taken a gross margin reduction. This doesn’t mean they made a bad decision; they had to carry the iPhone. That carrier margin reduction money is going to Apple and not the carriers. Carriers are strategizing how they can regain that dominance going forward and I believe Intel will part of those plans. Intel has the capability to partner with an extremely low cost manufacturer or ODM an entire solution, white label it to a carrier and provide a competitive Android experience. I expect a few key announcements this month at this year’s Mobile World Congress.
Of course, we cannot forget about Intel’s technology. According to tests run at Anandtech, Intel’s Medfield is competitive in power at 32nm LP so you must assume that it only gets better at Intel’s 22nm 3DTri–Gatetechnology. Intel will roll Atom into 22nm in 2013 and 14nm in 2014. This is all the while in 2012 TSMC is at best case at 28nm and GLOBALFOUNDRIES and Samsung is at 32nm.
I define success as the ability to reach a relevant level of profitable business that supports the desired brand goals. For Intel, this doesn’t need to be 80% like they have in the PC market, but needs to be a profitable 20%.
What this Means for Intel, Qualcomm, Texas Instruments, and NVIDIA
Over a period of three years, Intel will start to take market share from Qualcomm, Texas Instruments and NVIDIA, albeit very small in 2012. As Intel integrates wireless, moves to 14nm, and waterfalls their offerings to lower price point smartphones, this makes much more competitive to handset makers and carriers. I expect Huawei, ZTE, or a major carrier to go big with Intel in 2013 which will make a huge difference. One thing to remember about Intel; unlike others in the marketplace, Intel also captures the manufacturing margin TSMC and GLOBALFOUNDRIES makes and the design margin ARM earns. While Intel has a long way to go in proving themselves, they have the start they never had before at a time to take advantage of the mammoth growth in smartphones. Never count Intel out of any market, no matter how many times they have tried and failed.
Intel made a big splash at CES 2012 with the announcement that Motorola and Lenovo committed to Intel’s Medfield smartphone solution. This came on the heels of a disappointing break-up between Intel and Nokia as well as a lack of previous traction with LG. While Intel has come farther than they have ever come before with one of their X86 SOCs, they still have a long way to go to claim smartphone victory. Of course Intel knows this and is working diligently and sparing no expense. The biggest challenge Intel faces is attacking a market where the incumbent, ARM ecosystem partners Qualcomm, NVIDIA, and Texas Instruments have almost 100% market share. To start gaining share in smartphones, Intel must demonstrate many things in the near future.
More Design Wins with Key Players
The Motorola announcement was impressive in that Moto has a respected name in smartphones, but they won’t carry Intel that far alone. Lenovo is an even smaller player and while very successful in PCs, hasn’t been able to secure a lot of smartphone market share even in their home country, China. Intel knows they need a few more partners to start chipping away at market share and I expect them to announce at least one at this year’s Mobile World Congress.
One of the challenges is that many of the top players are already locked-in in one way or another, Intel has some negative history with, or has rapidly declining share. Apple already has their own A-Series SOC, Samsung has Exynos SOC, and Nokia rebuffed Intel last year and is clearly locked into ARM and Microsoft for the time being. RIM as a partner is a shaky proposition and HTC is an aggressive player but is recently dropping share. That leaves lower smartphone market share holders LG, Sony, Sharp, NEC and ZTE in the short term.
Longer term, I don’t expect Apple or Samsung to get out of the SOC business because they have been successful with their own strategies. I cannot see Nokia or Microsoft motivated to drive a change or provide dual support for X86 until Windows 9. RIM is in a free-fall with no bottom in sight. Intel is forced to take the long-term approach as they are with Lenovo by developing smaller smartphone players to become larger ones. ZTE certainly is a good long term prospect as is Huawei. If Intel can leverage their PC franchise with them I could see them being successful.
Relevant, Differentiated, and Demonstrable Usage Models
In fighting any incumbent, the new entrant must provide something well above and beyond what the incumbent offers to incent a change in behavior. I am assuming that Intel won’t lead in low price or lowest development cost, so they must offer handset makers or the carriers a way to make more money or get consumers to demand an Intel-based smartphone. Regardless of which variable Intel wants to push, they must devise relevant, differentiated and demonstrable usage models that ARM cannot.
By relevant I mean that it must be fixing a known pain point or creating a real “wow” feature consumers never asked for, but is so cool it cannot be passed up. One pain point example is battery life. Battery life is simply not good enough on smartphones when used many times daily. If this weren’t true, car chargers and battery backs wouldn’t be so popular. Wireless display is useful and cool but not differentiated in that Apple can enable this via AirPlay. Demonstrable means that it must be demonstrated at the store, an ad, or on-line on a web site. If something isn’t demonstrable then it may as well not exist.
I would like to see Intel invest heavily in modularity, or the ability to best turn the smartphone into a PC through wireless display and wireless input. Yes, this is dangerous short-term in that if Intel does a great job at it then they could eat into their PC processor franchise. But, this is the innovator’s dilemma, and a leader must sacrifice something today to get something tomorrow. I could envision an Intel-based emerging region smartphone that enables PC functionality. ARM cannot offer this well today but will be able to in the future with their A15 and beyond-based silicon. Intel should jump on the modularity opportunity while it lasts.
One other opportunity here is for Intel to leverage their end-to-end experience from the X86-based Intel smartphone to the X86-based data center. If Intel can demonstrate something incredible in the end-to-end experience with something like security or a super-fast virtualized desktop, this could be incredibly impactful. One thing that will be with us for at least another 5 years is bandwidth limitation.
Outside of Apple, the carriers are the gatekeepers. Consumers must go through them to get the wireless plans, the phones, and most importantly, the wireless subsidy. Apple’s market entry strategy with AT&T on the iPhone was a strategic masterpiece in how to get into a market and change the rules over time. Apple drove so much consumer demand for iPhones that the carriers were begging Apple to carry the iPhone, the exact opposite of the previous decade.
Intel must get carriers excited in the new usage models, bring them a new stream of revenue they feel they are being cut out from, or lower their costs. Intel doesn’t bring them revenue from content side but could I can imagine Intel enabling telcos to get a piece of classic retailer’s PC action once “family plans” become a reality. While telco-distributed PCs weren’t a big success in the past, this was due primarily from the absence of family data plans. I can also imagine Intel helping telcos lower the costs of their massive data centers with Xeon-based servers. Finally, if Intel could shift traffic on the already oversold “wire” by shifting processing done in the cloud and onto their SOCs, this would be very good in a bandwidth-constrained environment.
Competitive Handset Power
At CES, Intel showed some very impressive battery life figures for Medfield handsets:
• 6 hour HD video playback
• 5 hours 3G browsing
• 45 hour audio playback
• 8 hour 3G talk time
• 14 day standby
This was measured on Intel’s own reference platform which is somewhat representative of how OEMs handsets will perform. What will be very telling will be how Medfield performs on a Tier 1 handset maker, Motorola when they launch in Q3 2012. There is no reason to think the Moto handset won’t get as impressive battery life figures, but Intel could gain even more credibility by releasing those figures as available.
When Will We Know When/If Intel’s Smartphone Effort is a Success?
Intel has slowly but surely made inroads into the smartphone market. Medfield is impressive but competing with and taking share from an incumbent with 99%+ market share is a daunting task. The easy answer to measure Intel progress is by market share alone but that’s lazy. I believe that Intel smartphone efforts should first be measured by handset carrier alliances, the number of handset wins, the handset quality and the new end usage models their SOCs and software can enable. As these efforts lead to potential share gain does it make sense to start measuring and scrutinizing share.
If you are in the high-tech industry and haven’t heard of the term “Ultrabook”, you’ve probably been on sabbatical or have been living under a rock. Intel introduced an industry-wide initiative to re-think the Windows notebook PC, which they have dubbed and trademarked the “Ultrabook”. Launched at Computex 2011, Ultrabooks are designed to be very thin and light, have good battery life, have instant-on from sleep, be more secure and have good performance. If you want to see the details on what constitutes an Ultrabook, let me direct you to an article I wrote in Forbes yesterday. Does this sound a bit like a MacBook Air? This is what I thought about the entire category until Dell lent me their Ultrabook, the Dell XPS 13, for a few days. I have to say, I am very impressed and believe they have a winner here that could take some business from Apple. I don’t make that statement lightly as my family is the owner of three MacBooks and I do like them a lot.
Dell plays hard to get
When Ultrabooks were first introduced in July, Dell was somewhat silent on their intentions. Typically Dell is locked arm in arm with Intel many steps of the way. When they didn’t introduce an Ultrabook by the back to school selling season, “industry people” started to ask questions. When Dell didn’t release one by the holiday selling season, people were asking, “what’s wrong with the Ultrabook category”, or “what is Dell cooking up”?
I thought they were waiting for Intel’s Ivy Bridge solution that was scheduled for earlier in the year. Whatever Dell was waiting for doesn’t matter, because they did nothing but impress at CES. During the Intel keynote with Intel’s Paul Otellini, Dell’s vice chairman Jeff Clarke, stormed on-stage with some serious Texas swagger. The video cameras at the CES event didn’t do the Dell XPS 13 justice as it’s hard to “get” the ethos of any device on camera, but with Jeff Clarke and Paul Otellii on stage, you knew it was important to both companies. In my 20+ years as PC OEM and technology provider to OEMs, I believe the only way to really “get” a product is to live with it as your primary device for a few days. And that’s just what I did.
It’s apparent to me that Dell took their combined commercial and consumer experience and put it to good use. Rather than just follow Apple, HP or Lenovo, they put together what I would call the best of both worlds. The machined aluminum frame adds the brawn and high-brow feel, while the rubberized carbon-fiber composite base serves to keep the user’s lap cool and reduce weight. The rubberized palm rest provides a slip-proof environment that adds serious precision to keystrokes and trackpad gestures. It also provides a slip-proof mechanism for carrying the unit across the house, the office, or into a coffee shop. In a nutshell, Dell solved my complaints about my MacBook Air and made it look, feel and operate premium.
I give Dell and Intel credit for working together to make Windows 7 PCs almost “instant on”. The XPS 13 turned on and off very quickly thanks to Intel Rapid Start and Dell’s integration. I wasn’t able to use Smart Connect, but when I can use the XPS 13 for a few weeks I want to try this out. This is essentially a feature that intermittently pulls the XPS out of sleep state and pulls in emails and calendar updates. While this is as close a PC will get to “always on, always connected”, it is a decent proxy.
Ingredient Branding and Certifications
Historically, the typical Windows-based PC with all its stickers looks like a cross between a Nascar racing car and the back of a microwave oven. That doesn’t exactly motivate anyone to shell out more than $599 for a Windows notebook. There are no visible stickers on the XPS 13 and the only external proof of Intel and Microsoft is on a laser-etched silver plate on the bottom of the unit. Underneath the plate are all the things users usually ignore like certifications.
Keyboard and Trackpad
I never quite understood how little evaluation time users spend on what ends up being one of the most important aspects of a notebook; the keyboard and trackpad. I already talked about the rubberized palm rest that gives the XPS 13 a stable palm base for the keyboard and trackpad. My palms slip all over the place with my MacBook Air. The XPS 13’s keyboard is auto backlit and the keys have good travel and a firm touch. The trackpad feels like coated glass and supports all of the Windows 7 gestures. Clicking works by either physically clicking the trackpad down or gently tapping it. It’s the user’s choice.
The display is 13.3″ at a very bright 300 nits at 1,366×768 resolution. It’s an edge to edge display (or nearly), which allowed Dell to design a 13.3″ display into around a 12″ chassis. I compared it to a MacBook Air and it is in fact narrower with the same dimension display. That is very impressive. I would have preferred a higher-resolution display but I don’t know if many users will make a huge deal out of this. The display is coated with Gorilla Glass which gives some extra added comfort knowing it will be up to the task of my kids accidentally scratching it up.
Compared to some of the other Ultrabooks, I applaud Dell for removing some of the ports that I am certain primary research said were “must-haves.” Must haves like a VGA port, 5 USB ports, and an ethernet port. (yawn) Users get a Displayport, one USB-3, one powered USB-2, and a headphone jack. The only port I would have preferred was a mini or micro HDMI port. Displayport guarantees that I will need to buy a cable or an adapter I don’t have. I can live without the SD card reader but it sure would have been nice if they could have fit it inside.
I am still very skeptical on most battery life figures of any battery-powered product. One exception is the Apple iPhone and iPad, where Apple goes out of their way to provide as much detail as possible for different use cases. With that caveat, I do believe the Dell XPS 13 will have very respectable battery life figures versus other Ultrabooks and the Apple MacBook Air. Dell says the XPS 13 will achieve nearly 9 hours of battery life, well above Intel’s target of between 5 and 8 hours.
One of the sexier features harkens back to the days of Dell batteries, which had buttons to gauge how much was power was left. Like the Dell batteries of yesteryear, press a small button on the side (not back) of the XPS 13 and it will light up circles to show how much battery you have left. That shows a dedication to useful innovation, not penny pinching bad decisions made in dark meeting rooms. This is the kind of small thing that demonstrates attention to detail that Apple quite frankly has dominated so far.
Consumer and Commercial Applicability
Whenever I hear that one product serves two different markets I usually cringe and jump to the conclusion that it will be mediocre at both. I also take a very realistic approach on the “consumerization of IT”, in that I believe we are a long way off until 50% of the world’s enterprises give their employees money to choose their own laptop. In the case of the Dell XPS 13, I believe that it will provide a good value proposition to both target sets. Consumers are driven by style, price, aesthetics and perceived performance at an certain price point while businesses are more interested in TCO, services, security, and custom configurability. The Dell XPS 13 provides all that. They may run into challenges with IT department and sealed batteries, lack of VGA and Ethernet ports, but then again a few IT departments would require serial ports if you let them spec out the machine completely.
Pricing and Specs
The Dell XPS 13 starts at $999 and includes an Intel Core i5 processor, Intel HD 3000 graphics, 128GB SSD hard drive, 4GB memory, USB 3.0, and Windows Home Premium. For a similarly configured Apple MacBook Air, buyers would pay $1,299. With the Mac, you get OS X Lion, a bit higher resolution display, Thunderbolt I/O, and an SD card slot. And yes, for the record, I know PCs don’t primarily sell on specs but they are still a factor in the decision. If it weren’t, Apple wouldn’t provide any specs anywhere, right?
Possibly Taking Bites from the Apple
From everything I experienced with the Dell XPS 13 evaluation unit, I can safely say that they have a potential winner. Why do I say “potential”? First, I’m using an evaluation unit, not a factory unit with a factory image. As a user or sales associate, if I start Windows and I start getting warning messages for virus protection, firewall and 3rd party software, the coolness factor will be for naught. The first consumer impression will be bad. I hope this doesn’t happen with the factory software load.
Many success factors go into successfully selling a system and creating a lasting consumer bond. Great products must align with great marketing, distribution and support. Controlling the message is key at retail. If, and I mean “if” Dell can effectively pull their messages through retail and somewhat control merchandising at retail, this will be a solid step in connecting the value prop with the consumer. This is very hard, especially in the U.S., where Best Buy rules brick and mortar. What will the Best Buy yellow shirt say when someone asks, “whats the difference between the MacBook Air and the Dell XPS?” If they say “$300” that is a fail. Retail will be important, more important than direct for Dell, because industrial design doesn’t translate well to the web. Seeing the XPS 13 image doesn’t impress as much as holding it does, so retail cannot be minimized.
I see the XPS 13 doing well in business and enterprise, again, given aligned messaging, channel, sales training and support. IT departments now have a design that is every bit as cool as the MacBook Air and arguably more productive plus the added benefits of TPM and Dell’s customization and support.
Net-net I see potential consumer and business buyers of thin and very light notebooks looking at Apple’s MacBook Air and many choosing the Dell XPS 13 Ultrabook instead. This won’t just be based on price, but all other benefits I’ve outlined above. I also believe Apple’s MacBook Air sales will increase during 2012 but they would have sold more had it not been for Ultrabooks, especially the Dell XPS 13, the best Ultrabook I’ve used so far.
You can get more information on the Dell XPS 13 Ultrabook here on Dell’s website.
There has been a lot of industry skepticism since Intel predicted at Computex Taipei 2011 that Ultrabooks would account for 40% of consumer portable sales by the end of 2012. That included skepticism from me as well, and I continue to have that skepticism. Rather than dive into that discussion though, I think it’s more important and productive to examine how Intel could conceivably achieve that goal.
What Intel is Actually Predicting
It’s important to understand what Intel means when they made their prediction. First, they are making the prediction for the consumer market, not the slower moving SMB, government, or enterprise markets. Also, the prediction is not for the entire year, it is for the end of December, 2012. That is, 40% of consumer notebooks by the end of December 2012 would need to be Ultrabooks. This makes a huge difference when evaluating the probability of this actually occurring.
So what would it take for 40% of all consumer notebook sales to be Ultrabooks by the end of 2012?
Make Ultrabooks Look New, Relevant, and Sexy
Intel and their ecosystem need make Ultrabooks perceived as new, relevant and sexy. By relevant I mean making the direct connection between what the Ultrabook delivers and what the consumer thinks they need. Sexy, is, well sexy, like MacBook Airs. The ecosystem must make a connection with:
- Thin and light– this is easier because Apple has blazed the trail and it is evident on the retail shelf.
- Fast startup– this is somewhat straightforward and a communicated consumer pain point with Windows today
- Secure– this is the most difficult in that it is always difficult to market a negative. It’s like life insurance; it sounds good, people say it’s important, then don’t buy it. I think Intel would be much more successful taking the same base technology and enabling exclusive consumer content or speeding up the on-line checkout or login process.
- Performance- this is difficult to market in that no longer does performance have a comparable metric and chip makers have appeared to stop marketing why it is even important.
- Convertibles- I am a big fan of future convertibles given the right design and OS. If OEMs can put together a classy, ~18mm design, it could very well motivate consumers to delay a tablet purchase. This will not work prior to Windows 8’s arrival, though because you really need Metro for good touch.
Probably the biggest impediment here is the “sexy” piece. Sexy is the “X” factor here. It’s cool to have an Apple MacBook Air. It isn’t cool yet to have an Ultrabook. A lot of that $300M UltraBook investment fund must pay for the Ultrabook positioning and re-positioning of anything Windows. This is a tough task, to say the least.
Steal Some Apple MacBook Air Market Share
Intel and their ecosystem, to hit the 40% target, will need to steal some of Apple’s market share. There is no way around this to achieve the 40% target unless they want to pull the dreaded “price lever”. Apple “owns” 90+% of the premium notebook market today and because Windows OEMs and Intel for that matter aren’t motivated to trash pricing now, they will need to steal some of Apple’s share. This will be a tough one, a real tough one particularly in that Intel shoots itself in the foot short-term by going aggressively after this one given they are inside every MacBook Air. So OEMs will need to take this one on their own, using Intel marketing funds as a weapon. This will be especially difficult given that Apple positioning isn’t going to be instantly erased by anything short term and Windows OEMs haven’t been able to penetrate this for years. Remember the Dell Adamo? Sexy, Windows 8 convertible designs could very well be the magic pill that could help steal share from Apple.
Lower Price Points
This is the last lever anyone wants to pull as it destroys positioning. Depending which data service you look at, the average consumer notebook ASP (average selling price) is between $600-700. This seems high, I know, when you look at what is being sold at local retailers, but remember that this includes on-line and Apple which has a higher ASP. Ultrabooks range from around $799 to $1,299 excluding Apple. This is well above the prices it would need to be to achieve the 40% goal. There are two ways to lower price; lower the cost or lower margins. I believe you will see a little bit of both.
As volumes increase, there will be immediate cost savings in expensive mechanicals like aluminum, plastic, and composites. Custom cooling solutions are very expensive required to cool thin chassis between 16-21mm in thickness. Tooling and design cost can be amortized over greater volumes to decrease the cost per unit. Intel Ivy Bridge, available in April 2012, will provide a shrink from 32nm to 22nm which would theoretically allow a lower price point at the same performance point, although I am sure Intel isn’t leading with that promise. Intel would much rather provide large marketing subsidies and pay NRE (non recurring engineering) costs to retailers and OEMS to design and promote the Ultrabook category. SSD is a tricky one to predict given spinning hard drive supply issues. Spinning hard drive price increases allow SSD makers to increase prices which doesn’t bode well for Ultrabook BOM costs in the short term.
Leverage Windows 8 Effect
The expected Windows 8 launch for the holiday of 2012 could help the Ultrabook cause on many fronts. First, it may give consumers a reason to consider buying a new laptop or notebook. I fully expect consumers to delay purchases and wait for Windows 8 to arrive. This could create a bubble in Q4 that, again, helps achieve the 40% goal.
Finally, Ultrabooks need to get off to a solid start in 2012. Consumer influencers and the rest of the ecosystem needs to perceive UltraBooks as a success in 1H/2012 for them to “double-down” for 2H/2012. CES will be one tactic to do this, where I expect to see 100s of designs on display to demonstrate OEM acceptance to the press, analysts, and retail partners. Intel’s Ivy Bridge will give another boost in April, followed by the Windows 8 launch. Retailers cannot be stuck with excess inventory and cannot make drastic price cuts that would only deposition the category. Currently there is skepticism on the entire Ultrabook value proposition and the price points they can command so there is a lot of work to be done.
Will Ultrabooks Achieve the 40% Target by End of 2012
While this analysis is about what it would take to achieve the goal, I must weigh on what I think will happen. I like to bucket these kinds of things into “possible” and “probable”. I believe that if the Ultrabook ecosystem could accomplish everything outlined above, Ultrabooks could hit 40% of consumer notebook sales by the end of 2012. So it is possible, BUT, I don’t see it as probable, primarily due to the low price points that it will need to be hit. There just isn’t enough time to reposition a Windows notebook as premium and either raise price points of the Windows notebook category or steal Apple market share.
AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.
There’s an awful lot of mis-guided analysis wafting about regarding AMD’s new strategic direction, which the company says it will make public in February. This piece is to help you (and me) sort through the facts and the opportunities. I last took a look at AMD’s strategies earlier this year, available here.
Starting With the Facts
- AMD is a fabless semiconductor company since 2009. The company depends on GlobalFoundries and soon Taiwan Semiconductor to actually fabricate its chips;
- In its latest quarter, AMD had net income of about $100 million on $1.7 billion in revenue. Subsequently, the company announced a restructuring that seeks to cut costs by $118 million in 2012, largely through a reduction in force of about ten percent;
- AMD has about a 20% market share in the PC market, which Intel says is growing north of 20% this year, largely in emerging markets;
- AMD’s products compete most successfully against rival Intel in the low- to mid-range PC categories, but 2011 PC processors have underwhelmed reviewers, especially in performance as compared to comparable Intel products;
- AMD has less than a 10% market share in the server market of about 250,000 units, which grew 7.6% last quarter according to Gartner Group;
- AMD’s graphics division competes with nVidia in the discrete graphics chip business, which is growing in profitable commercial applications like high-performance supercomputing and declining in the core PC business as Intel’s integrated graphics is now “good enough” for mainstream buyers;
- AMD has no significant expertise in phone and tablet chip design, especially the multi-function “systems on a chip (SOCs)” that make up all of today’s hot sellers.
What Will AMD CEO Rory Read’s Strategy Be?
I have no insider information and no crystal ball. But my eyebrows were seriously raised this morning in perplexity to see several headlines such as “AMD to give up competing with Intel on X86“, which led to “AMD struggling to reinvent itself” in the hometown Mercury News. I will stipulate that AMD is indeed struggling to reinvent itself, as the public process has taken most of 2011. The board of directors itself seems unclear on direction. That said, here is my score card on reinvention opportunities in descending order of attractiveness:
- Servers — For not much more work than a desktop high-end Bulldozer microprocessor, AMD makes Opteron 6100 server processors. Hundreds or thousands more revenue dollars per chip at correspondingly higher margins. AMD has a tiny market share, but keeps a foot in the door at the major server OEMs. The company has been late and underdelivered to its OEMs recently. But the problem is execution, not computer science.
- Desktop and Notebook PCs — AMD is in this market and the volumes are huge. AMD needs volume to amortize its R&D and fab preparation costs for each generation of products. Twenty percent of a 400 million chip 2011 market is 80 million units! While faster, more competitive chips would help gain market share from Intel, AMD has to execute profitably in the PC space to survive. I see no role for AMD that does not include PCs — unless we are talking about a much smaller, specialized AMD.
- Graphics Processors (GPUs) — ATI products are neck-and-neck with nVidia in the discrete graphics card space. But nVidia has done a great job of late creating a high-performance computing market that consumes tens of thousands of commercial-grade (e.g., high price) graphics cards. Intel is about to jump into the HPC space with Knight’s Corner, a many-X86-core chip. Meanwhile, AMD needs the graphics talent onboard to drive innovation in its Fusion processors that marry a processor and graphics on one chip. So, I don’t see an AMD without a graphics component, but neither do I see huge profit pools either.
- Getting Out of the X86 Business — If you’re reading along and thinking you might short AMD stock, this is the reason not to: the only legally sanctioned software-compatible competition to X86 inventor Intel. If AMD decides to get out of making X86 chips, it better have a sound strategy in mind and the ability to execute. But be assured that the investment bankers and hedge funds would be flailing elbows to buy the piece of AMD that allows them to mint, er, process X86 chips. So, I describe this option as “sell off the family jewels”, and am not enthralled with the prospects for success in using those funds to generate $6.8 billion in profitable revenue or better to replace today’s X86 business.
- Entering the ARM Smartphone and Tablet Market— A sure path to Chapter 11. Remember, AMD no longer makes the chips it designs, so it lacks any fab margin to use elsewhere in the business. It starts against well-experienced ARM processor designers including Apple, Qualcomm, Samsung, and TI … and even nVidia. Most ARM licensees take an off-the-shelf design from ARM that is tweaked and married to input-output to create an SOC design, that then competes for space at one of the handful of global fab companies. AMD has absolutely no special sauce to win in the ARM SOC kitchen.To win, AMD would have to execute flawlessly in its maiden start (see execution problems above), gain credibility, nail down 100+ design wins for its second generation, and outrace the largest and most experienced companies in the digital consumer products arena. Oh, and don’t forget volume, profitability, and especially cash flow. It can’t be done. Or if it can be done, the risks are at heart-attack levels.
“AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.” One way to read that ambiguous sentence by AMD is a strategy that includes:
- Tablets and netbooks running X86 Windows 8;
- Emerging geographic markets, chasing Intel for the next billion Internet users in places like Brazil, China, and even Africa. Here, AMD’s traditional value play resonates;
- Internet-based businesses such as lots of profitable servers in the cloud. Tier 4 datacenters for Amazon, Apple, Facebook, Google, and Microsoft are a small but off-the-charts growing market.
So, let’s get together in February and see how the strategy chips fall. Or post a comment on your game plan for AMD.
Intel has long followed a two-year product cycle it calls tick-tock. In a “tick” year Intel introduces new chips based on a major change in process technology, such as this year’s release of the Sandy Bridge processors. The next year, a “tock” brings refinement within the existing process.
This pattern is driven both by the pace of technology innovation and and the realities of manufacturing. Semiconductor technology evolves fast, but not so fast that major disruptive change is required every year. And a two-year cycle gives Intel the time it needs to perfect fabrication and reap the benefits of the investment in proces change.
It looks like Apple is falling into a similar patter with the iPhone. The 4S announced Oct. 4 was a tock to last year’s iPhone 4 tick. A similar tick-tock pattern marked the release of the iPhone 3G in 2008 and the 3GS in 2009.
There are still major changes in the 4S hardware, most notably the move to the A5 processor, the new camera system, and the use of a dual-mode GSM/CDMA radio. But the basic design is unchanged, allowing the new models to be slipstreamed smoothly into Apple’s (or Foxconn’s) production process.
A change in the industrial design of a handset may not be as disruptive as new semiconductor process technology, but it never happens without difficulty. Apple had problems ramping up production of the iPhone 4 and manufacturing difficulties caused many months delay in the release of the white version. Then there were the notorious problems with the antenna.
Keeping the basic design the same gives Apple more time to perfect both the design and the manufacturings processes for what will almost certainly be next year’s tick, the iPhone 5, while maintaining smooth, high-volume production of the 4S.
Android is very popular and has made great inroads in the market in smart phones (with more than 50% share) and is beginning to pick up traction in tablets as well with a plethora of new devices due out shortly. But Android itself has not always been that good a performer, and some of the SW choices Google has made while developing the various versions have been troublesome.
It is clear Android can use some assistance in optimizing the code and user experience (one of the primary reasons Google is buying Motorola is for its engineering talent that has had a major positive impact on the design and tuning of Android). But Google needs assistance in improving future versions of Android, and has a broader vision for Android than today’s phones and tablets.
Although not well understood, Intel is one of the largest SW companies in the world (they have many thousands of SW engineers). It has a unique ability to make SW and particularly OSes run extremely well and have been doing so for many years, and not just with Windows. It is a leading provider of development and compiler technology. While Intel won’t necessarily help Android run better on ARM, it can certainly make Android run great on the Intel architecture. It is already well down this path with the Android code porting and optimization work it’s been engaged in for some time.
But Google has greater ambitions for Android than powering current mobile devices. Google ultimately wants to be a leading OS provider across the board and on many form factors, including on the x86 platform powering PC and PC-like devices, and competing with Microsoft and Apple. This is an extension of Google’s “service in the cloud” strategy with clients powered by Android and Chrome and productivity apps being “optimized” for its own environment.
So the relationship between Google and Intel is key to both their long term strategies. It’s a win-win relationship if done right. It’s quite conceivable that by the time Intel is through optimizing Android code, it will run substantially better on its chips than on ARM. But any help Intel provides Google for Android reliability and performance optimization on x86 will most likely also help it running on ARM since the efforts will be repurposed, and this ultimately helps Android on ARM as well.
The bottom line is both companies actually have a great deal to benefit from a close relationship. Intel gets to show of its upcoming devices for mobile form factors running a highly optimized (for its chips) version of Android. And Google gets a path to higher end systems and optimized code to access its services. And users get choice and a more compelling experience. So there really are no “junior partners” in this relationship. Both have much to gain.
This week two industry heavyweights will be holding conferences around their companies greatest assets. Microsoft will be holding its Build Conference where they will highlight and showcase Windows 8 their next major OS release. Intel will be holding its annual Developer Forum which is designed to promote and encourage new innovation in hardware and software for Intel’s X86 CPU architecture.
Myself and many in the industry will be closely tuned to these events this week as we look for Microsoft and Intel to show us their vision of the future of the PC and post PC landscape.
Both Microsoft and Intel are key players and heavy influencers in the technology industry. These events are important for them to demonstrate to the world, and more importantly to the key players in their ecosystem, their value.
With Microsoft I anticipate much of build to be about Windows 8 on other platforms than PC’s. I expect, and hope, they show how Windows 8 will add value to the hardware manufacturers who have set their eyes on smart phones and tablets. We already know that Windows 8 will inevitably be shipped on new PC’s going forward. If you make PC’s for a living, and are not Apple, you have no choice but to use whatever Microsoft builds for you. I am more interested in what Microsoft has to offer in the areas where they are not the only OS in town.
I also expect their Windows on ARM initiative to be highlighted and emphasized. I believe the Windows on ARM campaign from Microsoft is one of the more important if they want to see their Windows OS get to more devices like tablets and smart phones. Those devices do not run Intel’s X86 architecture, but rather run an ARM based architecture. If Microsoft can gain momentum getting software developers to use their tools to develop software for Windows on ARM then they have a clear path to bring Windows software to new devices running the ARM architecture.
Intel on the other hand sees the trend of Post PC devices and has to be worried because that future right now does not include them. Intel is aggressively working on bringing their ATOM processors to smaller devices to compete with ARM, the only problem is right now there is no competition. Manufactures looking to bring tablets and smart phones to the mass market are not even considering Intel at this point in time.
Intel at IDF this week, I am betting, will focus heavily on mobility and Smart TV. Within mobility I expect them to push heavily their UltraBook initiative showing off a range of new devices and PC prototypes to showcase the kind of devices they want to see hit the market.
(Related: WIll UlraBooks Make PCs Interesting Again)
I would not be surprised if Intel also shows off a tablet or two and perhaps even some early smart phone hardware running Intel silicon.
The bottom line is both Microsoft and Intel making big showing at Build and IDF are key. They are both key players in helping drive innovation in the technology industry.
We will see what kind of vision both of them provide of the future.
No wonder AMD was upset enough over BAPco’s SYSmark 2012 benchmark to drop out of the non-profit benchmarking organization in June with much sturm und drang.
My testing of the AMD Fusion high-end “Llano” processor, the A8-3850 APU, shows an overall rating on SYSmark 2012 of 91. Except for the 3D component of the benchmark, the Intel “Sandy Bridge” Pentium 840 scores higher in individual components — and higher overall — with a score of 98, according to the official SYSmark 2012 web site.
The SYSmark 2012 reference platform scores 100. That puts the high-end Llano desktop performance at 90% of a 2010 Intel “Clarkdale” first-generation Core i3-540, a low-end mainstream processor.
Moreover, the Intel “Sandy Bridge” Core i3-2120 dual-core processor with integrated graphics costs within a dollar of the “Llano” A8-3850 but delivers a 36 point higher score – noticeably snappier performance, in my actual use experience (see chart below).
I also tested AMD’s Phenom II 1100T, a top-end AMD six-core processor with an ATI Radeon HD 4290 graphics card, against an Intel “Sandy Bridge” second generation Core i5-2500 with integrated graphics. The Core i5-2500 is the superior processor on this benchmark; the much-maligned Intel internal graphics barely loses to the ATI 4920 external graphics card in the 3D component, while delivering a 44 point overall advantage. The results are shown below in Chart 1.
|AMD Phenom II 1100T||122||109||116||122||183||108||110|
|Intel Pentium 840||98||100||102||106||87||90||107|
|Intel Pentium G620T||79||81||81||88||70||71||86|
Source: Peter S. Kastner andBusiness Applications Performance Corporation
Is SYSmark 2012 Relevant?
SYSmark 2012 is relevant because it allows evaluators to test specific PC configurations against actual, commonly used business applications.
AMD says “AMD will only endorse benchmarks based on real-world computing models and software applications, and which provide useful and relevant information. AMD believes benchmarks should be constructed to provide unbiased results and be transparent to customers making decisions based on those results.” Let’s look at what SYSmark does and how it does it.
Serious readers will study the SYSmark 2012 Overview published at the BAPco web site. This benchmark version is built on 20 years of collaborative experience by BAPco in modeling business work loads into application scenarios and corresponding benchmarks through a 26-phase process that takes years to complete. The last version was SYSmark2007 under Windows Vista. SYSmark is real-world in that it incorporates widely used applications such as Office, AutoCAD, Acrobat, Flash, Photoshop and Internet Explorer under Windows 7 in component scenarios.
SYSmark is widely used around the globe in business and public tenders to select PCs without bias towards vendor and processor manufacturer. SYSmark is the only generally accepted benchmark for general business computers since it uses actual application code in the tests, not synthetic models.
The benchmark is intensive, reflecting workload snapshots of what power users actually do, rather than light-duty office workers. There are six scenario components to SYSmark 2012, each of which counts equally in the final rating:
Office Productivity: The Office Productivity scenario models productivity usage including word processing, spreadsheet data manipulation, email creation/management and web browsing.
Media Creation: The Media Creation scenario models using digital photos and digital video to create, preview, and render a video advertisement for a fictional business.
Web Development: The Web Development scenario models the creation of a website for a fictional company.
Data/Financial Analysis: The Data/Financial Analysis scenario creates financial models to review, evaluate and forecast business expenses. In addition, the performance and viability of financial investments is analyzed using past and projected performance data.
3D Modeling: The 3D Modeling scenario focuses on creating, rendering, and previewing 3D objects and/or environments suitable for use in still imagery. The creation of 3D architectural models/landscapes and rendering of 2D images and video of models are also included.
System Management: The System Management scenario models the creation of data backup sets and the compression, and decompression of various file types. Updates to installed software are also performed.
For each of the six components, BAPco develops a workflow scenario. Only then are applications chosen to do the work. BAPco licenses the actual application source code and assembles it into application fragments together with its workflow measurement framework. The data/financial analysis component, for example, runs a large Microsoft Excel spreadsheet model.
What I don’t like is the “2012” moniker. This SYSmark version is built on business application components as of 2010. By naming it SYSmark 2012, BAPco implies the benchmark is forward looking, when it actually looks back to 2010 application versions. The labeling should be 2010. In spite of the labeling, SYSmark 2012 is unique as a cross-platform benchmark for stressing business desktops using real-world applications in job-related scenarios.
Analysis and Conclusions
The SYSmark 2012 reference-point PC is a Core i3-540 and has a 100 point score. When I used this processor with Windows 7 last year as my “daily-driver PC” for a month, I was underwhelmed by its overall feel. Subjective comment, yes, but my point is that the reference machine is no speed demon.
The new AMD “Llano” A8-3850, a quad-core processor with integrated graphics, is adequate for light-weight office duties as measured by BAPco SYSmark 2012. The top-of-the-line AMD Phenom II 1100T with a discrete graphics card is better suited for mainstream task-specific business computing than the “Llano” processors.
Intel’s low-end dual-core “Sandy Bridge” Pentium 620 and 840 bracket the “Llano” A8-3850 in processor performance, while lagging in graphics-intensive 3D benchmark components.
Intel’s entry-level Core i3-2120 with integrated graphics handily beats the top-of-the-line Phenom II 1100T with a discrete graphics card in all but graphics-intensive 3D benchmarks, making it an attractive price-performer. The high-end Core i5-2500 tops the top-of-the line Phenom II 1100T with a 44 point overall advantage, despite using integrated graphics.
SYSmark’s results do not plow new performance ground. An Internet search will quickly turn up numerous reviews that conclude, using a different set of benchmarks, that the “Llano” line is weak as a processing engine and pretty good at graphics, especially 3D consumer games. Yet consumer games are not typically not high on the business PC evaluation checklist.
Many of the SYSmark 2012 applications use graphics-processor acceleration, when available, including Adobe Photoshop, Flash, Premier Pro CS5, Autodesk 3ds Max and AutoCAD, and Microsoft Excel. SYSmark 2012 convinces me that today’s integrated graphics are plenty good enough for business PCs shy of dedicated workstations. But a strong processor is still necessary for good overall performance.
Business desktops ought to be replaced every three to four years. However, the reality is many businesses keep desktops for five or more years, and many have instituted a “replace it when it breaks” cycle. Productivity studies show that knowledge workers deserve the higher end of today’s performance curve in a new PC so as not to be completely obsolete — and less productive — before the machine is replaced.
No single benchmark should be the sole criteria for selecting a computer, and SYSmark 2012 is no exception. However, I disagree with AMD that SYSmark is no longer worthy of consideration, and by other analysts that SYSmark is dead because AMD walked away from BAPco.
The bottom line for PC evaluators is simple: if you believe that the extensive work by the BAPco consortium across two decades stands up to scientific and peer scrutiny, then the SYSmark results discussed above show AMD at a serious performance disadvantage. If you don’t think SYSmark is a relevant benchmark for business PCs, then neither AMD nor I have a viable substitute.
The next shoe to drop is AMD’s high-end “Bulldozer” processor, expected in the next 60 days.
You might think that this is a trick question. On the surface, the answer should be the iPad and its eco system. But the iPad is a new category and while it is true they fear Apple’s potential of owning this market and making it hard to create products that are competitive, this is not the product that they fear the most.
The product they fear the most is Apple’s MacBook Air. When Apple first introduced the MacBook Air, a lot of the PC vendors thought it was a gimmick. While it was very thin and light it was very underpowered. And well over $1000. PC Vendor’s thin and lights (their definition, not mine) had broken $1000 and PC”s under $700 were dominating the overall market for laptops. And this first generation MacBook Air had no impact on their laptop market at all.
The only company to kind of take this Apple move serious was Dell, who created the Adamo XPS, supposedly their version of the MacBook Air. But while it was relatively thin compared to all of the other “thin and light” laptops on the market, it was also so high priced that people stayed away from it in droves. At least for the short term, Apple’s MacBook Air was considered the thinnest and lightest laptop albeit slightly underpowered and with Apple’s upper end pricing scheme behind it.
In the mean time, the demand for cheap PC’s started to take off. In fact, a new category of thin and lights called netbooks was all the rage for about two years. And while Steve Jobs considered netbooks toys, he watched its growth with interest. While he publicly said Apple would never make a netbook, it was pretty clear that Jobs and company had decided to make the next MacBook air lighter and thinner than a netbook yet as powerful as most mid to high end laptops. And, while their starting model is $999, their proprietary unibody casing and integrated graphics chips still make these the most powerful ultralights on the market today.
But when Apple also decided to kill their MacBooks, or their entry level laptops and only bring to market MacBook airs at prices close to their older entry level models, the PC vendors sat up and took note of this quickly. To them it signaled that Apple is getting ready to start a full out assault on what has been sacred territory for them. Sure, they can still create laptops under $500 and sell them all day long. But they also realized that Apple is now setting the bar for laptops at a new level by using the MacBook Air to help define the next generation of laptops and, they know that with Apple’s buying power and International reach Apple could price them even more aggressively in the very near future.
The PC industry itself had somewhat anticipated this and is working on creating what they call Ultrabooks, Windows based systems that are much like the MacBook Air. But the one that is on the market today that is the closest to the MacBook Air is the Samsung 900 3X which is priced about $1600 Euro’s in Europe and well over $1800 in the US. Apple’s comparative model is $1599. Although the Samsung 900 3X is a solid product, Apple’s lead in these types of “ultrabooks” along with their stores will help them sell even more of these in the future. In fact, in the last earnings call, Apple said they sold about 4 million computers in the last quarter and that 73% where laptops. And we believe that 75% of those where MacBook Airs.
Given the MacBook Air’s pricing and Apple’s apparent commitment to be even more competitive with the mainstream PC vendors with this model, signals to me that they really want more of the hallowed ground that traditional PC vendors tread today. And it looks like Apple is about to crank up their laptop supply chain prowess, industrial design skills and marketing and retail emphasis and will go right at the heart of these PC vendors most profitable laptop segment.
Oh yeah, and they will soon have their iCloud offering that will bring their eco-system in sync to their laptops and desktops as well, another value added piece of technology that I am sure will strike a chord with users. And given the possible halo effect of the new iPhone 5 when it comes out as well as the iPad and the iCloud, I am certain that Apple will drive even more people into their stores and will put an even greater effort on selling MacBook Airs and MacBook Pro’s in the future.
Yes, the iPad is a real concern for the PC vendors as Apple has a huge lead in tablets and strong demand. But if Apple starts eating into their laptop market share, this will have the greatest impact on these PC vendors in the future and make it even harder for them to make strong profits on this part of their laptop business.