The Death of the PC Business


Hewlett-Packard’s decision to split the company into two businesses, one selling PCs and printers, the other offering services and high end business network equipment, marks the end of their run in the 35-year computer industry. In a sense, it’s an overdue shift of the business to companies designed to function as low margin and mostly Chinese. But it is worth considering the history. (Apple, of course, is another business that flourishes on computers for its own reasons)

In many ways, IBM shapes the story. The personal computer moved from being a consumer system, such as the Apple ][, into a business dominated market. The key was the introduction of the Model 5150, better known as the IBM Personal Computer, in 1981. But IBM broke its own way of controlling hardware. The PC was based on freely available systems, most importantly the Intel 8086 processor, and IBM freed Microsoft to make its operating system available to all comers. The market swelled from big tech companies, such as Digital Equipment and Texas Instruments, to arguably the most important start up, Compaq, in 1982.

IBM tried to strike back at the surging market share of “IBM PC clones”. In 1987, it came out with a redesign called the Personal System/2, which would require the  use of IBM-designed computer cards such as graphics controllers in place of an industry common design. The only successful component of the PS/2 design was an improved round plug to connect keyboards and mice. At about the same time, IBM issued the OS/2 operating system to compete with MS-DOS and Windows. It was another dismal slump, although IBM wouldn’t give up, issuing a version called Warp aimed at business and consumers in 1994.

In the end, IBM’s last real hit was the first truly successful laptop, the ThinkPad, introduced in 1990. But even it could not hold off the competition from Compaq, HP, and a plethora of other laptop makers.

By the end of the 1990s, IBM was in serious trouble in just about all its businesses (more trouble, in fact, than HP has ever been in). As the last stage of a long effort to reconstruct the company, it sold its PC business to Lenovo, which managed to become a very successful computer company as a global, low margin leader. Earlier this year, IBM finished off the last bit of its low end hardware business by selling its X-series servers to Lenovo.

Needless to say, HP didn’t study IBM as an example. In 1999, it sold off its original instrument business (now Agilent Technology) to favor the printer and technology business and in 2002 bought Compaq, a dominant if not very profitable player in the computer industry. It also generated internal management and shareholder struggles that have never completely gone away, though CEO Meg Whitman seems to have calmed the chaos.

In a sense, the decision to get out of the PC and printer business has looked obvious. The company’s last dramatic computer move was the 2010 catastrophic purchase of Palm. HP’s big acquisitions have been EDS consulting, 3Com, with network equipment and services, and Autonomy, a controversial big data software maker.

In the new world, Hewlett-Packard Enterprise, with a focus on corporate technology needs and competition with companies such as IBM and Accenture, will be run by Whitman and many of her top executives. Dion J. Weisler, executive vice president in charge of the PC and printer division, will be CEO of HP, Inc. The deal should close some time late next year.

The future of HP, Inc. is unclear. It could be like Dell, a U.S. seller of Chinese computers pursuing a low end market. Or it could end up being bought by a Chinese manufacturer — speculation having even included Lenovo. Either way, the HP we know is gone.

Published by

Steve Wildstrom

Steve Wildstrom is veteran technology reporter, writer, and analyst based in the Washington, D.C. area. He created and wrote BusinessWeek’s Technology & You column for 15 years. Since leaving BusinessWeek in the fall of 2009, he has written his own blog, Wildstrom on Tech and has contributed to corporate blogs, including those of Cisco and AMD and also consults for major technology companies.

35 thoughts on “The Death of the PC Business”

  1. So glad you’re back, Steve! This is the kind of thing that made me avidly read everything you wrote for 10 years, going back to when you wrote for BusinessWeek.

    The only item I would change is your last sentence. I would say the HP built by Bill Hewlett and Dave Packard has vanished without a trace. However, they might like Agilent.

  2. Excellent stroll down memory lane. Yes, I remember the outrage against the PS/2 (microchanel architecture) and the original OS/2. The hardware PS/2 hardware was fine, but proprietary. OS/2 was a joke (helped with a little MS backstabbing). Tandy, Compaq, and other’s also tried their proprietary lock downs. They too were slapped for it.

    The most important thing, however, were the times. Open systems gave users (okay buyers, owners) the control. They were not going to put up with technologies that pigeonholed them into anything. You may consider DOS/Windows as pigeonholing, but together with the hardware architecture it was a vast foundation under which to innovate.

    1. I agree. I think this was the case because a large portion of the market/users (not necessarily just the owners, such as in the corporate IT environment) were coders (or at least in some way expected to manage code), too. Even then, especially with the PC, we aren’t just talking users, also the ones who decided what to buy, even if it was someone else’s money.

      Not many even cared for a mouse. It wasn’t the intrinsic nature of open systems that allowed innovation. It was a common system for the people who were at the helm of the largest base of decision makers, professional and enthusiast. The least amount of friction in their work flow, (including support vendors).

      GUI and the mouse was an affront to their work flow and was a work around to their sphere of influence. That’s why they constantly disparaged a mouse and GUI PC as a “toy”. “Real work” was done with a keyboard and command line. If you needed to navigate the screen, well, that’s what arrow keys were for. In this regard, the systems were culturally closed. The system was not meant for everyone. It was open only to the “enlightened”. It was very esoteric. That is job security.

      But that is not today’s environment. That is what is causing the retreat of longtime PC vendors. In a way, it makes me hopeful for the future of the PC. What is dying off likely is what needs to die off. Old solutions are no answer for new problems. That is what kind of makes it ironic, right? Technology is upending decades, even century old industries—news, music, entertainment, retail, etc. Now technology is even upending the tech industry.


      1. There is a lot of merit to your points. But it’s precisely “open”, non-curated systems that “allowed” the innovation.

        It’s well known from lore that the war cry at MS was “DOS isn’t done ’till Lotus won’t run”. A despicable position. Had they not been forced (by expectation) to be open, and curated instead, they could have just said “not allowed”.

        There were several windowing systems before Windows. Besides MacOS, there was GEM, Desqview, Topview, and others. No one prevented them from developing and marketing their products. You might also remember a Quarterdeck product QEMM386, that flat out replaced DOS’s memory manager (talk about “reproducing functionality”). I would venture to say that none of this would be allowed on “some” systems today.

        The move to Mice/GUI was user choice. Up until Windows 95 (and even then), Windows itself was a shell that ran on top of DOS that the user chose to run.

        1. “No one prevented them from developing and marketing their products”

          That is the power of a System (I don’t mean an OS, but systems in the larger sense) as much as herd mentality, small as the herd was back then compared to today. It was a cultural reasoning, not an intrinsic value of open. It was a definer, a definition of what “real” computer users used. It imbued a lot of social power and acceptance from the elite to a person who fell in and toed the line. It really was an elitist system. You weren’t taken seriously unless you followed accepted form.


        2. “The move to Mice/GUI was user choice. Up until Windows 95 (and even then), Windows itself was a shell that ran on top of DOS that the user chose to run.”

          This was the acceptance that the market was shifting from the tech pro/enthusiast to less technologically proficient users, aka, the end user. Up to then the PC market was enterprise (IT) and what I now call technophiles. That was a key turning point in the shift of the market. It also shows how little even MS regarded the “enduser”. Only a “shell” was needed to placate the end user, while the serious work was still done in DOS, without a mouse.


        3. “the war cry at MS was “DOS isn’t done ’till Lotus won’t run”. ”

          Which is likely why Excel became such an important cornerstone for MS, so they weren’t at the mercy of another company. Sound familiar?


          1. Right, but they had to compete (in many ways) for it. Lotus wasn’t ruled out by fiat.

          2. Well, as they did later, MS competed in ways that were worse than by fiat. At least Jobs was quite open and clearly told Adobe why they were “ruled out”.

            And that’s what made the early days of the PC so much more exclusionary and elitist than today. “Open” was a euphemism for its own form of mystery religion. I would say “Open” today, as exemplified by Linux, is less a mystery religion than it once was. This is in opposition to what Google means by “open”.


          3. Respectfully, nothing is worse than fiat, no matter how honestly it gets presented. I will grant you that there may be technical elitism on the open side, but if you took the trouble to learn, you could do anything you wanted. And that should not be impeded. Learn if you want, use as directed otherwise.

            One thing we will never know, is how much innovation could have been done if iOS were open. Yes, there would have been crap too, perhaps more of it, but that’s how things progress.

          4. Well, considering by definition “fiat” is not Apple’s procedure, I guess your differentiation is moot. It may have seemed and even been “by fiat” in the early days of the Apps store. Apple has clearly moved to being more open with their reasoning today. Either way, I will disagree about which is worse since I will never side with covert subversion.

            Heck, I would even argue even with Apple’s process there is still crap in iOS apps. I’ve deleted a number of them from my iPhone and iPad. So far, iOS has shown how innovative their system is compared to “open” Android. It’s not as if we don’t have an open system to compare it to, if one considers Android or even Windows open.


          5. And I will grant you that they are more palatable to me these days under Cook, than under Jobs. Still, the mechanism to be subversive exists. You must code with allowed languages and the result must be sold through them.

          6. Sort of. Certainly within Apple’s ecosystem. But unlike the early days of the PC (or heck, even today), there are viable alternatives to developing for iOS.


          7. Respectfully, once again, you show yourself to be very hung up on the using of the tool, and how you cannot abide the maker of the tool having anything to say about the method by which it is best used. I think this makes you miss the forest for the trees.

            Great, so MS or Google or Samsung “lets” you hold their hammer any way you wish. Doesn’t matter that you are trying to hammer a nail or not, do what you like. Turn it around, hold it like a gun and play at war with your buddy, no problem. But when Apple presumes to “tell” you how to hold the hammer they make, you get upset.

            But I can still use that Apple “hammer” to just “hang a picture frame” or to build an Eiffel Tower. I too am only limited by my imagination, because the Apple hammer offers me the least resistance to actually accomplishing my goal. I find that a lot easier and more pleasing to accomplish with my Apple hammer than some other, because it is a great hammer. I find other hammers are out of balance, their heads aren’t square with their handles, they fall apart, etc. They might be great to play around with, but they just aren’t a great tool for getting the job done to my specifications and satisfaction at the end of the day.

            So, while some find freedom and openness in nobody telling them not to hold their hammer backwards, and being able to express themselves by pretending the hammer is an Uzi or an air guitar, others find freedom in just being able to accomplish what is in their head with their hammer, knowing that even the greatest artists, architects and engineers abide by certain “rules” in order to produce enduring and unique work.

            You are giving Apple far too much credit and power. You are somehow seeing them as an arbiter of your imagination. They are not. They are the tool maker, and as such, they are an empowering agent. But it is precisely the elevation of the humble tool to a position of “elitist system”, that Joe is talking about — that is when it interferes with accomplishing some goal or producing something I have imagined. Endless talk about the “flexibility” of the tool itself, becomes an end in itself. Just get on with using the tool as intended so you can see how good it is at actually helping you produce what you want to produce without turning out some mediocre half-baked result that is the same as everyone else’s because the tool is sooo open it’s good for everything except what you need it to do.

            You seem obsessed with how Apple “draws circles around its tools” as stepping stones. Some of us are concerned with how MS or Google or whomever are be drawing circles wider, just beyond our focus, as fences, perhaps; maybe around “computing” or “internet” or something. And you claim all the innovation is within the wider circle, the “more open” one. Yet time and again, new use cases and new types of devices, new tools — usable tools — come from Apple; and people do amazing new things with them. The “control” you constantly decry is simply not of the sort that is a hindrance to progress in the way that the “open” systems have proven to be.

          8. My comment is in a historical context, and my thesis on what the PC revolution was about, it’s benefits, and it’s growth.

            But your right, in many ways I find iOS’s way of doing things as the antithesis of that.

          9. The antithesis of what? “What the PC revolution was about”? “Its benefits, and its growth”?

            Speaking of historical context, one context you are not taking into account is the whole face of technology. You are taking Apple as one closed context, and everyone else lumped together as another closed context. You are lumping them together, because they represent a coalition of “open”, due to the modular nature of their business models, etc.

            Thing is, in the historical context, there were lots of Apple’s. There was Amiga, Acorn, Commmodore, Atari, Sinclair, etc., and there were other OSs too, like BeOS, NeXT, OS/2, etc.

            These were extinguished by MS when it flexed its monopolistic muscles, and made astute business deals. This has nothing to do with innovation and “openness”. On the contrary, it is the stifling of innovation and openess, because now only Apple has remained as a viable alternative to the MS hegemony, the system that encircled most of personal computing for decades. Yes, there was “growth” …of corporate IT, the elite priesthood that Joe was speaking of.

            This “openness” of MS or Google is like the Communist Block. The OEMs are all vassal states behind an iron curtain.

            Read Ben Barjarin’s article of today, about Samsung. There is no particular innovation taking place under the OEM relationships with Google or MS. It’s a race to the bottom.

            You may feel free to use your “open” tools without restraints, but such “freedom” is just part of a larger set of restraints to do with a system, or to do with having your data used as a product (in the case of Google).

          10. “Thing is, in the historical context, there were lots of Apple’s. There was Amiga, Acorn, Commmodore, Atari, Sinclair, etc., and there were other OSs too, like BeOS, NeXT, OS/2, etc.”
            All closed systems in one form or another, except for OS/2, which came too late. The only one to survive was Apple (barely).
            You see the “restraints” you speak of are “standards”. Whether the IBM PC was worthy of the crown or not is irrelevant. It provided open standards that permitted open innovation. You could buy a PC at home, and it ran the same stuff you had at work. And it was far less expensive.
            The PC (including the Mac) did not require permission from MS, Intel, and OEM, or anyone to make a hardware or software product. The fruits of that labor also operated in a marketplace where multiple channels were allowed. So yes, it’s an antithesis to iOS.

          11. Ahh, standards. Interesting. Yes, standards, in the sense of agreed rules governing compliance to an agreed set of specifications in order to achieve interoperability and consistent user experience. Of course: standards are a requirement of any system, market or society… Or chaos ensues.

            You are right that the “restraints I speak of are ‘standards'” in at least one way. Likewise, the standards you speak of are restraints. And they are in danger of being declared “standards” by fiat, to serve the best interests of the companies promoting them (MS and Google in this discussion).

            By “sharing” these “standards” in an “open” way with OEMs, they simply create a smaller, poorer world (my analogy is to the Soviet Union). They have their own club, that does not represent the world, let alone the “cutting edge”.

            There are other “standards” that even a “closed” company like Apple subscribes to: open file formats, codecs, html, JavaScript, network protocols, you name it. In fact, Apple was instrumental in helping develop and promote a lot of these. Apple plays very well with openly declared and mutually agreed standards of all sorts. Why would it not?

            The so-called “standards” that MS and Google tend to promote are their own corruptions (corruptions of JavaScript, their own extensions for browsers that make webpages more and more proprietary, weird DRMs or codecs, etc.)

            Apple has standards too, and adheres to international, open standards. Often much better than MS or Google. Apple simply chooses not to license its own OS. I know that sticks in your craw. But believing that the standards of the world and the future innovation of the world should be in the hands of MS and Google when they have created their own iron curtains around a certain, limited set of “standards”, is incredibly naive and short-sighted.

            The Soviet Union crumbled from within. This is why we are discussing articles about the relevance of MS. As far as the future of innovation, Apple just has to apply itself to coming up with another solution to another problem it sees, and everyone else scrambles and bankrupts itself, like the Soviet Union trying to react to rumors of a “Star Wars” defense system.

            The future is not to resurrect the Soviet Union, nor to promote corrupt oligarchs in its place (Samsung?). This is not the source of innovation or the future of computing.

            1) Everyone has standards and adheres to some more than others. But some standards are better than others.
            2) Having a “closed” business model, like Apple, is neither here nor there in terms of the importance of “Standards”. Standards are still relevant, if not more so. It just means you are “an independent state”. You can still be part of the EU, or the UN, or whatever, in which you adhere to international law.
            3) it is in fact, the so-called “open” systems/modular business models that are trampling on the very idea of Open Standards. So, I find your post very revealing.

          12. Here’s where your defense of iOS falls apart.

            “Yes, standards, in the sense of agreed rules governing compliance to an agreed set of specifications ” -Kizedek

            Who agreed to iOS’s standards when they were set? It would still be okay if they were submitted, as is, without prejudice, for use on iOS products. Thus it’s a closed, not open standard.

          13. I’m not “defending iOS”, so much as countering your mistaken, narrow, revisionist notions. I am discussing the issues around your notions on the following theme:

            “…it was a vast foundation under which to innovate.”

            “…what the PC revolution was about”

            “But it’s precisely “open”, non-curated systems that “allowed” the innovation.”

            “It provided open standards that permitted open innovation.”

            I am pointing out that your context is painfully narrow. Regarding Personal Computing, whether desktop based or mobile, there is (mostly) three main “spheres” of development or influence, if you like: Apple, the “club of Google”, and the “club of MS”.

            You seem to have the notion that just by virtue of being a club (“open” and licensable to OEMs, etc), there is some inherent characteristic / power / tendency / worthiness… or something about the club, that produces “innovation” and greater development of the sort that we should all value and appreciate.

            Not so. I am simply saying this is largely an illusion. The wider historical context, of both computing and every other industry you could think of, is that there is the opportunity or need for lots of independent, integrated players like Apple.

            You are framing it as “open” vs “closed” and making a value judgement. The problem is that a high degree of modularity (one that separates software and hardware for example) in such complex products as PCs or Mobile Devices actually dampens innovation and creates “a sea of sameness”. It’s patently obvious. Yes, in theory, you would think that “two heads” and all that would be better than one. But it hasn’t happened. Rather, it’s a race to the bottom and a struggle for OEMs to survive.

            What’s truly “open” and desirable, would be a broader world made up of more independent players like Apple, to balance out the ineptness of all the club members that have been enjoined by MS and Google — the ones that can’t really solve the complex puzzles represented by Mobile Devices or PCs in any meaningful way because MS and Google can’t produce hardware to save their lives, and their OEMs can’t do software.

            If there were more “Apples”, then you might get actually get your wish and get some nice alternatives to iOS.

          14. First off, there was nothing revisionist in my statements. Secondly, if there are “clubs” (your term, not mine), we both know who adheres to that notion the most. But even in spirit, yes, the broadest most mutually compatible “club” is the PC (including Mac) and Android worlds.

            Secondly, the only value judgment I made is not on technical merit, rather on access to the “ability to innovate”. You did not need anyone’s permission to innovate. You didn’t answer to IBM, MS, Intel, Apple (in PC’s) or any specific retailer. It turns out that IBM won, happened to have an open architecture (perhaps stupidly for them, but a gift to the market) and an ecosystem flourished, that gave tremendous power to everyone. The size of the market drove down costs, and thus it became more affordable as well. IBM lost control of the platform, but birthed a market. GOOD! This put control into the hands of the owners of the machines. Nothing was verboten.

            “Race to the bottom”-Kizedek

            Some bottom! My desktop blows away 90% of all PC’s (not bragging, not what I do). It cost me less than $2000. Hexacore i7, dual GPU, RAID 0 SSD’s, 32 GB RAM, etc. Spec showdown? Perhaps. But most price/performance. WIn8 desktop, to me, as good as OSX. All made possible through the use of spare parts, and OPEN architecture.

            “Many Apples”-Kizedek

            If many Apples are interoperable with each other, it’s less of an issue. Still standards DO have tremendous benefits, especially when they are equitable.
            But here’s what happened:

            “Therefore is the name of it called Babel; because the LORD did there confound the language of all the earth: and from thence did the LORD scatter them abroad upon the face of all the earth.” Genesis 11:9

    2. Somehow, your high praise of ‘open systems’ rings hollow when the supposedly open system called DOS/Windows was the instrument for establishing one of the most innovation-stifling monopolistic reigns in industrial history. How could that have completely gone over your head? Sure, the order brought forth by DOS when it concatenated the process of standards-forming allowed the PC industry to get ahead with innovating much sooner, but later on it became quite clear that the only innovation that was allowed to survive was innovation that Microsoft approved of, innovation that reinforced, not threatened, the Windows-Office conjugal monopoly.

      And now Google is trying to replicate MS’s feat with Android. Thank god Apple, that reprehensible, fiat-spewing, closed-system pusher, is big enough today to thwart the would-be monopolist, eh?

      1. In fact, I believe that MS should have been broken up because they were too dominant and too influential. They leveraged OS to lock out competing Office Suites, Programming Languages, etc. Today they call the same thing “vertical”. Go figure.

        1. I believe that Microsoft would be better off had it not appealed the breakup ruling back in 2000. I don’t think Nadella will do it now though.

  3. “The personal computer moved from being a consumer system, such as the
    Apple ][, into a business dominated market. The key was the introduction
    of the Model 5150, better known as the IBM Personal Computer, in 1981.”

    Minor nit: The hardware half of the transition was the 5150. The software half was Visicalc and its IBM-PC successor, Lotus 123, which caused corporations to realize that these consumer toys had a vital role to play in their businesses.

    1. Great comment. If I’m not mistaken, VisiCalc actually launched on the Apple II first.
      We should also not forget the CP/M systems that preceded the PC.

  4. Great article. You perspective and insight have been missed. IMHO, you are one of the people who give Techpinions a unique edge.


  5. “The future of HP, Inc. is unclear. It could be like Dell, a U.S. seller of Chinese computers pursuing a low end market. Or it could end up being bought by a Chinese manufacturer — speculation having even included Lenovo. Either way, the HP we know is gone.”

    To a great extent Apple is the HP of the early 21st century, building its new headquarters on formerly HP land.

  6. HP is just the next Dell which became a declining company. Dell used to be big in 2006, now they are very small.

    Dell is not going to be big and you pay attention to market share of HP, they are number 2 pc company now.

Leave a Reply

Your email address will not be published. Required fields are marked *