Antiquated Thinking About The Tech Industry

A few weeks ago, Ben Thompson hit the nail on the head with this tweet:

Screen Shot 2015-02-22 at 8.55.58 PM

When we look at commentary by so many in the industry, what Ben’s tweet highlights, and perhaps the best way to understand the issue, is many are hindered by antiquated thinking about the technology industry. What they learned, observed, studied, and lived in the industry of the past is largely outdated.

As many of the modern critiques of things like Disruption Theory and Adoption Theory have shown, the industry has changed and changed fast. Many business theories, once the fundamental foundation for thinking about the technology industry, were created in an era very different from the one we live in today. It is also this outdated thinking that is the root of most bad analysis of Apple. While some of the fundamentals of theories of old remain helpful, the takeaways and modern applications of them are not as black and white as most believe.

Uncharted Waters

Last year I articulate in this article how the tech industry history is being made, not repeated. This fundamental understanding is the root of why we have to take theories and observations from the 30 year tech industry past with a grain of salt when looking to implement them universally today. Sometimes they apply and sometimes, they don’t. But too many believe these are universal applications to the market today and it is simply not true. In that post I stated:

Billions and billions of people have leapfrogged the PC and jumped straight to a computer in their pocket. They have no concept of what it was like to grow up with a PC or even in a PC saturated region. They didn’t have PCs in their school to learn computer literacy, they haven’t had to deal with the Microsoft Windows monopoly, they never dealt with corporate IT bureaucracies. In the very near future, the number of people in the market who had nearly zero contribution to tech industry history will dwarf the number that did. So how can we, with any degree of intellectual honesty, claim so boldly we know how this will play out?

In a similar line of thinking, Ben Thompson takes the time to elaborate more on this tweet in a recent episode of his podcast Exponent.Fm. In this discussion, he outlines how some of the lessons we learned with Microsoft’s dominance may not represent a universal trend or truth. There is a great deal of modular vs. integrated business model philosophy to unpack but understanding these dynamics is central. Understanding what has changed is key to our framework moving forward.

In the podcast, Ben makes the excellent point that was has changed was a premium in the market placed on user experience. When IBM/Microsoft “won” in the last era, it was largely due to IT buyers being the customers for PCs. This customer places a premium on cost, not experience. So, once the customer base shifted from corporate IT buyers, who favored cost, to true consumers, who favored experience, the basis of competition changed. Once the basis of competition changes, so do the dynamics in the market. This is what we are observing today many miss. While we can explain this in terms of user experience, the deeper observation of what has changed is the emergence of a pure consumer market.

This is the root of my point that tech industry history is being made. We have 30 years of history focused on a business customer and only 10 years or so of tech industry history selling to pure consumers. And even fewer years of history when we think about how long the tech industry has been able to sell a PC in the shape of a smartphone to almost every person on the planet. As we add as many brand new internet users over the next five years as we did in the past 30, we may very well realize none of the lessons learned in the past 30 years apply any longer.

Published by

Ben Bajarin

Ben Bajarin is a Principal Analyst and the head of primary research at Creative Strategies, Inc - An industry analysis, market intelligence and research firm located in Silicon Valley. His primary focus is consumer technology and market trend research and he is responsible for studying over 30 countries. Full Bio

148 thoughts on “Antiquated Thinking About The Tech Industry”

  1. Well said Ben, though this is more an admission of the mainstream analyst community finally catching up to to the leaders who have been saying this for years. Horace Dediu springs to mind along with Thompson. Asymco is where JKirk cut his teeth commenting on Horace’s analysis years ago which has enabled him to rehash that POV in his own inimitable style here.
    Now we just need the laggards at Gartner et al to get with the reality and we can move forward faster and reduce the number of TL;DR articles restating why the future will not be like the past.

    1. It is more than the analyst community. I’ve been having this same discussion behind the scenes with many industry executives well before any of this public writing. Many of the same discussions were had when discussing Apple vs. Samsung and many didn’t believe me when I said Samsung was in worse shape in the long run.

      I’ve always felt the role of a good analyst is to be a teacher. To communicate well fundamentals that need to be understood by their clients for their clients success. I hadn’t quite been able to nail the root of the issue being the simple outdated philosophy until Ben’s tweet. Although admittedly for people to recognize all the things they learned may be wrong is a tough pill to swallow.

      1. Exactly. It is incredibly difficult to get humans to first recognize that they are wrong, and second, change their behaviour/thinking. Cognitive dissonance is powerful.

  2. The antiquated thinking you’re describing is at the heart of how the tech media underestimates the Apple Watch – just like they did for every new Apple product since the iPod.

    There’s antiquated thinking and there’s also lack of imagination. Those predicting Apple’s doom can’t seem to explain its success. Most of the tech media can’t imagine the future until someone shows it to them. And even when that happens they frame what they see in the context of the past.

    Last week, I wrote about how the Apple Watch is spectacularly misunderstood in a most predictable way: http://torusoft.com/blog/the-ipod-all-over-again

  3. I agree with your sentiment. Management theories are a dime a dozen, see http://www.valuebasedmanagement.net/ And, obviously, to a man with a hammer, everything looks like a nail.
    Having said that, the real challenge is in figuring out how much of our past experience is of value in the future. Writers such as Ben Thompson, Horace Dediu and the author team on Techpinions provide a lot of valuable input that helps me improve my understanding of the industry.

  4. “Billions and billions of people have leapfrogged the PC and jumped straight to a computer in their pocket.”

    “Leapfrogged” implies that the computer in their pocket has obsoleted the PC. Yet, with the EXCEPTION of mobility, mobile does nothing better than PC’s. In fact, it does everything worse. What mobile proves is that it’s better to have a computer in your pocket, than no computer at all. For common uses, this suffices, and that’s a good thing.

    When you say “Ok Google” the interpretation of the command is not executed on the device. The search is not executed on the device. The computing chores are distributed among computers in the network, including the device.

    Bill Joy comes to mind again. “The network is the computer”. Then, with mobile, we have “access” to “the computer” in our pocket. Perhaps that’s a better way to look at it.

    1. On the other hand, I must have been 13 we I got my first computer (a ZX 81) and started copying then writing programs, then hacking its hardware. Nowadays, teens around me usually don’t even use a computer except as a glorified gaming console + TV, let alone have a clue about its inner workings, programming, hacking. I think it’s similar to how early car users had to know how to change a transmission belt, check brake fluid, and fix a blown tire.
      What’s being leapfrogged is both that “computer as an end in itself” phase, and indeed the desktop hardware + desktop OS use case. I’m fairly sure you need to get to university for a desktop to become mandatory, and even then, not for all majors.
      But more than that, I think what’s being leapfrogged is the “not connected” stage of computing.

      1. The Internet has been the most significant achievement in human knowledge management and access since the printing press. It surpasses it by orders of magnitude.

        What you (and I, and many many others) did was to educate ourselves about our tools. This is important, no matter what the tool. Apparently that’s antiquated. And this is a good thing?

        1. But you are ignoring the job the tool is for. The job today is far different than the job was in even the nineties. As obarthelemy rightly points out, the computer as an end in itself is not how people are computing anymore.

          Joe

          1. The computer as an end to itself is for computer scientists, of which I am not. Not sure what it is I’m ignoring. What we didn’t have in the early nineties was the internet. We’ve not stopped doing the other things we did then. We do more things including those.

          2. “The computer as an end to itself is for computer scientists”

            And computer hobbyists like you.

            Joe

          3. I use my computers not only as a hobby, but as tools to earn my living as well. No tablet can replace any of my PC’s for that purpose.

          4. I think the reason iPad sales growth has stalled is that Apple oversold the productivity angle and people quickly realized that the things they planned to do on a tablet aren’t really practical. Word processing, photo and video editing, spreadsheets, and other large-screen computing tasks, there is really no substitute for a PC except a PC with an even larger screen. You can do revisions on the fly with a tablet but only a masochist would do heavy duty stuff on a 10-inch non-windowing screen.

          5. Certainly for the way most westerners think about how to do things. This is the process they know and are most comfortable with. But for those, as Ben is writing about, who never knew that way of doing things, they’ll be looking back at us thinking they are supposed to talk into the mouse.

            http://www.cnn.com/2015/02/19/africa/africa-mobile-internet/index.html

            “More people in Africa have a mobile phone than access to electricity,”

            Most of those people will likely never use a PC, much less own one, and find ways of using mobile devices instead.

            It also isn’t hard to find article after article talking about how internet cafes in Asia are on the decline because of mobile. If we think how we use computers now is all there is to it, we will be lost in the future.

            Joe

          6. If it influences how tech is used and understood it doesn’t matter the grounds. Although I think either assessment would be incorrect. It is cultural grounds.

            Joe

          7. Are you saying that for a person who has only used a tablet for writing complex lab reports, if you demonstrate to him how it’s done on a PC, the person will say “No, having a small screen and only single window in view is better than a large screen and multiple windows”?

            And I never said that PCs are categorically better than tablets? I said there are certain tasks that, excluding affordability aspects, PCs are better at than tablets.

            The wheel is pretty antiquated, helluva lot more antiquated than PCs, shall we discard that too on the basis of its antiquity?

          8. Not what I am or have said or implied. I would think, however, if someone has been writing complex lab reports on a tablet, their workflow will be significantly different than one on a PC. And from the perspective of the article I referenced they might wonder where that PC would have to reside in order to have power and would that ultimately be useful because of that.

            We keep thinking of a mobile device in relation to a PC. Our workflows, systems, and processes have evolved from that perspective. For someone who is beginning from a mobile device, their workflows, systems, and processes will be significantly different. They will be looking at the PC from the perspective of mobile computing.

            Joe

          9. Are people that rigid with their workflows? Are you saying that a normal person cannot independently arrive at the observation that this newfangled (for them) PC might actually work better than the tablet for a particular task if I just modify my workflow a certain way?

            Let us not put workflow up on an infallible pedestal. It’s one thing to say that it might not be worth the effort to change one’s workflow if that’s what it takes to use a PC and do a particular task better overall. It’s another thing to say, people will never adopt a PC even if it will ultimately do a task better because workflow is sacrosanct.

          10. I think the point you’re missing is that the PC isn’t a practical option for many people, and isn’t even the preferred option. Would a PC do some tasks better? Of course. But if those tasks can be done on an iPad, and provide the advantages of mobility, it may not matter that the PC can do the tasks better. Mobility is very powerful. Add to that the personal nature of a mobile device, it’s much more likely that each person will have a mobile device while a PC is shared, so we’re back to the problem of time sharing. There are many good reasons why a user would choose an iPad over a PC, even if they know some tasks are better on the PC.

          11. Kind of colonialist to think you will show the light to the savages, don’t you think? Couldn’t possibly be the other way around.

            Joe

          12. Great example. I think you’re both right. That’s why I favor versatility in a device.

          13. Nah, iPad sales have levelled off because it took off like a freaking rocket with nuclear sharks attached, basically double what the iPhone did initially and the iPad hit every milestone quicker as well. The growth rate had to slow dramatically just to make sense.

            The iPad is already at annual sales that make it alone the top PC maker in the world. Add to that the long replacement cycle (my iPad 2 from 2011 is still great) and pressure from the bottom (very capable iPhones) and top (very portable MacBooks) and it’s not that surprising. I expect more growth from iPad though, but it will be a slow burn from here on out.

            As for what tasks you can do on an iPad, ask my four teenagers how much trouble they have with any of the complex tasks you mention. The answer is none, it doesn’t occur to them to do this stuff on anything but their iPads, that’s the only PC they’ve ever had.

            Even for an old gorilla like me, I do all my writing on my iPad (with a ZAGGFolio case), I prefer it over my MacBook.

          14. If your four teenagers have no trouble using the iPad to write lab reports where they have 1) one window for writing the report itself, 2) another window for the website that has background material, 3) still another window for the website that contains the data they recorded in real time during the experiment, 4) another window for live texting with their lab partners, then your kids have amazing memory and cognitive skills that’s why they don’t mind having to, or don’t need to, switch back and forth between windows (rather than having all four in view at once).

            My kids don’t have outlier mental abilities so they prefer using my iMac over an iPad, or even their laptops, when writing lab reports. Similarly for research papers.

          15. They use the fast switching, seems to work fine for them. I’m sure a very large screen with big windows side by each would be an improvement, but they flit between windows and apps very quickly, it’s not enough of a limitation to give them any trouble re: completing various tasks. Kids adapt quickly.

            The large screen option of course has the downside of less or no mobility, and one user at a time. With iPads all my kids have PCs they can take anywhere, and work anywhere. Viewed through that lens which is better, four kids all doing their work one at a time on the family iMac when they happen to be at home, or four kids working simultaneously on iPads anywhere they like?

          16. So what? There are already regions where the mobile device is all they know. That’s the point. Their tool is not your tool and how they use that tool is not how you use your tool. Just because you can’t figure out how to use a tablet instead of a PC because you’ve never had to, doesn’t mean the rest of the world is in the same position.

            Joe

          17. I come back to my original comment. Other than mobility, there is nothing these devices can do that a PC can’t do better. Internet performance is better, compute performance is better, there’s more storage, there’s more RAM, they can even sport multiple user interfaces (some ostensibly user friendly, my mother in law even gets confused over her iPad), they are more adaptable, they even come with touchscreens. It’s just no comparison. What they won’t do is fit in your pocket.
            But….the best computer is the computer you have with you. For many every day tasks mobile suffices.

          18. “there is nothing these devices can do that a PC can’t do better”

            Based on the things you need to do in they way and work flow you need to do them. Which is not a universal.

            “Other than mobility”

            Which is the point, to the point that fact alone will start to reshape workflow and processes.

            Joe

          19. It’s the PC that’s universal, not the tablet. Unless you accept “The network is the computer”, then there’s a PC, or something more powerful, somewhere on the network.

          20. Goes the other way too. Just because you can’t figure out how to use PC instead of a tablet because you’ve never had to, doesn’t meant the rest of the world is in the same position.

          21. True, but mobile devices, pocket computers, iPads, are far more accessible and will spread far more quickly and gain much more use. The next wave of computing is happening with different tools.

        2. So of course you do all the maintenance and repair work on your new BMW, because that’s *important*, right? Or, more likely, you don’t, because that isn’t the job you hired the BMW for. You did not buy a BMW so you could work on it, you hired the BMW for the job of transportation.

      2. Believe you me, in high school, a full-feature PC (desk- or laptop) is pretty much mandatory. Well, okay, assuming you go to a high school that makes you write essays, term papers, class presentations and lab reports regularly.

        1. From what I see around me, most kids are in countries that still emphasize handwriting at that stage, and/or that don’t want to assume every kid has access to a PC. Also, aren’t tasks at that level doable on a tablet ? My niece mostly uses hers.

          1. I know this is anecdotal, but my kids cannot do the stuff they need to do on a cramped keyboard and a 10 inch screen. A laptop is the minimum and when I’m not using my iMac, they’re on it right away if they have a term paper or lab report due. Those tasks typically need more than one window open and in view. And typing on a tablet, virtual keyboard or cramped physical accessory gets old fast.

    2. “Yet, with the EXCEPTION of mobility, mobile does nothing better than PC’s. In fact, it does everything worse.”

      It does things differently than how you are accustomed, and that makes it seem worse to you. For instance, It seems right and natural to you (and me) that computers should have file systems and that one should be able to get one’s hands dirty managing one’s data in the form of files.

      But almost all those aspects that we are used to are, in fact, incredibly user-hostile to non-technical people who haven’t spent time and effort learning how PC’s work. They require you to learn how to use the machine (which nobody but nerds like us cares about) and only then learn how to use the software that does the stuff you care about.

      Mobile OS’s abstract a lot of that geeky cruft away, resulting in a much friendlier and superior device for getting done the stuff you want to do without worrying about learning how to use the underlying device.

      From the point of view of a non-technical person, mobile devices are inferior to a PC only in that they lack a keyboard and have tiny screens, which makes them hard to use for typing or for seeing fine details/large images or data. From the point of view of a technical user like me, who can get over the fact that they work differently than one is used to, they also suffer from a lack of on-device storage and they can’t do CPU intensive tasks well. In every other respect, they are clearly better than a PC.

      1. From a hardware point of view, as you agree, except for mobility, the PC wins hands down. Further, there’s nothing stopping the PC from running any interface for usability. They have hidden the “geeky cruft” away since batch files were used to launch programs and present them as a menu. Windows based PC’s even have touch.

        Even on the same router, the PC accesses the internet faster than a mobile device. Let’s not even get into multitasking.

        Then there are the subjective matters. Typing on glass sucks IMO. We lose the sense of touch (paradoxical as it may seem). You don’t really need to know the filesystem to use a PC either. And, as you also note, there’s that persistent small screen.

  5. Something old and something new.

    There’s no doubt the consumerization of IT (and its increased visibility since it’s now outside of the den/home office) is changing the rules for success, emphasizing ease of use, social desirability, and apps-as-media.

    On the other hand, some lessons we should have learned last time around seem to be falling by the wayside: the importance of interoperability and the perversity of lock-in. MS were once a liberating force for IT, allowing cheap and standard PCs, easier apps… then they realized the easiest way to happy shareholders was to milk their de facto monopoly and leverage it into neighbouring areas. Looking at what Palm and Psion were doing in the mid-90s (user-friendly pocket computers with an appstore and peripherals, also a smartwatch and netbooks), I feel MS’s hegemony and defensiveness set us back at least a decade. I can’t help but think the same is already starting to happen, with both Google and Apple holding back on formats and devices, and Apple fleeing the low end.

    1. “I can’t help but think the same is already starting to happen…”

      Apple HAS learned the lessons from last time around, and they are unwilling to make the same mistakes.

      Differentiating features are not “lock-in.” They are often the sole reasons to chose one product over another. To consider that “perverse,” is to desire a world where there is no exceptionalism. Do you really believe that is a lesson that should have been learned?

      What you call “lock-in” is what I call “the essential baseline requirement for innovation and survival.” Of course, I call it that when no one is listening. It sounds pretentious.

      1. You’re assuming a lot about what I call lock-in. To me, it’s the deliberate use of proprietary interfaces, formats, protocols and HW interfaces, and the emphatic deprecation of standard ones, as well as obfuscation or legal protection against reverse engineering. Apple are certainly doing that.
        And clearly prioritizing it, well away from the public’s ears. If Apple learned a lesson, it’s that lock-in brings oodles of money and they must go for it, as they themselves unequivocally state in this S. Job meeting agenda (** sudden eye-opening warning: this is not the PR you’re so used to !**): http://genius.com/Steve-jobs-itinerary-for-apples-2011-top-100-meeting-annotated.

        1. “Google and Microsoft are further along on the technology, but haven’t quite figured it out yet – tie all of our products together, so we further lock customers into our ecosystem”

          Is this the nefarious bullet point? ‘Cuz if it is, its not exactly being done “well away from the public’s ears.” It’s an obvious Apple selling point and everyone else and their mothers are desperately trying to do the same thing.

          The lesson Apple learned was that its good to be king. Apple’s mistake (and it only looks like a mistake in hindsight) was making (arguably) superior technology choices that ultimately lacked price economies of scale when they weren’t adopted by others: serial ports vs parallel, Motorola/Power PC vs Intel, SCSI vs ATA/SAS, NuBus vs PCI, etc.

          Apple learned that uninformed consumers mistook the Mac’s higher initial cost-of-ownership for being an OVERALL higher cost-of-ownership. Lesson learned.

          So, if you’re complaining that Apple uses proprietary Lightening connectors on their phones, well, then you’re simply anti-innovation. Apple chose this proprietary route because USB sucks.

          1. First, I’m not complaining, I’m looking at what Apple are discussing internally.
            Second, I’ve never seen “You’ll be further locked into our ecosystem” in any Apple PR, I’m curious as to what examples of that being a selling point you can enlighten me with. You also seem to be overlooking the “**further** lock in” part of that sentence.
            I have no clue where your 2 “Apple learned” points are coming from. They certainly don’t seem to be trying to get their tech or ecosystem adopted by others to achieve more economies of scale, nor to lower entry price or discuss TCO ?
            To take your Lightning example, one could also say that Apple chose to build a slight derivative of the Thunderbolt industry standard as a way to lock-in purchases of peripherals and cables, on top of those of content. I’m also curious as to how USB/MHL are inferior ?

          2. Apple has been designing away the need for most ports. Would they like to eliminate USB? Certainly. But US Government contracts require that they offer wired keyboards and mice. So, rather than offering a non-standard keyboard port (which will cause you to complain) Apple still has USB ports on their Macs.

            But, they aren’t on their phones. And that is a good thing.

            As for your two links, one is an extremely iffy accusation, and the other clearly states “Before we go any further, let’s cut to the chase: No, you aren’t at risk of infecting your Mac or iOS device with this current [WireLurker] malware—unless you do some really risky things…”

            No computer manufacturer can fully secure a device against a reckless user. But USB is inherently insecure.

            As for my critical thinking skills, they’re sharp enough to know this conversation is a waste of time.

          3. So, in your own words, Apple’d rather risk USB than lose out on gov contracts or fall back to, say, PS/2 or ADB or thunderbolt connectors (no need to invent a new one), or keep USB from their customer hardware (simply not mounting the USB connector for non-gov PCs will achieve that at negative cost). Not sure what point you’re making about USB or Apple. USB indeed has a glaring security issue, and Apple don’t seem any more fazed about it than anybody else. Hint: they went Lightning (and 30-pin before that) *before* the USB issue was unveiled, so that was clearly not a factor either. Also, they were a big USB promoter.

            The “very risky thing” is: install an app on your mac, then connect your phone to it via the lighting-USB cable. Doesn’t sound out of this world.

          4. “Not sure what point you’re making about USB or Apple.” Therein lies the problem. So I’ll spell it out for you.

            The iPhone was designed to be secure from the ground up. At first, Apple/Jobs didn’t even want third party apps for it. No USB, no memory card, no Flash. These were all avoided for a reason.

            The “very risky thing” is side-stepping Apple’s providence and installing apps from untrustworthy sources.

            There. Is that clear enough for you? If I had a mic in my hand, I’d drop it.

          5. You really should read the links above. Your “secure from the ground up” phone has a built-in, voluntary security hole, on top of the accidental ones, and Apple-pushed USB was thought to be secure at the time.
            I’ll leave you to your “Apple providence”.

          6. So what was your point about “no apps, because security, but yes apps, because… money ???” again ?

  6. One thing though. If the debate is cost vs experience, I’m not sure the two are opposed, or even related. If you take away something’s price from its experience rating (a big if admittedly, a lot of things are enjoyed mostly or partly because they are expensive/exclusive) examples abound of cheaper things being more enjoyable or efficient than expensive things. Staying in the IT zone, my DOS PC was a lot more pleasant to use than my uni’s PDP-11, and a lot less expensive.

    I think it’s not so much about experience as about fashion, and Apple’s recent hires seem to bear that out. The IT we use has become part of our public persona, same as our clothes, handbags and cars. Even our stay-at-home IT now ranks alongside our furniture and knick-knacks, not our tool shed.

    1. “I think it’s not so much about experience as about fashion, and Apple’s recent hires seem to bear that out”

      That’s what you would like to think the hires bear out. Of course, all it bears out is that people have always treated watches differently than calculators. Apple is the only tech company to recognize this. You can add as much digital functionality to a watch as you like — but that in itself doesn’t make it a nice watch to wear. Look at glasses – functionally they help you see, but they are real fashion items too. Then look at Google Glass — if you can still find a pair. If you wear glasses all the time, I would love to see what you are willing to wear or not wear.

      The simple truth is that the more personal a functional “appliance” becomes, the more fashion conscious we all as humans are about it. “Fashion” doesn’t preclude its functionality or usefulness — unless, according to you, it’s Apple. For the rest of us, Apple is packaging more usefulness with something we’d actually want to wear. And no, we don’t want to wear it because it says ‘Apple’ on it, but because Apple actually has taste and design sense.

  7. I have long promoted a simple fact to many of the startups and entrepreneurs I talk to:

    It’s OK to think outside the box but you have to know where the walls of the box are set up. Sometimes the sides of the box are there for a reason and need to be respected.

      1. Well, I also do point out that there’s a reason why there’s a solid double yellow line down the middle of certain sections of the highway….

        1. Robert,
          You are confusing lines that mark the end of the road and lines that delineate it’s path.

          1. In America, double solid lines in the highway mean that you do NOT pass other cars because of the danger of oncoming cars. And it comes with a hefty fine for good reason if you are caught violating that law. There’s is a huge difference between breaking some rules to launch a startup and actually breaking the law.

            No confusion here. In fact, if you’re one of those who believes that ALL laws are meant to be broken, good luck with crashing through that barrier at the end of the road.

          2. Also, we sometimes believe we are thinking outside the box when we are really just getting better at thinking inside the box.

      2. In and by itself, there is no virtue in crossing a line, unless there is a purpose to do so.
        The real winners simply ignore that lines may exist and forge ahead to reach their goals.

        1. Can’t say I agree. The act of exploration is just that, crossing lines. Sometimes we are pleasantly surprised in entirely new and profound ways.

  8. Ben, I think you have mentioned before that the trajectory of Samsung was exactly as expected by Disruption Theory. In that sense, Disruption Theory still seems to be quite valid in consumer markets as well as B-to-B markets.

    Apple is apparently an anomaly, and since it is such a large company and a large influencer, we tend to think that we can discredit theories that has been successfully applied to many markets and many companies just because of this single instance. I think that this is not a sensible approach. Although we need to find a way to understand Apple, we have only a single example and this is hardly sufficient evidence to discredit Disruption Theory. Instead, we should be looking carefully for other companies that are like Apple and do not fit. Only by finding more instances can we discredit the theory. Unfortunately, as far as I know, even Ben Thompson has not detailed any such cases outside of Apple.

    Microsoft’s dominance taught us that modular systems win. However, the caveat to that discussion is, as Christensen has noted from the onset, integrated systems win during the early stages of the market when the product is constantly improving in a way that customers appreciate the advances. It is only after the product is “good enough” that modular wins. Hence Apple’s apparent escape from Disruption Theory can easily be explained if we can prove that the iPhone is not yet “good enough”.

    We can choose to discredit Disruption Theory for future analyses based on a single anomaly, or we can choose to be more careful about when a product reaches “good enough”. I think that it is more sensible at this point to choose the latter.

    1. Seems to me that Apple’s product is actually the user experience, and I’m not sure the user experience can ever be good enough. Perhaps in business/IT, but with consumers, I doubt it.

      1. I am aware that some people like to make the general statement that user experience can never be good enough. However to make a general statement, you have to, at the very least, provide two examples. One may be Apple. What is the other?

        1. Most of everything we use isn’t nearly good enough from a user experience perspective. Cars, housing, clothing, appliances, and so much more. When we talk of user experience, you pretty much have to get to the point where a thing works like magic 100 percent of the time, and then maybe that’s good enough. And at that point you can still branch out re: fashion. Of course we’re talking about the segment of the market that values the user experience. Much of any market does not, lots of people are okay with good enough or even mediocre.

          I don’t see that many companies paying attention to user experience, thoughtful design, quality, in the way that Apple does, but I do have a couple of candidates off the top of my head. Tesla could perhaps be said to be the Apple of cars. And Breville could be said to be the Apple of small home appliances. I don’t own a Tesla, I don’t actually consider their cars good enough yet (but they’re working on it). I do own a couple of Breville appliances, and I am very impressed with the attention to detail and how they seem to have thought about how to do something in the best way, and then did that.

          I’m sure there are other examples of companies and products that are focused on building what works best, on what delivers the best user experience, and I would think we’d find they do very well in the high end segment of their market, they have healthy margins, and high customer satisfaction rates.

          1. And Tesla is as profitable as Apple? Breville is as successful as Apple?

            The companies you describe are niche players. In general, companies that painstakingly pay attention to detail are either luxury, niche, or somewhere in between. Tesla isn’t even profitable if I’m not mistaken.

            The theories are not about whether a niche player can survive in the market. They are about which companies will dominate or maintain a powerful presence.

          2. Apple is also a niche player, it just happens their niche is about a billion consumers. By many measures Apple does not dominate. Apple is an interesting case, a company focused on a minority segment of a market, but that segment happens to be so large that Apple has grown enormous and gained power through the value of the segment they do dominate.

            “The theories are not about whether a niche player can survive in the market”

            I’m more interested in looking at whether user experience can ever be good enough in a specific segment of the consumer market, and I think there’s lots of examples that show the user experience can continually be improved, and that there’s a healthy appetite for this improvement/value.

          3. Yes, but by many measures Apple does dominate relative to other players in the same market. It dominates not only in profits, but also is very very strong in usage. It dominates in corporate mobile IT. It dominates in ecosystems.

            Otherwise, it wouldn’t be worth talking about, would it?

            People wouldn’t consider it to be an anomaly to Disruption Theory, would they?

          4. “Yes, but by many measures Apple does dominate relative to other players in the same market. It dominates not only in profits, but also is very very strong in usage. It dominates in corporate mobile IT. It dominates in ecosystems.”

            Yes, because Apple dominates what I call the Best Customer Segment. I suppose the tech landscape might be very different if other companies valued, culturally, Apple’s approach. But in general the tech industry thinks what Apple is doing is wrong. There’s a profound lack of respect for design, thoughtfulness, and the user experience, in most of the tech industry.

          5. I hope you didn’t fall out of your chair, I wasn’t trying to hurt you. 🙂

            Never denied that. See, I have an advantage of not being a fan of anyone. My objections against Apple have never been about that. The things I object to, and you know them very well, supersede the other desirable aspects. I want it ALL, especially at premium pricing.

          6. “The things I object to, and you know them very well, supersede the other desirable aspects.”

            And this is why you’re not a good fit with Apple. Much of what you object to forms an important part of the value within the user experience that Apple delivers.

            As Apple can deliver more ‘openness’ and reveal complexity, they will, as long as it does not negatively impact the user experience they are focused on. However, it is likely it will never be enough to satisfy your wants/needs. Apple is not an evil entity sitting in a dark room tenting its fingers and figuring out ways to screw you. But the value and user experience they deliver conflicts with many of your wants and needs.

            Simply wishing you could have it all does not make it practical. Also, for what Apple is delivering the price is not premium, it is fair for the value. The problem is it is not the value you seek, so you see Apple as premium when it comes to pricing. I’ve never felt that, I’ve always seen Apple as a value purchase.

          7. Actually it’s a win-win, they got the other guys to up their game. I do think Apple is self serving however, in unfathomable ways no less. And just like I would anyone else, I call them on it. Can’t just have the cheerleading section be heard.

          8. Apple is financially motivated to serve their customers well, but in doing so they are in conflict with what you want, so you see Apple as self-serving, when in fact that is not their focus. You’re seeing motives and conspiracies where there are none. You view Apple’s actions as unfathomable because you don’t understand why people want what Apple offers.

            You’re not so much calling them on it (whatever it is) as you are complaining that Apple is doing things you don’t want/like or find value in.

          9. And banning Wiley, however briefly, from their bookstore served who? How did that serve their customers?

            How does a singularly exclusive store serve the customer? Monopolies, even internal ones, are bad. Wiley is but one case in point.

            Actually, having bought an Apple device, is it possible to not give Apple more money if one chooses? Sure, just don’t buy anything. No Apps, no media, no books. Buying Apple is a huge commitment.

            Still, I know you disagree, and I know I’m not a good fit for Apple (thankfully). These points must be put through as long as the cheerleading continues.

          10. The edge cases you use only serve to demonstrate how flawed your thinking is. Apple isn’t perfect and never will be. I know you think you’re combatting the cheerleading, a noble cause indeed, but you’re really not. People are saying “Hey, this is where I get value from Apple” and then you essentially freak out because that isn’t where you find value.

            “Buying Apple is a huge commitment.”

            And others would say buying Apple is a huge value. You see commitment where others see value. That’s fine, but it isn’t cheerleading for other people to find value in what Apple does.

            Honestly, you come off as a jilted lover more than anything else.

          11. Pretend I’m your bartender and tell me all about ’em.

            Women, come in all perfect and then the more you get to know them, the more locked-in you are to their whole ecosystems.

            Amirite?

          12. The fact that most of everything that we use is not good enough from a user experience perspective, is actually proof that even in consumer segments, user experience is generally not very important for selling products.

            Even if user experience is very important for selling smartphones, we have to recognize that that is not the norm for consumer products. Hence the general assumption that user experience is important for B-to-C but not for B-to-B is questionable at best.

          13. “The fact that most of everything that we use is not good enough from a user experience perspective, is actually proof that even in consumer segments, user experience is generally not very important for selling products.”

            Yes, I said this already, most people are okay with good enough or mediocre. But there is a segment which allows better goods and services to thrive. User experience matters, and sometimes you don’t even know that until somebody shows you something better. We probably all thought are phones were great before 2007, then we saw how bad they actually were.

          14. What happened in 2007 was not simply an improvement in user experience. It was a technical breakthrough. Similar to nylon stockings. There were huge improvements in functionality. The iPhone’s success was not because of just “user experience”. You should not confuse huge functional revolutions with simple user experience improvements.

          15. Ah, I think we’re talking about user experience in two completely different ways. When I say user experience, I mean the total experience of using the product and the associated jobs-to-be-done, and how that impacts the person using it.

            The technical breakthrough and huge improvements in functionality re: the iPhone were indeed a leap forward in the user experience. Huge functional revolutions change the user experience, in fact they are the user experience. It is this sense of user experience that I think can continually be improved upon.

          16. The problem with your broad definition is that it is not compatible with the article.

            When IBM/Microsoft “won” in the last era, it was largely due to IT buyers being the customers for PCs. This customer places a premium on cost, not experience. So, once the customer base shifted from corporate IT buyers, who favored cost, to true consumers, who favored experience, the basis of competition changed.

            Here, Ben states that IT buyers place a premium on cost not experience. If the definition of “experience” includes all the jobs-to-be-done and impacts on the person using the device, it will obviously include all productivity enhancements. Hence, using your definition, the above quote must be interpreted as “IT buyers do not care about much except price. They don’t care too much about any productivity improvements.”

            Nothing can be further from the truth.

            In the article, and in other writings by Ben Thompson, “user experience” is clearly related to the emotional aspects. The definition should be confined to this. Otherwise, the discussion becomes nonsensical.

          17. It might be nonsensical for you. I understand it perfectly. Looking at IT buyers doesn’t tell us anything, since in that case the buyers are not the end users. That is the key difference. When the end user is the also the buyer, the user experience matters much more. I work with many mid-sized businesses, I can assure you that the user experience is not paramount and productivity is indeed sacrificed for cost savings. I hear many complaints about the various systems employees have to use. Indeed, their are armies of consultants that make a living attempting to improve these terrible systems. Is it any surprise that iOS devices are infiltrating the enterprise when employees are allowed to choose their device?

            I digress, I’m also not concerned whether my understanding of user experience is compatible with the article. I know what my experience is as a user, I know what it encompasses. It is necessarily difficult to define narrowly but I think it comes down to whether the use of a product was a positive experience or not, and if positive, to what degree. There is an emotional aspect to this, certainly, but that is not all that it is.

          18. Well, I’m not concerned whether my definition is compatible with the article, but that doesn’t mean it isn’t. I think user experience is very simply defined as a person using a product or service. Is that a broad definition? Maybe it has to be, since each person will create their own experience in the different ways they use a thing.

            Now, that user experience has to encompass all aspects of the product, which again is very broad. Off the top of my head we’ve got marketing, the buying process, retail presence, product presentation, the industrial design of the product, all the jobs-to-be-done (which differ depending on the user), the hardware and software, the ecosystem of content, vertical integration, the support of the product, being able to extend my use of the product by purchasing other products from the same vendor (Apple provides computing devices from pocket to desktop), using all my products within one ecosystem, and so on. I’m sure there’s much I’m forgetting. But all of this impacts the experience of the person using the product, the user experience.

            I leave it to you and others to figure out if disruption theory can be applied here, but it seems to me that Apple is so user-focused that the user experience becomes the product. It is also obvious to me that the user experience can always be improved, there is no good enough.

            Back to segmentation, I don’t think user experience is the primary factor in the buying decision for much of the market. Many people tolerate a less good user experience in order to get cheap. But there is an incredibly valuable and powerful segment that does base decision-making on user experience. Good enough and cheap might not apply in this segment. You would have to first match the user experience that Apple delivers, and that is perhaps impossible. Certainly very difficult, and it doesn’t seem like any other company is even interested in matching Apple in this regard. Never mind the discussion about whether it’s possible. It may not be. It took Apple decades to develop the user experience it now delivers.

            The lie of course (that we hear from Apple detractors) is that you can get the same user experience at a lower cost by purchasing other products. But the user experience is not the same, it’s not even close. Some specific aspects are roughly equal, but the total experience is miles apart.

            So, can Apple be disrupted? I’m not sure. I think it would take breakthroughs on the technical/functional fronts that change/improve the user experience so dramatically that Apple is no longer delivering a better user experience. But again, the user experience is not based on one thing. It’s not just a faster chip or a better screen or this one killer app, it’s everything, the sum of the parts.

          19. I can’t change what the user experience is. It is what it is. I suppose we could say the definition is quite simple but the application of that definition is necessarily broad. It is the sum of the parts, that is reality.

            Applying this re: analysis of Apple seems obvious to me. Perhaps you’re trying to use a specific framework or bias when it comes to analysis that makes it difficult to incorporate the user experience as I’ve discussed it.

          20. “I’m not out to prove disruption theory wrong, I just think it may not apply to Apple.”

            I don’t know squat about disruption theory, or any business theory for that matter. Here’s what I do know:

            All it takes is one ugly fact to destroy a beautiful theory.

            As you suggest, the theory may not apply. If the theory does not apply you need to either limit the theory or accept that the subject (Apple) is out of context. I lean with the theory is just a model, not quite elevated to the status of a theory.

          21. I think if we consider user experience as the product re: Apple, then disruption theory might still work. The point is that for a segment of the market, user experience can never be good enough.

          22. You don’t know crap.

            Anything about business – you don’t understand.

            You see visions in your head…. what you don’t realize is their delusions.

            You’re going to start hearing voices too

    2. Apple is made of the same figurative and ethereal substances as the Industrial Revolution and as The Renaissance were. Through a novel orchestration of ethics, aesthetics and economics, it impacts the auditioning, hence the interpretation of an age-old symphony, human evolution.

      Apple is a coordinated event in History, therefore, one cannot search for a peer where and when space-time can afford none. Disruption theory wouldn’t have helped the dark ages and the artisan economies to forestall the disruptive nature of any form of enlightenment.

      Apple, as is the Industrial Revolution, as is The Renaissance, is…disruption theory distillate; the evolution-accelerating space-time entrant. The surround sound one hears then is the universal screeching of the incumbents’ wheels as they grind ever so slowly to a ‘relativity’ stop.

      1. Man, I can dig it. It’s all so purple. It’s like, ever since Mars went into retrograde the whole sky is singing. Hey, I’d love to sit and chat, but this rhododendron just asked me to dance.

        1. I can see that you and I are quite alike in that we both exclusively engage, and most authentically and honestly so, with comments just as authentic and honest as ours.

          Good faith is, for me, never anonymous. It strikes very close to home. berult.

    3. I have a few follow up thoughts. Firstly, the disruptive narrative, in terms of being predictable, seems to follow a pattern in modular ecosystems. However, what is changed, and the point i’m making, was more about the idea that modular always “wins” or always gets the most market share. I think that assumption is the one in question. This seems to be something that is possible during a “phase” of a markets maturity, but not the defining result once it is mature. Microsoft’s dominance was a result of circumstances which no longer exist. So whether the disruption was a result a fundamental of disruption or just because the market evolved is the center of the discussion I think.

      But the point which can not be ignored, is that of way too little of consumer tech market history to make the assumptions many are making. We can use the same ideas of Microsoft’s dominance being circumstantial as we can about Google. Perhaps “search” was the right business at the time, for the phase we were in, but not the end result. It is this consumer market evolution which is throwing many for a loop and challenging conventional thinking, because, to my point, conventional thinking isn’t actually all that conventional in reality. That is where the danger lies, is to believe it was. Hence Horace, and I have continually pointed out, the trick is in the application and unfortunately many don’t know the markets well enough to apply said theories correctly.

      1. I’m not sure I fully understand your comment, but I hope I’m saying something relevant.

        “Modular” is not a single state. Microsoft’s dominance was only a single instance of many possible “modular” states.

        What happened is that in the early-2000s, the OS and basic productivity software (MS-Office) became “good enough”. Hence the value chain evolved towards the next step; solutions. Instead of mailing Word-documents around, and keeping customer information in a huge spreadsheet, SaaS solutions emerged that would provide project management solutions or CRM solutions. These solutions were decoupled from the underlying OS (many of them were provided on the Web), which meant that the OS did not have to be Windows anymore. This created a new state of “modularity”. This is the basis of the Cloud as we know it.

        My understanding is that by “modular always wins”, we are only saying that an integrated system will tend to be decoupled. The integrated system may be the hardware layer (Samsung) or it may be the hardware-OS layer (Apple) or it may be the OS-application software layer (Microsoft). Any of these layers, if they become “good enough”, will find themselves decoupled.

        From this, we can learn a lesson. For example Google. Google’s strength is their huge machine learning project that learns from all websites on the Internet, and also from emails in Gmail, location data in Android, etc. The knowledge powers Google Search, Google Now, and their advertising. Google’s power lies in the fact that all of this is integrated.

        Now by applying the “modular always wins” axiom, we can assume that at a certain point, Google’s value proposition will also be decoupled. We can already see this happening. Instead of simply returning a list of websites, Google now returns results based on Wikipedia, Yelp, and other human curated sources, none of which are based on a huge machine learning project. In fact, on mobile, people don’t even use Google but instead use the Wikipedia or Yelp app. Google is in the process of being modularised (decoupled) into an ecosystem of apps, none of which rely on a centralised machine learning project, but instead on human curated (or social) data.

        Basically, in Google, we are seeing “modular always wins” happen at the services-layer.

        My fundamental understanding is that the value chain is extremely long and it is constantly changing. There are many points where integration may span a few nodes, or where they may be broken up and decoupled. And when a collection of nodes is broken up, some of these nodes will connect with adjacent nodes and create a new point of integration. I think that this constant change is what is meant by “modular always wins”. It means that an integrated state can’t last forever.

        My conclusion is that “modular does always win”, but we don’t often realise that the state of “modular” is not always the same.

        As for Apple, I think they have wings. That’s why they can defy the laws of gravity (disruption theory) and fly so high. Most other companies don’t and are thus confined to a few meters off the ground at best. We need a separate discussion for their wings.

        1. I don’t see how the automobile industry which saw independent chassis fabricators, engine builders, and coachworks coalescing into integrated manufacturers supports “modular always wins”.

          I don’t see how cameras, camcorders, hi-fi, video game consoles, support “modular always wins”.

          Of course there’s no such thing as totally modular or totally integrated but it seems to me when it comes to complex, mass market consumer goods that operate in reasonably competitive market settings, the evolution of products is away from modularity and towards integration.

          Perhaps, very early in an industry products are almost totally integrated. The Wright brothers built everything– engine and airframe. One of the earliest cars built by Gottlieb Daimler was built by his company from the ground up. But one could argue these are prototypical or near-prototypical models. Economies of scale seem to dictate that industries undergo a modular phase at lower production levels and evolve towards integration as markets reach mass consumption levels. (Absent a monopolist that stifles competition and hence the need to push the performance envelope rapidly.)

          1. If you look into the components of cars for example, they are sourced from a very large number of suppliers. These suppliers often sell similar components to other companies. Tyres, brakes, etc. That looks quite modular.

            Furthermore, a single car company has many different brands. You might find a component in a Bentley to be the same as one in a cheap Volkswagen. That is again modular.

            Video games are another example of modular, where the chips come from IBM, ATI, etc. Games come from independent vendors so that is again modular.

            I’m not sure what you mean by your examples not being modular. Maybe you are looking at only a single interface?

          2. Look back to the last sentence of the third paragraph of my post: I said the evolution of the industry is towards modular. There is really no such thing as a completely integrated manufacturer. You mistake me for talking about integration vs. modularity in absolute terms; I am talking about them in relative terms.

            The way you define modularity is probably different from how I define it. By your definition, outsourcing a bolt represents just as much modularity as outsourcing a RAM chip or an outsourcing an OS. I haven’t thought about it enough but surely some distinction has to be made between these three cases. I would say from outsourcing a bolt, to a RAM chip, to an OS, the level of “modularity” rises.

            When a car manufacturer starts manufacturing engines in-house when it used to buy it from an independent engine builder, can you agree with me that that is a case of a movement towards integrated manufacture and away from modularity? Notice that even if all the engine parts and components that are used in both cases (in-house vs outsourced engines) were purchased from an outside supplier, I would still consider the case as a movement towards integration.

            For me integrated vs modular in the production of goods is all about how much of the total production process is undertaken in-house as opposed to outsourced from other firms. And I think that is the operative definition in use when Disruption Theory talks about how industries start out with companies building the whole product from the ground up then turn to outsourcing (i.e. go modular) as the industry matures. That definition thus doesn’t include component interchangeability across a company’s product line, as in your example where the Volkswagen company uses the same component in a Bentley and a VW. This latter case is really one of modular design rather than modular production (although the two are related).

            Sorry, too long already.

          3. Since as you say, there is no such thing as completely integrated state, the discussion depends on which interface we are looking at. Each industry will look modular in one way and integrated in another.

            You could use your proposed measure of how much production is done in-house, but that seems quite arbitrary and I’m not sure why that would be a good measure.

            What you have to do is look into each element in the value chain. Some will be modular, some will be integrated, even in the early stages of innovation. “Modular always wins” means that as each element improves in performance, they will overshoot the required specs. At this point, it will make more sense to modularized that component. It does not mean that the total integration level of the final product, measured by some arbitrary measure will decrease.

        2. Again, if you look upon Apple as an event of change instead as an institution of change, then, the laws of physics, determinism, ought be complemented by randomized considerations. Context, which epitomizes integration, comes fully into play.

          Apple…happens, …and throws everyone off. Disruption happens, and validates, ex post, a theory that it is, ex ante, no part of. It’s the paradox of un-bequeathed change; quantized evolution.

          1. DxDp=>h/2pi
            Where x and p are any pair of complimentary variables
            You are correct! There is no trajectory!

        3. Yes not sure I clearly communicated what is in my head. Deep philosophical stuff involved in this discussion.

          As I stated, we have to know when it applies. In this case we are talking about a variable related to performance, and what happens when performance becomes subjective. Hence in fashion, we would never say that $15 jeans are disruptive to $200 jeans. We know that fashion is subjective and therefore there are other dynamics at play. Similar in cars that a Toyota is not disruptive to a Porsche. (we have talked about this before on this site).

          That’s the point of the dynamics that drove MS in this post. Performance was not subjective when IT customers were the buyers of PC. It became subjective once consumers entered the market.

          The reason, I think it plausible to say Microsoft was an anomaly is because we will forever have a very large pure consumer market for electronics. This is also the basis of the observations that we must now view the world in terms of market shares and not market share. Niches and profitable ones become sustainable and viable.

          Apple will never be disrupted for the same reason $15 jeans will not disrupt $200 jeans. What’s interesting is the size of humans on the planet willing to spend on a more expensive smartphone due to superior user experience. Hence this post on profitable niches for Insiders. http://techpinions.com/profitable-niches/35142

          Will Google get disrupted or obsoleted? I suggest the latter.

          Yet overall, knowing that a market now exists which is huge and has much more diversity, complexity, subjectivity, than any that has existed before in tech, we have to rethink these things freshly simply because so much has changed.

          1. I appreciate your examples like jeans and cars. And I do agree that there is an emotional element for consumer products, as opposed to B-to-B products.

            My proposal is to try harder to add these examples on top of the Disruption framework instead of starting something fresh.

            I think consumer tech markets should be understood by adding an additional component to Disruption theory. My proposal is that the first hypothesis should be that Apple has wings, and although it too feels the forces of Disruption, it flaps them hard to keep flying. Hence instead of looking for a void in the earth’s gravitational field, we should try to find what bird’s wings look like, how you should flap them, and what muscles you need for that.

            If that proved completely impossible, if the hypothesis is rejected and if Disruption Theory itself makes understanding consumer markets impossible, then we can acknowledge a void and start ignoring Disruption Theory for these markets.

            Confucius said 温故知新; “He that would know what shall be must consider what has been.” I think unless proved otherwise, it is more appropriate to try a bit harder to consider what has been.

          2. This is why I’ve been an advocate to add a behavioral science / human observation / study to this template. Horace and the Disruption school at HBS as heard me articulate this. Because humans are diverse, the only way we can truly add something useful to this framework for a manager, executive, CEO, entrepreneur, VC, etc., those who actually need to understand this, is to add the element of human study. The challenge is, this is very hard for a lot of people. Humans are not clean cut, and often not easy to understand. Getting inside the minds of consumers is essential but also very hard and will challenge many. The theory helps makes things clean but consumer markets are messy. And as I said, very young, therefore everyone still has a great deal to learn.

          3. I agree.

            In general, I think mainstream theories in most social sciences have unfortunately ignored good empirical behavioural studies, and for some strange reason emphasised either number crunching or armchair thinking. Mainstream economics, management theories and maybe marketing (although I’m less aware of good theories in marketing) are all guilty.

            Disruption Theory stands out among numerous business concepts in that it tries hard to build a theory that lasts, and very importantly does not rely on the “stupid manager theory”. It does not dig into behavioural aspects, but I would postulate that it may help to bring the microscopic conclusions that behavioural science would uncover, and show how they affect the bigger picture.

            The science is very young but there is hope.

    4. First , i fully agree with you – one example is not enough to discredit such successful theory as disruption. We need more than Apple to do that.

      >> integrated systems win during the early stages of the market when the
      product is constantly improving in a way that customers appreciate the
      advances.

      First the iPhone is modular in many respects – the app ecosystem, the display, some of the electronics, etc.

      And why is modularity even important ? two main reasons – cost and rapid rates of innovation.

      So apple does the smart thing – where fast rates of innovation is important , it opens the platform. for example – apps.

      But i believe that the rate of hardware innovation is far slower than software, they usually get copied after some time , and at this stage of the game(say since android 4.0/4.4 when android finally started offering “good enough” design and apps adapted) ,in most cases , hardware isn’t the differentiating feature with regards to the common man. And i’m not sure how big of a difference hardware in itself was even earlier versus all the rest.

      Also, one thing Christensen doesn’t talk about is all kinds of platform lockups and network effects , etc(probably better generalized as barrier to entry) – which can make the bar for a product to be “good enough” impossible. This is what explains why microsoft the integrated OS is still wining against linux, the modular OS.

      So maybe this has some relevance to apple ?

      1. Great comment!

        I totally agree.

        Regarding the observation that the iPhone is modular in many respects, Christensen described the concept of “law of conservation of attractive profits”, which has been used to describe how, when combining each element in the value chain, the aggregate level of modularity and integration tend to be conserved. In fact, the layer adjacent to an integrated one tends to be modular, and vice-versa.

        Comparing Android and iOS, Android is more open at the hardware-OS level, which means that the adjacent layer, the OS-App layer tends to be more integrated. This is the fragmentation problem (which has been fixed to some degree by Google’s imaginative efforts, but is still an issue). Developers have to be careful about what version OS is running on the device, and what the hardware supports.

        On the other hand, iOS is integrated in hardware-OS layer, which means that the adjacent OS-App layer is more modular. This means that software developers do not have to worry too much about hardware or OS version fragmentation.

        As you mention, it is very likely that Apple carefully designs which layers should be integrated and which should be modular. I agree that network effects are likely an important factor in the decision. Having said that, it is also likely that technical innovation will eventually cause the modular/integration choices to shift.

        1. Thanks!

          Thinking a bit more about it, I’m not even sure that the integrated/modular dichotomy still holds.

          Why? Is the app store open? Well yes, anybody can wrote apps. But is the app store closed? Well yes, Apple has huge control over it, and 30% of revenues is probably more than the profit of many app makers.

          But regarding your point about a new technical innovation that will change the modularity question, yes it could happen. Maybe project ara will be it.

          As for fragmentation being a result of Christensen law? Not sure, because fragmentation is technical but Christensen law is mostly business. For example, technically Linux could have been instead Windows. But I’ll have to think more about it, it’s an interesting question
          .

          1. I am not aware of Christensen describing this “law” in detail, so what I say in the following is mostly my thinking.

            I think the easiest way to think of it is to go to an extreme. If everything was modular, then it would be up to the customer to put everything together. They would have to build their own phone from parts (if it is an Ara phone), they would have to choose their own carrier, stick their own SIM into it, configure their phone for that carrier maybe, install the smartphone OS and install basic apps. In this case, although the products are modular, integration itself has not gone away. The customer does the integration.

            Regarding Android fragmentation, in that ecosystem, it is the responsibility of the developer to test their apps against tens or even hundreds of devices to ensure compatibility. It is the responsibility of the developer to write code that takes into consideration the various screen sizes and hardware capabilities. Here, the developer is the integrator because his responsibility spans the hardware, the OS and his app. Modularity of the hardware and OS is causing integration headaches for the developer.

            The task is easier for iOS developers. Although there is some fragmentation, many developers choose to only write for the most recent iOS version. Hence they only have responsibility for the app. This is the modular situation. Apple takes care of the hardware – OS integration so that the developer doesn’t have to.

            As outlined above, integration flowing to the adjacent layer is I think, mostly a technical issue. If you are familiar with object-oriented programming and the importance of encapsulation and interface boundaries, as well as why we sometimes need to violate these principles for performance reasons, I think it is easier to understand.

      2. Not so sure just how integrated Windows is as an OS. The driver module is quite modular. So are all the API’s, which anyone can develop as well. It’s the kernel that’s monolithic.

  9. The people who clearly took the wrong lesson to heart are a good chunk of the tech press, especially the tech of the first decade of this Millennium. Everything was Highlanderism, where “there can be only one!” and people churned out “Apple will be dead any day now” with depressing regularity because they all believed it was only a matter of a few months or years before Microsoft turned it’s gaze to music/laptops/mobile/tablets/ etc… and crushed the new players with it’s mighty and somehow deserved monopoly powers.

    And given the way Microsoft attempted to portray itself for that time period, it truly believed that a silent majority was holding their breath for Microsoft to makes its moves so that their massive client base would sweep them back to a majority position.

    It took most of that decade for Microsoft and the tech press to realise that that customer base did not actually exist. User base, yes, but not customers, not a billion people that actually went out of their way to buy Microsoft products.

  10. I have to disagree with the person saying Apple is the exception. Its just the FIRST exception… to all the old rules. But look at Paypal, eBay, Google, these were companies started to solve a simple problem. The worked so well the grew huge.

    What they lack was the unified vision of Steve Jobs which he ingrained in Apple with the internal Apple University and teaching the executives to keep the focus on user experience and great product not on quick profits.

    Sadly, what makes Apple great is right in front of us for any company to copy, but with the huge greed factor out there, most companies miss it.

  11. I recently tried CBD gummies throughout the original cbd thc gummies lifetime and they exceeded my expectations. The correctness was charming, and they helped me unwind and relax. My worry noticeably decreased, and I felt a meaning of overall well-being. These gummies are now a requisite in my self-care routine. Hugely propose for a logical and balsamic experience.

  12. I recently embarked on a journey to grow autoflower weed seeds representing the key in good time, and it was an incredibly marijuana seeds cheap enriched experience. As a beginner, I was initially apprehensive, but the process turned out to be surprisingly straightforward. Primary crazy, the germination configuration was uniform sailing. The seeds sprouted quickly, and their vigor was impressive. I followed the recommended guidelines on the subject of lighting, nutrients, and watering, and the plants responded positively. A particular of the biggest advantages of autoflowering strains is their ability to automatically conversion from vegetative development to flowering, regardless of scintillation cycle.

  13. The CBD SEO Intermediation delivers top-notch services quest of businesses in the thriving CBD industry. With a tactical nearer to Search Motor Optimization (SEO), they cbd seo agency dominate in enhancing online visibility and driving living traffic. Their savvy lies in tailoring SEO strategies specifically representing CBD companies, navigating the unique challenges of this niche market. Via extensive keyword research, essence optimization, and link-building tactics, they effectively encouragement search rankings, ensuring clients stand for out amidst competition. Their side’s pledge to staying updated with enterprise trends and search appliance algorithms ensures a emphatic and junk approach. The CBD SEO Agency’s commitment to transparency and customer communication fosters trust and reliability. Inclusive, their specialized services dance attendance on to the pellucid needs of CBD businesses, making them a valuable fellow-dancer in navigating the digital landscape within this competitive market.

Leave a Reply to Naofumi Cancel reply

Your email address will not be published. Required fields are marked *