And yet, there it was. eBook sales in the US declined 17% last year, and printed book sales were up 4.5%. What happened to the previous forecasts for electronic publishing and the inevitable decline of print? Wasn’t that widely accepted as a foregone conclusion when Amazon’s first Kindle was released about 10 years back?
Of course, there are plenty of other similar examples. Remember when iPad sales were accelerating like a rocket, and PC sales were declining? Clearly, the death of the PC was short at hand.
And yet, as the world stands five years later, iPad sales have been in continuous decline for years, and PC sales, while they did suffer some decline, have now stabilized, particularly in notebooks, which were seen as the most vulnerable category.
Then there’s the concept of virtually all computing moving to the cloud. That’s still happening, right?
Not exactly. In fact, the biggest industry buzz lately is about moving some of the cloud-based workloads out of the cloud and back down to “the edge,” where end devices and other types of computing elements live.
I could go on, but the point is clear. Many of the clearly inevitable, foregone conclusions of the past about where the tech industry should be today are either completely or mostly wrong.
Beyond the historical interest, this issue is critical to understand when we look at many of the “inevitable” trends that are currently being predicted for our future.
A world populated by nothing but completely electric, autonomous cars anyone? Sure, we’ll see an enormous impact from these vehicles, but their exact form and the timeline for their appearance are almost certainly going to be radically different than what many in the industry are touting.
The irreproachable, long-term value of social media? Don’t even get me started. Yes, the rise of social media platforms like Facebook, Twitter, SnapChat, LinkedIn and others have had a profound impact on our society, but there are already signs of cracks in that foundation, with more likely to come.
To be clear, I’m not naïvely suggesting that many of the key trends that are driving the tech industry forward today—from autonomy to AI, AR, IoT, and more—won’t come to pass. Nor am I suggesting that the influence of these trends won’t be widespread, because they surely will be.
I am saying, however, that the tech industry as a whole seems to fall prey to “guaranteed outcomes” on a surprisingly regular basis. While there’s nothing wrong with speculating on where things could head and making forceful claims for those predictions—after all, that’s essentially what I and other industry analysts do for a living—there is something fundamentally flawed with the presumption that all those speculations will come true.
When worthwhile conversations about potential scenarios that may not match the “inevitable direction” are shut down with group think (sometimes from those with a vested interest at heart)—there’s clearly a problem.
The truth is, predicting the future is extraordinarily difficult and, arguably even, impossible to really do. The people who have successfully done so in the past were likely more lucky than smart. That doesn’t mean, however, that the exercise isn’t worthwhile. It clearly is, particularly in
Tech futurist Alan Kay famously and accurately said that “the best way to predict the future is to invent it.” We live and work in an incredibly exciting and fast-moving industry where that prediction comes true every single day. But it takes a lot of hard work and focus to innovate, and there are always choices made along the way. In fact, many times, it isn’t the “tech” part of an innovation that’s in question, but, rather, the impact it may have on the people who use it and/or society as a whole. Understanding those types of implications is significantly harder to do, and the challenge is only growing as more technology is integrated into our daily lives.
So, the next time you hear discussions about the “inevitable” paths the tech industry is headed down, remember that they’re hardly guaranteed.