Will the Future of Computing Emerge from the Fog?
Many scary movies have at least one scene where someone—or something—emerges from the fog. There’s something about fog that’s both a bit eerie and yet intriguing—we can’t seem to help but want to see what appears from it.
Surprisingly, the future of computing may actually be one of those things. But, in this case, its emergence shouldn’t be seen as frightening, but instead, as a very exciting and intriguing new development. I’m talking specifically about the relatively new, but little understood, concept of “fog computing.”
The basic idea of fog computing is to leverage the key new software technologies, processes, and applications built to take advantage of cloud computing infrastructure, but deployed on computing hardware closer to the edge of the network. Thus, it’s about bringing the cloud close to the ground—hence “fog.”
What’s fascinating about the concept of fog computing is that it brings together just about every major tech buzzword that you’ve likely heard over the last few years into one place. Cloud, edge, IoT, AI, virtualization, blockchain, containers, DevOps, 5G, analytics, autonomous cars, smart cities, and more all come together in fog computing, as do several other lesser-known but critically important concepts like TSN (time-sensitive networking—essentially a time-prioritized version of wired Ethernet), distributed computing (where computing applications are split across multiple environments), and IT-OT collaboration (that is, IT department and operations technology departments working together to solve real-world problems).
In fact, despite its somewhat hazy definition, fog computing leverages, and even unifies, a surprising number of these important and cutting-edge technologies into a comprehensive whole. To be sure, not every kind of computing application is, nor will need to be thought of as a “fog computing” opportunity, but a lot of the most interesting ones in business environments are going to be.
Fog computing isn’t being described and discussed just because it’s an interesting concept, either. Products and services are being built for it because of real-world problems. As I discussed in a previous column a few weeks back (see “Edge Computing Could Weaken the Cloud”), we’re starting to witness a massive shift away from the centralized computing model that enabled the cloud, and towards the distribution of more computing power back out to the edge of the network. The reason for this is that big-picture applications like autonomous driving, smart cities, smart agriculture, smart homes, and even remote medicine all have requirements that can’t always be addressed by the cloud in its current form.
Issues such as latency (small time delays), security, network reliability, performance, privacy, and many others are extremely difficult to completely overcome in centralized cloud computing models. As a result, companies are both reshaping and reworking traditional data center computing components, as well as building completely new types of hardware, to bring computing elements that previously only existed within the cloistered confines of data centers out to new types of devices and new types of environments. At the same time, software is being re-thought, re-architected, and freshly built to bring applications from the cloud out onto the edge of the network.
One of the many fascinating outcomes from this development is rethinking where and in what forms this new distributed computing hardware environment will exist. For example, can and should computing elements be integrated into network equipment (built by traditional network vendors) that sits out at the edge of the radio network? Or, should we start to see new types of micro-sized datacenters (built by more traditional computing companies) get deployed into entirely new types of places? The answer, of course, is that both are likely to happen, leading to some interesting new competitive situations.
From a software perspective, the challenges are figuring out how the kinds of complex workloads traditionally done in big data centers can be broken up and/or simplified. For example, we could start to see things like AI-based inferencing being done on something as simple as a Raspberry Pi circuit board. Doing so would likely leverage Docker-style containers, virtualization, and other complex software advancements in entirely new ways.
Even with all these interesting efforts, we aren’t going to see traditional cloud, network, data center, or common endpoints going away. These new fog computing efforts are typically created in addition to these still-critical infrastructure elements, sometimes through the use of small fog nodes. In fact, several companies have started building proprietary solutions that leverage some of these ideas to create unique opportunities for themselves. As with most web-based initiatives, however, it’s going to take more open efforts—like those being driven by the OpenFog Consortium—to gain more widespread success.
Moving forward, though, it’s becoming surprisingly clear that the future is in “the fog.”