Megatrend alert: The rise of ubiquitous computing

I like to observe patterns that emerge around me, certainly where technology is involved. The rise of the Internet has pushed many applications from distributed to centralized. Back in the 1990s, we were all evolving from mainframes to local area networking and client/service development.

This was a paradigm change, but now we’re moving back to centralized computing again. If you’re keeping track, we’ve done mainframe (legacy) computing, then small distributed systems (client/server), and now cloud computing, which is sharing centralized resources.

With interest in edge computing, Internet of Things (IoT), and 5G communications, we are moving from “centrally delivered” to “ubiquitous computing.” What the hell does that mean?

To read this article in full, please click here