Hyperconnection and the Future of Computing

A great article in the Economist on March 12th, “After Moore’s law: The Future of Computing,” got me thinking.

According to Moore’s Law, “processing power doubles roughly every two years as smaller transistors are packed ever more tightly onto silicon wafers.” However, this rate, consistent for the last 50 years, has now slowed down to every 2.5 years.

The author says that computational progress will not actually slow, but come increasingly from improved algorithms and deep learning, the connected cloud, and access to specialized chips embedded in the cloud.

The Economist forgot to name the revised law, so I’m calling it The Law of Hyperconnection.

According to the Law of Hyperconnection, overall network performance doubles every 2 years; according to Digital Universe that’s how often the world’s data is doubling, so that suggests a correlation.

But what does network performance mean? Moore was smart in 1971 when the first commercial microprocessor was introduced and we began a period where computing power was directly related to productivity and other business value. That linkage seems to have phased out and in 2016, the data generation and action execution endpoints drive more computational progress than the central processor, particularly as the endpoints increasingly link data to and from the Internet of Things. But it’s the network that links stuff (things, software and the cloud) with people and processes that really creates the most value these days and that value is growing exponentially as the number of users increases (the network effect).

So, it’s the relationship of data, stuff, people and processes that powers the value equation of the Law of Hyperconnection.

Come to our site to read about how all this leads to value as our clients and others use Digital Business Innovation to turn Hyperconnection into competitive advantage.

Allan Adler, Managing Partner

#DigitalBusiness #Ecosystems

5 views0 comments