Like history, the computer industry is cyclical, so I predict that cloud computing will last fifteen years. Why?
Computing started as a single logical unit with all resources available locally. Then came the mainframe, where computing resources were centralised and accessed remotely through remote terminals. When mainframes became oversubscribed, resources were sent back to the client side (with some services remotely, admittedly). Cloud services are sending services centrally again, with devices acting like terminals for these resources.
Each iteration of these scenarios has resulted in evolution of the client and the centralised “server” each time, because a requirement has exceeded the limitation of the current scenario. Mainframes couldn’t handle the need to deliver graphical interfaces and large amounts of I/O. Client setups can, but are not financially or collaboratively efficient. So what requirement will mean the abandonment of “cloud” and bring resources back to the client? Some possibilities:
VR – developments like the Occulus Rift and Hololens may bring data centrally again because of the need to process huge amounts of data in real-time,
Decentralisation – IPv6 and cheap tech may mean that each device will have all the resources it needs, and talk to other devices in an always-on, peer-to-peer configuration. Centralised, static, resources like file servers will be obselete. Replication will replace backups, and file references will replace large data stores,
Disaster – the Internet breaks, and so some resources are sent local to the clients. This may also be in response to a perceived data threat, or a virus that completely ruins the global network.