I see "cloud" computing and think "mainframe" and "dumb terminals" and "thin clients".
Yeah, I see this as well.
And in all honesty - what's good for businesses isn't always good for us home users. Frankly, the 100% internet is a bit of a dream, and there are still security, privacy, latency, bandwidth, trust, etc etc etc issues.
Frankly, I don't think it's even a philosophically good idea to go 100% cloud computing. There's nothing inherently evil or wrong about offline or hybrid applications. I just don't get why people are so driven to make this happen to everything under the sun.
I can understand it for some applications. But everything? Really? Why? And why 100% "eventually?" Why not eventually hybrid? What's so horrific about offline computing that we have to think that 100 or 1000 years from now we'll all be "100% on the cloud?"
Frankly, I think it's more dogma than sense.
My opinion is that 100-1000 years from now, we're going to be using systems that will seamlessly work both online and offline. Hybrid systems with seamless synchronization. That's my vision, and IMHO it makes a lot more sense.
And franky, some things may never be totally or even partially online, even 1000 years from now. Sensitive medical info, business trade secrets, and classified information are things that will always make sense to remain isolated from the general Internet.