"I have a mind like a steel... uh... thingy." Patrick Logan's weblog.

Search This Blog

Monday, August 18, 2008

Between You and the Cloud

Damien Katz wonders...

"Cloud computing is great. Right up until you lose your connection, then what?"


Then you wish your software was a little more aware of the difference
between you and your cloud.

As programmers, a lot of us toss around the
http://en.wikipedia.org/wiki/Fallacies_of_Distributed_Computing but
then expect the network to always be there anyway.

I absolutely believe most people and businesses should be using cloud
computing for nearly everything... soon. But most of the software has
to catch up to cloud style, and cloud style has to evolve toward more
awareness of the sometimes-impassable distance between the cloud and
the cloud user.


That condition will not change anytime soon, so software has to become
more aware of the issue.

1 comment:

Craig said...

I personally feel like the term Cloud Computing is a misnomer. What most refer to as Cloud Computing, I refer to as Domain Compute Services. We have Amazon, AppEngine and others, yet none of these services inter-operate across domains (unless some clever hacker makes it so). It's more like /Clouds/ Computing than Cloud. And when one Cloud goes down, it rains on every ones parade because it crosses every ones domain because of the nature of the internet; inter-networking.

People still don't get that any single connecting point is a point of failure and every day we create more connections, yet these domain specific services will and do fail even if they're built to be redundant. Single points of service failure when those services see new usage patterns is common.

The answer is simple. Don't have servers. Don't have single connectors. Don't have single services. Multiple trusted peer based networking - and I don't mean p2p as it's done currently - is where we need to be thinking. Not CloudS Computing.

There's an even greater problem looming here. One of huge data silo's on the horizon. So much data in one place that in order to use it, it's more efficient to move the code to it. Yet then the compute power at the data limits the usage of it. Scientists are already running into the problem of being able to reproduce results created from large datasets in order to verify and build further studies from them. The LHC is going to become another classic example of this. The sensor net that's arriving with all kinds of devices that are pumping their data online into silos is only going to grow and worsen. There's an answer and it's called convergence. The brain is an auto-associative network for a reason. Networking as we know it will need to change dramatically if we're to make real progress towards fault-tolerant inter-networking. I feel we need major changes at the protocol level, yet with legacy apps, NATs, content filters and other policies fucking up so much experimentation, it just feels like an uphill climb. A long slow climb along the stair way of hell to heaven, only then to become part of The Cloud.
Because Cloud Computing as we know it now certainly isn't anywhere close to heaven.

Blog Archive

About Me

Portland, Oregon, United States
I'm usually writing from my favorite location on the planet, the pacific northwest of the u.s. I write for myself only and unless otherwise specified my posts here should not be taken as representing an official position of my employer. Contact me at my gee mail account, username patrickdlogan.