Anyone who’s spent any time as a software engineer or managing IT will attest to the fact that just like the clothing industry, computing has it’s trends and fashions. Some, like client-server architecture, have been around so long that it’s hard to imagine a time without them. Like the necktie they’ve just become the standard for how we do business. Others seemed long gone only to resurface again in a new form, like VDI, with it’s pedigree stretching back to the time sharing systems of old (bell bottoms anyone?) The question then becomes “which trends to follow?” Do you blindly jump on every one that comes along like a teenager desperately trying to fit in? Or do you stubbornly play the rebel and eschew the current fashion in favor of classic strategies and technologies?
The answer of course is “neither.” When it comes to technology trends the enterprise can ill afford to blindly embrace them nor can they be ignored. Each must be carefully considered, tested, and ultimately a decision must be made based on the technical and more importantly the business impact they’ll have on the enterprise as a whole. I realize that to most of you this sounds like simple common sense, and it should. However after 17 years in the industry I’ve learned that what is clearly common sense to the individual can become something very different to the organization, and no trend has illustrated this better than cloud computing.