Thursday, June 18, 2009

The Technical Vacuum of American Industry

Having been in the software development industry for my entire career, I've seen how computing has revolutionized the way some aspects of business and economy function.

And I've seen that some businesses insist on remaining stuck in a 1960's era mentality of technology, using it as a resource only when necessary - like toner for a copier, or paper cups at the water cooler.

It says here that such a mentality is a deadly practice for any business wanting to leverage technology over the next two decades.

The new economy brought on by the advent of handheld personal computers and devices, tied to gether via a world-connected Internet is, amazingly, still hidden from the eyes of many managers, with the value of technology-savvy individuals realized only in a mop-up mode. The problem is that technology is being managed by, well, managers; managers without a clue what they're managing, and as a result, are piloting their companies into a technical vacuum.

The old-guard mentality around technology and software holds that you bring in an IT guy - the kind everyone likes to make fun of - after some process or project is well on its way, to see if maybe an Excel spreadsheet or a webpage might simplify, or at least publicize, some aspect of the operation. The worse model is the one where some half-baked notion of software is implemented without thought to design or scale, necessitating expensive after-the-fact support to be brought in to fix a disasterous implementation. The latter notion is frequently seen in companies that live under the idea that IT resources were largely interchangeable, disposable, but certainly never part of a company's "core competencies."

Such is precisely the model that must change in order for companies to adapt to 21st century technological realities. Technology experts must be present at every level of most enterprises to bring to bear their expertise on every aspect of an enterprise's operation. It must be brought in as a partnering peer at the beginning of every new project, every new concept, every new plan undertaken.

This is not a temporary change. Technology is now an integrated part of every day life, and forever will it remain so. A reinvention of the business management model that recognizes the mandatory inclusion of technology expertise up front is in order. How well businesses respond to the new world order of technology is unclear. Those who embrace the integration of technology will create for themselves a vital competitive advantage over the next twenty years. Those who don't will wonder why their grasp of technology seems perpetually inadequate, as they continue to bring in staff to close the technology gap only after-the-fact, increasing expenses while blaming the very technology they are unwilling to embrace.

Companies that think they are forward-thinking in this regard probably believe in their approach to IT because they've added a CIO or CTO position in their senior hierarchy, and leave such people to design networks, control desktop deployments, manage their printers, and staff helpdesks. They manage bulletin boards and field arguments of Microsoft versus Apple, Windows versus Linux. They're also the ones to heed the call to cut IT costs when lean times arrive.

Hear this well: That is not enterprise technology integration.

A technology-integrated company mandates the assessment of risk and benefit that can be brought to bear on every enterprise project, no matter how distant a technical angle might be. It leverages the knowledge that, for example, electronically connected design teams can cross continental boundaries and timezones to achieve a continuously operating work force. It identifies how corporate intellectual property can be protected and leveraged to the profit of an entire company, and set trends for an entire industry. It gains insight into the operations of business elements across diverse domains of operation and learns how technology can build efficiencies into the enterprise.

Companies with the kind of forethought necessary to bring technology to bear from the top down, rather than backfilled, will be the Microsoft's and Google's of the next generation. Those who don't will, in all likelihood, go the way of GM, hoping a misguided government will bail them out of their own shortsightedness.

Sunday, June 7, 2009

New designs, bad habits?

Here's a poser for the Object-Oriented development crowd.

OO tells us to encapsulate; to keep our object methods small (or, in the word of the jargon, atomic). Build objects that have simple methods, and that helps make the objects reusable. By the same token, objects also have state that is resposed in one or more member variables that may or may not be exposed by public properties. State, however, is tyically expensive, because persistence implies the memory and similar resources necessary to implement it.

Atomicity and state indirectly tend to work against each other. If my methods are too small, it necessarily suggests I'm going to push out elements to the class level. But if I push too much to the class level, I run the risk of creating classes that may need increasingly complex persistence mechanisms which, in turn, suggests a class that may be too broadly scoped. Yet if I decompose (or factor) an object too much, the fragmented design becomes a nightmare to maintain. It's not clearly a vicious circle, but it's a cautionary cliff to avoid.

Here's a shadowy example.

Suppose you have a class:

public class Something
{
    public void InterestingMethod1()
    {
    int ImportantVariable;
   ....do something interesting...
    }
}


And, without typing them here, suppose you have several similar methods in this class, each with a similar "ImportantVariable" declaration. Now, the casual observer would probably suggest that the repetition of that variable could indicate that it should be declared at the class level, as such:


public class Something
{
    int ImportantVariable;

    public void InterestingMethod1()
    {
    }
}


If, however, we start referencing "ImportantVariable" in our atomic methods, don't we reintroduce an old villain in our nice, object-oriented code? It seems to me that in a class of any appreciable size that does any appreciable work, factoring out common variables to the class level starts to look a lot like our old nemesis - global variables. We all know they're bad, don't we? That is, a substantive module wherein we declare a variable once, then it has scoping across all methods, allowing a single change to wreak all manner of unintended consequences. But isn't that precisely what member variables are allowing us to do in even moderately complex classes?

I won't pretend that I have the answer here, nor that this microexample is anything but a strawman example of the point. So I'll throw out the question- what is the "right" answer? When do our atomic methods have elements like local variables factored to the class scope, risking global behavior; when do our classes have members pushed down to methods for the sake of atomicity?

When, indeed? The floor is open for debate...