Monday, February 04, 2008

Henry Ford and Software Assembly

Having used the Henry Ford analogy on numerous occasions; it was interesting to read a recent JDJ article by Eric Newcomer.

The Henry Ford analogy to software goes something like this (quoting Eric)...

"The application of the Ford analogy to software is that if you can standardize application programming APIs and communications protocols, you can meet requirements for application portability and interoperability. If all applications could work on any operating system, and easily share data with all other applications, IT costs (which are still mainly labor) would be significantly reduced and mass production achievable."

Eric suggests that despite the software industry having attempted the pursuit of software re-usability, these activities have failed. Whilst the Web Services initiative has, to some degree, increased interoperability, it has failed to deliver code re-use. Eric concludes that the whole analogy is wrong, and that rather than trying to achieve code re-use, the industry needs to focus of sophisticated tools to import code, check for conformance and ready it for deployment within the context of a particular production environment.

This article triggered a number of thoughts:
  • Did the industry seriously expect WS-* to usher in a new era of code re-use? Surely Web Services are a way to achieve loose coupling between existing, and so by definition, stove-piped monolithic applications? I guess the answer here partly depends on the granularity of re-use intended?
  • Perhaps JEE should have faired better? Generic or re-usable business logic that could be deployed to a general purpose application server seems like just the thing! However, expensive bloated JEE runtimes, and the associated complexity and restrictions, prompted the developer migration to Spring.
Do these experiences really point to a fundamental issue with the idea of code re-use, or are they an indication that the standards developed by the IT industry were simply not up to the job?

If the latter, then what is actually needed? Clearly:
  • It must be significantly simpler for developers to re-use existing code relative to the effort required to cut new code for the task in hand -  thus implying:
  1. The ability to rapidly search for existing components with the desired characteristics.
  2. The ability to rapidly access and include the desired components into new composite applications.
  3. Component dependency management must be robust and intuitive both during the development cycle and during the life-time of the application in production.
  • The runtime environment must be sufficiently flexible and simple that it offers little or no resistance to developers and their use of composite applications.
  • In addition to the runtime environment insulating applications from resource failure, and providing horizontal scale, the runtime must also track all components that are in use, and the context (the composite system) in which they are used.

I'd argue that, unlike previous IT attempts, current industry initiatives are clearly moving in the right direction:
  • The OSGi service platform gives us a vendor neutral industry standard for fine-grained component deployment and life-cycle management. Several excellent OSGi open source projects are available; namely Knopflerfish , Apache Felix and Eclipse Equinox
  • Service Component Architecture (SCA) provides a vendor neutral industry standard for service composition.
  • Next generation runtime environments like Infiniflow (itself built from the ground up using OSGi and SCA) replace static stove-piped Grids, Application Servers and ESB's with cohesive, distributed, adaptive & dynamic runtime environments.
But are these trends sufficient to usher in the new era of code re-use?

Possibly - possibly not.

Rather than viewing code re-use simply in terms of "find - compose - deploy" activities, we perhaps need one more trigger; the development framework itself should implicitly support the concept of code re-use! This message was convincingly delivered by Rickard Oberg in his presentation concerning the qi4j project at this years JFokus conference.

But what would be the impact if these trends succeed? Will the majority organizations build their applications from a common set of tried and tested shrink wrapped components? To what extent will third party components be common across organizations, or in house developed components be common across systems within organizations?

The result will almost certainly be adaptive radiation; an explosion in re-usable software components from external software companies and internal development groups. As with any such population, a power-law can be expected in terms of use, and so re-use; a few components being used by the vast majority of systems, whilst many components occupying unique niches, perhaps adapted or built to address the specific needs within a single specialist application in a single organization.

Going back to the Henry Ford analogy, whilst standardization of car components enabled the move to mass production, this was not, at least ultimately at the expense of diversity. Indeed, the process of component standardization, whilst initially concerned with the production of Ford Model Ts (black only) resulted in cars available for every lifestyle, for every budget and in any colour!

No comments: