Monthly Archives: May 2004

Islands of Misfit Tools

A week ago I had the pleasure of talking with David Sprott of CBDI Forum about IONA in general and about our new products, Artix and Mobile Orchestrator. I promised to call him up next time I’m in Dublin, and I think I will, since it looks like we have a lot in common.
Part of my job includes talking with reporters and analysts. Usually I try to be familiar with the previous work of each person I talk with, but last week was a hectic travel week (aren’t they all I guess ;-), and this was one time I had not managed to get the background preparation work done.
Anyway, after the call (and during this week’s hecticic travel schedule 😉 I read the January and March issues of CBDI Journal, and found the articles to be a very good mixture of reporting and analysis.
For reasons that readers of my blog entry on related topics will understand, I was particularly pleased to find David’s article on “Adapting RUP to Support Service Orientation” in the January issue.
The article was based on some interviews with IBM/Rational folks who “…were quite open that the RUP was in need of considerable redevelopment to meet the requirements of the service oriented world.”
(The Rational Unified Process or RUP is the methodology behind the Rational toolset’s development model, which is basically implemented using UML and MDA.)
Some of the interesting bits follow (the online article requires a subscription, otherwise I’d include a link):
“The main granularity unit for RUP and OOA/D is the class. This is too fine grain. It should be the service and the component.”
“RUP tends to assume the focus is a single application…”
“We note a recent paper by Philippe Kruchten of IBM/Rational on RUP and Legacy, where it is argued that RUP supports Legacy strongly. However, the main thrust of the argument is that it is the RUP process framework which can assist, which we would agree with. However, there is no architecture or technique guidance in this area, nor in integration generally, which is the area of concern for most enterprises and the starting point for service orientation.”
In other words, the models, techniques, architectures, and developement metaphors and concepts behind the successful OO tools are not the same as the ones needed for successful SO tools.
It may be possible for IBM/Rational to change from OO to SO, but I’m not convinced. David’s conclusion is more upbeat: “The RUP process framework will evolve fairly easily to SO.”
But in the next sentence he outlines what to me seems like a very big challenge: “Many of the workflows will need tweaking, and the main Analysis, Design and Implementation ones need to be replaced with Specification, Provisioning, and Assembly. This in turn will give rise for the need for different project templates.”
And furthermore “The more major issue for RUP users relates to the fundamental architecture, where there are some foundational concepts that need radical revision…”
To me this seems to imply that the people who brought us RUP are not likely to be the ones to bring us the next generation development model around SOA, since it is going to be too hard to keep the current generation of tools going while simultaneously building the next.
I think new tools are needed that are a better fit for SOA, along with new development approaches, models, and architectures. The old tools have the wrong focus — they are oriented toward the developer of applications that perform discrete units of work or functional aspects of larger systems. We need tools oriented toward builiding services that are of value to the consumer of a service, instead.

Using Doc Literal for all Services

With all the discussion about the best encoding style to use for WSDL/SOAP, such as this excellent IBM DeveloperWorks article from October, 2003, I have to wonder whether the best answer is really just to use doc/literal for everything…?
The doc/literal style would seem to be the most abstract or the most “loosely coupled,” since it does not include data typing (although data typing is provided by an associated XML Schema) and does not include a method name in the message.
True, the burden of interpreting the XML document falls on the Web service implementation, but isn’t that how it should be? In developing services, as Don Box and others have been recently saying, shouldn’t we be thinking about the content of the message independently of the underlying implementation of the service? Whether it’s object-oriented, procedure-oriented, a message queue, an ERP package, Java, C#, COBOL, or Python? Then the implementation of the service, whether a direct or indirect mapping (that is, whether the execution stage involves transformation or multiple steps), can take responsiblity for extracting the relevant information from the XML document and formatting it for the execution environment.
Won’t this level of abstraction really make life easy? Maybe not for the folks who have to develop the services and the mapping of the documents, but for the next level developers who have to stitch everything together. I mean, once enterprises are all service-oriented, and have adopted and implemented SOA using Web Services and XML, won’t integrators just have to worry about the data being passed in the messages, and not about the objects, queues, methods, or programs underneath the services?
Once executable, compiled software systems are over-layed with XML based Web services, and they start exchanging messages based on the data in them, and stop worrying about the object signature, productivity will go way up, won’t it?
Today when integrating various bits and types of software, combining programs written using different languages and/or middleware systems, you have to know what it is you’re invoking and figure out how to structure your data into the arguments associated with the particular method or program name. I don’t think that’s very productive, at least not for those integrating services into appliations and flows.

Mass Assembly of Software

I first wrote about the mass assembly of software in 1995, in a chapter of the Future of Software called “The Keys to the Highway.” We used to call the Internet the Information Superhighway back in the day when Al Gore invented it ;-).
(The title is from a song by Big Bill Broonzy, by the way.)
At the time we were working on a project in the telecom industry called SPIRIT, which was a follow-on to MIA.
We demo’d SPIRIT and STDL on seven different platforms, including HP
(I apologize but Web references to MIA and the Telecom ’95 event are a bit hard to find a decade later.)
The “keys to the highway” are standard APIs across any platform, and an interoperability protocol capable of connecting any platform. Once you have those standards in place — I mean really in place, the idea is that they should enable mass-assembling applications out of interchangeable, multi-sourced “parts” (also known as “services” ;-).
The biggest single cost of IT remains labor. Standards for interchangeable software “parts” could have the same effect on the software industry as they did on the hard goods manufacturing industry, turning it from a craft industry into an automated industry, starting with the Model T, and driving an economic cycle of affordability.
We accomplished it with SPIRIT, at least in demo form, and I managed to get the STDL spec adopted by X/Open (now The Open Group). STDL is a kind of precursor to EJB inasmuch as it abstracted low-level distributed transaction processing details from the developer. It was implemented on top of seven different TP monitors.
We forsaw and proved the concept of a future, and wrote about it, in which applications would be mass assembled instead of craft built. All that’s needed is the equivalent of the threading, tooling, and fastener standards Henry Ford required of his parts suppliers.
However, SPIRIT was all procedure-oriented. Java was not invented when we started the effort ‘way back in 1990; neither was DCOM or CORBA. The software industry was not done innovating yet.
Today the industry is done innovating and is looking toward Web services to finally provide the long sought for solution to get strategic value out of IT investments, automating the production of custom business process flows.
Will the promise finally be realized? Will the right commercial pressure finally be brought to bear? Or will it all collapse on itself, as the software industry tries to cling to outmoded business models and continues to compete on the basis of products rather than on conformance to the standards IT so desperately needs?