August 2002,

The end in mind ...

I often get asked the questions: where is this whole Business Process Modeling (BPM) thing going? what is it good for? why people think it should be based on open technologies? Here is my one page answer. as usual comments and feedback welcome.

11/27/02 A related article from Scott Hebner

The infrastructure battle

The software industry has been at war since its very early days with players that wanted to dominate infrastructure (Hardware, Operating systems, Distributed computing models, ...). I am too young to remember when it all started (though I "touched" my first computer -an IBM/360- when I was 5 years old in 1969 in Casablanca as my dad let me in the machine room of his company, and still remember it as if it was yesterday) but it started a long long time ago. This business was also a cash cow: a friend of mine told me that his grand-father retired on the money he made after he bought some stock in the streets of a quiet Massachusetts town -Maynard- from a guy talking about a new thing -computers-, he happened to be the founder of Digital... So the game and what is at stake has not really changed for the past 50 years. It is probably "softer" today, that's all. 

The past 15 years or so has seen new major players emerging as well old one disappear due to new infrastructure component types or flavors. IONA and BEA come to mind, and Microsoft or Oracle before them. Nowadays, infrastructure software include operating systems, databases, application servers, web servers, EAI and B2B infrastructures, ... oops I almost forgot web services, and many others like MOMs, LDAP repositories, ...

What we see is that people who build modern web-based applications (mission critical apps or solutions) need almost all of the pieces of this infrastructure. And if you develop applications it is hard to be successful if your infrastructure look like a mosaic of components provided by different vendors. Within a year or two because of the inherent complexity of the required infrastructure, one will be confronted to either choosing a semi-open infrastructure (built on open standards but with some proprietary fixes) or to spend a lot of effort integrating so-called open components that just not quite work together or do not perform as well as an integrated infrastructure platform. For instance, a unified security model comes to mind as something really hard to deal with a mosaic of components.. So let's face it there will be probably 3 infrastructure players left within a couple of years: IBM, Oracle (only if they care to be more than a database vendor) and Microsoft. Sounds familiar? when we cannot have the butter, the butter's money and the farmer's wife as we say in French, so we might as well have the right infrastructure.

BPM is a technology that helps writing complex applications, it is part of the application model as Intalio put it when it founded BPMI, it is a component that enables corporations to run and manage the process-oriented business logic of the their applications at a common level as opposed to the current situation where this type of business logic is buried in code in all applications. 

The future of "the" application model

Every generation of developers and architects have dealt with the issue of "the" application model. What are the best ways to write applications? It started by finding the best language and development tools (I remember my dad telling me how he entered each instruction in its computer, one by one by flipping a few switches), then by separating the data management from the application itself and by introducing libraries, then the client/server model, the n-tier model, ... I am again too young and not really interested to give you an historical perspective on this. However, as an application developer, this is something that cannot be taken lightly. 

More recently a new trend has appeared in the software development community: metadata. More and more applications and modules are metadata driven. This trend has been officialized by the OMG with the MDA: the Model Driven Architecture. This is a new approach to code reuse which is not based on reusing code per say as library in the past, but rather write some code which can be "configured" by metadata to work in a given context. This code is based on a metamodel which is often described via UML or XML schema. Of course the OMG goes further and is thinking to go directly from the metamodel to component deployment in different technologies (J2EE, CORBA, WebServices,...).

The reason of course why infrastructure is so important today is that one can no longer write an application from scratch, right on top of the operating system and a few libraries (remember POSIX?). We need to rely on a large number of services common to all applications (security, transaction, connection pooling, messaging ... to name a few). Productivity can be greatly enhanced by using the right pieces of infrastructure. We also need to point out that the risks of using pieces that you have no control over is that functionality and performance of these pieces may not match the requirements of your application. This risk is usually mitigated by selecting pieces that comply to an open standard. In this case you can at least expect that one vendor would match the performance (Anybody who is using XALAN XPath engine would understand what I mean) and implement more than 90% of the open standard specification. Only then your application has a chance to operate properly in production far from you developers desktop.

So how does this application model look like today? how BPM is positioned in the application model? Historically I think that no one will argue that the MVC pattern (Model-View-Controller) has been the most influential in shaping the way we develop user-based applications. 

The current forces when designing new applications are two fold: a) every application must be able to evolve rapidly -this is not so new-, b) most applications cannot be developed in isolation, they must integrate readily with their environment (typically other applications) -this is rather new, as the cost of ownership and the value of the application both strongly depend on how well they integrate with their environment.

Well, at this point, if you have not clicked away you would probably tell me that Enterprise Application Integration has been around for almost a decade, again as an infrastructure component. It has seen its winners: NEON, Mercator and more recently WebMethods (as Active) and CrossWorlds. But the pressure is higher today to integrate applications not only at the data level, but also at the process level. This is identified by the buzzword "Process Federation". In other words what application users, managers, developers want today is to be able to define business processes which span applications like the web client model has allowed us to build web-based user interfaces that spans multiple application seamlessly.

So BPM is right there at the convergence of metadata driven components and solving new EAI requirements actually killing two birds with one stone: A) your applications are more maintainable because the "process" is not hard coded but described by an XML definition, B) if most applications comply to the same business process metamodel, one has a greater chance to be able to compose and recompose these processes across applications, provided that their respective process engines interoperate.

In addition some BPM specification are also addressing another aspect of application development which is the "coding" of long running units of work. Historically, OLTP systems have provided a synchronous and therefore instantaneous link between the consumer of the data and the data. As applications are more and more integrated with their environment, one cannot expect to have the whole world synchronously (and transactionaly) connected to your application. Hence, more and more, you find code snippets in various applications which are little state machines that manage these asynchronous collaborations between your application and other applications. This code is typically harder to write, error prone and harder to debug. Some of the BPM specs have consciously or unconsciously provides a way to write this code as metadata. However, I don't feel that this is truly BPM, it is at best a glorified UML activity diagram. These units of work are, however, part of a business process and constitute the "work-flow" of the business process. 

If we go back to the MVC pattern, BPM technologies are designed to implement metadata driven controllers. We are yet to see a complete separation from the process-oriented business logic and the model-oriented business logic but we are clearly on the way. There is probably a big market for tool vendors to look at this new application model and provide tools that complete the vision of a "business process managed" enterprise because the application model itself (and once complete) is only going to get us half-way there.

A less visible battle is happening in parallel at the level of user "tasks". The process-oriented logic can be divided in two broad categories: a) tasks, which represent the interactions between a user and a given unit of work and b) business processes, which represent the compositions of tasks, systems interactions and B2B transactions. Several standards are playing in this field, XForm of course, the next generation of HTML forms, and the OASIS specification WSIA currently being developed after two efforts have merged (WSUI and some work done by IBM). Except for IBM the big guys are completely absent of this field, which is probably as much important as the BPM field: the paradox of automating your processes if that you need user intervention to handle and resolve exceptions. It is also important to move this piece from code to the metadata level as more and more complex "tasks" are being defined within applications. 

Now, where do web services fit in the picture? well web services are the edges of the business processes, they provide a homogeneous and loosely coupled invocation mechanism to the "units of work" implemented by applications (and traditionally exposed via a proprietary API) enabling them to be composed within the various business processes. One could also turn this picture upside down and view a BPM as defined by BPML or BPEL as nothing less than a web service "broker" providing various system and business level services for web services composition. I like the first picture better though.

Is B2B part of the story?

Well, yes, B2B is definitely part of any BPM story: how many business processes within any corporation does not have touch points with a customer, supplier or channel partner? Is B2B seriously considered by the Web Services, BPEL and BPML gurus, it is unlikely. They will try to convince you that you can do B2B with web services, you might even have to spend a lot of money to realize that you are dealing with a low level technology (one level on top of HTTP), which cannot support large interoperable business communities. So yes, the authors of these specs can talk as much as they want about travel agents and airlines and despites a few press releases, the Fords and Boeings are not betting their business on web services in the current state of the specifications.  If you still think you should bet your business, you will quickly realize that you'd be better off faxing your XML documents to your business partners, you would get non-repudiation, guaranteed delivery, and sender/receiver authentication for the price of a phone call. The most expensive web service infrastructure cannot deliver to these simple business requirements while ensuring worldwide interoperability. B2B infrastructure is going to remain the realm of specialists like Sterling Commerce, Seeburger or CycloneCommerce, to mention a few, simply because there is already an existing infrastructure in place (EDI) and an inherent complexity that do not need to percolate through "the" application model we outlined above. When Microsoft starts looking at it this way, it could end up buying one of these companies to bring B2B expertise in its product line and offer something else than RPC calls.

Incidentally, anybody which looks at how the web services architecture of specification is managed (i.e. everyone feels free to add its own spec) and the ratio of B2B semantics over RPC semantics exhibited by the various specs, one can easily extrapolate and derive that most of the web services spec are geared towards the next generation infrastructure and the new application model that will come out of it. The authors of these specs are on a mission (for their struggling company) and if you compare the amount of money that will be made selling this new infrastructure as a next generation distributed computing platform, compared to a B2B infrastructure, then you understand where these guys are heading.

So where did Microsoft and IBM want us to go today?

Well, clearly we are still far from the promised land of "business processes" and "real-time" business process re-engineering. It is going to be a while until you convert your IT staff into business analysts which spend their time monitoring or re-engineering your processes with point & click tools. Everything that you see or can buy today is still at the infrastructure and application model level. This is going to remain like this for a while. For how long? We will remain here until a) infrastructure vendor stabilize their "business process management system" offering b) most applications are rewritten to take advantage of this new application model and completely isolate the process-oriented logic from the model-oriented logic, c) tool vendors emerge with business process suite which provide the hooks to manage this infrastructure and metadata at the "business process level". Vendors like MEGA have been clear precursors of this approach, they might have been a few years too early though.

The immediate benefits are for your developers. They will become familiar with BPEL4WS and start using it to write and manage the process-oriented business logic. 

Phil Wainewright ( added some comments to the article.

Hit Counter