Thursday, June 23, 2005

Business Process Breakthrough Needed

This blog entry is an unusually long posting as I try to first explain, and then place into perspective, the entire fifty year history of the computer industry.

My goal is to describe what I see as the major breakthrough that will propel software development into the next decade of the 21st century.



Every now and again, the software industry has experienced some truly remarkable epiphanies. For example:
  • The invention of compiled programming languages that translate source code written in a high level language into object code or machine language that may be directly executed by a computer's CPU.

    • FORTRAN, developed back in 1957 by a team at IBM led by John Backus, is generally credited as being the first successful compiler.

    • COBOL, created in 1960 by the Navy's Admiral Grace Hopper, was an especially important early high level language. It introduced the notion of separating DATA and PROCEDURE into separate portions of code.

    • ALGOL, developed in the 1960s, pioneered most of what's considered modern, present-day, programming language concepts, such as block structures, functions, and subroutines. Today's crop of popular languages, C and its descendents (C++, Java, and C#), still bear a remarkably striking resemblance to Algol.

  • In the 1970s, tremendous advances were made in the field of database management. None were more significant than the introduction of the ANSI/SPARC three schema two-transformation Data Architecture.

    • external schema -- supports multiple different views which control how data is accessed by all individual programs or users

    • conceptual schema -- defines a single, unified model that logically represents real-world entities and relationships

    • internal schema -- specifies how data is physically stored on disk, including access methods and related data structures such as pointers, indices, and hash tables

  • The 1980s are perhaps best remembered from a software perspective for the emergence of multi-layered Network Architecture stacks with protocols and interfaces. A layer in one stack communicates with a peer layer in another stack through protocols. Within each stack, interfaces specify how one layer interacts with the layers immediately above or below. Network Architecture is what made the Internet possible. The most important implementations included:

    • ISO's OSI (Open System Interconnect)

    • IBM's SNA (System Network Architecture)

    • Digital Equipment Corp.'s DECnet (DEC's Network Architecture)

    • TCP/IP (Transmission Control Protocol/Internet Protocol)

  • The really big software breakthrough that occurred during the 1990s was the widespread adoption of object-oriented programming (OOP). Three core concepts lie at the heart of the OO paradigm:

    • encapsulation
      An encapsulated object acts as a "black box" for other parts of a program which interact with it. An encapsulated object provides a service, but the calling objects do not need to know the details of how the service is accomplished. Objects present an interface which defines the set of services the object provides. Objects interact by sending messages back and forth to each other. Each message is defined by a signature which specifies a count of the number of different parameters comprising each message, plus a name and data type for each individual parameter.

    • polymorphism
      Objects can respond differently to the same message. Polymorphism usually refers to overloading. This is the idea of allowing the same code to be used when incoming messages may be made up of different data types. Overloading allows multiple functions, based on different data types, to be defined using the same name. For example, consider the difference in functionality provided by the "+" (plus) operator depending on whether the operands (i.e., the numbers being added) are integers, floating-points, or strings (the latter is often used to specify concatenation).

    • inheritance
      Inheritance is a relatively simple concept which allows one class to extend another, or to inherit characteristics. When a class is extended, to create a sub-class, all of the properties (variables and methods) of the original class still exist within the new class along with any other new properties which have been added. The sub-class can override existing properties by reusing the same name as an inherited variable and/or method. In other words, new properties and be added AND existing inherited properties can be modified.

    Possibly even more valuable than the three concepts presented above -- encapsulation, polymorphism, and inheritance -- is object technology's support for a property called closure.

    I'm quite sure you're already familiar with this notion of closure. For example, comsider how integer algebra supports the closure property. If you add two integer numbers together, the resulting sum is always an integer. If you multiply two integers together, the resulting product is always an integer.

    Similarly, relational database technology supports the closure property. If you take two relational tables and join them together using relational algebra, the final result is always another relational table.

    With objects, if you take one object and embed it within another object, the result is always another object. This is enormously powerful in that it enables support for a feature called object composition. The beauty of this capability is how it allows programmers to build software the same way people usually describe their requirements -- by exception. That is, I want my program to behave just as it already does, only differently.

    When a software developer takes one object and wraps it around another object, the programming code in the outer wrapper object has to be able to handle all incoming messages. Now, imagine an incoming message arrives that matches a message supported by the inner wrapped object. The outer wrapper object has a choice. One possibility is for the outer wrapper object to simply pass the incoming message along to the embedded inner wrapped object for processing. However, if new functionality is desired, the outer object wrapper can intercept the incoming message and pass it instead to some new code in the outer wrapper object, thereby bypassing the embedded inner wrapped object. Note that it's also entirely possible, and often quite practical, for the outer wrapper object to add code to support new messages, and thereby extend the existing object's original functionality.

    In terms of full disclosure, let me point out that development by exception can also be implemented using inheritance, and sometimes inheritance is a better approach for handling exceptions than composition. Explaining the differences between composition and inheritance is beyond the scope of this discussion. If you want to explore this topic further, let me suggest you begin by reading Object Composition vs. Inheritance.

    There's one last point I'd like to discuss regarding objects before we switch our attention to business processes. Along with the emergence of object technology came a three-tier approach to distributed processing. Just as data in the 1970s evolved into three layers (i.e., external, conceptual, internal), objects in the 1990s were broken down into three groups of categories:

    • Visual Objects -- refer to the external, visual appearance as seen by a user, particularly a user interacting through a graphical user interface. Today, most GUI environments are implemented using a design pattern known as MVC -- Model, View, Control.

    • Business Objects -- represent the equivalence of a Database conceptual schema. Each business object is generally described in terms of its properties (i.e., data fields), the messages it accepts (i.e., its behaviors), and the messages it sends (i.e., the other objects with which it collaborates). Another useful way of describing business objects is a technique called CRC -- Classes, Relationships, Collaborations. The most difficult problem in learning about object-oriented is getting a programmer to give up the global knowledge of control that is possible with procedural programming languages, and rely instead on the local knowledge of objects to accomplish tasks. Procedural designs can be characterized at an abstract level as having processes, data flows, and data stores. CRC represents a similar set of fundamental principles for object designs. Classes create a vocabulary for discussing a design. Designing requires finding just the right set of words to describe objects, a set that is internally consistent and evocative in the context of the larger design environment. Responsibilities identify problems to be solved. The responsibilities of an object are expressed by a handful of short verb phrases, each containing an active verb. The more that can be expressed by these phrases, the more powerful and concise the design. One of the distinguishing features of object design is that no object is an island. All objects stand in relationship to others, on whom they rely for services and control. Collaborators are objects which will send or be sent messages in the course of satisfying responsibilities.

    • Persistent Objects -- refer to how objects are stored on disk, often inside relational databases. Because of fundamental modeling differences between relational and object technologies, mapping software is often needed to translate between the different representational models.
While the hope for object technology to deliver on the promise of reusability was originally huge, the actual results achieved proved to be significantly less spectacular. Two problems in particular arose. The first involved business object bloat. It turns out that over time, business objects have a tendancy to just keep growing bigger and bigger with more and more functionality getting added to the mix. The second issue was even more vexing. Developers found it extremely difficult to find the objects they were supposed to reuse. Basically, the tools provided to software developers for browsing class hierarchies turned out to be the weakest aspect of integrated development environments (IDEs). If existing objects can't be easily found, developers just go ahead and reinvent the wheel by creating entirely new object classes.

That pretty much brings us up to the first decade of the 21st century, sometimes referred to as the aughts.

The 21st century breakthrough that has moved computing beyond the realm of object technology was the realization that the most reusable facet of enterprise computing is not business objects, but rather, business events.

The most effective method for leveraging business events is modeling business processes using state-transition diagrams. A stable state exists until such time that a business event occurs. The actual business event gets signaled by a message. In response to that message, business rules get fired off that determine whether or not a state-transition should take place.

While business events represent the most reusable facet of computing, business rules reflect the most volatile aspect of computing -- business rules are continuously changing.

The best way to automate business events and business rules is with message triplets that specify:
  1. objects
  2. actions
  3. conditions
When a business event occurs, it triggers the sending of a message to a business object. Included with the message is an action along with associated parameters. Then, conditions can be tested based on business rules. If the test returns TRUE, the message is sent and the corresponding action performed. On the other hand, if the test returns FALSE, the message is discarded and ignored.

The elegance of this approach comes from the fact that at every step in a process, what's being manipulated are business objects. In other words, the process is providing a locality of reference so that objects can be found and hence reused. Should an existing object not provide exactly the functionality that's required, the software developer can use either composition or inheritance to extend the object's functionality by exception. If a business object already provides an 80% solution, then the developer only needs to concentrate on satisfying the remaining 20%.

Obviously, there's a tremendous synergy that exists between business objects and business events. Add in the declarative nature of business rules and the resulting development environment represents a major advancement in programmer productivity. Collectively, the promise of focusing on business processes is their ability to enable enterprises to achieve enormous and sustained operating performance improvements.

People who are committed to business processes and who, in fact, comprehend its true nature, must assure that others within the enterprise also share a correct and deep, as opposed to flawed or superficial, understanding of business process concepts.

Dr. Michael Hammer, a long-time industry pundit, points out how business processes ought to be specified end-to-end -- not just as a single procedure -- not just as any sequence of activities -- not only routinization or automation. Indeed, Hammer views the end-to-end nature of business processes as the most distinguishing feature that lies at the heart of its power. He believes that central to harnessing business processes is reorienting people's mindsets, focusing everyone's attention on customers, outcomes, and teams. In other words, business processes provide the big picture that enables people to see themselves and their work in context.

The analysis of business processes should not be limited to transactional back-room operations. It also applies to creative activites (like product development), support or enabling processes (e.g., HR), infrastructure processes (such as plant maintenance), and even managerial work (such as stategic planning). All these benefit from the process disciplines of design, measurement, ownership, and ongoing management. According to Dr. Hammer, business process management entails precise design, honest diagnosis, disciplined remediation, and eternal vigilance.

Perhaps the most important point here is that executive management, the people Dr. Hammer is talking to, is hearing the same message IT people need to hear about the importance of business processes. Superior business processes are the result of creative design, precise implementation, and careful management of change. To succeed, an organization must have expertise in the full range of issues associated with process design and reengineering, from the conceptual (identifying and envisioning processes) through the technical (structuring and carrying out an implementation plan) to the organizational (anticipating and overcoming resistance to change).

BPEL and BPM

As Paul Harmon of BPTrends points out, there's a giant gap between the promise of business process management, as described above by Michael Hammer, and the reality of technology to deliver on this promise.

If you talk to business process modeling practitioners, you will discover that most assume BPEL (Business Process Execution Language) is, or soon will be, the language of choice for modeling and managing business processes.

The first, and thus far most successful implementation of BPEL, called BPEL4WS (BPEL for workstations), was originally jointly developed by BEA, IBM, and Microsoft. Unfortunately, BPEL suffers from a huge deficiency.

While many vendors claim to fully support BPEL, the sad truth is that each individual vendor has pretty much been forced to go off and build its own proprietary implementation with numerous features needed in order to fill the gaping holes in the BPEL specification.

The major underlying problem is that OASIS, the standards group responsible for the BPEL specification, has decided to leave "holes" in the standard which individual vendors will have to fill. This guarantees that even complete official BPEL implementations will require proprietary extenstions in order to work and that it will be impossible to interchange one vendor's version of BPEL with another's.

Most people naturally assume that BPEL should be able to manage all business processes -- both automated as well as non-automated. But, the current version of BPEL can only manage automated activities. This undermines the whole possiblity of using BPEL as a way of passing business process descriptions from one tool to another, or passing a company's business process descriptions to its partners.

Conclusion

The road to business process management is going to be difficult. As Paul Harmon has pointed out, BPEL is still a very immature standard and desperately in need of dramatic improvements. As Michael Hammer indicated, the paradigm shift associated with business process modeling and management extends far beyond the boundaries of an enterprise's IT organization.

The ultimate goal is to bring object-oriented's development by exception philosophy into the realm of business process models, such that an existing best practice workflow can be used as a starting point, and then easily extended or modified to accomodate local requirements. One can almost envision a world where a business-oriented developer working collaboratively with an object-oriented software developer could rapidly and almost effortlessly take existing business processes and business objects and make simple changes:
  • adding new capabilites
  • modifying existing capabilities
Add to this mix a declaratively-specified business rules engine that controls state-transitions, and the net result represents the ultimate software development platform in terms of productivity, speed of change, and responsiveness to business needs.

0 Comments:

Post a Comment

<< Home