Andrzej Parkitny is an Enterprise Integration Architect. Andrzej has fourteen years of software development and software architecture experience. He has designed and developed solutions in the health insurance, finance, engineering, retail, telecommunications, and healthcare industries. He holds an Honours Bachelor of Science in Software Engineering (Computer Science) from the University of Toronto. He has extensive experience in Service Oriented Architecture, Application Architecture, Solution Architecture, Software Development and Data Architecture. His expertise in SOA is supported by a SOACP certificate (Certified SOA Architect).
An Approach for Assessing SOA Maturity in the Enterprise Published: May 20, 2013 • Service Technology Magazine Issue LXXII PDF
Abstract: As a large organization grows, system integration becomes an important aspect of the operational nature of its growth and IT maturity. Organizations may claim that they have implemented a service-oriented architecture, but that claim may require further questioning. SOA is, in essence, a journey. This article explores how adopting SOA maturity models such as the independent SOA Maturity Model (iSOAMM) as described by Rathfelder and Groenda [REF-3], and OSIMM by The Open Group [REF-5], can assist IT integrators in establishing a formal process by which integration gaps can be captured and subsequent plans made to remedy such gaps.
Where is your SOA Practice Headed?
As Information Technology integrators, we may at times feel that the effort to integrate between the systems of an organization becomes too great as the organization grows. Although the practice of developing a service-oriented architecture is an approach that has its roots in dogmatic computer science, we still need to consider what is practical and achievable within the enterprise.
The implementation of an SOA practice is a journey at heart. Depending on where your organization is on its SOA journey, it is important to ask a number of key questions:
As SOA integrators, we are responsible for moving the organization's IT systems to an architecture in which organizational agility is increased through a reduced IT burden [REF-1] to ultimately lead to a tangible return on investment [REF-1]. The main challenge that we face is that IT departments are for the most part sponsored by project-focused business teams, meaning grand visions of upheaving the IT landscape are rarely entertained.
We can determine the current integration maturity of our IT enterprise through an assessment of SOA maturity, either against an industry-accepted model or its derivative. Depending on the maturity of our integration landscape, we can use this knowledge to identify where gaps exist and propose a roadmap to fill those gaps. Let us start by discussing SOA governance.
SOA Governance, Service Ownership and Taxonomy
At a minimum, we can be sure that each IT department has a separation of responsibilities that is aligned to specific business needs but not necessarily to specific functional components. That is, many stakeholders focus on their specific line of business and adopt a siloed approach. They measure IT success based on how the systems in a particular silo perform to fulfill very specific business needs.
It is the responsibility of us, as integrators, to keep their measurements honest. This responsibility should be shared between the system-owning teams and governance team to ensure that the strategic goals of increasing intrinsic interoperability, federation, vendor diversification options, and business and technology domain alignment [REF-1] are well understood by all parties involved, so that the benefits can be successfully achieved.
At the inception of an SOA adoption, there may be a number of IT teams that start exposing the system capabilities of the domains they support as services. These may simply be components such as EJBs, or better yet, as Web services that are described by a Web Service Definition Language (WSDL) file. It is important to note that SOA adoption is not instantaneous. It may be that your particular organization will need to implement service-oriented architecture at the domain level rather than at the enterprise level, before working towards a target architecture.
You can consider formalizing SOA governance by taking into account both design-time and runtime governance. This can be achieved by using a service repository that manages service governance through a service lifecycle workflow process. The individuals participating in the SOA governance activities should have clearly defined roles and responsibilities. Please refer to Table 1 for a high-level set of suggested roles and responsibilities, which can be tailored to your organization's specific needs.
Table 1 – Some suggested roles and responsibilities for service governance and management.
Once your organization has a framework by which proper SOA governance can be executed, I suggest that the SOA Governance team, with participation from service domain owners as led by technical leads such as the service provider prime, assess the current state of services in the organization.
For this, we can look at industry standards in order to classify service capabilities into categories of business processes. The TeleManagementForum (TMForum) in the telecommunications industry publishes The Application Framework (TAM) [REF-6] and the Shared Information Data Model (SID) [REF-7], which can be used as a reference service model taxonomy for helping to assign a category to an existing service or component and/or a newly designed service.
In addition to specific industry standards, you may choose to use a maturity model to assess the current state of where your organization is, in terms of service-oriented architecture.
Consider how an industry-accepted taxonomy based upon TAM and SID can be tailored to categorize schema and service contract assets in a service inventory, as illustrated in Figure 1. Notice that the namespace has a root qualification that represents the organization, which in the example is “examplecomp.” The data domain assets are categorized under the data taxonomy tree, as shown on the left in Figure 1.
The data domain taxonomy tree can hold multiple levels of classification. For usability, I recommend limiting it to two levels in order to keep the asset qualifier character length relatively manageable. The service taxonomy tree has a similar structure to that shown for the data taxonomy tree. However, the service taxonomy is focused on the classification of capabilities, whereas the data taxonomy is focused on the data in the messages passed in a service.
Figure 1 – A graphical representation of how an application and data taxonomy that are based on an industry standard, such as TAM/SID, can be used to categorize services in a company's
In Figure 1, two data schemas named Customer_v1_0.xsd and Address_v1_0.xsd have been identified. There is also a service called CustomerMgmtSrc under /service/CustomerManagement/CustomerInfoMgmt that has a WSDL that imports the Customer and Address schemas to be used in defining the service contract for this service.
As an organization matures, one will see that the artifacts in each of the categories that have been defined for the data and service domains will grow in number. If this is done properly, with acceptance and buy-in from multiple participants in the organization, the organization will achieve SOA benefits through service discoverability, service reuse and service composability [REF-1].
A Practical Approach to SOA Maturity Models
In their piece on maturity models, Rathfelder and Groenda describe a reference model for assessing SOA maturity called the Independent SOA Maturity Model (iSOAMM) [REF-3]. The iSOAMM suggests looking at five aspects or viewpoints of SOA maturity. These are Service Architecture, Infrastructure, Enterprise Structure, Service Development, and Governance.
The Open Group has published The Open Group Service Integration Maturity Model (OSIMM) [REF-5], which specifies a model by which you can assess the maturity of service integration in an organization. OSIMM defines a set of dimensions representing the different views, such as business or architectural, of an organization. These dimensions are Business, Organization and Governance, Method, Application, Architecture, Information, and Infrastructure and Management.
OSIMM also defines seven SOA maturity levels that can be applied against each dimension. These SOA maturity levels are Silo, Integrated, Componentized, Service, Composite Services, and Virtualized Service and Dynamically Re-Configurable Services.
Let us consider a simplified approach to the assessment of SOA maturity. The proposed flowchart in the following figures can be considered against three major aspects, with the assumption that your organization has an appropriate governance model that is driven via an SOA lifecycle management suite, such as the ORACLE SOA Suite [REF-2] or SOA Software Governance Suite [REF-4]. The three aspects are the following:
At the outset of the assessment process, we can look at the three streams individually (Figure 2). We may have one set of individuals scrutinizing the business analysis aspect, a set of individuals scrutinizing the design-time analysis aspect and another set of individuals scrutinizing the runtime aspect. It is highly recommended that each set of analysts has a high degree of competency in the stream they are assessing, which may be through education, certification, experience or a combination of these criteria.
The success of SOA practices is a reflection of the combined efforts of the participants, who need to have the appropriate skill set in order to ensure an effective execution of the SOA journey to achieve the relevant goals and reap the benefits.
I encourage you to use the OSIMM as a reference for generating a checklist by which you can assess each aspect. Also include any additional checklist items that are relevant to your specific organizational needs. For the purpose of this article I am assuming that your organization has an SOA governance team already established. In order to assess the level of SOA governance that your organization practices, you may consider referencing the OSIMM's Organization and Governance Dimension: Assessment Questions (p. 21) [REF-5].
For business analysis, consider referring to OSIMM's Business Dimension: Assessment Questions. For design-time, I recommend that you consider a number of dimension assessment questions that are described in Method Dimension: Assessment Questions, Application Dimension: Assessment Questions, Architecture Dimension: Assessment Questions, and Information Dimension: Assessment Questions [REF-5]. Lastly, for runtime, I recommend that you consider Infrastructure and Management Dimension: Assessment Questions [REF-5].
Figure 2 – A proposed process flow (entry) for an SOA Maturity Assessment.
Let us walk through a proposed method for the business analysis aspect, as illustrated in Figure 3. Determine whether business processes are documented across the enterprise, depending on the formal documentation processes in your organization. Any form of documentation that is sourced from a business primeship that articulates how they expect their business processes to be executed is a good starting point. If business process documentation exists, determine whether the documentation is isolated to a business silo or whether it is integrated with the processes' other business units in the
If there is minimal to no business process documentation available, this fact should be captured and communicated to management that the organization will be challenged to move forward in achieving a higher level of SOA maturity, from a business process perspective.
If the business organization works in a way where all or most of the business groups integrate their business processes with each other, then the organization is classified as having an Integrated SOA maturity level from a business perspective. For the basic integrated maturity level, each respective group has its own processes and process components, and integration occurs at the periphery of the processes as a group for a specific business unit.
The next level of business SOA maturity, or the Componentized SOA maturity level, requires the business groups/units to share business process components. For the last two maturity levels there is tight coupling of processes.
In order to ensure that SOA strategic goals and benefits are achieved, we are encouraged to move the business into adopting a paradigm of producer and consumer business functional roles that mirrors what we want to expose from a services perspective within all of the IT system domains. The concept of producer/consumer business functional roles corresponds to the Service SOA maturity level.
Once the organization has reached a Service SOA maturity level with respect to the business aspect of the organization, the next level of SOA maturity is the Composite Services SOA maturity level.
The beauty of composite services is that they are the ultimate representation of business processes, where the business is not burdened with understanding the specific technical details but rather focuses on the business processes exclusively. This can be achieved by having the business teams articulate business processes using a service orchestration notation such as Business Process Execution Language (BPEL). The BPEL artifacts can be translated into distinct business capabilities and mapped to business composite services that can orchestrate supporting domain services, which are managed by IT.
Figure 3 – A proposed process flow (business analysis aspect) for an SOA maturity assessment.
Let us now look at a method by which we can go through the design-time analysis aspect, as illustrated in Figure 4. Firstly, we should assess the inventory of IT components that exist in the organization to the best of our ability. If components are already being governed through an SOA governance platform, then some of the components can be discovered using the cataloging capability of the platform. For components that are not registered in a central repository, the exercise may be a bit more challenging.
If possible, each IT subdomain-owning team should document its IT components and provide the list to the SOA governance team for assessment and registration. For the purpose of our discussion, we can group IT components into a number of categories. One category is the legacy components grouping that contains all of the legacy components that are not service-oriented, for example mainframe components, proprietary vendor system components, stand-alone programs written in C, stand-alone programs written in Java, EJBs, CORBA components, and so forth.
A second category is the non-compliant components grouping, which contains all of the service components and has been designed in a bottom-up fashion for a specific business need without taking into account any SOA governance standards that exist in the organization. The third and ultimate category is the compliant components grouping, which contains all of the service components that have been designed using proper SOA design considerations and are compliant to the enterprise-defined SOA standards and guidelines.
These service components will have standardized service contracts [REF-1] that are designed in a way that minimizes message transformation through service federation [REF-1] and has message structures that reference canonical schema assets, following the Schema Centralization Pattern [REF-1] and Domain Inventory Pattern [REF-1] that are applied to entity schemas.
Figure 4 – A proposed process flow (design-time aspect) for an SOA maturity assessment.
The last aspect that we will consider in our assessment of SOA maturity is runtime analysis, as illustrated in Figure 5. The initial step is similar to the one in the design-time analysis, where we do an inventory of runtime IT components that are governed under a runtime governance platform such as the one found in the ORACLE SOA Suite [REF-2] or SOA Software Governance Suite [REF-4].
The inventory of IT components is categorized into a number of groupings categorized into stand-alone application components, integrated application components (non-SOA), common reusable infrastructure (such as an organization-wide API that can be packaged into a Java archive (JAR)), project-based components (SOA-compliant) and federated SOA components.
The stand-alone components are assessed at a Silo SOA maturity level. The integrated application components are non-SOA components and are assessed at an Integrated SOA maturity level. The common reusable infrastructure, such as a shared API (a layer that may abstract low-level programmatic components) can be assessed at a Componentized SOA maturity level.
As the organization moves into a more mature integration environment, we can observe that project-based components that follow SOA standards and guidelines begin to get registered into the IT infrastructure. These components are considered to be classified at a Services maturity level.
Once the Services maturity level has been reached, the organization, which has been using activities and assets coming out of the business aspect such as formalized business process documentation, will be in a position to produce business services that are classified at the Composite Services SOA maturity level.
Figure 5 – A proposed process flow (runtime aspect) for an SOA maturity assessment.
After you have completed the SOA maturity assessment of your organization, review the outcome of the analysis of each aspect of SOA maturity that we have discussed, namely, business, design-time, and runtime. You will most likely have different levels of SOA maturity across the analyzed aspects. The next step is to articulate and document a formalized plan by which your organization can move to the next level of SOA maturity for each aspect, and through doing so have alignment across these aspects.
For a heightened understanding of what you should be looking for when moving your organization to a higher SOA maturity level, review the SOA strategic goals and benefits as articulated by Thomas Erl [REF-1], a listing of which is provided for your convenience in Table 2. I encourage you to review the material further for your own understanding.
Table 2 – A list of service-oriented architecture strategic goals and benefits, as sourced from Thomas Erl's SOA Principles of Service Design [REF-1].
There is a definite benefit to performing an SOA maturity assessment. In keeping with the spirit of SOA strategic goals and benefits [REF-1], we can determine where our organization stands with respect to the kind of business and system integration that exists therein. Establishing SOA governance, applying SOA patterns that follow from SOA principles [REF-1], using industry standards to establish an appropriate data and service asset taxonomy, and implementing a reference SOA maturity model such as OSIMM [REF-5] can help elevate SOA maturity with respect to integration across business and IT system capabilities. Through the formalized plan that comes out of such an assessment, we can work with IT and business team stakeholders to work towards a better integration environment. Good luck in your SOA assessment efforts!
[REF-1] Erl, Thomas. SOA Principles of Service Design. Toronto: Prentice Hall, 2008.
[REF-2] ORACLE. Right from the Start: SOA Lifecycle Governance. (2010) http://www.oracle.com/us/dm/h2fy11/right-from-the-start-soa-194191.pdf
[REF-3] Rathfelder, C. and Groenda, H. 2008. iSOAMM: An Independent SOA Maturity Model. Distributed Applications and Interoperable Systems Lecture Notes in Computer Science Volume 5053: 1-15.
[REF-4] SOA Software. Integrated SOA Governance. (2013) http://www.soa.com/solutions/integrated_soa_governance
[REF-5] The Open Group. The Open Group Service Integration Maturity Model (OSIMM), Version 2. Berkshire, UK: The Open Group, 2011.
[REF-6] TM Forum. Application Framework (TAM) Concepts and Principles, GB929-CP. (2012) http://www.tmforum.org/DownloadRelease125/14239/home.html
[REF-7] TM Forum. Information Framework (SID), GB922 0-P, TM Forum Approved Version 1.6. (2011) http://www.tmforum.org/DownloadRelease125/14240/home.html