Master IT Architect, working for Hewlett-Packard since 2002, has 17 years of experience with Computing Industry, delivering project for the Financial Services customers. Leszek has gained programming, solution development and project leading experience, building IVR systems, database projects and Web applications. Prior to joining HP, Leszek worked for Metrosoft, Inc, delivering solutions for major US brokerage houses and investment banks.
Representative accomplishments are: SOA platform implementation for National Bank of Poland, Multi-Channel Platform and Corporate Electronic Banking application for Lukas Bank (Credit Agricole), Investor Phone System for TD Waterhouse and Quick & Reilly, Bloomberg Investor Phone for GTE Airfone (Verizon). Leszek’s current focus is on Internet Banking Solutions, Service Oriented Architecture featuring both, JEE and .NET technologies, Enterprise Security and Master Data Management solutions.
SOA in Banking: A Review of Current Trends and Standards based on an Example of a Real-Life Integration Project Delivered to a Financial Services Customer Published: June 11, 2014 • Service Technology Magazine Issue LXXXIV PDF
Abstract: The following article is based on the example of an actual banking project, delivered by HP Professional Services to a major European bank. A similar approach has been taken in other integration projects to extents, depending on the legacy constraints and requirements of the customer.
Due to rapid growth in the market, a European bank decides to replace its IT application infrastructure, including its core processing system, integration platform and all front-end delivery channels. One of the challenges of this project was the integration of over 20 back-end systems being used by the bank, as well as integration with external service providers.
Three main challenges of the project were: (1) business related, to bridge the gap between business requirements and technical specifications, (2) management related, to shorten development time and meet the aggressive timeline of the project, and (3) technical, to assure that performance requirements will be met on the given hardware platform. The performance of the transactional system, especially the middleware layer, is extremely important because of strict rules imposed by the systems being interfaced. An example is how ATM requests processed in the online mode are governed by strict time-out rules.
Two key areas of the solution were: (1) the definition of a unified, enterprise-wide data model for communication, and (2) selection of tools, allowing high-level definition of orchestrated services, that will meet agreed performance requirements in the production environment and allow business analysts to participate in the service design process.
The project team acknowledged that the resulting data model would need to follow business requirements but also need to comply with the requirements of the existing legacy systems. While basic data transformations can be performed directly in the service interface of the back-end systems, fine-grained business requirements can be fulfilled only by the orchestration of coarse-grained back-end services.
In contrast to business services exposed by the back-end applications, services resulting from the orchestration were called virtual services.
Unified Data Model
Building a unified data model was the first step on the transformation roadmap for a service-oriented architecture. After an IT assessment, 21 different back-end applications have been identified that provided various services in the bank's production environment. Applications were operated internally by the bank itself or externally, usually by another bank, exposing a set of remote services. After identification of the distinct data areas, a "primary owner" or "source application" (application that owns the particular data) has been assigned to each data area, redundancies (multiple systems storing the same data) have been identified and finally systems that were dependent on the selected data have been found. As a result, representation of the particular data item has been defined as an XML schema type.
How to Start the Process?
Common approaches to the data modeling were to:
The first option was ruled out, because there were no single standard defining end-to-end banking operations. Also, country specific regulations are very often inconsistent with particular standards, leading to the need for proprietary extensions.
The last option was ruled out, because the effort (cost and time) estimated for the "build from the scratch" approach far exceeded the available resources.
A Unified Data Model was created based on a template: XML Reference Data Model developed by our team, using several industry standards and applicable to the financial industry, like SWIFT, IFX (Interactive Financial eXchange) or ISO20022 (ISO Standard for Financial Services Messaging), and drawing on our practical experiences previous projects.
This selection allowed for a pragmatic approach to the standardization process within the bank. Instead of enforcing some specific standard, it provided an effective framework and templates for the implementation teams in all the relevant banking operations' areas. It was positioned an effective baseline for the Gap Analysis and subsequent analytical and design activities, with the final result being a full, customized XSD representation of the actual bank's environment. We found such an approach very practical and effective in several other banking projects.
Another question to be answered was the scope of the project, defined as:
Although the project would follow similar steps and activities in both cases, an enterprise-wide project would require much stronger governance, in order to synchronize work performed by the teams focusing on particular domains.
The project referenced in this article was recognized as an enterprise-wide project, and the work was organized around the following activities:
Identification of Business Domains
Particular domains and sub-domains have been associated with different business units of the Bank. For example:
Another aspect of such a split was the definition of data ownership. Data ownership can be viewed from an IT (technical) or a business perspective. IT is mainly focused on where data is being stored, while business looks at the data from an operational perspective. Very often those two perspectives are different. For example, in case of the bank using multiple core banking systems (CBS) data entities belonging to the same logical domain are stored in different core systems. Looking from this perspective, the application of logical views allows for the identification of important links that are not visible on the technical level.
The following diagram presents different ways to split the data, depending on the viewpoints described in the next chapter.
Definition of the Design Standards
The consequence of the decision to follow service-oriented design approach was the ability to involve business at any stage of the project, starting from the analysis through to design, implementation and maintenance. In order to ensure that such cooperation is possible, the project team had to prepare set of simple, yet extremely important design standards enabling this cooperation, such as:
A rule of the thumb is that naming conventions should be self-explanatory and human-readable. It is also good to agree on the consistent usage of a
The first part of the example shows basic types, that might be either enterprise-wide (as Amount or Currency), or domain-specific, such as Address type.
The second part shows domain specific types, where "basic" type is always defining a minimum subset of the data common for all the sub-types. An example is how "personal" and "corporate" sub-types of the user is demonstrating different identities of the same entity (e.g. the user can be identified as an employee of the corporation, or individual customer of the bank). "Permission" type is an example of a set of properties that are applicable to any user. Finally, UserType is a super-type, grouping all possible details of the user together.
The third part of the example is presenting message parts that will later be used to build services and service operations (functions).
Organization of the Physical Assets of the Project
Although it may sound like a minor decision, the organization of the data model in terms of folders and files can have significant consequences, both for IT and for business. While the technical approach focusses on the organization of files based on their technical properties (e.g. simple types, complex types, interfaces), the business approach would be oriented around grouping the files based on data ownership.
Enterprise-scale projects usually require a number of teams working in parallel. Proper organization of the physical assets was very helpful in achieving a certain level of independence between the teams, focusing on different business domains and working with different counterparties from the bank.
Structure of the Data Dependencies on the Analytical Level
Another fundamental decision is to distinguish between enterprise data that will be shared between various business domains and domain specific data that most likely will be governed within the domain.
In both cases, common data will be organized in the form of dictionaries. While domain-specific dictionaries are governed locally by the team responsible for that particular domain, enterprise dictionaries will be shared between the teams and hence would require special governance procedures.
Tools and Document Templates
The most important part of collaboration between the project teams is to focus on the business and functional contexts. This is why proper selection of the design tools and document templates is critical for effective collaboration between the teams, ensuring that the results can be easily shared and merged together on the enterprise level.
Selection of the tools for data modeling.
In the project referenced in this article, a choice was made between two tools:
While those tools are significantly different in nature, both allow for the definition and maintenance of the data model. The main differences are the following:
The choice made on the project level was to use XSM Spy as a primary data modeling tool, because of the outstanding quality of the XSD produced by this tool.
In the other project, a similar decision related to the choice between EA and XML Spy led to mixed approach. Thanks a high-quality API exposed by EA, small application were developed to automatically synchronize work between the two tools.
Introduction of Service Governance
Service governance is essentially related to central management of the enterprise resources that span between domains and business areas. All the already mentioned design standards are contributing to the overall governance, but cover only one aspect. Tools and templates won't work without strong management procedures that needs to be followed at any stage of the project.
Unfortunately, event templates and procedures are not sufficient for overall success. It could be the case that, especially during the analysis phase of the project, work done in accordance with agreed standards could lead to meaningless results. In the project referenced in this article, a team working on the transaction authorization procedures came to the point where processes were so complicated that there was no simple way to model them in any of the available tools.
Such illuminate the most important aspect of governance – strong leadership. It could be the Chief Architect or any other team member with a clear vision of the overall solution and the ability to put several small bits and pieces into the big picture to represent expected results of the project. In the referenced project, the decision was made to establish an Architecture Board consisting of the Chief Architect and a few other team members with good overall knowledge of the project. Throughout the project the Architecture Board was responsible for reviewing of all the analytical documents, to achieve overall consistency. In case of conflicts, members of the Architecture Board were dealing with business to produce satisfactory solutions.
Definition of the Physical Data Model
Definition of the physical data model was following top-down vs. bottom-up approach.
The primary focus of the top-down approach was to understand the future needs of the business stakeholders, by:
The bottom-up approach focused on the existing applications by:
Design of the XML Schema
Results of the top-down and bottom-up analyses lead to the creation of the unified data model defined as an XML schema, where:
The prefix CFM stands for Common Message Format. In various banking projects, data models defined for particular banks were named individually for each bank. For example, the model for Bank X could be named BXMF.
Design of the Service Messages
Although XSD was used to define data structures, the ultimate goal of the SOA project was definition of the services. This is why the data model was usually referred to as a "message model".
Message-specific data types extended the following abstract types:
Usage of the common types for request and response messages brought another level of standardization. Despite technical data transferred in case of Web services in the SOAP header, common types were defining common parts of the body of the messages.
Design of Web Services
The Web services were defined for "business services" exposed by the back-end systems and for "virtual (orchestrated) services" exposed by the service orchestration engine separately. In both cases:
Selection of Service Orchestration Tools
The ability to create fine tuned business services following detailed specifications prepared together with the business was one of the fundamental requirements that justified adopting the service-oriented approach for the project.
The selection of the service orchestration tool was based on the following set of goals, highlighting particular business and technical requirements of the project:
The tool that was selected for the project was based on the proprietary solution, and is due to be replaced by an off-the-shelf software package. Currently a number of leading ESB-class applications available on the market is fulfilling the requirements stated above.
The project resulted in the identification of approximately 140 business services provided by internal and external service providers. These services were further orchestrated into 500 highly specialized, fine-grained virtual services in support of various service consumers that were mainly front-end applications.
Following the top-down and not bottom-up approach, the services delivered for the front-end applications almost perfectly matched requirements stated by the business stakeholders, while only minor modifications to the back-end systems were required.
All the data transformations between internal data structures of the back-end systems and unified data model were implemented in the specific adapters, exposing services from the particular systems. In some cases, services were implemented using native capabilities of the back-end applications (e.g. internal scripting languages), while in other cases, adapters were placed on top of the legacy systems as a separate, add-on applications.
From a technical perspective, service orchestration was implemented on the ESB-class platform that was selected for the project.
Since most of the online communications, including a timeout-sensitive ATM channel, were based on the orchestrated services, detailed performance testing was performed to measure the actual overhead of the orchestration:
These results proved that properly designed service orchestration that focuses on business requirements and avoids extensive data transformations brings only minimal performance overhead.
Later on, orchestration capabilities were used several times to develop new capabilities required by the business. In one of the examples, the creation of brand new line of "saving" products, combining term deposits with mutual funds, was completed in only three weeks, while a similar task would have required three months following the traditional and not service oriented approach.