Big Data as a Service

Varun Sharma

Varun Sharma

The current times are deeply impacted by social, mobile and complementary technologies in multiple ways. This article is to showcase and highlighted some potential use case and user scenarios where B2C and B2B marketers and pioneers of IoT could use data from previously overlooked sources and pull information in great details to influence buying and possibly many other user behaviors, touching upon the technical aspects of the business, data and information. This article is a recommended source of guidance for building a framework for the enterprises in the B2C and B2B space, helping them lay down tiered Big Data reference architecture for capitalizing on the power and potential of Big Data while keeping a sharp focus on security and governance. This article is a recommended read for architects and IT leaders who are on the verge of taking the Big Data plunge. There are so many things that you can re-do in the world of IT, but you just get limited chances, time and bandwidth to build good architecture; this article shall enable IT decision makers to build the Big Data architecture itself "right the first time." This article is NOT an endorsement for any brand, company, enterprise, analysts and/or methodology. It's just an author's view on how one could re-use and apply simple concepts from his past experience to build a robust, expandable and flexible Big Data reference architecture and transform business using Big Data-as-a-Service. Mutual reinforcement and convergence of disruptive forces from underlying SMAC (i.e. social, mobile, analytics and cloud) platforms in the last couple of years has unmistakably given rise to more new business scenarios through innovation and realignment than ever before. The ever growing number of connected devices and machines is changing the IT and business landscapes...


SOA in Banking: A Review of Current Trends and Standards based on an Example of a Real-Life Integration Project Delivered to a Financial Services Customer

Leszek Jaskierny

Leszek Jaskierny

The following article is based on the example of an actual banking project, delivered by HP Professional Services to a major European bank. A similar approach has been taken in other integration projects to extents, depending on the legacy constraints and requirements of the customer. Due to rapid growth in the market, a European bank decides to replace its IT application infrastructure, including its core processing system, integration platform and all front-end delivery channels. One of the challenges of this project was the integration of over 20 back-end systems being used by the bank, as well as integration with external service providers. Three main challenges of the project were: (1) business related, to bridge the gap between business requirements and technical specifications, (2) management related, to shorten development time and meet the aggressive timeline of the project, and (3) technical, to assure that performance requirements will be met on the given hardware platform. The performance of the transactional system, especially the middleware layer, is extremely important because of strict rules imposed by the systems being interfaced. An example is how ATM requests processed in the online mode are governed by strict time-out rules. Two key areas of the solution were: (1) the definition of a unified, enterprise-wide data model for communication, and (2) selection of tools, allowing high-level definition of orchestrated services, that will meet agreed performance requirements in the production environment and allow business analysts to participate in the service design process. The project team acknowledged that the resulting data model would need to follow business requirements but also need to comply with the requirements of the existing legacy systems. While basic data transformations can be performed directly in the service interface of the back-end systems, fine-grained business requirements can be fulfilled only by the orchestration of coarse-grained back-end services. In contrast to business services exposed by the back-end applications...


Cloud and Virtual Data Storage Networking:
Data Footprint Reduction

Greg Schulz

Greg Schulz

Data footprints - or information about how data are accessed, used, manipulated, and saved in a given system - are important to maintain as networks store increasingly large quantities of data. If not managed carefully, this metadata can become an additional storage burden. Many different data footprint reduction technologies are being implemented in order to improve efficiency: archive, compression, de-duplication, and thin provisioning. This article details the different technologies for data footprint reduction (DFR), as well as different techniques for data management. It also addresses data footprint impact, changing data access, changing lifecycles, and the economic benefits of DFR. The amount of active and inactive data that needs to be stored is continually growing. As this data growth rate climbs, your data footprint continues to expand and new business issues, challenges, and opportunities arise. Optimization, doing more with what you have or with less, and virtualization are both vital methods for managing your data footprint. These trends enable more data to be retained in a cost-effective manner without compromising quality of service or increasing associated infrastructure resource management (IRM) complexities. By improving the way vast amounts of data are stored, the overall storage network maintains its quality and efficiency, even as more and more data is acquired. If it was not becoming increasingly necessary to process and store more data for longer periods of time in different locations, there would be little need for data storage, networking, IT clouds, or virtualization. Similarly, if there was not n ever-increasing dependence on information being accessible when and where needed-including data that was previously off-line or not even available in a digital format-there would be no need for business continuance (BC), disaster recovery (DR), or backup/restore as well as archiving. However, as has been discussed in previous chapters, there is no such thing as a data recession and dependence on information continues to grow. Countering...


An SOA Maturity Ecosystem

Hector Fabio Meza, Paul Escobar Mossos

Hector Fabio Meza

Paul Escobar Mossos

Incorporating SOA in an organization is basically a migration initiative, and as such it should have an associated transition process inside it. An important part of this process is the establishment and periodic evaluation of a SOA maturity roadmap. There are several models created to perform this task, most of them tied to a specific vendor, which reduces the possibilities to openly use them. The Open Group has a model which, additionally, is an industry standard, and besides, it's not only a maturity model but a framework, which is to say it lays a foundation which can be extended and personalized, plus it has an associated method. This landscape has two consequences, first, it leaves organizations, independent consultants, open source vendors, etc, orphaned of a tool that will automate their maturity assessments and second, it leaves them lacking a wide and tested set of maturity indicators to assess according to previous experiences or vertical markets. With the goal of giving organizations, independent consultants, open source vendors and other actors interested in measuring the evolution of SOA migrations what they need to be agile, effective, systematic, rigorous and to rely on their environment experiences, improving SOA implantations in organizations, it's necessary to have additional elements to a maturity model which will complement, potentiate and lever the way we evaluate and define roadmaps for SOA initiatives, filling the void of systematization of assessments and lack of collaboration based in the experiences of different sectors. This article will describe the way to cover these shortages, detailing a maturity ecosystem for SOA, comprised by 3 main elements that give each other feedback. Maturity framework. Based on the Open Group's OSIMM, which is an ISO standard and can be used by any vertical market. Automation Tool. Open source tool that implements the flexibility of the OSIMM framework so organizations and consultants can perform measuring and evolutions in a systematic way. It has its own community that permits its maintenance...


Issue LXXXIV, May/June 2014

From the Editor

Thomas Erl Big Data technology and practices are becoming increasingly relevant to IT enterprises. Many are discovering the extent to which traditional data analysis and data science techniques have formed the foundation for what Big Data has become in terms of a professional field of practice. But what consistently distinguishes Big Data are orders of magnitude to which those established techniques now need to be utilized and the sometimes extreme conditions under which massive volumes of data need to be processed. These and other necessities brought about by Big Data processing demands have led to further layers of innovation in both practice and technology that have built upon traditional data science foundations.

Thomas Erl, Series Editor and Site Editor

Download This Issue
Download Full Service Technology Magazine PDF The entire May/June 2014 issue of the Service Technology Magazine is now available for download as a high-resolution PDF.
SOA, Cloud Computing & Big Data Certification Workshops
To view the most current calendar of public SOA, Cloud Computing & Big Data Science Certified Professional workshops, visit www.arcitura.com/workshops