ServiceTechMag.com > Archive > Issue XLII: August 2010 > Information Security for SOA: Why the Information Security Consultancy Industry Needs a Major Overhaul - Part I
Hans Wierenga

Hans Wierenga

Biography

Hans Wierenga is an enterprise architect working for DCE Consultants, an Amsterdam based management and IT consultancy firm. He is one of the two founding authors of the Netherlands Governmental Reference Architecture (NORA), which is the official reference architecture for all layers of government in the Netherlands. He is one of the three key authors of the GEMMA Information Architecture, in which the principles of the NORA have been translated into concrete designs to enable local governments across the Netherlands to act as the entry point of citizens and businesses to all government services. He is currently working on a project to connect all provincial governments in the Netherlands to the national databanks.

Other recent publications include 10 SOA Commandments and Architectural Information Economics.

Hans holds a Bachelor of Science (Honours) degree from the University of Tasmania and a Graduate Diploma in Computing Studies from Canberra University.

Email: hans.wierenga@dceconsultants.com

Contributions

rss  subscribe to this author

Bookmarks



Information Security for SOA:
Why the Information Security Consultancy Industry Needs a Major Overhaul - Part I

Published: August 9, 2010 • SOA Magazine Issue XLII
 

Abstract: Current conceptions of what information security is all about - such as are embodied in CRAMM, the ISO 27000 family of standards and COBIT - are too systems-centric to be effective in the Internet age. The key terms - confidentiality, integrity and availability (CIA) - describe properties of systems and do not adequately address the collective business value of information security. We suggest replacing them with a service-centered approach based on the terms trust, respect and utility (TRU), in order that the security impact of the totality of our information systems can be adequately assessed and managed from a business perspective.


Introduction

On the 30th of April, 2009, a lone assailant attempted to crash his car into an open bus containing the Dutch royal family. The next working day, thousands of municipal workers, driven by nothing but personal curiosity, accessed the personal details of the assailant using the national citizens registry. They could see where he was born, who his parents were, whether he had ever been married, and his current and previous residential addresses. Was this an invasion of his personal privacy? Certainly. Should our information systems be designed to discourage such misuse ? Hopefully. Did the access constitute a breach of information security? No, not according to the current state of wisdom of the information security consultancy industry, because it considers the confidentiality of information to be violated only when the information has been accessed by people who were not authorised to do so, and each and every one of the municipal workers was authorised to access the national citizens registry. In other words, nothing went wrong.

This example illustrates the importance of using appropriate terminology in order to address information security, or, for that matter, any other domain. If the vocabulary we use does not allow us to register that there is a discrepancy between what we are achieving and what we aim to achieve, it is unlikely that we will ever take measures to eliminate the discrepancy. Unfortunately, the current information security vocabulary - in particular, as embodied in standards such as the ISO 27000 family of standards [REF-5], CRAMM [REF-4] and COBIT [REF-3] - is structurally and fundamentally unsuitable for the expression of the information security requirements of the 21st century. The key terms of this vocabulary are confidentiality, integrity and availability, better known under the acronym CIA . As we shall show in this article, there are many, many goals which are not adequately covered by these terms, but nevertheless must be achieved in order for an organisation to have good information security in the Internet age.

However, the vocabulary is not the only problem with CIA: the way that it is applied is also inadequate. CIA is applied to the individual information assets of organisations, with little regard for the collective impact these assets have on the experiences of customers, suppliers and employees. But it is this collective impact that determines the business value of information security. In other words, the security consultancy industry standards do not just employ the wrong words, but they also apply them to the wrong things. The CIA-paradigm entirely ignores the fact that the whole is more than the sum of the parts, blithely assuming that if each individual information system is secure that the whole is too. This way of thinking is hardwired into the standard approach of the information security consultancy industry, which involves making an inventory of the information systems and then working out how to make each of them secure.

As a paradigm, CIA lags decades behind developments in the business applications of information processing and technology. It was developed at a time when each information system could be considered as an island, complete with its own information security solutions. In the Internet age, however, the world is completely different. Stakeholders, whether they are customers, suppliers or your own employees, no longer interact with just one or two individual information systems, but with a whole host of them. And the information systems as such don't interest them, just the responses they get to their input, regardless of how many information systems and even organisations are required to produce that response. The systems-based approach of the information security consultancy industry is no longer appropriate.

To make things worse, the CIA-paradigm lacks metrics which can be used to determine how effective it is. What difference does it make to the business value that an organisation's information security delivers when it invests heavily in security awareness for its employees, a new firewall or better access control? There is no way to measure it. If all the money ever invested in implementing CIA was one giant waste, it wouldn't matter because there is no way to tell. We may know the end result of this investment, but not what the result would have been without it. Using words which don't adequately express the goals we wish to achieve, applying an approach which considers only the parts but not the whole, and not measuring how effective you are is a recipe for ineffective solutions. That need not be a problem if the whole point of the exercise is to enable those responsible to claim that they took the best advice and did everything they could, but not everybody can afford to take such a position. In this paper we shall discuss how the conventional wisdom of the information security consultancy industry can be improved upon in order to deliver measurable business value. We shall introduce more appropriate terms, which enable us to maximize this business value, and we shall introduce an approach which goes from the whole to the parts. The new terms - trust, respect and utility - enable us to focus on the business value of information security and leads to better information security solutions. We shall show how engendering trust, showing respect and delivering utility changes the information security landscape. We shall demonstrate how they improve on the CIA-goals and approach, and discuss whether it makes sense to incorporate the old wisdom into the new.

A new approach to information security is hardly possible without a new way of looking at information systems. In this paper we shall apply the service-oriented architecture paradigm for that purpose. The paradigm describes all interactions in terms of services, in which a requestor asks an agent for something to be done, and the agent ensures that it gets done and delivers a response to the requestor. This way of thinking can be applied at a business level, in order to describe interactions between organisations, at a functional level, to describe how the activities of which business processes are comprised interact, and at the level of information systems, in order to describe how systems and parts of systems interact. Applying it at all levels enables an organisation to make the connection between each and every part of its information processing and the business value that it delivers. When it is applied properly - for some guidelines see Wierenga [REF-7] - it makes it possible to coherently account for the business value of IT.


Engendering Trust

People do business with you only if they trust you. Lose that trust, or fail to gain it in the first place, and you may as well close your doors. Management guru Stephen M.R. Covey puts it this way: "The ability to establish, grow, extend, and restore trust with all stakeholders - customers, business partners, investors, and co-workers - is the key leadership competency of the new, global economy." Although Covey's research was addressed to the functioning of management and disregarded IT, similar considerations apply to information resources. Customers, suppliers and your own employees will use your information if they trust it. Lose that trust, and at best they will maintain their own versions of the information and just not tell you, and at worst they will stop working with you. Either way, that damages your effectiveness. You must engender their trust in your information and the systems with which you deliver it to them in order to prevent counterproductive behaviour.

It is possible to measure the trust that you have engendered and the business value which your stakeholders attach to it. All you have to do is ask them. That in itself makes it a more useful objective than integrity, the CIA alternative. It also covers more than integrity, which ISO 27000 equates with protecting the accuracy and completeness of anything of value to the organisation. CRAMM and COBIT employ similar definitions. Even the COBIT attributes of integrity, reliability of information and compliance combined do not cover all that trust covers, as we shall show in the following paragraphs. Covey identified thirteen behaviours which are strongly associated with high trust leaders. Half of them are just as relevant to the way in which our information systems engender trust: Create Transparency, Right Wrongs, Confront Reality, Clarify Expectations, Practice Accountability, and Keep Commitments. None of them are adequately covered by the term "Integrity", as defined by the information security consultancy industry. We shall discuss each of the above behaviours individually.


Create Transparency

There are two ways in which stakeholders can build up faith in your information resources. The first is that they immediately experience that the information you provide is evidently correct. This is the case if they already knew the answer, but had forgotten it; receipt of the information then results in recognition. It is also the case if they can independently verify that it is correct: the solution to a Sudoku puzzle falls into this category. But for the vast majority of information requirements, an experience of evident correctness does not belong to the ordinary run of things. That leaves the second way: to make the processes by means of which the information is produced and delivered transparent to the stakeholder. For all but the most suspicious types, this provides an assurance that everything is OK .

To create transparency, there are a few simple pointers. Ensure that your business processes are well thought out and simple. Let stakeholders know what they can expect, and what is expected of them. Report milestones and deviations from previously communicated plans to all concerned stakeholders. Give stakeholders the means to track the progress of the processes which concern them, whenever they want to. Grant stakeholders access to the information that you hold about them.

Given the importance of transparency in order to deliver business value, it is unfortunate that standard information security consultancy industry treatments of integrity do not cover transparency. Transparency is not, in first instance, a means to protect the accuracy and completeness of information, but a means to turn that accuracy and completeness into business value.


Right Wrongs

There is nothing more deadly to the engendering of trust in the information that you supply to your stakeholders than situations in which they see and report errors in the information, but nothing is done to correct them. Almost as bad is the situation in which it takes us so long to correct the errors that stakeholders have no choice but to maintain their own version of the information. Ultimately, such situations lead to the stakeholders seeing and being irritated by the errors, but seeing no point in reporting them.

In theory, the ideal strategy to avoid the alienation that wrong information produces is to ensure that the information is always correct. In practice that is seldom possible. Even when it is possible, it is not always enough, because although we may be convinced of the accuracy of the information, the stakeholder may not be, and his perception is in itself a problem that must be addressed. It is therefore almost always advisable to establish a mechanism by means of which stakeholders are encouraged to report the errors that they detect, and the progress in assessing and correcting errors is communicated to them. Even if it is hardly ever used, the mere fact that it is there amounts to a statement that you are concerned to maintain the accuracy of the information. Trust is a perception, and to engender it we must do what we can in order to influence perceptions, regardless of the fact that the CIA-paradigm discourages us from even thinking about doing so.

Because trust is a perception, it is imperative that incidents that damage that perception are properly responded to. Damage control is imperative. It is in general better to acknowledge a failure, and in doing so emphasize how important you think information security is, than to allow the perception to take hold that you seem to think that security breaches are tolerable.


Confront Reality

Information systems are models of some administrative reality. We build them because they are easier to use than the reality which they model. For example, it easier to count records of customers in a database than it is to count the actual customers. But we must never forget that it is the administrative reality that determines the model, and not the other way around.

In order to ensure that the information produced by our systems engender trust, we must keep the lines between that reality and our model of it as short as possible. When something in that reality changes, our model should be updated to reflect that change. Any options which ensure that we capture the update as soon as possible should be explored. Waiting until somebody decides to tell us about the change is a choice, not an imperative.

To confront reality in this way, we must be clear as to who is entitled to detect that the reality has been changed and to register at first hand the new information. Those people or organisations are the primary source of the information. We should endeavour to capture the information from them as soon as they register it, so that it is as up-to-date as possible. We should capture this information in such a way that we can be sure that it has not been modified by others since leaving the primary source, in other words that it is authentic. But that is not enough, because primary sources can and do make errors in capturing information. In general it is preferable to deal with such situations by getting them to validate the information when it is captured and asking them to correct invalid information before passing it on to us. If our information is useful for their validity checking, we should make it available to them, rather than waiting until they pass invalid data on to us. The closer to the source that information is corrected, the better. There is no good reason why organisation boundaries should get in the way.

Unfortunately it is quite possible to have a chain of information systems, each designed for maximum integrity, producing useless information because they don't trust each other's information. Each organisation can produce an auditor's statement to prove that they are "in control", but they have cut themselves from the sources of the information that could be used to correct deficiencies in their data because these sources are not under their control. The process by means of which they collect and validate data are followed to the letter, but the connection to the data source is missing. It is only by extending trust that we can create a community in which we can work together to improve the reliability of information.

When we choose to confront reality, protecting the accuracy of our information is no longer our highest goal. We must not regard accuracy as being primarily something we have that we can lose, but as something we don't have but can gain. Only then will we learn to see our models as being determined by the administrative reality which they model, and be inclined to confront that reality at every opportunity.


Clarify Expectations

People trust information better when they know what it represents, so it makes sense to ensure that they have appropriate expectations of the information with which we supply them. Ideally, when we present people with information, we should make the meta-information (that is, the information as to the meaning of the information) available with it. With previous generations of information technology this was a real problem, because information and the knowledge as to what the information represented were separated more rigidly than racial groups during the heyday of apartheid. The information was stored in records and the knowledge of what it represents was built into computer programmes. Fortunately, we can go beyond the limitations of these technologies by using service-oriented architectures and, in particular, XML (the eXtended Markup Language), which revolves around keeping information and meta-information together.

The information security consultancy industry does not appear to have caught up with this development. Clarifying expectations is not part of its agenda, and unclear expectations are not defined to be a problem. In part this is a terminology problem: integrity is concerned with the data as it is stored in the system, whereas trust is concerned with information as it is supplied to a person. Trust has a semantic and social context, integrity doesn't.

The social context of trust affects more than just our way of thinking; it also affects the scope of our thinking. The expectations of an information recipient are concerned with the information as such, not with the information systems required to produce it. But the whole approach of the information security consultancy industry focuses on these systems instead of on the information. It is standard practice to make an inventory of all an organisation's information systems, classify them according to their integrity requirements and then take measures to ensure that these requirements are met. It is not at all unusual to encounter information security experts who, in all seriousness, assess the integrity of individual information systems without ever considering the combined effects of these systems for the ultimate recipient.


Practice Accountability

Accountability is concerned with being responsible for what you do and being prepared and able to account to others what you did and why you did it. It is vital to engendering trust, but only of peripheral interest to maintaining integrity.

Accountability is not the same thing as compliance, although they are related. Compliance is a term which is focussed on the auditors and authorities to whom it must be demonstrated. Accountability is broader: it is concerned with all stakeholders. It is of primary importance to be accountable to those who provide us with information. If we can satisfy them that we are using their information in accordance with the purposes for which they disclosed it to us, they are much more likely to trust us and our information.

Practicing accountability towards our business partners requires us to be able to demonstrate that we have done everything for them that we said we would do and that everything we told them was correct. Such a demonstration demands that neither we nor they can repudiate a message that the other side received, unless of course it really was sent by somebody else who was masquerading as the apparent sender. With integrity, non-repudiation is an extra, whereas with trust it is part of the package.

Practicing accountability is in first instance a matter of business, which should be implemented by recordkeeping at business level. A plethora of technical log files is no substitute for a record management system in which all incoming and outgoing messages are archived. Such messages can, by their nature, never be changed, although they may be rescinded using subsequent messages.


Keep Commitments

Nothing destroys trust more quickly than commitments which are not kept. Our information security should help us keep track of all the promises we have made, particularly those which we made to customers. They should help us ensure that we deliver what we promised to deliver, within the timeframes we have led the customer to expect that they will be delivered. And if, due to reasons beyond our control, we are unable to keep these commitments, we should at least inform the customer and compensate him.

The information security consultancy industry approach of considering each information system in isolation is entirely alien to customer oriented commitment management, as outlined above. It is quite possible for each information system to manage its commitments perfectly whilst the commitments to the customer are forgotten. The information security consultancy industry does not insist or even indicate as best practice that commitments to customers should be managed across all systems using, for example, a business process management system. More fundamentally, it does not provide the concepts with which the problem can be named. What does a commitment to a customer consist of? - nothing in the ISO 27000 family of standards, CRAMM or COBIT will tell you.

What's foreign to the systems-based thinking of the information security consultancy industry sits naturally with Service-oriented Architecture (SOA). In the SOA-paradigm, a business commitment is a transaction between a customer and a vendor, and the process by means of which it is implemented is a service. To understand how well SOA supports the logical consistency requirements, it is important to understand what a business transaction entails. A business transaction consists of the following elements:


  1. The vendor provides the customer with information on which the customer bases his purchase decision. Typically, the information is concerned with the properties of the goods and services on offer, the terms under which they may be acquired — including, of course, the prices - and their availability. In legal terms, the vendor represents this information to be true.
  2. The customer places a purchase order with the vendor, based on the representations made by the vendor.
  3. The vendor verifies that the information on which the purchase decision was based is still applicable, and if so, commits to the order. If there has been a change - perhaps because the goods or services are no longer available, perhaps because the prices have increased, perhaps because the specifications of the goods or services have changed - then some processing may be required in order to decide what to do: commit to the order anyway, negotiate amendments to the order, or perhaps just cancel it.
  4. The vendor and the customer each perform their side of the purchase agreement.

SOA maintains the logical consistency of such a business transaction in a number of ways, none of which is standard information security consultancy fare.

Firstly, all communication which implies a change to a previous situation can be done using a protocol that guarantees secure message delivery. Whenever a commitment is made by either the vendor or the customer, the act of making the commitment involves such a change. When both sides practice accountability as outlined earlier in this article, it is not possible for the customer and the vendor to have different perceptions as to the current state of the transaction.

Secondly, the logical consistency of databases and the record of process progress in the process management system can be maintained using a 2-phase commit protocol. Logical consistency across multiple databases is maintained by a sequence of such 2-phase commits. First database A is synchronised with the process management system, then this system synchronises with database B, and so on. Thirdly, any changes in the vendor's reality between the time that representations were made to the customer and the placing of the order are dealt with using an optimistic concurrency control mechanism. This way of working is natural to SOA: the process of determining whether there has been a relevant change can be completely automated. And because SOA makes it feasible to access data at the moment it is used to make a decision, it is only rarely that the optimism is unfounded and manual intervention is necessary.

Lastly, an abort of the business transaction itself - for example that the customer withdraws the order, turns out not to be able to pay, or dies - can be handled relatively easily using SOA. Because the documents which have been used or produced in the context of the business transaction are clearly identifiable, it is possible to determine which countervailing messages are required in order to correct the perceptions of customers and vendors. Because it is clear which database updates are directly related to the transaction, it is clear which changes in databases need to be rolled back using compensating transactions.

Note that the scope of a business transaction is limited to that which is done directly in order to handle a service request. SOA provides rules to distinguish between the things that are part of the business transaction and the things that are not. In SOA, the rule is to orchestrate your own processes and provide event notification to others. If the service is aborted, it may be necessary to notify them of the change. What they do with this information is entirely up to them.


Conclusion

The second part to this article will be published in the September 2010 issue of The SOA Magazine. It sets out what showing respect and delivering utility entail, and then discusses dysfunctional tendencies within the information security consultancy industry and how to correct for them.


References

[REF-1] Bell, David Elliott and La Padula, Leonard J. (1973) (PDF). Secure Computer Systems: Mathematical Foundations. MITRE Corporation.
[REF-2] Biba, K. J. "Integrity Considerations for Secure Computer Systems", MTR-3153, The Mitre Corporation, April 1977
[REF-3] COBIT is available at http://www.isaca.org
[REF-4] CRAMM (CCTA Risk Analysis and Management Method): see www.cramm.com
[REF-5] ISO/IEC 27000 family of standards: for an overview see standards.iso.org/ittf/PubliclyAvailableStandards/c041933_ISO_IEC_27000_2009.zip
[REF-6] The Standard of Good Practice for Information Security is available at https://www.isfsecuritystandard.com/SOGP07/index.htm
[REF-7] Wierenga, Hans (2010): 10 SOA Commandments, available at http://www.infoq.com/articles/10-soa-commandments