Knowing theoretically how secure systems should be put together is just a first step towards actually securing anything.
In order to do information security properly, it is first and foremost important to understand what it is we are trying to protect. One of the most used models is that of the CIA-triad. This sounds a lot more ominous than it is. The letters C I A stand for Confidentiality, Integrity and Availability. These are the three critical aspects about information that need to be balanced in order for non-public information to have value for an organisation. Firstly the information needs to remain confidential. This may be because it’s a strategic trade secret, private data about an organisation’s client (say their bank balance), or a military secret such as real-time troop deployment information. But this confidential information is only valuable if the people using it to make decisions can trust its validity. This is were Integrity comes in. The information needs to be complete and correct. Both of these are easy enough to achieve. Simply store the information with the best encryption available. Then switch off the computer, unplug it from the network and bury it somewhere. No-one will be able to copy or alter the data so its confidentiality and integrity are guaranteed. The problem, of course, is if no-one can access it, it might as well not be there at all. The third aspect, availability, brings all kinds of headaches. For the information to have value it must be available to the people who need it when they need it. The three aspects are in permanent conflict with each other and good IT security means making the right trade-offs depending on the type of information and the way it is utilised.
This classic and somewhat static view has been augmented by the work of security guru Bruce Schneier who has stated that information security is about protecting it as well as is reasonably possible, detecting breaches in that protection, and ensuring a timely and adequate response to such a detected breach. This additional view allows for a much more flexible approach to securing data ‘well enough’ at a certain cost, while taking calculated risks that are considered acceptable in terms of cost trade-off.
Combining both models gives a good framework for implementing technology to safeguard information as required, while keeping it available to those who need it.
Awareness and training
Having the best technology does not make anything secure if the people using it do not know how to use it properly or are not motivated to follow the necessary procedures. Awareness among the users of any system about the importance of security is the foundation under any secure environment. Without it, all other efforts are useless. The most expensive network security equipment on the market can be defeated by passwords being shared (loudly!) with entire departments, including people not employed by the organisation in question. If people do not know or care about the importance of procedures, no technology can save you.
Beyond being motivated to do the right thing people need to know what the right thing is. Often end user training is bought on the cheap because the budget has been spent on state-of-the-art technology. In my opinion at least half of any security budget should be spent on awareness creation and training of the end users of any system. It’s the combination of good technology and people who are empowered, able and motivated that leads to secure environments. The image on the left shows the results of privacy and security from having the CIA-triad supported by good behaviour and well-implemented and maintained technology (so even the technology side is at least partly about people, since keeping the tech working properly is about skilled and motivated people too).
Audits and what to do with them
When all this technology, procedures and knowledge are in place, there need to be regular checks to see if they are being used properly. In other words – audits. Some of these may be automated (such as users having sufficiently strong passwords and whether they change them often enough), while other aspects may have to checked by internal or externally-hired human auditors. A good way of exposing weaknesses in the total combination of technology, procedures and people is often to let outsiders test the defences by trying to defeat them. This is referred to as penetration testing and you can hire specialized companies to do it for you. These type of audits are meant to be educational, so organisations should only do them if they are willing and able to spend the time to learn from them. Otherwise they are a waste of time (or a very expensive form of entertainment).
All this does not mean there should be no consequences to failing audits. If the impact of breaking rules is severe, there should be severe consequences. This needs to be clearly communicated to all stakeholders and be made a part of terms-of-employment and other relevant legal documents. It’s no good telling employees afterwards that they should have paid more attention to this procedure or that – tell them upfront.
Some leadership required
Our litmus test when talking to any organisation about security is always: do the rules apply to everyone? Or only to the people making up to twice the minimum wage? In many places basic protocol is being violated by highly-paid (and scarce) professionals without any corrective action by senior management – even when the possible consequences of this behaviour are known. The logic appears to be that it’s just impossible to herd cats anyway and those individuals are crucial to the primary process of the organisation, so they are allowed to get away with it. This encourages everyone else to start circumventing the rules as well and soon things start falling apart and there may as well be no rules at all.
So security guidelines need to be implemented from the top down and those at the top need to lead by example. Leading by example is generally a good approach if you need to motivate people to put up with a little inconvenience to achieve an abstract goal.