The levels of expenditure and the time spent on compliance issues continue to grow, and the associated punishments for noncompliance are increasingly draconian. Despite this, many organizations seem to approach security and privacy as though they are patching their favorite pair of jeans. You notice a hole, you patch it. You see a rip, you sew it up.
Many organizations still "design" their security policy by reacting to auditor noncompliance and the latest high-profile fraud reported on the Web. This approach will only result in an infrastructure that is stretched at the seams, and one day your dirty laundry will be on display to the whole world.
This article will explain basic principles and help you evaluate what your current organization needs. Most organizations will find that a proactive, framework-based approach will create less holes in the long run.
The Security Functions of the OS
While writing this article, I could not help but recall a couple of questions asked recently on some of the online forums. The two very similar questions invited the assembled System i gurus to help the posters identify who had viewed a particular record in a file and when a record was last changed. On reading this, I imagined security specialists and in-house IT security administrators thinking, "You'd know if you had already activated the audit journal, or object auditing, or trigger processing...."
But clearly they hadn't. If they had, they would not have needed to ask. Operating systems like i5/OS have powerful security functionality, but you have to choose to use those tools before the need arises. A reactive approach to a security incident or an auditor request rarely works because it is unlikely that the analysis information you need will be there.
When I first started as an IT manager on a System 38, life was a lot simpler—no real worries about external connectivity, mostly 5250 terminals, users who were much less computer-aware. So even though I had a written security policy, it was pretty basic, covering the expected things such as system values, menu structures, user roles, and authorities.
Those Darn Auditors
Not only was the system environment simpler, but the requirements of the auditors were minimal. Often, they had limited knowledge of the midrange space and could easily be persuaded that everything was OK and there was no need to look too deeply.
Take a look around most IT departments these days, and you will see how much has changed; there is in-house knowledge of regulations such as HIPAA and Sarbanes-Oxley. Many IT departments have even made the drastic step of giving someone a job title with "security" or "privacy" in it. Yet they still process security and privacy issues the same way they always have. However, the current crop of auditors have teeth; they have specific requirements and expectations, and if you ignore them, beware.
To satisfy the auditing requirements imposed on organizations, it is not sufficient to do just the basics. There are standards and recommendations for settings such as system values or profile attributes, so they should not be a challenge. However, many organizations now need a total proactive approach to security and privacy.
For many, this will be a new way of looking at security and privacy. It may not be popular, but this will involve going back to simple security principles and then applying them to the nth degree. This is not a quick-and-dirty solution, but it will prepare the organization for the future. Put simply, it will involve identifying the information assets, the authorized users of each of those assets, the different ways of accessing them, and the level of auditing required of those accesses.
Asset Owners
As with any major framework project like this, one of the major challenges is in knowing the right starting point. My advice is to start with the information assets, but at the same time, a special category of people needs to be incorporated into the framework: the "owners" of the assets. The importance of the asset owners must not be downplayed. Someone must analyze the data—at a file level and also in some cases at a field level—as a first step in designing the framework. The data assets must be important to these data owners; they should want to protect them with their lives.
The asset owners must analyze the data and categorize the information held there. Some of it will be internal and fairly low-risk (think of items such as department numbers and the accounting calendar). These items are important to the running of the organization but would have minimal impact if exposed to the outside world. Another category is the sensitive internal data, such as prices, quotes, and the organization's financial data. These items should at all cost be retained in-house. A third category is information about other organizations or individuals: names, addresses, bank details, credit card numbers, personal information, historical data on buying habits, etc. This group of data elements will likely be covered by industry or government privacy restrictions. The categorization of the data assets can go down to many more levels than this, but I hope you get the idea.
This ties in very neatly with a discussion that has been appearing more and more in the System i space. In the past, many System i sites have not seen a need for database administrators (DBAs), even though they are often an essential part of other environments. In other systems, they performed a variety of tasks relating to their databases, such as problem-solving, monitoring and tuning, data analysis, integration, etc. In the System i space, we have been fortunate that the OS has always made handling databases so easy that looking after them was never a full-time job. However, a well-designed database is becoming increasingly critical as we share information with nontraditional user interfaces. So a DBA position could be useful in our systems—not just for advanced aspects such as information lifecycle management and aligning technology to support business goals, but also to help define the security and privacy elements of our information assets.
Users' Roles
Once the data assets have been defined, the organization can then move on to the broad range of users who need to access the environment. One of the most popular ways to attack this task is to group the users together using a role-based access control (RBAC) methodology. The use of RBAC could help many organizations, yet it is often ignored because it is misunderstood. Some see it as inappropriate for their organization or too difficult to incorporate. However, I believe that RBAC will actually simplify the definition of a security and privacy framework.
Some believe that RBAC tries to force-fit a perfect organizational structure into an imperfect environment. In reality, it is actually about defining the actual "types" of users who could have access. This does not mean that just because Sally and Bill have the same job title (e.g., AP Department Supervisor) they have the same access needs. Far from it. It is not unusual for users' access needs to move away from their job titles because of other factors, such as seniority, favoritism, deputizing for an absent superior, etc. So there is nothing wrong in defining two roles, AP Department Supervisor 1 and AP Department Supervisor 2, even if there is only one person in each role. It is important though, that the organization recognize that these two roles are different and understand which to use if a new employee takes the title AP Department Supervisor. When RBAC is used correctly, it is just a structured way to document the reality.
Of course, an organization could choose to document the users' needs person by person, but using roles will actually speed up the process. There are many users to include in this list of roles: internal users, external users, partners, ex-employees, investors, even the auditors themselves. It is important to add the auditors to the list of authorized users. They have a duty to check out parts of the system, and the security policy should ensure that they have the correct read-only access. In addition, if we refer back to the bulletin board questions mentioned at the start of this article, auditors may insist on access to audit trail information for certain elements of the database.
For example, one of my clients in the pharmaceutical industry was asked by the auditor to retain and archive any audit trails that would prove, beyond a shadow of a doubt, who was the last person to change "batch A" or "ingredient formula B." And this data had to be available for different time periods; some data needed to be retained for up to five years, and some needed to be retained permanently! Now that may have been a bad interpretation of a FDA ruling, but it created quite a stir.
Users' Data Needs
After the users have been documented, the security framework needs to define what those users' data needs are. The simplest way to look at this is using terms such as "read," "write," "update," etc. However, it is likely that the framework will be defined in much more detail than this. For instance, "update" might need to be split into different levels because one role may be permitted to change a data field but only within specific boundaries—for example, perhaps the changed value cannot be more than one million.
Another aspect of the data asset definition is that some of the data may need to be stored or transmitted in an encrypted form. This should be defined and documented, even if the organization or system does not have the capability at this point to achieve this.
Finally, we need to consider how the user accesses the data. The recent upsurge in the use of System i exit point programs proves that just because a user is authorized to change a file using a native program does not necessarily mean that the same user is allowed to change the same file using an ODBC utility. Similarly, if a user is in a hotel room with VPN access to your system, do you want him to see exactly the same data he can see when he is at his desk? Are you concerned about shoulder-surfing?
When all of these definitions are complete, you can link the types of users to the types of data assets, and you have defined what the users can do with that data, using what type of user interface. At this point, the security framework will contain sets of associations that read like a something from a game of Clue: We allow Colonel Mustard to view the payroll file using Microsoft Access.
Building the Framework
During the definition steps, it is crucial that the organization ignore any known limitations to their systems, infrastructure, and applications. The framework must be built on the principle that it will encompass the best theoretical security for the organization. In reality, you may not be able to apply the entire framework to your current environment for a number of reasons:
- The vendor package you are using will not allow you to secure the system in the way you want.
- There is not enough capacity on the system to store a huge increase in audit data.
- The operating system does not allow you to restrict a user to one process but not another.
- No "at rest" encryption is available on your hardware.
- You have not yet found a security tool that will do what you need.
There are advantages to defining what you want in your ultimate configuration because things could change over time:
- You can choose your next vendor package so that it does not prevent you from achieving your security goals.
- You can justify upgrading to a bigger, more powerful system.
- Your OS provider may enhance the capabilities.
- New technologies might be added to your hardware.
- You might find a tool to get you nearer to your goal.
One of the things that auditors love to highlight is where their client has missed something critical that they did not even know about. However, a full and honest framework defining what the organization's perfect security configuration is will be applauded. You should know the limitations of your system, document how they need to be changed, acknowledge that this is a limitation you need to work around, and finally describe what needs to happen, and when, before you can move nearer to that ideal.
Taking Privacy Seriously
Earlier in the article, I mentioned that a number of data assets may have been categorized as covered by privacy regulations. This raises an extra set of implications that we have not necessarily had to consider so far when building the framework for security compliance. Most companies have already realized that they should implement a sound privacy policy, but unfortunately not all will implement that policy before their name appears in the press. The online newsletters delight in another data loss here, another missing laptop there. It makes great press.
Surveys have tried to quantify the mistake of not taking privacy seriously. One study group calculated that 83% of their respondents said that they would stop doing business entirely with any company that misused their personal information. Therefore, it is critical that the framework cover privacy elements too.
There are a number of excellent resources for understanding these issues, one of which is an organization representing the people who may be checking compliance on the systems. The American Institute of Certified Public Accountants (AICPA) offers excellent guidelines on defining good privacy and security practices for personal information.
The AICPA's members consider certain elements to be critical in defining a privacy policy:
- Management—Identifies who ultimately takes responsibility for the policies and procedures
- Notice— Defines how the various parties are advised of that policy
- Collection and Use—Ensures that personal information (PI) is only collected and used for a an agreed upon purpose
- Access— Defines how individuals can see their own PI
- Disclosure—Defines sharing PI with third parties
- Security—Covers an acceptable security framework
- Enforcement—Ensures compliance with the policies and procedures on an ongoing basis
The privacy arena is extremely complex, with different regulations for each industry and different approaches from country to country. There is a global trend toward protecting privacy as a fundamental human right. However, some societies are more advanced than others in enforcing those rights. Using an advisor who knows the privacy landscape can help ensure that an organization's approach is appropriate for its market.
Be Proactive, Not Reactive
Despite the fact that this article has used examples from the System i arena, this framework needs to be extended enterprise-wide and across multiple platforms. It should also encompass any test and development environments. Many organizations use copies of live files as test and QA data for new applications; therefore, the security and privacy of data held on those systems must be included in the framework definitions too.
Management now needs to move away from thinking that the appropriate response to a new regulation or a known exposure is a simple act of closing another hole. By defining a management framework to cover security and privacy policy, it is more likely that an organization is already protected or has the necessary information to keep in compliance. New exposures will appear, but good proactive preparation will normally ensure that the impact is lessened and that a detailed set of forensic information is available to analyze any impact.
Martin Norman is Senior Systems Engineer for SafeStone Technologies, an IBM BP specializing in compliance and identity management. As one of the original developers of SafeStone's security portfolio, Martin has performed security audits and advised on installations for clients throughout the United States and Europe. Martin can be contacted at
LATEST COMMENTS
MC Press Online