Last month's article on architecture focused on creating an application architecture for compliance. This month, we'll take a look at the rest of the organization.
Security Policy
The response to your organization's compliance requirements is found in your organization's security policy. The policy is documentation of the organization's approval and assertion of the organization's security posture. Without a policy, there is no way to know the stance the organization has taken on various aspects of security configuration. It is also a legally binding document. Because of this, I highly recommend that your organization's security policy be reviewed by legal counsel.
There's no exact science of what a security policy should say or the areas it should cover. Some that I've seen are very extensive; some cover just the basics. However, most laws and regulations have some common requirements for an organization's security policy. Let's take a look at the security policy requirements that the Payment Card Industry (PCI) documents in its Data Security Standard as an example. To be in compliance with the PCI requirements, your organization's security policy must address the following:
- Ways in which the organization is complying with all aspects of the Data Security Standard
- Annual security assessments
- Regular monitoring of the organization's systems and network security configuration for compliance
- Appropriate use of data and the organization's other resources, such as computer equipment
- Assignment of roles—who (the position) is responsible for each section of the policy, the policy itself, and policy enforcement
- A formal employee security awareness training program, including education on the policy itself
- A security incident-response plan that is invoked if a breach occurs
You can find numerous free or low-cost examples of specific sections of a security policy on the Web, including free examples from the SANS Institute, which can be downloaded and modified for your use.
System Settings and Processes
It's obvious how some requirements of the laws and regulations apply to your organization. However, some statements need to be interpreted so that you know how they apply to and what they mean for configuring operating systems, including i5/OS. For example, the PCI requirement "Maintain a security policy that addresses information security" doesn't require much interpretation; however, the requirement "Implement strong access control measures" requires interpretation to determine the exact settings that need to be implemented on each operating system.
The key is to understand the intent of the requirement. Then, it's usually not difficult to determine the actual operating system settings that must be implemented. For example, the intent of PCI is to allow access to credit card data only to those individuals whose job responsibility requires it. All others are to be denied. In i5/OS terms, this means that the *PUBLIC authority of all files containing credit card information must be set to *EXCLUDE. And if users access this data through an application, only those users with a specific job responsibility should be given the menu options that allow access to these files. This requirement also implies that the number of users having *ALLOBJ special authority be limited. (See last month's article on "Building an Architecture for Regulatory Compliance" for a discussion on the requirements for and implementation of appropriate access controls on files.)
Another intent of laws such as SOX or regulations such as the Data Security Standard is that of data integrity. Again, this requires a bit of interpretation, but in i5/OS terms, this means that QSECURITY needs to be set to level 40 or 50 and that QCRTAUT needs be set to *EXCLUDE or *USE. A note of caution: If the system values just described are not currently set to these values on your organization's systems, do not change them without first evaluating and testing the effect of changing these values.
User Profiles
Many laws and regulations refer to user profiles as user accounts. Requirements for user accounts typically include these:
- Removal of profiles that have been inactive for 90 days
- User accounts for each individual (no sharing of accounts)
- No default passwords, even for vendor products or users servicing your system (this includes profiles created for IBM's use)
- Enablement of profiles used to service the system only at the time they are needed. Once again, this includes the profile created for IBM's use, QSRV (I can't tell you how many times I've see this profile have a password that never changes "because IBM needs to know what it is when they come in to service the system"). I've also seen vendors require that a profile with *ALLOBJ special authority be enabled at all times and have a default password so they can sign on to their clients' systems and apply application fixes. Both practices are in violation of PCI and possibly other regulations.
- Profiles with just enough authority to perform the job function—no more. That is, users are given capabilities or access to data based on their job position's "need to know." In i5/OS terms, this means that users are not given any special authorities (especially not *ALLOBJ) unless their job requires the functionality provided by the special authority.
Because users' capabilities and accesses are usually required to be reviewed at least annually, one recommendation to make this process easier is to implement role-based access. On i5/OS, that's most easily implemented through group profiles. Roles are usually defined within applications, but you can also do that at the system level. There are usually fewer roles defined at the system level than in an application. Across my i5/OS clients, I typically see some set of the following roles: security officer, system administrator, operator, programmer, programming manager, database administrator, and analysts. Once you define the roles in your organization, list the tasks each role is responsible for performing. For example, the operator role is responsible for backing up the system, responding to messages, and managing outqs. Programmers write code and debug production issues. Once the tasks are defined, list the special authorities required to perform each task as well as any private authorities or authorities to authorization lists that are required. If you assign these authorities to the group profile that represents the role and then make each user a member of the appropriate group, you have implemented role-based access. To review access levels, managers simply have to review and approve the tasks associated with each role and the users associated with each role. This is far more simple than reviewing each individual's authorities.
Passwords
User account requirements don't stop with profile configuration; they continue with the password requirements. Again, while the policy states the high-level requirement, the implementation of the architecture (policy) must be interpreted for each operating system. Here are the password requirements along with the i5/OS implementation:
- No default passwords. Regularly run the Analyze Default Password (ANZDFTPWD) command to detect
- No sharing of passwords
- Minimum length. Set the QPWDMINLEN system value to 7.
- Must contain letters and digits. Set the QPWDRQDDGT system value to 1 (or "yes").
- Frequency that a password must be changed. Set the QPWDEXPITV to 90.
- Frequency that a password can be repeated. Set the QPWDRQDDIF to 8, which allows passwords to be repeated after four times. However, best practices say to set this value to 1, which means that the password can be repeated after 32 other passwords are used.
- Frequency that a user can try to remember a password before being locked out. Set the QMAXSIGN system value to 6 (best practices say 3–5) and the QMAXSGNACN to 2 or 3. The value of 2 sets the profile to STATUS(*DISABLED). The value of 3 sets the status to *DISABLED and disables the virtual device.
Logging
While many of the logging (or auditing, in i5/OS terms) requirements are fulfilled through the implementation of an application, you need to look at the logging facilities of each operating system. i5/OS has a rich set of auditing functions that allow you to monitor almost any activity. For compliance requirements, you typically have to log, at a minimum, invalid access attempts (both to the system and critical data files), changes to the system configuration, and actions taken by users with *ALLOBJ special authority. Beyond these actions, I prefer to audit a bit more so that I can be assured of being able to recreate a scenario using the audit journal as the forensic data. That means configuring two system values:
- QAUDCTL is set to *AUDLVL and *NOQTEMP.
- QAUDLVL is set to *AUTFAIL, *CREATE, *DELETE, *SAVRST, *SERVICE, *SECCFG and *SECRUN.
While you are often required to log the actions of individuals with *ALLOBJ, you can configure additional auditing for any user. Simply run the following command, specifying the profile and the additional actions to audit. Note: There's no need to specify actions at a user level that are already being audited at the system level.
If you are required to audit the use of a specific object (such as a file containing electronic protected healthcare information), add *OBJAUD to the QAUDCTL system value and run the following command, specifying the object to be audited:
Note: If you have to audit an IFS object, use the Change Audit (CHGAUD) command.
Save It
Laws and regulations often affect how your organization backs up its data. To be in compliance with many regulations, a well-thought-out save strategy is required. The financial industry has requirements that data and transactions be saved for seven years. Yet the credit card industry encourages organizations to save credit card information for as little time as possible. A financial organization has to follow the laws and regulations for its industry and be prepared to explain to a PCI auditor how it has determined the length of time to retain its data. In addition to the length of time backups are retained, you may reconsider how data is saved. For example, to ensure that you can recover the audit logs easily and to ensure they are retained for a sufficient period of time, you may want to save those on separate media from the rest of your data. Finally, if you choose to encrypt your data either before it's saved or during the save process, you may be exempt from most of the state breach-notification laws that require you to notify state residents if their data is lost or stolen.
In addition to a well-thought-out save strategy, laws and regulations require a documented and tested disaster recovery (DR) plan. This is not just a good practice to ensure compliance. This just makes good business sense. I've never seen a client return from a DR test without deciding to update their documentation or processes because of the information gained from the test. If you encrypt or are thinking of encrypting your backup media, don't forget to consider how this affects your DR processes. If you're using hardware to perform the encryption, it must be available at the DR site along, with the encryption keys and personnel with the authority to use them.
Best Practices
While I have focused on the security policy and i5/OS operating system requirements, laws and regulations may affect other areas of IT, such as the security settings of your network components, including firewall, router, and wireless access point configurations. Compliance requirements may demand that regular network (both wired and wireless) scans be performed and that certain connections be over encrypted sessions.
If you are overwhelmed with trying to keep up with all of the individual laws and regulations, one approach is to implement security best practices wherever possible. If you cannot implement best practices, write a business risk acceptance statement that documents why your organization cannot implement the best practice setting. Integrating security best practices into your architecture is your best chance of complying with new security laws and regulations that are being created seemingly every day.
Carol Woodbury is co-founder of SkyView Partners, Inc., a firm specializing in security policy compliance and assessment software as well as security services. Carol is the former Chief Security Architect for AS/400 for IBM in Rochester, Minnesota, and has specialized in security architecture, design, and consulting for more than 16 years. Carol speaks around the world on a variety of security topics and is coauthor of the book Experts' Guide to OS/400 and i5/OS Security.
LATEST COMMENTS
MC Press Online