03
Tue, Dec
0 New Articles

Use Qshell Tools to Clean Up Your IFS Automatically

RPG
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Create a useful RPG program to purge unnecessary files from your IFS.

 

The year-end stuff is over, you've captured all your yearly snapshot data in their own files, and you're feeling pretty good about yourself. Next thing is to clean house. Reports are a good place to start. You likely have a lot of PDFs and Excel spreadsheets that should be purged from your IFS, and this article will walk you through developing a reusable program to easily do exactly that.

 

Qhell has a find command that we will be using to provide the functionality to delete files that are beyond a certain age. We'll start by discussing the details of the find command interactively in Qhell. Then we'll build a simple program to utilize this command.

 

Let's jump right in with an example. Then we'll break it apart. Start up Qhell by entering strqsh on your green-screen command line. For our first example, we'll simply list the files in a specific directory that are greater than 90 days old.

 

find /Public/myDirectory -type f -mtime +90 –print

 

Here's a breakdown of the command above:

 

Command Segment

Description

Find

The command that will find the files

/Public/myDirectory

The directory to search for the files

-type f

Look only at files; no directories will be listed

-mtime +90

Include only files that are greater than 90 days old

-print

Display the results on the screen 

 

Executing this command will display the list of files that are older than 90 days within the directory path that you specify. Most of the segments are self-explanatory, but the time parameter requires a little more detail to include a few additional time options that may be of interest.

Different Times: mtime, ctime, and atime

There are several ways to specify the age of the files we are looking for, but for this article, I'll be discussing only the times that relate to days. Here's a brief overview of each one that's available:

 

Time

Description

mtime

Last Modified Time—the last time the data contents were changed

ctime

Last Change Time—the last time the attributes of the file were changed

atime

Last Access Time—the last time the contents of the file were read

 

For this example, we'll be using the last modified time with mtime, which is what I commonly use as my purging criteria. You may want to use another.

 

In the code above, I used a positive (+) number to find older files that were greater than a specified numbers of days. If I wanted to find newer files that were less than a specified number of days, I would use a negative (-) number.

Filtering Files

Suppose you have a mix of files in your directory of Excel spreadsheets, CSV files, and PDF files with the file name extensions of .xls, .csv, and .pdf, respectively. And you want to select only the Excel spreadsheets. You can do this by filtering the files that are identified using the –name parameter as follows:

 

find /Public/myDirectory -type f –name '*.xls' -mtime +90 –print 

 

Running this command will list all the files ending with the .xls file name extension. You can be creative with your file pattern recognition, so you're not just limited to file extensions; you could do more.

Preventing Subdirectory Recursion

By default, the find command will automatically search for files that are found in subdirectories of the directory location that you specify. This may or may not be desirable.

 

For some versions of the find command, there is a –maxdepth option that you can set to the value of 1 to keep the command from going into the subdirectories. Unfortunately, in Qhell, at least in my current version of V7R1, this option is not available.

 

For UNIX-type commands, there are tons of possibilities to be creative with, but I've found the simplest solution is to utilize the –prune option available with the find command. By specifying your target path to search, you can additionally specify a path to prune, which will eliminate all subdirectories.

 

find /Public/myDirectory -path '/Public/myDirectory/*' -prune

-type f –name '*.xls' -mtime +90 –print

 

By using –path, we can specify the directory paths that we want to prevent from being searched, along with the –prune parameter. The –path tells –prune what path to mask from the search, which will be all the subdirectories in this case.

 

The key difference between the search path and the prune path is the slash/asterisk (/*) character combination, which indicates the subdirectories of the search path.

Deleting Files

Up to this point, we've identified the files by listing them on the screen. This verifies that our filter criteria are working harmlessly with no modification being made to any of the files. Next, we are going to use that information to delete the older files to keep the directories clean.

 

I always list my files first to make sure that I'm not going to delete more files than I expect to. I suggest that you do the same. Deleting mass files could have disastrous results if you specify the wrong directory. This is why it doesn't hurt to have the *.xls filter to ensure you're deleting only Excel files.

 

To delete the files, we replace the –print parameter of the find command with the –exec parameter to execute the rm command to remove the files from the IFS, as follows:

 

find /Public/myDirectory -path '/Public/myDirectory/*' -prune

-type f –name '*.xls' -mtime +90 -exec rm {} \;

 

The –exec parameter indicates that we want to execute a command, which is followed by the command. The curly braces ({}) will be replaced with the name of each file that was previously listed so that each file listed will have the specified command executed on it. The backslash (\) escapes the command and the semicolon (;) tells the rm command that the list of arguments has ended.

Writing the IFSCLEAN RPG Program

Now that we know how to do everything in Qhell, let's write an RPG program that we can easily reuse to take advantage of the find command to purge our directories.

 

We will be encapsulating our Qhell code within a procedure named cleanIFS. To support the settings described earlier, our prototype for the procedure will look like this:

 

     D cleanIFS...

     D                 PR             1N

     D   argPath                   2000A   const varying

     D   argDays                      3S 0 const

     D   argRecurse                   1N   const options(*NOPASS:*OMIT)

     D   argFilter                  128A   const options(*NOPASS:*OMIT)

     D                                     varying

     D   argTime                      1A   const options(*NOPASS:*OMIT)

 

The parameters will give us the capability to access all of the options that we will want to control for the purging process. Here is the code for the cleanIFS procedure:

 

      *-----------------------------------------------------------------

      * cleanIFS: Delete files on IFS using Qhell

      *-----------------------------------------------------------------

     P cleanIFS...

     P                 B                   EXPORT

     D cleanIFS...

     D                 PI             1N

     D   argPath                   2000A   const varying

     D   argDays                      3S 0 const

     D   argRecurse                   1N   const options(*NOPASS:*OMIT)

     D   argFilter                  128A   const options(*NOPASS:*OMIT)

     D                                     varying

     D   argTime                      1A   const options(*NOPASS:*OMIT)

     D* Local Variables

     D   svReturn      S              1N

     D   svRecurse     S              1N   inz(*OFF)

     D   svFilter      S            128A

     D   svTime        S              1A   inz('m')

     D   svCmdString   S            512A

      /free

        svReturn = *OFF;

        //---------------------------------------------------------

        // Optional Safety Mechanism, only allow full paths in Public

        // <Insert Extensive Warning Here for Other Programmers>

        //---------------------------------------------------------

        if (%subst(argPath:1:8) <> '/Public/');

          return *ON;

        endif;

 

As a warning, the Qhell rm command can be very dangerous; if you aren't careful, you could delete system files, which would not be good. So, I try to put in a safety mechanism that will allow only full paths to be specified within the Public folder. I have structured my IFS so that all of my program output goes into subfolders of the Public directory off the root of the IFS. This is not necessary, but I put this out there to make you consciously aware to be careful. Hard-coding a safety mechanism like this forces a programmer to review the code to see what it's doing and consciously add to the permitted list or override it.

 

To implement this safety net, I am going to ensure that every path that is being purged begins with /Public. If it doesn't, the program will not execute the purge and will pass back a failure value to the calling program.

 

        //----------------------------------------------------------

        // Initialize local variable with parameters (if applicable)

        //----------------------------------------------------------

        if %parms > 2;

          if %addr(argRecurse) <> *NULL;

            svRecurse = argRecurse;

          endif;

        endif;

        if %parms > 3;

          if %addr(argFilter) <> *NULL;

            svFilter = %trim(argFilter);

          endif;

        endif;

        if %parms > 4;

          if %addr(argTime) <> *NULL;

            svTime = argTime;

          endif;

        endif;

 

The preceding segment of code is pretty standard for my procedures. Local variables will be defined to act as preprocessed data that will be prepared for usage through the rest of the code. These variables may be passed in through the parameters or set to default values.

 

The next section of code will start building the string that will be passed to the STRQSH command. The string is initialized with the static value of the command itself, followed by the specified path to purge. The static –type f indicates that we will be processing only files.

 

         //-------------------------------------------------------------

         // Start building the Qhell String

         svCmdString = 'STRQSH CMD(''find '

                     + %trim(argPath)

                     + ' -type f ';

 

Next, we will support the parameters that act as switches to turn certain options on or off. The recursive option will determine if the purge will recursively process all the subdirectories of the primary directory that we are purging.

 

         monitor;

           //-------------------------------------------------------------

           // Safer Road, Default to Not Recursive

           if not svRecurse;

             svCmdString = %trim(svCmdString)

                         + ' -path '''''

                         + %trim(argPath)

                         + '/*'''''

                         + ' -prune';

           endif;

 

I put a note here that it is safer to prohibit recursion. I say this because, if you have subdirectories, they may have a different purpose than the directory that you are targeting. But that's just a safety tip. If you know you want to delete all files within that directory and all subdirectories, then you would allow recursion to occur. It's pretty common to do this; otherwise, recursion would not be the default. Being specific could help protect you against mistakes. However, this precaution is unnecessary after you've tested out your command interactively. But the capability is still relevant when you want to purge only certain files.

 

           //-------------------------------------------------------------

           // Safer Road, Filter on File Name Pattern

           if svFilter <> *BLANKS;

             svCmdString = %trim(svCmdString)

                         + ' -name '''''

                         + %trim(svFilter)

                         + '''''';

           endif;

 

I also put a note here that it is safer to use a filter. I say this because if you accidentally specified a system directory to purge, changes are you wouldn't find any Excel spreadsheets. And if you did, I wouldn't think they would be critical files anyway. The same thinking applies here that once you've verified that your path is correct, then the safety aspect is unnecessary.

 

You may notice that I have a lot of single quotes in the string above. This is not a typo. When you want a single quote to be contained in a string without ending it, you double it up to use two quotes to represent a single quote in the results. In this case, the string will be processed twice, once by QCMDEXC and once by the command being executed STRQSH, so we need five total.

 

The remaining code will set the number of days for the time selection and append the execution command. The command will then be executed within the RPG code:

 

           //-------------------------------------------------------------

           // Time: M, C or A

           svTime = %xlate('MCA':'mca':svTime);

           svCmdString = %trim(svCmdString) + ' -'

                       + svTime + 'time';

           //-------------------------------------------------------------

           // +/- Days

           if (argDays >= 0);

             svCmdString = %trim(svCmdString) + ' +';

           else;

             svCmdString = %trim(svCmdString) + ' -';

           endif;

           svCmdString = %trim(svCmdString)

                       + %trim(%editc(%abs(argDays):'J'));

           //-------------------------------------------------------------

           // -exec rm

           svCmdString = %trim(svCmdString)

                       + ' -exec rm {} \;'')';

           //-------------------------------------------------------------

           ExecuteCommand(%trim(svCmdString):%len(%trim(svCmdString)));

         on-error;

           // Exception

           svReturn = *ON;

         endmon;

        return svReturn;

      /end-free

     P                 E

 

The final part of the service program ensures that the time letter that is passed in is in lowercase and then attaches it to the time parameter of the find command. The days are then formatted to use a sign character as expected by the command. Then the rm execution segment is appended to the end of the command.

 

Once the string is built for the find command, it is executed with ExecuteCommand, which is just a prototype to QCMDEXC.

 

To test our new procedure, we will use a simple program to call the new procedure. This program will simply initialize some variables to pass into the parameters of our new cleanIFS procedure. You can take this further by creating a physical file that could contain these settings for different directories. Then you just loop through the directories and pass in the purge settings.

 

     D displayBytes    S             52A

     D strPath         S            512A

     D strFilter       S            128A

     D strTime         S              1A

     D intDays         S             10I 0

     D boolRecurse     S              1N

     D* Prototype for QCMDEXC API

     D ExecuteCommand...

     D                 PR                  extPgm('QCMDEXC')

     D  argInCommand              65535A   const options(*varsize)

     D  argInLength                  15P 5 const

     D  argInDBCS                     3A   const options(*nopass)

     D*

     D* Prototype for cleaning IFS folders

     D cleanIFS...

     D                 PR             1N

     D   argPath                    512A   const varying

     D   argDays                      3S 0 const

     D   argRecurse                   1N   const options(*NOPASS:*OMIT)

     D   argFilter                  128A   const options(*NOPASS:*OMIT)

     D                                     varying

     D   argTime                      1A   const options(*NOPASS:*OMIT)

 

      /free

       //-------------------------------------------------------------

       // Initialize Parameter Information

       strPath = '/Public/myDirectory';

       intDays = 90;

       boolRecurse = *OFF;

       strFilter = '.xls';

       strTime = 'm';

       //-------------------------------------------------------------

       // Display Purging Information

       displayBytes = 'Path: ' + %trim(strPath);

       dsply displayBytes;

       displayBytes = 'Days: ' + %trim(%editc(intDays:'J'))

                    + ' (' + strTime + ')time';

       dsply displayBytes;

       displayBytes = 'Filter: (' + %trim(strFILTER)

                    + ') Recursive: ' + boolRecurse;

       dsply displayBytes;

       //-------------------------------------------------------------

       if (cleanIFS(strPath: intDays: boolRecurse: strFilter: strTime));

         displayBytes = 'FAIL:( ' + %trim(strPath);

       else;

         displayBytes = 'PASS:) ' + %trim(strPath);

       endif;

       dsply displayBytes;

 

       *inlr = *ON;

      /end-free

 

If you run the program, you'll see the following output:

 

> call mcp049rpg                                              

   DSPLY  Path: /Public/myDirectory      

   DSPLY  Days: 90 (m)time                                     

   DSPLY  Filter: (.xls) Recursive: 0                          

   Command ended normally with exit status 0.             

   DSPLY  PASS:) /Public/myDirectory     

 

You will most likely need to change the code to support the directories that you are using for your files. I used the hard-coded /Public/ folder to prevent accidentally purging from an unintended directory. You can either remove or update this setting to match your directory structure.

 

Take the code for a test drive. You no longer have excuses to postpone those cleanup processes that you've been putting off. I strongly encourage you to run the commands interactively within Qhell that list the files that qualify for purging to become familiar with the commands and ensure that you are deleting the files you intend to purge. Then purge away!

Download the Code

You can download the code used in this article by clicking here.

 

Thomas Snyder

Thomas Snyder has a diverse spectrum of programming experience encompassing IBM technologies, open source, Apple, and Microsoft and using these technologies with applications on the server, on the web, or on mobile devices.

Tom has more than 20 years' experience as a software developer in various environments, primarily in RPG, Java, C#, and PHP. He holds certifications in Java from Sun and PHP from Zend. Prior to software development, Tom worked as a hardware engineer at Intel. He is a proud United States Naval Veteran Submariner who served aboard the USS Whale SSN638 submarine.

Tom is the bestselling author of Advanced, Integrated RPG, which covers the latest programming techniques for RPG ILE and Java to use open-source technologies. His latest book, co-written with Vedish Shah, is Extract, Transform, and Load with SQL Server Integration Services.

Originally from and currently residing in Scranton, Pennsylvania, Tom is currently involved in a mobile application startup company, JoltRabbit LLC.


MC Press books written by Thomas Snyder available now on the MC Press Bookstore.

Advanced, Integrated RPG Advanced, Integrated RPG
See how to take advantage of the latest technologies from within existing RPG applications.
List Price $79.95

Now On Sale

Extract, Transform, and Load with SQL Server Integration Services Extract, Transform, and Load with SQL Server Integration Services
Learn how to implement Microsoft’s SQL Server Integration Services for business applications.
List Price $79.95

Now On Sale

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: