26
Thu, Dec
0 New Articles

Practical SQL: More DB2 Services

SQL
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

IBM continues to add services to DB2, many of which provide programmatic access to system information.

A couple of years ago, I wrote an article in which I introduced some DB2 services. This was early in the development cycle for these wonderful additions to the IBM i toolkit. In the ensuing years, that particular pouch of tools has grown to an entire rolling tool chest. More and more collections of important system data have been surfaced by DB2 services, and today I'll present a couple that I find extremely useful.

Why Are These Services So Important?

Before we delve into the specifics, I think it's important to understand why the very concept of a DB2 service is so important. Why go through the trouble of learning a new technique? Some (although not all!) of these services mirror data that can be materialized using CL commands. For example, I'll be showing you a DB2 service that allows direct access to journal entries, a task you can also accomplish using the DSPJRN command to an output file. However, by placing this same access within a service, now you can embed the access directly in an RPG program without having to go through the traditional exercise of creating a work file and then reading through it. Yes, you'll need to use embedded SQL syntax, but I hope you're already using that today; if not, some of these services might be enough to give you a reason to learn it!

Add to this ease of use the fact that many of these services don't even have a direct CL counterpart. The QHST log service I'll present in a moment can be emulated only by executing DSPLOG QHST to print and then scanning through the resulting spooled file. This is definitely not a future-proof solution! So let's take a look at some of the newest services.

Let's Get Historical!

One of my regular tasks involves checking the history log, QHST. I can't tell you how many times I've needed to spin through the history log in an attempt to track down certain events. To do that, I have to execute DSPLOG QHST and try to limit the scope using parameters to get to near where I want, and then page back and forth. It gets more difficult when I 'm looking for multiple events. Let's say, for example, I want to find the start and end dates and times for jobs by a specific user (a typical example would be an external database access). Since I can't use the DSPLOG parameters to filter by user, I instead have to use the technique I alluded to earlier: dump the log to a spooled file, copy the spooled file to a database file, and then finally scan through that database file. It works, but it's neither elegant nor future-proof, since the layout of the printed report may change in the next release.

Note: DSPLOG's first parameter is the log, which must be entered as QHST. Did IBM intend to have more? Enquiring minds want to know!

Anyway, let's take a look at a way to use the DB2 service associated with the history log, which has the appropriate name of history_log_info. With history_log_info, I can get a list of messages from the history log and manipulate them as needed. In this case, I want to get a list of all job start (CPF1124) and job completion (CPF1164) messages for a specific user. The user is QTFTP, which is the default user for trivial FTP requests.

with t1 as                                                    

   (select * from                                              

     table(history_log_info('2018-06-01-00.00.00.000000')) hli)

   , t2 as (select * from t1 where MESSAGE_ID = 'CPF1124')    

   , t3 as (select * from t1 where MESSAGE_ID = 'CPF1164')    

   select FROM_JOB, t2.MESSAGE_TIMESTAMP, t3.MESSAGE_TIMESTAMP,

     int(t3.MESSAGE_TIMESTAMP - t2.MESSAGE_TIMESTAMP) Duration

   from t2 join t3 using (FROM_JOB) where t2.FROM_USER = 'QTFTP'

The only really tricky part is the initial subselect, in which we access the history_log_info table function. This table function is designed to return a table, and by putting that within the TABLE clause and performing a SELECT on it, we now have a subselect like any other subselect. The history_log_info function takes parameters in various ways (which we'll see in more detail in the second example), but in its simplest form, the first parameter is just a timestamp that identifies the earliest entry to process. So in simplest terms, the T1 common table expression (CTE) consists of all entries since the beginning of June 1.

Other than the table access, the rest of the SQL is pretty straightforward. We create two additional CTEs: T2 for the job start messages and T3 for the job completion messages. Finally, we join those together showing job name, start, end, and duration for every job for user QTFTP. The result looks like this:

FROM_JOB                 START_TIME                 END_TIME                     DURATION

169703/QTFTP/QTTFT00017 2018-06-02-23.37.14.417034 2018-06-03-01.37.14.611797   20,000

169704/QTFTP/QTTFT00004 2018-06-02-23.37.14.421874 2018-06-03-02.07.14.679087    23,000

The first job lasted for two hours, the second for two hours and 30 minutes. Nothing to it!

Journal Mining

I also do a lot of journal mining. Journal mining involves locating and analyzing entries in a journal for forensic examination. This might be done to help understand a business process better or to identify a program that is updating the database in an unexpected manner. Either way, it involves looking at specific data. But one of my most common techniques is to send marker entries to the journal to then later retrieve data within a specific period. For example, I'll start by executing a command to send an entry to the journal:

SNDJRNE JRN(JRNLIB/MYJRN) ENTDTA('TRANSACTION BOUNDARY – START 01')

This puts an entry into the journal. I then execute whichever function it is I'm analyzing. Then I send another entry:

SNDJRNE JRN(JRNLIB/MYJRN) ENTDTA('TRANSACTION BOUNDARY – END 01')

Now I just want to review all the entries between those two. In order to do that, I need to get the beginning and ending sequence numbers, which I do with this query:

select ENTRY_TIMESTAMP, SEQUENCE_NUMBER, char(ENTRY_DATA, 100) Data

from table(display_journal('JRNLIB','MYLIB',                

   JOURNAL_ENTRY_TYPES => '00',                              

   STARTING_TIMESTAMP => '2018-06-02-09.00.00.000000') ) a    

Note that, once again, we have a table function (in this case, it's display_journal), which we put inside a TABLE clause. In this case, we don't need a CTE because we can do all the selection criteria within the call to display_journal. I used a specific syntax here in which I combined both position parameters and keyword parameters. The first two parameters are the journal library and name, so I was able to add those without the keywords. However, a very nifty syntactical feature of table functions is that you can specify parameters by keyword. In this case, I was able to add a keyword to limit the selection to only my type 00 journal entries, and also to start at 9:00 a.m. on June 2. The result looks like this:

ENTRY_TIMESTAMP               SEQUENCE_NUMBER   DATA

2018-06-03-15.29.02.675152    26,059,704       TRANSACTION BOUNDARY - START 01

2018-06-03-15.29.23.544480   26,059,801       TRANSACTION BOUNDARY - END 01

Now, in order to see what happened, I only need to look at the entries between 26,059,704 and 26,059,801. Not surprisingly, I can specify those values on a second call to the table function and further limit the scope with more parameters—to a single file perhaps or a program.

The nice thing about this technique is that I can run this inside a program. I could theoretically run a SELECT statement like this every night to review all the changes to a specific file. I could review the data and notify someone if a database anomaly is identified. As I noted earlier, I could have done this with DSPJRNE to an outfile, but this is so much easier, especially when it comes to automating repetitive queries.

So this article just brings to light a couple more DB2 services provided by the developers at IBM. I think the one thing I'd recommend to everyone is to keep an eye out from release to release and look for more of these great services!

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: