Sat, Jul
2 New Articles

Practical SQL: Using Old World Tools with New World Data

  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Using DDL to define your files provides a wealth of new features, but just which features should you embrace? Some of that depends on the tools you use, and this article explains a couple of pitfalls.


The ongoing SQL vs. native I/O debate really consists of two different debates: how to define your data and how to access it. Access is either via native I/O or SQL, but that's not the topic today. Today, I'm focused on the data definition debate, either the older Data Definition Specifications (DDS) or the more modern SQL-based Data Definition Language (DDL). I firmly believe this dispute has been decided in favor of DDL, but DDL is not perfect. Some potential pitfalls exist, and this article will address one of the more onerous problems.

When Is DDL Appropriate?

I'm going to go out on a limb here and say that you should create all of your new tables using DDL. Every one. If you can provide a situation where DDL doesn't do something that DDS does, I can almost guarantee that your application could be rewritten relatively easily in such a way as to no longer need that feature. Of course, the key word "relatively" covers a lot of ground, and there are those diehard DDS fans who point out that DDL just doesn't have all the features of DDS. This is a valid criticism, especially if you like to use select and omit on your logical views and then run them through a program with level breaks. But I'm not going to debate the point; for today's discussion, let's assume you want to use DDL. What sort of things might you want to look out for? Well, sometimes you'll have to use some non-standard keywords to do things in DDL that are specific to the IBM i. For example, you'll need to know some extra keywords to generate a record format name that's different from the file name, or to put both column headings and descriptive text on a field. Look for tips on those specific features another day.


For me, one of the biggest things has been to be careful about the attributes of my fields (or columns, in SQL parlance). You may find yourself wanting to use a field type in SQL that makes sense for the situation but causes some inadvertent side effects. I advise that, whenever you want to implement a new data type, you test it thoroughly first. The problem is determining just how to test it and where you might get bitten. Let me give you two examples.

Bad Dates

Sometimes I just can't help quoting favorite old movies (in case it doesn't ring a bell, that's from Raiders of the Lost Ark). Anyway, my problem wasn't exactly bad dates, but more how to handle uninitialized dates. When you move to date data types (type L in DDS, and type DATE in DDL), you have to consider what will happen with an uninitialized date. In the olden days of CCYYMMDD fields (which in turn replaced MMDDYY fields, but we don't talk about that), I could write a record to a file, and any uninitialized date fields would end up as zero. This was especially true if I used the technique of writing to a logical view that didn't define all the fields; my date fields would end up with all zeros. In SQL, you do the same thing by performing an INSERT that does not include that field: any numeric fields get initialized to zero.


But with the date data type, that's no longer an option. The L date field doesn't support a value of zeros. In fact, the low value of a date field is 0001-01-01, January 1st of the year 1. But here's the tricky bit: if you write to a file and don't explicitly initialize the date field, you don't get a low value. You get the current date! That's great when you want a current timestamp, not so much when you want a low value. In SQL, there are two ways to handle an uninitialized date: the low value of 0001-01-01 or else make the field null-capable and leave the value null. I'm going to avoid a long discussion on the pros and cons of null-capable fields and stick to the practical aspects (this is "Practical SQL," after all). The biggest drawback to null-capable fields is that they require a little programming gymnastics in order to implement them properly whether you use native I/O or embedded SQL. What RPG does works, but it always seems a little unnatural to me. The best example of how to implement null indicators in embedded SQL was written several years ago by one of my longtime programming heroes, Ted Holt.


I'm still not a fan of nulls, though, so instead I found a better way to get around the problem. If you define a date field in DDL, a couple of keywords will initialize the field as a low value without having to make it null-capable. Take a look at my DDL definition for the field named CHANGED:




Ta da! Now if you INSERT a record and don't explicitly initialize the CHANGED field, it will get the low value of 0001-01-01. This same technique should be used for TIME and TIMESTAMP fields if you don't want to deal with nulls. The format is crucial; you see the proper format for DATE fields: TIME fields should be initialized with '00.00.00' and TIMESTAMP fields with '0001-01-01-00.00.00'. Be very careful with the punctuation. Separate date fields with dashes and time fields with dots. Separate the date and the time in a TIMESTAMP with a dash.

My INT Is Bigger Than Your INT

Another problem I ran into was with the data type BIGINT. This is used frequently in SQL tables for counter fields or unique IDs (one day I'll write an entire article on using unique IDs instead of key fields). A BIGINT is a 64 bit binary field. Such a field can go up to 2^64, which is eighteen quintillion or nearly 10^20. If you're familiar with integer numbers in RPG, you know they are defined as either I (for signed integer) or U (unsigned integer). When you specify the size of the field, you specify it in decimal digits: 3 (8-bit), 5 (16-bit), 10 (32-bit), or 20 (64-bit). So, at first glance, it would seem that you can create a 64-bit BIGINT field in DDL and then use that field in an RPG program. And for the most part, that's a true statement. But this is a perfect example of why you need to test all the possible scenarios. I created a file with a BIGINT field, and it worked flawlessly, right up until the time I went to look at the data in the table. Not one IBM i database tool would show it to me. WRKDBF failed (which is not a knock on Bill Reger's fantastic utility). DBU failed. Even the standard IBM i utility UPDDTA wouldn't let me modify a file with a BIGINT field.


So the upshot of this is that once I put BIGINT into a file, I can only access it via SQL, whether it's through STRSQL in the green-screen or through some other standard SQL tool such as SQuirreL SQL. I have to admit that I didn't see that one coming. I did come up with a workaround: you can use a 20-digit numeric field (packed or zoned) in place of the BIGINT. You get the same range, but now the DDS-oriented tools can work with the data.


So the moral of the story is that while DDL is the data definition technique of the future, not all features of DDL are equally seamless to us legacy programmers. That should not be an excuse, however, to cling to DDS! Go forth and use DDL! Just be forewarned that you may need to do a little testing before you put it into production.

Joe Pluta

Joe Pluta is the founder and chief architect of Pluta Brothers Design, Inc. He has been extending the IBM midrange since the days of the IBM System/3. Joe uses WebSphere extensively, especially as the base for PSC/400, the only product that can move your legacy systems to the Web using simple green-screen commands. He has written several books, including Developing Web 2.0 Applications with EGL for IBM i, E-Deployment: The Fastest Path to the Web, Eclipse: Step by Step, and WDSC: Step by Step. Joe performs onsite mentoring and speaks at user groups around the country. You can reach him at This email address is being protected from spambots. You need JavaScript enabled to view it..

MC Press books written by Joe Pluta available now on the MC Press Bookstore.

Developing Web 2.0 Applications with EGL for IBM i Developing Web 2.0 Applications with EGL for IBM i
Joe Pluta introduces you to EGL Rich UI and IBM’s Rational Developer for the IBM i platform.
List Price $39.95

Now On Sale

WDSC: Step by Step WDSC: Step by Step
Discover incredibly powerful WDSC with this easy-to-understand yet thorough introduction.
List Price $74.95

Now On Sale

Eclipse: Step by Step Eclipse: Step by Step
Quickly get up to speed and productivity using Eclipse.
List Price $59.00

Now On Sale



Support MC Press Online


Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: