30
Mon, Dec
0 New Articles

Practical SQL: Making Connections

SQL
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

The connection string is the bridge between the SQL world and the QSYS world of the IBM i.

I have recently been spending a lot of time working with a couple of concepts: data warehouses (and other external database access) and disaster recovery. These two concepts are fundamentally related, and like so many things in the IBM i world, some well-planned architecture decisions up front will make the back end very easy. In this case, we'll be talking about the connection string used by external processes to talk to your enterprise data on the IBM i.

Some Assumptions

I've already begun making some assumptions in the opening paragraph, so I might as well make the entire list:

  • All enterprise data resides on the IBM i.
  • Data in external sources (such as data warehouses) is staged from the IBM i.
  • Business logic resides on the IBM i, not in the data warehouse.

From a process standpoint, this means that the data warehouse is responsible only for manipulation and presentation of data provided by the i. I call that the "slicing and dicing" in which users can see the enterprise data in various grouping and aggregations. This doesn't mean there are no calculations in the data warehouse. On the contrary, I'm all for providing transactional data (such as sales) to external tools that can do the computational heavy lifting of forecasts and then provide all the nice graphical dashboards that management needs to make decisions.

But note that the enterprise data—the day-to-day underlying bones of the business, from purchasing to production to payments—resides on the IBM i. We enter it there, we store it there, and we make it available to whoever needs it. We may be able to simply lift data points, or we may need to do some complex data-driven logic, but at the end of the day, the data comes from the enterprise, and the data warehouse packages and presents it.

The Bright Red Line

So where is the line that separates calculations on the server from calculations in the data warehouse? As always in architectural decisions, there may be a bit of wiggle room, but I do have a pretty firm guideline: if a calculation conditionally accesses different tables under different use cases, then it definitely belongs on the server (and in my world, it should be written in RPG, not SQL).

Let's consider an example. My environment uses two different costing models: standard cost and average cost. The type of cost used in a given situation depends on a number of factors. Add to that the fact that some cases may need the current cost, while others may require a point-in-time cost, where you specify a date and the system tells you the cost of that item on that date. Given that complexity, I wrote an RPG program to determine the cost, and for external data consumers I wrapped it in a user-defined function (UDF).

And that's what we're going to be talking about today. How to access stored procedures and UDFs using an IBM i-aware connection.

Libraries Are Like Schemas, Except When They're Not

Creating your first stored procedure and UDF isn't really all that difficult; awhile back I explained the process in two articles. But like so many things in programming, it's not the initial proof of concept that requires the architectural underpinnings; it's the nuances of using those concepts in a live environment.

My environment separates the object library and the data library. This isn't a problem on the IBM i; that’s what library lists are for! But it does present the first challenge for external access, since standard SQL usually thinks about only a single schema (a schema is roughly equivalent to a library in QSYS terms). I could get around this at first by specifying the data library as the primary schema and then qualifying the function. I might say something like this:

SELECT IMITEM, IMDESC, PRODOBJ.ITM_COST(IMITEM) FROM ITMMST

The problem is that the ITM_COST function calls a stored procedure, ITM_COSTGET. That procedure is also in PRODOBJ, and that means it won't be found unless I qualify the library in the function (I literally tell the function ITM_COST to call ITM_COSTGET in PRODOBJ). So now I've got the library name PRODOBJ hard-coded in a couple of places. This is the stuff that architectural nightmares are made of. The first thing that goes wrong is when you need to test changes. In the IBM i world, we tend to test by putting the test library on the top of our library list. We just add TESTLIB on top and any programs in that library take precedence over the versions in PRODOBJ.

Unfortunately, our UDF call is qualified, and it calls a qualified program, so there's literally no way to make it use the test program. Instead, we'll need a different version of the UDF and a different call to that UDF. That's not entirely horrible at first, but then you have to promote the function. Well, you actually have to create a new object and change the qualified name of the called procedure. The real danger here is that you're essentially required to do double maintenance, keeping two completely separate objects in sync (but with slight differences!). The first time you forget to update the production version—or worse, put the test version into production—you'll understand just how painful that can be.

The problems only multiply when you’re talking about databases, especially when the files exist in multiple libraries. For example, our primary test box has three environments, each with the same database file names but in different libraries, and there are multiple libraries per environment. For the green-screen, this is really easy: just set your library list according to the environment you wish to use. But with SQL's single-schema concept, there's no way around it. In fact, it's really quite difficult. Every SELECT statement has to qualify all of the tables, except those in one library. In order to use a different set of libraries, you have to change all of those qualified names.

It's disaster waiting to happen.

I Wish There Were a Schema List!

So what's the solution? Is there a way to make an SQL connection library list–aware? As it turns out, there is, but it isn't entirely intuitive. One part is pretty straightforward, and the other is much less obvious. Also, the setup is slightly different between the two primary connection types, ODBC and JDBC. I'll walk you through the JDBC connection here, using one of my favorite tools, DBeaver.

Practical SQL: Making Connections - Figure 1

Figure 1: Start by creating a DB2 for iSeries connection.

DBeaver is really helpful in that it understands what an iSeries is and is able to create most of the defaults for you. Okay, it still thinks we work on iSeries (and the “AS 400,” whatever that is), but still, that's better than not knowing what the system is at all.

Practical SQL: Making Connections - Figure 2

Figure 2: In the Database/Schema prompt, enter *LIBL followed by your library list separated by commas.

Specifying the library list is the straightforward part. The only weirdness is having to enter *LIBL to tell the system that it's a library list, but it works as shown. Now any stored procedures in PRODOBJ and any tables in PRODDTA1 or PRODDTA2 can all be accessed without having to qualify the library name. To use different libraries, set up a different connection with the required libraries.

Practical SQL: Making Connections - Figure 3 

Figure 3: Change the naming property from sql to system.

This is the nonintuitive part. In the Driver properties tab, you'll see the naming property. It has two values, sql and system, and the default is sql. You don't want the default; you want the other value, system. So select that value.

Practical SQL: Making Connections - Figure 4 

Figure 4: That's all that's required. Now you can hit Finish.

That's it. This connection will now honor the library list you specified in Figure 1. And the beautiful part of this is that now you can run the same statement in a completely different environment just by creating a new connection with the required library list and connecting to it instead.

Additional Nuances

There are some additional nuances that I'll cover in subsequent articles. In particular, creating objects (tables, views, stored procedures, and functions) requires a little more work. But this is all you need to start creating a robust multi-platform environment where SQL and the IBM i work together seamlessly.

 

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: