TechTip: Node.js on Linux with JDBC Connection to DB2 for i

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

 

Real-world needs demand real-world "what happens if I poke here?" solutions.

One thing I love about my job at Krengeltech is doing discovery work to determine if a collection of technologies can work together to accomplish a business need. Such was the case today. A business had the need to communicate from Node.js on CentOS to DB2 for i. There are many options available to accomplish this task and I've listed some of them below. I've included commentary concerning my thoughts about each.

  • Use DB2 Connect from IBM. This requires the purchase of licenses from IBM for each developer and server. I've used this before for another IBM i customer that had Node.js on Ubuntu, so I know it is a solid solution.
  • Communicate via the Node.js iToolkit from IBM. Under the covers, this communicates via HTTP to XMLSERVICE on the IBM i and allows for interaction with DB2 and RPG. It isn't as fast as what one would probably desire when compared to 1-tier database access.
  • Utilize the jdbc npm. I've always considered the JDBC driver to be a solid solution for connecting Java to DB2 for i. This does require a Java Virtual Machine (JVM) to be loaded on the machine. Not a huge issue, but it is another technology in the stack (the fewer the better in my opinion).
  • Create Node.js micro services on IBM i that can be invoked from Node.js on CentOS. These "micro services" could be thick or thin. What I mean by that is you could have a thin Node.js program that simply received in an SQL string, executed it against DB2 for i, and returned the result set(s). Or you could have a thicker Node.js program that invoked business functions (e.g., getCustomerList) instead of allowing raw SQL to be run.

After a brief customer call, we decided to pursue the jdbc npm. The install instructions said to run the following command. Note I am running the below command on my Mac, not on the IBM i.

$ cd git && mkdir node_jdbc && cd node_jdbc

$ npm i --save jdbc

npm WARN deprecated node-uuid@1.4.7: use uuid module instead

-

> java@0.8.0 install /Users/aaronbartell/s3/git/node_jdbc/node_modules/jdbc/node_modules/java

> node-gyp rebuild

gyp WARN download NVM_NODEJS_ORG_MIRROR is deprecated and will be removed in node-gyp v4, please use NODEJS_ORG_MIRROR

gyp WARN download NVM_NODEJS_ORG_MIRROR is deprecated and will be removed in node-gyp v4, please use NODEJS_ORG_MIRROR

gyp WARN download NVM_NODEJS_ORG_MIRROR is deprecated and will be removed in node-gyp v4, please use NODEJS_ORG_MIRROR

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/java.o

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/javaObject.o

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/javaScope.o

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/methodCallBaton.o

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/nodeJavaBridge.o

  CXX(target) Release/obj.target/nodejavabridge_bindings/src/utils.o

  SOLINK_MODULE(target) Release/nodejavabridge_bindings.node

clang: warning: libstdc++ is deprecated; move to libc++ with a minimum deployment target of OS X 10.9

> java@0.8.0 postinstall /Users/aaronbartell/s3/git/node_jdbc/node_modules/jdbc/node_modules/java

> node postInstall.js

jdbc@0.4.0 node_modules/jdbc

??? node-uuid@1.4.7

??? winston@2.3.1 (cycle@1.0.3, eyes@0.1.8, stack-trace@0.0.9, isstream@0.1.2, async@1.0.0, colors@1.0.3)

??? async@2.1.4

??? This email address is being protected from spambots. You need JavaScript enabled to view it..4

??? java@0.8.0 (nan@2.4.0, find-java-home@0.1.3, async@2.0.1, glob@7.1.1, This email address is being protected from spambots. You need JavaScript enabled to view it..4)

Success! At this point, we have the jdbc npm installed on the laptop. If you're on Windows, I'd highly recommend you pursue "Git for Windows" from Github. It gives you a Linux-like shell that will allow you to run things like the above. You can see a fuller tutorial for getting set up with Node.js on Windows 10 at this tutorial.

Having used JDBC for many things in the past, I knew I'd need the jt400.jar file to use as the database adapter. You can obtain the jt400.jar file from the JTOpen project. Specifically, go here to download the jtopen_9_1.zip file and unzip it at the location of your choosing.

Next, run the following commands to get the jt400.jar file into your project.

$ cd node_jdbc

$ mkdir drivers

$ cp path/to/jtopen_9_1/lib/jt400.jar drivers

At this point, I believe I have everything I need and go looking for example code in the Github repo for the jdbc npm project. I do a search for "as400" in the repo and come up empty handed. I searched for "as400" because of prior experience knowing that's what the URL connection string contains in it.

At this point, I start piecing together code from other examples in the jdbc npm Github repo. I'll often look for MySQL examples and hack them to work for DB2 for i because they can sometimes be more similar than others.

Below is the full working example.

----app.js----

//

// node app.js <schema> <user> <password>

//

var JDBC = require('jdbc');

var jinst = require('jdbc/lib/jinst');

var asyncjs = require('async');

if (!jinst.isJvmCreated()) {

  jinst.addOption("-Xrs");

  jinst.setupClasspath(['./drivers/jt400.jar']);

}

var server = process.argv[2];

var schema = process.argv[3];

var user = process.argv[4];

var password = process.argv[5];

var config = {

  url: 'jdbc:as400://' + server + '/' + schema,

  drivername: 'com.ibm.as400.access.AS400JDBCDriver',

  minpoolsize: 10,

  maxpoolsize: 100,

  properties: {

    user: user,

    password: password

  }

};

var ibmi = new JDBC(config);

ibmi.initialize(function(err) {

  if (err) {

    console.log(err);

  }

});

ibmi.reserve(function(err, connObj) {

  // The connection returned from the pool is an object with two fields

  // {uuid: <uuid>, conn: <Connection>}

  if (connObj) {

    console.log("Using connection: " + connObj.uuid);

    // Grab the Connection for use.

    var conn = connObj.conn;

    // Adjust some connection options. See connection.js for a full set of

    // supported methods.

    asyncjs.series([

      function(callback) {

        conn.setAutoCommit(false, function(err) {

          if (err) {

            callback(err);

          } else {

            callback(null);

          }

        });

      },

      function(callback) {

        conn.setSchema(schema, function(err) {

          if (err) {

            callback(err);

          } else {

            callback(null);

          }

        });

      }

    ], function(err, results) {

      // Process result

    });

    // Query the database.

    asyncjs.series([

      function(callback) {

        // CREATE SQL.

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            statement.executeUpdate("CREATE TABLE blah "

                                  + "(id int, name varchar(10), date DATE, "

                                  + " time TIME, timestamp TIMESTAMP)",

                                  function(err, count) {

              if (err) {

                callback(err);

              } else {

                console.log('SUCCESS: create table')

                callback(null, count);

              }

            });

          }

        });

      },

      function(callback) {

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            statement.executeUpdate("INSERT INTO blah "

                                  + "VALUES (1, 'Jason', CURRENT_DATE, "

                                  + "CURRENT_TIME, CURRENT_TIMESTAMP)",

                                  function(err, count) {

              if (err) {

                callback(err);

              } else {

                console.log('SUCCESS: insert into')

                callback(null, count);

              }

            });

          }

        });

      },

      function(callback) {

        // Update statement.

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            statement.executeUpdate("UPDATE blah "

                                  + "SET id = 2 "

                                  + "WHERE name = 'Jason'",

                                  function(err, count) {

              if (err) {

                callback(err);

              } else {

                console.log('SUCCESS: update')

                callback(null, count);

              }

            });

          }

        });

      },

      function(callback) {

        // Select statement example.

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            // Adjust some statement options before use. See statement.js for

            // a full listing of supported options.

            statement.setFetchSize(100, function(err) {

              if (err) {

                callback(err);

              } else {               

                statement.executeQuery("SELECT * FROM blah",

                                       function(err, resultset) {

                  if (err) {

                    callback(err)

                  } else {

                    console.log('SUCCESS: select')

                    resultset.toObjArray(function(err, results) {

                      if (results.length > 0) {

                        console.log("ID: " + results[0].ID);

                      }

                      callback(null, resultset);

                    });

                  }

                });

              }

            });

          }

        });

      },

      function(callback) {

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            statement.executeUpdate("DELETE FROM blah WHERE id = 2", function(err, count) {

              if (err) {

                callback(err);

              } else {

                console.log("SUCCESS: DELETE FROM blah WHERE id = 2")

                callback(null, count);

              }

            });

          }

        });

      },

      function(callback) {

        conn.createStatement(function(err, statement) {

          if (err) {

            callback(err);

          } else {

            statement.executeUpdate("DROP TABLE blah", function(err, count) {

              if (err) {

                callback(err);

              } else {

                console.log("SUCCESS: DROP TABLE blah")

                callback(null, count);

              }

            });

          }

        });

      }

    ], function(err, results) {

      console.log('results2:' + results);

      ibmi.release(connObj, function(err) {

        if (err) {

          console.log(err.message);

        }

      });

    });

  }

});

 

As you can see, there are a lot of things going on with this code. I will not go into a number of things like how Node.js communicates with Java or how the asyncjs npm is used to synchronize the calling of SQL statements one after another. Instead, I will call out a few things that are unique to IBM i.

The first thing to notice is the config variable's url and drivername property values. They are both IBM i-specific, and I usually need to Google for them to remember the makeup of the strings. The other thing I had to do to the example code was remove the trailing semicolons from the SQL statements so DB2 would not error out.

Now it's time to give the program a try, as shown below.

$ node app.js 192.168.0.10 SCHEMA USER PASSWORD

module.js:327

    throw err;

    ^

Error: Cannot find module 'async'

    at Function.Module._resolveFilename (module.js:325:15)

    at Function.Module._load (module.js:276:25)

    at Module.require (module.js:353:17)

    at require (internal/module.js:12:17)

    at Object.<anonymous> (/Users/aaronbartell/s3/git/sofn_node_jdbc/app.js:32:15)

    at Module._compile (module.js:409:26)

    at Object.Module._extensions..js (module.js:416:10)

    at Module.load (module.js:343:32)

    at Function.Module._load (module.js:300:12)

    at Function.Module.runMain (module.js:441:10)

Shoot!  I forgot to install the async npm. Let's do that now.

$ npm install async

async@2.1.4 node_modules/async

??? This email address is being protected from spambots. You need JavaScript enabled to view it..4

OK, now to run the program again.

$ node app.js 192.168.0.10 SCHEMA USER PASSWORD

Using connection: 33dda6bd-xxxx-9999-xxxx-4d616777f555

SUCCESS: create table

SUCCESS: insert into

SUCCESS: update

SUCCESS: select

ID: 2

SUCCESS: DELETE FROM blah WHERE id = 2

SUCCESS: DROP TABLE blah

results2:0,1,1,[object Object],1,0

Rock. On. As you can see, it worked and we are now communicating from Node.js on Mac/Linux to DB2 for i using JDBC. #winning

The above code accomplished a variety of database interactions for the sake of proof. I should note that I spent the better part of 10 minutes trying to figure out why I couldn't connect, only to learn my profile was disabled. I convey this so you know the error reporting might not be what you expect.

I hope it is obvious that you need to have Java installed on the machine where this is being run. On that note, IBM i has a good JVM that could allow Node.js on IBM i to communicate to other databases. I have not tried that. Maybe one of you will give that a try and add your results to the comments section below?

If you have any questions or comments, then please comment below or email me at This email address is being protected from spambots. You need JavaScript enabled to view it..

Aaron Bartell

Aaron Bartell is Director of IBM i Innovation for Krengel Technology, Inc. Aaron facilitates adoption of open-source technologies on IBM i through professional services, staff training, speaking engagements, and the authoring of best practices within industry publications andwww.litmis.comWith a strong background in RPG application development, Aaron covers topics that enable IBM i shops to embrace today's leading technologies, including Ruby on Rails, Node.js, Git for RPG source change management, and RSpec for unit testing RPG. Aaron is a passionate advocate of vibrant technology communities and the corresponding benefits available for today's modern application developers. Connect with Aaron via email atThis email address is being protected from spambots. You need JavaScript enabled to view it..

Aaron lives with his wife and five children in southern Minnesota. He enjoys the vast amounts of laughter that having a young family brings, along with camping and music. He believes there's no greater purpose than to give of our life and time to help others.

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  •  

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: