11
Fri, Oct
6 New Articles

Cut Your Migration Time from Oracle to DB2 UDB for iSeries

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times


The free IBM DB2 Migration Toolkit for iSeries (MTK) can significantly reduce the effort required to migrate an Oracle database solution to the iSeries and thus shorten the time to market for your applications. The MTK provides a single development environment that supports migration from Oracle 8 to DB2 UDB EE/EEE V7 and V8 as well as DB2 UDB for iSeries V5R2. The toolkit, which runs on Windows NT or Windows 2000 and is available only in English, provides all the functions needed to graphically create, build, and deploy the migrated database objects on the target platform.

The MTK Basics

The MTK manages work by using projects, which store information about the source and target database server connections and maintain a set of files used for migration purposes. Generally, the toolkit first converts the source metadata into DDL scripts, which can later be deployed in the target DB2 UDB database. The core element of the utility is the SQL translator. The translator is used to convert Oracle SQL object definitions (for example, table, trigger, or stored procedure definitions) and SQL queries to equivalent constructs in DB2. The translator is invoked automatically by the toolkit when migrating the entire Oracle database.

The translator takes as input a sequence of scripts containing Oracle SQL statements and generates corresponding DB2 output scripts. It also generates metadata information for each source object definition as well as each corresponding DB2 object definition. The metadata summarizes important properties of the database objects (such as column names and types for a table) and is used by the toolkit to generate an overview of source and target objects. Usually, the migration is an iterative process and the scripts generated by the toolkit require some level of manual fine tuning. The refine process in the toolkit helps you identify the inconsistencies in the generated scripts so that they can be easily corrected.

The data itself can also be exported and loaded into DB2. On the iSeries, the MTK utilizes the native Copy From Import File (CPYFRMIMPF) command and its many options for deploying data. The command has been recently enhanced so that BLOB/CLOB data types are also supported.

The final step is to run the scripts, thereby deploying the database to DB2. Generally, you have several choices to consider during deployment. For example, the data can be loaded in DB2 during the same deployment as the metadata. You also have the option to deploy only the DB2 metadata.

Figure 1 illustrates the MTK's architecture.


http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)00.jpg

Figure 1: Overview of the MTK's architecture (click to enlarge)

MTK Test Drive

In this section, I will take you on a quick tour to demonstrate the most important features of the migration toolkit. The source database is the Oracle's sample database, and it resides on an Oracle 8 server. The sample database has been additionally enhanced with several nontrivial database objects such as sequences, triggers, stored procedures, and views.

As I mentioned, MTK manages its work by projects. When you open a new project, MTK prompts you for some initial data, such as project name, project path on the local workstation, source database, and DB2 target database. In this case, I selected DB2 UDB for iSeries V5R2 as the target. Figure 2 shows the settings for the test project.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)01.jpg

Figure 2: Settings for a new project (click to enlarge)


Once I created the project testp, I could start the migration process. The main MTK dialog window has several tabs that are arranged so that each tab relates to a migration step. The first step is to acquire the source scripts that contain the Oracle PL/SQL statements to be migrated to DB2. This can be accomplished on the Specify Source Page tab. The source scripts can be either directly extracted from an Oracle database or imported from the existing metadata source file. In this case, I extracted the sources directly from an Oracle database. You can use either native JDBC or ODBC to establish a connection with the Oracle source. The JDBC option requires that the classes111.zip file is added to the CLASSPATH global variable on the local workstation. The classes111.zip usually can be located in the [ORACLE_HOME]jdbclib directory. The ODBC option requires that the data source name (DSN) for the source database is registered through the ODBC administration. Additionally, I strongly recommend that you run statistics on the SYS.DEPENDENCY$, SYS.OBJ$, and SYS.USER$ tables before extracting. This can be done through Oracle's admin utilities such as DBA Studio.

Once the MTK successfully connects to the Oracle database, the Extract dialog box appears. The tree of all Oracle objects available for extraction is shown in the left panel under Available objects. I selected all objects found in schema JAREK. The selected objects then moved to the Objects to extract panel, as shown in Figure 3.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)02.jpg

Figure 3: The Extract dialog box (click to enlarge)


When invoked, the Extract utility (see Figure 1) does its magic behind the scenes. It retrieves the metadata from Oracle's catalogs and generates the appropriate SQL statements for all selected objects. The statements are written into a flat file called testp.src located on the local workstation in the project's directory (C:codeMTK_AS400projects estp). This file is used by the translator to produce the DB2 metadata.

Next, I moved to the second tab, Convert. This dialog has several options that can be used to fine-tune the conversion process. For example, the Global Type Mapping function provides a list of default type mappings. Some of these mappings can be changed to better reflect the database design or to improve performance. Unfortunately, the current beta version of the toolkit doesn't allow you to change the default VARCHAR2 to VARCHAR mapping. I recommend, for performance reasons, that you remap VARCHAR2 to CHAR on iSeries. Luckily, the Refine step, discussed later in this section, allows you to do so. (Later in this article, I'll tell you where to find more details on the optimal Oracle-to-iSeries data mapping.)

The translator is invoked by selecting the Convert button on the Convert dialog. Once the conversion process is finished, the MTK automatically switches to the Refine tab. So far, the migration is mostly a hands-off process. The Refine step is critical for the quality of the resulting DB2 code and requires a fairly high level of expertise in both source and target database systems. Usually, the refine-covert is an iterative process, where the source file is manually "tweaked" and then reconverted. This step is repeated until the conversion results are satisfactory or cannot be improved any further by modifying the source script.

I started the analysis of the conversion results by looking up the Messages tab on the Refine dialog. The messages generated by the translator are sorted by category: Input Error, Translator Information, Translator Limitation, and Translator Omission. The order of the message category reflects its relative importance. The upper levels require analysis and possible modification of the source metadata, while the lower levels are just informational and require little or no action. This is illustrated in Figure 4.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)03.jpg

Figure 4: Working with translator messages (click to enlarge)


For example, the first message under the Input Error category informs basically that the Oracle's %ROWTYPE and %TYPE constructs are not supported by the translator as input parameters of a stored procedure or function. Note, however, that these constructs are supported in variables or cursors in the body of a procedure or function. As a workaround for this limitation, you manually rewrite the source procedure so that %ROWTYPE is converted to a list of variables matching the record fields. Here's the original version of the stored procedure that utilizes %ROWTYPE and %TYPE syntax:

CREATE OR REPLACE PROCEDURE Get_emp_rec (Emp_number IN Emp.Empno%TYPE,
Emp_ret OUT Emp%ROWTYPE) IS
BEGIN
SELECT Empno, Ename, Job, Mgr, Hiredate, Sal, Comm, Deptno
INTO Emp_ret
FROM Emp
WHERE Empno = Emp_number;
END;


And here's the modified version:

CREATE OR REPLACE PROCEDURE Get_emp_rec (Emp_number IN NUMBER,o_Empno OUT
NUMBER,o_Ename OUT VARCHAR2, o_Job OUT VARCHAR2, o_Mgr OUT NUMBER, oHiredate
OUT DATE, o_Sal OUT NUMBER, o_Comm OUT NUMBER,o_Deptno OUT NUMBER) IS
BEGIN
SELECT Empno, Ename, Job, Mgr, Hiredate, Sal, Comm, Deptno
  INTO o_Empno, o_Ename, o_Job, o_Mgr, o_Hiredate, o_Sal, o_Comm, o_Deptno
FROM Emp
WHERE Empno = Emp_number;
END;


Similarly, the message under Translator Omission indicates that the CREATE SEQUENCE statement is not translated. Currently, the DB2 UDB for iSeries does not support the sequence objects, and the translator recognizes that fact, so I removed the CREATE SEQUENCE from the source script altogether. The sequence is used in the Oracle application to draw consecutive numbers that are then used as employee numbers (column EMPNO in the EMP table). DB2 UDB for iSeries supports the identity data type, which has a functionality that's similar to the sequence object in Oracle. Therefore, I used an identity column in place of a sequence in the DB2 version of the application. This change, however, requires the modification of the DB2 target script and could be performed only after I successfully reconverted the final version of the source script.

Then, I moved on to the Generate Data Transfer Scripts tab. Before I performed this step though, I needed to manually modify the DB2 script to accomodate the identity column. The DB2 script name is listed in the right panel under View Output File. This is shown in Figure 5.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)04.jpg

Figure 5: The Generate Data Transfer Scripts dialog (click to enlarge)


As mentioned, the data type of the EMPNO column in EMP table needs to be changed to identity, so I edited the target test.db2 script. Selecting View Script File (see Figure 5) opens the file in the default text editor. Here's the altered table definition:

CREATE TABLE EMP(
    EMPNO INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY,
    ENAME CHAR(10),
    JOB CHAR(9),
    MGR INTEGER,
    HIREDATE TIMESTAMP,
    SAL DECIMAL(7,2),
    COMM DECIMAL(7,2),
    DEPTNO INTEGER)


Note that the ENAME and JOB column types have also been manually changed from VARCHAR to CHAR. Additionally, the HIRE_EMP stored procedure has been adjusted so that it uses the generated identity value in place of the sequence. Here's the relevant code snippet:

INSERT INTO EMP ( ENAME,JOB,MGR1,HIREDATE,SAL,COMM,DEPTNO )
 VALUES(NAME,JOB1,ORA8.ROUND(MGR1),HIREDATE1,SAL1,COMM1,ORA8.ROUND(DEPTNO1));
 SET NEW_EMPNO = (SELECT identity_val_local() FROM SYSIBM.SYSDUMMY1);


Since the EMPNO column is defined as identity, it is intentionally omitted on the INSERT statement. The value for this column is generated by DB2 at the insert time. Then, the identity_val_local() function returns the most recently assigned value for the identity column.

Once the DB2 script was manually tuned, I continued on the Generate Data Transfer Scripts dialog. I selected Store data scripts on iSeries (see Figure 5). This instructs the toolkit to create the data transfer scripts on the target iSeries system. Note the location of the scripts on the iSeries in the IFS file system (/QIBM/UserData/MTK/projects/testp/DataOutScripts/ ). At this step, only the Field Definition Files (FDF) are generated and stored on the target system. The FDF file defines the format of the data import file and is required by the CPYFRMIMPF command for the fixed-format data load. It contains information about the columns' begin and end, and it identifies whether the columns are nullable. You can look up the content of the FDF files on the target iSeries using the iSeries Navigator.

The final step in the migration process is to deploy, using the Deploy to DB2 tab. There are several options that can be considered during the deployment phase. For example, you can deploy only the DB2 metadata, or you can load only the data in case the metadata has been already deployed. I selected all options that combine the metadata deployment with the data transfer. See Figure 6.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)05.jpg

Figure 6: The Deploy to DB2 dialog (click to enlarge)


The central panel of the Deploy to DB2 dialog contains the summary of the conversion process. Generally, any messages shown in the Input Error or Translator Omission categories should not be ignored. In this case, however, I provided fixes for the conversion messages that appear in this panel by modifying the DB2 scripts so that I could continue with the deployment.

Note that the database schemas (collections) referred to in the deployment scripts (there two such schemas for this example: DB2USER, and PUBLIC) must exist before the deployment step is initiated.

Several tasks are performed by the toolkit during the deployment step:

  • The data is extracted from the Oracle source database and stored on the iSeries in the project's directory.
  • The source user-defined functions (UDFs) are created on the iSeries. These functions are provided to emulate Oracle functions that do not exist on DB2. The UDFs reside in the ORA8 schema that is created at this step.
  • The DB2 script testp.db2 is executed so that the DB2 metadata gets created on the iSeries.
  • The source data is loaded into the DB2 tables.
  • The integrity of the database is checked.
  • The deployment process is verified.


The MTK generates two report files: the Log and the Verify. The Verify report's content automatically appears in a browser window immediately after the deployment step has finished, as shown in Figure 7.

http://www.mcpressonline.com/articles/images/2002/mtk4oracle_v1.05(V4)06.jpg

Figure 7: The Verify report (click to enlarge)


In this example, the MTK reports no deployment issues. All original Oracle objects were successfully migrated to DB2 UDB for iSeries. Additionally, the data has been extracted from the source database and loaded into DB2 tables. This concludes the migration process.

Prerequisites

As mentioned, the MTK supports OS/400 V5R2. Additionally, I strongly recommend that you install the latest version of DB2 UDB for iSeries database FixPak (SF99502). Make sure also that the critical SI06748 and SI06675 PTFs are loaded on the target system.

Additional Information

The MTK comes with detailed help documentation. Check it out before performing any migration work. Especially, you should carefully review the Summary of Feature section. It contains the list of all Oracle features currently supported by the toolkit.

The PartnerWorld for Developers database technology team has put together a series of database porting guides. Please refer to the following paper for an in-depth discussion on the Oracle to iSeries porting issues: DB2 UDB Universal Database for iSeries Porting Guide: Oracle to IBM eServer iSeries. Also, the ITSO offers two Redbooks that can be helpful to those who want to learn more about the programming techniques for SQL procedures, triggers, and functions on DB2: Developing Cross-Platform DB2 Stored Procedures (SG24-5485) and Stored Procedures and Triggers on DB2 UDB for iSeries (SG24-6503).

Feedback

Please feel free to send your questions concerning the beta version of the MTK for iSeries directly to the development team at This email address is being protected from spambots. You need JavaScript enabled to view it..

Jarek Miszczyk is the Senior Software Engineer, PartnerWorld for Developers, IBM Rochester. He can be reached by email at This email address is being protected from spambots. You need JavaScript enabled to view it..



Jarek Miszczyk

Jarek Miszczyk is a Lead Technical Consultant for System x Virtualization and Cloud Computing at the IBM STG Global ISV Enablement organization. He is located in Rochester, Minnesota. He can be reached by email at This email address is being protected from spambots. You need JavaScript enabled to view it..

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: