06
Sun, Oct
2 New Articles

Transforming Data in an HA Environment

High Availability / Disaster Recovery
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

 

When the amount of data was small, it was possible to simply query it in the production systems. But then data volumes grew, turning previously fast, efficient queries into monsters that consumed inordinate system resources, not just because of the quantity of data, but also because it was optimized for operational rather than query and analysis requirements. As data grew and it was needed in a format that afforded easier access, data warehouses were developed. Nightly processes extracted data from production servers, transformed it to the needed format, and then loaded it into the data warehouse using either in-house developed code or a type of software known as Extract, Transform, and Load (ETL).

Data volumes and schema optimization were not the only problems that organizations faced as their business intelligence requirements matured. Information silos sprang up because of the use of independent, best-of-breed applications that ran on incompatible platforms, or because corporate mergers created conglomerations of diverse hardware and software in the merged enterprise, or for a host of other reasons. Consequently, those ETL programs had to be capable of extracting data from a variety of platforms, transforming it, and loading it into a single data warehouse or operational data store. (There are some differences between the two, but, for brevity, the remainder of this article will use "data warehouse" to refer to an operational data store as well.)

As the volume and complexity of the data swelled and the demands for the precise bits of information required to make more informed business decisions accelerated, the ETL processes became increasingly intricate, leading to a large increase in the time required to extract data from the production systems. This resulted in a high impact on the systems that were intended to run the business, and, therefore, it threatened the organization's ability to support the needed 24x7 high-performance applications.

What's more, the nightly extracts resulted in data warehouses that contained day-old data that was inadequate for certain types of decision-making. Hence, business executives and managers began to demand real-time, or what some industry analysts have called "right-time," data for their business intelligence purposes.

To address this requirement, a new generation of data sharing software, which provides functionality that goes by the name of "data replication," entered the market. This class of software replicates in real-time just the data needed, and only incremental changes to that data, from the production server to the data warehouse. Thus, data replication virtually eliminates the need for high-volume data extracts. The transfer of only incremental data also allows for much faster data transformation and load times. Thus, the technology of real-time Change Data Capture (CDC) has solved the problem of having too much data that wasn't available in an easily accessible format for timely business analysis.

Most DBMS products include data replication features, but they typically support replication to only a database managed by the same brand of DBMS. This does nothing to fulfill the need to share data among or merge it from unlike platforms. For that, custom-build or, more likely, third-party replication software is required.

Change-based replication functionality, which is also an integral part of high availability (HA) software, provides an added benefit beyond the loading of data warehouses. Because it creates a replica, usually on another server, backups can be taken from the replica rather than the production servers, thereby removing the impact that backups have on production applications. This provides a true 24x7 environment for production servers, while also reducing the time needed to populate business intelligence systems.

Functionality Requirements

When looking for a solution to provide real-time replication, obvious considerations come into play. These include the four Cs: cost, complexity, compatibility, and company. For the solution to be a good fit in your environment and provide the greatest value, all four of these issues need to be addressed. They will be discussed briefly below. However, before evaluating those other factors, you must first ensure that the product itself will meet a core set of criteria before you can consider using it to manage your mission-critical data. The following features are a benchmark for a quality replication solution:

Data Capture

At a minimum, every replication solution, regardless of whether it performs bulk extracts or change-based replication, must be capable of capturing the required data on the source system. In the past, and possibly still today, the most common data capture approach was to have the local database administrator hand-code SQL routines that extract, at predefined intervals, the data required by business users. The extracted data was then sent across to the target system, where it could be used for business intelligence purposes.

As the familiarity with and demand for business intelligence grows, more data is needed and more people want it, causing the extract and transmit processes to expand until they become unwieldy. When this happens, change-based replication using database triggers, journaling, or some other change capture method that is tied closely to the database becomes necessary.

Graphical Interface

The simplicity of using a GUI to create the data mappings between the source and target databases and to define the data flow requests significantly lowers the costs of the initial setup, maintenance, and learning curve compared to the hard-coding of separate programs to fulfill each specific data requirement.

At a minimum, the interface should include the following capabilities:

  • Allow administrators to make changes to the replication environment while replication is active.
  • Allow administrators to start and stop the replication processes.
  • Provide version control.
  • Provide a scripting facility that allows custom-developed transformation routines to be applied to the data as it is copied from the source to the target in order to meet unique enterprise information requirements.
  • For security and regulatory requirements, it should be password protected and have one level of access for developers and another for operational staff.
  • Most of all, it should be intuitive as to how to create the relationship between the source table structures and the desired target schema.

Enterprise Support

Before choosing a solution, be sure that it supports the hardware, operating systems, and databases you currently run, as well as those that figure into your future plans. Because the future usually cannot be accurately foreseen, it is best to be sure the software supports all of the major database vendors. Also make sure the vendor you are considering is stable and invests in the continued development that will be necessary to support future technologies.

Data Transformation

The needs of business intelligence are often best served by data that is not in the same format as in the operational databases. For example, in global companies, the same data items may be stored differently in different local databases, such as American versus metric measurements and product names that are in different languages in different countries. All of this data must be homogenized as it is combined into an integrated data warehouse. Therefore, the product must accommodate all such data transformations and any others that will be needed to support business intelligence requirements.

Schema Flexibility

The nature of the business intelligence process is vastly different from other business applications. Operational transactions usually access one row or just a few rows of data in a limited number of tables. Business intelligence and reporting applications, on the other hand, typically need to access all or a large portion of the rows in many joined tables. In addition, online operational applications are typically update-intensive, whereas business intelligence applications usually perform read-only tasks.

Because of these differences, when business intelligence applications run against databases designed for operational purposes, the results can be disastrous. Response times and the strain on system resources are often grossly unacceptable because the operational database is not designed for this type of functionality.

The replication solution must, therefore, support data movement between databases that use different schemas in the operational and data warehouse databases. You might for example, intentionally denormalize the data warehouse in order to eliminate the need for the business intelligence applications to perform frequent, costly table joins.

Heterogeneous Database Support

Just as the database format and schema might vary between the operational databases and the data warehouse, so might the database engine itself. The best database engine for data warehousing is not always the best engine for operational processing. What's more, your operational systems might not all run on the same database, so the replication solution must be able to transfer data between differing DBMSs.

Automation

Finally, the importance of business intelligence databases has become too great to rely solely on manually executed data extraction processes. Doing so imposes an unacceptable burden on the IT department. Instead, the solution must be capable of running without manual intervention, whether that is the periodic running of extract and load operations or the continuous replication of changes.

Cost

Cost is a factor in every software acquisition. It may be difficult to assess the value of the features and functions of the various vendors' offerings, but at least the price tags are usually clear. That's not true with in-house developed code. The preliminary cost analysis of this option compared to the price tags of vendors' offerings may make the idea of in-house development seductive, but the analysis is often deceptive.

What matters most is not the sticker price, but rather the total cost of ownership. In-house development incurs a number of costs, as follows, that are not incurred when buying a vendor's offering:

  • The necessary data extraction skills must be acquired through either training or hiring. Employee attrition turns these into recurring costs.
  • Business requirements analysis is required whether buying or building a solution, but the analysis, design, development, testing, debugging, and documentation of the programs that will fulfill those requirements is necessary only when developing software in-house.
  • The programs must be supported on an ongoing basis.
  • Business benefits are deferred until the software is developed and put into production. Consequently, the organization incurs opportunity costs because the investment begins to generate value much later than if pre-built software was used.

When evaluated in full, including the lifetime cost of maintaining and supporting the replication software, in-house development typically turns out to be much more costly than a purchased turnkey solution.

Complexity

Part of the high cost of in-house development derives from the inevitable complexity of the solution. The software has to be capable of transforming data and possibly transferring it between different database schemas, including differing table and column names and even between differently structured tables. Thus, when evaluating software, consider how easily—preferably through an intuitive, point-and-click GUI—it accommodates this complexity.

Compatibility

Most organizations support multiple DBMSs running on a variety of hardware platforms and operating systems for their operational environment. All of these diverse platforms must feed the data warehouse, which may run on yet another platform. What's more, the technologies will undoubtedly continue to evolve. The extensiveness of the software's platform support and the ease with which it can incorporate new platforms into its runtime environment is an important consideration when evaluating replication software.

Company

You will be betting your company's long-term success on the infrastructure supporting your business intelligence systems. Thus, don't look only at the features and functions of the various software offerings in your consideration set. Consider also the companies behind them. Because the solution will be providing the timely data needed to make critical business decisions, make sure that the vendor provides support 24 hours a day, seven days a week.

You also need to be confident that the vendor you choose will be there in the future to provide support, upgrade its software, and accommodate any important new technologies that come along. There are no foolproof tests of permanence, but the vendor's track record and its financial stability are good indicators of its ability to continue to develop, maintain, and support its products.

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: