Documentation

Documentation

  • Getting Started
  • Core Concepts
  • Using Fivetran
  • Usage-Based Pricing
  • Connectors
  • Applications
  • Databases
  • Files
  • Events
  • Functions
  • Destinations
  • Partner-Built
  • Transformations
  • Logs
  • Security
  • REST API
  • Local Data Processing (HVR 6)
    • Getting Started
    • Supported Platforms
    • Capabilities
    • Requirements
      • Source and Target Requirements
        • Actian Vector Requirements
        • Amazon S3 Requirements
        • Apache HDFS Requirements
        • Apache Kafka Requirements
        • Aurora MySQL Requirements
        • Aurora PostgreSQL Requirements
        • Azure Blob Storage Requirements
        • Azure Data Lake Storage Requirements
        • Azure SQL Database Requirements
        • Azure SQL Managed Instance Requirements
        • Azure Synapse Analytics Requirements
        • Databricks Requirements
        • Db2 for i Requirements
        • Db2 for Linux, Unix and Windows Requirements
        • Db2 for z/OS Requirements
          • Installing Capture Stored Procedures
          • Db2 for z/OS as Source
          • DB2 for z/OS as Target
        • File, FTP, SFTP Requirements
        • Google BigQuery Requirements
        • Google Cloud Storage Requirements
        • Greenplum Requirements
        • Ingres Requirements
        • MariaDB Requirements
        • MySQL Requirements
        • Oracle Requirements
        • PostgreSQL Requirements
        • Redshift Requirements
        • SAP HANA Requirements
        • SAP NetWeaver Requirements
        • SingleStore Requirements
        • Snowflake Requirements
        • SQL Server Requirements
        • Sybase ASE Requirements
        • Teradata Requirements
      • Hub Repository Database Requirements
    • Install and Upgrade
    • User Interface
    • Command Line Interface
    • Action Reference
    • Property Reference
    • Advanced Operations
    • Internal Objects
    • REST API
    • FAQ
    • Release Notes
  • Release Notes
RSS
Release notes RSS
HVR 5 Documentation
HVR 5 Documentation
  • Support
  • Sign In
Edit on GitHub

Db2 for z/OS Requirementslink

Updated October 31, 2023

This section describes the requirements, access privileges, and other features of Fivetran Local Data Processing when using 'Db2 for z/OS' for replication.

For information about compatibility and supported versions of Db2 for z/OS with Local Data Processing platforms, see Supported Platforms.

For the Capabilities supported by Local Data Processing on Db2 for z/OS, see Capabilities for Db2 for z/OS.

For information about the supported data types and mapping of data types in source DBMS to the corresponding data types in target DBMS or file format, see Data Type Mapping.

For the character encodings supported by Local Data Processing on Db2 for z/OS, see the Character Encodings page.

Introductionlink

To capture from Db2 for z/OS, Local Data Processing needs to be installed on a separate machine(either 64-bit Linux on Intel or 64-bit Windows on Intel or 64-bit AIX on PowerPC) from which Local Data Processing will access Db2 on z/OS machine. Additionally, the Local Data Processing stored procedures need to be installed on Db2 for z/OS machine for accessing Db2 log files. For steps to install the stored procedures on Db2 for z/OS machine, see section Installing Capture Stored Procedures.

Local Data Processing requires 'Db2 Connect' for connecting to Db2 for z/OS.

WD-Hvr-Location-DB2zOS_Architecture.png

Prerequisites for Fivetran Local Data Processing Machinelink

Local Data Processing requires the IBM Data Server Driver for ODBC and CLI or Db2 client or Db2 server or Db2 Connect to be installed on the machine from which Local Data Processing connects to Db2 on z/OS. The Db2 client should have an instance to store the data required for the remote connection.

To install the IBM Data Server Driver for ODBC and CLI, refer to the IBM documentation.

To set up the Db2 client or Db2 server or Db2 Connect, use the following commands to catalog the TCP/IP node and the remote database:

db2 catalog tcpip node nodename remote hostname server portnumber
db2 catalog database databasename at node nodename
content_copy
  • nodename is the local nickname for the remote machine that contains the database you want to catalog.
  • hostname is the name of the host/node where the target database resides.
  • databasename is the name of the database you want to catalog.

For more information about configuring Db2 client or Db2 server or Db2 Connect, refer to IBM documentation.

Verifying Connection to the Db2 Serverlink

To test the connection to the Db2 server on the z/OS machine, use the following command:

  • When using IBM Data Server Driver for ODBC and CLI

    db2cli validate -database databasename:servername:portnumber -connect -user username -passwd password
    content_copy

    or

    db2cli validate -dsn dsnname -connect -user username -passwd password
    content_copy
  • When using Db2 client or Db2 server or Db2 Connect
    db2 connect to databasename user userid
    content_copy

Related Articleslink

  • Installing Capture Stored Procedures
  • Db2 for z/OS as Source
  • Db2 for z/OS as Target
  • Location Connection for Db2 for z/OS

Questions?

We're always happy to help with any other questions you might have! Send us an email.

    Thanks for your feedback!
    Was this page helpful?