Skip to content

Installing Databridge Client on UNIX

This section provides instructions for installing and upgrading Databridge Client, including the Client Console and daemon, on a UNIX system.


Install Databridge Client on UNIX

Use this procedure for first-time installations and upgrades on all UNIX, Linux, and related systems. In this procedure, you'll install the Client, create a Client working directory, and specify a user ID for the daemon.

If you are running Oracle Client on a non-English database, the system detects this automatically, reads the Oracle NLS parameters, and adjusts the affected client parameters accordingly.

If you are running Oracle Client on a database that uses the UTF-8 character set, the system detects this by automatically reading the Oracle NLS parameters, and adjusts the session parameters to accommodate UTF-8.

Important

Commands, filenames, and often, passwords, are case-sensitive on UNIX systems. Type the commands as shown.

To install the Client on UNIX

  1. Log on using a userid that has root privileges.

  2. If a previous version of Databridge Client is running on your system, stop it.

  3. Create a new install directory for Databridge 6.6 (for example, "/opt/dbridge70").

  4. Move the appropriate tar files (Client, Client Console) to the UNIX system using binary FTP. See Description of Files on the Installation Image.

  5. Change the directory to the install directory you created in step 3 and extract the tar file:

    tar -xvf filename

    where filename is the full tar filename, including location.

    The extract program creates a client subdirectory named after the database (for example, Oracle). Files are extracted to this subdirectory and to the directory you specified.

  6. Next, change the owner and group for the files in the install directory to the userid (and the group to which the userid belongs) designated to run Databridge Client (that is, the USERID specified in the file globalprofile.ini). To do this, type the following command:

    chown -R dbridge:users *

    where dbridge is the user ID and users is the corresponding group.

    Caution

    To prevent the files in the install directory from being accidentally deleted, we recommend that you leave them set to read-only (the default). Before you update the software, to prevent the extraction from failing, either remove all of the files from the install directory or make them writeable until the update is successfully installed.

  7. In the editor, open the sample daemon script file (dbdaemon) located on the root of the install directory, update the necessary environment variables, and then save dbdaemon to the following location:

    • On Linux and Solaris: /etc/init.d
  8. Create the directory /etc/MicroFocus/Databridge/7.0

  9. In the editor, open the globalprofile.smp file on the root of the install directory and make the following changes:

    Warning

    This file is critical for client operations. If this file is missing or contains the wrong information, the client will not run.

    • For INSTALLDIR, specify the install directory you created in step 3.
    • For WORKINGDIR, specify the full name of the Client's working directory for 6.6. If you're upgrading from an older version than 6.2, make sure that the working directory for 7.0 is different from the older working directory**. You create the actual directory later in this procedure.
    • For USERID, specify the user under which you'll run the daemon. When you start the daemon, you must log in as this user or as the root user. Clients can only be run under this userid.
    • To enable file security for the Client working directory, specify a value for umask. The bits specified get removed from the default file security bits. A umask of 027 (the default) indicates that the owner bits are unchanged, but the group's w and x bits are reset and all 3 bits are reset for other users. For a stronger mask, specify a value such as 077.
  10. Save the file as globalprofile.ini to the directory in which the file was created in step 8.

    /etc/MicroFocus/Databridge/7.0

  11. Do one of the following:

    • (New installations) In the home directory, use the editor to update your profile (for example, ".profile" or " bash_profile") and include the environment variable ORACLE_HOME. Consult the Oracle database administrator to determine what this variable should be set to (typically, it's "/opt/oracle..."). You must also include the directory $ORACLE_HOME/bin in the PATH to ensure that the bulk loader sqlldr can be located. You may want to also include the environment variable INSTALLDIR, which points to the directory created in step 3. You can then add the Client program directory $INSTALLDIR/Oracle to the PATH, making it a lot easier to run the Client from the command line.

    • (Upgrades - Optional) Update your profile so that the PATH points to the newly-installed Client.

  12. For Databridge Client to find the Oracle shared libraries, add the Oracle lib directory to the appropriate environment variable for your system, as shown in the following table. The ORACLE_HOME environment variable is used instead of typing the complete Oracle lib directory name.

    To update Add the following to you profile
    All clients except on AIX 7.1 or newer LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$ORACLE_HOME/lib
    Clients on AIX 7.1 or newer LIBPATH=LIBPATH:$ORACLE_HOME/lib

    Note

    The UNIX shell you use may require that you add a line that exports the environment variable. For instructions, consult your UNIX documentation.

  13. Using the value you specified in step 9b, create a Client working directory for 7.0 that contains the three subdirectories config, locks and logs. (For upgrades, the Migrate program creates a new, secure working directory and subdirectories from this value. See Run the Migrate Program on UNIX.)

  14. When installation is complete, verify that the Client directory files are correctly installed. For a list of installed files, see Files Installed with Databridge Client and Client Console.

  15. If you run the Client from a command line, you must also update the environment variable PATH in your profile to include the directory that contains Databridge Client. If you installed the Client to the directory /opt/dbridge66, add ":/opt/dbridge66/Oracle" to the PATH. The PATH is defined as:

    PATH=$PATH:$ORACLE_HOME/bin

    where each specified directory is separated by a colon.

    Individual clients use the database names as the directories names (for example, Oracle).

    !!! caution Make sure that the mknod utility's directory (usually /usr/sbin) is included in the PATH. If the shell scripts used to clone data sets cannot find this utility, files will be used in place of named pipes resulting in bulk loader problems, such as broken pipes.

  16. Do one of the following:

    • If you're upgrading, proceed to Run the Migrate Program on UNIX to complete the installation.

    • If you're installing for the first time, install the Client Console to configure the Client parameters.

    • The auto_reclone parameter has been deprecated, as this feature did not work correctly with online garbage collection reorgs. The two available options are to either stop the client when a garbage collection is detected, or, let the client deselect the affected data sets and reclone them later when all the reorgs are completed. The -y option for the process command makes it easy to reclone these data sets without having to do anything else.

    • The max_wait_secs parameter now accepts two values, the second is optional and defaults to 0. When the second parameter is non-zero, it enables code that breaks up long wait-and-retry loops, implemented by the engine alone, into smaller wait-and-retry loops in the engine. The client repeatedly issues DBWait remote procedure calls until the period of inactivity exceeds the value specified by the first value. This allows you to break up a one hour loop into 60 one minute loops, which ensures that the line does not go idle for long periods of time.

    • The index suffixes for the Oracle client now can use the environment variable $(INDEX_NAME) which gets replaced by the actual index name when the suffix is applied. See the Databridge Client Administrator's Guide section on SQL suffixes in Appendix C: Client Configuration.

    • The client implements row filtering for secondary tables derived from items (or GROUPS) that have OCCURS clauses that are not flattened. This is done by using a text file to specify the filtering criteria for such tables using a SQL like syntax.

    • The client implements the flattening of unsigned NUMBER items and ALPHA items with OCCURS clauses to a CHAR of VARCHAR column instead of making each occurrence of the item into a separate column. This process is referred to as flattening the OCCURS to a string. You can have fixed format strings and CSV format strings where the delimiter can be selected.

    • The DMSII time handling code now supports NUMBER(12) items that represents time in the format "HHMMSSmmmmmm", where the last 6 digits are the fractional part of the seconds.

  17. The SQL Server client can now handle table names that are TRANSACT-SQL reserved words by enclosing them in square brackets in all SQL statements it uses. In order to maintain backward compatibility we implemented the configuration parameter bracket_tabnames that enables this feature.

  18. The -k option was added to force the Client to drop tables rather than run cleanup scripts in a multi-source environments when recloning datasets. This is designed to be used after a DMSII reorganization that requires certain data sets to be recloned. Using the -k option for the first such clones makes the process a lot simpler, as manually dropping the tables to get them recreated with the new layouts are no longer required.


Installing Databridge Client Patches

Overwriting the old software is not always advisable, we recommend renaming the old install directory first. You can then create a new install directory with the same name. For example if you installed the base release of 7.0 to /opt/dbridge70, rename the directory dbridge70_save and create a new /opt/dbridge70 directory before installing the patch. The tar files for the patch contain all the file that were in the base release, so this procedure is perfectly. Upload the appropriate tar file for the Client from the hot fix, update or service pack to a temporary directory (The Client and Console should be installed in separate directories to facilitate maintenance.) If you use Windows to process the extract of the tar file from the zip file, you must transfer the tar file to UNIX using binary FTP.Change the current directory to be install directory and use the following command to extract the files:

tar -xvf <filename> where <filename> is the full name of the tar file. This command replaces the files in the Databridge install directory with updated files from the tar file.

Note

To avoid accidentally deleting the Databridge applications, we recommend that you always keep the install directory and the working directory separate.


Run the Migrate Program on UNIX

Use this procedure to upgrade Databridge Client 6.1 and earlier on UNIX systems. If upgrading from any version of 6.3, 6.2, or 6.1 SP3 and are using the daemon, you do not have to run migrate or dbfixup, you can use the existing working directory and configuration files.

To run the Migrate program

  1. Install Databridge Client 7.0. See Install Databridge Client on UNIX.

  2. Make sure that the file globalprofile.ini includes values for INSTALLDIR, USERID, WORKINGDIR. If you want to enable file security for the working directory, make sure that the umask value is set. (For instructions, see Install Databridge Client on UNIX). The Client runs under the specified USERID in the globalprofile.ini.

  3. Open a command session and type migrate. If the Client directory wasn't added to the PATH during installation, you must type the full path to migrate.

    The Migrate program first creates a new global working directory and then creates a working directory for each migrated data source. Next, it moves the existing user scripts and configuration files to this directory after updating the configuration file parameters. It also creates the daemon configuration file which contains any scheduling information defined for the individual data sources.

  4. The program asks if you are upgrading from version 6.0 or 6.1 and use the daemon. If the answer is yes, enter the full filename of the global working directory. Make sure that you enter a different name for the 7.0 working directory. After a new working directory structure is created for the service, data sources from the earlier version will be migrated to subdirectories in this working directory. Skip the remaining steps.

  5. When prompted, enter the name of each data source. Next, enter the location of the working directory for this data source either as an absolute path or a name relative to the current directory.

    The Migrate program creates a working directory for each data source. Existing settings are used to create a new binary configuration file in the config subdirectory and user scripts are moved to the scripts subdirectory. The Migrate program starts the dbfixup program, which upgrades the client control tables and populates the dbscripts directory for each data source. Lastly, the Migrate program applies file security to the working directory, data source directories, and all Client files.

    If the read_null_records parameter is set to True, the Migrate program copies the Null Record files for the data sources to the config subdirectories. A new daemon configuration file is created (dbcontrol.cfg), which includes updated data sources and any existing scheduling parameters. The Migrate program runs a dbutility generate command for each data sources to populate the dbscripts directories.

  6. Repeat step 5 for each data source. When you're done, enter an empty line (that is, <CR>).

  7. To run the Client using the daemon, install the Client Console---preferably to a different machine than the relational database to avoid using valuable system resources.

You can run the command-line program (dbutility) or the daemon after you create a script for the daemon that includes the updated ORACLE_HOME, INSTALLDIR, and WORKING_DIR parameters.

Note

If you need to re-run the migrate program at any point, first delete the working directory that was created by the migrate program.