Skip to content

Appendix

This appendix lists the configuration parameters for Databridge Kafka Client. Refer to Appendix C: Client Configuration in the Databridge Client Administrator's Guide for a complete description of Client Configuration files.


Client Configuration Files

The Databridge Kafka Client uses binary configuration files. The configuration file is named dbridge.cfg and resides in the config subdirectory of the data source's working directory. You can create a text version of this file using the export command, which can be edited and converted into a binary file using the import command.


Command-Line Options

The following command-line options have no equivalent configuration parameter:

Option dbutility Command Description
? Short help
-d All Default Tracing
-f filename All Specifies the configuration filename
-h Long help
-k reload Makes the command preserve the stateinfo of data sets that have a ds_mode of 2 and have not been reorganized
-m All Includes a 5-digit millisecond timer in all output messages
-t mask All Tracing options
-u configure, define, redefine, and dropall Unconditionally performs the requested command, overriding any warnings that would be displayed without this option.
-v All Causes the client to log and/or print some additional messages.
-w clone or process Toggles the setting of the use_dbwait parameter.
-x clone Clones all active data sets except those specified at the command line.
-y process Instructs the client to reclone all data sets whose ds_mode has a value of 11 or 12.
-z cloneorprocess Instructs the client to reclone all data sets whose ds_mode has a value of 11 or 12.
-B display Causes the display command to quit after displaying the DATASETS client control table records.
-D database Specifies the Oracle database name to connect to.
-F afn process Use this option make the client act as if a QUIT AFTER afn command had been executed. It applies to process and clone commands only. The range of values allowed is 1-9999.
-K process Prevents the audit file removal WFL from being run on the mainframe after the Engine finishes processing an audit file.
-L All Forces the Client to start using a new log file.
-P password All Sets the password associated with the user ID for the Oracle database. The password is limited to 30 characters.
-R redefine Forces all data sets to be redefined.
-T All Forces the program to create a new trace file when tracing is enabled.
-U userid All Specifies the user ID for the Oracle database. The user ID must have the appropriate resource privileges for the designated relational database.
-X define, redefine, clone, process Specifies the host password.
-Y reclone_all process Causes all active data sets to be recloned.

Syntax

Follow these conventions in the configuration file:

  • For hexadecimal values, use the 0xnnnn format.

  • A semicolon (;), except within double-quoted strings, indicates that the remainder of the current line is a comment.

  • Enclose section headers in square brackets.

  • Section headers and parameter names are not case-sensitive.

  • Spaces and tabs between entries are ignored; however, spaces within double quoted values (for example, password values) are read.

  • If you are not using a parameter, either comment the parameter out or delete the corresponding line in the configuration file. Do not leave an uncommented parameter without a value after the equal sign (=). Doing so results in syntax error.

You can specify some of these parameters only in the Client configuration file. Other parameters have equivalent command-line options and environment variables. For a complete list of configuration file parameters, their equivalent command-line options, and their related Client command, see [params].

Sample Kafka Client Configuration File

You can view the configuration file by using the export command. Refer to the Databridge Client Administrator's Guide for details on the export command.

To use a parameter that is commented out, delete the semi-colon ( ; ) and after the equals sign (=) enter a value that is appropriate for your site. Boolean parameters can be represented by True or False.

In the example below, some of the commented-out parameters have a value of -1. These parameters include the DBEngine control file parameters that can be overridden by the client (commit frequency parameters and engine workers). This value indicates that the corresponding parameter in the Databridge Engine (or Server) control file will not be overridden by the Client. Do not uncomment these lines, unless you want to supply an actual value. Otherwise, the Client will issue an error.

;
; Databridge Client, Version 7.0 Kafka configuration file -- generated programmatically
;

[Signon]
;user                   = USERID
;password               = PASSWORD
;database               = DATABASE
;hostpasswd             = HOSTPASSWD

[Log_File]
file_name_prefix        = "db"
;max_file_size          = 0
logsw_on_size           = false
logsw_on_newday         = false
newfile_on_newday       = true
single_line_log_msgs    = false

[Trace_File]
file_name_prefix        = "trace"
;max_file_size          = 0

[Kafka]
;
;  Kafka client parameters
;
;default_topic          = ""
kafka_broker            = ""
;kafka_debug            = ""
ltrim_zeroes            = true
real_format             = scientific,11,6
rtrim_spaces            = true
security_protocol       = "PLAINTEXT"
;ssl_ca_location        = ""
;ssl_cert_location      = ""
;ssl_key_location       = ""
;ssl_key_passwd         = ""
;sasl_kerberos_srvc_name= ""
;sasl_kerberos_keytab   = ""
;sasl_kerberos_principal= ""
span_date_delim         = ""
span_date_format        = 21
span_date_scale         = 0
treat_real_as           = real
use_plus_sign           = false

[Params]
;
;  (1) define/redefine command parameters
;
allow_nulls             = true
automate_virtuals       = false
convert_ctrl_char       = false
default_user_columns    = 0x00000000
enable_dms_links        = false
;external_column[n]     = ["name"][,[sql_type][,[sql_length][,"default"]]]
extract_embedded        = false
flatten_all_occurs      = true
force_aa_value_only     = 0
miser_database          = false
read_null_records       = true
sec_tab_column_mask     = 0x00000000
split_varfmt_dataset    = false
strip_ds_prefixes       = false
suppress_new_columns    = false
suppress_new_datasets   = true
use_binary_aa           = false
use_column_prefixes     = false
use_date                = true
use_decimal_aa          = false
use_nullable_dates      = false
;
; (2) process/clone command parameters
;
alpha_error_cutoff      = 10
aux_stmts               = 100
;batch_job_period       = 00:00, 00:00
century_break           = 50
;commit_absn_inc        = -1
;commit_idle_database   = -1
;commit_longtrans       = -1
;commit_time_inc        = -1
;commit_txn_inc         = -1
;commit_update_inc      = -1
controlled_execution    = false
;convert_reversals      = -1
correct_bad_days        = 0
dbe_dflt_origin         = direct
defer_fixup_phase       = false
discard_data_errors     = false
display_bad_data        = false
enable_af_stats         = false
enable_doc_records      = false
;engine_workers         = -1
error_display_limits    = 10,100
inhibit_8_bit_data      = false
inhibit_console         = false
inhibit_ctrl_chars      = false
keep_undigits           = false
linc_century_base       = 1957
max_discards            = 0,100
max_retry_secs          = 20
max_srv_idle_time       = 0
max_wait_secs           = 3600,60
min_check_time          = 600
null_digit_value        = 9
numeric_date_format     = 23
set_blanks_to_null      = false
set_lincday0_to_null    = false
show_perf_stats         = true
show_statistics         = true
show_table_stats        = true
sql_exec_timeout        = 180,0
statistics_increment    = 100000,10000
stop_after_fixups       = false
stop_after_gc_reorg     = false
stop_after_given_afn    = false
stop_on_dbe_mode_chg    = false
track_vfds_nolinks      = true
use_ctrl_tab_sp         = true
use_dbwait              = false
use_latest_si           = false
;
; (3) Server options
;
;shutdown {until | for} hh:mm after stop
;stop {before | after} task "name"
;stop {before | after} time hh:mm[:ss]
;
; (5) miscellaneous command parameters
;
display_active_only     = true
;
; (6) user scripts
;
user_script_bu_dir      = ""
user_script_dir         = "scripts/"
;
; (7) external data translation parameters
;
use_ext_translation     = false
eatran_dll_name         = "DBEATRAN.SO"

[Scheduling]
;
; dbutility process command only
;
;daily                  = 08:00, 12:00, 17:00, 24:00
;sched_delay_secs       = 600
exit_on_error           = false
sched_minwait_secs      = 0
sched_retry_secs        = 60
;blackout_period        = 00:00, 02:00

[EbcdicToAscii]
; e1 = a1
; e2 = a2
;  ...
; en = an
;
[DBConfig]
default_date_fmt        = 21
global_type0_changes    = true

[Encryption]
ca_file                 = ""
ca_path                 = ""
certify_server_name     = false
enable_encryption       = false
tls_host_name           = ""

Configuring Databridge Client for Kafka Parameters

Certain Kafka specific parameters in the dbridge.inimust be configured before use and are outlined below. All of the Kafka-related parameters are in the [Kafka] section of the dbridge.ini file. The sample dbridge.ini file excerpt below shows how the Databridge Kafka Client might be configured to use Kerberos authentication.

[Kafka]
;
; Kafka client parameters
;
default_topic           = "TESTDB"
kafka_broker            = "kafkabuild.kafkalab.net:9093"
;kafka_debug            = ""
ltrim_zeroes            = true
real_format             = scientific,11,6
rtrim_spaces            = true
security_protocol       = "SASL_PLAINTEXT"
;ssl_ca_location        = ""
;ssl_cert_location      = ""
;ssl_key_location       = ""
;ssl_key_passwd         = ""
sasl_kerberos_srvc_name = "kafka"
sasl_kerberos_keytab    = "/etc/security/keytabs/dbclient.keytab"
sasl_kerberos_principal = "dbclient/oel.kafkalab.net"
span_date_delim         = ""
span_date_format        = 21
span_date_scale         = 0
treat_real_as           = real
use_plus_sign           = false

Refer to the [Kafka] descriptions below for more information on the Kafka Client parameters.

Processing Order

Configuration file options override environment variables. Command-line options override both environment variables and configuration file options.

The parameter processing order is as follows:

  1. The operating system login name (user ID) is used as the lowest level default for the database user ID

  2. Environment variables (DBUSERID, DBPASSWD, and DBHOSTPW).

  3. Command-line options -d (for full tracing), -v (for verbose messages), -t(for creating a Databridge Client trace file) and -T (for forcing the client to start a new trace file), and -f (for specifying a configuration file other than the default dbdridge.cfg). These options are processed in the order in which they appear on the command line.

  4. Parameters specified in the configuration file.

    You can specify the configuration file via the -f option. If you do not specify a configuration file name via the -foption, the Databridge Client tries to open the default configuration file (dbridge.cfg in the config subdirectory of the data source's working directory); if the file does not exist, the Databridge Client uses the default values for each configuration file parameter. The absence of a configuration file is not treated as an error only when running the command-line client. If you use the service or daemon, the absence of a configuration file named dbridge.cfg is treated as an error.

  5. All remaining command-line options.

    In the final pass, a command-line option with a configuration file equivalent overrides the configuration file entry.