Skip to content

Debugging Your Models

The Design Tool provides several ways to test and debug host application models. Use the options on the Debug menu to confirm that your model is operating as you expect and is ready to deploy. The log monitor is a troubleshooting utility that you can run outside of the Design Tool.

Before You Deploy Your Model

Here are some additional ideas for validating that your model will work as you expect and will be easy to maintain.

Use offline mode to move to each entity you have created and use the operations tab to review each operation.

  • Confirm that your naming conventions are consistent with the operation performed. Are your constant values correctly spelled and specified? Review the Modeling Tips.
  • Remove checkOperationConditions commands unless there is an actual condition to check. Removing any extra checks of this sort will improve model performance.
  • Make sure all constant values use TransmitToAttr (as opposed to DefaultValue) when an attribute is available.
  • Delete any unneeded entities, attributes, and operations. You may have extraneous commands that were inadvertently included when you were in Record mode.
  • Some setting preferences for a model when tested in the Design Tool may not be the same when it is deployed on the Host Integrator Server. Check the setting for Apply default value to server.

Validator

The Host Integrator Validator evaluates your host application model for errors. You must resolve all errors reported in the Validator before you can deploy it to a Host Integrator server. The Validator evaluates the following aspects of a host application model:

Variables

All variables must have a unique name and be complete.

Tables

Model tables are tested for the following:

  • All table names are unique.
  • The minimum value of all integer table columns is less than the maximum value.
  • The minimum length of all text table columns is less than the maximum length.
  • Type mismatches between filter parameters and output parameters that are mapped to each other. For example, a mismatch occurs if a filter parameter is a string type and the output parameter it is mapped to is an integer.
  • All filter, data, and output parameters are used in the procedure.
  • All branch entities in a procedure are either the primary or alternate destination of the selected operation.
  • A route exists between all entities in a procedure.
  • All attributes marked as required for an operation are mapped to input parameters in a procedure.
  • All attributes mapped to input parameters are writable.
  • All attributes mapped to output parameters are readable.

Entities

All entities must be unique, and there must be a valid operation that is able to navigate to every entity in the model.

Note

A warning appears if navigation does not exist between an entity and its home entity.

Event Handlers

All event handlers must be up to date with the associated libraries. Click Rebuild on the Events menu to rebuild the event handler libraries.

Validating a model

  1. Open the model you want to validate in the Design Tool and from the Debug menu, select Validate. All imported models are automatically validated unless you clear that option on the Import tab of the Preferences dialog box.

  2. In Validator, select the model elements you want to evaluate. You can evaluate Variables, Tables, Entities, and Event Handlers.

  3. Click Validate. When complete, a validation report appears in the results box.

Validation Report

The validation report details any problems found in your host application model that would prevent it from being stamped by the Host Integrator as deployable. The report lists any problems found with model variables, entities, tables, procedures, or paths. Before you can deploy your model to a Host Integrator Server, the Validator must report no problems.

Interpreting Validation Reports

If no errors are found in your host application model, the Validation Report is empty. If errors are found, the following symbols appear in the validation report:

report error Error. An error that prevents the Design Tool from authenticating this model was detected.

report caution Caution. Alerts you to a possible error in your model. The Design Tool will authenticate a model that has cautions, but no errors.

The report is organized as a tree structure with variables, entities, tables, and paths at the top level, with any sub-categories of these model elements appearing below them, each describing the problem detected.

Signature Analyzer: Pattern Tab

Use the Signature Analyzer to verify that all entities are uniquely identified. Select Signature Analyzer from the Debug menu to open the Signature Analyzer dialog box. The options available on the Pattern and Validation tabs of this dialog box can be used to compare and validate the signatures of any two entities in your host application model. The Design Tool performs the evaluation by comparing the patterns that have been defined for any two entities.

Pattern Tab

In the left portion of the dialog box, select the two entities to compare by clicking the down arrow next to the Entity and the Compare to lists and clicking the entities you want to compare. Miniature versions of the entities you select help you visualize the entities you are comparing. At the top center there are two icons representing the primary entity and the entity you're comparing it to. If the two entities match, both icons are green; if the entity you're comparing to the primary does not match, it appears in yellow.

  • Save Log button

    To create a log file of the selected entity compared to the current screen, click the Save Log button. By default, the Save As dialog box appears and prompts you to save your model log file as <model name>_siglog.xhtml in your \<VHI install directory>\models\<model name> folder.

  • Compare All button

    To compare all defined entities to the current screen, click the Compare All button. A log file is written that describes if any of the defined entities match the current screen or if two entities are defined for the current screen. By default, the Save As dialog box appears and prompts you to save this model log file as <model name>_siglogall.xhtml in your \<VHI install directory>\models\<model name> folder.

The Signature Analyzer displays the following information about the entities you are comparing:

  • Patterns from

    The Patterns from box lists all the patterns defined for the primary entity. For each pattern, the Design Tool tells you whether it detects a match between any patterns in the primary entity and the one you are comparing it to. As you click the patterns in the list, notice that information about the selected pattern is displayed.

  • Position

    Specifies whether the pattern is in a specified region of the screen or relative to the cursor, and lists the location and size of the pattern (row, column, height, and width).

  • Properties

    Field type tells you whether the field type is protected or unprotected. If you incorporated the text color into the pattern definition, the color appears next to Text color.

    The Pattern evaluation box specifies whether the selected pattern is defined as present or not present. If the pattern evaluation is defined as "Screen properties not present", then the screen signature is not considered to be complete unless this pattern is absent. The purpose of this setting is to allow you to specify the absence of a particular pattern as an indicator that an entity is unique.

    The Text box displays the text that forms this pattern, and specifies whether it is case sensitive or defined as <blank>, <Any text>, <Any number> or <User specified>.

The Design Tool is unable to recognize patterns under the following conditions:

  • A pattern containing a decimal point with a text property type defined as <Any number>.
  • A pattern containing blank spaces with a text property type defined as <Any text>.

Validation Tab

To review or modify Validation tab settings, select Signature Analyzer from the Debug menu, and click the Validation tab on the Signature Analyzer dialog box. Use the Validation tab to check that any validation patterns, conditions, or cursor positioning wait options defined as part of the entity signature are being validated. These options will only appear if they have been defined on the Validation tab of the Advanced Entity Properties dialog box.

Select Navigator on the Debug menu to open the Navigator. The Navigator dialog box displays a graphical representation of the entities and operations in your model. Use the Navigator to review the structure of your host application and test traversal operations in your model to ensure that there is a valid traversal operation that can reach every entity in your model. You can use the Navigator either while you are connected to the host or running in offline mode.

Note

Select the Use for navigation commands to this destination check box on the Operation tab so the Navigator is aware of an operation between two entities. Only one operation between two entities can have this option selected.

The Navigator uses the following symbols to represent entities and operations in a model:

Symbol Description
entity Entity
main entity Selected entity
invalid entity There is no operation operation available to navigate away from this entity

Use Root view at to change the root entity currently displayed in the Model window. Select either Home entity, the Current entity, or Custom to choose any entity in the model.

Testing Traversal Operations

To test the Host Integrator's ability to navigate to an entity, click the entity in the Navigator window. If you are connected to the host, you will see the Host Integrator navigate to the entity in the host application in the model window.

Identifying Operations

To determine the name of an operation, click it in the Navigator window. The name of the operation appears next to Selected operation in the top portion of the Navigator dialog box.

Debug Command List

Select Debug Command List on the Debug menu or click the Debug button on the Operation tab to open the Command Debug dialog box. This dialog box is used for debugging an operation or command list (for example, a move cursor, login, or logout command list) created in the Design Tool.

To see the entire list, click Copy from the toolbar and paste the list into a text editor.

Tip

Click the Apply button before debugging a command.

To test and debug a command list

  1. Select the operation or command list to test from the Name field and click Run. The Design Tool runs through each command listed and then reports the operation's success or failure in the Status box.
  2. Click Step to execute the highlighted command, or click Skip to pass over it.
  3. To stop execution of the command list as it is running, click Stop.

Attributes Test

You can test the Host Integrator's ability to read data and to store and write attribute data to the terminal screen before deploying your model to a Host Integrator Server. Select Attributes Test on the Debug menu or click the Test button on the Attribute tab to open the Attributes Test dialog box.

Read Attributes

When you click the Read tab, all the readable attributes for the current entity are included. By default, all attributes are enabled for the test. This test is particularly valuable if you are implementing event handlers and need to confirm that an attribute is being read successfully.

Click Execute. The checked attributes are read from the current entity and the resulting values are displayed.

Executing a Read Attributes Test

  1. To perform a read attribute test, you must either be connected to the host and have access to the application the model is based on, or you must load the model in the Host Emulator and connect to it.

  2. Select an entity that contains configured readable attributes, open the Attributes Test dialog box, and click the Read tab. The names of the readable attributes configured for the selected entity appear in the Name column.

  3. Click Execute. If the correct value for the attribute is read from the screen, the model is working properly.

Note

Reading attributes associated with non-display fields such as passwords will return blank strings.

Write Attributes

Click the Write tab. The attribute write test from within the Design Tool simulates the capabilities of the SetAttributes method provided with the AppConn APIs and uses UpdateAttribute and UpdateAttributes commands in conjunction with attribute input commands in operations. By simulating these calls, you can accurately test these methods with the host application model before using the connectors to create a client application.

Some preliminary steps may be required before running a write attribute test on a host application model. If you choose to enable Cache all attribute writes:

  • For a specific entity: Select the Cache all attribute writes check box on the General tab of the Advanced Entity Properties dialog box.
  • Throughout your model: select this same option on the Entity tab of the Preferences Setup dialog box.

Then, create operations that contain one or more the following commands:

  • an UpdateAttributes command
  • a series of UpdateAttribute commands
  • attribute input commands such as DefaultValue and TransmitToAttr

Note

When the Cache all attribute writes check box is selected, the Design Tool will not write attributes to the terminal screen unless an UpdateAttributes command or a series of UpdateAttribute commands is included in the attribute input operation.

Executing a Write Attributes Test

  1. To perform a write attribute test, you must either be connected to the host and have access to the application the model is based on, or you must load the model in the Host Emulator and connect to it.
  2. Select an entity that contains configured attributes, open the Attributes Test dialog box, and click the Write tab. The names of the attributes configured for the selected entity appear in the Name column.
  3. Select a configured operation from the Operation list and click the Execute button.
  4. If the operation successfully navigates to its expected destination and writes the cached attribute values to the terminal screen, the model is working properly.

Recordset Test

The Recordset Test dialog box allows you to test recordsets before the model file is deployed to the Host Integrator Server and accessible to client applications written using the connector APIs. All of the recordset actions listed below simulate the capabilities of recordset methods provided with the connectors. This dialog box allows you to test these methods with the host application model before using the connectors to create a client application.

Configure the following options to create a test of the current recordset:

Recordset

Displays a list of defined recordsets. By default, the recordset that appears in the Name box is selected.

Action

Displays a list of defined actions for the selected recordset. Use the model example, located in your \<VHI install directory>\models folder, to demonstrate recordset actions.

See Adding Entities for information on these actions:

  • Fetch Records
  • Select Record
  • Insert Record
  • Update Current Record
  • Set Current Record Index
  • Reset Current Recordset

Note

After executing the first Fetch Records, you must reset the recordset by executing the Set Current Record Index action. When Host Integrator arrives at an entity with a recordset, it will always indicate the current record is at a record index of 0, which means before the first record. Any subsequent fetch of data will then start with the first record and proceed normally. In addition, the current recordset index action available in this dialog box works independently of operations configured on the Operation tab. If you want to test an operation that is not related to a recordset, use the Execute button on the Operation tab instead of the options available in this dialog box.

Filter string

To customize your test, you can create a string in the Condition Edit dialog box to narrow your fetch to certain record or records in the recordset. To open the Condition Edit dialog box, click the Edit button.

Limit fetch to __ records

Select this check box and type the number of records to fetch. The fetch will begin with the line number indicated by the Current record index at the bottom of this dialog box. Example: If you set your current record index to line 5, and you want to fetch 20 records beginning at line 5, select this check box and type 20 in the box. Select Fetch Records and click Execute; the first 20 records beginning at line 5 should be returned in the Fetch returned box.

Synchronize button

Click this button to synchronize after an entity change. This button is available after a recordset fetch terminates due to a change in an entity.

Copy fetch results button

Click this button to copy your fetch results onto the Windows Clipboard.

Execute button

Click this button to execute the selected action.

Edit button

Click this button to open the Condition Edit dialog box.

Fetch returned

Displays the results of the data fetch test.

Perform Scrolling Operations

Click any of the available buttons to test scrolling operations that have been defined for the current recordset.

Note

Testing recordsets will only function while you are connected to a host or using the Host Emulator.

Procedure Test

Use the Procedure Test to test the procedures for a table. This allows you to debug your procedure definitions before deploying your model. For a basic introduction, see the procedure example in the tutorial.

To Test a Procedure

To test a table's procedure, you must either be connected to the host and have access to the application the model is based on, or you must load the model in the Host Emulator and connect to it.

  1. Load the host application model containing the procedure to test into the Design Tool.

  2. Connect to the host.

  3. Select Procedure Test from the Debug menu to display the Procedure Test dialog box.

  4. Click the down arrow next to Table and select the table containing the procedure you want to test.

  5. Click the down arrow next to Procedure and select the procedure you want to test. All filter and data parameters for the selected procedure appear in the Procedure filters box. If the filter is defined as required, a check mark appears in the box to the left of the filter.

  6. In the Value column, enter data for the parameter. If a parameter is required, you must enter a value for that parameter.

  7. Click Execute.

The Design Tool will test the procedure. If this is a SELECT procedure, the Design Tool will display the output in the Procedure outputs dialog box. The record count is displayed at the bottom of the dialog box.

If the Terminal window is visible, you will see the Design Tool navigate to the appropriate host screen while it tests the procedure.

Procedure options

  • You can limit the number of records to be returned for the procedure test. Type in a number for Limit results to <x> records. The total record count is displayed with the procedure results.
  • After executing the procedure, click Copy to copy the procedure output to the Clipboard.
  • If an event handler that includes an [Execute Procedure] call is attached, a lightning bolt is displayed in front of the procedure name. When you click Execute, the associated event handler is invoked as part of the procedure test. The Execute Default button is available so that you can test the procedure without testing the event handler at the same time. Click Execute Default to test the procedure as displayed in the Procedure Editor, without testing the associated event handler.

Debugging a Procedure

If you experience problems with a procedure used to satisfy an SQL query or you want to fully investigate the behavior of a procedure, you can debug it in the Debug Procedure dialog box.

  1. Click the down arrow next to Table and select the table containing the procedure you want to test.
  2. Click the down arrow next to Procedure and select the procedure you want to test. All filters for the selected procedure appear in the Procedure filters box. If the filter is defined as required, a check mark appears in the box to the left of the filter.
  3. In the Value column, enter data to test against the procedure. All filters for the selected procedure appear in the Procedure filters box. If the filter is defined as required, a check mark appears in the box to the left of the filter.
  4. Click Debug to open the Debug Procedure dialog box with the procedure displayed.

SQL Test

Use the Test SQL dialog box to test SQL queries on the tables you created for your host application model. This allows you to debug your table and procedure definitions before deploying your model. Before you can run an SQL test on a host application model, you must first create tables derived from the model that contain the data that you want to query.

When you test an SQL query, you simply enter a supported SQL-92 statement in the SQL statement box of the Test SQL dialog box and the Design Tool returns the results.

For a basic introduction, see the SQL example in the tutorial.

Tip

If you're using SQL, note the Tables dialog box setting for Allow SQL SELECT to return a subset of columns.

Executing an SQL Test

To test an SQL query on your model tables, you must either be connected to the host and have access to the application the model is based on, or you must load the model in the Host Emulator and connect to it.

To run a test SQL query on a table, follow these steps:

  1. Load the host application model containing the table to test into the Design Tool.

  2. Select SQL Test from the Debug menu to display the SQL Test dialog box.

  3. Connect to the host.

  4. In the SQL statement box, type a supported SQL statement.

  5. Click Resolve. The Design Tool determines the procedure or procedures that could be used to satisfy the query you entered and display them in the Procedures box.

  6. Click Execute. The Design Tool executes the procedure or procedures and displays the output in the Output recordset dialog box. If the Terminal window is visible, you will see the Design Tool navigate to the appropriate host screen while it fulfills the SQL query.

    Note

    The total record count is displayed with the output recordset results. You can limit the number of results for your test. Type in a number for Limit results to <x> records.

  7. Click Copy to copy the results to the Clipboard. This is handy if you need to evaluate and compare results from multiple SQL queries.

  8. Click Reset to perform another test or Close to close the Test SQL dialog box.

To Debugging a Procedure

If you experience problems with a procedure used to satisfy an SQL query or you want to fully investigate the behavior of a procedure, you can debug it in the Debug Procedure dialog box.

  1. In the SQL statement box, type a supported SQL statement.

  2. Click Resolve. The Design Tool determines the procedure or procedures that could be used to satisfy the query you entered and display them in the Procedures box.

  3. Click Debug. The procedure or procedures display in the Debug Procedure dialog box.

Connection Events Test

Click Connection Events Test on the Debug menu to confirm that the life cycle event handlers and model event handlers you have added to your model are performing as you expect. Use this dialog box to test sequences of events that require a reset of the terminal session: connecting to the host and establishing a client connection. You can perform tests in both dedicated model and pooled session environments. In order to use the Connection Events Test, the model must be connected to the host, but login is not required.

  • Dedicated Session

    Click New Client to simulate the sequence of one client application disconnecting and another client connecting to this model using the ConnectToModel API.

  • Pooled Session

  • Click New Session to simulate destroying and creating a pooled session, then connecting to the session using the ConnectToSession API.

  • Click New Client to simulate the sequence of one client application disconnecting and another client reconnecting to this model using the ConnectToSession API.

  • Credentials

    If the event handler requires a user name and password, specify the client user credentials here. An Authenticate User event will be sent to the event handler based on the credentials shown here. The initial values are based on those provided on the Environment tab in the Event Handler Settings dialog box, but you can change them. You can also update the Environment tab settings if you make changes here. The credentials you use for this test remain current until another test or a reset.

    You can also specify or review host credentials. Click the Model Variables button to view or change other model variable values.

  • Event Handlers

    The Connection Events Test dialog box checks the event handlers currently associated with life cycle or model event handlers displayed here.

  • Testing Event Handlers

    After you have specified your test parameters, the Edit, Rebuild, and Reload options are available as you debug the event handlers.

  • Edit Button

    Click the Edit button to open the selected event handler in your default editor.

  • Rebuild

After editing an event handler, click this button to recompile event handlers and to update them for the default JAR/assembly file.

  • Reload

    Click this button to reload all event handler libraries to match the latest copies stored with the model.

Connection Events Test Options

The Connection Events Test dialog box tests sequences of events that require a reset of the terminal session: loading a model, connecting to the host, and establishing a client connection. The startup and shutdown sequences below illustrate the differences between each of the test options.

Dedicated Session: New Client

This option simulates the sequence of one client application disconnecting and another client connecting to this model using the Connect to Model method:

  1. The Client Disconnected event is fired.
  2. The terminal session navigates to the home entity, which could involve Execute Operation events.
  3. The logout process is executed, which could involve firing the Execute Logout event.
  4. The host session is disconnected from the host.
  5. Steps 1-7 of standard reset processing are executed.
  6. The host session is connected to the host.
  7. The login process is executed, which could involve firing the Execute Login event.
  8. The terminal session navigates to the home entity, which could involve Execute Operation events.
  9. The Client Connected event is fired.

Steps 1-4 can also be accessed by pressing the logout button, while steps 5-9 can be accessed by pressing the login button on the same toolbar.

Pooled Session: New Session

The New Session option simulates destroying and creating a pooled session, then connecting to the session using ConnectToSession method.

With this option, a session pool host session is created, connected to the host, and logged in before a client session is actually created. This simulates the real-world behavior of a session pool session.

  1. The Client Disconnected event is fired.
  2. The terminal session navigates to the home entity, which could involve Execute Operation events.
  3. The logout process is executed, which could involve firing the Execute Logout event.
  4. The host session is disconnected from the host.
  5. The Client Session Destroyed event is fired.
  6. The Host Session Destroyed event is fired.
  7. The Host Session Created event fires.
  8. The login process is executed, which could involve firing the Execute Login event.
  9. The terminal session navigates to the home entity, which could involve Execute Operation events.
  10. The Client Connected event is fired.
  11. The Authenticate User event is fired, with any resulting errors displayed in the Event Handler Console.
  12. The Client Session Created event is fired.

Pooled Session: New Client

This option simulates the sequence of one client application disconnecting and another client reconnecting to this model using the ConnectToSession method.

In this case, only the client session events are fired because a session pool host session is not normally logged out or destroyed between client invocations.

  1. The Client Disconnected event is fired.
  2. The terminal session is navigated home, which could involve Execute Operation events.
  3. The Client Session Destroyed event is fired
  4. An Authenticate User event is fired.
  5. The Client Session Created event is fired.
  6. The Client Connected event is fired.