- Resource Type
- Data Sheet
- Success Story
FREQUENTLY ASKED QUESTIONS
- How can I use off-the-shelf business intelligence tools against data in DMSII?
- How can I replicate MCP data that is not stored in DMSII?
- Can a data integration solution let me have several target databases sourced by one DMSII database?
- Can a data integration tool really reduce MIPS costs?
- If I am using an ETL tool against DMSII, can I reduce costs further?
- How can I protect the business-critical data stored on MCP while still getting the benefits of ETL?
- How does audit mirroring aid in disaster recovery?
- What other attributes should I look for in an ETL solution?
- Can one data integration solution really give me all the results discussed above?
- What target databases are supported by Databridge?
- What source databases are supported by Databridge?
- How much data storage will I need for Databridge?
How can I use off-the-shelf business intelligence tools against data in DMSII?
Most common off-the-shelf business intelligence (BI) tools support relational databases but do not support hierarchical databases such as DMSII. When evaluating integration solutions, make sure you look for one that can extract DMSII information into a relational database, enabling you to use common off-the-shelf BI tools against that data.
How can I replicate MCP data that is not stored in DMSII?
Sometimes it's prudent to extract application files, communication transaction trails, system summary logs, and other such files to a secondary system for data analysis. An example of this need is the examination of LINC logs to detect insider fraud. Make sure your solution can replicate these files in a real-time feed from the DMSII environment, via SQL, to an anti-fraud (or anti-money-laundering) package with simple SQL scripting.
Can a data integration solution let me have several target databases sourced by one DMSII database?
Yes. In fact many organizations are successfully moving selected data from the host to multiple databases. If that capability is important in your IT environment, don't hesitate to insist upon it.
Can a data integration tool really reduce MIPS costs?
In today's metered environments, any actions taken on the Unisys ClearPath MCP platform translate to Million Instructions Per Second (MIPS) charges. MIPS usage is a constant target for cost cutting, so expensive queries against DMSII data are subject to scrutiny. (Because the majority of business-critical information is stored in DMSII, this constraint can result in conflicting IT demands.) The right data integration tool will let you extract queriable data to a relational databases and keep it synched-up without intervention. And with that capability, you can move those queries off the MCP platform to substantially reduce MIPS costs.
If I am using an ETL tool against DMSII, can I reduce costs further?
Yes. But only if your data integration solution offloads host processing to a separate platform that does not require MCP processing. The most advanced solutions give you the option of installing the server component on either the server portion of a Unisys MCP-hosted mainframe or on a separate machine that has visibility to the mainframe disk units. You will conserve mainframe resources by letting the enterprise server component reside between the host and your choice of client, performing all host-related processing and I/O operations before sending data on to client systems. This approach saves you even more MIPS than simply moving the queries/business intelligence to a relational databases.
How can I protect the business-critical data stored on MCP while still getting the benefits of ETL?
The right data integration solution will not put your MCP data at risk; one-way replication requires only read access and never writes back to the mainframe.
How does audit mirroring aid in disaster recovery?
Mirroring audit files in the background to a Windows platform or secondary MCP lets you recover your database from the last tape backup, plus any changes stored in the audit files. Audit mirroring for real-time data recovery is a must-have feature in today's data integration market.
What other attributes should I look for in an ETL solution?
First, buy from a company you trust. Look for a provider with significant experience in solving Unisys access and integration problems. If a vendor says they can make DMSII data integration less costly, more efficient, or more dynamic, make sure they have the skill to stand behind those claims.
When evaluating ETL products, you should insist on attributes like these:
- Support for your source and target databases.
- Advanced filtering capabilities.
- Advanced data mapping capabilities.
- Ability to monitor and modify ETL workflow.
- Support for many targets from one source.
Look for a true ETL solution that securely integrates DMSII and non-DMSII data into a secondary system. By moving selected data from the host to a relational databases (or multiple databases), organizations can combine data from several external sources, perform trend analysis, and generate a wide variety of reports for improved decision support.
Can one data integration solution really give me all the results discussed above?
Yes. Databridge provides all the above capabilities and more. See below for specific technical details on Databridge.
What target databases are supported by Databridge?
Databridge has several client options supporting many databases on their various platforms:
- Databridge Client for Oracle on Windows
- Databridge Client for SQLServer on Windows
- Databridge Client for Oracle on HP-UX
- Databridge Client for Oracle on SunOS
- Databridge Client for Oracle on Linux
- Databridge Twin for Unisys MCP Systems
- Databridge DMSII Client for Unisys MCP Systems
- Databridge Client for Oracle on IBM AIX
- Databridge Client for IBM DB2 on IBM AIX
What source databases are supported by Databridge?
How much data storage will I need for Databridge?
The best way to plan for disk space on the relational database is to have double the amount of space that's required on the DMSII side. (For example, if you are replicating 2 GB of data, you would want 4 GB free for data on the relational database side.)