Bulk Load dialog box
- WorkPath For Bulk Loading - This path is mandatory for the following options:
- When using the Content Manager Bulk Loader, which is available as part of the Content Manager SDK. This path is required for using the Index Text function.
NOTE: Additional information on the SDK can be found:
SDK Documentshttps://content-manager-sdk.github.io/Community/
Sampleshttps://github.com/content-manager-sdk/Community/ - When using Data Port’s Use Bulk Loader feature. This path is used for importing Electronic Documents.
- When migrating data from an existing dataset to a new dataset. This path is used to temporarily store the loading script and data files for importing.
- The path must be an UNC path to allow the Workgroup Server(s) and/or Database Server to access the files contained with.
Full permissions to the given path must be given to one or more of the following logins:
- the network login running the Bulk Loader program
- the network login running the Workgroup Server service
- the network login running the Database Instance
- NOTE:
- When using the Content Manager Bulk Loader, which is available as part of the Content Manager SDK. This path is required for using the Index Text function.
- Use Bulk loading for temporary tables - For large operations, temporary tables will be employed to optimise result sets. To load these temporary tables more efficiently, bulk loading can be enabled. If no threshold value is specified, the default is set at 1000 uris, otherwise the threshold value specified will be used.
- Use direct path load for Oracle - for large operations, the direct path load is more efficient over conventional path load.
- Use Azure blob store for bulk insert - select this option to enable the Azure blob storage for bulk insert. If this is not selected bulk insert is not available for Azure SQL Databases and less efficient multiple insert commands are used in situations where the bulk insert is required.
Azure blob store configuration
- Container name - enter the Container name. This is a separate bucket in the Azure blob storage to hold the data files required for bulk insert command. The external data source is configured in the Azure SQL Server to access this container.
- Connection string - enter the connection string for the Azure blob storage
- Paste - paste the generated Azure blob storage Connection string from Azure.
- Set Shared Access Signature - set the SAS token value from the Azure Portal - Storage account - Shared Signature page.
- Test - Test the Azure blob storage and also the external data source, if the data source is not created, it will get created in this process.
NOTE: If the provided Connection string includes the Shared Access Signature then there is no requirement to add the details to the Shared access signature field, it will be automatically picked from the Connection string.
In the Create or Register New Dataset dialogs, click Next to continue to the Document Storage dialog box.