- 08 Apr 2024
- 11 Minutes to read
- Contributors
- Print
- DarkLight
Build XML Scripting for Dataset
- Updated on 08 Apr 2024
- 11 Minutes to read
- Contributors
- Print
- DarkLight
This topic relates to the following sections:
Overview
XML Scripting is imported to the SAP Data Services system for use in the extract, translation, and loading of data. This section is relevant for projects using ETL Tool = SAP Data Services.
View the following page by navigating to Migrate > Mappings from the Syniti Migrate Homepage.
Anytime after the Dataset has been imported and the Target tables are determined as relevant, the user may build XML scripting for all of the target processing. This includes the Project, Work flows, Dataflows, and Query structures for any of the target related processes. Once the fields are mapped, and reports are created against the fields, the user should again build the XML to include reporting to the XML scripting.
Note
The Source XML Scripting information is provided within section Migrate > Mappings > Build XML Code for Source Data. Please refer to this section for details on the build, view, and import of that data.
Prerequisites for Building XML Scripting for Dataset
A few steps are required prior to actually building the XML Scripts for the Dataset. These include:
Ensure that the Migration Prefix field contains a value in the Dataset Details page. Refer to Catalog > Dataset Design > Create & Maintain Datasets section of Migration Application. See notes below.
Ensure that the correct Working Database is assigned to the Working Instance of the Reporting Environment. Refer to Administer > Setup > Environments > Manage Development Areas for more details of this assignment and validation.
Ensure that the Target Source mapping table has been assigned so the Working table may be created as part of this set of steps. Refer to Migrate > Mappings > Create & Maintain Target Sources for more details of this process.
Build XML Button
While viewing the Mappings section, click Edit for a Dataset to view Release Dataset Details window. Click the Build XML button to build the automation XML scripting for Dataset. The system will also create the Target tables and Working tables for the Dataset within the Database at this time.
Note
Within Project Setup, the Development Areas must be completed prior to building XML. The Development Area is a required field, and this field ties to the Environment and the Working Database where the DDL Scripting builds the tables for this Mapping Object. The Working Datastore assists in the scripting of the XML code. Refer to section Maintaining the Development Areas for details.
Note
CREATE TARGET TABLES button builds the target table within the working databaxe separate from XML Build - The process of creating the Target Table is set as a separate task should the automated process fail. Refer to section Migrate > Mappings > Create Target Tables for details of this new process.
Click on the OK button to proceed.
Once the job is initiated, a "Build XML Successful" message appears at the bottom right of the Mappings multi-panel page. View the XML scripting as well as checking the Job Queue and Debug log to ensure that the DDL automation builds for tables are complete and successful. Refer to sections - Monitor > Job Queues and Monitor > Debug Log for more details on these topics.
Note
Within Mapping Details, the BUILD status is automatically set to complete for Simple Mappings (Copy, Xref, Default, Not Used, or Internal) when the user clicks BUILD XML.
Note
Click DOWNLOAD XML button to produce a local file to view the XML scripting for the entire file. Refer to section DOWNLOAD XML Button within this portion of the help for more details of this process. Individual portions of the XML may be viewed on the page such as the target tables, or reports, or exports.
Note
The target table builds within the database during this process if there is a working database setup and if the connection to the database is active. Should the target table already exist, the process does not rebuild the table. If there are new fields added to the Dataset and imported to the Mappings page, the process alters the target table to add these fields at the end of the columns. Within the Debug Log for this Job Queue, the DDL scripting is provided as a step if the table is being built.
Datastore for Target Reporting
Within the XML load file for the Dataset calls on the Datastore xAppMigrate_Webservice. This datastore joins to the database of same name that is stored within Migration database of Syniti Migrate application on SQL Server. The Admin builds the datastore as part of the initial setup of the Data Services system. This datastore provides for Data Services the tables required for Target Reports (COR_REPORTS). The imported XML files do not run successfully if this datastore is not in place.
Refer to section Migrate > Build xAppMigrate Datastores for more details of this setup and use.
Data Element Translation using Datatype XREF
In the process of building the XML for use in SAP Data Services, the Data Elements go through a translation from the source code to the target code. There are a few notes to keep in mind:
All Datasets that are built within Syniti Migrate Dataset Design show formatting of the Data Elements as SAP for the Tables and Fields tab and as the target database (Oracle, HANA DB, or SQL Server) for the Migration tab.
If the user should import the Dataset from another project, then the Translation of the Data Elements copy from the originating project. In this case, the user may find that the DDL Scripting defaults to all NVARCHAR(255).
Job Queue Details for Dataset Build XML
The Build XML process within Mappings > Dataset produces an Executable Job within the Job Queue as well as multiple Debug Log postings.
Note
The BUILD XML job only builds new, in the database, tables and views that do not already exist. Also, if the user adds more fields to the table, the BUILD XML job adds these new fields at the end of the columns of an existing table. It does not over write tables.
Note
The import of XML Scripting from a BUILD XML job to SAP Data Services OVER WRITES all scripting with new import.
The user needs to monitor both the Job Queue and the Debug Log to determine if the job contains errors. In a few cases the issue is very clearly defined, but at times it may take pasting the Create statement into the database to propose a detailed error message. However, most of the common errors are identified and provide accurate description.
Although the Job may show Status of ERROR, the build still may be complete. The ERROR is probably in the actual building of the target table in the Database. At this point, it is helpful to manually attempt to build the same script in the database to see the issue and find resolve. From within the Debug Log the scripting of the DDL table may be copied and then pasted into an open query in the database to attempt to build the table manually.
The Database results in gaining the new Target tables.
Additional Reference Information on the Process for Building XML within a Dataset
For details of the process of importing XML scripting into the ETL tool, see the associated ETL tool document -
Auto generating DDL script into the Oracle Database tool. Refer to section The Data Migration Process > Migration Using Oracle as the Database.
Auto generating DDL script into the HANA Database tool. Refer to section The Data Migration Process > Migration Using SAP HANA DB as the Database.
View XML Button
While viewing the Release Dataset window, click the VIEW XML button that is visible if there has been a build of XML for this dataset to view the automation XML and DDL for Dataset target. Also, the fields Load XML ID, Load File Name, and Load File Created stores values once the Build XML is complete.
The View XML
Note
Source field mapping does not have to be in process for this section XML Build because this is all related to the target. The job attempts to build both the Target tables and Working tables.
The XML Types are as follows:
XML Type | Definition |
---|---|
LOAD FILE | The top-level XML file that contains all subordinate Data Services objects that were generated from the Dataset-level automation "Build XML" process. |
TargetTable | The table definition XML that contains the Data Services metadata for the ETL working database target table |
Reporting Tables | The table definition XML for COR_REPORT (Stores all reporting) |
TargetExportFormat | The table definition XML that contains the Data Services metadata for the target export flat file that contains load- ready target data |
TargetTableExportDF | The target XML that contains the Data Services Data Flow and embedded Transforms to copy active, load-ready data from the target table to the target export flat file |
TargetTableExportWF | The target table XML that contains the Data Services Work Flow container necessary for the TargetTableExport Data Flow |
ObjectReportingDF | The object XML that contains the Data Services Data Flow containers necessary for the Dataset Reporting for Data Ready, Stage Ready, Postload, Preload, Remediation, and Target Ready Reports. |
ObjectReadbackWF | The object XML that contains the Data Services Work Flow container necessary for any Dataset object table readbacks from the target system |
ObjectExportWF | The object XML that contains the Data Services Work Flow container necessary for the TargetTableExport Data Flow |
ObjectPostLoadReportWF | The object XML that contains the Data Services Work Flow container necessary for the object post load report Data Flows |
ObjectTransformTargetPostLoadWF | The object XML that contains the Data Services Work Flow container necessary for the object post load Data Flows |
ObjectTransformTargetRuleWF | The object XML that contains the Data Services Work Flow container necessary for the object Data Flow |
ObjectTargetReadyWF | The object XML that contains the Data Services Work Flow for the Target Ready Reports AND RE-runs the Preload reports since they may be affected by TargetReady rule updates. |
ObjectTransformProject | The object XML that contains the Data Services Project container necessary for the subordinate Jobs |
ObjectTestJob | The object XML that contains the Data Services Job container necessary for the testing Work Flows |
ObjectPostLoadJob | The object XML that contains the Data Services Job container necessary for the ObjectReadbackWF Work Flow |
ObjectExportJob | The object XML that contains the Data Services Job container necessary for the ObjectExportWF Work Flow |
ObjectTransformJob | The object XML that contains the Data Services Job container necessary for the ObjectTransformTargetRuleWF Work Flow |
ObjectLoadJob | The object XML that contains the Data Services Job container necessary for the target data load Work Flows |
ObjectReleaseSubVar | The object XML for Substitution Variables for the Release ID and the Export pathway |
XML Scripting to Include SAP S/4HANA LTMC Export Process
Should the object for migration require use of the SAP S/4HANA Legacy Transfer Migration Cockpit process for loading data to the system, the XML Scripting includes a change to the Export portion of this build:
The user needs to create a Datastore within Data Services that serves as the datasource for the HANA DB used to store the LTMC Staging tables. This Datastore is named "xAppLTMC" and it is a connection to the HANA DB with Schema of the LTMC Staging tables.
During the import of XML file(s) for the Dataset, the Export section of the project reflects a table to template table transfer of rows where the Template table is the object staging table in HANA DB.
Refer to section The Data Migration Process > LTMC - SAP S/4HANA Migration Cockpit for details of using the LTMC Staging option for SAP S/4HANA system projects.
Download XML Files
Click the GET LOADFILE button to generate and download the XML file to the local downloads folder. This file may now be imported to the ETL tool to build out the overall project and components to convert the data at the target level.
Depending upon the overall size of this build, there may be more than one LOADFILE. Should there be 2 or more LOADFILE lines, the user is NOT able to use the GET LOADFILE button to download the files as it is disabled. The user must click on each of the XML buttons within the LOADFILE lines to download them individually. These files are numbered as 1 of 2 and 2 of 2 to help provide the sequence to import to Data Services.
If there are issues with the automated build to the database system, the user may also view the scripting built in the Debug Log of the Job Queue for this build, they can run the DDL scripting within the database to create the target tables and views. Most times this process generates in the background using batch files and the Job Queue without any issue.
Note
Should the rows of Type = LOAD FILE be blank in the DSXML panel, the file will not download. It may be that it is taking a longer time than usually expected to produce this file from the database. Exit and enter the process and once the DSXML scripting is displayed for the LOADFILE rows, download them to a local drive.
Download Individual XML Files
Download the XML file by clicking the XML button adjacent to the selected XML file type. After clicking the button, the message "Generated XML file successfully" should appear at the bottom of the page and the file should be downloaded to the local download folder. Refer to Download Individual XML Files for more information.
DOWNLOAD XML Button
Should the XML scripting be large and difficult to view in the View XML page, the user has option to click on the DOWNLOAD XML button on the Release Dataset Details page. This action produces a file that stores to the user's local drive. This option prevents users from limiting download due to the size of the data stored in that Dataset.
Subsequent Steps
Once the Target and Working tables are built within the working database, the mapping may begin. Refer to section Mapping the Fields of a Datasource for details of subsequent steps in this process.