Post Import Maintenance of Project
  • 25 Mar 2024
  • 6 Minutes to read
  • Contributors
  • Dark
    Light

Post Import Maintenance of Project

  • Dark
    Light

Article Summary

This topic is related to the following sections:

Overview

Once the import(s) are completed, there are steps to take to ensure that the project will execute properly within SAP Data Services.  The project will be assembled with a Project at the top, and then multiple Jobs / Worklfows / Dataflows for each section of the conversion.  Each of the workflows are identified in the table below:

Priority

Job Name

010

Transform

500

TargetReadiness

600

Export

700

Load

800

Postload

900

TestRunner

null
Project: Job hierarchy

Update the Transform to Add Source Workflows

Now, add the source workflows that were imported for this project to the Transform job in the specific order that they should be run prior to all Target table workflows.

Locate the imported source workflows by a click within Local Object Library on the Workflow tab.  Select the workflow and then drag and drop it into the working panel of the open Project for the Transform Job.  The note included indicates that all source workflows should precede the target workflows in the Transform job as shown in the example below.

null
Project: Transform: Add Source Workflows

Substitution Parameters

From the main page in SAP Data Services, click pathway Tools > Substitution Parameter Configurations...

Validate the following parameters:

RELEASE_ID

The import process will build a substitution parameter for the $$RELEASE_ID. The purpose of this parameter is to assist in building the reporting correctly by Release.  If this field is not set correctly for the Objects being converted, the reports will not function.

Note

As part of the Dataset XML Build, this Substitution Variable should import as part of the Object Import process.

null
Substitution Parameter Configurations: RELEASE_ID

Note

he Release ID may be found within Migrate by following pathway Migrate > Project Setup > Releases > Edit. The Release ID will display in the lower left corner for the relevant release.  Copy and paste this entry to the Substitution Parameters.

null
Project Setup: Release ID

MIGRATE_EXPORT_PATH

The import process will build a substitution parameter for the $$MIGRATE_EXPORT_PATH.  This pathway will ensure that all Export files will be stored in the same logical pathway.  Generally, the IT Lead or Project Manager will provide this information as it becomes available.  Enter the Export Path to the field $$MIGRATE_EXPORT_PATH in this page.

Note

As part of the Dataset XML Build, this Substitution Variable should import as part of the Object Import process.

null
Substitution Parameter Configurations: MIGRATE EXPORT PATH

USER_ID

The import process will build a substitution parameter for $$USER_ID. 

Note

As part of the Dataset XML Build, this Substitution Variable should import as part of the Object Import process.

Validate Design is Complete

As each section of the XML scripting is imported into DataServices, the User should validate that the new portion of the Project work will run successfully.

Update Transform to Include each Source Workflow process

As stated in the prior section, each source workflow imported to the Local Repository will need to be attached to the Transform Job prior to any of the existing target workflows.

Validate all Dataflows for Transform Job

Source Insert Dataflow - This example is attempting to connect to a working table that has not yet been defined.  Add the DataStore and import the table and fields.  User will need to then delete this failed target (showing with red x) and attach the active working table from Local Object Library to this dataflow.

XREF Update Dataflow - This example again shows where the dataflow is not complete due to missing Datastore tables.  The source and target tables in this dataflow will need to be re inserted and attached after the working table is active within the Datastore.

Note

Should the Datastore exist within Data Services with the tables for this object, then the issue is typically because the EXT_SERVER Database Version does not match with the Database Version in the SAP DS Datastore.

  • For SQL Server, the value is set to same as stored in Syniti Migrate for same server (2019)

  • For Oracle, the value is 12c

  • For Hana, the value is 1.x

Make sure it matches the version of the Database in the DS Datastore.

XREF Insert Dataflow - As example, in this imported Source table Dataflow for XREF Insert, the translation is not complete - the source table is not attached to the rest of this Insert.  In this case, there are no XREF rules mapped, and therefore there are no XREF translates, and that forms this disconnect.

null
Source Workflow: XREF Insert Dataflow

In this case, there are no XREF rules mapped, and therefore there are no XREF translates, and that forms this disconnect.

null
Source Workflow: XREF Insert Dataflow

The user will continue through each workflow, dataflow, and query to validate that all is setup and able to run successfully.  At any point, the user may copy a dataflow into the TestRunner to Execute individually.

Dataflow Insert - Loading Options

Within the Dataflow for the Insert of data from WRK table to TGT table, the XML scripting sets the target table tabs to Not use BULK Loader options.  This makes use of Dataservices SQL Pushdown to the database - In the situation where both the source and target database are stored on the same server, it is not necessary to use BULK loader.  Dataservices streamlines the insert for the most efficient use.  However, in cases where the source and target databases are not on the same server, the XML scripting is setup for BULK loader (this is case for Import from SYS Source to SRC Source for Datasources and for Insert from Working table to Target table for Migration projects).

In the example below, the Insert is from SRC table to WRK table, and since server is different for each, the setup is updating the Bulk Loader Options tab as follows:

Source Workflow: Insert Src Dataflow: Target table details: Bulk Loader Options tab

In this case, the Bulk Load option is selected, and Mode = truncate as this is the first or only insert from Src table to Wrk table.

In the next example, the XML scripting is setup for an Insert from Wrk table to Tgt table, and the databases are both in the same server.  In this case, the Options tab will show field 'Delete data from table before loading' as selected.  This action will clear out the Target table for the newly processed conversion data insert.

null
Source Workflow:  Insert Tgt Dataflow: Target table details: Options tab

The Bulk Loader Options tab will not be used in this scenario.  This will enable the SQL Pushdown Logic.

null
Source Workflow:  Insert Tgt Dataflow: Target table details: Options tab

Optimized SQL

While viewing the Dataflow for the Insert from Wrk table to Tgt table, the User may validate this logic by using menu path:  Validation > Display Optimized SQL...

null
Source Workflow:  Insert Tgt Dataflow

The User may adjust the insert of the data using the Optimized SQL window.

Source Workflow:  Insert Tgt Dataflow: Validation: Optimized SQL

Build the Intricate Queries Manually

As indicated in prior sections, the import of XML scripting to DataServices is meant to build out the structure.  The user will need to next add to this conversion the more intricate rules and updates.

Within each source workflow there will be a standard set of dataflows:

  • SrcIns - Insert data from source table to working table

  • Upd_XREF - Update the values in working table for all XREF fields (zFieldName_XREF) to zFieldName using the XTVALUEMAP table that stores all base XREF translations from source to target values by Datasource_ID

  • Ins_XREF - Insert to the XTVALUEMAP table all new entries pulled from the source table for the XREF fields by Datasource_ID

  • TargetDelete - This is a Function that will delete from the target table all entries by zSOURCE.  This will ensure that each source dataflow will remove and load only records from their zSOURCE ID.

  • TgtIns - Insert all ZACTIVE = 1 records from working table to target table

These dataflows will handle most of the easy translations such as copying a field, defaulting a value to a field, and cross referencing process.  However, for the more intricate rules and XREF set up in mapping, the user will need to first build a view within the DB system, and then import that view to use as the source for an update query within DataServices.  Each of these rules will be added to the appropriate source workflow prior to the insert to target dataflow.


Was this article helpful?