A
ALE
Application Link Enabling (ALE), a load type, is an SAP technology that facilitates the exchange of data between different systems.
Application
A logical grouping of data migration activities that are related to a particular business application or system. It serves as a container for all the data objects, rules, and processes that pertain to that particular business application.For example, an Application could represent an ERP system like SAP or Oracle, where data migration activities are focused on moving data into or out of that specific ERP system.
B
Batch Load
A load type that refers to the process of transferring a batch of data records into the target system in a single operation.
Business Ready
A report that ensures data aligns with business processes and rules, supporting overarching processes such as financial reconciliations or asset depreciation calculations.
C
Catalog Asset
Assets registered in the Catalog module, such as Databases or Systems.
Check Table
Reference tables used to validate and control values entered into field values during data migration from source to target.
Conditional
A tier in Data Classification used to describe data that is dependent on certain conditions or contexts. It can include rules, policies, or hierarchies that are applied to other data tiers to provide additional context or to drive business logic.
Construct
A field mapping type where data must be manually constructed by the business users when target values do not exist in any legacy system, and cannot be defined by a mapping rule.
Construct Pages
Web pages used to the build, cleanse, and enrich data as part of the migration process.
D
Data Mapping
The process of identifying relevant and in-scope source data to be moved into the target system, identifying gaps in data, establishing rules for moving data, and creating missing data.
Data Ready
A report indicating that data has passed all readiness checks, including staging, transformation, and business rule validations, and is ready for the final load into the production environment.
Datasources
A connection to a system at a specific level for transferring data between that source and the client database.
Datastore
A datastore represents a set of data, typically a database, schema, or an API connection, within a system.
DDL
Data Definition Language, a standard for commands that define the different structures in a database.
Delta Postload
A report showing the results of a supplemental data or "delta" load, identifying any changes or updates that occurred during the load process. It helps to reconcile the final state of the data in the target system with expected outcomes.
Deployment
Deployments group together data for purpose of assignment to Users. They are used within Enrich and Reporting to segregate data requiring security (e.g., HR employee records, cost centers).
Deployment Reports
Reports that focus on the operational aspect of moving the data into the production environment of the target system. These reports contain an additional filter by Deployment to separate the data into sections - multiple reports. This is very handy for situations where the data has security requirements. For example, GL Accounts or Cost Centers could be split by the Deployments for Company code.
Development Areas
A combination of a Working Datastore and a Milestone that produces a unique staging environment for transforming data that is pertinent to a Milestone.
Double-Byte Characters
Characters that use two bytes (16 bits) of data to represent a single character in a computer encoding system. They are used to handle large character sets, such as those required for languages with complex scripts (for example, Chinese, Japanese, and Korean characters).
E
Enrich/Enrichment
A field mapping type where business user may be required to review and modify or enrich the existing legacy data that is incomplete or in an inconsistent format.
Environment/Instance
A specific environment of the Application such as Development, Test, and Production. Each Instance has its own set of configurations, data mappings, and transformation rules that are appropriate for that stage of the lifecycle. For example, you might have a Development Instance for rule creation and testing, Test Instance for user acceptance testing, and Production Instance for execution.
ERP
Enterprise Resource Planning is a business process management software that allows an organization to use a system of integrated applications to manage the business and automate many back office functions related to technology, services and human resources.
ETL Tool
Extract, Transform, and Load (ETL) Tool is a software component that is used during the data migration process to extract, transform, and load data.
F
Foundational
A tier in Data Classification representing the most basic and core data elements essential for the operation of business systems. Foundational data is often related to the configuration and setup of systems and applications.
G
GDPR
The General Data Protection Regulation (GDPR) is a comprehensive data protection law that came into effect on May 25, 2018, in the European Union (EU). It is designed to protect the privacy and personal data of EU citizens and affects organizations worldwide that process or hold personal data of individuals residing in the EU.
Syniti ensures that all data migration activities comply with GDPR regulations, ensuring personal data is handled according to GDPR principles throughout the migration process.
Governance Rule
A guideline created by business users to evaluate data health and ensure that business requirements are met. Governance rules are used to assess data quality, document compliance with requirements, and indicate the completion of necessary manual tasks, among other purposes.
I
IDOC
Intermediate Document (IDOC), a load type, is a standard data structure used in SAP systems for transferring data between SAP systems or between an SAP system and an external system.
Info
A report providing informational insights into the data, often used for audit purposes or to give stakeholders an overview of the data's current state without indicating specific errors or readiness.
Informational
A tier in Data Classification that includes data derived from processing and analyzing other data tiers. This can include aggregated data, reports, analytics, and business intelligence that provide insights into business performance, trends, and decision-making.
L
LTMC Staging
Legacy Transfer Migration Cockpit (LTMC) Staging, a load type, is a tool provided by SAP for data migration, particularly for SAP S/4HANA migrations. It is used to stage and load data from legacy systems into SAP.
M
Manual
A load type that refers to the process of entering data directly into the target system manually through the system's user interface.
Master Data
A tier in Data Classification that represents a consistent set of identifiers and extended attributes describing the core entities of the business, such as customers, suppliers, products, and employees. This data is typically non-transactional in nature and is used across multiple systems and processes.
MDG
Master Data Governance, a state-of-the-art master data management solution, providing out-of-the-box, domain-specific master data governance to centrally create, change, and distribute, or to consolidate master data across your complete enterprise system landscape.
Metrics Scorecard
A scorecard used to quickly assess the effectiveness of the migration activities and to identify areas that may require attention or improvement. It provides a quick overview of the migration's current status against predefined benchmarks.
Migration Reports
Detailed Reports that provide insights into the data migration process. These reports are used to track the progress of the migration, identify and resolve data issues, and ensure that the migration aligns with the project timelines and quality standards.
Milestone
An action or event marking a significant change or stage in development. In Migration, the Milestone refers to each scheduled run of the migration data ETL process.
O
OTC
Order-to-Cash, an integration point between Finance and Sales. It is also known as OTC or O2C in short form. It is a business process that involves sales order from customers to delivery and invoice. It comprises SO, Delivery, Post Goods Issue and billing to customers.
P
PMO Dashboard
A centralized dashboard used to capture progress of a migration project based upon set relevancy metrics. It is primarily used by the Team leads and Project Manager to provide the client with the status of the Data Migration Project.
PTP
Procure-to-Pay, a business process containing all activities involved in procuring goods and services from external suppliers and paying for them.
R
Release
A release refers to a specific point in time at which a set of data is moved from one environment to another. This could be from a staging environment to a production environment or from a legacy system to a new system.
Relevancy Attributes
Attributes that determine if the object type record is relevant for conversion and loading to the target system. These attributes assist in maintenance of the fields that define relevancy for Master Data objects like Customer, Vendor, or Material Master.
Remediation
The process of correcting or fixing errors in the data. In Migration, remediation involves correcting data errors by supplying default values or replacement logic to ensure that a row of data does not fail during loading.
Report Cache Database
The Report database stores every active report that has been run successfully as part of an ETL. The project within the ETL tool will populate each table (representing each active report) as the job runs. These tables will serve as a static view of the data at the time the conversion was run.
RICEFW
Abbreviation for Reports, Interfaces, Conversions, Enhancements, Forms, and Workflows. It is a method used to catalog and uniquely identify all the technical objects that need to be built during a large and complex project.
RTR
Record-to-Report is a Finance and Accounting management process which involves collecting, processing and delivering relevant, timely and accurate information used for providing strategic, financial and operational feedback to understand how a business is performing.
Rule
A field mapping type that requires transformation logic to calculate the target value from the legacy value.
Rule Xref
A field mapping type where the legacy source value is manipulated to derive a unique legacy value, which is then translated through a cross-reference table to the correct target value.
Rulebook
A report that details the transformation and validation rules applied to the data, serving as a reference for understanding how the data has been processed. This report is provided to the client as part of the deliverables.
S
Sampling Method
A method that allows for the validation of a specific percentage of records in the Target Reports based on the ANSI/ASQ Standard Z1.4-2003. This standard uses a lot size (e.g., record count) and an inspection level (e.g., General 1-3 or Special 1-4). The sample size is set based on the inspection level and report record count, providing a random sample size that adheres to the specification.
SAP Data Services
An ETL tool provided by SAP that can be used instead of Syniti Migrate's ETL option. It allows for the extraction of source data and facilitates data integration, transformation, quality management, and data cleansing.
Snapshot Management
The process of capturing data at a set point in time, usually by moving data from one datastore to another staging datastore.
Stage Ready
A report that identifies records which contain a Critical Data Error which prevents the record from loading, based on information available BEFORE the start of the data loads. For example, Missing Required Fields, Invalid Configuration Values, Missing Master data based on errors in upstream Staging Tables, and more.
Subject Area
A logical grouping of related data objects associated with a specific business function or domain within an organization. Subject Areas help organize the data migration process by categorizing data into meaningful sections based on business operations.
T
Target Ready
A report that Identifies records which contain a Critical Data Error which prevents the record from loading, based on information available only AFTER the start of the data loads. For example, Missing Master Data in the Target System (not in tgtECC.dbo.MARA)
T-Code
The transaction code used in the Target table, typically within SAP systems, to uniquely identify a specific transaction or process.
Transactional
A tier in Data Classification that refers to data generated from the day-to-day operations of the business. This data is dynamic and records the transactions that occur within the business, such as sales orders, purchase orders, deliveries, and financial transactions. Transactional data is often time-stamped and can be used to track the activity and performance of the business over time.
V
Validation Hub
A list of validation scripts used during each milestone of the migration project. These scripts group tasks related to an object to ensure that the validation of data during the preload or post-load stages is completed accurately.
Value Mapping
The process of converting or translating data values from the source system to their corresponding values in the target system during a data migration project. This is particularly necessary when the source and target systems use different formats, nomenclature, or standards for representing data.
W
Working Databases
A database instance used to host the Migrate Working Databases, where customer data that is being processed is temporarily stored. These databases are essential for storing data that will be worked on during the migration process.
Working Table
Temporary intermediary tables used during the data migration process. They act as a staging area for data, storing it after extraction from the source system and before it is transformed using ETL rules and loaded into the target system.
X
Xref
A field mapping type in which a legacy source value is translated through a cross-reference table to obtain the corresponding target value. This is used to ensure accurate mapping between the source and target systems when their data formats or values differ.