Stay updated with Syniti's upcoming releases and enhancements in enterprise data management. These releases are planned for a future release of the Syniti Knowledge Platform (SKP), but their availability in the next month’s release is not guaranteed.
To see latest updates, check out the current month’s release notes under What’s New to discover new features and improvements we have rolled out for you.
Coming Soon!
Tricentis Tosca Cloud Integration
We're excited to announce the upcoming integration of Syniti Knowledge Platform (SKP) with Tricentis Tosca Cloud that combines SKP's data transformation, quality, and governance capabilities with Tosca Cloud's automated testing framework to streamline SAP system transformations, for example, SAP S/4HANA implementations and upgrades.
Key Capabilities
Production-Grade Test Data Provisioning: Generate valid test datasets that maintain complex SAP referential integrity across multiple interconnected business objects. SKP leverages production data snapshots to create test data that accurately represents the intricate relationships required for SAP business processes.
Data-to-Test Pipeline: Eliminate manual test data preparation by directly populating Tosca Cloud test cases with SKP-transformed data. This integration bridges the gap between test design and data provisioning, ensuring test scenarios reflect real-world business conditions.
Compliance and Data Governance: Enforce GDPR and PII compliance throughout the testing lifecycle. All test data used in non-production environments can be:
Masked according to data privacy requirements
Scrambled using standard Transformation logic
Filtered or segmented based on Sources, Deployments, or Organizational Hierarchies.
Benefits
Faster Testing Cycles: Integrated test data provisioning removes bottlenecks in SAP testing workflows
Improved Test Accuracy: Production-grade data ensures test scenarios match actual business operations
Reduced Project Risk: Valid test datasets prevent failures caused by incomplete or incorrect data relationships
Compliance Assurance: Built-in data masking and governance eliminate regulatory concerns in testing environments
Snapshot Management Workspace
We're excited to announce the upcoming release of the Snapshot Management workspace, a dedicated workspace in Replicate Preview that replaces the legacy Snapshot Management user interface within Migrate. This change allows Migrate to use the capabilities offered by Replicate Preview instead of the Syniti Replicate desktop version. It enhances the user experience by consolidating data replication capabilities within a unified interface while maintaining full compatibility with existing Migrate workflows, such as replications in ETL Jobs and target datastores on the Value Mapping page.
The workspace provides a centralized location where users can view, manage, and monitor all snapshots across their infrastructure, with streamlined navigation and intuitive controls for faster and more efficient snapshot management.
Additional functionalities not available in the legacy UI include:
Configuring replication dependencies
Creating a replication group
Running Pre and Post replication business rules.
Replicating from Syniti Drive (flat files)
Key Changes
Navigation: The Snapshot Management workspace is automatically provisioned for new tenants. For existing tenants, all configuration resources are automatically migrated.
Terminology Updates:
Snapshot Datasources → Target Datastores (prefixed with SRC** or TGT***)
System Datasources → Source Datastores
Snapshot Tables/System Snapshot Tables → Replications (including tables, and Orchestrate workflows)
Catalog Integration: Target datastores configured in Replicate Preview appear on the Datasources page with Datasource Purpose set to Snapshot.
Field Mappings: The Do Not Import toggle is replaced by the Field Mappings feature in the Replications panel for excluding columns during replication.
MIGRATE_APP Tables: Now registered as OData endpoints with views prefixed with MIG_ for Central Relevancy tables, System Reports, and Rulebook views.
Logs: Replication logs are managed via the Progress Status column in the Replications panel, where you can click a status to view execution history.
Unstructured Data Quality
We’re excited to introduce Unstructured Data Quality, a new capability that extends data governance beyond structured databases to your business documents. Scan, categorize, and measure the quality and accuracy of unstructured data such as contracts, statements of work, purchase orders, and other critical business documents, directly within SKP.
Benefits
Unstructured data comprises 80-90% of institutional business data, yet traditional data quality tools focus only on structured data. Unstructured Data Quality bridges this gap, enabling you to:
Gain complete visibility across all your business data; structured and unstructured
Enforce consistency through automated quality checks on document content
Reduce risk by identifying policy violations, missing information, and contradictions
Streamline workflows with immediate feedback on document quality at upload time
Key Features
Access and Categorize Documents: Documents are accessed from your connected repositories. Each document is automatically classified into a category(for example, contract type) to route it through the appropriate quality checks.
Extract Key Data: The system intelligently extracts key fields and data from your documents—such as payment terms, jurisdiction, and signature requirements—and routes this information through standard data quality processes.
Run Quality Checks: Two types of checks ensure document quality:
Passive Checks: Continuously monitor your document library in the background.
Active Checks: Validate documents at upload or publish time, providing immediate feedback to authors.
Review Results: Quality check results display pass or fail status with cited evidence from the document. If a document fails a check, suggested remediations help you address the issue.
Analyze Trends: The Unstructured Data Quality dashboard aggregates quality results, allowing you to filter and drill into problem areas by document, category, or rule.
Connectivity Upgrade: New Connector for SKP
We’re excited to announce the coming release of a major step forward in simplifying and modernizing connectivity for SKP! The new SKP Connector setup eliminates the need for complex VPNs and networking headaches by introducing a lightweight, secure connector. This upgrade streamlines deployment and enhances security.
Key Features
Simplified Deployment: The new connector simplifies installation and upgrades on both Linux and Windows.
Enhanced Security & Compliance: Supports FIPS-compliant algorithms, robust audit logging, and passes SAP PQ (Veracode) requirements.
Automatic Updates: Allows for automatic scheduled updates and connector log visibility from the SKP.
Benefits
Eliminates the need for Wireguard, UDP Port management, and complex security discussions.
Simplifies the upgrade and installation process.
Provides a scalable, auditable, and secure way to connect tenants to the SKP.
Master Priority Fields
In a coming release, the Match module will support Master Priority Fields, giving you explicit control over master record selection within match groups.
Key Features
Define priority rules during Match job setup to automatically designate the master record in each match group based on your business requirements. The system evaluates records against your specified criteria and flags the highest-priority record as the master.
New fields on the Create Match Job and Edit Match Job pages:
Master Priority Order—Assign priority ranking (1, 2, 3, etc.) to establish evaluation sequence
Master Priority Type—Select to define sorting direction for each priority field
Benefits
Improved data quality—Automate master record selection based on business-defined rules such as completeness, recency, or value.
Reduced manual effort—Eliminate manual review and rework by applying consistent prioritization logic across all match groups.
Enhanced transparency—Clearly understand why specific records are designated as masters through visible, auditable priority rules.
How It Works
During Match job execution, the system converts your priority field mappings and applies them to sort records within each match group. Priority fields are excluded from recommended match settings to ensure they influence only master record selection, not matching logic.
Data Quality Dimensions
SKP will soon include Data Quality Dimensions, a powerful enhancement that enables structured assessment and classification of data quality issues across your subject areas, rules, datasets, and business processes. This feature brings industry-standard data governance practices directly into your data quality workflow.
Key Features
Classify Data Quality Issues by Dimension
Data quality rules can be tagged with standardized dimensions that represent different aspects of data quality:
Accuracy—Data correctly represents real-world values.
Completeness—Data is not missing required values.
Conformity—Data conforms to specified formats.
Consistency—Data provides non-conflicting information.
Integrity—Data maintains important relationship linkages.
Timeliness—Data is sufficiently up-to-date.
Uniqueness—Data is not duplicated.
Dimension Management
Administrators can:
Enable or disable the Data Quality Dimensions feature through the Admin module.
Create custom dimensions with unique names and descriptions.
Edit, enable, or disable custom dimensions.
Control which dimensions are available for rule tagging.
Enhanced Catalog Visibility
When Data Quality Dimensions are enabled, you can:
View dimension-specific charts on catalog pages.
Filter rule search results by dimension.
Click any dimension chart to see all rules tagged with that dimension.
Gain insights into which aspects of data quality need attention.
Benefits
Better Prioritization—Understand which types of data quality issues are most prevalent and prioritize remediation efforts accordingly.
Standardized Reporting—Align your data quality metrics with industry-recognized standards for more meaningful reporting to stakeholders.
Improved Insights—Analyze data quality issues by dimension to identify patterns and systemic problems across your data landscape.
Flexible Customization—Extend the default dimensions with custom categories that reflect your organization's specific data quality needs.
Getting Started
Once available, administrators will be able to enable Data Quality Dimensions in Admin. Once enabled, users can begin tagging rules with appropriate dimensions and viewing dimension-based reports in the catalog.
Import Datastores
We’re excited to introduce the new Import Datastores feature in the coming weeks, which will help streamline your workflow. This feature allows you to select a connection and choose the relevant databases and schemas to create the necessary datastore(s), all in one place.
This is useful in cases where multiple databases exist for a connection (such as databases used in Migrate) or when quickly browsing for the desired schemas is necessary. Additionally, selecting specific schemas from a database allows you to import labels only relating to those schemas.
Importing datastores will save you time and effort, making datastore creation faster and more efficient than ever!
Address Verification Module
We’re excited to announce the future launch of the Address Verification module, a new standalone capability designed to streamline and simplify the address validation process across the SKP.
Address validation is a critical process for ensuring data quality and improving record matching. The new Address Verification module makes this standard workflow accessible and efficient.
Key Features
Smart Data Mapping & Intelligent Verification
Map your input data to standard address fields with an intuitive interface
Automate verification of address data with configurable thresholds
Review & Decision Tools
Review suggestions with clear, readable confidence levels
Edit and accept recommendations directly within the interface
Configure automatic approval based on return codes for streamlined workflows
Seamless Data Flow
Import data from system datastore tables
Export verified results directly to your database
Track verification statuses and changes for full transparency
Benefits
Improved Data Quality—Ensure addresses are accurate and standardized across your datasets
Enhanced Matching—Better address data leads to more accurate record matching
Time Savings—Automate validation workflows with configurable approval rules
Availability
The address Verification module will be available in an upcoming release. Watch this page for future announcements with specific availability details.
Data Migration Quality Assurance
We’re excited to announce the upcoming launch of the Quality Assurance feature, which enhances data migration project delivery by providing automated, consistent governance to ensure adherence to the SynitiONE Methodology and best practices across projects. It evaluates potential errors in application configurations and database development work, consolidates findings into an interactive dashboard, and enables users to quickly identify and resolve issues without relying on manual code reviews.
By integrating quality insights directly into the delivery workflow, it supports faster error remediation, more efficient communication, and better assurance of project success.
Key Features
Automated Quality Checks—Executes validation reports against application tables, working database objects, and Migrate registrations to detect deviations from methodology and best practices.
Multi-Level Visibility—Presents high-level quality metrics in interactive dashboard for project leadership and detailed issue breakdowns for other users, such as Migrate Developers.
Issue Reporting—Organizes findings on seven areas—Project Setup, Extraction, Dataset Design, Mapping, Development, Reporting, and Load Program—aligned with project phases.
Severity Classification—Assigns Critical, Info, or Debug severity levels to help teams prioritize remediation.
Trend Analysis—Tracks quality performance over time to highlight improvements or emerging risks.
