Infoworks Release Notes
Release Notes

v6.1.2

Date of Release: January 2025

This section consists of the new features and enhancements introduced in this release.

  • Azure AD OAuth: Users can now authenticate with Snowflake using user-based tokens instead of client-based tokens, if needed, by selecting the appropriate access type in the Azure AD Snowflake profile configuration.

Resolved Issues

This section consists of the resolved issues in this release:

JIRA IDIssue
IPD-27102Automatically handle the case of in-progress workflows after the Azure Postgres maintenance
IPD-27387Pipeline jobs page showing 'No Jobs Found' despite previous runs

Known Issues

  • Segmented load parallel segment ingestion failing with history enabled.
  • Target table in SQL pipeline created through SQL import is not available as a reference table.
  • Insert Overwrite mode is not supported in transformation pipelines.
  • Ingestion for tables created using query as a table with incremental mode insert overwrite will fail when the column names have space in it for a CDC job.
  • Ingestion for tables created using query as a table with incremental mode insert overwrite and derived splitby configured will fail for a CDC job.
  • ERROR message is noticed while opening preview data in custom target node in a pipeline.
  • Streaming jobs that have been stopped may show running state for the cluster job, users can verify that the job is actually stopped by observing that the number of batches run for that job does not increase after stopping it; more details here.
  • Micro batches processing stop for streaming ingestion if the source stops streaming the data for Databricks runtime version 14.3.
  • Pipeline build failing when read from merged/deduplicated table is selected.
  • Pipeline node preview data request times out for initial few tries.
  • Error message is noticed when attempting to delete ACT CRM source.

Limitations

  • CDC SCD2 pipeline build are failing intermittently on 6.1.0 unity environment.
  • When the table is ingested using Databricks 14.3 and in the pipeline when we check for preview data either with 11.3 or 14.3, API gets timed out.
  • Streaming is not supported on a shared cluster.
  • For TPT jobs running on shared cluster, it is user's responsibility to install TPT otherwise job will not work due to limitation from databricks.
  • In a non-Unity Catalog environment, execution type Databricks SQL is only supported in DBFS storage.
  • Target tables used in the SQL pipelines without a create table query, will not be available in the data models for use.
  • Spark execution type does not support SQL pipelines.
  • Jobs on the Databricks Unity environment fail with "Error code: FILE_NOT_FOUND_FAILURE." Refer here.
  • Preview Data is not supported on intermediate transformation nodes (except source and target nodes) for pipelines configured with execution engine as DBSQL.
  • In Pipelines using Snowflake Metadata Sync sources, case sensitive column names are not supported.

For Kubernetes-based installation, refer to Infoworks Installation on Azure Kubernetes Service (AKS).

For more information, contact support@infoworks.io.

Upgrade

For upgrading Azure Kubernetes, refer to Upgrading Infoworks from 6.1.1 to 6.1.2 for Azure Kubernetes.

PAM

The Product Availability Matrix (PAM) is available here.

On This Page
v6.1.2