Date of Release: July 2024
This section consists of the new features and enhancements introduced in this release. |
Databricks Unity Catalog as a metastore:
Infoworks now supports Ability to Re-Order Columns during the Ingestion on UI itself. User can specify the target column order in which he wants the columns to land in target.
Infoworks now supports the TLS version 1.3 for all it's internal communications, so that customer compliance is enforced.
Infoworks now supports the custom tags to be set on ingestion, which can be used to identify/segregate the resource based on tags.
Users can now specify lower and upper watermark for a particular run. Read more about this under Onboarding Data.
Restricted users' access to secrets by associating with domains. Refer here for details.
Restricted users' access to Data Environments to only their assigned domains. For details, please refer here and Adding Secret Store/Keyvault.
Users will now be able to submit jobs and workflows during upgrade.
Secrets from secret store can be accessed in Bash node of workflow. Refer to the Bash Script section for details.
Secrets from secret store can be accessed in job pre-post hooks and custom target for pipeline extension. Refer to the Managing Job Hooks section.
For prerequisites of Network Policy on AKS Cluster: please refer here.
Confluent Cloud now supports using AzureAD as Identity provider. You can view the details here.
This section consists of the resolved issues in this release:
JIRA ID | Issue |
---|---|
IPD-25709 | Solution to reduce/eliminate Integer and Decimal datatype mismatches between Teradata source and target Snowflake |
IPD-26474 | Regression: Redshift pipeline build failing on azure databricks 14.3 |
IPD-25719 | Data validation for failed failure |
IPD-25837 | API to change table group scheduled user is not working |
IPD-25839 | GET table group call gives a refresh token of the scheduled user in the response |
IPD-25840 | Source_schema_name and Source_table_name are interchanged in the ingestion metrics response in v5.5.0.5 |
IPD-25868 | Incorrect success response of verify refresh token API |
IPD-25869 | Add table action of the file mapping doesn't validate the uniqueness of the target DB, schema, and table name combination |
IPD-26017 | Ingestion jobs fail when access control list parameter is configured with 'group' |
IPD-25952 | Status.sh showing connection error warnings/exception stack trace when all services are in stopped state |
IPD-26115 | CICD failing for Pipeline group migration |
IPD-26110 | The upload schema option is erroring if the user tries to update the column name |
IPD-26143 | Issue with executing snowflake stored procedure |
IPD-25940 | Snowflake pipeline export job on the existing artifacts (created in 5.0) creating the DB/SCHEMA/TABLE names as case sensitive (in lower case) |
IPD-26224 | Case of column names converted to lower case on pipeline build to cosmos db target |
IPD-26178 | Merge to Snowflake table fails SQL compilation error |
IPD-26285 | Ingestion failing with java.lang.NullPointerException in v5.5.1.2 |
IPD-26304 | Ingestion on user-managed tables failing with schema mismatch error post 5.5.1.2 upgrade |
IPD-26278 | v5.5.1.3 - Add user-agent to databricks API calls |
This section consists of the improvements in this release:
JIRA ID | Improvements |
---|---|
IPD-25709 | Solution to reduce/eliminate Integer and Decimal datatype mismatches between Teradata source and target Snowflake |
IPD-24208 | Optimize the incremental query to retrieve a maximum watermark value for segmented load tables |
PYTHON_REQUESTS_TIMEOUT
with value “60” in orchestrator deployment.For Kubernetes-based installation, refer to Infoworks Installation on Azure Kubernetes Service (AKS).
For more information, contact support@infoworks.io.
For upgrading from 5.5.1.x to 6.0.0 for Azure Kubernetes, refer to Upgrading Infoworks from 5.5.1.x to 6.0.0 for Azure Kubernetes.