Infoworks Release Notes
Release Notes

v5.4.0.5

Date of Release: April 2023

Resolved Issues

JIRA IDIssue
IPD-21604User is unable to use Snowflake secondary roles in pipelines.
IPD-21703The jobs are intermittently failing due to Service Account Authorization Token expiry.
IPD-21277During BTEQ conversion, join condition contains invalid column references resulting in validation error.
IPD-21280The SQL Import adds the IN and UNION condition in the where clause which throws a validation error.
IPD-21475During BTEQ conversion, pipelines are creating additional target tables before the union node.
IPD-21613The SQL import fails with the null pointer exception for the aggregate node.
IPD-21618The job_object.json file from Infoworks ingestion/pipeline job logs contains Databricks token in clear text.
IPD-21729Password is being shown in the logs when Snowflake source is being used.
IPD-21186During BTEQ conversion, the "insert into..." query is getting created in overwrite mode for the existing table.
IPD-21275The NOT IN condition is getting added in join node.
IPD-21309If an SQL query has multiple CASE statements without column alias, it is creating only one derivation instead of appropriate number of column ports.
IPD-21407While configuring SFTP, if user selects "Using Private Key" as the Authentication Mechanism type and selects Private Key option button, it results in internal server error afterwards.
IPD-21420The feature to retrieve private key for SFTP source from Key vault Secret store is not available.
IPD-21427Pipeline build is adding null columns in "insert into..." query while using the reference table.
IPD-21441There is no configuration to automatically deselect all the ziw columns on the target node.
IPD-21450Operations Dashboard is not displaying Speedometer charts at the top of the page.
IPD-21451Pipeline Preview Data is failing as it is unable to connect to Databricks Cluster.
IPD-21510No refresh tokens are generated for users created via SAML or LDAP authentication flow.
IPD-21503Operations Dashboard displays different count on speedometer chart and the actual list of jobs.
IPD-20984When user defined tables are selected for ingestion target, Infoworks attempts to create error tables in the target schema instead of the staging schema.
IPD-21178The sub-queries are not supported in IN/NOT-IN node during SQL Import.
IPD-21201The SQL import fails with duplicate ziw_row_id column while running the BTEQ converter.
IPD-21266The Pipeline lineage API does not return the source schema and source table name for pipelines built in Snowflake environment.
IPD-21319Some of the upstream pipeline columns are not getting projected to the target table during table update resulting in pipeline failure during runtime.
IPD-21260The column name is not extracted properly when the column name has schema_name.table_name value.
IPD-21272The Domain Summary page hangs when there are more than 100 pipelines in the domain.
IPD-21383Infoworks is overwriting spark driver and executor javaoptions resulting in users unable to add custom java options
IPD-21247The pipeline build fails when Derive node (previous to the target node) is not part of the Pipeline configured with Update mode.
IPD-21299The Staging Schema Name field is not visible in the source setup page for Kafka sources on Snowflake.

Upgrade

Prerequisites

NOTE Before going through the below mentioned prerequisites for this section, ensure that all the Prerequisites for Installing Infoworks on AKS are validated.

  • Ensure the current deployment’s chart is present in /opt/infoworks.
  • Python 3.8 or later version with the pip module is installed in the Bastion VM.
  • Stable internet connectivity on the Bastion VM to download the required python packages from the python repository during installation/upgrades.
  • Validate that the version of the old chart is 5.4.0.
Command
Copy
Output
Copy

Ensure to take backup of MongoDB Atlas and PostgresDB PaaS. In case you don't take the backup, jobs will fail after the rollback operation. For more information, refer to the MongoDB Backup and PostgresDB PaaS Backup

Upgrade Instructions

To upgrade Infoworks on Kubernetes:

It is assumed that the existing chart is placed in the /opt/infoworks directory and the user has the access permission.

Command
Copy

Before selecting the type of upgrade execute the following commands.

Step 1: Create the required directories and change the path to that directory.

Command
Copy

Internet-free Upgrade

NOTE If you are upgrading via Internet-based procedure, skip to the next section.

Step 1: Download the upgrade tar files shared by the Infoworks team to the Bastion (Jump host) VM and place it under $IW_HOME/downloads.

Step 2: To configure Internet-free upgrade, execute the following command:

Command
Copy

Internet-based Upgrade

Step 1: Download the Update script tar file.

Command
Copy

Common Steps for Both Internet-free and Internet-based

NOTE Once you have selected the type of upgrade, the below mentioned steps are common for both Internet-free and Internet-based.

Step 1: Extract the iwx_updater _k8s_5.4.0.5 tar.gz under $IW_HOME/downloads.

Do not extract the tar file to /opt/infoworks/iw-k8s-installer as it would result in loss of data.

Bash
Copy

This should create two new files as follows - update-k8s.sh and configure.sh.

Step 2: Run the script.

Command
Copy

NOTE At the end of the above command's execution, if you want to run helm upgrade manually, then type N and press Enter. There is a 30-second timeout set to abandon the deployment of the upgraded version. If no input is received within the timeout duration, the deployment is triggered.

Output
Copy

Step 3 (Applicable only for Databricks Persistent Clusters): A change in the Infoworks jar requires libraries being uninstalled and cluster restart. Without this step, there will be stale jars. Perform the following steps:

(i) Go to the Databricks workspace, navigate to the Compute page, and select the cluster that has stale jars.

(ii) In the Libraries tab, select all the Infoworks jars and click Uninstall.

(iii) From Infoworks UI or Databricks dashboard, select Restart Cluster.

Rollback

Prerequisites

  • Before executing the rollback script, ensure that IW_HOME variable is set.
  • Assuming Infoworks home directory is /opt/infoworks, run the below command to set the IW_HOME variable.
  • Validate that the version of the old chart is 5.4.0.5
Command
Copy
  • Ensure the current deployment’s chart is present in /opt/infoworks.
  • Execute the below command to check the appVersion.
Command
Copy
Output
Copy

Ensure to restore MongoDB Atlas and PostgresDB PaaS. In case you don't take the backup, jobs will fail after the restore operation. For more information, refer to the MongoDB Restore and PostgresDB Restore.

Rollback Instructions

NOTE Since IW_HOME has been exported in the Prerequisites section mentioned above, the following steps can be executed from any location for users with read/write access to the aforementioned IW_HOME.

Step 1: Download the rollback script.

Command
Copy

Step 2: Place the Update script in the same directory as that of the existing iw-k8s-installer.

Step 3: Ensure you have permission to the $IW_HOME directory.

Step 4: Give executable permission to the rollback script using the below command.

Command
Copy

Step 5: Run the script.

Command
Copy

Step 6: You will receive the following prompt, "Enter N to skip running the above command to upgrade the helm deployment. (timeout: 30 seconds): ", type Y and press Enter.

NOTE If you want to run helm upgrade manually, then type N and press Enter.

Prompt
Copy

NOTE There is a 30-second timeout set to abandon the deployment of the downgraded version. If no input is received within the timeout duration, the deployment is triggered.