Infoworks Release Notes
Release Notes

v5.3.0.13

Date of Release: April 2023

Features and Enhancements

JIRA IDIssue
IPD-20373Infoworks provides the REST APIs to create and delete the file mappings for the Mainframe sources. The sample curl request can be found here
IPD-20276Infoworks now supports ingestion from Variable Block (VB) Mainframe Files. To add variable block mainframe file-mapping, you can select the same from the Record Type dropdown.
IPD-20175Infoworks can now ingest Copybook files which have incorrect indentation. You can change the value of comment_upto_char (Default Value: 6) and comment_after_char (Default Value: 6).
IPD-20173, IPD-20097, and IPD-20088Infoworks supports ingesting all types of mainframe files with filter.
IPD-20099Infoworks supports ingestion Copybook files which have FILLER as a column name.
IPD-20098Infoworks supports flattening of complex datatypes.
IPD-20685

Infoworks supports to register and de-register HIVE/Metastore UDFs. Following are the key-value pairs to register/de-register.

should_register_hive_udfs=true/false

should_deregister_hive_udfs=true/false

hive_udfs_to_register= <functionName1>:<fullyQualifiedclassNameImplementingFunction1>;<functionName2>:<fullyQualifiedclassNameImplementingFunction2>

hive_udfs_to_deregister: <functionName1>:<functionName2>

Resolved Issues

JIRA IDIssue
IPD-21571Back ticks (`) are not getting preserved in pipelines while importing the SQL query.
IPD-21574The API to create Query as a Table is not working in the BigQuery environment.
IPD-21423Pipeline build fails with "Unable to find table metadata for node" error message.
IPD-20206If user tries to stop the Orchestrator using stop.sh script, it stops all the components except the Orchestrator Engine Worker.
IPD-20921The backlog jobs gets stuck in pending state and do not progress resulting in other jobs also getting blocked.
IPD-21003During the pipeline build, Infoworks is unable to read the timestamp in the BigQuery output.
IPD-21093For a fixed length file, the first column gets moved to the last during ingestion.
IPD-20889In a TPT-based teradata table, there is a discrepancy between the actual job run time and the job duration shown on the Job Metrics page.
IPD-20979The BigQuery pushdown attempts to read and write to the specified parent project instead of the configured project in the environment.
IPD-20087The Clustering columns are missing in the Target BigQuery table.
IPD-20536Data Analyst and Data Modeller have permissions to preview the data in pipeline, view sample data, and generate sample data when configuring a table.
IPD-20022Despite disabling the dataset creation in the pipeline configuration, the pipeline still creates the schema.
IPD-19943There is no provision to configure the disk space for the Dataproc clusters.
IPD-19929In few scenarios, Upgrading from 5.3.0 to 5.3.0.5 crashes the Ingestion service.
IPD-19945Infoworks does not fetch the correct datatypes for the CDATA sources.
IPD-19821When service credential used in BigQuery target is different than service credentials used to create environment , sync to target fails with “Invalid JWT Signature” error.
IPD-19766For BigQuery export files, Sync to Target is failing when the table schema contains array type.
IPD-19853If number of characters in table name exceed 27 characters, then export to teradata is failing with "table_name_temp already exist" error.
IPD-19751Infoworks does not disable query caching while fetching schema from BigQuery.
IPD-19815There are incorrect log messages in Sync to Target for teradata job in 5.3.
IPD-19474The API POST call to Pipeline Config-Migration fails with generic error.
IPD-19545The list of data connections and GET data connections APIs are accessible only to Admin users.
IPD-19542When running ingestion on BigQuery environment, error table is not getting created on BigQuery dataset if the source has only one error record.
IPD-19339Despite cluster creation getting completed, the Creating Cluster timer duration keeps increasing.
IPD-19663The Workflows were failing due to the request timeout. Hence, they go directly to failed state without executing any of the tasks.
IPD-19701The API call to trigger Sync to Target for the table group configured with target data connection fails.
IPD-19753For JSON and streaming sources, if CDC data has column with empty values, it is marked as an error record.
IPD-20202If you manually change the datatype for a column after the metacrawl, Mainframe ingestion fails.
IPD-20174 and IPD-20172If a COBOL layout file does not have header field, Infoworks unable to crawl/ingest EBCDIC file.
IPD-20684Google has changed the return message for exception handling of autoscaling policies resulting in job failure.
IPD-20570The "Add tables to crawl" API is not working for BigQuery Sync source.
IPD-20455The dt advanced configurations to merge partitions is not taking effect in the pipeline job dt_batch_spark_coalesce_partitions.
IPD-20432The workloads were running in the Compute project rather than the Storage project where datasets are persisted.
IPD-20397Pipeline build fails when the source table column has trailing "%".
IPD-20207Data analyst and Modeler are unable to crawl metadata.
IPD-20371While configuring the BigQuery target, the columns are getting ordered alphabetically irrespective of the order user chooses.

Upgrade

Procedure

Step 1: ​​Stop all running jobs.

Step 2: Change the directory to tmp folder.

Command
Copy

Step 3: Change the user to infoworks.

Command
Copy

Step 4: To download the tar file, execute the following command.

Command
Copy

Step 5: Extract the tar files.

Command
Copy

Step 6: Set the IW_HOME environment variable.

Command
Copy

Step 7: Create a backup directory.

Command
Copy

Step 8: Move the REST API and DT files to the backup folder.

Command
Copy

Step 9: Copy the patch files to the respective original directories.

Command
Copy

Step 10: Restart the RESTAPI and DT services now.

Command
Copy

Rollback

To go back to previous checkpoint version:

Step 1: ​​Stop all running jobs.

Step 2: Change the directory to tmp folder.

Command
Copy

Step 3: Change the user to infoworks.

Command
Copy

Step 4: Set the IW_HOME environment variable.

Command
Copy

Step 5: Remove patch specific files and folders.

Command
Copy

Step 6: Move the files and folders from the backup directory to the respective original directories.

Command
Copy

Step 7: Restart the RESTAPI and DT services.

Command
Copy