Date of Release: March 2023
JIRA ID | Issue |
---|---|
IPD-20373 | Infoworks provides the REST APIs to create and delete the file mappings for the Mainframe sources. The sample curl request can be found here |
IPD-20276 | Infoworks now supports ingestion from Variable Block (VB) Mainframe Files. To add variable block mainframe file-mapping, you can select the same from the Record Type dropdown. |
IPD-20175 | Infoworks can now ingest Copybook files which have incorrect indentation. You can change the value of comment_upto_char (Default Value: 6) and comment_after_char (Default Value: 6). |
IPD-20173, IPD-20097, and IPD-20088 | Infoworks supports ingesting all types of mainframe files with filter. |
IPD-20099 | Infoworks supports ingestion Copybook files which have FILLER as a column name. |
IPD-20098 | Infoworks supports flattening of complex datatypes. |
IPD-20685 | Infoworks supports to register and de-register HIVE/Metastore UDFs. Following are the key-value pairs to register/de-register.
hive_udfs_to_register=
|
JIRA ID | Issue |
---|---|
IPD-21423 | Pipeline build fails with "Unable to find table metadata for node" error message. |
IPD-20206 | If user tries to stop the Orchestrator using stop.sh script, it stops all the components except the Orchestrator Engine Worker. |
IPD-20921 | The backlog jobs gets stuck in pending state and do not progress resulting in other jobs also getting blocked. |
IPD-21003 | During the pipeline build, Infoworks is unable to read the timestamp in the BigQuery output. |
IPD-21093 | For a fixed length file, the first column gets moved to the last during ingestion. |
IPD-20889 | In a TPT-based teradata table, there is a discrepancy between the actual job run time and the job duration shown on the Job Metrics page. |
IPD-20979 | The BigQuery pushdown attempts to read and write to the specified parent project instead of the configured project in the environment. |
IPD-20087 | The Clustering columns are missing in the Target BigQuery table. |
IPD-20536 | Data Analyst and Data Modeller have permissions to preview the data in pipeline, view sample data, and generate sample data when configuring a table. |
IPD-20022 | Despite disabling the dataset creation in the pipeline configuration, the pipeline still creates the schema. |
IPD-19943 | There is no provision to configure the disk space for the Dataproc clusters. |
IPD-19929 | In few scenarios, Upgrading from 5.3.0 to 5.3.0.5 crashes the Ingestion service. |
IPD-19945 | Infoworks does not fetch the correct datatypes for the CDATA sources. |
IPD-19821 | When service credential used in BigQuery target is different than service credentials used to create environment , sync to target fails with “Invalid JWT Signature” error. |
IPD-19766 | For BigQuery export files, Sync to Target is failing when the table schema contains array type. |
IPD-19853 | If number of characters in table name exceed 27 characters, then export to teradata is failing with "table_name_temp already exist" error. |
IPD-19751 | Infoworks does not disable query caching while fetching schema from BigQuery. |
IPD-19815 | There are incorrect log messages in Sync to Target for teradata job in 5.3. |
IPD-19474 | The API POST call to Pipeline Config-Migration fails with generic error. |
IPD-19545 | The list of data connections and GET data connections APIs are accessible only to Admin users. |
IPD-19542 | When running ingestion on BigQuery environment, error table is not getting created on BigQuery dataset if the source has only one error record. |
IPD-19339 | Despite cluster creation getting completed, the Creating Cluster timer duration keeps increasing. |
IPD-19663 | The Workflows were failing due to the request timeout. Hence, they go directly to failed state without executing any of the tasks. |
IPD-19701 | The API call to trigger Sync to Target for the table group configured with target data connection fails. |
IPD-19753 | For JSON and streaming sources, if CDC data has column with empty values, it is marked as an error record. |
IPD-20202 | If you manually change the datatype for a column after the metacrawl, Mainframe ingestion fails. |
IPD-20174 and IPD-20172 | If a COBOL layout file does not have header field, Infoworks unable to crawl/ingest EBCDIC file. |
IPD-20684 | Google has changed the return message for exception handling of autoscaling policies resulting in job failure. |
IPD-20570 | The "Add tables to crawl" API is not working for BigQuery Sync source. |
IPD-20455 | The dt advanced configurations to merge partitions is not taking effect in the pipeline job dt_batch_spark_coalesce_partitions . |
IPD-20432 | The workloads were running in the Compute project rather than the Storage project where datasets are persisted. |
IPD-20397 | Pipeline build fails when the source table column has trailing "%". |
IPD-20207 | Data analyst and Modeler are unable to crawl metadata. |
IPD-20371 | While configuring the BigQuery target, the columns are getting ordered alphabetically irrespective of the order user chooses. |
Assuming IW_HOME variable is set to /opt/infoworks
To support rollback after metadata migration, you need to take backup of metadata. Following are the steps:
Step 1: Install/Download MongoDB tool: mongodump. (if needed)
Step 2: Create a directory to store the database backup dump using the below command
xxxxxxxxxx
mkdir -p $IW_HOME/mongo_bkp
cd $IW_HOME/mongo_bkp
Step 3: Use the below command to take a dump (backup) of the databases from the mongodb server.
If MongoDB is hosted on Atlas
xxxxxxxxxx
mongodump "mongodb+srv://<username>:<password>@<mongodb_server_hostname>/<db_name>"
If MongoDB is installed with Infoworks on the same VM
xxxxxxxxxx
mongodump "mongodb://infoworks:IN11**rk@localhost:27017/infoworks-new"
For upgrading from 5.3.0 to 5.3.0.12, execute the following commands:
Step 1: Use the deployer to upgrade from 5.3.0 to 5.3.0.12
Step 2: Goto iw_home/scripts
folder of the test machine.
Step 3: Execute the following command:
rm update_5.3.0.12.sh
Step 4: Download the update_5.3.0.12.sh
wget
https://iw-saas-setup.s3.us-west-2.amazonaws.com/5.3/update_5.3.0.12.sh.
Step 5: Give update.sh executable permission
chmod +x update_5.3.0.12.sh
Step 6 (Optional): This patch requires Mongo Metadata to be migrated. To export the environment variable, run export METADB_MIGRATION=Y
. This ensures that the metadata will be migrated, else run export METADB_MIGRATION=N
.
Alternatively, you can enter it in the prompt while running the script.
Step 7: Update the package to the hotfix source iw_home/bin/env.sh
./update_5.3.0.12.sh -v 5.3.0.12-ubuntu2004
You will receive a "Please select whether metadb migration needs to be done([Y]/N)" message. If you need to perform metadb migration, enter Y, else, enter N.
To rollback the migrated metadata:
Step 1: Install/Download MongoDB tool: mongorestore. (if needed)
Step 2: Switch to the directory where the backup is saved on the local system.
xxxxxxxxxx
cd ${IW_HOME}/mongo_bkp/dump
Step 3: Use the below command to restore the dump (backup) of the databases to the Mongodb Server.
If MongoDB is hosted on Atlas
xxxxxxxxxx
mongorestore "mongodb+srv://<username>:<password>@<mongodb_server_hostname>/<db_name>” --drop ./<db_name>
If MongoDB is installed with Infoworks on the same VM
xxxxxxxxxx
mongorestore "mongodb://infoworks:IN11**rk@localhost:27017/infoworks-new" --drop ./<db_name>
To go back to previous checkpoint version:
Step 1: In a web browser, go to your Infoworks system, scroll-down to the bottom, and click the Infoworks icon.
Step 2: The Infoworks Manifest Information page opens in a new tab. Scroll down and check the Last Checkpoint Version.
Step 3: ssh
to Infoworks VM and switch to {{IW_USER}}.
Step 4: Initialize the variables in the bash shell.
full_version=5.3.0.12
unpatched_version=$(echo $full_version | cut -d "." -f 1-3)
major_version=$(echo $full_version | cut -d "." -f 1-2)
previous_version=<Previous Version> # Last Checkpoint Version from step 1
os_suffix=<OS Suffix> # One of [ ubuntu2004 amazonlinux2 rhel8 ]
Step 5: Download the required deployer for the current applied patch.
https://iw-saas-setup.s3-us-west-2.amazonaws.com/${major_version}/deploy_${unpatched_version}.tar.gz
Step 6: Execute the SCP command for the above mentioned files to the following path.
deploy_${unpatched_version}.tar.gz
file in ${IW_HOME}/scripts/
directory.
${IW_HOME}/scripts/.
Step 7: Extract the deployed tar file in case it does not exist.
cd ${IW_HOME}/scripts
[[ -d iw-installer ]] && rm -rf iw-installer
tar xzf deploy_${unpatched_version}.tar.gz
cd iw-installer
Step 8: Initialize the environment variables.
source ${IW_HOME}/bin/env.sh
export IW_PLATFORM=saas
Step 9: Run the Rollback command.
./rollback.sh -v ${previous_version}-${os_suffix}