Date of Release: June 2023
JIRA ID | Issue |
---|---|
IPD-23216* | Unable to unlock locked entities. |
IPD-22964 | The connection.schema_registry_authentication_details.password field is now part of the iw_migration script. |
IPD-22983 | For a source table fetched from a Confluent source with incremental mode as Append, the pipeline source query does not bring the entire dataset every time anymore irrespective of the provided query. |
IPD-22963 | For the 5.3.x and 5.4.x versions, when the Incremental Load is enabled and the Sync Type is set to append, the second build of the pipeline does not copy the duplicate records anymore. |
IPD-22038 | Fixed the unlock functionality for the Admin users. |
IPD-22113 | You can now create pipelines via the API by using Environment Name, Environment Storage Name, and Environment Compute Template Name. |
IPD-22817 | Thebatch_engine key is now validating the user input during the pipeline creation via API. |
IPD-22721 | For Confluent Kafka source, the streaming_group_id_prefix configuration is now working as expected. |
IPD-22615 | The Partition and the Clustering details are now appearing in the BigQuery table created via Infoworks pipeline. |
IPD-22036 | Pipeline build now succeeds even when the target table for the BigQuery external target already exists and is clustered. |
IPD-22090 | For the Delimited File target, the timestamp format can be configured as per the user requirements. This will be applicable for all the timestamp columns for that table.
|
IPD-22351 | Infoworks has added advanced configuration for setting BigQuery Session Project IDs for Data Transformation and Ingestion jobs (dt_bigquery_session_project_id/ingestion_bigquery_session_project_id) |
IPD-21449 | The Import SQL API is now picking the correct table (even if a table with the same schema/table name is present in multiple data environments). |
IPD-21534 | The Initialize & Ingest and Truncate jobs can now reset the value of the last_merged_watermark key. |
IPD-21584 | The Import SQL command is now able to fetch queries that contain backtick (`). |
IPD-21700 | Fixed the pipeline deletion issue. |
IPD-21792 | The duplicate tables are not allowed to be onboarded anymore on HIVE Metadata sync source. |
Assuming IW_HOME variable is set to /opt/infoworks
To support rollback after metadata migration, you need to take backup of metadata. Following are the steps:
Step 1: Install/Download MongoDB tool: mongodump. (if needed).
Step 2: Create a directory to store the database backup dump using the below command.
xxxxxxxxxx
mkdir -p $IW_HOME/mongo_bkp
cd $IW_HOME/mongo_bkp
Step 3: Use the below command to take a dump (backup) of the databases from the mongodb server.
If MongoDB is hosted on Atlas
xxxxxxxxxx
mongodump "mongodb+srv://<username>:<password>@<mongodb_server_hostname>/<db_name>"
If MongoDB is installed with Infoworks on the same VM
xxxxxxxxxx
mongodump "mongodb://infoworks:IN11**rk@localhost:27017/infoworks-new"
For upgrading from 5.4.1/5.4.1.x to 5.4.1.4, execute the following commands:
Step 1: Use the deployer to upgrade from 5.4.1 to 5.4.1.4.
Step 2: Goto iw_home/scripts
folder of the test machine.
Step 3: To ensure that there is no pre-existing update script, execute the following command:
[[ -f update_5.4.1.4.sh ]] && rm update_5.4.1.4.sh
Step 4: Download the update_5.4.1.4.sh
wget
https://iw-saas-setup.s3.us-west-2.amazonaws.com/5.4/update_5.4.1.4.sh
Step 5: Give update.sh executable permission
chmod +x update_5.4.1.4.sh
Step 6 (Optional): This patch requires Mongo Metadata to be migrated. To export the environment variable, run export METADB_MIGRATION=Y
. This ensures that the metadata will be migrated, else run export METADB_MIGRATION=N
.
Alternatively, you can enter it in the prompt while running the script.
Step 7: Update the package to the hotfix source iw_home/bin/env.sh
./update_5.4.1.4.sh -v 5.4.1.4-ubuntu2004
You will receive a "Please select whether metadb migration needs to be done([Y]/N)" message. If you need to perform metadb migration, enter Y, else, enter N.
To rollback the migrated metadata:
Step 1: Install/Download MongoDB tool: mongorestore. (if needed)
Step 2: Switch to the directory where the backup is saved on the local system.
xxxxxxxxxx
cd ${IW_HOME}/mongo_bkp/dump
Step 3: Use the below command to restore the dump (backup) of the databases to the Mongodb Server.
If MongoDB is hosted on Atlas
xxxxxxxxxx
mongorestore "mongodb+srv://<username>:<password>@<mongodb_server_hostname>/<db_name>” --drop ./<db_name>
If MongoDB is installed with Infoworks on the same VM
xxxxxxxxxx
mongorestore "mongodb://infoworks:IN11**rk@localhost:27017/infoworks-new" --drop ./<db_name>
To go back to previous checkpoint version:
Step 1: In a web browser, go to your Infoworks system, scroll-down to the bottom, and click the Infoworks icon.
Step 2: The Infoworks Manifest Information page opens in a new tab. Scroll down and check the Last Checkpoint Version.
Step 3: ssh
to Infoworks VM and switch to {{IW_USER}}.
Step 4: Initialize the variables in the bash shell.
full_version=5.4.1.4
major_version=$(echo $full_version | cut -d "." -f 1-2)
previous_version=<Previous Version> # Last Checkpoint Version from step 1
os_suffix=<OS Suffix> # One of [ ubuntu2004 amazonlinux2 rhel8 ]
Step 5: Download the required deployer for the current applied patch.
https://iw-saas-setup.s3-us-west-2.amazonaws.com/${major_version}/deploy_${full_version}.tar.gz
Step 6: Execute the SCP command for the above mentioned files to the following path.
deploy_${full_version}.tar.gz
file in ${IW_HOME}/scripts/
directory.
${IW_HOME}/scripts/.
Step 7: Extract the deployed tar file in case it does not exist.
cd ${IW_HOME}/scripts
[[ -d iw-installer ]] && rm -rf iw-installer
tar xzf deploy_${full_version}.tar.gz
cd iw-installer
Step 8: Initialize the environment variables.
source ${IW_HOME}/bin/env.sh
export IW_PLATFORM=saas
Step 9: Run the Rollback command.
./rollback.sh -v ${previous_version}-${os_suffix}