To upgrade to 5.3.0, you need a newly generated license that enables the supported environments on the installation. For more information, contact support@infoworks.io.
export IW_HOME=/opt/infoworks
export IW_USER=infoworks
Ensure that you have upgraded the OS on the system to Ubuntu 20.04 if you are on an Ubuntu based system.
Ensure that GNU bc arbitrary precision calculator language (bc) is installed.
Ensure that the Default Java version is pointing to Java 8.
Ensure that all files in ${IW_HOME} are owned by the Infoworks user.
Ensure that all the services are up and running in ${IW_HOME}/bin/status.sh
.
The default value of ${IW_HOME} is /opt/infoworks and ${IW_USER} is infoworks.
Execute the following command to recursively change the permissions of the directory if not owned by ${IW_USER}: chown -R ${IW_USER}: ${IW_HOME}/resources/nginx-portable
Navigate to ${IW_HOME}/resources/nginx-portable/conf/nginx.conf
and check if the user is ${IW_USER} in the file. If not, you must set it to ${IW_USER}
Restart Infoworks services using the following command ${IW_HOME}/bin/stop.sh all mongo && ${IW_HOME}/bin/start.sh all mongo
.
Ensure that you have the new license key to use the version of Infoworks to be upgraded to. If you do not have it, you may contact support@infoworks.io to receive the same.
You must take a backup of ${IW_HOME} before starting the upgrade process.
To back up Infoworks services, perform the following steps:
Step 1: Stop Infoworks service by executing the following command via IW_USER:
${IW_HOME}/bin/stop.sh all mongo
Step 2: Switch to root user.
sudo su -
Step 3: Take a backup of the Infoworks by executing the following command:
cp -r ${IW_HOME}/../infoworks ${IW_HOME}/../infoworks.bkp
Step 4: Switch back to IW_USER.
sudo su - ${IW_USER}
Step 5: Start Infoworks service by executing the following command:
${IW_HOME}/bin/start.sh mongo all
Before you upgrade, it is recommended to take a backup of Mongo Atlas Infoworks DBs, if any.
Perform the following steps for internet-based upgrade:
Step 1: Download the upgrade script using the following commands:
cd ${IW_HOME}/scripts
wget https://iw-saas-setup.s3-us-west-2.amazonaws.com/5.3/update.sh --no-check-certificate
Step 2: Provide execute permission to the script using the following command:
chmod +x update.sh
Step 3: Execute the following script to source the env.sh
file. This sources the Infoworks environment variables, such as path variables.
source ${IW_HOME}/bin/env.sh
Step 4: Export the following environment variables:
Field | Description | Details |
---|---|---|
METADB_MANAGED | This flag indicates whether the Mongo Installation is managed by Infoworks | Enter Y (default) for Infoworks Managed, or N for an external mongo. export METADB_MANAGED= |
METADB_USE_SRV | This flag indicates whether the MONGO_HOST corresponds to a DNS SRV record | Enter N(default) for No (prefix “mongodb://”), or Y for Yes (prefix “mongodb+srv://”) export METADB_USE_SRV= |
Field | Description | Details |
---|---|---|
METADB_FORCE_DROP | This flag indicates whether the Installer should overwrite any existing Databases. | Enter N (default) for No (the provided databases must be empty), or Y to allow dropping of databases. export METADB_FORCE_DROP=N |
MONGO_HOST | This flag indicates the Mongo host URL to connect to | Enter the Mongo Server or Seed DNS hostname (without prefix) export MONGO_HOST= |
MONGO_USER | The Mongo user that is used to authenticate into Infoworks DBs | Enter a user that has at least read/write permissions over the databases mentioned. (defaults to Infoworks). export MONGO_USER= |
MONGO_PASS | The Password of the aforementioned MONGO_USER | Enter the Password of the MONGO_USER (defaults to Infoworks Mongo Password) export MONGO_PASS= |
MONGO_DB | The Database Name that will store Infoworks MetaData. | Enter a valid database name. export MONGO_DB= |
MONGO_QUARTZDB | The Database Name that will store Quartzio MetaData. | Enter a valid database name. export MONGO_QUARTZDB= |
Step 5: Execute the following script:
./update.sh -v {{version}}
For example, to upgrade to version 5.3.0, use the following script:
./update.sh -v 5.3.0-ubuntu2004
./update.sh -v 5.3.0-rhel8
Step 6: Upgrade the license soon after logging into the UI.
iw_home/temp/backup-5.3.0/mongo
. After upgradation, it is recommended to test the migration to MongoDB Atlas thoroughly by running Infoworks Ingestion and Pipeline Jobs. These dumps can be deleted after the tests are complete.
Step 7: Update the valid sections in the security.ini
file manually: vi /opt/infoworks/platform/conf/security.ini
. Replace only the "Roles" and the "Permissions" sections:
[roles]
admin = domains, clusters, sources, databricks_cluster_templates, notification_subscribers, auth_configs, locked_entities, metadata_cleanup_policies, backups_mongo, configs, users, custom_tags
dbadmin = sources, custom_tags, configs
modeller = domains, pipelines, cubes, workflows, custom_tags, configs
analyst = domains, pipelines, cubes, workflows, custom_tags, configs
operations_analyst = domains, sources, pipelines, workflows, locked_entities, clusters, jobs, custom_tags, configs
[permissions]
admin = cluster_metadata_create, cluster_metadata_read, cluster_metadata_update, cluster_metadata_delete, source_metadata_create, domain_metadata_create, admin_entity_metadata_create, admin_entity_metadata_read, admin_entity_metadata_update, admin_entity_metadata_delete, bulk_source_metadata_read, bulk_domain_metadata_read, bulk_admin_entity_metadata_read, bulk_subscribers_metadata_read, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
dbadmin = source_metadata_read, source_metadata_update, source_metadata_delete, source_data_update, bulk_source_metadata_read, bulk_subscribers_metadata_read, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
modeller = domain_artifacts_metadata_create, domain_metadata_read, pipeline_metadata_read, pipeline_metadata_update, pipeline_metadata_delete, table_metadata_read, table_metadata_write, inter_domain_table_metadata_read, cube_metadata_read, cube_metadata_update, cube_metadata_delete, workflow_metadata_read, workflow_metadata_update, workflow_metadata_delete, bulk_domain_metadata_read, bulk_subscribers_metadata_read, pipeline_data_update, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
analyst = domain_artifacts_metadata_create, domain_metadata_read, pipeline_metadata_read, pipeline_metadata_update, pipeline_metadata_delete, table_metadata_read, table_metadata_write, inter_domain_table_metadata_read, cube_metadata_read, cube_metadata_update, cube_metadata_delete, workflow_metadata_read, workflow_metadata_update, workflow_metadata_delete, bulk_domain_metadata_read, bulk_subscribers_metadata_read, pipeline_data_update, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
operations_analyst = bulk_dashboard_metadata_read, dashboard_workflow_restart, dashboard_workflow_cancel, dashboard_job_cancel, dashboard_job_resubmit, dashboard_locked_entity_delete, dashboard_locked_entity_read, cluster_metadata_read, bulk_source_metadata_read, bulk_domain_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
cd ${IW_HOME}/bin && ./stop.sh platform && ./start.sh platform
Step 8: In case there are any changes, update the files in ${IW_HOME}/conf/*.json
, post taking the backup in ${IW_HOME}/../infoworks.bkp
as reference.
Before you upgrade, it is recommended to take a backup of Mongo Atlas Infoworks DBs if any.
Perform the following steps for internet-free upgrade:
For Ubuntu 20.04 machine, perform the following command:
export os=ubuntu2004
For RHEL8 machine, perform the following command:
export os=rhel8
Step 1: Get the Upgrader Tarball using the following link:
https://iw-saas-setup.s3.us-west-2.amazonaws.com/5.3/iwx_upgrader_internet_free_${os}_5.3.0.tar.gz
Step 2: SCP this tarball into /home/infoworks
path of the Edge Node.
Step 3: Untar the Upgrader Tarball using the following command:
tar -xvf iwx_upgrader_internet_free_${os}_5.3.0.tar.gz
The following upgrade scripts are available:
Step 4: Create the downloads and scripts directory for Infoworks Package Tar.
mkdir -p ${IW_HOME}/temp/downloads
mkdir -p ${IW_HOME}/scripts/
Step 5: Copy the Infoworks Package Tar to target downloads location:
cp infoworks-5.3.0-${os}.tar.gz ${IW_HOME}/temp/downloads
Step 6: Copy the Configure Script, Update Script and Deployer Tar to the target scripts directory:
cp configure.sh internet-free-update.sh deploy_5.3.0.tar.gz ${IW_HOME}/scripts/
Step 7: Edit the configure.sh file:
cd ${IW_HOME}/scripts/
vi configure.sh
Configure the parameters as described in the table given below, and then save the file:
Field | Description | Details |
---|---|---|
METADB_MANAGED | This flag indicates whether the Mongo Installation is managed by Infoworks | Enter Y (default) for Infoworks Managed, or N for an external mongo. export METADB_MANAGED= |
METADB_USE_SRV | This flag indicates whether the MONGO_HOST corresponds to a DNS SRV record | Enter N(default) for No (prefix “mongodb://”), or Y for Yes (prefix “mongodb+srv://”) export METADB_USE_SRV= |
Field | Description | Details |
---|---|---|
METADB_FORCE_DROP | This flag indicates whether the Installer should overwrite any existing Databases. | Enter N (default) for No (the provided databases must be empty), or Y to allow dropping of databases. export METADB_FORCE_DROP=N |
MONGO_HOST | This flag indicates the Mongo host URL to connect to | Enter the Mongo Server or Seed DNS hostname (without prefix) export MONGO_HOST= |
MONGO_USER | The Mongo user that is used to authenticate into Infoworks DBs | Enter a user that has at least read/write permissions over the databases mentioned. (defaults to Infoworks). export MONGO_USER= |
MONGO_PASS | The Password of the aforementioned MONGO_USER | Enter the Password of the MONGO_USER (defaults to Infoworks Mongo Password) export MONGO_PASS= |
MONGO_DB | The Database Name that will store Infoworks MetaData. | Enter a valid database name. export MONGO_DB= |
MONGO_QUARTZDB | The Database Name that will store Quartzio MetaData. | Enter a valid database name. export MONGO_QUARTZDB= |
Step 8: Provide the executable permissions to the Upgrade script.
chmod +x ${IW_HOME}/scripts/internet-free-update.sh
Step 9: Perform the following steps for upgrade:
source ${IW_HOME}/bin/env.sh
cd ${IW_HOME}/scripts/
./internet-free-update.sh -v 5.3.0-${os}
Step 10: Perform a cleanup of the /home/infoworks directory, using the following command:
rm -rf infoworks-5.3.0-${os}.tar.gz deploy_5.3.0.tar.gz internet-free-update.sh
Step 11: Upgrade the license soon after logging into the UI.
As a part of upgrade/install for a non-managed mongo setup, the following MongoDB artifacts are deleted from the local machine:
As a part of the upgrade, MongoDB dumps are taken on the local machine at iw_home/temp/backup-5.3.0/mongo
. After upgradation, it is recommended to test the migration to MongoDB Atlas thoroughly by running Infoworks Ingestion and Pipeline Jobs. These dumps can be deleted after the tests are complete.
As a part of the upgrade, Postgres backup is taken on the local machine at iw_home/resources/postgres/pgsql.old
. After upgrading, it is recommended to test the migration to Postgres thoroughly by running Infoworks Workflow Jobs. These dumps can be deleted after the tests are complete.
Step 12: Update the valid sections in the security.ini
file manually: vi /opt/infoworks/platform/conf/security.ini
. Replace only the "Roles" and the "Permissions" sections:
xxxxxxxxxx
[roles]
admin = domains, clusters, sources, databricks_cluster_templates, notification_subscribers, auth_configs, locked_entities, metadata_cleanup_policies, backups_mongo, configs, users, custom_tags
dbadmin = sources, custom_tags, configs
modeller = domains, pipelines, cubes, workflows, custom_tags, configs
analyst = domains, pipelines, cubes, workflows, custom_tags, configs
operations_analyst = domains, sources, pipelines, workflows, locked_entities, clusters, jobs, custom_tags, configs
[permissions]
admin = cluster_metadata_create, cluster_metadata_read, cluster_metadata_update, cluster_metadata_delete, source_metadata_create, domain_metadata_create, admin_entity_metadata_create, admin_entity_metadata_read, admin_entity_metadata_update, admin_entity_metadata_delete, bulk_source_metadata_read, bulk_domain_metadata_read, bulk_admin_entity_metadata_read, bulk_subscribers_metadata_read, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
dbadmin = source_metadata_read, source_metadata_update, source_metadata_delete, source_data_update, bulk_source_metadata_read, bulk_subscribers_metadata_read, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
modeller = domain_artifacts_metadata_create, domain_metadata_read, pipeline_metadata_read, pipeline_metadata_update, pipeline_metadata_delete, table_metadata_read, table_metadata_write, inter_domain_table_metadata_read, cube_metadata_read, cube_metadata_update, cube_metadata_delete, workflow_metadata_read, workflow_metadata_update, workflow_metadata_delete, bulk_domain_metadata_read, bulk_subscribers_metadata_read, pipeline_data_update, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
analyst = domain_artifacts_metadata_create, domain_metadata_read, pipeline_metadata_read, pipeline_metadata_update, pipeline_metadata_delete, table_metadata_read, table_metadata_write, inter_domain_table_metadata_read, cube_metadata_read, cube_metadata_update, cube_metadata_delete, workflow_metadata_read, workflow_metadata_update, workflow_metadata_delete, bulk_domain_metadata_read, bulk_subscribers_metadata_read, pipeline_data_update, bulk_cluster_template_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
operations_analyst = bulk_dashboard_metadata_read, dashboard_workflow_restart, dashboard_workflow_cancel, dashboard_job_cancel, dashboard_job_resubmit, dashboard_locked_entity_delete, dashboard_locked_entity_read, cluster_metadata_read, bulk_source_metadata_read, bulk_domain_metadata_read, bulk_custom_tag_metadata_read, bulk_general_configs_metadata_read
cd ${IW_HOME}/bin && ./stop.sh platform && ./start.sh platform
Step 13: In case there are any changes, update the files in ${IW_HOME}/conf/*.json
, post taking the backup in ${IW_HOME}/../infoworks.bkp
as reference.