Infoworks 6.1.3
Operationalize Data
Introduction
Creating a Workflow
Designing a Workflow
Building a Workflow
Configuration Migration
Bash Node Security

Workflow Configuration Migration

For an easy workflow configuration migration, the sources and table groups, pipelines and domains, cubes and domains in the exported file must exactly match the names in the environment where the workflow configuration is being imported. It is also possible to manually map the respective sources and table groups, pipelines and domains, cubes and domains, in the exported file to the target import workflow.

This feature:

  • Allows you to map tablegroup to any tablegroup in the source.
  • Allows you to map pipeline to any pipeline in the domain.
  • Allows you to map cube to any cube in the domain.

NOTE During workflow migration, the system will automatically select the active version, regardless of the version specified in the exported JSON file. This will ensure that the pipeline version is always pre-filled on the mapping page. Users will have the option to change the version during the mapping process if desired.

Exporting Workflow Configuration

To export a workflow configuration, follow these steps:

  1. Click the Domains menu and click the domain where the workflow is created.
  2. On the Workflows page of the respective domain, click the workflow for which configuration must be migrated.
  3. Click the Settings icon.
  4. On the Settings page, locate the Configuration Migration section.
  5. Click Export. The workflow configuration gets downloaded in the .json format.

Importing Workflow Configuration

To import a workflow, follow these steps in the target environment:

  1. Click the Domains menu and click the domain where the workflow is created.
  2. On the Workflows page of the domain, click the workflow for which configuration must be migrated.
  3. Click the Settings icon.
  4. In the Settings page, locate the Configuration Migration section.
  5. Click Choose A File and select the required workflow configuration file.

The following page is displayed:

NOTE This page displays the table group, pipeline, cubes and workflows mapping between the imported file and the same entities created in the target environment.

If a table group, pipeline, cube or workflow mapping is not available, an error is displayed next to the mapping. In this case, select the desired option from the respective drop-down menu.

Follow the steps below to manually map the entities from the import file to the target environment:

  1. Select the required table group from the Table Group in this Source drop-down list to map it with the Table Group in the Uploaded Configuration.
  2. Select the required pipeline from the Pipeline in this Domain drop-down list to map it with the Pipeline in the Uploaded Configuration.
  3. Select the required cube from the Cubes in this Domain drop-down list to map it with the Cube in the Uploaded Configuration.
  4. Select the required workflow from the Workflow in this Domain drop-down list to map it with the Workflow in the Uploaded Configuration.
  5. Click Import Configurations. The screen with a success message is displayed.

Limitation

  • Importing a workflow which is already opened in a workflow editor is not supported.
  • Modifying the design of workflow or importing workflow configuration with a different design when an instance of the workflow is already running might lead to intermittent failures.
  • Existing workflow metrics for all workflow runs prior to 2.6.0 will not be available.

Orchestrator Log Files Cleanup

To archive old logs, and to avoid disk space issues while running a workflow, cleanup orchestrator log files using the following steps:

  1. Stop the orchestrator services using the following command: $W_HOME/bin/stop.sh orchestrator
  2. Ensure that no orchestrator services are running using the following command:

ps -ef | grep airflow

ps -ef | grep orchestrator

  1. If the processes is still running, use the following command to kill the process: pkill -f airflow; pkill -f orchestrator
  2. Cleanup the log directories using the following commands:
  • To delete the complete logs for all historical dates: rm -rf $W_HOME/orchestrator-engine/logs/scheduler/*
  • To delete the logs for specific historical dates: rm -rf $W_HOME/orchestrator-engine/logs/scheduler/<date folders to remove>
    1. Cleanup the log files using the following commands: rm -rf $W_HOME/orchestrator-engine/airflow-scheduler.log

rm -rf $W_HOME/orchestrator-engine/airflow-scheduler.out

Deleting Workflow

In the Settings page, click the Delete button to delete the workflow.

If the workflow impacts any other workflow, a pop up message indicating the impact will be displayed.

Rest API

We have added an additional key to the PUT API endpoint to ensure consistent behavior. The payload now includes the following:

endpoint: v3/domains/:domainId/workflows/:workflowId/config-migration

Payload
Copy

The map_active_version key will automatically provide the active version of the pipeline during the recommendation phase for mapping.

Reference Video

The demo of Workflow Deletion is available here.

  Last updated by Monika Momaya