Create and Modify Workflows

Prepare Your Work Environment

After you have installed and started FRINX Machine (see “https://github.com/FRINXio/FRINX-machine”) you will want to modify existing workflows or add new workflows to meet your needs. We will be referring to the machine that is running the FRINX Machine containers as host. Typically that host is a VM running on your laptop, in your private cloud or in a public/virtual private cloud. Here is how to get started.

Go to the directory on your host in which you have cloned the FRINX Machine repository. In our case this is

$ ls -al
total 116
drwxrwxr-x 12 gwieser gwieser 4096 Nov 23 23:38 .
drwxr-xr-x 15 gwieser gwieser 4096 Nov 24 00:03 ..
drwxrwxr-x 21 gwieser gwieser 4096 Nov 23 23:40 conductor
-rwxrwxr-x  1 gwieser gwieser  151 Nov 23 23:38 docker-compose.min.yml
-rwxrwxr-x  1 gwieser gwieser 2514 Nov 23 23:38 docker-compose.yml
drwxrwxr-x  2 gwieser gwieser 4096 Nov 23 23:38 dynomite
drwxrwxr-x  2 gwieser gwieser 4096 Nov 23 23:38 elasticsearch
-rw-rw-r--  1 gwieser gwieser   25 Nov 23 23:38 .env
drwxrwxr-x  5 gwieser gwieser 4096 Nov 23 23:38 frinx-uniconfig-ui
drwxrwxr-x  9 gwieser gwieser 4096 Nov 23 23:38 .git
-rw-rw-r--  1 gwieser gwieser  390 Nov 23 23:38 .gitmodules
-rwxrwxr-x  1 gwieser gwieser 3904 Nov 23 23:38 health_check.sh
-rwxrwxr-x  1 gwieser gwieser 1955 Nov 23 23:38 install.sh
drwxrwxr-x  2 gwieser gwieser 4096 Nov 23 23:38 kibana
drwxrwxr-x  3 gwieser gwieser 4096 Nov 23 23:38 logstash
drwxrwxr-x  3 gwieser gwieser 4096 Nov 23 23:38 microservices
drwxrwxr-x  2 gwieser gwieser 4096 Nov 23 23:38 portainer
-rw-rw-r--  1 gwieser gwieser 3830 Nov 23 23:38 README.md
drwxrwxr-x  3 gwieser gwieser 4096 Nov 23 23:38 sample-topology
-rwxrwxr-x  1 gwieser gwieser 1910 Nov 23 23:38 startup.sh
-rwxrwxr-x  1 gwieser gwieser  135 Nov 23 23:38 teardown.sh
drwxrwxr-x  2 gwieser gwieser 4096 Nov 23 23:38 test
drwxrwxr-x  4 gwieser gwieser 4096 Nov 23 23:38 uniconfig
-rwxrwxr-x  1 gwieser gwieser 4080 Nov 23 23:38 wait_for_it.sh

To prepare the development environment on your host go to the FRINX-machine directory and enter:

sudo apt-get install python-setuptools
cd conductor/client/python/
sudo python setup.py install

Go to the folder that includes the python workers that are used for the workflow execution logic.

$ cd microservices/netinfra_utils/workers/
$ ls -al
total 180
drwxrwxr-x 3 gwieser gwieser  4096 Nov 24 00:04 .
drwxrwxr-x 3 gwieser gwieser  4096 Nov 23 23:39 ..
-rw-rw-r-- 1 gwieser gwieser 11835 Nov 23 23:39 cli_worker.py
-rw-rw-r-- 1 gwieser gwieser  9469 Nov 24 00:04 cli_worker.pyc
-rw-rw-r-- 1 gwieser gwieser   667 Nov 24 00:03 frinx_rest.py
-rw-rw-r-- 1 gwieser gwieser   921 Nov 24 00:04 frinx_rest.pyc
drwxrwxr-x 2 gwieser gwieser  4096 Nov 23 23:39 .idea
-rw-rw-r-- 1 gwieser gwieser 11134 Nov 23 23:39 inventory_worker.py
-rw-rw-r-- 1 gwieser gwieser  8725 Nov 24 00:04 inventory_worker.pyc
-rw-rw-r-- 1 gwieser gwieser  5569 Nov 23 23:39 lldp_worker.py
-rw-rw-r-- 1 gwieser gwieser  4554 Nov 24 00:04 lldp_worker.pyc
-rw-rw-r-- 1 gwieser gwieser   721 Nov 23 23:39 main.py
-rw-rw-r-- 1 gwieser gwieser  8932 nov 23 23:39 netconf_worker.py
-rw-rw-r-- 1 gwieser gwieser  8932 nov 24 00:04 netconf_worker.pyc
-rw-rw-r-- 1 gwieser gwieser  1284 Nov 23 23:39 platform_worker.py
-rw-rw-r-- 1 gwieser gwieser  1616 Nov 24 00:04 platform_worker.pyc
-rw-rw-r-- 1 gwieser gwieser 13780 Nov 23 23:39 uniconfig_worker.py
-rw-rw-r-- 1 gwieser gwieser 10557 Nov 24 00:04 uniconfig_worker.pyc
-rw-rw-r-- 1 gwieser gwieser  7658 Nov 23 23:39 unified_worker.py
-rw-rw-r-- 1 gwieser gwieser  6374 Nov 24 00:04 unified_worker.pyc
-rw-rw-r-- 1 gwieser gwieser   338 Nov 23 23:39 workers.iml

Go to the file “frinx_rest.py” and make the following changes (highlighted text). If the file can’t be written to you might have erroneously run the install as sudo. Make sure you run the FRINX Machine install script as regular user, without sudo, and you will be able to edit and save the microservice files.

import json

#odl_url_base = "http://uniconfig:8181/rests"
#elastic_url_base = "http://elasticsearch:9200"
#conductor_url_base = "http://conductor-server:8080/api"

odl_url_base = "http://localhost:8181/rests"
elastic_url_base = "http://localhost:9200"
conductor_url_base = "http://localhost:8080/api"

odl_credentials = ("admin", "admin")
odl_headers = {"Content-Type": "application/json"}

def parse_response(r):
    decode = r.content.decode('utf8')
    try:
        response_json = json.loads(decode if decode else "{}")
    except ValueError as e:
        response_json = json.loads("{}")

    response_code = r.status_code
    return response_code, response_json

Note

“localhost” is the hostname of FRINX Machine host VM. In case you are running and developing the workers remotely, use the IP address of FRINX Machine host instead of “localhost”.

Start the FRINX Machine.

The changes that you have made in the file above will result in executing the worker tasks on your host (the machine running the FRINX Machine containers) instead of in the microservice container. Therefore you need to stop micros container with this command:

$ sudo docker stop micros

Save the file with the changes and start the python workers on your host with the following command:

$ python main.py

You will see a similar output like shown below, which indicates that the workers are now running and are ready for execution.

[...]
Polling for task UNICONFIG_write_structured_data_as_tasks at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_delete_structured_device_data at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_commit at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_dryrun_commit at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_checked_commit at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_calculate_diff at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_sync_from_network at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_replace_config_with_oper at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_create_snapshot at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_delete_snapshot at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_replace_config_with_snapshot at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Polling for task UNICONFIG_check_uniconfig_node_exists at a 1000.000000 ms interval with 1 threads for task execution, with worker id as gns3vm
Starting common workers
[...]

The final (optional) step is to stop the microservice container “micros”. You only need to stop the microservice container if you want to modify existing workers. If you create entirely new workers, this step is not necessary.

$ sudo docker-compose stop micros
Stopping micros ... done
$ sudo docker-compose stop micros

FRINX Machine is now using the workers running on your host instead of the workers running in the FRINX Machine container. This allows you to modify existing workers and add new workers to your workflows directly on your host.

New workflows and tasks are created through the REST API of Conductor. All workflows are stored in folder microservices/netinfra_utils/workflows https://github.com/FRINXio/FRINX-machine/tree/master/microservices/netinfra_utils/workflows

Create a New Workflow

Now that we have our environment prepared, we can create the first simple workflow. The goal is to have the first task in our workflow receive two input parameters (id1 and id2). The purpose of our task is to add the two input variables and return the result. The execution logic of our task will be implemented in a small python function.

The second task in our workflow will be a http call to a test API. We will use the output of the first task to select the id of our test function. E.g. If the output of our first task is “5” then our second task should go to the following URL: https://jsonplaceholder.typicode.com/posts/5
If the output of our first task is 6 then the second task should call https://jsonplaceholder.typicode.com/posts/6

The output of our workflow will be the value of a parameter in the response from our test function called “title”.

For a full documentation of tasks, workflows and the capabilities of Netflix Conductor, please go to https://netflix.github.io/conductor/

Workflows consist of one or multiple tasks. Conductor supports two different kinds of tasks: system tasks that are executed within the conductor server JVM and worker tasks that are running outside of the conductor JVM.

Conductor maintains a registry of worker task types. A worker task type MUST be registered before using in a workflow. In the following example we register a new worker task. We use POST to create a new task and we use PUT to update an existing task.

POST /api/metadata/taskdefs HTTP/1.1
Host: localhost
Content-Type: application/json
cache-control: no-cache
Postman-Token: 9cd87d64-679f-49e2-8873-6459d26b8033
[
    {
        "name": "add_two_integers",
        "retryCount": 0,
        "timeoutSeconds": 30,
        "inputKeys": [
            "id_1",
            "id_2"
        ],
        "timeoutPolicy": "TIME_OUT_WF",
        "retryLogic": "FIXED",
        "retryDelaySeconds": 0,
        "responseTimeoutSeconds": 30
    }
]

You can use any http tool (e.g. curl, …) to create the new task in Conductor via its API or you can register task python worker.

For our second task, we will use an existing system task called “http_get_generic”. This task exists already in our library and uses the http system task function. You do not have to create it, it is already in our collection. Its definition looks like this.

{
    "createTime": 1543026743415,
    "name": "http_get_generic",
    "retryCount": 3,
    "timeoutSeconds": 10,
    "timeoutPolicy": "TIME_OUT_WF",
    "retryLogic": "FIXED",
    "retryDelaySeconds": 5,
    "responseTimeoutSeconds": 10
}

Now we can create our first workflow by stringing together the two tasks in sequence. Create a new file EXAMPLE_add_integers_and_GET_HTTP.json in folder microservices/netinfra_utils/workflows.

We use the following definition for our workflow.

{
  "name": "EXAMPLE_add_integers_and_GET_HTTP",
  "description": "Adds two integers it receives from input and calls a sample API - EXAMPLE",
  "version": 1,
  "tasks": [
    {
      "name": "add_two_integers",
      "taskReferenceName": "add_two_integers_1st_instance",
      "inputParameters": {
        "id1": "${workflow.input.id1}",
        "id2": "${workflow.input.id2}"
      },
      "type": "SIMPLE",
      "startDelay": 0
    },
    {
      "name": "http_get_generic",
      "taskReferenceName": "http_get_generic_1st_instance",
      "inputParameters": {
        "http_request": {
          "uri": "https://jsonplaceholder.typicode.com/posts/${add_two_integers_1st_instance.output.result}",
          "method": "GET"
        }
      },
      "type": "HTTP",
      "startDelay": 0
    }
  ],
  "inputParameters": [
    "id1[This is the first addend of the addition][2]",
    "id2[This is the second addend of the addition][3]"
  ],
  "outputParameters": {
    "title": "${http_get_generic_1st_instance.output.response.body.title}"
  },
  "restartable": true,
  "schemaVersion": 2
}

We can declare LABELS in description body. This labels should be added after dash symbol at the end of the body.

After we save this file we can use script

$ microservices/netinfra_utils/importWorkflow.sh -l

to put new workflow to FRINX Machine.

We can now find our new workflow in the FRINX-UI.

alt_text

The next step is to create the execution logic in python. First we create a new file called “add_integer_worker.py” in the workers directory with the following content.

from __future__ import print_function


def execute_add_two_integers(task):
    addend_one = task['inputData']['id1']
    addend_two = task['inputData']['id2']
    result = int(addend_one) + int(addend_two)
    return {'status': 'COMPLETED', 'output': {'result': result}, 'logs': []}

def start(cc):
    print('Starting add_two_integers worker')
    cc.register('add_two_integers', {
    "name": "add_two_integers",
        "retryCount": 0,
        "timeoutSeconds": 30,
        "inputKeys": [
            "id_1",
            "id_2"
        ],
        "timeoutPolicy": "TIME_OUT_WF",
        "retryLogic": "FIXED",
        "retryDelaySeconds": 0,
        "responseTimeoutSeconds": 30
        }
    )
    cc.start('add_two_integers', execute_add_two_integers, False)

The name of your task in Conductor needs to match the highlighted text in the python worker. Next, you need to associate a python function with the task.

The “task” object contains all structures passed to and from Conductor to the python worker. The return object must be consistent with the format expected by conductor. For more detailed information see https://netflix.github.io/conductor/

Finally, we need to register our new python worker. Add the highlighted text in main.py.

import time
import worker_wrapper
from frinx_rest import conductor_url_base
import cli_worker
import inventory_worker
import lldp_worker
import netconf_worker
import platform_worker
import uniconfig_worker
import unified_worker
import common_worker
import add_integer_worker


def main():
   print('Starting FRINX workers')
   cc = worker_wrapper.ExceptionHandlingConductorWrapper(conductor_url_base, 1, 1)
   register_workers(cc)

   # block
   while 1:
      time.sleep(1000)


def register_workers(cc):
   cli_worker.start(cc)
   netconf_worker.start(cc)
   platform_worker.start(cc)
   lldp_worker.start(cc)
   inventory_worker.start(cc)
   unified_worker.start(cc)
   uniconfig_worker.start(cc)
   common_worker.start(cc)
   add_integer_worker.start(cc)


if __name__ == '__main__':
   main()

Save your changes and (re)start main.py with the following command.

$ python main.py

In the following pictures we see how our workflow is executed from the FRINX UI. The UI with the entry form is auto generated from the workflow definition.

alt_text alt_text alt_text alt_text

The last picture shows us the two input variable that we entered through the UI and the output being the title that we retrieved from the test API service.

Our workflow can also be executed via the REST API from command line. This is a 2-step process. First we start the workflow and we retrieve a workflow id in response. In the second call we use the workflow id to retrieve the status and output of the workflow.

$ curl -X POST \
>   http://localhost:8080/api/workflow/EXAMPLE_add_integers_and_GET_HTTP \
>   -H 'Content-Type: application/json' \
>   -H 'Postman-Token: 9eb2de1c-1668-489f-b933-93ae202c48a7' \
>   -H 'cache-control: no-cache' \
>   -d '{
> "id1": "3",
> "id2": "4"
> }
> '
25876e82-6883-4af1-9db3-35888c2c4a23
$
$ curl -X GET   http://localhost:8080/api/workflow/25876e82-6883-4af1-9db3-35888c2c4a23 \
> -H 'Content-Type: application/json' \
> -H 'Postman-Token: 5b783994-1812-4415-87ce-bf2b2cc690ed' \
> -H 'cache-control: no-cache' | json_pp
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  5957    0  5957    0     0   831k      0 --:--:-- --:--:-- --:--:--  831k
{
   "output" : {
      "title" : "magnam facilis autem"
   },
   "endTime" : 1575290757316,
   "workflowVersion" : 1,
   "startTime" : 1575290756511,
   "priority" : 0,
   "input" : {
      "id2" : "4",
      "id1" : "3"
   },
   "updateTime" : 1575290757316,
   "tasks" : [
      {
         "responseTimeoutSeconds" : 30,
         "retried" : false,
         "taskDefName" : "add_two_integers",
         "workflowTask" : {
            "name" : "add_two_integers",
            "optional" : false,
            "startDelay" : 0,
            "taskDefinition" : {
               "retryCount" : 0,
               "retryDelaySeconds" : 0,
               "rateLimitFrequencyInSeconds" : 1,
               "retryLogic" : "FIXED",
               "name" : "add_two_integers",
               "inputKeys" : [
                  "id_1",
                  "id_2"
               ],
               "timeoutSeconds" : 30,
               "rateLimitPerFrequency" : 0,
               "createTime" : 1575289837649,
               "timeoutPolicy" : "TIME_OUT_WF",
               "responseTimeoutSeconds" : 30
            },
            "inputParameters" : {
               "id1" : "${workflow.input.id1}",
               "id2" : "${workflow.input.id2}"
            },
            "type" : "SIMPLE",
            "taskReferenceName" : "add_two_integers_1st_instance",
            "asyncComplete" : false
         },
         "workerId" : "a9794fbfae83",
         "scheduledTime" : 1575290756534,
         "rateLimitPerFrequency" : 0,
         "startTime" : 1575290756775,
         "endTime" : 1575290756793,
         "queueWaitTime" : 241,
         "inputData" : {
            "id1" : "3",
            "id2" : "4"
         },
         "retryCount" : 0,
         "updateTime" : 1575290756775,
         "rateLimitFrequencyInSeconds" : 0,
         "executed" : true,
         "taskType" : "add_two_integers",
         "pollCount" : 1,
         "taskId" : "6abd6a94-2d7e-45a0-8634-19175fc881ec",
         "seq" : 1,
         "referenceTaskName" : "add_two_integers_1st_instance",
         "workflowType" : "EXAMPLE_add_integers_and_GET_HTTP",
         "taskStatus" : "COMPLETED",
         "outputData" : {
            "result" : 7
         },
         "taskDefinition" : {
            "present" : true
         },
         "workflowInstanceId" : "25876e82-6883-4af1-9db3-35888c2c4a23",
         "status" : "COMPLETED",
         "callbackFromWorker" : true,
         "workflowPriority" : 0,
         "startDelayInSeconds" : 0,
         "callbackAfterSeconds" : 0
      },
      {
         "seq" : 2,
         "workflowType" : "EXAMPLE_add_integers_and_GET_HTTP",
         "referenceTaskName" : "http_get_generic_1st_instance",
         "pollCount" : 1,
         "taskId" : "10a0b519-4d5a-4131-a669-2f7218f342e4",
         "callbackFromWorker" : true,
         "workflowInstanceId" : "25876e82-6883-4af1-9db3-35888c2c4a23",
         "taskDefinition" : {
            "present" : true
         },
         "status" : "COMPLETED",
         "callbackAfterSeconds" : 0,
         "startDelayInSeconds" : 0,
         "workflowPriority" : 0,
         "outputData" : {
            "response" : {
               "reasonPhrase" : "OK",
               "headers" : {
                  "Server" : [
                     "cloudflare"
                  ],
                  "CF-RAY" : [
                     "53ed66a10e0edffb-FRA"
                  ],
                  "Set-Cookie" : [
                     "__cfduid=d7cf4fbaeb4fb34808a98c89fc8cba6461575290757; expires=Wed, 01-Jan-20 12:45:57 GMT; path=/; domain=.typicode.com; HttpOnly"
                  ],
                  "X-Content-Type-Options" : [
                     "nosniff"
                  ],
                  "Via" : [
                     "1.1 vegur"
                  ],
                  "Expires" : [
                     "-1"
                  ],
                  "Pragma" : [
                     "no-cache"
                  ],
                  "Expect-CT" : [
                     "max-age=604800, report-uri=\"https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct\""
                  ],
                  "CF-Cache-Status" : [
                     "HIT"
                  ],
                  "Content-Type" : [
                     "application/json; charset=utf-8"
                  ],
                  "X-Powered-By" : [
                     "Express"
                  ],
                  "Connection" : [
                     "keep-alive"
                  ],
                  "Accept-Ranges" : [
                     "bytes"
                  ],
                  "Age" : [
                     "6236"
                  ],
                  "Content-Length" : [
                     "225"
                  ],
                  "Date" : [
                     "Mon, 02 Dec 2019 12:45:57 GMT"
                  ],
                  "Etag" : [
                     "W/\"e1-wrK4SLERwov0EbpkNAKTHsvGWBs\""
                  ],
                  "Access-Control-Allow-Credentials" : [
                     "true"
                  ],
                  "Cache-Control" : [
                     "max-age=14400"
                  ],
                  "Vary" : [
                     "Origin, Accept-Encoding"
                  ]
               },
               "statusCode" : 200,
               "body" : {
                  "id" : 7,
                  "title" : "magnam facilis autem",
                  "body" : "dolore placeat quibusdam ea quo vitae\nmagni quis enim qui quis quo nemo aut saepe\nquidem repellat excepturi ut quia\nsunt ut sequi eos ea sed quas",
                  "userId" : 1
               }
            }
         },
         "taskStatus" : "COMPLETED",
         "queueWaitTime" : 328,
         "endTime" : 1575290757306,
         "scheduledTime" : 1575290756794,
         "rateLimitPerFrequency" : 0,
         "startTime" : 1575290757122,
         "responseTimeoutSeconds" : 0,
         "retried" : false,
         "workflowTask" : {
            "type" : "HTTP",
            "taskReferenceName" : "http_get_generic_1st_instance",
            "asyncComplete" : false,
            "taskDefinition" : {
               "name" : "http_get_generic",
               "retryLogic" : "FIXED",
               "timeoutSeconds" : 10,
               "rateLimitFrequencyInSeconds" : 1,
               "retryCount" : 3,
               "retryDelaySeconds" : 5,
               "timeoutPolicy" : "TIME_OUT_WF",
               "createTime" : 1575289837626,
               "responseTimeoutSeconds" : 10,
               "rateLimitPerFrequency" : 0
            },
            "startDelay" : 0,
            "inputParameters" : {
               "http_request" : {
                  "uri" : "https://jsonplaceholder.typicode.com/posts/${add_two_integers_1st_instance.output.result}",
                  "method" : "GET"
               },
               "asyncComplete" : false
            },
            "optional" : false,
            "name" : "http_get_generic"
         },
         "workerId" : "b1f0c4cf9080",
         "taskDefName" : "http_get_generic",
         "rateLimitFrequencyInSeconds" : 1,
         "executed" : true,
         "updateTime" : 1575290757123,
         "retryCount" : 0,
         "taskType" : "HTTP",
         "inputData" : {
            "http_request" : {
               "uri" : "https://jsonplaceholder.typicode.com/posts/7",
               "method" : "GET"
            },
            "asyncComplete" : false
         }
      }
   ],
   "createTime" : 1575290756511,
   "workflowDefinition" : {
      "restartable" : true,
      "workflowStatusListenerEnabled" : false,
      "outputParameters" : {
         "title" : "${http_get_generic_1st_instance.output.response.body.title}"
      },
      "tasks" : [
         {
            "asyncComplete" : false,
            "type" : "SIMPLE",
            "taskReferenceName" : "add_two_integers_1st_instance",
            "inputParameters" : {
               "id2" : "${workflow.input.id2}",
               "id1" : "${workflow.input.id1}"
            },
            "taskDefinition" : {
               "timeoutSeconds" : 30,
               "retryLogic" : "FIXED",
               "inputKeys" : [
                  "id_1",
                  "id_2"
               ],
               "name" : "add_two_integers",
               "retryDelaySeconds" : 0,
               "retryCount" : 0,
               "rateLimitFrequencyInSeconds" : 1,
               "responseTimeoutSeconds" : 30,
               "createTime" : 1575289837649,
               "timeoutPolicy" : "TIME_OUT_WF",
               "rateLimitPerFrequency" : 0
            },
            "startDelay" : 0,
            "optional" : false,
            "name" : "add_two_integers"
         },
         {
            "optional" : false,
            "name" : "http_get_generic",
            "inputParameters" : {
               "asyncComplete" : false,
               "http_request" : {
                  "method" : "GET",
                  "uri" : "https://jsonplaceholder.typicode.com/posts/${add_two_integers_1st_instance.output.result}"
               }
            },
            "taskDefinition" : {
               "responseTimeoutSeconds" : 10,
               "timeoutPolicy" : "TIME_OUT_WF",
               "createTime" : 1575289837626,
               "rateLimitPerFrequency" : 0,
               "timeoutSeconds" : 10,
               "name" : "http_get_generic",
               "retryLogic" : "FIXED",
               "rateLimitFrequencyInSeconds" : 1,
               "retryCount" : 3,
               "retryDelaySeconds" : 5
            },
            "startDelay" : 0,
            "asyncComplete" : false,
            "type" : "HTTP",
            "taskReferenceName" : "http_get_generic_1st_instance"
         }
      ],
      "schemaVersion" : 2,
      "version" : 1,
      "updateTime" : 1575289863912,
      "inputParameters" : [
         "id1[This is the first addend of the addition][2]",
         "id2[This is the second addend of the addition][3]"
      ],
      "description" : "Adds two integers it receives from input and calls a sample API - EXAMPLE",
      "name" : "EXAMPLE_add_integers_and_GET_HTTP"
   },
   "workflowName" : "EXAMPLE_add_integers_and_GET_HTTP",
   "workflowType" : "EXAMPLE_add_integers_and_GET_HTTP",
   "status" : "COMPLETED",
   "schemaVersion" : 2,
   "version" : 1,
   "workflowId" : "25876e82-6883-4af1-9db3-35888c2c4a23"
}

This example provides useful information to start writing your own workflows. Please use the documentation of the Conductor project for more information on tasks and workflows. https://netflix.github.io/conductor/

Create a New Workflow using Builder

The goal is to create the same workflow consisting of tasks:

  • add_two_integers

  • http_get_generic

using Workflow builder - feature for creating, modifying and executing workflows inside Uniconfig UI.

To create new workflow using Workflow Builder, we press on New button in Workflows section:

alt_text

We will be prompted to enter unique name of the workflow, in our case its EXAMPLE_add_integers_and_GET_HTTP.

We can optionally declare additional workflow information (see Workflow information), then we click Save.

alt_text

To start adding simple tasks into our worklow, we can search and drag them on canvas from sidemenu positioned on left side of the screen (see Workflow Tasks).

We are provided with two options:

  • adding already existing task from library, which must have registered worker with execution logic (see Register new task)

  • adding special LAMBDA task, where we can define our own execution logic written in JavaScript (see Lambda Task).

Since, in this case, we don’t want to reuse our example task add_two_integers in any other workflow, we don’t need to register it and we will use LAMBDA task to define execution logic locally for this specific task.

We drag and drop LAMBDA task from sidemenu on canvas, double-click on it and fill in custom name and taskReferenceName (must be unique across workflow).

alt_text alt_text

Next we switch to Input parameters tab, where we define input parameters (provided on execution) and execution logic in JavaScript:

alt_text

Note

lambdaValue is default example parameter, we can use it or ignore it and define our own parameters.

Execution logic in JavaScript:

var result = parseInt($.id1) + parseInt($.id2);

return result.toString()

Next we add second task http_get_generic, similarly we fill general and input parameters:

alt_text alt_text

We link tasks together by dragging line from output ports to input ports (see Linking tasks):

alt_text

Now the workflow is ready to be executed. However, we may want to set descriptions or default parameters first (see Adding workflow information.

We click on Edit general in upper right corner, we click on tab Defaults & descriptions and we set default value and description for input parameters id1 and id2:

alt_text alt_text

Finally, we can look at workflow definiton by clicking on Definition and execute the workflow by clicking Execute in upper right corner (see Executing workflow:

Note

Executing the workflow will also save the workflow. You can also only save the workflow without executing by pressing Save in upper left corner.

alt_text alt_text