Open the modelling environment for building pipelines via SAP Data Hub Modeler. To access the SAP Data Hub Launchpad in AWS or GCP or Azure you need go to the chapters 3.3 and 3.4 as described in the Getting Started with SAP Data Hub, trial edition guide. From SAP Data Hub Launchpad you could access the SAP Data Hub Modeler.
As the above URL is a local URL, it will be accessible only if you are doing the tutorials and have already configured the hosts file. If not, please refer to Getting Started with SAP Data Hub, trial edition 2.5 guide.
Enter DEFAULT as the Tenant, DATAHUB
as Username and the password which you have selected during system setup as Password to logon.
Create a new graph and add Workflow Trigger
operator, 2 X Pipeline operators and a Workflow Terminator to the graph by drag and drop.
Connect the output
out port of the Workflow Trigger to the input
in port of the first Pipeline operator. Connect the output
out port of the first Pipeline operator to the input
in port of the second Pipeline operator. Connect the output
out port of the second Pipeline operator to the stop
in port of the Workflow Terminator operator.
Right click on the first Pipeline operator and go to Open Configuration. Under the parameter Graph name select the graph that we have created in the tutorial Create Workflow (part 1), Enrich data with Data Transform. In our case, we have named it as Workflow 1.
Similarly, for the second Pipeline operator, select the graph that we have created in the tutorial Create Workflow (part 2), Aggregate data with Data Transform under the parameter Graph Name. In our case, we have named it as Workflow 2. Also increase the parameter Retry interval from 20 to 200 for both the Pipelines.