Workflows for SAP AI Core are created using YAML or JSON files that are compatible with the SAP AI Core schema. Let’s start with a simple workflow which will output a log file, containing: Hello from SAP AI Core
.
In your GitHub repository, click Add file > Create new file.
Type LearningScenarios/hello_pipeline.yaml
into the Name your file field. This will automatically create the folder LearningScenarios
and a workflow named hello_pipeline.yaml
inside it.
CAUTION Do not use the name of your workflow file (hello_pipeline.yaml
) as any other identifier within SAP AI Core.
Now copy and paste the following snippet to the editor. The code is also available by following this link.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: first-pipeline # Executable ID (max length 64 lowercase-hyphen-separated), please modify this to any value if you are not the only user of your SAP AI Core instance. Example: `first-pipeline-1234`
annotations:
scenarios.ai.sap.com/description: "Introduction to SAP AI Core"
scenarios.ai.sap.com/name: "Tutorial"
executables.ai.sap.com/description: "Greets the user"
executables.ai.sap.com/name: "Hello Pipeline"
labels:
scenarios.ai.sap.com/id: "learning"
ai.sap.com/version: "1.0"
spec:
entrypoint: mypipeline
templates:
- name: mypipeline
steps:
- - name: greet
template: greeter
- name: greeter
container:
image: docker.io/python:latest
command:
- python3
- '-c'
args:
- |
print("Hello from SAP AI Core")
CAUTION The key metadata > name
specifies your executable ID. In the example below, the value first-pipeline
becomes your executable ID. This executable ID is a unique identifier for your workflow within SAP AI Core. The executable ID of each workflow needs to be unique from all other workflows and GitHub repositories that you sync with your SAP AI Core instance. If your SAP AI Core instance is shared with other users, edit the value to be sure that it is unique, for example, first-pipeline-1234
.
Scroll to the bottom of page and click Commit new file.
The workflow contains annotations, which are identifiers for SAP AI Core. Your AI use case is termed Scenario (Tutorial
in this case) and within each scenario you create executables (workflows), (Hello Pipeline
in this case). These workflows are used for training, serving or batch inferencing.
IMPORTANT Recall that the executable ID (name
in this case) within your workflow (first-pipeline
in this case) must be unique across all GitHub repositories onboard to your SAP AI Core instance. If your SAP AI Core instance is shared, update the value to first-pipeline<some-number>
.
The executable uses a step by step flow and starts with templates
(blocks of code). The order of steps is shown in the example by the values: mypipeline > greet > greeter
.
The code first takes takes a public docker image of python, which is run a using a python interpreter, and prints an output.
What is a Docker Image?
A Docker Image is a portable Linux environment, similar to a virtual machine. Docker images are layered environments, which means you may just have Linux OS (for example Distrom
) as one Docker image or another Docker image which has python layered on top of that Linux.
While the code in this tutorial is written directly in the workflow, in actual production you will store the code scripts within your Docker Image. The number of code files and programming language are your preferences.