Prompt optimization
- How to prepare datasets and object stores for prompt optimization.
Prerequisites
- BTP Account
Set up your SAP Business Technology Platform (BTP) account.
Create a BTP Account - For SAP Developers or Employees
Internal SAP stakeholders should refer to the following documentation: How to create BTP Account For Internal SAP Employee, SAP AI Core Internal Documentation - For External Developers, Customers, or Partners
Follow this tutorial to set up your environment and entitlements: External Developer Setup Tutorial, SAP AI Core External Documentation - Create BTP Instance and Service Key for SAP AI Core
Follow the steps to create an instance and generate a service key for SAP AI Core:
Create Service Key and Instance - AI Core Setup Guide
Step-by-step guide to set up and get started with SAP AI Core:
AI Core Setup Tutorial - An Extended SAP AI Core service plan is required, as the Generative AI Hub is not available in the Free or Standard tiers. For more details, refer to
SAP AI Core Service Plans
-
How to create and register prompt templates in the Prompt Registry.
-
How to configure and run prompt optimization via AI Launchpad, Bruno, and the Python SDK.
-
How to monitor executions, review metrics, and save optimized prompts for reuse.
Pre-Read
Before starting this tutorial, ensure that you:
-
Understand the basics of Generative AI workflows in SAP AI Core.
-
Are familiar with creating and managing prompt templates, artifacts, and object stores
-
Have the required roles such as genai_manager or custom_evaluation.
-
Have completed the Quick Start tutorial or equivalent setup for SAP AI Core and AI Launchpad access.
Architecture Overview
-
Prompt Optimization in SAP AI Core connects the Prompt Registry, Object Store, and ML Tracking Service to form an end-to-end optimization workflow.
-
The dataset (for example, Test-Data.json) is stored in the Object Store and registered as an artifact.
-
During execution, the system uses the selected prompt template, metric, and model to evaluate multiple prompt variants.
-
Metrics are tracked in the ML Tracking Service, and both the optimized prompt and results are saved back to the registry and object store.
-
This process runs as an execution and is model-specific, ensuring the optimized prompt aligns with the target model’s behavior.

Notebook Reference
For hands-on execution and end-to-end reference, use the accompanying Prompt Optimization Notebook. It includes complete Python code examples that align with each step of this tutorial — from dataset preparation and artifact registration to configuration creation, execution, and result retrieval.
💡 Even though this tutorial provides stepwise code snippets for clarity,
the notebook contains all required imports, object initializations, and helper functions to run the flow seamlessly in one place.
To use the notebook:
-
Download and open notebook in your preferred environment (e.g., VS Code, JupyterLab).
-
Configure your environment variables such as AICORE_BASE_URL, AICORE_AUTH_TOKEN, and object store credentials .
-
Execute each cell in order to reproduce the complete prompt optimization workflow demonstrated in this tutorial.