Skip to Content

Using foundational models on SAP AI Core

In this tutorial we are going to learn on how to consume LLM on AI core deployed on SAP AI core.
You will learn
  • How to inference foundational models on AI core
dhrubpaulDhrubajyoti PaulFebruary 10, 2025
Created by
sharmaneeleshsap
July 12, 2024
Contributors
sharmaneeleshsap
dhrubpaul
rbrainey

Prerequisites

  • A BTP Global Account
    If you are an SAP Developer or SAP employee, please refer to the following links ( for internal SAP stakeholders only ) -
    How to create a BTP Account (internal)
    SAP AI Core
    If you are an external developer or a customer or a partner kindly refer to this tutorial
  • Ai core setup and basic knowledge: Link to documentation
  • Ai core Instance with Standard Plan or Extended Plan
  • Step 1

    For more information on the models refer to Claude 3 Family.

  • Step 2

    For more information on the models refer to GPT4.0 Mini.

  • Step 3

    For more information on the models refer to Llama 3.1.

  • Step 4

    For more information on the models refer to Embeddings - OpenAI.

  • Step 5

    For more information on the models refer to Text embeddings API .

  • Step 6

    For more information on the models refer to Text embeddings API .

  • Step 7

    For more information on the models refer to Mixtral-8x7B-Instruct-v0.1.

  • Step 8

    For more information on the models refer to Amazon Titan Text models.

  • Step 9

    For more information on the models refer to Claude 3 Family.

  • Step 10

    For more information on the models refer to Meta-Llama-3-70B-Instruct.

  • Step 11

    For more information on the models refer to Claude 3 Family.

  • Step 12

    For more information on the models refer to Amazon Titan Text models.

Back to top