The SAP Cloud Application Programming model utilizes core data services to define artifacts in the database module. Because this model is meant to be database-agnostic – i.e., work with any database – it does not allow you to leverage features that are specific to SAP HANA Cloud. For this reason, you will also create two tables that do not require any advanced data types.
Step 1: Create database entities
-
In the db
folder, right mouse click and choose New File
-
Use the following name:
interactions.cds
-
Use the following content in this new file:
namespace app.interactions;
using { Country } from '@sap/cds/common';
type BusinessKey : String(10);
type SDate : DateTime;
type LText : String(1024);
entity Interactions_Header {
key ID : Integer;
ITEMS : Composition of many Interactions_Items on ITEMS.INTHeader = $self;
PARTNER : BusinessKey;
LOG_DATE : SDate;
BPCOUNTRY : Country;
};
entity Interactions_Items {
key INTHeader : association to Interactions_Header;
key TEXT_ID : BusinessKey;
LANGU : String(2);
LOGTEXT : LText;
};
What is going on?
You are declaring two entities with relationships between each other. The design-time artifacts declared in this file will be converted to run-time, physical artifacts in the database. In this example, the entities will become tables.
Step 2: Create service interface
-
In the srv
folder created another file and name it interaction_srv.cds
interaction_srv.cds
-
Use the following content in this new file:
using app.interactions from '../db/interactions';
service CatalogService {
entity Interactions_Header
as projection on interactions.Interactions_Header;
entity Interactions_Items
as projection on interactions.Interactions_Items;
}
-
Save all.
What is going on?
You are declaring services to expose the database entities you declared in the previous step.
-
From the terminal issue the command: cds build
cds build
-
Look into the console to see the progress. You can scroll up and see what has been built
Step 3: Explore generated design-time artifacts
-
If you pay attention to the build log in the console, you will see the CDS
artifacts were converted to hdbtable
and hdbview
artifacts. You will find those artifacts in a new folder under src
called gen
.
-
You will now deploy those objects into the HANA Database creating tables and views. We will use the SAP HANA Projects view to do this. Please expand this view and you will see the following:
-
We need to bind our project to a Database Connection and HDI container instance. Press the bind icon to being the process.
-
The bind process will start a wizard where you will be prompted for values via the command pallet at the top of the SAP Business Application Studio screen. You might be asked to confirm your Cloud Foundry endpoint and credentials depending upon how long it has been since you last login.
-
You might be presented with options for existing service instances (if you’ve completed other tutorials or have performed other HANA development). But for this exercise we want to choose Create a new service instance
-
Press enter to accept the generated service name
-
It will take a minute or two for the service to be created in HANA. A progress bar will be shown in the message dialog
-
Upon completion, the Database Connections will now show the service bound to the instance the wizard just created.
-
We are now ready to deploy the development content into the database. Press the Deploy button (which looks like a rocket) at the db folder level in the SAP HANA Projects view.
-
Scroll up to in the console to see what the build process has done.
What is going on?
CDS stands for Core Data Services. This is an infrastructure that allows you to create persistency and services using a declarative language. Notice how you are using the same syntax to define both the persistency and the services.
You can find more information on CDS in the help
You defined a CDS artifact, this is an abstraction, a design-time declaration of the entities to be represented in a database and the services to expose them.

The original .cds
file was translated into hdbtable
, which is the SQLDDL syntax specific to SAP HANA when you saved all of the files.

These hdbtable
files were then translated into runtime objects such as tables in the HANA database.
If you checked the services in your space, you would see the service for your HDI container.
You can find a similar example and further context on Core Data and Services in this explanatory video
Step 4: Check the Database Explorer
You can now check the generated tables and views in the Database Explorer.
-
In the SAP HANA Projects view, press the Open HDI Container button
-
The Database Explorer will open in a new browser tab and automatically select the database entry for your project’s HDI container.
-
Once open, navigate to the Tables
section and click on the Header
table.
-
Note the name of the table matches the generated hdbtable
artifacts. You will also see the physical schema managed by the HDI container.
Unless a name is specified during deployment, HDI containers are automatically created with names relative to the project and user generating them. This allows developers to work on different versions of the same HDI container at the same time.

Step 5: Load data into your tables
-
Download the header file and the items file into your local file system.
-
Right-click again on the header table and choose Import Data.
-
Choose Import Data and press Step 2
-
Choose Local for the Import Data From: option. Browse for the Header
file and click Step 3.
-
Keep the default import target and click Step 4.
-
Keep the default table mapping and press Step 5.
-
Keep the default error handling and press Review
-
Choose Import into Database.
-
You will see confirmation that 4 records have imported successfully.
-
Repeat the process with the Items.csv
file into the Items
table.
Step 6: Check data loaded into the tables
-
You can now check the data loaded into the tables. Right-click on the Items
table and click Generate Select Statement.
-
Add the following WHERE clause to the SELECT statement and execute it to complete the validation below.
where "LOGTEXT" like '%happy%';