Skip to Content

Execute Batch Payload Operations Using Java Database Connectivity (JDBC) Receiver Adapter

Learn how to connect, interact, and perform batch operations on your database via JDBC receiver adapter.
You will learn
  • How to connect Java Database Connectivity (JDBC) receiver adapter with your database (on the receiver system) and send, receive, or update data on the database.
  • How to interact with your database using SQL queries and XML.
  • How to perform operations on database using Batch Payload functionality.
HarshitaSuryavanshiHarshita SuryavanshiDecember 30, 2022
Created by
HarshitaSuryavanshi
September 7, 2022
Contributors
HarshitaSuryavanshi
jmmargo

Prerequisites

The data that you are transferring (sending or receiving) is called payload. The payload sent to the JDBC adapter can be modified. You can modify multiple records in a single payload using either INSERT, UPDATE, or DELETE modes. This can be achieved using Batch Payload functionality.

There are two types of Batch operations:

  • Atomic: This mode considers each batch operation as a single unit. It updates the whole batch operation successfully or reverts the entire operation to its initial state if anything in the batch operation fails.

  • Non-Atomic- The operation in this mode varies depending upon the database and its driver. Generally, it updates all the successfully executed records and throws an exception if anything fails. It does not revert the failed records to its initial state.

Following the steps below, you will be able to set up a sample database, create an integration flow, upload drivers, and data sources required for JDBC receiver adapter. Then, you can create a payload, execute, and benefit from Batch Payload functionality.

  • Step 1

    Based on your requirements, choose a database you wish to work on. JDBC receiver adapters supports more than ten databases. To know more, see JDBC: Supported Databases.

    If you’re connecting to on-premise database, then you must validate the connection in the Cloud Connector. For more information, see JDBC: Access Control.

  • Step 2

    JDBC adapter supports third party and in-house databases on Neo and Cloud Foundry Business Technology environments. Before you establish the connection to these databases, or create an integration flow, you must upload drivers (applicable for HANA, ASE, PostgreSQL only) and add Data Sources.

    Refer to Configure JDBC Drivers to understand the process of uploading the drivers and the databases.

    Uploading Drivers

    Refer to Managing JDBC Data Sources to understand the process to create a data source.

    Deploying Data Sources

  • Step 3

    Now, create a basic integration flow to understand the basic function of JDBC adapter. In this basic integration scenario, a sender is connected to the Start event via HTTPS Sender adapter and the End event is connected with receiver via JDBC adapter. This JDBC adapter enables the connection of the integration flow with the receiver’s database. It allows you to send, receive, update the data of receiver’s database.

    Initial Integration Flow

    You must set the following properties:

    • For HTTPS adapter: Under Connection tab > Address: Define your end point.
    The HTTPS settings

    To understand about other settings, refer to HTTPS Sender Adapter

    • For the JDBC adapter: Under Connection tab > JDBC Data Source Alias, enter the name of database source to which the adapter should connect to.
    The JDBC settings

    True or False - Data source should be added before you connect to database.

  • Step 4

    To execute this scenario, create a table with few columns.

    1. Add a content modifier and then add following query to its Message Body. To ensure duplicate row entry is restricted in your database, make one of the columns as primary key.

      create table samplejdbc (emp_id integer primary key, emp_name varchar(255));

      Add Content Modifier and input Query
    2. Deploy this integration flow. Now, try to execute this via Postman (You can use any other HTTPS client application too.)
      You can find the endpoint in Manage Integration Content section as shows below.

      Find Endpoint

      In the Request section, enter this End-point and Send the blank query. This will set up connection of Postman with your integration flow.

      Send query from Postman
    3. Before proceeding to the next step, remove Content Modifier from your integration flow. It was added to create a database successfully.

    For simplicity, Postman is used to push the payload, later you will also see how to use Groovy Script to push the payload.

  • Step 5
    1. In your integration flow, edit the settings for JDBC Receiver adapter. Under Connections tab, select the Batch Payload checkbox and Batch Operation as Atomic. It’s done because Batch supports XML payload in INSERT, UPDATE and DELETE modes only. Native SQL queries are supported with prepared statements only.

      Batch Payload Settings
    2. Open your HTTP client (in this case Postman) to create a payload. Enter the Username and Password under Authorisation tab.

      Postman- Authorization
    3. To create Payload, use INSERT mode and create a payload (with table name samplejdbctest ) to test Batch Payload scenario. To use these modes, you must follow the syntax. Send the following payload from Request > Body tab of postman:

      <root>
      	<insert_statement1>
      		<dbTableName action="INSERT">
      			<table> samplejdbctest</table>
      			<access>
      				<emp_id>1</emp_id>
      				<emp_name hasQuot="Yes">v XXXXX</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement1>
      	<insert_statement2>
      		<dbTableName action="INSERT">
      			<table>samplejdbctest</table>
      			<access>
      				<emp_id>2</emp_id>
      				<emp_name hasQuot="Yes"> g28</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement2>
      </root>
      
      Sample Playload

    Notice that the JDBC adapter provides the content from the database in a specific format as described under Defining XML Documents for Message Protocol XML SQL Format

    To check the Message Processing Log (MPL) of this integration flow from your tenant, go to Monitoring > Overview > All Integration flows. Choose the latest record with your artefact name.

    Payload Processed

    Ensure to check the time stamp to determine which record to check.

  • Step 6

    Atomic Batch Operation doesn’t allow an action to be performed on the payload if an error is encountered. To verify that, follow these steps:

    1. Copy the above payload and modify just one record and make it unique. For example, here the emp_id (of second record) is set as 3. Thus, making that record unique, but the first record is duplicate. Send this query.

      <root>
      	<insert_statement1>
      		<dbTableName action="INSERT">
      			<table> samplejdbctest</table>
      			<access>
      				<emp_id>1</emp_id>
      				<emp_name hasQuot="Yes">v XXXXX</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement1>
      	<insert_statement2>
      		<dbTableName action="INSERT">
      			<table>samplejdbctest</table>
      			<access>
      				<emp_id>3</emp_id>
      				<emp_name hasQuot="Yes"> g28</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement2>
      </root>
      
      Atomic Batch Operation

      Postman displays error. To get more clarity, check the MPL.

      MPL for Atomic Batch Operation

    2. To confirm whether unique record (with emp_id 3) is inserted in the database or not, uncheck the Batch Mode checkbox and Deploy your integration flow.

      Batch Mode Off

    Then, Send the following query from Postman:

    select * from samplejdbctest

    Atomic Batch Operation Verification

    This response shows that no new data has been inserted. Hence, verifying the Atomic operation.

  • Step 7

    Depending upon your driver (which depends upon choice of database), the behaviour of Non- Atomic mode differs. Say for SQL server database, you would be able to insert the unique record and it would fail all the duplicate records. So, the complete payload doesn’t fail. But for some other database and driver, it may act similar to Atomic mode and fail the payload even if single error is encountered.

    1. In your integration flow, edit the settings for JDBC Receiver adapter. Under Connections tab, select the Batch Payload checkbox and select Batch Operation as Non-Atomic.

      Batch Payload Setting Non-Atomic
    2. Send this payload from Postman. Here the emp_id (of second record) is set as 4. Thus, making that record unique, but the first record is duplicate.

      <root>
      	<insert_statement1>
      		<dbTableName action="INSERT">
      			<table> samplejdbctest</table>
      			<access>
      				<emp_id>1</emp_id>
      				<emp_name hasQuot="Yes">v XXXXX</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement1>
      	<insert_statement2>
      		<dbTableName action="INSERT">
      			<table>samplejdbctest</table>
      			<access>
      				<emp_id>4</emp_id>
      				<emp_name hasQuot="Yes"> g28</emp_name>
      
      			</access>
      		</dbTableName>
      	</insert_statement2>
      </root>
      
      Non-Atomic Payload

      Check the MPL too.

      Payload MPL
    3. To confirm whether the unique record (with emp_id 4) is inserted in the database or not uncheck the Batch Mode checkbox and Deploy your integration flow.

    4. Send the following query from Postman:
      select * from samplejdbctest

      Non-Atomic Batch Mode Verification

    So, it is evident that using Non-Atomic mode, the new record is getting inserted and the duplicate record fails. So, the whole payload doesn’t fail unlike Atomic Batch operation.

    The function/behaviour of Non-Atomic Batch Operation may depend upon the choice of database.

  • Step 8

    You can also upload the payload using the Groovy Script. In this case you must use the prepared statements.

    1. Uncheck Batch Mode in the integration flow.

      Batch Mode Off

      Now, insert a new database table. Try to do it directly from Postman unlike database creation via Content Modifier. Send the following query from Postman.

      create table testjdbcsample (emp_id integer primary key, emp_name varchar (255), join_date varchar(255), a1 integer, a2 integer, email varchar(255));

      Create database from Postman
    2. In the integration flow, insert the Groovy script to pass the payload and click Select.

      Insert Groovy Script
      import com.sap.gateway.ip.core.customdev.util.Message;
      import java.util.HashMap;
      import java.util.Arrays;
      
      //Insert script
      def Message processData(Message message) {
      
          //Headers
          List paramList = new ArrayList();
          paramList.add(Arrays.asList(108, 'test', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(107, 'test', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(111, 'test', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(105, 'test', 2021-01-01,1,1,'test@gmail.com'));
          message.setHeader("CamelSqlParameters",paramList);
      
          //Body
          message.setBody("INSERT INTO testjdbcsample (emp_id,emp_name,join_date,a1,a2,email) VALUES(?,?,?,?,?,?)");
          return message;
      }
      
    3. Select the Batch Mode checkbox and click Deploy.
    4. Run a blank query in Postman to send the entered payload via Groovy Script and establish a connection between Postman and your integration flow.

      Send Payload from Postman for Groovy Script

    5. Now, verify if the payload has been successfully sent to the database. Remove Groovy Script, deselect the Batch Mode and Deploy the integration flow. Then, run the following query from Postman:
      select * from testjdbcsample

      Link text e.g., Destination screen

    This displays the table data. Hence, a payload has been sent successfully via Groovy Script. You can now alter the table data and verify the Atomic and Non Atomic behaviours as earlier.

  • Step 9

    Replicate the above Atomic scenario with a Groovy Script.

    1. Choose Atomic Batch Mode and add the following script in Groovy Script as explained in Step 8.

      Here, the third and fourth record of the original Groovy Script have been modified.

          import com.sap.gateway.ip.core.customdev.util.Message;
          import java.util.HashMap;
          import java.util.Arrays;
      
          //Insert script
          def Message processData(Message message) {
      
              //Headers
              List paramList = new ArrayList();
              paramList.add(Arrays.asList(108, 'test', 2021-01-01,1,1,'test@gmail.com'));
              paramList.add(Arrays.asList(107, 'test', 2021-01-01,1,1,'test@gmail.com'));
              paramList.add(Arrays.asList(112, 'test112', 2021-01-01,1,1,'test@gmail.com'));
              paramList.add(Arrays.asList(106, 'test106', 2021-01-01,1,1,'test@gmail.com'));
              message.setHeader("CamelSqlParameters",paramList);
      
              //Body
              message.setBody("INSERT INTO testjdbcsample (emp_id,emp_name,join_date,a1,a2,email) VALUES(?,?,?,?,?,?)");
              return message;
          }
      
    2. Send a blank query in Postman to send the entered payload via Groovy Script.
    3. Check the error from MPL.

      Link text e.g., Destination screen
    4. Remove Groovy Script, switch off the Batch Mode and Deploy the integration flow. Then, run the following query from Postman to check the database updates, if any:

    select * from testjdbcsample

    Link text e.g., Destination screen

    This proves the Atomic mode functionality as the database table is not updated. It is the same as it was after Step 8. The new record didn’t insert because there were duplicate records.

  • Step 10
    1. Choose Non-Atomic Batch Mode and add the following script in Groovy Script as explained in Step 8.
      import com.sap.gateway.ip.core.customdev.util.Message;
      import java.util.HashMap;
      import java.util.Arrays;
      
      //Insert script
      def Message processData(Message message) {
      
          //Headers
          List paramList = new ArrayList();
          paramList.add(Arrays.asList(108, 'test', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(107, 'test', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(112, 'test112', 2021-01-01,1,1,'test@gmail.com'));
          paramList.add(Arrays.asList(106, 'test106', 2021-01-01,1,1,'test@gmail.com'));
          message.setHeader("CamelSqlParameters",paramList);
      
          //Body
          message.setBody("INSERT INTO testjdbcsample (emp_id,emp_name,join_date,a1,a2,email) VALUES(?,?,?,?,?,?)");
          return message;
      }
      
    2. Send a blank query in Postman to send the entered payload via Groovy Script.
    3. Check the error from MPL.

      MPL for Atomic Groovy Script

    4. Remove Groovy Script, switch off the Batch Mode and Deploy the integration flow. Then, run the following query from Postman to check the database updates, if any:

    select * from testjdbcsample

    MPL for Non-Atomic Groovy Script

    This shows that the database table has been appended with the new records, proving the Non-Atomic mode operation.

    Congratulations! You’ve learned how to use Batch Mode functionality. Now, you can try to implement the desired type of Batch Mode in your business requirements.


Back to top