Data Collaboration in SAP IBP — Read and Write Key figures(Part 2)
Odata APIs in SAP Integrated Business Planning for reading and writing data
A set of REST based Odata APIs are available in SAP IBP to extract key figures and master data. It is also possible to use these APIs in an external middleware to extract planning data from SAP IBP.
In the previous blog, I explained the need for data collaboration and presented also a high-level overview of such a collaboration setup. In this exercise we used Google’s Apigee as a middleware platform for Extracting, transforming and loading planning data from SAP IBP. In your organization, it could be a different tool used for such purposes, nevertheless the steps involved in calling the APIs remain the same for any middleware.
Google’s Apigee platform provides a façade for consuming backend APIs such as the SAP IBP’s Odata service. This can be done by creating an API Proxy in Apigee platform. An API Proxy is an HTTP endpoint on Apigee that developers can use to access backend services. In this exercise, we create a simple API Proxy on Apigee to consume planning data from SAP IBP using its EXTRACT_ODATA_SRV API.
Step 1 — Create a Proxy
Sign in to your Apigee account and start by creating a new Proxy. Click on Reverse proxy option in the wizard, that would route an inbound traffic to a backend API. In this case, our backend API would be the standard SAP IBP API for extracting planning data. As a test, you can use the API URL for getting the service metadata. The URL for this GET call would look like,
https://{Name-of-your-SAP_IBP-host}/ /sap/opu/odata/IBP/EXTRACT_ODATA_SRV/$metadata
You would have to use your hostname. You can provide this URL as your target URL while building the API Proxy in Apigee. Only you have deployed and finished the API Proxy, you can open it to edit. Go to the develop section of the API Proxy and add an Assign Message Policy to the Pre flow. In the XML for the new Assign Message policy, set the Authorization headers and optionally you can also fetch the x-csrf-token. An example policy would look like,
This single API Proxy step would look something like this,
Once you have saved this version of the API Proxy you can deploy it. Copy the URL of the API Proxy you just created and open it on Postman or your browser. If the credentials were right, you would see the EDMX document from the OData service as a response. Here, the call was made from your browser to the API Proxy you just created in Apigee. The call was then forwarded with the headers you have set on the Assign Message policy to the target endpoint where the SAP IBP’s extract OData service was running. In this way one can build proxies for each method supported in the EXTRACT_ODATA_SRV API.
A Word on CSRF Token
If you want to modify or create new data in the backend you need a CSRF token. In the assign message policy, we have one additional header which is fetching the CSRF token using the header x-csrf-token. It is possible to fetch the CSRF token on any GET call to the backend. The response from the backend would contain the token and a Cookie from the Communication server at the SAP IBP application. We need to extract these two details and reuse them in the following calls. If you do not reuse the cookie from the API Proxy, your target endpoint call will fail as a CSRF token validation error, even if you use the new CSRF token. The response would contain two header parameters,
a. x-csrf-token
b. set-cookie
You can also observe this behaviour on Postman when you do a x-csrf-token fetch in a GET call. Take a look at the headers from the response for “set-cookie”. If you reuse the CSRF token in Postman, it would automatically pick the set-cookie header and assign it to a new header called Cookie and make your modify or create operation. However, if you write this logic using NodeJS or any other language on your own, you need to extract set this header manually. Apigee does not do this for you automatically like Postman, but you can use the ExtractVariable policy to extract these two headers and assign them to the subsequent calls.
Step 2 — Create Proxies for writing data
Once you have extracted the data from SAP IBP, you can modify or mix it with other data that could be relevant in your context. After all the transformation is done on your application, it would time now to write data back to SAP IBP. This is done by executing a sequence of API calls to the backend.
Step 2.1 — Prepare the payload for Writing data to SAP IBP
Once you have read the data from SAP IBP, it is your code which can modify that data, analyse, correlate or collaborate with external data that is relevant to it. After you realize the final content, you are now ready to write data back to SAP IBP. The steps 2.1 till 2.5 explains the sequence of steps you need to do in order to post this modified data back to SAP IBP.
Step 2.2 — Get a Transaction ID for sending your data back
After your payload is prepared, you need to pack this with a transaction ID. This ID is provided by SAP IBP for your write operation. A GET call is available in the EXTRACE_ODATA_SRV API for this purposes. The URL would look like,
https://{Name-of-your-SAP_IBP-host}/sap/opu/odata/ibp /EXTRACT_ODATA_SRV/getTransactionID?$format=json&P_EntityName={Name-of-your-planning-area}
Use the Hostname of your SAP IBP instance and you would also need the name of the planning area to which you like to write the modified key figure data. The response would contain the Transaction ID on a JSON payload. If you remove the $format URL parameter on your request the response would have a XML payload.
Step 2.3 Now make the POST call to send the data
To write the modified data back to SAP IBP, one would have to reuse the transaction id received in the previous step, in the payload on a POST call. The payload would look like this,
AggregationLevelFieldsString contains the name of the key figures you like to write back to SAP IBP. In this example the capacity supply is recalculated by an external algorithm using additional data like real time availability of the shop floor, custom optimizations,etc. Since Capacity Supply is associated with Location, resource and also the time bucket, all four parameters are needed for the one specific time bucket. We pack the names of these fields in the AggregationLevelFieldsString and then inside the Nav{Planning Area} JSON object, we pack the values of the four attributes. The URL for this POST call would look like,
https://{Name-of-your-SAP_IBP-host}/sap/opu/odata/ibp /EXTRACT_ODATA_SRV/{Name-of-your-planning-area}Trans
You might have a situation were there are more than a million records you like to write back to SAP IBP. In which case, you would potentially break the payload into multiple packages and like to make multiple post calls for each package. These calls can also be made in parallel. In that case, you would reuse the same transaction id for the whole series of posts.
Step 2.4 Commit your transaction
Once all your payload is successfully sent to the backend, data is temporarily stored in SAP IBP for this transaction. You have to let the SAP IBP system know that your modification is completed and you have reached the end of all records you want to send back to the system. For this purpose a commit method is available. You would have to pass the transaction id and the name of your planning area as query parameters for this GET commit call. The API call would look like,
https://{Name-of-your-SAP_IBP-host}/sap/opu/odata/ibp/ EXTRACT_ODATA_SRV/commit?P_EntityName{Name-of-your-planning-area}&P_TransactionID={your-transaction-id}& P_AggregationLevelFieldsString=''&P_ReplaceScopeFieldsString=''
Step 2.5 Check the transaction
Once all the data is posted and committed to the backend, it is possible to cross-check if all the data was received properly and confirms to the data types in the backend. A separate method is available for this purpose. The URL for this would look like,
I created separate API Proxies for each of the calls from 2.2 to 2.5. In the end, we could create one single API Proxy and use service call outs to trigger these steps sequentially. In the next blog I would explain how this is done using Apigee and also how read and write can be triggered from the Excel macro we created in the previous blog.
Domnic Savio Benedict