Use a middleware to develop, trace and orchestrate multiple API calls with SAP IBP and external services.

In the first part of this blog series, I explained the need for data collaboration in Supply chain planning and also about extracting data from SAP IBP. In the second blog of this series, we touched the different APIs that are available for writing data back to SAP IBP. All of these processes need to be done using a middleware. It can be code or a framework that could be handling this job. In this exercise, we used Google’s APigee for the same purposes.

For Apigee to initiate the read from SAP IBP, a trigger needs to be sent. Usually it is the Planner who would like to get some insights or associate the planning data to a valid context to simulate a scenario or a situation to which he needs to be prepared for. It could also be a manager or any execute who is associated with Planning. Hence, it is recommended to trigger the Apigee API Proxy to initiate a read from that tool or application the person uses. In this exercise, we assumed that the planner is using Excel as his tool of choice for planning with SAP IBP as his backend system. In Excel, we added a button on the SAP IBP Ribbon to make a HTTP Call (Please refer to part 1 for the details)to Apigee’s API Proxy.

We created a series of API Proxies for different steps of writing data back to SAP IBP (please refer to part 2). These proxies are individual API calls from an external system to Apigee. Now we create a new API Proxy which we call it “Planners Trigger”. This proxy would have a series of service call outs, payload extracts and message assignments between the service call outs. In the end, it would expose a single API proxy endpoint which can be used in the Excel macro to initiate the read, collaborate and write processes. The orchestration plan would look something similar to this illustration,

Fig. 1. Steps with a set of calls outs, extracts and assignments before and after data collaboration.

In the above diagram, an API Proxy called Planners Trigger is initially making a call to the Metadata API (step 1). This step is basically used to extract the CSRF token and also the set-cookie header (step 2). Using this detail, we read the planning data, which is usually a key figure with its attributes from a specific planning area(step 3). The response of this call is then used with an Extract Message policy to get the JSON payload and then assigned to collaborate with an external application. The external application could be anything that suits your requirement (step 4). In this example the Capacity supply key figure from a planning area is extracted along with its attributes such as location, resource and time bucket info. This data is applied to external data and algorithms to recalculate a new capacity supply value.

Once this data is available, then a transaction ID is received from the SAP IBP instance (step 5). This ID is then used along with the new values and then posted back to SAP IBP using the write method (step 6). Once all the data has reached the system, a commit GET call is made to verify the payload and then modify the key figure in the planning area (step 7). A separate API is available to check the status of the transaction (step 8).

My Planner trigger API Proxy was looking like this,

Fig. 2 Example Planners Trigger as an API Proxy implementing the above steps

It is recommended to test each of these service call outs individually before designing this final orchestrated API Proxy. It is not mandatory that both reading and writing has to be done using a single orchestration. It can be done in two or multiple steps too. The credentials for making the API calls to the SAP IBP system are never stored on any of the API proxies. The CSRF tokens and cookie data are generated on the fly by the calls with the backend. In this example, we used basic authorization. Hence, we provide the API Proxies with the Authorization header which contained the credentials from the external application which initiated the trigger. In our case, it was the Excel macro which is setting this header (Refer part 1). It is also possible to use certificate based authentication with SAP IBP.

Next Steps

While writing this blog, I was using the EXTRACT_ODATA_SRV in SAP IBP. Since the EXTRACT_ODATA_SRV API contains methods for both reading and write data in SAP IBP, it would be renamed and further enhanced with the new API — PLANNING_DATA_API_SRV. However, the sequence of steps for the write operation would still remain the same. In the new API, there would be additional features for example,

  1. The external tool can bring their own transaction id which can be accepted by SAP IBP
  2. An auto commit flag would also help speed the process in some situations where there is only a small payload and a single POST call.

Not limited to these new features the new API has more potential for enhancement. They would perhaps come up in 2021 and the following sprints.

Summary

This blog series, explained the reason for data collaboration. It also outlined how planning data can be exported out of SAP IBP. Using a cloud based middleware platform such as Google’s Apigee, this blog series explained how the different APIs can be orchestrated from API Proxies. We also explained in detail, how to initiate the API Proxy to collaborate by a trigger from a rudimentary planner’s tool. I hope you were able to get a glimpse of how extensible is SAP Integrated Business Planning for data collaboration.Thanks for your time on visiting this blog, I would be happy to hear your comments and feel free to reach out.

Domnic Savio Benedict

--

--

Domnic Savio

Writes code, loves Netflix and Coffee. He has a passion for new technologies, especially things that connect. https://www.linkedin.com/in/domnic-savio-benedict/