Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
AlexDong
Product and Topic Expert
Product and Topic Expert

Background


We know that one of the biggest advantages for SAP Integration Suite is the monitor functionality. You can monitory different KPI figures like how many messages completed in the past 12 hours or how many messages failed in the past hour, as the following screenshot shows.


However, sometimes we don't want to check manually every day to see if something goes wrong with messages. We want to have some notification mechanism that when e.g. message counts go beyond some threshold then we will receive a email notification. And SAP Alert Notificaiton Service can help here.

The IFlow will look like this.



Prerequisites



  1. You have basic knowlege on SAP Integration Suite

  2. You already have SAP Integation Suite instance

  3. You already have entitlements on SAP Alert Notification Service


Steps


First of all we need to have some API which can show the message status during the last 12 hours for instance. Fortunately SAP has already given such kind of API called Message Processing Logs.


The first one will give all message details given the filter condition e.g. you can set Status eq 'FAILED' which returns message processing logs with status 'FAILED'. Let's try the third simple one, which you can leverage to return the message count given the filter information.

You can test this API via postman very conveniently.

  • API host is your SAP Integration Suite runtime address, which you must be very familiar if you are a integration developer.

  • API path is /api/v1/MessageProcessingLogs/$count

  • API filter is $filter=LogEnd gt datetime'2023-01-28T20:17:57' and LogEnd le datetime'2023-01-29T08:17:57'. Of course the start time and end time should be dynamic and we need to set it up within this Iflow.

  • API payload is just the message count number e.g. 96 here.

  • API authorization is the same as the one used for other iflows e.g. the OAuth2 client id and secret



 

We also need to have SAP Alert Notification Service (ANS) running so that if the message count within last 12 hours have exceeded the threshold we specified, we will get an email. SAP ANS is not complicated and you can search in SAP community for information like how to create instance and how to set it up and finally how to test it. E.g. here is a nice blog post telling how to set it up in Cloud Foundry environment. Regarding the cost and price, you can refer to Discovery Center. Once you have SAP ANS instance ready, you can fill in the email subject and payload and send it to SAP ANS and then SAP ANS sent email to the person specified in SAP ANS.

Here is the event condition I set.


 

Now we have the API endpoint for Message Logs and SAP ANS for sending out notification emails.

Let's start to compose the Integration Flow, starting with a timer, which will trigger this IFlow every 12 hours.



 

I also put a content modifier named "Initialize" which is pertty simple, so that you can specify the message threshold. If the message count within the 12 hours exceeds this threshold, then we trigger the notification, otherwise we do nothing. This is an externalized parameter and you can configure it each time you want to change it to some new value.


The next step we need to calculate the time range for last 12 hours via a groovy script. Here is the code I used. And we need to consume the two properties called "CURRENT_TIME_FRAME_START" and "CURRENT_TIME_FRAME_END" for the Message Logs API endpoint filter. You can also see we set 12 hours for "MAX_TIME_FRAME_SIZE_MS".
import com.sap.gateway.ip.core.customdev.util.Message
import groovy.transform.Field

import java.text.ParseException
import java.text.SimpleDateFormat

@Field final String TIME_ZONE = "UTC"
@Field final String TIME_FORMAT = "yyyy-MM-dd'T'HH:mm:ss"
@Field final SimpleDateFormat DATE_FORMATTER = new SimpleDateFormat(TIME_FORMAT)
@Field final String LOG_PROPERTY_KEY = 'Log'
@Field final String LAST_TIME_FRAME_END_HEADER_NAME = "LAST_TIME_FRAME_END"
@Field final String CURRENT_TIME_FRAME_END_HEADER_NAME = "CURRENT_TIME_FRAME_END"
@Field final String CURRENT_TIME_FRAME_START_HEADER_NAME = "CURRENT_TIME_FRAME_START"
@Field final Integer MAX_TIME_FRAME_SIZE_MS = 12 * 60 * 60 * 1000

Message processData(Message message) {
StringBuilder logMessage = new StringBuilder('Calculate Time Frame:\n')

configureDateFormatter()
Calendar now = Calendar.getInstance()

String currentTimeFrameStart = DATE_FORMATTER.format(retrieveCurrentTimeFrameStart(message, now))
String currentTimeFrameEnd = DATE_FORMATTER.format(retrieveCurrentTimeFrameEnd(now))

logMessage.append("Executing for time frame: ${currentTimeFrameStart} - ${currentTimeFrameEnd}\n")
message.setHeader(CURRENT_TIME_FRAME_START_HEADER_NAME, currentTimeFrameStart)
message.setHeader(CURRENT_TIME_FRAME_END_HEADER_NAME, currentTimeFrameEnd)
message.setProperty(LOG_PROPERTY_KEY, logMessage.toString())
return message
}

void configureDateFormatter() {
DATE_FORMATTER.setTimeZone(TimeZone.getTimeZone(TIME_ZONE))
}

Date retrieveCurrentTimeFrameStart(Message message, Calendar now) {
String lastTimeFrameEnd = getStringHeader(message, LAST_TIME_FRAME_END_HEADER_NAME)
if (lastTimeFrameEnd?.trim()) {
return getCurrentTimeFrameStart(lastTimeFrameEnd, now)
} else {
return getDefaultCurrentTimeFrameStart(now)
}
}

Date getCurrentTimeFrameStart(String lastTimeFrameEnd, Calendar now) {
try {
Date currentTimeFrameStart = DATE_FORMATTER.parse(lastTimeFrameEnd)

return getTimeFrameSize(now.getTime(), currentTimeFrameStart) > MAX_TIME_FRAME_SIZE_MS ? getDefaultCurrentTimeFrameStart(now) : currentTimeFrameStart
} catch (ParseException e) {
return getDefaultCurrentTimeFrameStart(now)
}
}

Date getDefaultCurrentTimeFrameStart(Calendar now) {
Calendar nowClone = now.clone() as Calendar
nowClone.add(Calendar.MILLISECOND, MAX_TIME_FRAME_SIZE_MS * -1)
return nowClone.getTime()
}

static long getTimeFrameSize(Date to, Date from) {
return to.getTime() - from.getTime()
}

static Date retrieveCurrentTimeFrameEnd(Calendar now) {
return now.getTime()
}

static String getStringHeader(Message message, String headerName) {
return message.getHeader(headerName, String.class)
}

Aferwards, let's drag a "request and reply" adapter so as to call the Message Logs API. Here is the connection details. As you can see the two parameters in last step have been filled into the API filter.

For credentials, as mentioned it is the same as your other iflows. Here we use the OAuth2 authentication.


Next, we drag a "content modifier" adapter and name it "Set Message Count", which means we got the count number after API call and set it into properties.


Now we have the message count in last 12 hours and we have specified the message threshold in the beginning, we can compare the two values. Let's drag a "Router" adapter and give the condition like this ${property.MessageCount} > ${property.MessageThreshold}. This means we will send out the email notificationt only under this condition.


Let's drag another content modifier and name it "Prepare Notification Body", where we set the email subject and body.


Here is the complete body.
{
"eventType": "CPIIntegrationFlowExecutionFailure",
"resource": {
"resourceName": "Spark Usage Integration Flow",
"resourceType": "flow",
"tags": {
"env": "develop environment"
}
},
"severity": "FATAL",
"category": "ALERT",
"subject": "Spark Usage Detected - Alert",
"body": "Integration flow message number increases by 10% yesterday",
"tags": {
"ans:correlationId": "30118",
"ans:status": "CREATE_OR_UPDATE",
"customTag": "42"
}
}

 

and don't forget to put a message header to the adapter as the following screeshot shows.


Finally let's drag another "request reply" adapter to call the SAP ANS endpoint. Here is the connection details.


No magic here, except you need to know SAP ANS endpoint and the credential information, both can be got within SAP ANS instance.

Test


Let's do a simple test. Set the message threhold value to be a smaller one like 0 and deply the iflow. For testing purpose you can set the timer to be run once afer deployment. Then you should get a notification email as long as you have iflow messages in the last 12 hours.



Summary


This is just a simple use case and you can extend it according to your own requirement. For example, you want to monitor the "Failed" message in the past 1 hour. But please notice this is a customized iflow, which means this notification iflow will still cost your message volume.

Suppose you don't want to spend money on SAP ANS you can also leverage other email e.g. Gmail to send the notification.




Are you a developer or integration designer who already know his way around SAP BTP? Then find out how to build integrations from and to cloud applications SAP’s free learning content on SAP Integration Suite. Check out even more role-based learning resources and opportunities to get certified in one place on SAP Learning site.
3 Comments