0% found this document useful (0 votes)
44 views23 pages

BTP 32

This document provides an overview of API provisioning and security in SAP Integration Suite. It discusses exposing APIs from backend systems through a uniform API portal using techniques like custom domains, URL rewriting, load balancing, and advanced authentication like mutual TLS and OAuth. It also demonstrates principal propagation from an identity provider to an on-prem system using policies for token exchange and certificate generation.

Uploaded by

sprasadn66
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views23 pages

BTP 32

This document provides an overview of API provisioning and security in SAP Integration Suite. It discusses exposing APIs from backend systems through a uniform API portal using techniques like custom domains, URL rewriting, load balancing, and advanced authentication like mutual TLS and OAuth. It also demonstrates principal propagation from an identity provider to an on-prem system using policies for token exchange and certificate generation.

Uploaded by

sprasadn66
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

PUBLIC

openSAP
Modernize Integration with SAP Integration Suite

Week 3 Unit 1

00:00:05 - Hello, everyone. Welcome to week three,


00:00:08 which is all about APIs and Event-Based Integration. In unit one, we talk about API
provisioning.
00:00:18 In week one of our course, we went through the five pillars of API Management
00:00:23 as illustrated in the slide. In this week's chapter,
00:00:27 we will deep-dive into API provisioning that involves taking a closer look at API providers,
00:00:34 securely exposing APIs from the systems, and ensuring that there is a uniform facade
00:00:41 for applications and consumers to interact with these APIs. At this point, you probably know
that the API portal
00:00:51 exposes a default virtual host on the apimanagement.region.hana.ondemand.com subdomain.
00:00:59 What you could also do is provision additional virtual hosts that could either be on the default
subdomain
00:01:07 or on custom domains of your choice. It should be pointed out
00:01:12 that this is not the same as custom domain service on BTP. This comes at no additional
charge,
00:01:20 and all we need would be your domain certificates and private keys to be securely provided
00:01:27 to our operations team. We do support extensions like SNI and SAN
00:01:33 over wildcard certificates, if that's okay with you. We do support client certificate based
authentication.
00:01:40 If you need additional security, we have out-of-the-box policies
00:01:46 to read certificate attributes for advanced ACLing if that's your use case.
00:01:53 Another facet of an API Management solution would be to facilitate URL rewriting on the fly
00:02:00 so that dynamic routing decisions can be made. As you can see in the screenshot,
00:02:05 attributes and resource path parameters on the API provider can be parameterized and
dynamically populated.
00:02:14 Another nifty feature would be to facilitate load balancing across multiple API providers
00:02:20 and distribute traffic to these API providers via multiple algorithms like round-robin,
00:02:27 so on and so forth. Not just the API portal, the developer portal,
00:02:33 or the API Business Hub Enterprise can also carry a custom host name,
00:02:39 for example, api.developers.bestrun.com, so that your organization
00:02:46 can have a truly heterogeneous API catalog. Let's deep-dive into API security.
00:02:57 You probably know by now that API providers are nothing but ways to define types of systems

00:03:05 and set the connectivity to these systems. For example, you could have an internet-enabled
LOB systems,
00:03:12 Cloud Integration systems, from where integration flows can be discovered,
00:03:17 Open Connectors, providers, to discover APIs that point to non-SAP systems, and so on.
00:03:24 You can set up MTLS between the API gateway and API providers, if you like.
00:03:31 Now, a prominent use case is to be able to connect to on-prem systems
00:03:36 like S/4HANA or ECC. And for that,
00:03:40 we implicitly end up using the Cloud Connector, and we do support standards like passing the
user principal
00:03:48 from the BTP connectivity context all the way to on-premise systems,
00:03:53 and we have a demo of that coming up now. Not just that,
00:03:58 we have many customers that need connectivity to systems like S/4HANA Cloud,
SuccessFactors,
00:04:05 beyond just basic authentication and certificates. And we do have many variants of policy
templates
00:04:12 where advanced techniques based on OAuth token exchange and identity federation can be
enabled.
00:04:20 Now, let's have a look at some of these aspects in a live demo.
00:04:26 I'm connected to the API Portal of the Integration Suites, and in my Create API Experience,
00:04:33 you see that over and above the default host name, I have the custom domain introduced here
as well.
00:04:40 So we do support additional domain, and if you were to create your API,
00:04:45 it actually gets published on either of these. So I have this API client certificate test,
00:04:52 and as you can see, it actually points to a custom domain. And if I did invoke the API proxy
URL,
00:05:00 you would see that it actually ends up in a failure. That's because I did not explicitly pass a
certificate
00:05:06 while making this call. So you see that the endpoint is actually configured
00:05:12 to accept a two-way SSL certificate. Now, let's see that in action.
00:05:16 So I'm actually now invoking a curl, and on the curl, you see that I'm actually passing
00:05:23 my client certificate and private key. And when I make a call, you should see that as a
response,
00:05:32 it actually gives me back a set of parameters. As a demonstration,
00:05:36 I'm actually printing out the certificate common name, the distinguished name, the certificate
fingerprint,
00:05:42 raw certificate details, and so on, and this is something that we can test as well.
00:05:48 So if I were to just do a quick OpenSSL command to read the certificate attributes,
00:05:54 you see that it actually matches with the response that I just got in my previous invocation.
00:06:00 So this does go on to show that we do honor a two-way SSL on an API proxy endpoint with a
custom domain.
00:06:10 Let's move forward. And next coming up on the demo,
00:06:15 I have a system where I've actually configured a principal propagation end to end
00:06:22 with an Azure identity provider. And as you can see,
00:06:26 this is an API provider that actually connects to an SAP S/4HANA on-prem simulation system.

00:06:31 And this is not a real system, something that I've just created as a simulation,
00:06:35 and you see that it's actually connected to an httpbin.local endpoint.
00:06:40 But the interesting thing is that it needs to actually go via Cloud Connector,
00:06:45 and that it's configured for principal propagation. And if I were to now go to my Cloud
Connector setting,
00:06:53 what you see is that the virtual host is indeed connected to my internet host,

2 / 23
00:07:00 and this is something which is set for principal type X.509. Excellent, this is what we wanted.
00:07:06 And on the configuration, you see that we have of type on-premise
00:07:11 and we've actually configured the system and other certificates needed,
00:07:17 and it actually checks for a common name on the parameterized user's email address,
00:07:23 and it actually creates a certificate on the fly, and then propagates this all the way to the
backend.
00:07:28 So that's the way how we are set up. And to test this,
00:07:31 and here is our endpoint on Microsoft Azure, and you see that we are actually set up
00:07:36 for an OpenID Connect invocation. And once I authenticate with my Azure AD,
00:07:42 what you see is that the system comes back with a jwt token, and this token actually has the
user's principal
00:07:50 that is actually of a Microsoft user. And once I invoke my API on API Management
00:07:58 with the bearer token in place and I send this, you should see that the system actually
00:08:05 takes this connection all the way to our on-prem backend. And while that's trivial,
00:08:11 it is important to note that we in fact were successfully able to pass
00:08:17 the user's common name, which is what I am about to extract now.
00:08:22 So on my Cloud Connector logs, if I were to just look at the log entries
00:08:29 and if I look for X.509, what you see is that it in fact does show the fact
00:08:38 that the certificate was generated for my Outlook user, and this is something
00:08:42 that goes all the way to our on-prem system. Excellent.
00:08:47 So you saw that it was so simple to pass a user principal all the way to an on-prem system.
00:08:56 And to achieve this, what you see here are a bunch of policies,
00:09:01 and these are all policies that actually we've published as templates on API Business Hub,
00:09:07 and we are actually set up for a user principal creation, and then exchanging that as a token
with access UAA.
00:09:15 So that was the primary example there. And if I can go back to my API portal,
00:09:21 and if I show, for example, now my next API proxy, which is set up for connectivity to BTP,
00:09:28 in this case, S/4HANA Cloud, you see that there is a communication arrangement
00:09:33 which is set up for OAuth and not just username and password.
00:09:38 And as you can see, there are OAuth credentials that the communication arrangement
exposes,
00:09:44 and this is something that we will have API Management honor in this demonstration.
00:09:51 And if I can head back to my destinations, you should be able to see that we are set up for this

00:10:00 and the proxy has an endpoint. And within my target endpoint,
00:10:06 you see that we actually are connecting to the S/4HANA Cloud instance.
00:10:12 Excellent. And yes, this is the crux of the connectivity piece.
00:10:17 You see that now we have the s4hana_oauth_email as the destination,
00:10:25 and you see that it's actually set up to talk to the S/4HANA endpoint,
00:10:29 and it's actually based on OAuth to SAML bearer exchange, and this is something that API
Management will honor.
00:10:36 And we do this by virtue of having a bunch of policy templates.
00:10:42 Again, something that we've documented and you see that all that the user needs to do
00:10:46 is to actually set up the destination name within the configuration file and that's about it.
00:10:57 And as I invoke the API, you'll see that the system will do the translation.
00:11:03 I'm actually calling the API business partner endpoint. And you see that it actually comes back

3 / 23
00:11:08 with a token to be exchanged, and this token goes to BTP, and BTP would then set up
00:11:16 the necessary authentication to S/4HANA, and then you see that it actually comes back
00:11:21 with my response set. I could have done something very similar
00:11:26 also with SuccessFactors. So here in this case,
00:11:28 you see that I'm actually invoking the SuccessFactors employee payroll endpoint.
00:11:33 It follows exactly the same approach: I get an access token exchanged,
00:11:39 and with this access token, when I present this to API Management,
00:11:43 API Management presents it to BTP on my behalf, and then eventually it comes back
00:11:49 with an authentication to SuccessFactors. And this is something that you will see in a minute.
00:11:54 And you see that it actually came back with records from SuccessFactors,
00:11:58 and it follows pretty much the same nuances of destination, like I mentioned.
00:12:04 And you see that it's actually connected to API sales demo, and pretty much the same policy
set.
00:12:11 But just that in this case, the destination is set for SuccessFactors and not S/4HANA
00:12:17 and this is the crux. All right, with that demonstration,
00:12:24 let's head back to our summary. All right, so to summarize: what we've learned so far
00:12:31 is that API Management in Integration Suites provides a myriad of options to seamlessly
connect
00:12:38 to heterogeneous systems. We have rich out-of-the-box policies,
00:12:43 and many ready-to-deploy policy templates that can help mediate the security handshakes
00:12:50 for clients that seek to connect to backend services within and outside the BTP context.
00:12:56 See you in the next unit.

4 / 23
Week 3 Unit 2

00:00:06 - Hello, everyone. Welcome to week three, unit two.


00:00:09 In this unit, we talk about API consumption. In our previous unit,
00:00:16 we went through aspects of API provisioning. In this week's unit, we look at the other spectrum

00:00:22 of the API management lifecycle, which focuses on API consumers.


00:00:28 These could be partners, end users, or even systems of engagement
00:00:32 that would want to discover and publish the APIs and consume them per the security
standards
00:00:40 defined by API administrators. What used to be simply called a developer portal
00:00:49 in the past now takes a new avatar, called API Business Hub Enterprise.
00:00:55 Please note that in this unit, we don't intend to walk you through
00:01:00 each and every feature and function of the API Business Enterprise
00:01:04 as this is something we've done in our previous iterations of our OpenSAP course.
00:01:09 Instead, we will focus here only on certain novel aspects and features
00:01:15 that we've added from the recent past. The idea is that there is something here for every
persona,
00:01:24 ranging from a cloud-native pro-code developer who needs code snippets, samples,
00:01:30 and client SDKs to get going, to a business app developer who needs a rich set
00:01:36 of strongly typed API specs and definitions that he can pull into any low-code tool,
00:01:42 or even a citizen developer who needs access to all enterprise APIs across SAP and non-SAP
installations
00:01:50 in their organization. And this needs to be in a catalog format,
00:01:54 so that their no-code tools can have access to these APIs. Of course, while we start with a
listing of APIs,
00:02:03 at some point, the API Business Hub Enterprise will have a rich catalog of events, business
objects,
00:02:11 the ability to to call actions on these APIs, so on and so forth, just like
00:02:16 the SAP Business Accelerator Hub does. While we make it possible to implicitly have
00:02:22 APIs and data graphs from the underlying API Management system,
00:02:27 it is totally possible to have APIs listed from third-party API gateways as well as unmanaged
resources
00:02:36 without putting an additional layer of our own gateway on top of this.
00:02:45 There have been recent enhancements on the API Business Hub Enterprise
00:02:50 in terms of its ability to take customizations. We have added site editor capabilities
00:02:57 and a new design that allows customers to rebrand the portal to match with their own
enterprise themes.
00:03:06 We have added graph as a capability that would get customers navigate and explore data
graphs
00:03:14 that they would have provisioned on their tenant. Here is an example of the customization
panel
00:03:24 where one could change site headers and banner settings, themes, colors, logos, CSS, and so
on.
00:03:36 In the previous unit, we saw how API Management would interact with API providers
00:03:42 via various connectivity and destination settings. And from an API consumer's point of view,
00:03:49 it is equally easy to invoke the APIs from an app context running in SAP BTPs.

5 / 23
00:03:56 Apps that are deployed beyond BTP context as well will have a host of options to connect.
00:04:04 For example, via API keys, certificates, and open standards based on OAuth framework
00:04:11 and OpenID Connect, SAML, and so on. Let's have a quick demonstration
00:04:18 and look at some of these aspects and build on top of our previous week's showcase.
00:04:27 Let's start by invoking the API Business Hub Enterprises URL in a new browser tab, and as
you can see
00:04:35 over and above the default identity provider, I could actually authenticate
00:04:39 via different identity providers on different protocols as well.
00:04:44 So in this case, I chose to go with OpenID Connect and I present my username
00:04:50 and the system actually determines that for my SAP user, it needs to actually go to Okta.
00:04:56 And you see that I actually sign up with Okta and the system signs me in
00:05:01 and it lets me into the developer portal and you see that I'm actually authenticated
00:05:06 and that I get to see my list of products. Excellent.
00:05:09 But I could have also signed in with my Outlook user and what you see that the system
00:05:16 will then authenticate me to Microsoft Azure and it actually presents me the same developer
experience,
00:05:22 but just that in this case, I'm authenticated with the Azure user with absolutely no additional
effort.
00:05:29 The platform handles this for me free of cost. I could have also authenticated with SAML,
00:05:35 which is what I do now, and you see that in this case, Okta is also set up as a SAML identity
provider
00:05:42 and I am able to successfully authenticate and log into the same portal with SAML as well.
00:05:50 And here is how the view of the developer portal would look from an administrator standpoint.
00:05:57 What you see here are a bunch of connectivity options called Enterprise Manager, and here
for example,
00:06:03 I could see and manage my platform users, connections to multiple API portals,
00:06:10 so you see that in this case, there is one registered user, and likewise, I could connect
00:06:16 to multiple API portals as well. I could maintain domain categories,
00:06:21 I could maintain notifications and other regular updates that my developers ought to have.
00:06:27 So it's really a rich set of data and portal that allows for developer communication
00:06:34 and developer coordination. Now we do have the site editing capabilities.
00:06:49 So what you see over here is a form, and I could actually edit and update the website settings.

00:06:57 I could add my own branding, my logo, I could manage descriptions,


00:07:01 I could also introduce new colors, themes, and so so. And this is something that would contain

00:07:07 and would then continue for developers who sign in to the portal subsequently.
00:07:14 And here is a demonstration of the API consumption. And you see that I'm actually logged in to
an application,
00:07:23 to an instance of Business Application Studio. And actually I'm signed in with my SAP IAS
user,
00:07:30 and what is important to note over here is that it's actually a default application
00:07:36 generated per template and it's actually connected to a Northwind destination on the URL
00:07:44 that actually falls under the API Management domain. And it's a pretty standard application,
00:07:49 just generated out of the box. There's nothing special going here,
00:07:53 and you see that within the manifest descriptor, it's actually connecting to the API resource.
00:08:00 And if I were to now take you through the destination settings, we use a similar destination
00:08:07 as we did in the previous unit. I'm calling destination Northwind,

6 / 23
00:08:12 and it's actually set for OAuth 2.0 jwt bearer. This is something that clients do understand.
00:08:18 It's something that the BTP platform supports, and something that API Management
00:08:23 would then grab and then exchange the presented token and create a new session to
authenticate to the backend.
00:08:31 And that's exactly what we'll do now. And if I go to my application API,
00:08:38 you see that it's actually the Northwind API, and if I show you the the target endpoint,
00:08:46 it's actually connected to my on-prem system. So the target endpoint entails the API provider
00:08:55 and this is the same API provider that we used in the previous example as well.
00:09:00 Non-prem system connected to Northwind, set for principal propagation.
00:09:06 Excellent. Now let's actually go back to the app and invoke this.
00:09:11 So I'm invoking on this in a local mode and you see that it actually will start up an application.
00:09:18 The application eventually would then load a UI page and it comes back with a response
00:09:24 and this goes all the way to the backend. It actually has come back with a list of products for
me.
00:09:29 And if I were to create a small filter, show me product ID name, show me quantity per unit,
00:09:36 and that actually comes back with a record set. Excellent.
00:09:41 And let's just go back to Cloud Connector and then verify what connections came in.
00:09:46 So if I went to my logs and trace files, downloaded logs, and in this case what you would see
00:09:55 is that the user's application, which in fact is a user from SAP IAS
00:10:02 is the one that the application would then present to the backend.
00:10:07 And this is something that I would filter for, look for an X.509 certificate, look for the user
issuer,
00:10:15 and you see that this is actually pointing to this user who actually comes from my IAS.
00:10:22 Excellent. So we saw that the client connection
00:10:25 came in all the way to the backend and it's so simple for us to create this application
00:10:31 and then have it consumed via API management. So to summarize, what we've learned in this
unit
00:10:39 is that API Management and Integration Suite offers a rich design-time experience
00:10:45 for developers to discover APIs and data graphs from the API Business Hub Enterprise
00:10:51 in a secure and governed manner and we enable a cloud-native experience
00:10:56 for clients to build apps on top of these APIs, something that they can connect to
00:11:02 to an enterprise backend system.

7 / 23
Week 3 Unit 3

00:00:05 - Hello everyone. Welcome to week three, unit three.


00:00:09 In this unit we talk about API lifecycle management. In our previous units, we focused on
facade and security
00:00:19 as the main pillars that drive the handshake between API consumers and providers.
00:00:25 In this week's unit, we will deep dive into API governance, transformation
00:00:30 and insights as the supporting pillars that sort of complete all the aspects of API full lifecycle.
00:00:40 Beyond just the capability to be able to deploy, or undeploy an API,
00:00:46 it is important that your developers know that an API can hold other secondary states as well.
00:00:52 For example, an API that is marked as an alpha or beta could be implied to have a different
SLA
00:01:00 that is totally different from regular APIs, or your developers need to know that as an API
provider,
00:01:08 you have chosen to have multiple versions of an API and that a lower version could be set to a
deprecated state
00:01:16 and yet developers could continue to use these APIs and to a client that makes such a call,
00:01:24 you will have to deal with the custom HTTP error code.
00:01:28 And if an API were to be later decommissioned, you could deal it accordingly.
00:01:33 Likewise, just beyond a regular versioning, it will be possible now to carve out multiple
revisions
00:01:40 of the same API and be able to view and cycle through deployed instances of these instances.

00:01:48 Likewise, also on the API Business Hub Enterprise, a developer may want to periodically
rotate API keys
00:01:57 so that there are no security breaches on his APIs. Access control on APIs and products can
be fine-grained too.
00:02:07 As you can see here on the slide, I have a product that authorizes a read,
00:02:13 write and delete scope, which would mean that access tokens to a developer would have to be
granted
00:02:20 only based on the requesting scope. And if the developer tried to perform an operation,
00:02:26 that falls outside the scope of the request that was called out,
00:02:36 the call will be rejected. Likewise, an admin could explicitly enable permissions
00:02:42 to be able to discover and subscribe for APIs based on the actual group assignment
00:02:48 that a user's login assertion presents. Here are some aspects of governance
00:02:56 as built into the API Business Hub Enterprise. There are two fundamental pieces here,
00:03:03 one that an admin should be able to look at the users that are registered
00:03:08 as developers in their tenant, look at the subscriptions, perform an on-behalf task like onboard,

00:03:16 offboard users as necessary. The second piece


00:03:19 is the ability to connect to multiple API portals and create one unified API Hub Enterprise.
00:03:28 We want you to continue having multiple API gateways if you have chosen so,
00:03:34 and yet be able to create a unified catalog of APIs that is able to federate across
00:03:41 all your multiple API portal installations. API transformations can be done in multiple ways,
00:03:51 and we have many out-of-the-box policies, like assign message, extract variables,
00:03:57 service call-out and script-based policies for header, payload, and URL manipulation.
00:04:06 We don't necessarily talk about it here in this course. What we want to focus on here

8 / 23
00:04:11 is the ability to conduct API transformation via Graph that are first-class semantic managed
and curated APIs
00:04:22 from SAP that really simplified the way how you interact with SAP and your non-SAP systems,

00:04:30 helping you build relationships, associations between APIs, being able to filter, query
00:04:37 and select data across multiple systems in a OData V4 compliant way,
00:04:42 or even in a Graph query manner. We do this by building what we call
00:04:49 a business data graph. What you see here on the slide
00:04:52 is basically a metadata that describes these objects as data sources and the relationship
between them.
00:05:00 We do support being able to enhance the data graph with your own custom data sources and
model extensions.
00:05:12 Let's have a look at some of these aspects at play in this very quick demonstration.
00:05:19 Let's understand what exactly is a business data graph. So one of the nuances of Graph is that
we have access
00:05:29 to what we call mirrored entities and unified entities. Mirrored entities are basically projections
of data objects.
00:05:36 And as you can see here, we have an entity from SAP S/4HANA,
00:05:42 we call this business partner which is something that you get out of the box.
00:05:47 We have unified entities on our different namespace which are abstracted, and these are
simplified entities
00:05:54 that may connect multiple data models and think of this as a least common denominator
00:06:00 that really makes it simple for you to access all these heterogeneous systems together.
00:06:05 Now we also have what is called a model extension. And model extensions
00:06:11 are nothing but enablers of custom entities, and you actually can define your own entities, for
example,
00:06:18 that may combine attributes from multiple entity sources. You may want to simplify
00:06:24 or rename or hide the complexities from APIs. That's something that model extensions let you
do
00:06:32 and you may as well connect to other sources as well using these associations.
00:06:40 Let's take a look, let's have a quick example of what a custom entity may look like.
00:06:44 So here is an entity called FlyingPartner on the best run name space.
00:06:50 And as you can see, it has a bunch of attributes. And these attributes essentially come
00:06:56 via associations with for example, an S/4 business partner entity.
00:07:01 And you may have a custom data source in which, in this case I call this a frequent flyer entity.

00:07:10 And you see that there are attributes that actually come from this custom data source,
00:07:15 and I blend everything together and then I actually get this custom entity as a result.
00:07:23 Let's take a quick look at how this pans out on the application.
00:07:28 So what you see on the screen is the graph entity within the Integration Suite, Canvas, and
within Graph.
00:07:37 Now you see that there are essentially two portions: We have what is called
00:07:42 the business data graph and the model extension. And let's have a look at each one of these.
00:07:49 The model extension is basically the part where I could create entities from let's say a custom
source.
00:07:57 And the way how I have done this is by having this JSON expression,
00:08:03 which is typically file and XML, file and JSON based, but we do have plans of actually
enhancing this,

9 / 23
00:08:11 and then having a user interface to let you enable such a definition.
00:08:18 What you see here on the screen is the frequent flyer entity from the custom data source,
00:08:23 and then we have the business partner entity from S/4, and you could essentially create a
composition,
00:08:29 you could reorder them. So it really is a very powerful way
00:08:34 to be able to create these custom extensions. And not just in a tabular format:
00:08:40 We have a graphical representation as well, for you to look at these relationships.
00:08:50 Now that we've created our model extension, the next step would be to actually create a
business data graph with this.
00:08:57 And to do that, let's create a new business data graph. This is something that I've already
done,
00:09:02 so I'm calling this "flying-partner", and you see that it actually has a call out
00:09:08 to the my.custom namespace and my S/4 namespace as the data sources that I've bound this
to.
00:09:15 That's pretty simple. And once this is created and deployed,
00:09:24 let's basically change our persona and let's don the role of an application developer.
00:09:31 And as an application developer, I head to Graph Navigator. And what you see on the
navigator on the left hand side
00:09:37 are the data graphs that I have access to as a developer. And you see that I have a bunch of
data graphs here.
00:09:45 Let's look at the flying-partner, the one that we just created.
00:09:49 And like I mentioned, you have access to all the abstracted namespaces
00:09:53 in the sap.graph namespace, as well as we have the mirrored entities as well.
00:09:59 So those are all the entities that you have access to on this data graph.
00:10:05 So that's business partner. And then these are all the other operations
00:10:08 on the business partner. And then I have my custom data source,
00:10:13 the frequent-flyer entity. And last but not the least, I have the best run namespace,
00:10:18 which eventually would represent the flying-partner object. And you see
00:10:23 that the schema calls out the multiple properties, simple properties that encapsulate this entity,

00:10:30 partner ID, membership, location, all simple types. And what you see on the right-hand side
00:10:37 is an example of how the semantic model for this object plays out.
00:10:43 Again, it's a simple type, and what I could do is to also in real time try out this API,
00:10:51 so I go to the tryout console, and you see that I have an OData query
00:10:57 that I could actually make on the fly. I call this object, and what you see here
00:11:03 is a representation of the OData that I get. What we also have is a GraphQL client,
00:11:10 so in this case I'm looking at Altair, it's a standalone for GraphQL visualizations,
00:11:16 and it lets us call out a GraphQL query, very similar to the one that we just did on OData V4.
00:11:24 So I call the bestrun FlyingPartner entity, call out the different nodes,
00:11:30 I query for the top one response. And lo and behold, you actually get a response back
00:11:37 in a GraphQL semantic. So that's as easy as it gets.
00:11:42 So what you see is that Graph really makes it easy for you to be calling out
00:11:47 OData as well as GraphQL-like object responses. So with that, let's head back to our
summary.
00:11:56 And what do we summarize into? So we summarize that API Management capabilities
00:12:02 within the Integration Suite offer a multitude of features to govern an API's lifecycle and status.

10 / 23
00:12:09 And as we saw, Graph offers a semantic model on APIs from SAP' entities and custom entities
as well.
00:12:19 And with that, I end this unit. Thank you very much.

11 / 23
Week 3 Unit 4

00:00:06 Hello and welcome to a closer look at the eventing infrastructure.


00:00:11 I was already telling you earlier, in the introductory session, that we have an end-to-end view,
00:00:19 and you can see it's detailed. So there are quite a number of event sources,
00:00:26 infrastructure components, and event consumers in there. And it's actually so many,
00:00:33 that I need my glasses, and I will put them on now. And let's start on the left-hand side
00:00:39 by looking at the event sources, as part of SAP's event-driven ecosystem today.
00:00:46 You can see that most major back ends, major business applications S/4HANA,
00:00:51 S/4HANA Cloud, ERP, SuccessFactors, Fieldglass are already event enabled.
00:00:58 They're event enabled using different means for some of those back ends or business
applications.
00:01:07 There are standard events, S/4HANA Cloud, S/4HANA, have roughly 500 standard events, for
example.
00:01:13 You can build custom events, ERP. The main approach to event enablement there
00:01:19 is to use an add-on that allows you to create custom events.
00:01:23 SuccessFactors reuses the Intelligent Services events that are there.
00:01:29 Fieldglass uses an add-on, again, and, and, and. The point is,
00:01:34 and this is the one thing I want you to remember, most SAP business applications,
00:01:40 all major SAP business applications and back ends, are event enabled using different means.
00:01:48 We'll deep dive into that later. Let's switch to the right-hand side,
00:01:52 to the event consumer side. There are a number
00:01:56 of event consumers that are there, BTP. Quite a few BTP services and apps are event
enabled.
00:02:04 Integration Suite allows you to consume events and to mediate those events, to work
00:02:11 on those events, to trigger next steps based on events. BTP, Kyma Runtime is event enabled,
00:02:19 Data Intelligence. Then, there are two things that I want to highlight,
00:02:26 two event consumers I want to highlight, and that is SAP back ends.
00:02:30 On the left-hand side with the event sources, we're talking about outbound events, exposing
events
00:02:36 from business applications, and from back ends, meaning a significant change takes place and
it's exposed.
00:02:44 On the right-hand side, we're talking about inbound, meaning that several SAP back ends are
able
00:02:50 to consume events, as well. This holds true for S/4HANA Cloud and S/4HANA and for ECC.
00:02:59 There are different means of getting this done. We'll look at that in another unit.
00:03:05 And, there's another thing I quickly want to highlight, Azure Event Grid there.
00:03:10 As an event consumer, we have just finished a beta program and as said, it has been
embedded.
00:03:18 In the future, it will be available to consume events via SAP Event Mesh into Azure Event Grid
as well.
00:03:27 At the bottom of this slide, there's quite some know-how as well.
00:03:31 The integration solution advisory methodology includes event-driven architecture.
00:03:36 There's lots of learning materials, and at the center of this event-driven ecosystem, there's
SAP Event Mesh
00:03:45 and SAP Integration Suite Advanced Event Mesh as our eventing infrastructure.
00:03:53 Let's look at those two event brokers, event meshes, in more detail.

12 / 23
00:04:00 So we're talking about fully managed services for enabling enterprise-grade, event-driven
architecture.
00:04:07 Let's start with SAP Event Mesh on the left-hand side. It's, in the end, our starting option
00:04:16 with a low entry barrier to event-driven architecture, and pay per use pricing.
00:04:22 It's used to integrate and extend SAP integrations in an event-driven way.
00:04:29 And, it's deployed on BTP, and it's a native event broker for S/4HANA.
00:04:34 And there's a free add-on for custom ECC events there. It combines openness and focus,
00:04:41 so it supports the major protocols like MQTT, AMQP, REST and it provides a few extra
benefits for the SAP ecosystem.
00:04:51 It does scale well. It has certain set limits though,
00:04:55 because there's a shared infrastructure behind the scene. And the shared infrastructure does
have a few
00:05:03 conditions that you need to need to stick to. Advanced Event Mesh complements SAP Event
Mesh
00:05:12 for more demanding scenarios. It's actually not only an event broker, it's an Event Mesh.
00:05:20 So it's a network of event brokers. You can combine the event brokers of Advanced Event
Mesh
00:05:26 into a network with events flowing back and forth. It's completely enterprise grade
00:05:34 in respect to performance, reliability, security. So it always has extra offerings,
00:05:42 extra benefits, in comparison to SAP Event Mesh, and it scales up from small to very large use
cases.
00:05:55 As said, it's a distributed mesh of event brokers. It can be deployed across environments in
private,
00:06:02 and in public clouds. So public clouds could be Azure, AWS, Google Cloud,
00:06:07 and so on, private cloud. You just need a Kubernetes environment,
00:06:11 and you can deploy it there. It holds a full purpose set of eventing services,
00:06:16 and it has great monitoring, great analysis tools.
00:06:22 On top of that, quite a number of advanced features, like fine-grained filtering,
00:06:26 like dynamic message routing. And as said, it's, in the end,
00:06:33 the basis for the foundation for a full-blown, end-to-end eventing infrastructure,
00:06:42 and an eventing strategy there. Important point, the brokers come in t-shirt sizes,
00:06:51 so it's not shared infrastructure. You do get your own broker that you use exclusively.
00:06:58 You can spin up a number of brokers, can combine those brokers to form a mesh,
00:07:04 a network, and those brokers can come in different t-shirt sizes, so that they can cater
00:07:09 to small use cases to ultra large use cases there. What do I want to add to that?
00:07:21 I think we've pretty much mentioned things. Important, network, you've got a lot of deployment
options.
00:07:29 It's your own brokers, so no shared infrastructure, they completely scale.
00:07:36 Looking at this in more detail, I think the important parts are the BTP deployment
00:07:44 on the Event Mesh side, versus the private and public cloud deployment on Advanced Event
Mesh.
00:07:50 So you use BTP to trigger the deployment. The actual deployment takes place
00:07:56 in the hyper cloud provider. So if you look at the message size
00:08:00 and storage here, to just pick a few points, that is a major difference if you look at it.
00:08:08 SAP Event Mesh supports one megabyte of event size and 10 gigabytes for storage,
00:08:18 meaning for buffering of events. At the same time, Advanced Event Mesh,
00:08:23 we're talking about up to 30 megabytes of message size, of event size, and you get up to six
terabytes for buffering.

13 / 23
00:08:32 And if you look at the advanced section, there are offerings, features there,
00:08:41 that SAP Event Mesh does not offer. For example, advanced filtering in respect
00:08:47 to the events, monitoring and analysis, event replay. To a certain extent, you can replay
events, for example,
00:08:56 for a queue that you enable for event replay. And later on, you can then replay those events
00:09:04 so that you can consume them again. And very important, in respect to the SAP event
sources,
00:09:11 SuccessFactors is already available as of today. S/4HANA direct integration is done today
00:09:22 using an add-on that you can deploy on S/4HANA on-premise. Starting with the next version in
the fall,
00:09:33 Advanced Event Mesh will be the second native event broker for S/4HANA on-premise in the
fall, and then for
00:09:41 S/4HANA Cloud at the beginning of 2024. With that said, let's look at Advanced Event Mesh.
00:09:52 What you see here is SAP Integration Suite, Advanced Event Mesh.
00:09:56 You can already see that there are a number of options there. There's an event management
part, holding in a catalog,
00:10:03 an event portal, event discovery here. Then there is a section named Event Streaming
00:10:11 that offers you a Cluster Manager and a Mesh Manager, and there's an event insight part
00:10:17 that allows you to do analysis and see what's going on
00:10:21 across your event-driven environment there. Let's start with the cluster manager right there.
00:10:28 This is where you do spin up brokers. You can see I've already created three brokers here,
00:10:35 one based on AWS, one based on Google Cloud, and one based on Azure, and all of this in a
global setup.
00:10:45 If I would like to spin up another broker, I would just click here.
00:10:50 I would give a name to the broker, then I can pick t-shirt sizes.
00:10:56 You can see there are different broker sizes starting from the Enterprise 250, as the smallest
one.
00:11:03 You can, in the end, pick and you get more storage, and so on, based on this t-shirt size.
00:11:11 And in the next step, you would choose the cloud. We could, for example, select Azure,
00:11:18 and you then see the different data centers. I could select one in the US,
00:11:24 and then I could just click Create Service, and we would be spinning up the broker.
00:11:31 I will cancel this very quickly, because I don't want to spin up the broker,
00:11:36 and we will switch over to the Mesh Manager. What is the Mesh Manager about?
00:11:42 I already mentioned that you can form a network of event brokers, and this is what I have done
with our brokers here.
00:11:50 So the US-based one, the Europe-based one, and the Asia-based ones, I have connected to
form a network,
00:11:57 so that events are flowing back and forth between those brokers to wherever they are needed.

00:12:04 And the idea is that you can put an event broker close to an event source, event gets handed
over,
00:12:13 and then it gets distributed on a global level. If we go back to the Cluster Manager,
00:12:19 we can have a very quick look at how we would now manage things.
00:12:25 And this is the UI for queue creation, for really going into details there.
00:12:39 Let's go back and very quickly look into the insights part. This allows you to do analysis here,
and the nice thing
00:12:51 is that you have very detailed analysis using Datadog. So if I go here,

14 / 23
00:12:59 I have different dashboards, I have monitors, I have alerts, and I can perfectly, in detail, see
what's going on
00:13:09 in my event-driven environment. To summarize things,
00:13:15 SAP provides an end-to-end event-driven ecosystem, which is already quite extensive.
00:13:22 It consists of different event sources, our main back ends, quite a number
00:13:28 of business applications there. It consists of eventing infrastructure,
00:13:31 named the SAP Event Mesh and SAP Integration Suite Advanced Event Mesh.
00:13:36 And there are quite a number of event consumers there. Our infrastructure offerings
complement each other,
00:13:43 with SAP Integration Suite Advanced Event Mesh being the enterprise-level offering
00:13:51 that really allows you to completely scale on a global level, and it offers a number
00:14:00 of sophisticated features, it offers distribution of brokers, and it offers ultra high scale.
00:14:09 With that said, let's move on to the next unit.

15 / 23
Week 3 Unit 5

00:00:06 - Hello and welcome to a closer look at SAP's event sources.


00:00:12 And before we go there, I would like to have a closer look at event types, as well.
00:00:18 I was already mentioning when introducing event-driven architectures
00:00:23 that there are different kinds of events, I had mentioned notification and data events.
00:00:29 A notification event just informs you about a significant change.
00:00:34 It does not contain all the data that you need. The advantage is the event is very small
00:00:41 and the data access is controlled, meaning you receive the event.
00:00:46 If you're interested in this event you will have to do an additional API call
00:00:51 and you go through authentication and authorization in the back-end system.
00:00:57 So data access is very controlled. There are customers that like this approach.
00:01:02 Our S/4HANA colleagues like this a lot as well. There are some challenges in this, though.
00:01:09 There is this additional synchronous step. So you get informed of the event in an
asynchronous way,
00:01:16 nevertheless, you have to do the API call and, obviously, on top of a suitable event,
00:01:22 you need a suitable API, as well. And there might be additional configurations for API access.

00:01:31 Then there are data events. All the data that you need is included in the event,
00:01:38 meaning no need for an additional API call. You receive the event, you get informed
00:01:43 of a significant change and you can immediately take action and execute.
00:01:50 This is really good - the data event approach - when the full dataset is required in the majority
of cases.
00:01:57 Nevertheless, once again, there are a few challenges to this.
00:02:02 It raises topics like data access and protection, so you want to watch how and with whom
00:02:09 you're sharing your data. And, obviously, there's a higher resource consumption
00:02:15 on the source side, on the event broker, and on the consumer side -
00:02:20 obviously, because of the larger event size. Both of these event types make sense.
00:02:28 In some use cases, notification events might be the better approach.
00:02:32 In some use cases, data events are the better approach, and SAP back nds typically support
both approaches.
00:02:42 We will see later on what that means in respect to the specific back ends.
00:02:49 Now, standard event versus custom event. Standard events are provided by SAP business
applications
00:02:58 or back ends as part of the SAP standard. S/4HANA, for example, provides roughly 500
standard events.
00:03:06 For S/4HANA, you can look these events up in the SAP Business Accelerator, api.sap.com.
00:03:14 For SuccessFactors, in the Intelligent Services documentation.
00:03:20 For other business applications it's typically the Business Accelerator Hub as well.
00:03:27 Typically these days, it's notification events for S/4HANA. In future, there will be more and
more data events.
00:03:40 For some back ends no adjustments to these events today are possible.
00:03:44 In future we are working on allowing adjustments to these events, to making adjustments
possible.
00:03:53 This again, holds true specifically for S/4HANA there. And, it's a great approach
00:04:00 because it scales very well. It takes you roughly a few seconds

16 / 23
00:04:06 up to minutes to just enable a standard event. Nevertheless, quite often, customers need
events
00:04:14 that are tailored for specific use cases. For those use cases you can create custom events.
00:04:22 So customers create these where no standard events are available.
00:04:27 There are different ways of creating these custom events and they differ from back end to back
end.
00:04:33 Typical approach for S/4HANA on-prem would be the event-enablement add-on.
00:04:39 This you can use for ECC as well, or that's a new S/4HANA Cloud and on-prem approach
00:04:47 it's this RAP-based approach where you can have RAP-based events
00:04:52 that you can tailor to your needs. And typically, you can completely and totally tailor
00:05:00 these events to fit your customer-specific needs. Both of these event types are needed.
00:05:14 In the end, in future, standard events will be there for most of the use cases.
00:05:19 You will be able to switch them on easily, you will be able to adjust them easily.
00:05:25 And in case that you need a custom event, you will be able to create custom events.
00:05:31 Now, there's a different perspective here, again: outbound versus inbound events.
00:05:37 Currently, outbound events are the 90% case. Meaning, events originate from an SAP
business application
00:05:46 or back end and are exposed to the SAP ecosystem or the outside world.
00:05:54 Standard and outbound events are enabled with just one click,
00:05:58 and custom outbound events you can create with low-code/no-code to pro-code approaches.
00:06:05 The 10% case are inbound events, meaning, the SAP back ends, the SAP business
applications,
00:06:12 consume events from other business applications, from other back ends, from third-party
applications.
00:06:21 Typically, the effort is higher. I mean, keep in mind this is usually a custom approach.
00:06:32 So you typically have actions in mind when you receive an event that you need
00:06:37 to code into the back end. So, usually it's pro-code development
00:06:42 and you need to specify exactly what to do when receiving an event.
00:06:48 And in most cases, quite often, calling an API is seen as an easier alternative.
00:06:54 So you need to think through whether you really want an inbound event
00:06:58 or whether you just want to call an API. There are lots of advantages though,
00:07:04 this asynchronous approach in respect to creating inbound events, as well.
00:07:11 Now, how does this translate into an overview of SAP's event sources?
00:07:18 You can see some of the main business applications, back ends here, you can see ECC on
the left-hand side,
00:07:25 you can see S/4HANA Cloud, S/4HANA in the middle, and you can see SuccessFactors.
00:07:31 And there's a certain overlap in respect to the approaches. The definitions that I was taking
earlier
00:07:40 hold true for those back ends. Let me quickly walk you through
00:07:45 the approaches to event-enable these event sources. SAP ECC uses the event-enablement
add-on.
00:07:54 The event-enablement add-on allows you to create notification and data events.
00:08:01 It uses a low-code to pro-code approach, so you can really pick whether you want to use
00:08:07 a low-code/no-code approach in 20 minutes to create a basic event,
00:08:12 or whether you really want to define using coding what the events do.
00:08:17 You can do inbound and outbound events and this holds true for ECC, S/4HANA Cloud and
S/4HANA.
00:08:26 The events are in CloudEvents format. The only business application that handles

17 / 23
00:08:32 this differently currently is SuccessFactors, that provides SOAP format.
00:08:38 For S/4HANA, this event-enablement add-on approach is possible as well.
00:08:43 So you can create custom events based on the event-enablement add-on.
00:08:49 S/4HANA and S/4HANA Cloud do have two other approaches as well, though.
00:08:54 Standard events, they both have roughly 500 standard notification events available
00:09:00 that you can just switch on. Currently these events cannot be adjusted
00:09:05 so you need to ask SAP Development to do this for you. And new events are provided
00:09:12 by request to SAP Development and then there is the way forward: the RAP-based events.
00:09:20 This is the new approach for S/4HANA and S/4HANA Cloud, allowing extensible standard
events in future.
00:09:29 So you will have standard events and you will be able to adjust those
00:09:34 in future versions of S/4HANA and S/4HANA Cloud. Currently you can create custom
notification
00:09:45 and data events using this RAP-based approach, as the goal in future is standard events.
00:09:52 You're able to switch those on easily. At the same time, you can adjust them to your specific
needs.
00:09:58 And then there's SAP SuccessFactors. There, we are reusing the Intelligent Services events.
00:10:05 These are adjustable standard events and you can turn them into notification and data events.

00:10:13 You can adjust the size to your need. The one thing you need to keep in mind for
SuccessFactors:
00:10:20 Custom events are not possible, so you need to use the standard events and adjust them.
00:10:28 With that said, let's quickly look at a demo. What I would like to show you now
00:10:39 is the Business Accelerator Hub. Here, you can look up available events
00:10:45 for your different business applications and back-end systems.
00:10:49 Let me now click on S/4HANA Cloud and click on events. And you do see that there are quite
00:10:58 a number of events available here. For a start, I would like to click
00:11:03 on the business partner events. And you can see that there are two events available:
00:11:09 Business Partner Changed, and Business Partner Created. You can now go into the details
and look at the payload.
00:11:18 And once again, you can see that the payload is very limited here.
00:11:24 So it just contains the business partner ID. Let's now go back and let's look at sales order.
00:11:47 I'm afraid I need the asterisk here. And it's going to find sales order events.
00:11:55 And this is why I'm showing you specifically, business partner and sales order.
00:12:01 Business partner: these are notification events and you can see it's not a lot of events
available.
00:12:08 Sales order is in the end a front runner. You can see lots of events available,
00:12:14 very detailed on a very detailed level. Not only Sales Order Created, Sales Order Changed.
00:12:20 As well on item level: Item Created, Item Changed, Item Profit Center Changed, and so on.
00:12:28 And if we now look at the payload of the event, you can see that the events are getting bigger.

00:12:36 So they already - these are standard events - as part of the standard contain way more
information here.
00:12:46 And this allows you to either already execute and take action, including the data
00:12:53 on the data that you have been delivered. Or alternatively, it allows you
00:12:57 to base your API call to collect additional data from the back-end system on more accurate
information,
00:13:05 on information that gives you more background. Let's switch over to S/4.

18 / 23
00:13:15 And let me show you how easy it is to expose events, to add events getting exposed from S/4.

00:13:23 You can see that currently, we're exposing business partner events,
00:13:28 and this asterisk here indicates that we're exposing all kinds of business partner events.
00:13:33 So, Business Partner Changed, Business Partner Created. What I would like to do now,
00:13:38 I would like to add sales order events, as well. By the way, as a sidetrack,
00:13:44 you can see there are quite a number of events getting exposed from the S/4 system.
00:13:50 I'm interested in sales order events, so I will just limit this slightly
00:13:58 and I will go and select sales order change events. I click create, and if I go back now,
00:14:08 you can see that now I'm not only exposing all business partner events,
00:14:14 I'm exposing all sales order events, as well. And you've seen how quick that was
00:14:19 to add exposing events for sales orders. You can obviously be more specific
00:14:25 and expose only selected events. For simplicity I'm exposing
00:14:30 all kind of sales order events here. Let's now look at how things work in SAP SuccessFactors.

00:14:40 And there you do go to the Intelligent Services Center. On the left-hand side you can see all
kinds of events
00:14:52 and it's quite a list: staff probation, new work order,
00:14:55 new position, and, and, and. Once you have selected an event - employee hire
00:15:02 in our case - you define the trigger type, which is Intelligent Services, the destination type,
00:15:08 REST, SuccessFactors as the source type, obviously. And then JSON as the format.
00:15:14 As said earlier it's not CloudEvents format, still a specific format.
00:15:18 And we're using the REST protocol. Now, we're looking at the Employee Hire event.
00:15:28 You can see that you have a number of fields that you can select there.
00:15:35 And in our case we're just going to stick with what we have.
00:15:40 We are going to click Select and in the next step, we will walk through the process
00:15:46 to create these events all the way: destination settings and review and run towards the end.
00:15:57 So, that was a very quick look at how to expose events in SuccessFactors.
00:16:04 What you should have seen is there are quite a number of events there and it's very easy to
expose them.
00:16:13 And with that said, let's wrap up the demo here and switch back to the slides.
00:16:23 To summarize things: Most SAP back ends and business applications
00:16:27 are event enabled. There are different approaches:
00:16:30 standard events, RAP-based custom events, custom events using the event-enablement add-
on
00:16:39 allow you to expose and consume events out of back ends and business applications.
00:16:46 And the future will be standard events that are extensible and adjustable.

19 / 23
Week 3 Unit 6

00:00:05 - Hello and welcome to event-driven use cases. You've heard a lot about event-driven
architecture
00:00:12 and event-driven integrations. Now let's look at what you can do with what you have learned
00:00:19 and what our customers, what we at SAP typically do in respect to event-driven
00:00:26 architecture. You have already seen this
00:00:30 example as part of the introduction. So it's a digital twin for our car fleet.
00:00:38 So, for every single car in our car fleet we have events that get raised, that flow around, that
update
00:00:47 the digital twin. The point is here that this does help the car fleet.
00:00:54 Nevertheless, it was the first scenario that SAP has implemented,
00:00:59 and SAP IT has moved forward where they have come up with more and more scenarios.
00:01:05 What I would like to point out here is: Get started! Identify a scenario that makes sense
00:01:12 and then take the next step, and you will identify more and more use cases that make sense to
you.
00:01:20 Currently, there are a number of front-runner industries and it's highly interesting to see which
industries
00:01:27 you see again and again. It's retail, typically, it's consumer goods, it's travel,
00:01:32 and it's pharma and life sciences. There are other industries that
00:01:38 are looking into event-driven architectures - oil and gas, for example.
00:01:44 Nevertheless, specifically here, I even experience this in the real world.
00:01:52 So if I look at retail, there's not a single retailer around my house
00:01:59 that does not employ event-driven architecture in one way or the other.
00:02:04 So if I go grocery shopping, I know what's happening behind the scenes, because it's really
00:02:13 companies that strongly use event-driven architecture. Travel, it's the same.
00:02:20 Whenever I travel these days, I look at what's going on and I can
00:02:26 tell these companies are using event-driven architecture behind the scenes for
00:02:33 informing customers about changes, for, in the end, steering customers to be on time.
00:02:41 It's highly interesting and thinking about why that is, why it's specifically these industries.
00:02:48 Quite often it's the need to be very efficient. Quite often it's the real time aspect
00:02:56 to base your decisions on data that is completely up to date.
00:03:03 And if I look at pharma and life sciences, for example, there it is really about the importance of
decisions.
00:03:11 You need completely up-to-date data because the impact of the data not being a hundred
percent
00:03:17 up to date would be quite big. Let's look at a few selected use cases.
00:03:23 And let's start on the top left with very specific, advanced event mesh use cases.
00:03:32 Advanced event mesh comes out of the financial services industry,
00:03:38 and there, speed is highly important.
00:03:42 If you think about how much money a delay would cost you
00:03:49 in respect to financial services - at the stock exchange and so on -
00:03:53 you can see that speed real time is absolutely needed. At the same point in time,
00:03:58 if an event does not get delivered, if you miss out on an event,
00:04:05 you might lose money as well. So speed and reliability is highly important there.

20 / 23
00:04:11 And advanced event mesh has been put to the test there and has really developed into a
completely
00:04:19 reliable event mesh, event broker, there.
00:04:25 Let's look at a few other scenarios. I was talking about retailers.
00:04:29 Availability is a big topic there. So if I'm interested in buying a new
00:04:36 chair or a new table, I can check on the website on the availability in a certain store.
00:04:44 And if my table, the table that I want to buy is available in that store,
00:04:49 I can go there and pick it up immediately. The worst thing that could happen
00:04:54 to this retailer that is offering this check would be that the data's not up to date:
00:05:04 the customer checking on the website, going to the store, and then the table is not available.
00:05:08 So there are retailers that update their stock in real time there.
00:05:14 Supply chain is a very typical use case. You can
00:05:19 track in real time the containers around the globe while they are on big container ships,
00:05:27 while they are on trucks. And based on this real-time information,
00:05:32 you can update not only the arrival dates at your plants, but as well the dates at which
00:05:41 the products, your products will make it into the stores,
00:05:49 to the consumers in the end: to the end of the supply chain. Data distribution is always a big
topic.
00:05:56 Something happens in one back-end system, you want to inform extension applications, other

00:06:04 SAP back ends, or potentially third-party business applications


00:06:13 about changes, and distribute the data - potentially even into a data lake.
00:06:19 And then there is one of my favorite examples: customer experience, and I'm using
00:06:26 typically the travel industry as an example here. I was traveling two years ago, three years
ago,
00:06:34 and it was a package deal and it was really during Covid so I had not been traveling that much

00:06:41 and I made it to the airport and we got, in real time, updates which gate to go to,
00:06:50 when the flight was ready to board. When we landed at the destination airport,
00:06:55 we had to pick up the luggage, meaning which belt exactly to pick up the luggage at.
00:07:04 Where the bus was waiting to transfer us to the hotel. And on the way back, there was a delay
of the flight.
00:07:10 They gave us an update in real time. Sorry, your flight is delayed.
00:07:15 You can go and have an extended breakfast or you can still go to the pool and in two hours
00:07:20 you will be picked up. So that wasn't quite nice,
00:07:22 this customer experience perspective. And I later on learned actually that the
00:07:28 company that we were traveling with was using event-driven architecture
00:07:34 for this behind the scenes. One step further,
00:07:38 and this is one thing I learned when we traveled to TechEd end of last year,
00:07:44 so I was going to Las Vegas. There were different flights going there -
00:07:49 - connecting flights, and the approach was very similar. It's getting standard throughout the
industry now.
00:07:56 They were giving us updates: Go to this gate, pick up your luggage at this belt, and so on.
00:08:06 I took it as customer experience and was happy as always about it.
00:08:10 What I did understand on that trip, though, is that you can use it for optimizing your business
processes,
00:08:16 and this is what the airline actually did, from my perspective.

21 / 23
00:08:20 They were making sure that the customers were at the gate on time, that they would board the
plane on time,
00:08:28 that the plane would leave on time. So this company was taking the next step.
00:08:35 They were really optimizing their business processes, trying to save money,
00:08:40 and trying to make sure everything would work very smoothly. Again,
00:08:47 later on, I learned that this airline is using event-driven architecture
00:08:52 behind the scenes for that. Let's switch over.
00:08:57 And I was talking about life science and pharma,
00:09:02 how important it is to have this real-time aspect in mind. And there's a software as a service
offering
00:09:10 named SAP Cell and Gene Therapy Orchestration, and our colleagues there help life science
companies to
00:09:20 organize, plan, and execute cell and gene therapy treatments.
00:09:24 And the important point here is that they need to track the supply chain completely.
00:09:32 So they need full supply chain visibility. Why?
00:09:36 Because this therapy, this medication needs to make it to the customer - which is the patient in
the end
00:09:45 - in 72 hours, and they need to be able to fully track,
00:09:49 meaning the pharma companies, need to fully track the entire supply chain
00:09:55 and be able to prove what has been going on. Our colleagues have included 30 different kinds

00:10:05 of events into this software as a service offering. And it's not only outbound events,
00:10:10 they're using inbound events as well. And I really like it because it does help people.
00:10:17 It makes very sure that the medication makes it to the patients in time and
00:10:22 that it makes it to the patient in quality, and that you can really prove that this is the case.
00:10:29 With that said, I think what has become quite clear is that event-driven architecture
00:10:36 supports quite a variety of use cases. There are a number of front-runner industries
00:10:44 that are moving forward. They have specific requirements - typically real time -
00:10:51 that they specifically benefit from event-driven architecture.
00:10:57 And in an SAP context, you can use event-driven architectures in the more obvious data
distribution use cases.
00:11:07 You can use event-driven architecture for real time scenarios.
00:11:12 And on top of that - and I quickly want to mention this towards the end - you can even use it
00:11:19 to adjust communication styles. If you think about it, in your private life
00:11:25 you're using Twitter style, WhatsApp style. Event-driven architecture allows us
00:11:31 to bring this approach into a business world as well. And I know this is just the tip of the
iceberg:
00:11:39 There will be more and more use cases and scenarios that will get addressed with event-
driven architecture.
00:11:48 With that said, we are at the end of our weekly topic on API and event-based Integration.
00:11:55 I hope you had an insightful session and I wish you the very best
00:12:00 for your weekly assignment and the rest of the course. I hope that I will see you on the
discussion forum.
00:12:07 Thank you very much.

22 / 23
© 2023 SAP SE or an SAP affiliate company. All rights reserved.
See Legal Notice on www.sap.com/legal-notice for use terms,
disclaimers, disclosures, or restrictions related to SAP Materials
for general audiences.

You might also like