0% found this document useful (0 votes)
215 views424 pages

Power Automate Guidance

Uploaded by

chupamela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
215 views424 pages

Power Automate Guidance

Uploaded by

chupamela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 424

Tell us about your PDF experience.

Power Automate guidance


documentation
Power Automate guidance documentation provides best practice information from the
team that works with our enterprise customers. We regularly add and update the
guidance content.

Planning a Power Automate project

e OVERVIEW

Introduction

Planning phase

Designing phase

Making phase

Testing phase

Deploying and refining phase

Robotic Process Automation with SAP

e OVERVIEW

Introduction

Use pro-code RPA with SAP GUI

Use low-code RPA with SAP GUI

Use no-code RPA with SAP GUI

Conclusion

Automation Kit for Power Platform

e OVERVIEW

Overview
Setup

Use the Automation kit

Automation adoption best practices

e OVERVIEW

Overview

Holistic Enterprise Automation Techniques (HEAT)

Automation Maturity Model

c HOW-TO GUIDE

Automation admin and governance

Automation Kit for Power Platform

Manage Power Automate for desktop on Windows

Building custom action for Power Automate for desktop

Related content

e OVERVIEW

Create your first flow

What is desktop flow?

Power Platform Center of Excellence (CoE) kit

Power Platform ALM guide


Introduction: Planning a Power
Automate project
Article • 02/16/2022

You can use Power Automate to automate your manual and repetitive processes so that
you can focus on higher-value tasks. It's a unique service that you can use to unify cloud
services, desktop applications, and legacy systems.

Do you have a process or task that you want to automate, but aren't quite sure how?
This documentation can help you plan and design an automation project, whether
you're a business user, an IT pro, or a professional app developer who has never worked
on an automation project before.

In these articles, you'll learn about automating your business processes with Power
Automate. The basic steps are as follows:

1. Plan: Identify the who, what, when, and why.

2. Design: Design your new automated process "on paper," and consider various
methods of automation.

3. Make: Create the Power Automate flows.

4. Test: Try out the automation you created.

5. Deploy and refine: Start using the automation in production, identify processes
that can be refined, and decide what to change or add.
Types of process automation
Article • 02/16/2022

This video provides a quick overview of process automation with Power Automate
https://www.microsoft.com/en-us/videoplayer/embed/RE4G4jU?postJsllMsg=true

There are two main types of automation available in Power Automate:

API-based digital process automation: cloud flows

UI-based robotic process automation (RPA): desktop flows

Cloud-based digital process automation (DPA)


You can use Power Automate to automate processes in over 380 applications by using
API-based connectors provided out of the box. Additionally, software engineers can
create new, custom connectors to any application that has an API available. Modern
applications (including cloud-based services) use APIs to provide programmatic access
to data and functionality. The API declares a set of rules for requests, and programmers
use the API to interact with the application.

Without writing any code yourself, you can use connectors to access data and use a
wide variety of application functionality in your automation. For example, you can use
the connectors for SharePoint and your email program to automate the process of
adding a new item to a SharePoint list when you receive an email that has a specific
subject line.
Desktop-based RPA
The next question you might have is: what if I have an application that Power Automate
doesn't have a connector for, and I can't create a custom connector because the app
doesn't have an API? This is where robotic process automation (RPA) comes in. You can
use RPA to create automation even in older systems that don't have an API. With RPA,
you automate applications by teaching Power Automate to mimic the mouse
movements and keyboard entries of a human user, as if a robot were using the
computer. In other words, whereas digital process automation provides API connectors
so you can tell the application what to do, with RPA you show it what to do.

DPA or RPA? Or both?


When using Power Automate to automate processes, we recommend that you use
digital process automation for any applications that have API-based connectors
available, because APIs are meant to be stable even as the application changes over
time. Software vendors work hard to avoid making changes that break the way existing
API rules work.

Conversely, RPA is susceptible to breaking when things change, such as when updates
are applied to a local computer's environment or the layout of an application's screens.
Additionally, you must take great care to ensure that you've been clear in your
instructions to the robot. For example, if you selected cell B3 in a worksheet, do you
want the robot to select cell B3 every time? Do you want it to select the first empty cell
in column B? Do you want it to select the cell in column B for the row where column A is
set to a specific value? When using RPA, it's easy to give instructions or assume that
decisions have been made that aren't obvious just from recording your mouse clicks and
keyboard entries. It might take some iteration to ensure that you've provided all the
necessary instructions, including what to do in case of errors.

Power Automate provides both digital process automation and RPA, to bridge what you
can automate between modern, API-based services and the websites and desktop
applications for which you don't have an API-based connector.
Planning phase
Article • 02/16/2022

Planning is the most important part in automating your process or task. When planning,
you should consider the following:

What problem will the process automation solve?

Who will use the process automation?

What goals and objectives will it meet for users?

Knowing the answers helps you stay on track when you design your process
automation. It's easy to fall into the trap of automating the process as the objective,
rather than solving the problem.

The elements of planning your process automation are very similar to how you plan
your app creation using Power Apps, and include:

Identifying the business problem to solve (the use case)


Deeply understanding the business process
Optimizing the business process with your solution
Deciding whether it's worthwhile to automate the process
Creating a project plan

For detailed information about each of these elements in the planning phase, read the
planning section in Power Apps guidance content: Planning phase.

After reading the planning section in Power Apps guidance content, come back and
continue reading the Designing phase in this guidance content.
Process design
Article • 02/10/2023

When you design a process automation, the first step is to decide when and what to
automate. Looking at the business process you currently have, you should first identify
which part of the process to automate.

Identifying automation areas


The types of benefits you can potentially gain by automation falls into these categories:

Consistently apply standardized business rules

Reduce manual work on repetitive processes

Reduce human error

Streamline approvals

Gain efficiency in high-volume transactions

Efficiently move data between systems (reduce manual data entry)

Maximize the use of available resources

Increase throughput

Apply standardized business rules


Business rules are the if/then logic that applies your business policies. Automating them
ensures that they'll be followed consistently every time.

In our sample expense reporting scenario, a business rule requires that if an expense
report amount exceeds $10,000, it needs to also be approved by the CFO. By
automating the process, Abhay ensures that no high-dollar expense report will slip by
unnoticed.

Automate repetitive processes


Automating repetitive processes can help your employees avoid mental and physical
burnout. Processes that are done the same way every time should be high on your list to
consider for automation.
For example, Abhay the accountant must collect expense forms and receipts from
everyone. The receipts might be a paper receipt from a restaurant or paper invoices
from vendors. Abhay has to manually scan these papers into a PDF file and store it.
Abhay also needs to enter what's written on the paper and post it in the financial system
for every expense report submitted.

Reduce human error


Tasks like copying and pasting values from one system to another, or keying in data
from paper forms, are processes where human errors can occur.

An example case for the expense reporting scenario is where Abhay needs to reimburse
cash to the employee by looking up the employee banking details, then accessing the
banking system.

Streamline approvals
A different type of error occurs when people forget to perform their tasks. You can set
up automation to remind them to work on the task or process that they've been
assigned.

For example, Lee has submitted an expense report but Nick hasn't responded to the
approval request for some time. An automation can be set up to remind Nick to make a
decision, and even provide a button to respond directly from the reminder.
Gain efficiency in high-volume processes
Another area you may want to automate are high-volume processes. High-volume
processes are processes that occur very frequently on a daily basis. It is closely tied to
repetitive processes but is slightly different. You may have a process that can only have a
step or two that can be automated. However, if a process must be done many times,
even small improvements can have large impacts.

For example, if the expense reporting scenario holds for 1,000 salespeople, each minute
of improvement would equate to two working days' worth of time saved. Analyzing the
actual impact can be done by using the analytics features.

Automate data entry


You might be manually entering data because you have two systems that don't talk to
each other. In our expense reporting scenario, this is where Abhay inputs data into the
accounting system by reentering values from the submitted expense report.
Maximize the use of available resources
Additional good automation candidates are those processes that can be run
independent of human interaction. This type of process isn't as easy to identify, so the
best way is to imagine whether there are processes that can be completed outside your
normal business hours. Such an automation acts as a "multiplier" of your people and
fully uses your other resources (such as PCs).

You might also use automation for processes that take too much time compared to a
human interaction, but is tolerable if done during night time when there's no rush for
the automation to finish. For example, if a person starts their day by processing orders
that came in overnight, you might create automation that processes the orders as soon
as they come in, so that your team can start fulfilling the orders first thing in the
morning.

Increase throughput
Similar to maximizing the use of available resources, automation also helps you increase
the throughput of a particular process. With this type of automation, your current
process can be performed by humans in parallel with the automation.

For example, Abhay the accountant may be the only person processing the expense
reports and their standard work hours may be between 9 AM to 6 PM. By setting up an
automation, you could also have Power Automate to process the expense reports as
well, and therefore have both Abhay and the automation to process it, resulting in a
higher throughput.
Example scenario
When all of the automation areas are applied, the example below shows how an
expense reporting business process can be covered by Power Automate with
improvements to throughput, maximizing the available resources, automating data
entries as well as streamlining approvals.

Next step: Determining which automation method to use


Determining which automation method
to use
Article • 03/17/2022

After the process design is complete, the next step is architectural design, where you
focus on how you'll automate that process.

First you determine what kind of connector you can use (if any), and then choose a
trigger to start the automation.

Choose an automation method


Ideally, all the systems you want to automate will have Power Automate connectors.
Check the list of connectors to see whether connectors are available for the system you
plan to automate. After you find the connector, make sure that the actions you need are
available for that connector. For example, a connector for an email system will need
actions for "send," "reply," and "delete."

If there are no connectors available, you have the following options to choose from:

Create a custom connector: This is the preferred method of automation if you're a


developer or your organization has a developer who can create custom
connectors. A custom connector allows the automation to interact with the target
system via a published API. This API should be resilient to system changes. More
information: Create a custom connector from scratch

Use the HTTP connector: If you're a developer and have one-off scenarios where
you need to connect to systems that have no connectors available—but you don't
want to set up custom connectors—your next-best method is to use an HTTP
connector. More information from Azure Logic Apps documentation: Add an HTTP
trigger

Create a web browser automation: If you can't find a connector, and if the system
is a web browser–based application or a website, you should consider web browser
automation. Web browser automation mimics keyboard inputs and mouse
movements as if a human were using the browser. You can build a browser
automation process with Power Automate Desktop.

Create a desktop application automation: If you can't find a connector, and if the
system is a desktop application on a PC, this is the automation method to use.
Power Automate has capabilities that mimic human keyboard inputs and mouse
movements. For desktop application automation, you create a new Power
Automate Desktop process with Power Automate Desktop.

The following table compares the different methods.

Method Ease Requires a development Easily affected Requires setup or


of use background? by system development
changes? time?

Connector Easiest No No None

Custom Easy Yes No Yes


connector

HTTP Easy Yes No No


connector

Web browser Easy No, but a basic knowledge Yes Yes


automation of CSS and HTML is
preferable

Desktop Easy No Yes Yes


application
automation

In complex automation scenarios, you can combine all these methods.

Choose a trigger to start the automation


With all the automation methods discussed earlier, you need to consider how to trigger
(start) these automations. The ways you can trigger an automation include:

Automated triggers
Instant or manual triggers
Scheduled triggers

Automated triggers
With an automated trigger, the system automatically starts the automation when a
condition is met. (Note that not all connectors include automated triggers.)

Examples of automated triggers include:

When an email is received in Outlook


When a new file is moved to OneDrive
When a new row is created in Microsoft Dataverse
When an item is modified in a custom SharePoint list

An example use case for the expense report might be to set an automated trigger to
start an approval flow when a new row is created in the Expense Approvals table in
Dataverse. This ensures, for example, that when a form is created with Power Apps,
which creates a new row in Dataverse, an approval flow is automatically triggered.

Instant or manual triggers


An instant or manual trigger is a type of trigger where a user needs to either manually
start the automation, or the trigger is instantaneously started. This can be triggered
directly from a instant flow, or a Microsoft service.

Scheduled triggers
Scheduled triggers run at a specific date and time, and are repeated periodically. They're
useful for situations where you need to automate a task that occurs daily, weekly, or
monthly.
In the expense report example, the accounting team might use a scheduled trigger to
send an automated email every Friday when the weekly BI report is ready.

Next step
Attended and unattended scenarios
Attended and unattended scenarios
Article • 02/16/2022

With any of the automation methods you use, the automation is going to be either
attended or unattended.

Attended (human-initiated) scenarios In these scenarios, the automation is executed


when users are in front of their computers. This is suitable when you want to automate
tasks and processes at an individual level. The automation is often triggered manually
whenever the user wants to run it. The process might require human interaction or
decisions between steps.

Unattended (fully automated) scenarios In these scenarios, a designated computer or a


server is set up to run the automation on behalf of a user. The whole automation
process is run fully by Power Automate, and no interaction or decision is made by a
human (with the exception of approval flows, in which the person doing the approving
is considered to be technically a "third party" to the automation). The automation
process can be triggered automatically from another system or service, or on a schedule.

The following table summarizes the two types of automation scenarios.

Attended Unattended

Requires human interaction or decisions No human interaction or decisions


required

Manually triggered Automatically triggered

Sign-in isn't required because the automation assumes Windows Sign-in is automated with
that the system is already signed in. predefined user credentials

You can use a combination of attended and unattended automation in your solution.

In the expense report example, the approval process can be automated with unattended
automation. The cash reimbursement process might be better suited to attended
automation, because Abhay might want to check the details of the bank transaction as a
final confirmation.
Next step: Separate flows into smaller automated processes
Separate flows into smaller automated
processes
Article • 02/16/2022

When you're setting up an automation, try to architect your flows so that you don't have
a single automation that covers the entire process. There are several reasons why you
should make multiple, smaller flows:

Maintenance is easier.

Error handling doesn't need to be as sophisticated.

Multiple people can work on the automation.

There's no need to restart the automation from the beginning if a step fails.

In the example below, one automation has been set up for an approval process,
covering multiple processes with a single automation.

If for example, the cash reimbursement process fails, the whole automation will be
considered a failure. If a requirement or specification for looking up employee banking
details changes, the whole process will have to be suspended until the updates are in
place.

Instead, you can separate the automation into modules, as shown in the following
image.
In this example, Automation #2 depends on the previous automation to set the status of
the expense report to "Compliance check complete." However, if there's a problem with
the mail system and Automation #2 fails, the tasks in Automation #1 will still be
completed. Only tasks in Automation #2 will need to be restarted.

Next step: Authentication and security


Authentication and security
Article • 10/30/2023

Your automation will probably access data and systems that are protected by requiring
users to sign in. Different automation scenarios require Power Automate to use different
authentication methods.

Before you set up your automation, you should ask yourself how you currently sign in to
the systems or computers to do the tasks manually. Below are some examples of
different types of authentication (sign-ins) that can be used when automating with
Power Automate.

To set up the automation, make sure you have the necessary authentication (sign-in)
information ready. You'll need this information when you're making a new connection to
set up your automation, setting up to access data via an on-premises data gateway, or
when using desktop flows.

On-premises data gateway


For desktop and website automation, an on-premises data gateway is required so that
programs that are installed on the on-premises computers (for example, browser
extensions and Power Automate Desktop) can be accessed from the Power Automate
cloud-based service. More information: Install an on-premises data gateway

Authentication by using Microsoft 365 and


Microsoft Entra ID
This is the authentication for any automation that you use with Microsoft services. When
the automation is run, it runs on behalf of the user who's running the automation and
not the user who set up the automation originally.

Authentication by using a username and


password
This type of authentication is used for systems and services that have an independent
system other than what's used with Microsoft 365 and Microsoft Entra ID, with a
separate username and password. Sign-ins for services such as Google, Facebook, and
Twitter all have their own methods for authentication. Some enterprise systems provide
single sign-on (SSO).
In the expense reporting example, the online banking system has its own sign-in ID and
password.

Authentication by using an on-premises system


or Windows sign-in
This type of authentication will be required if you're planning to automate with the
Power Automate Desktop application or desktop flows. It's separate from Microsoft 365
or Microsoft Entra ID. If a computer is connected to a corporate network, it's highly
likely that it uses Windows Server Active Directory.

Authentication by using a shared key


This authentication is usually used for online services and is used for system-to-system
(API) automation where the services are shared across the company. This is typically
provided and set up by your IT department, where the connector is shared with you.

Next step: Defining inputs and outputs


Defining inputs and outputs
Article • 02/16/2022

In any automation, there will be an input and an output. Before you start automating
processes with Power Automate, you need to define what these are.

The following example shows how you can define the inputs and outputs.

In the expense approval scenario, Abhay must take the following steps to reimburse an
applicant who submits an expense form:

1. Abhay receives an approval request for an expense report.

2. Abhay decides whether to approve or decline the request.

3. If the request is approved, Abhay sends an email to the employee to let them
know.

The following table shows the information required in this scenario.

Information required Input or Purpose


output?

Employee's name Input To send an email if the expense is approved

Employee's email Input To send an email if the expense is approved

Employee's employee Input To search in the employee management system for the
number banking number.

Approval result Output To send an email if the expense is approved

Approver's name Output To send an email if the expense is approved

Approver's email Output To send an email if the expense is approved

Approval date and Output To send an email if the expense is approved


time

This might look overwhelming, but the majority of the inputs can be retrieved
automatically. For example, the employee's name and email can be retrieved if the
automation is triggered manually by the employee.

Securing inputs and outputs


If you're handling sensitive data such as sign-in IDs, passwords, and banking
information, you can use the secure inputs and outputs feature in Power Automate.

By default in Power Automate, you can see inputs and outputs in the run history for a
flow. When you enable secure inputs and outputs, you can protect this data when
someone tries to look into the inputs and outputs and instead display the message
"Content not shown due to security configuration."

Next step: Handling sensitive text


Transforming and formatting data
Article • 02/16/2022

It may well be that some of the data you retrieve from a system needs to be
transformed to be understood by other systems used later in the process. For example,
you might need to convert local time to Coordinated Universal Time (UTC), or convert
one currency to another. To make your data understandable in another system, you can
convert it into a different format. Be sure to take data formats (and format conversions)
into account when you design your process automation architecture.

Here are some of the ways you can transform and format your data.

Built-in actions
You can use built-in actions to convert values and strings to different formats.

Expressions
Expressions are Excel-like equations you can use to convert and manipulate data.
The lists of different expressions are listed below:

String functions

Collection functions

Logical comparison functions

Conversion functions

Implicit data type conversions

Math functions

Date and time functions

Workflow functions

URI parsing functions

Manipulation functions: JSON & XML


For the full list, go to Reference guide to using functions in expressions for Azure Logic
Apps and Power Automate.

Next step: Formalizing messages and alerts


Formalizing messages and alerts
Article • 02/16/2022

When sending email notifications and Microsoft Teams messages, sometimes you might
not want to send them as yourself but you still want to ensure that people who receive
them can reach you.

Use shared senders


Sending messages from Power Automate as yourself is fine in small cases, but as the
process gets more formalized, we recommend sending messages as a shared sender.
This helps recipients know the message was sent via automation rather than as personal
nag. It has the added benefit that people won't try to bug you directly in response to
something that's meant to be purely informational. For the Microsoft Teams connector,
we have a few "Post as the Flow bot" actions that are well suited to this. Outlook has a
"Send an email from a shared mailbox" action, though you'll need to bring your own
mailbox. This advice also applies to updating tickets, creating records, and so on, but the
specifics will vary by connector.

Add a signature
When using automation to send emails and post messages, you want people to know
where they're coming from. A shared mailbox helps the recipient realize that the
message isn't coming from you directly. However, in case the automation breaks or
starts triggering too quickly, you should be easily reachable to correct the problem. This
is especially important if the automation works with people outside your organization or
in external systems where the recipient might not be aware of your flow. People might
even want to contact you to suggest ways to improve your flow! We use a signature like
"Sent with Power Automate. Contact <your email> with questions." You might also find
it helpful to link to the specific flow so that you can find it quickly if someone forwards
you the email.

Next step: Reducing risk and planning for error handling


Reducing risk and planning for error
handling
Article • 02/16/2022

Always assume your automation can fail.

No system is perfect. When you're designing your first set of automated processes, it's
easy to forget the importance of designing for when things fail to work correctly.

You should always design your automation so that there's a plan B—to make sure your
business process can continue even if the automation doesn't work. This isn't to suggest
that Power Automate is an unreliable system, but connecting with different systems
increases the risk of failure, which can be caused by reasons unrelated to Power
Automate.

In general, you should consider using connectors whenever possible because they're
more robust and aren't as fragile or easily affected by screen design changes as web and
desktop application automation. If no connectors are available, but you do have web
APIs or other methods of system integration, you should consult your IT pro or
development teams to help you set up custom connectors.

Possible failures with automation by using connectors

Shutdown of connecting systems due to maintenance

System unavailability due to software bugs

Changes to how systems are connected (API versions change)

Possible failures with web application automation

Screen design changes (so the bot can't tell how to proceed)

System unavailability due to regression

Possible failures with desktop application automation

Screen design changes (so the bot can't tell how to proceed)

Operating system updates

System unavailability due to regression

Possible failures common with any automation


Changes to passwords

Momentary network issues

Retry policy
You can use this feature of Power Automate to set up policies that will automatically
retry an action if it fails. By default, this is set to retry four times, but you can change it if
you need.

Set up custom failure notifications


If actions still fail, standard capabilities in Power Automate notify the owners of the
automation with a message similar to the following image.
However, if you'd like to send a custom notification, you can set it up by adding actions
that run only if the previous steps have failed.
Normally, by default, all actions that are set up will run only if the previous step was
successful. You can change this behavior by setting the action to run only when the
previous step failed—so that, for example, an email is sent to a custom list of recipients
after a failed action.

Assign multiple owners


Having a single owner for a particular automation can be a risk from an organizational
and administrative perspective. If that owner is absent or away from the office when a
problem occurs, no one else can fix the issue. You can prevent this by setting up
multiple users or groups as owners, to make sure more than one person can edit the
automation. More information: Share a flow

Reduce risk and increase throughput by setting


up a cluster
For a business-critical automation, one of the ways to reduce failures and risks is by
setting up a cluster. A cluster is a group of computers that you can use to run your
automation. Power Automate provides clustering capabilities to run the automation
concurrently. This is particularly useful for unattended scenarios, where you have more
than a single computer available to run your automation.

Next step: Adding analytical data to Microsoft Dataverse


Adding analytical data to Microsoft
Dataverse
Article • 02/16/2022

To identify bottlenecks in an automation, you can set up actions within the automation
that log the start time of each activity or step. You can do this by creating a table that
records the step name, start time, and end time.

This way, you can keep track of how long it took for each end-to-end run of automation
to be completed, and possibly find ways to make your automation even better.

If you store this data in Dataverse, you can use Power BI to identify which part of the
process took the longest time to complete.

Next step: Decision-making flowchart for your design


Decision-making flowchart for your
design
Article • 02/16/2022

Based on the design considerations mentioned previously in this section, this flowchart
can help you determine how to architect your automation.

Next step: Making phase


Making phase
Article • 02/09/2023

You have now planned and designed how you'd like to automate your process. The next
step is to set up the automation.

Basic steps for cloud flows


1. Create a cloud flow in Power Automate .

2. Specify an event or trigger to start the flow with connectors. Based on the way you
want to trigger the automation, you'll specify what event you want to use to
trigger your Power Automate flow.

3. Add actions and conditions.

More information: Create a cloud flow in Power Automate, Create a flow from a
template in Power Automate

Basic steps for desktop flows


1. Create a new desktop flow in Power Automate Desktop.

2. Add actions and set up configuration.

3. Define inputs and outputs.

4. Test the desktop flow you created.

5. If needed, set up triggers and links between cloud flows and Power Automate
Desktop.

Next step: Testing strategy


Testing strategy
Article • 02/16/2022

After you've finished making your flows and automation, the next step is for you to test
it. You should consider testing all possible patterns and outcomes of your flows. This is
because your flow might not simply fail, it might run but produce unexpected results.
Testing all patterns will reduce this risk.

If you're new to building flows in Power Automate, testing the automation each time
you add a new step is the best way to ensure that you catch mistakes, rather than
attempting to build the entire flow and then testing it.

Let's take a look at the example shown in the following illustration.

We recommend that you record your results in the Actual result column in a table like
the following, to make sure you've covered all possible combinations that might fail.

Case Step details Condition Expected result Actual


No. result

1-1 Check whether report Compliance Status updated to "Compliance


meets compliance met check complete"

1-2 Check whether report Compliance Email sent to employee to fix the
meets compliance not met expense report

1-3 Check whether report Compliance Notified flow maker and logged
meets compliance check fails failure to the "flow runs" feature.

2 Status updated to Status Notified flow maker and logged


"Compliance check update fails failure to the "flow runs" feature.
complete"
Case Step details Condition Expected result Actual
No. result

3 Email sent to employee to Email send Notified flow maker and logged
fix the expense report fails failure to the "flow runs" feature.

 Tip

To simulate email send failures, try sending a test email to a nonexistent address.

Testing in "live only" environments


Ideally, all tests should be done in test environments. However, there might be
situations where you don't have an environment to test separately from live systems. In
these cases, you can use the following methods:

For lookups: Use static text as the result to mimic a lookup.

For data entry: Create a step to make new record, followed by another flow to
delete the same record.

For sending data: If possible, set up a test environment on the system you want to
send the data to.

Testing with users


After you've completed the systematic tests, you should also run a final check with your
users (ideally the same people who were working on the process prior to the
automation). This helps ensure your automation does what you expect and presents
consistent outcomes.

Next step: Tools to test your automation


Tools to test your automation
Article • 02/16/2022

This article describes the tools you can use to check your flows for basic errors and
detect errors that occur when the automation runs.

Flow Checker
This tool checks for issues and errors in the automation you've created. After you feel
that you've completed setting up your automation, run the Flow Checker to see if you've
made any mistakes. More information: Find and fix errors with Flow Checker

Repair tips
If your automation fails, repair tips are automatically sent to whoever created or owns
the automation. These tips contain actionable information about the failure. More
information: Troubleshooting a flow

Custom error notifications


If repair tips don't meet your need for error notifications—for example, if you need to
inform more people than just the owner about any failures—you can set up custom
error notifications by setting an action that runs only when the previous step has failed.

In the example below, when the Get manager (V2) action fails to run, the **Send an
email notification (V3)"**action is executed.
Next step: Deploying and refining phase
Deploying and refining phase
Article • 02/16/2022

When you're deploying an automation, it's important to consider how you'll replace the
current business process with the new automation, to avoid disrupting the business.

Deploying your automation to production

Add redundant owners


If you have a flow that's used by your entire team, you don't want people calling you up
while you're on vacation if it breaks. Make sure you add a couple of co-owners so that
they can update the flow in your absence. Who you add depends on how the flow is
used. Probably you'll at least want to add your direct manager, who can act as your
proxy, and maybe your manager's manager if all their reports are relying on your
automation. If your group is large enough, or if you have a lot of team flows, you might
consider creating a security group of two or three people who are willing to set aside a
small amount of time to keep an eye on all the team flows. Don't add your entire
organization as a co-owner, though; that just invites more people to mess up the flow. If
your company has a Center of Excellence for Microsoft Power Platform, they might have
guidelines for flow ownership.

Keep in mind that the access applies not just to the flows themselves but to the
connections they use. For example, if your flow sends mail from a shared mailbox, make
sure that the co-owners have access to that mailbox in case they need to re-create the
connection.

Use solutions
Solutions are a great way to organize flows to manage versions and migrate from one
environment to another. You'll need to start by adding (or asking your admin to add) a
Microsoft Dataverse database to your environment. After that's done, you can go to the
Solutions tab to create a new solution for your team, or you can create multiple
solutions if you have a lot of flows that you'd like to further organize. There are a
number of other benefits too, such as native storage for your data, child flows to reuse
functionality, and solution export as a backup. Solutions do have some known
limitations, though, so this might not apply to all your flows.

Mark it as production
Solutions are the recommended way to organize flows, but sometimes your flow can't
go in a solution, or sometimes your solution will get crowded with other drafts and
proofs of concept. Either way, we recommend prefixing the names of your production
flows with "[PROD]" so that co-owners know to leave it be unless it has issues.

Deploy the automation in stages


To make sure your deployment is successful, you should consider taking the following
approach:

1. Use the automation with a small number of people.

2. Check that there are no issues for those people.

3. Have the remaining people start using the automation.

Next step: Assessing the business impact of the automation


Assessing the business impact of the
automation
Article • 02/16/2022

After you've successfully deployed your automation, you can assess its impact by
comparing your original business process against your new process, using the success
metrics you defined. Your automation results are stored for 30 days, during which you
can view them to analyze the total number of actions and runs in a day.

To view flow analytics

1. Go to My flows.

2. Select the flow you want to analyze.

3. Select Analytics.

4. Select the Usage tab.


Get the number of flow runs
The Usage tab shows you how many times in a day the flow has been used to automate
a particular business process.

To export run data to an Excel workbook

1. Hover over the graph that shows the data you want to export.

2. Select More Options.

3. Select Export data.

4. Select the file format you want (Excel workbook or CSV).

5. Select Export.
Get the flow run duration
To see the duration of each flow run

1. Go to My flows.

2. Select the flow you want to analyze.

3. In the 28-day run history, select All runs.

4. If you want to export the information, select Get .csv file.

The CSV file includes start time and end time of each run. You can use Excel to
recalculate the duration and do additional analysis (such as finding the average
duration). To get the duration in seconds, use the following formula:

(Run end time cell − Run start time cell) × 24 × 60 × 60

To get the average duration, obtain the total the number of durations, and divide it by
number of runs (no. of rows).
Now that you have the number of runs and duration of the runs, you'll be able to find
out how much time your automation has saved by comparing it with the previous
manual process.

Example scenario
Let's take the scenario from cash reimbursement as an example.

Previously, Abhay's accounting team had to key in accounting codes, enter transactions,
and post transactions in the accounting system based on the information they received
in the approval email attached to the expense report. Let's say that in the automation
planning phase, Abhay measured how long this takes and recorded that it took three
minutes to look up the employee's banking details and another five minutes to
reimburse the applicant from the online banking website.

Based on the flow run analytics, Abhay can see that the automation ran between 91 to
110 times a day, for an average of 107 runs.
The duration of the automation obtained is 40 seconds on average. Therefore, the time
reduced per run is:

Time before automation (3 minutes + 5 minutes) − time after automation (40 seconds)
= 440 seconds

Because the automation ran 3,226 times in 30 days, total time saved is:

Reduced time (440 seconds) × number of runs (3,226 times) = 1,419,440 seconds =
394.28 hours

Next step: Diagnosing performance issues


Diagnosing performance issues
Article • 02/16/2022

If you're experiencing delays or slowdowns during your flow execution, it might be that
you've reached your Power Automate limits for the day. More information: Request
limits and allocations

Action analytics
To review whether your automation has reached its limits, you can use action analytics
to get better insight into how much you're using your actions.

To see action analytics

1. Go to My flows.

2. Select the flow you want analytics for.

3. Select Analytics.

4. Select the Actions tab.

When your automation has slowed down, it's a good idea to revisit your flow design and
check for additional efficiency that can help reduce the number of actions being
executed.

For flows that are consistently getting delayed due to overages, flow owners also receive
a notification informing them about these overages with tips and tricks about how to
keep flow run execution from being delayed.

The following image shows an example of an email that was sent for a flow that was
consistently running up against action limits.

Limits from connected services


Similar to Power Automate, most web services and apps also tend to implement service
protection limits and abuse-detection algorithms.

A misconfigured flow can sometimes reach these limits, which usually manifest as errors
[429] or timeouts [5xx] in your flow runs. It's important to note that these limits vary
based on the connector or service you're using within your flow.
Introduction to SAP GUI–based RPA in
Power Automate Desktop
Article • 02/16/2022

Robotic process automation (RPA) enables you to automate mundane, rules-based


tasks. With RPA, you can automate legacy software without APIs, which opens the world
of automation to include software that's old or new, on-premises or in the cloud.

Executives who implemented RPA in their organizations have experienced the positive
impact it brings. Increasing the level of automation is a top strategic priority at most
organizations.

Many of these of organizations use SAP to manage their finance, supply chain,
production, and human resources processes. They're looking for ways to automate their
most frequent, mundane, and rules-based tasks. That's exactly what we'll be focusing on
in this playbook: SAP GUI automation patterns and best practices using Microsoft Power
Automate, Power Automate Desktop, and desktop flows.

Here's an introductory video for the series of automating SAP GUI-based applications
with Power Automate Desktop:
https://www.microsoft.com/en-us/videoplayer/embed/RWJE53?postJsllMsg=true

Lifecycle of a typical enterprise RPA bot


Power Automate empowers everyone to automate while providing security, compliance,
and control over the usage and execution of automation across the IT ecosystem,
whether on-premises or in the cloud.

This playbook takes you through prototyping the automation of an example SAP
scenario. However, it's important to understand that building sophisticated, robust, and
impactful RPA solutions that span multiple legacy systems takes time. And, as shown in
the following image, most of this time is spent on production readiness, including
advanced retry and exception handling.
Next step: Prerequisites for automating RPA SAP GUI
Prerequisites for automating SAP GUI-
based workloads
Article • 08/01/2023

The following prerequisites need to be met before you can start automating your SAP
GUI–based workloads.

License requirements
To build RPA solutions with Power Automate, you'll need one or more of the following
licenses or add-ons:

Power Automate Premium (or trial, previously Power Automate per user plan with
attended RPA)

Power Automate Process plan (previously Power Automate per flow and Power
Automate unattended RPA add-on)

Software requirements
Before you can use your device to create desktop flows and Power Automate Desktop
processes, you'll need to ensure that it meets the requirements outlined in Set up Power
Automate Desktop.

The following software components are required on Windows 10 Pro devices:

The latest version of .NET Framework (a reboot might be required)

The latest version of desktop flows, which includes Power Automate Desktop and
browser extensions (make sure you've enabled the browser extensions)

Microsoft Edge or Google Chrome browser

On-premises data gateway (make sure the data gateway region matches your
environment's region) or use the latest direct machine connectivity option.

SAP GUI for Windows (ask your administrator for details).

SAP GUI scripting configuration


Before you can use the SAP scripting engine, configure or confirm the following:
1. Enable SAP scripting.

a. Open SAP GUI.

b. Open transaction RZ11.

c. Enter sapgui/user_scripting into the Parameter Name field.

d. Select Display.

e. Confirm that under Value of Profile Parameter sapgui/user_scripting, Current


Value is set to TRUE. If it's FALSE, select Change Value, enter TRUE in the New
Value field, and then select Save.

f. Confirm with the SAP team that S_SCR authorization is assigned to all
automating users.

7 Note

After you've changed the value, you might get a warning that says,
"Change not permanent, will be lost at server restart." To avoid this issue,
make the configuration permanent on the server side by using transaction
RZ10 instead. You'll need to restart the SAP server for these settings to take
effect.

2. Open SAP GUI Options, go to Accessibility & Scripting > Accessibility > Use
accessibility mode, and then select any other checkboxes that you need.

3. Open SAP GUI Options > Accessibility & Scripting > Scripting, and under User
Settings, select Enable scripting. Clear all other options.

4. On the SAP GUI Options screen, go to Security > Security Settings, and under
Security Module, select Open Security Configuration. Change the Default Action
to meet your specific requirements, and then select Ok.

 Tip

You can select Allow as Default Action to avoid a security dialog appearing
during file save operations.

5. Gather use-case reference test data: Search for an active employee in your SAP
system and make a note of their Personnel number. Also make a note of a valid
Info subtype (for example, 2 = Temporary address).
7 Note

The address format we're using in the sample use case is based on US
requirements. Depending on your requirements, the field list and mandatory
fields might be different, so make sure you select controls that are relevant to
your setup.

6. Close all SAP sessions and windows.

Azure Key Vault credentials (optional)


Although this configuration step isn't mandatory for creating and running desktop
flows, we highly recommend that you use Azure Key Vault as a central cloud
repository for your secure strings, such as SAP passwords and SAP usernames. For the
scenario in this playbook, we've created four use-case–specific secrets on Key Vault.
We'll use these secrets later to pass to our desktop flow as secure inputs. More
information: Key Vault

Next step: Core components for Power Automate RPA SAP GUI automation
Core components for Power Automate
RPA SAP GUI automation
Article • 09/08/2022

Here are the four components needed to automate SAP GUI with Power Automate:

Power Automate
Desktop flows
Power Automate Desktop
On-premises data gateway

Power Automate
Let's start with the core platform service called Power Automate. Power Automate is an
enterprise service that helps you create automated workflows by using your favorite
apps and services to synchronize files, get notifications, collect data, and more. Learn
more in Get started with Power Automate and in our learning catalog.

The Power Automate designer is shown in the following image.


Desktop flows
Desktop flows bring robotic process automation (RPA) capabilities to Power Automate.
You can use desktop flows to automate repetitive tasks in Windows and web
applications. A desktop flow can record and play back UI actions (such as clicks and
keyboard input) for applications that don't have easy-to-use or complete APIs available.

A list of desktop flows is shown in the following image.


Power Automate Desktop
It's quicker and easier than ever to automate with new, intuitive Power Automate
Desktop. You can use Power Automate Desktop to automate legacy applications, such as
terminal emulators, and interact with modern web and desktop applications, files,
folders and many more.

Power Automate Desktop broadens the existing RPA capabilities in Power Automate. In
conjunction with desktop flows, all repetitive desktop processes can be automated.

You can use prebuilt drag-and-drop actions or record your own cloud flows to run later.

The following image shows an example of the Power Automate Desktop console,
showing desktop flows to which an individual has access.

The following image shows an example of the Power Automate Desktop designer, from
which you can create desktop flows.
On-premises data gateway
The on-premises data gateway acts as a bridge. It provides quick and secure data
transfer between on-premises data (data that isn't in the cloud) and several Microsoft
cloud services. These cloud services include Power BI, Power Apps, Power Automate,
Azure Analysis Services, and Azure Logic Apps.

By using a gateway, organizations can keep databases and other data sources on-
premises, while securely using that on-premises data in cloud services.

Direct machine connectivity


In addition to the on-premises data gateway, there is a new connectivity option that
allows physical or virtual machines to directly connect to Power Automate. When you
connect your machine to Power Automate, you can instantly start your desktop
automation with any of the wide array of available triggers.

Learn more about direct machine connectivity.

Next step: Sample SAP GUI automation for this tutorial


Sample automation of SAP GUI
Article • 02/16/2022

We've provided the following simplified example, which we'll use as the base for our
automation tutorials.

Let's say your organization doesn't have employee self-service functionality, but you
want to allow employees to add a second address to their personnel profile by using a
flow that they manually trigger.

7 Note

The following procedure was developed as an example for this playbook. Your HR
department can provide you with the exact steps for you to follow to add the
second address to SAP.

These are the steps:

1. Enter the transaction code PA30, and then select Enter.

2. Select or enter the Personnel number for the employee.

3. On the Basic personal data tab, do the following:


a. For Infotype, enter 0006 (= Addresses).
b. For STy, enter 2 (= Temporary Address).
c. For From, enter a date value.

4. Select the Create (F5) button on the toolbar.


5. In the opened Create Addresses form, provide all relevant address fields, such as
Address line 1, City, ZipCode, and State.

6. Select Save (Ctrl+S) on the toolbar.

7. On the Maintain HR Master Data form, select Back (F3) to return to the starting
point of your process recording.

8. Optionally, sign out of SAP and close all SAP windows.

Next step: Pro-code RPA with SAP GUI in Power Automate Desktop
Pro-code RPA with SAP GUI in Power
Automate Desktop
Article • 02/16/2022

The VBScript-based approach is well-suited for RPA Center of Excellence (CoE) teams
because they typically consist of a mix of IT pros, pro developers, security specialists,
and business analysts whose charter is to create, maintain, secure, and scale enterprise
automation solutions across the organization. With these diverse skill sets, they can
pursue more complex SAP GUI automation techniques than those that citizen RPA
developers undertake.

One of these techniques involves using VBScript to interact with the underlying SAP GUI
Scripting API . In fact, SAP includes its own proprietary SAP GUI automation engine
that generates VBScript output that's based on user interactions that are captured
during screen recording.

The good news is that you can use the VBScript that the SAP GUI automation engine
generates in a Power Automate Desktop action. To use this VBScript, all you have to do
is to replace the manually entered text that was captured during recording with dynamic
inputs in Power Automate Desktop.

Take a look at this video (episode 8 of the series) to learn more about the pro-code
approach for automating SAP GUI-based applications:
https://www.microsoft.com/en-us/videoplayer/embed/RWJZLc?postJsllMsg=true

Next step: Record VBScript with the SAP GUI automation engine
Use SAP GUI automation engine to
record VBScript
Article • 02/16/2022

1. Confirm that all SAP GUI scripting configurations are done.

2. Open SAP Logon, and then select the SAP system to which you want to sign in.

3. Select Customize Local Layout (Alt F12), and then select Script Recording and
Playback.

4. Select More.

5. Under Save To, provide the path and file name where you want to store the
captured user interactions.
6. Select Record Script to start the screen capturing process.
Every interaction you do in SAP will be captured as repeatable VBScript commands.

7. Follow the steps outlined in Sample SAP GUI automation for this tutorial to
produce a recording.

8. Select the Record and Playback dialog, select Stop Recording, and then close the
dialog.

9. Close all SAP windows now, if you wish.

SAP's scripting engine records each click as VBScript commands and saves it to the
output file you provide. Open the file in your code editor of choice to examine its
contents.

Next step: Review the generated code


Review the generated code
Article • 02/16/2022

The generated VBScript code is shown in the following image.

The following image shows the anatomy of an SAP GUI automation script.

Next step: Add variables to your VBScript


Add variables to your VBScript
Article • 02/16/2022

In this step of the RPA Playbook for SAP GUI Automation with Power Automate tutorial,
before we switch over to Power Automate Desktop, let's review all the hard-coded value
references in your VBScript and decide which ones to replace with dynamic input
variables.

Identify hard-coded values


Variables are used within desktop flow processes to store data for further processing,
and their names are enclosed within percentage signs, %. Almost every action receives
at least one variable as input or produces a variable as output. Every variable has a
unique name. Variable names can contain letters, numbers, and underscore ( _ )
characters, and aren't case-sensitive.

Some variable naming examples include:

%NewVar%

%file_path%

%Street%

The following image shows an example of replacing a hard-coded value with a variable.

Your script should look like the following after you've introduced variables.
Next step: Create a desktop flow that connects to SAP
Create a desktop flow that connects to
SAP
Article • 02/16/2022

In this section, we'll create a new flow with Power Automate Desktop and use the
previously prepared VBScript code in a Run VBScript action within the flow.

1. Select New flow in Power Automate Desktop.

2. Enter a name for the flow, and then select Create.

3. Select the Power Automate Desktop designer window, and then select Variables.

4. Select the plus sign (+), and then select Input to create several input variables.
You'll pass these variables into the flow from a desktop flow.
5. First, we'll create a few technical SAP variables, which will be needed in almost all
SAP-based automation flows. For each variable, enter the Variable name, External
name, and Description, and then select Update:

SAPPassword

SAPUser

SAPClient

SAPSystemId
6. Create the following use-case–specific variables:

EmployeeId

AddressType

EffectiveDate

Street

City

State

ZipCode

CountryCode
7. In the Actions pane, search for Run application and then drag it onto the design
canvas to create our first process action.
8. Enter the following information in the parameter list, and then select Save:

Application Path: C:\Program Files (x86)\SAP\FrontEnd\SapGui\sapshcut.exe

Command line arguments: start -system=%SAPSystemId% -


client=%SAPClient% -user=%SAPUser% -pw=%SAPPassword% -maxgui

Window style: Maximized

After application launch: Wait for application to complete

9. Now search for the Wait action, drag it onto the design canvas, enter 10 seconds
wait time into the Duration field, and then select Save.
10. Search for the Run VBScript action, drag it onto the design canvas, and paste the
previously generated and optimized VBScript into the VBScript to run field, and
then select Save.

11. Open the SAP Logon 760 (your version might differ) application, connect to an
SAP system, and then sign in to SAP Easy Access.

12. In Power Automate Desktop, select the UI elements icon on the right pane, expand
Add UI element, and then select Add a screen.
13. Bring SAP Easy Access to the foreground, and then hover over the outermost
frame of the SAP Easy Access window until a red border labeled Window appears.
While the border is active, hold down Ctrl, and then click to select the window.

14. Repeat steps 12 and 13 for the SAP Logon 760 (your version might differ) window.
You should now see the following in the UI elements pane.
15. Search for the Close Window action, drag it onto the design canvas, and then in
the Window dropdown menu, select Windows 'SAP Easy Access'.

16. Repeat step 15, but this time select Window 'SAP Logon 760'.

17. Select Save. Your authoring canvas should look like this now.

That's it! You've just configured your first SAP GUI automation desktop flow with Power
Automate Desktop. In the next step, we'll set up a cloud flow that provides the input
variables based on the employee's request.

Next step: Create the cloud flow with the Power Automate portal
Create the cloud flow with the Power
Automate
Article • 02/09/2023

You can close both Power Automate Desktop windows and then go to the Power
Automate designer . Here, we'll create a Power Automate cloud flow that calls our
desktop flow by using secure input parameters from the cloud.

7 Note

This cloud flow is designed as a "happy path," which means it has no exception
handling, scoping, or try-catch-finally patterns. You can find a more resilient design
approach in the Low-code RPA with SAP GUI in Power Automate Desktop section.

1. Go to Power Automate , sign in, and confirm that you're in the same Microsoft
Dataverse environment as the one you were in when you created the previous flow
in Power Automate Desktop.

2. On the left pane, select My flows, select New flow, and then select Instant cloud
flow.
3. In the Build an instant flow dialog, enter a flow name, select the Manually trigger
a flow trigger from the list, and then select Create.

4. This opens the designer, and it should look similar to the following image.
) Important

The next steps will involve configuring action components. To securely pass
parameters into our desktop flow, we'll be following the optional, but
recommended, approach of using Azure Key Vault secrets.

7 Note

If you don't have access to Key Vault, you can skip the following steps that
show you how to configure Key Vault and provide your credentials and other
parameters later as clear text. Microsoft does not recommend that you use
clear text credentials in production environments.

5. Select New step.

6. Search for Azure Key Vault Get secret, and then select Get secret action.
7. If this is the first time you've configured Key Vault in a flow, you'll get a prompt to
set up a connection. You can choose to connect either through user credentials or
a service principal account (which we recommend for production scenarios).

Establishing a connection with user credentials

Establishing a connection with service principal


8. After you configure the connection, select the appropriate Name of the secret
from the list.
9. Select More (...), and then select Settings.

10. Turn on Secure Inputs and Secure Outputs, and then select Done.
7 Note

These settings will hide sensitive text from the run flow history.
11. Select ..., and then select Rename to enter a more specific action name—for
example, Get SAP User.

12. Select ..., and then select Copy to my clipboard.

13. Select New step.

14. Select My Clipboard, and then select the name of the previously copied action—in
our example, Get SAP User.

15. Select ..., and then select Rename. Enter a more specific action name—for example,
Get SAP Password.

16. Repeat steps 14 through 18 for all other Key Vault–based secrets.
17. Select New step.

18. Search for Desktop flows, and then select Run a flow built with Power Automate
Desktop.
19. If this is the first time you've used the desktop flow action, you'll be prompted to
create a connection. Provide your Gateway name, Domain and username
(DOMAIN\User), and Password.

20. After the connection is established, select the previously created desktop flow.
21. Select Attended – Runs when you're signed in as Run Mode.

22. Enter each parameter field and either select the appropriate data from the
Dynamic content pop-up window (such as SAP System Id, SAP Client, SAP User,
and SAP Password) or manually enter placeholder data (for all the other
parameters in the list). If you aren't using the Key Vault option for your SAP
credentials, enter them manually and in free text instead.
23. After you've renamed the action to something meaningful and supplied all
parameters, the action should look like this.
Test the flow
1. Close all SAP windows.

2. Select Save, and then wait until the flow is saved.

3. Select Test, and then under Test Flow, select Test again.

4. Authorize any connections, if needed, and then select Continue.


5. Select Run flow.

6. Select Done.

) Important

Don't interact with your mouse or keyboard until the process completes.

7. Watch the desktop flow execution.


Congratulations, you've just created and launched a secure desktop flow from the cloud
and connected to your desktop flow to add an employee address to SAP.

Next step: Validate data with the SAP ERP connector


Validate data with the SAP ERP
connector
Article • 02/16/2022

) Important

This is an optional step.

This is an optional topic that covers the certified SAP ERP connector which allows
automation developers to connect to SAP through Business API (BAPI) and remote
function calls (RFCs). The prerequisites you'll need to meet before performing this check
are described in a blog post .

Let's extend our current scenario to include a validation check that uses the SAP ERP
connector to check whether the personnel number supplied to the flow is valid and that
the employee is in an active employment state. Depending on the results of this check,
we'll decide whether the flow ends.

Here's how the adjusted flow looks.

There are several benefits to employing this approach:

Avoids unnecessary RPA processing and complex UI-based exception handling.

Provides a better user experience through near–real-time data validation feedback.

Frees up virtual machine and bot capacity to run only on validated data.

Employs data loss prevention policies, allowing or disallowing this connector to be


used in conjunction with others.
Follow these steps to adjust the existing flow to incorporate the validation check.

1. Edit the SAP RPA Playbook Demo Flow you created in Create the cloud flow with
the Power Automate portal.

2. Under the Get SAP Client action, select New step.

3. Search for SAP, and then select Call SAP function (preview).

4. For Authentication Type, select SAP Authentication. Enter the Data Gateway, SAP
Username, and SAP Password.

7 Note

You'll need to supply your credentials manually. You can't use Azure Key Vault
secrets here because Power Automate validates the connection at design
time.
5. Select More (...), and then select Settings.

6. Turn on Secure Inputs and Secure Outputs, and then select Done.
 Tip

Use these settings to hide sensitive text from the run flow history.

7. Rename the action to Check whether the personnel number exists.

8. Enter the following information:

Enter AS host, Client, and AS System Number.


For SAP function name, select BAPI_EMPLOYEE_GETDATA.
For Stateful Session, select No.
For EMPLOYEE_ID, enter a valid personnel number.

9. Under Check whether the personnel number exists, select New step.

10. Search for, and select, Condition. Rename the condition to something meaningful,
and then select the dynamic content TYPE for the Choose value field.
11. Drag the desktop flow action SAP RPA Playbook onto the If yes box.

12. In the If no box, select Add an action, search for Send an email and configure the
email action as shown in the following screenshot.
1. In the If no box and below the email action, select Add an action. Search for
Terminate and configure the Terminate action as follows.

2. Go back up to the SAP RPA Playbook Demo Desktop flow action and enter a
personnel number that doesn't exist into the EMPLOYEE ID field.
3. Select Save, and then select Test to test your flow with the nonexistent personnel
number.

4. The resulting flow run should look like the following image.
5. Select Test again, but this time use valid personnel number. Confirm that the
results look like the following image now.

That's it. By incorporating the SAP ERP connector we've made the automation more
efficient, intelligent, and user-friendly.

Next step: Low-code RPA with SAP GUI in Power Automate Desktop
Use low-code RPA with SAP GUI in
Power Automate Desktop
Article • 02/16/2022

With the addition of Power Automate Desktop, it's quicker and easier to develop low-
code SAP GUI automation. You can use Power Automate Desktop to build from simple
to highly sophisticated end-to-end automation. To showcase this, we'll perform the
steps outlined in Sample SAP GUI automation for this tutorial by using Power Automate
Desktop actions. We’ll do some minor tweaks afterward in the designer. To make our
automation dynamic, first we'll need to create some variables that we'll use to fill in data
—like street, city, and country—and also system variables to connect to your SAP
system.

Follow along in the low-code approach in the video series:


https://www.microsoft.com/en-us/videoplayer/embed/RWJKYQ?postJsllMsg=true

Next step: Create an SAP desktop flow with Power Automate Desktop
Create an SAP desktop flow with Power
Automate Desktop
Article • 02/16/2022

1. To create a desktop flow, open Power Automate Desktop, and then select New
flow.

2. Enter a name for the desktop flow, and then select Create.

3. Select Variables in the Power Automate Desktop designer.


4. Select the plus sign (+), and then select Input.

You'll create several input variables which will be passed into this desktop flow
from a cloud flow.

5. First, we'll create a few technical SAP variables, which will be needed in almost all
SAP-based automation flows. For each variable in the following list, enter the
Variable name, External name and Description, and then select Update.

SAPPassword

SAPUser

SAPClient

SAPSystemId
6. Next, create the following use-case-specific variables.

EmployeeId

AddressType

EffectiveDate

Street

City

State

ZipCode

CountryCode
7. Next, we'll create our first process action. Search for, and then drag the Run
application action onto the design surface.
8. Enter the following information into the parameter list, and then select Save.

Application Path: C:\Program Files (x86)\SAP\FrontEnd\SapGui\sapshcut.exe

Command line arguments: start -system=%SAPSystemId% -


client=%SAPClient% -user=%SAPUser% -pw=%SAPPassword% -maxgui

Window style: Maximized

After application launch: Wait for application to complete

9. Search for the Wait action, drag it onto the designer, enter 10 (seconds) into the
Duration field, and then select Save.
10. Now, open SAP GUI and sign in to your system of choice.

11. In Power Automate Desktop, search for the populate action in the Actions search
box, and then drag the Populate text field in window action onto the canvas.

12. Select Add new UI element.


13. The Tracking session dialog opens, which tracks the individual controls you select
on a screen.

14. Select the SAP main window to give it focus.


15. Hover over the transaction code field. When a red frame surrounds Gui Ok Code
Field, hold down Ctrl while you click. This adds the control to the UI elements list
in the Add UI elements dialog in Power Automate Desktop.

16. In the SAP window, hover over the green check mark button, and then hold down
Ctrl and click.
17. Enter transaction code PA30 into the list box, and then select the green check mark
button.

18. We'll continue our field selection process on the next screen.

19. Select the following fields and button:


20. Verify that the dialog includes the following control names now. Select Done to
close the dialog and to return to the design canvas.
21. This is the dialog you should see after you've closed the previous dialog.
22. Open the Text box dropdown menu, select Gui Ok Field 'okcd, and then select
Select.

23. Enter the transaction code PA30, and then select Save.
24. On the right pane, select the UI elements icon. Select each control in the list and
rename it.

 Tip

This step isn't required, but it's highly recommended because your control
library might include dozens of controls, making it difficult to identify them by
their system names.

25. Here's the renamed control list.


26. Enter press button into the Action search box, and then drag the Press button in
window action onto the canvas.

27. Select Continue.

28. Select Save.


29. Enter wait for into the Actions search box, and then drag the Wait for window
content action onto the canvas.

30. Under UI element select Employee id, and then select Select.
31. Select Save.
32. Enter populate into the Actions search box, and then drag the Populate text field
window action onto the canvas.

33. Under UI element select Employee id, and then select Select.
34. Select the icon, which is in the Text to fill-in text box, and then select
EmployeeId.

35. Select Save.


36. Repeat from step 33 through 35 for controls Infotype, Info subtype, and Effective
date, and select the variables or provide a fixed value as shown in the following
images.
37. Enter press button into the Actions search box, and then drag the Press button in
window action onto the canvas.
38. Under UI element, select New address, and then select Save.

39. Select Save, and then select OK.


40. Enter wait for into the Actions search box, and then drag the Wait for window
content action onto the canvas.

41. Select UI element, and then select the Add new UI element button to bring up the
Tracking session dialog.
42. The Tracking session dialog appears.
) Important

For the following steps, you'll need valid SAP reference test data as outlined in
step 5 of the SAP GUI scripting configuration prerequisites.

43. Enter values for Personnel number, Period From, Infotype (always provide 0006,
because this is a standard type in SAP systems), STy (this is the Infotype subtype),
and then select Create (F5).
44. Hover over Address line 1. A red frame surrounds the field Gui Ok Text Field. Hold
down Ctrl, and then click to add the control to the UI elements list, which shows
up in the Tracking session dialog.
45. Repeat step 44 for these fields as well: City/county, State, Zip code, Country key,
and the Save (Ctrl+S) button.
46. In the Tracking session dialog, select Done.

) Important

Be sure to leave the SAP form open.


47. Select UI element and then select Gui Text Field 'P0006-STRAS", and then select
Save to close the dialog.

48. On the right pane, select the UI elements icon. Select each of the newly added
controls in the list, and then rename them. This isn't mandatory but highly
recommended because your control library might include dozens of controls,
making it difficult to identify them by their system names.
49. Enter populate in the Actions search box, and then drag the Populate text field
window action onto the canvas.
50. Select UI element, and then select Street.
51. On the Text to fill-in box, select the fx icon, and then double-click to select the
Street variable.
52. Select Save.

53. Repeat steps 50 and 52 with these controls: City, State, and ZipCode.
7 Note

SAP GUI combo boxes need special handling, which requires us to use a
combination of actions to select the correct list items within combo boxes.

54. Enter click UI into the Actions search box, and then drag the Click UI element in
window action onto the canvas.
55. Select UI element, and then Country.
56. Keep the default value for Click type.

57. Enter send keys into the Actions search box, and then drag the Click UI element in
window action onto the canvas.
58. In the Text to send field, select the fx icon, and then double-click to select
CountryCode.
59. Amend the %CountryCode% text by typing {Enter} right after the variable name.
The {Enter} reference mimics an Enter keystroke on your keyboard.

60. Drag another Send keys action onto the end of the flow.
61. Enter {Enter} into the Text to send field, and then select Save.
62. Enter press button into the Actions search box, and then drag the Press button in
window action onto the canvas.

63. Under UI element, select Save.


64. Select Save.

) Important

Confirm that the save operation is complete for the SAP record before you
interact with other controls or windows.To make sure the operation is
complete, add a Wait action to wait for a control to become visible.
65. Enter wait into the Actions search box, and then drag the Wait for window
content action onto the canvas.

66. Under UI element, select Employee id, and then select Select.
67. Select Save.
68. Enter press button into the Actions search box, and then drag the Press button in
window action onto the canvas.

69. Under UI element, select Add new UI element.


70. Hover over the Back (F3) button.

71. Hold down Ctrl while clicking to add the control to the UI elements list in the
Tracking session dialog. (Note: we kept this form open in step 46.)
72. Now, select the Back button (without the Ctrl key) to go back to the previous
screen.

If you see a "Data will be lost" message, confirm by selecting Yes.


73. You should be on the following screen now. In the Tracking session dialog, select
Done.

74. Select Save.

75. Rename the button we've just added to the control library to Back button.
76. Enter close window into the Actions search box, and then drag the Close window
action onto the canvas.
77. Under UI element, select Window "SAP Easy Access", and then select Save.

78. Enter close window into the Actions search box, and then drag the Close window
action onto the canvas.
79. Under Window, select Add a new UI element.
80. Hover over the outer window frame of the SAP Logon 760 window. Hold down
Ctrl and click to add the window element to the UI elements list, which appears in
the Tracking session dialog.

81. Under UI element, select Window "SAP Logon 760" (the number 760 reflects the
SAP GUI version, so this might differ in your environment), and then select Save.
82. Select Save.

83. Select Save to save the flow.


Next step: Create a subflow for SAP GUI automation
Create a subflow for SAP GUI
automation
Article • 02/16/2022

Now, let's modularize our current flow by moving actions that carry out specific
interdependent tasks (in our scenario, filling out a form) into a new subflow.

1. Create a new subflow, name it ProcessHRMasterForm, and then select Save.

2. Go to the Main tab for the Subflows area, select the highlighted actions shown in
the following image (rows 4 through 11), right-click, and then select Cut.
3. Go to the ProcessHRMasterForm tab, right-click to select it, and then select Paste.

4. Go back to the Main tab.

5. In the Actions search box, enter run subflow, and then drag the Run subflow
action onto the canvas. For Call subflow, select ProcessHRMasterForm, and then
select Save.
6. Create another subflow, name it ProcessEmployeeAddressForm, and then select
Save.
7. Go back to the Main tab.

8. On the Main tab, select the highlighted actions shown in the following image
(rows 4 through 14), right-click, and then select Cut.
9. Go to the ProcessEmployeeAddressForm tab, right-click to select it, and then
select Paste.
10. Go back to the Main tab.

11. In the Actions search box, enter run subflow, and then drag the Run subflow
action onto the canvas. For Call subflow, select ProcessEmployeeAddressForm,
and then select Save.

Next step: Get ready to debug


Get ready to debug
Article • 02/16/2022

Before we continue with the primary flow in the Power Automate portal, let's test the
flow by temporarily assigning a default value to the variables that we defined earlier.

1. In the Variables pane, select More (…) next to the AddressType variable name, and
then select Edit.

2. For Default value, enter 2 (for example, for a temporary address).

3. Select Update.

4. Repeat steps 1 and 2 for the other 11 variables.


5. Select Save, close the confirmation dialog, and then select Run.

If you run into an error as your flow is executed for this test run, observe the error status
bar in the lower part of the Power Automate Desktop designer and apply the
appropriate fix.

) Important

Delete all default values before you leave the desktop flow authoring experience in
Power Automate Desktop.

Next step: Create the cloud flow to update an address with Power Automate
Create the cloud flow to update an
address with Power Automate
Article • 02/09/2023

Now, we'll create the cloud flow that employees use to request an updated address. This
cloud flow passes the input variables to the desktop flow you created in the previous
section.

We highly recommended that you create cloud flows, desktop flows and other Microsoft
Power Platform artifacts within solutions to allow for better portability, application
lifecycle management (ALM), and encapsulation.

1. Navigate to https://make.powerautomate.com and sign in with your Azure


Active Directory (Azure AD) credentials.

2. Confirm that you're in the same environment as the one in which you built the
desktop flow with Power Automate Desktop, and then select Solutions > + New
solution.

3. Enter a Display name, select a Publisher, and then select Create.


4. Open the solution by selecting its name.

5. Select + Add existing, select Desktop flow, select the flow you created in Power
Automate Desktop, and then select Add.
6. Select + New > Cloud flow.

7. Give your flow a name, and select Manually trigger a flow as the trigger.
8. Select + Add an input, and then select the appropriate data type to create the
eight inputs listed in step 9.
9. Add the following inputs to the flow trigger.
10. Select New step.

7 Note
The following Azure Key Vault action configurations are optional, so if you aren't
using Key Vault today, or you just want to test your desktop flow without it, feel
free to skip them.

11. Enter azure key vault into the search box.

12. If you don't have an existing Key Vault connection, you'll be prompted to create
one. You can sign in either with an Azure AD user account or a Service Principal
(recommended).
13. Let's assume you select Connect with Service Principal.
14. After you establish the connection, add four Get secret Key Vault actions to the
canvas, select the secret, and then rename the actions as shown in the following
image.
15. Select More (…) next to the action name, select Settings, turn on Secure Inputs
and Secure Outputs, and then select Done. Repeat this step for the other three
Get secret actions.
16. Select the plus sign (+), and then select Add an action.
17. Enter scope into the search box, and then select the Scope action.
18. Drag all your Key Vault actions onto the Scope container, and then rename it to
Try.
19. Search for, and then select, the initialize variable action to add it under the trigger.
20. In the Initialize variable dialog, make the following settings, and then select OK in
the Expression dialog:

For Name, enter Bot failed.


For Type, select Boolean.
For Value, enter false.

21. Add two scope actions. Name them Catch and Finally.
22. In the upper-right corner of the Catch scope, select …, and then select Configure
run after.

23. Select has failed, is skipped, and has timed out, and then select Done.

24. In the upper-right corner of the Finally scope, select …, select Configure run after,
and then select the is successful, has failed, is skipped, and has timed out
checkboxes. Select Done.
25. Search for, and then add, the set variable action to the Catch scope container.
26. Select the Bot failed variable, enter true in the Expression dialog, and then select
OK.
27. Search for the condition action, and then add it to the Finally block.

28. Select the Bot failed variable from the Dynamic content list and then assign it to
the Choose a value field.
29. Set the expression to false, and then assign it to the value field.

30. In the If yes section, add a Send an email (V2) action.


31. Select User email from the Dynamic content list and add it to the To field, and
then enter a subject and an email body.

32. In the Try container, select New step. Search for Desktop flows, and then select
the Run a flow built by Power Automate Desktop action to add it to the flow.
33. Select your data gateway and then enter a domain, username, and password for an
account that has sufficient privileges to run your desktop flows.

) Important

Your on-premises data gateway must be deployed into the same region as
your environment; otherwise, it won't appear in the dropdown list.

34. For Desktop flow, select SAP RPA Playbook Demo. For Run mode, select
Attended – Runs when you're signed in.
35. Select the System Id field, and then select value from the Get SAP System Id
action output in the Dynamic content list.
36. Check the expected date time format in SAP and make adjustments if needed by
using the formatDateTime function. For example, use
formatDateTime(triggerBody()['date'],'dd.MM.yyyy') to get a date formatted day-
month-year, as in the German-formatted date of 13.10.2020.
37. Supply the data for all other fields by selecting the appropriate property from the
trigger Dynamic content list for the cloud flow.
38. Select Save to save the flow.

39. Select Test.

40. Select I'll perform the trigger action, and then select Save & Test.
41. Authenticate if needed, and then select Continue.

42. Supply values for all variables, and then select Run flow.
43. Select Done.

7 Note
Don't interact with your mouse or keyboard until the process is completed.

The cloud flow starts and calls the desktop flow, which enters data into SAP.
Congratulations! You've successfully implemented two SAP GUI automation techniques.
We're excited to see what you build next with SAP and Power Automate.

Next step: No-code RPA with SAP GUI in Power Automate Desktop
Use no-code RPA with SAP GUI in Power
Automate Desktop
Article • 02/16/2022

You can use the Power Automate Desktop recorder to record SAP GUI interactions. The
desktop recorder translates each mouse click and keystroke into Power Automate
Desktop actions, and then adds these actions to your desktop flow.

If you want to see the new desktop recording experience in action, follow these steps to
reconfigure the action-based automation you built in the Low-code RPA with SAP GUI in
Power Automate Desktop section of this playbook.

You can also follow along in the no-code approach in episode 6 of the video series:
https://www.microsoft.com/en-us/videoplayer/embed/RWJEiJ?postJsllMsg=true

1. Select the ProcessHRMasterForm subflow tab, and then select Desktop recorder.

2. Open SAP, bring the SAP Easy Access window to the foreground, and then in the
Desktop recorder window, select Start recording.
3. Enter PA30 into the Transaction field, and then select Enter.

4. Enter a Personnel number, and then select Enter.


5. Select Addresses as the Infotype Text, and then enter a value in STy (such as 2 or
any other value suitable for your use case).

6. Enter a date in the From field, and then select the New icon.
7. Enter a street name along with a house number in Address line 1.

8. Enter values for City, State/zip code and Country key, and then select Save.
9. Select the Back (F3) icon.

10. This completes the address creation process, so select Finish in the desktop
recorder window.
11. Your desktop flow script should look similar to the following image.
12. During recording, you might have accidentally selected windows or other UI
elements that aren't relevant to your flow. If this happens, you can remove
duplicate or unnecessary action steps from the script.

13. Highlight all actions that were manually defined in Low-


code RPA with SAP GUI in Power Automate Desktop, right-click, and then select
Delete selection to remove them from the subflow.
14. Edit the Populate text field in window action for Employee Id, and replace the
Text to fill-in value with the previously defined EmployeeId variable.
15. Edit the Populate text field in window action for Info subtype, and replace the
Text to fill-in value with the previously defined AddressType variable.
16. Edit and replace the hard-coded text with variables in the Populate text field in
window actions for Effective Date, Street, City, State, ZipCode, and Country.

17. Highlight the actions that you'll need for the employee address creation subflow,
right-click the actions you selected, and then select Cut.
18. Open the ProcessEmployeeAddressForm subflow, select all actions, right-click,
and then select Delete.

19. In ProcessEmployeeAddressForm subflow, right-click, and then select Paste.


20. Under the Variables pane, edit all variables and then provide Default values. These
default values will be used to test the updated desktop flow.
21. Select Save, and then close the confirmation message.

22. Select Run.

Awesome! With this new desktop recording option, some minor action tweaks, and a bit
of refactoring, you've reduced development time and simplified the overall desktop flow
action definition process.

) Important
Delete all previously defined default values before you leave the desktop flow
authoring experience.

Next step: Extract data from the SAP GUI UI with Power Automate
Extract data from the SAP GUI UI with
Power Automate
Article • 02/16/2022

When you create or update records in SAP, it generates status information which
includes the newly generated record IDs. SAP displays this status information in the
lower part of the SAP UI.

Here's a sample status message that SAP displays after you create an SAP PM
notification.

This status information might be useful for the automation process steps later.
Therefore you should extract and assign this status data to variables for downstream
data processing.

We have multiple ways to achieve this, depending on your development approach. The
two approaches are low-code or pro-code techniques.

Pro-code approach
The pro-code approach uses VBScript commands to extract information from SAP UI
elements.

To extract SAP status data, such as a newly created purchase requisitions or plant
maintenance notification numbers, you need access to the status bar UI element.

Here's how you access the status bar UI element.

Visual Basic Script

session.findById("wnd\[0\]/sbar/pane\[0\]").Text

Let's walk through the end-to-end experience. We'll record how to create an SAP PM
Notification and modify the code that's generated to return the newly created Id to
Power Automate Desktop.

In case you've never heard of SAP's PM Notification or you don't have access to it, don't
worry you'll still be able understand the steps needed to extract such data for your own
scenario.
Here are the steps:

1. Confirm that all SAP GUI scripting configurations are done.

2. Open SAP Logon and select the SAP system to which you want to sign in.

3. Select Customize Local Layout (Alt+F12), and then select Script Recording and
Playback....

4. Select More.

5. Under Save To, provide the file path and filename where you want to store the
captured user interactions.

6. Select the Record Script button to start the screen capturing process. Every
interaction you do now in SAP is captured as repeatable VBScript commands.

7 Note
If you recorded steps and saved to this file before, you'll need to confirm if
you want to overwrite the file.

7. Enter transaction code IW21, and then select Enter.

8. Provide the Notification type, and then select Enter.

9. Enter a Short Text, Planner Group, Main WorkCtr, and any other field that you
need.

10. Select Save (Ctrl+S) on the toolbar.

11. Back in the Create PM Notification initial form, you'll notice a new status message
in the status bar in the lower-left corner of the UI.

12. Select Exit (Shift+F3) on the toolbar, and then stop the recording.

13. Optionally, log off from SAP and close all SAP windows.

Let's examine the generated VBScript code:


Now, let's adjust the generated code to include a step that extracts, trims, and then
returns the new notification ID to the Run VBScript action.

This sets the VBScriptOutput variable of the Run VBScript action to the newly created
notification ID.
If you want to see the contents of the VBScriptOutput, you can use a display message
action, similar to the output in the following images.
Low-code approach
The low-code approach uses actions and custom selectors to extract newly generated
record IDs or other status messages that are important for downstream flow processing.

The following steps won't go into detail about how to record or use manual action
design to create a new SAP PM Notification record. Please review Low-code or No-
code RPA with SAP GUI in Power Automate Desktop if you need a refresher on how to
do that.

Follow these steps for a low-code approach:

1. Use the desktop recorder or manual action design to capture all controls that you
need for the SAP PM Notification process.
2. Create a notification record, and capture the status text that appears after you
select Save on the toolbar of the Create PM Notification screen.

After the item is saved, you'll be redirected to the previous screen where you
should see a new notification number in the status bar text.

3. Now, go back to Power Automate Desktop, search for an action named Get details
of a UI element in window, and add it to your authoring canvas.

4. Select the UI element dropdown menu, and then select Add a new UI element
button.

5. Select the SAP Easy Access window and hover over the status bar until a red
border labeled Gui Statusbar appears. While the border is active, hold down Ctrl
and then click to select the status bar.
6. Select Save.

7. Search for the Replace text action in the Actions pane, and drag it onto the design
canvas.

8. In Text to find, enter Notification. Under Replace with, enter a blank string by
entering the following characters: %''%
9. Add another Replace text action, and rename the Variables produced to
%NotificationId%. In Text to find, enter save, and under Replace with, enter a
blank string (%''%).

10. Search for the Trim text action on the Actions pane, drag it onto the design
canvas, select %NotificationId% as Text to trim value and rename Variables
produced to %TrimmedNotificationID%.
11. Search for the Display message action on the Actions pane, and drag it onto the
design canvas. Set Message box title and Message to display to suit your needs.

12. Run the automation that creates a new SAP PM Notification and extracts its newly
generated notification ID that can be displayed as shown in the following image.
Next step: Summary of the benefits of VBScript vs. actions for SAP GUI
automation
Summary of the benefits of VBScript vs.
actions for SAP GUI automation
Article • 02/16/2022

As with many other technologies and techniques, there are pros and cons for each
approach. Depending on the type of developer you are, you might find that a certain
approach better fits your intended use case and skill set.

There will be occasions when combining both techniques will be beneficial, like adding a
VBScript action with some Excel VBA code to support advanced formatting as part of
your action-based automation.

Here's a summary of the two approaches.

Feature VBScript Actions

Citizen developer play

IT pro and pro developer play

Faster development

Faster execution

Lower maintenance

Multilevel exception-handling support

Resilient to UI changes and versioning

Reusability support

Requires individual UI element capturing

Requires SAP scripting configurations

Supports silent and resilient sign-in with sapshcut.exe

Takes advantage of the SAP internal recording engine

Next step: Conclusion of RPA Playbook for SAP GUI automation tutorial with
Power Automate
Conclusion of RPA Playbook for SAP GUI
automation tutorial with Power
Automate
Article • 02/16/2022

By taking advantage of known and proven SAP recording techniques, combined with
the latest and greatest cloud and desktop automation authoring tools in Power
Automate, we were able to build an end-to-end SAP RPA prototype in less than an hour!
Of course, we've only looked at the happy path for this use case, but it already includes
some important security best practices and recording tips and tricks.

Now that you understand the basics of using Power Automate to automate SAP, you
can use the Power Automate documentation and best practices guidance (including the
Automate It! video series) to automate your most frequent, mundane, and rules-
based tasks.
Tips for using Automation Kit for Power
Platform with ALM
Article • 09/20/2022

One or more developers create approved automation projects in a development


environment. One or more automation projects can be part of the same automation
solution. More information about solutions can be found here: Solutions overview.

Organizations can use any of the following three ways to implement application lifecycle
management (ALM).

1. ALM based on manual actions.


2. ALM based on automated actions.
3. ALM based on a combination of manual actions and automated actions.

Manual Actions
Manual activities can include the following.

1. Export the solution.


2. Import the solution.
3. Store the solution in a source control repository.
4. Download the solution from a source control repository.

Automated Actions
Azure DevOps and the Microsoft Power Platform Build Tools for Azure DevOps provide a
great way to automate manual ALM activities and more. Together with Power Automate,
you can orchestrate a complete set of activities.

Here are a few orchestrations for you to consider.

1. Example 1 - Clean start

a. Get an unmanaged solution from a source control repository.

b. Import the unmanaged solution into a development environment.

2. Example 2 - Commit work

a. Export an unmanaged solution from a development environment.


b. Store the unmanaged solution in a source control repository.

3. Example 3 - Create a production version

a. Commit work (see the previous example)

b. Export the managed solution from a development environment.

c. Store the managed solution in a source control repository.

4. Example 4 - Update a testing or production environment

a. Get a managed solution from your source control repository.

b. Import the managed solution into a testing or production environment.

For organizations that prefer to use GitHub instead of Azure DevOps, there is a preview
version of GitHub Actions for Microsoft Power Platform which offers a subset of the
Azure DevOps functionalities. Power Platform Build Tools webinar and live demo .

The ALM Accelerator for Power Platform includes a set of prebuilt tools and templates
to accelerate your ability to build and deploy automation solutions.
Overview of the Automation Kit
Article • 12/01/2022

The Automation Kit is set of tools that accelerates the use and support of Power
Automate for desktop for automation projects. The kit provides tools that help you
manage automation projects and monitor them to estimate money saved and return on
investment (ROI).

The Automation Kit applies the HEAT (Holistic Enterprise Automation Techniques) to
support your organization.

The kit is especially useful to an Automation Center of Excellence (CoE) team, which is a
team of experts who support automation within your organization. They have good
knowledge about Power Automate for desktop, set up and maintain the Automation Kit,
and maintain the configuration data such as departments, process categories, goals, and
more.

The goal of the Automation Kit is to enable organizations to manage, govern, and scale
automation platform adoption based on best practices. The Automation Kit provides the
following items in support of your Automation Center of Excellence.

Near-real time ROI / SLA: Short and long-term analytics to drive towards your
business goals.
Tools for all users: These tools are for makers (citizen and pro developers), your
Automation CoE team, and executive sponsors.
End-to-end automation lifecycle: These tools help to automate and manage all
aspects of hyperautomation scenarios, including ALM and templates to drive
consistency.
Enterprise readiness: Helps to secure, govern, audit, and monitor your automation
deployment.

Automation Kit components


The Automation Kit supports an automation CoE with the following components:
1. Automation Project: This project is a canvas app that supports requesting
automation projects and submitting them for approval.
2. Automation Center: This is a model-driven app that organizations can use to
create and maintain automation assets, such as master data records, map
resources and environments, and assign roles to employees.
3. Automation Solution Manager: This is a canvas app in satellite environments that
enables the metering of solutions and their artifacts.
4. Cloud flows: These cloud flows use Dataverse tables to sync data from satellite
environments, in near real time, to the main environment.
5. A Power BI dashboard that provides insights and monitors your automation assets.

These two solutions contain the components in the kit.

The main solution, which you deploy to the main environment.


The satellite solution, which you deploy in each satellite environment.

Use satellite environments to develop and test your automation projects before you
deploy them to production. The production satellite monitors and meters the solutions
and solution artifacts for an automation project.

The data from the metered solutions syncs to the main environment in near real time for
monitoring on a dashboard.

Conceptual design
The Automation Kit has the following conceptual design components.
The key element of the solution is the Power Platform main environment.

There are usually several satellite production environments that run your automation
projects. Depending on your environment strategy, these could also be development or
test environments.

Between these environments there is a near-real-time synchronization process that


includes cloud or desktop flow telemetry, machine and machine group usage, and audit
logs. The Power BI dashboard for the Automation Kit displays this information.

Near real-time data synchronization


The synchronization processes the minimum data that's required to calculate the ROI
and SLA. It doesn't create a complete inventory of all low code assets.

The components in satellite environments could be organized by geography or


capability. The cloud flows in these environments push information about metered
cloud and desktop flows in near real-time to the automation main environment.
Automation Center of Excellence
strategy with the Automation Kit
Article • 12/01/2022

The Automation Kit, along with the ALM Accelerator from the Power Platform CoE Kit,
supports your Automation Center of Excellence strategy.

Corporate automation strategy

Impactful automation requires executive sponsorship and deep collaboration across


many roles in the organization.

The process can start by setting goals for your overall automation ambitions, such as
target savings over one, two, or three years, or target efficiency gains.

The executive sponsor from a business unit, who could be a manager or managing
director, would collaborate to share responsibility and own the overall business process
improvement ambitions.

The automation owner, who could be the head of an individual department (for
example, Finance) or sub-department (for example, Accounts Payable), specifies and
develops the automation solutions.

The automation team often needs collaboration with the Microsoft 365 or Power
Platform Center of Excellence team to enable the Automation Center of Excellence.
These teams can provide guardrails, governance, and best practices. For unattended
automation, they will need infrastructure provisioning, networking, and security to
support the operation of the deployed solutions.

For the risk and compliance side of the diagram, the corporate rules, regulations, cyber-
security, data privacy, and auditing requirements owners must be consulted and
involved in the process.

Automation lifecycle
The Automation Kit supports the journey from ideation and project definition to
production automation with SLA and ROI measured with near real-time reporting.

Here is a summary of the relevant roles:

CVP or Area Manager: Defines the automation goals


Process Owner: Creates the Automation Projects in the system, defining process
volume / frequency, characteristics of current throughput of current process, and
the expected ROI
Reviewer / Approver: Validates idea to approve that the idea should be
implemented and is justifiable from an effort, maintenance, and compliance
perspective
Automation CoE: Sets up and maps solution to a specific environment
Process Owner: Builds the solution, promotes to User Acceptance Testing (UAT)
and Production. Maps the deployed solution to the Automation Project for
metering. Metering allows report on overall status and how often the actions are
carried out and measures the automation against the SLA and expected ROI and
corporate automation goals.
Automation Center Of Excellence
This diagram describes many of the components and activities that are involved with an
Automation CoE and how these map to the Automation Kit and the ALM Accelerator.

The left side of the diagram shows various stakeholders or participants in the process.
The center of the diagram displays specific components that are related to each part of
the process.

Stakeholders and partners of the Automation CoE

Bot developers

This includes citizen developers who need a guided experience to go from identifying an
opportunity for building a bot, to building, deploying, and maintaining it.

Automation CoE members


Automation CoE members can build out patterns and templates that allow citizen
developers to focus on solving the business problems.
The Automation CoE team deals with automation of DevOps pipelines and code review
processes to accelerate the Citizen developers' bot building journey.

Infrastructure Ops

Partner with the infrastructure operations team to ensure that an efficient provisioning
process exists.

This could include provisioning virtual machines, which could be unattended in


production or attended in development. Plan for keeping the operating system and
Power Automate for desktop software up to date.

Power Platform and Microsoft 365 CoEs


Collaborate with your Power Platform or Microsoft 365 Center of Excellence teams for
environment management and creation of Data Loss Prevention (DLP) policies that need
to be put in place for the organization to scale.

An Automation CoE may not always have full access to perform these activities.

Executive sponsors

Monitor and track outcomes against goals. The Automation Kit helps you track projects
from early in the bot development lifecycle. Key elements can include assigning the
business case to the project and tracking the impact that the automation project it's
delivering.

In the Automation Kit dashboards, you can view the outcome for a particular use case
and the value that's being obtained through the investments.

How the Automation Kit and the Power Platform CoE


Starter Kit support the strategy
The Automation Kit and components of the CoE Starter Kit, including the ALM
Accelerator, can be mapped to the Automation Center Of Excellence diagram as follows.

Bot Development Lifecycle

Ideation: The Automation Kit provides a process to create potential automation


projects and an approval process to determine which automation projects need
investments.
Build: The ALM Accelerator provides the ability to build a managed deployment of
an RPA solution with versioning applied from solutions stored in source control.
Deploy: The ALM Accelerator helps you configure and deploy solutions between
the development, test, and production environments. The deployment process
makes use of Azure DevOps and includes the ability to apply branching and
merging to apply a source-controlled governance and a review process as a
deployment progresses to production.

ALM components

Code Review: Reviews for solutions in Azure DevOps with the ALM Accelerator.
The ALM Accelerator Azure DevOp extensions allow the solution to be unpacked
and the script definition from the exported solution versions to be compared for
changes side by side.
Monitoring: The Automation Kit provides a near real-time tracking process to
measure the impact of deployed solutions.
Data Loss Prevention: Determine the impact of DLP rules on deployed desktop
flows using the Automation Kit.
Dashboard and Data ETL: Synchronize data from multiple environments to a
centralized dashboard to monitor the impact of deployed solutions.

Automation Maturity Model and the


Automation Kit
The Power Platform automation maturity model describes the capabilities and the levels
in organizational automation maturity.

The Automation Kit and the ALM Accelerator can be combined to help customers grow
their maturity in adopting best practices.
The dotted blue boxes indicate the maturity model areas that the Automation Kit
addresses using the Holistic enterprise automation techniques (HEAT). Dotted green
boxes indicate areas address by the ALM Accelerator.

Empower
The Automation Kit can be used with Hackathons to help validate possible automation
projects and monitor the impact of the Hackathon experiments.

Discover and plan


Use the Automation Kit to plan for automation projects and potential impact. Use the
approval process to determine which projects to invest in and monitor the impact.

The ALM Accelerator enables profiles to be created that enable RPA solutions to be
deployed between development, test and production environments.

Design and document


Design and document enables makers to quickly experiment in development
environments and then measure the impact of their experiments to assess which viral
automation projects are making large impact.

Use the ALM Accelerator to define deployment environments and deployment settings
for machines and machine groups along with environment variables.
Build and implement
Use the ALM Accelerator to automate the deployment of RPA solutions so that the
Automation Kit can monitor them.

Use the near real-time ROI automation to monitor the impact of automation projects.

See also
ALM Accelerator for Power Platform - The ALM Accelerator is released as part of
the CoE Kit and provides tools and templates to provide enterprise-scale end-to-
end Application Lifecycle Management.
Admin and Governance Whitepaper
Manage Power Automate for desktop on
Windows https://aka.ms/padonwindowspnp
RPA Migration Whitepaper https://aka.ms/PAD/RPAMigrationWhitepaper
Prerequisites to install and use the
automation kit
Article • 08/01/2023

The following prerequisites are required to install and use the automation kit.

An administrative account, which is called Automation CoE Admin or similar.

The automation kit requires access to your Power Platform environments, and
some Azure resources, such as Key Vault and app registration. The account you set
up as the Automation CoE Admin needs the following.

Roles
Microsoft Power Platform service admin or Dynamics 365 service admin.
Account must be mail enabled.
Azure contributor role (for Key Vault and app registration).

Azure app registration


An Azure app registration is used for an application user for the Dataverse Web API in
each of the satellite environments.

Azure Key Vault


The automation kit uses the new Azure Key Vault secrets (preview).

) Important

Azure Key Vault is required only for automation kit satellite release March 2023 or
earlier. Starting with the April 2023 release, Azure Key Vault is no longer required as
a prerequisite.

Azure Key Vaults are used to store secrets for the Azure app registration, depending on
your requirements. There might be one Key Vault per satellite environment. Here's an
example of how you might name your Key Vaults.

KV-Contoso-Dev
KV-Contoso-Test
KV-Contoso-Prod

To store your secrets:

1. Register the Microsoft.PowerPlatform resource provider in your Azure


subscription. Follow these steps to verify and configure it: Azure resource providers
and types.

Azure Key Vault must have Get secret access policy set for the Dataverse service
principal.

2. Select Access policies.

3. Select Create.

4. Under Secret permissions select Get and click Next.

5. In the service principal search blank, search for Dataverse.

6. Select the Dataverse service principal with the 00000007-0000-0000-c000-


000000000000 identity.

7. Select Next > Next > Create.

License requirements
All users must have one of the following licenses:

Microsoft 365 license (E3, E5).

Power Automate Premium license (non-trial, previously Power Automate per user
with attended RPA).

Optional Power Automate Process license (non-trial, previously Power Automate


with unattended RPA).

Power App Pay As You Go, Per User or Per App license (non-trial).

Power BI Pro license.

Enable code components


The Automation Kit uses the Power Platform Creator Kit. It was developed to bootstrap
and enhance the canvas apps look and feel. The Creator Kit uses Fluent UI references
and guidelines. To learn more about Fluent, go to Fluent Design System .
The Creator Kit uses a component library and code components. You must enable code
components inside all the environments into which the automation kit will be installed.

2 Warning

You'll have to uninstall and potentially lose all data if the Power Apps component
framework for canvas apps isn't turned on for the environments where the
automation kit is installed or upgraded. Enable the component framework before
installing or upgrading.

1. Sign in to the Power Platform admin center .

2. Select an environment where you want to enable this feature. This is needed in
both the main and all satellite environments.

3. Select Settings at the top of the screen.

4. Below Product, select Features.

5. Turn on Allow publishing of canvas apps with code components.

6. Select Save.
Automation Kit setup checklist
Article • 12/01/2022

The following checklist provides an overview of the key steps, to assist you in setting up
the Automation Kit.

Use this checklist to ensure you have followed the key steps in the setup instructions.

Checklist for main


Import the Power Platform Creator Kit (CreatorKitCore_x.x.x.x _managed).
Verify Creator Kit is installed correctly.
Imported AutomationCoEMain_x_x_x_x_managed into your main environment.
Provisioned the approvals solution (optional).
Confirmed all flows in the solution are turned on.
Assigned the security roles that follow.
Shared the Canvas apps with appropriate users, using the guidance that follows.

Checklist for satellite


Import the Power Platform Creator Kit (CreatorKitCore_x.x.x.x _managed).
Verify Creator Kit is installed correctly.
Created Azure AD app registration (Dataverse API) for each satellite.
Defined satellite environments (new or existing).
Created application users in all satellite environments.
Imported AutomationCoESatellite_x_x_x_x_managed.
Configured environment variables that follow.
Confirmed all flows in the solution are enabled.
Shared the Canvas apps with appropriate users, using the guidance that follows.

Checklist for configuration data


Configured the general data defined below
Defined the ROI data
Console data configured
Flow Exception Rules Framework has been defined.

ROI calculations
Example data:

Cost element Input

FTE Cost $50/hr

Time to Process (mins) 60 mins

Frequency Daily*

# of FTEs to Process 1

Error Rate (%) 10%

# of FTEs to Fix 1

Time to Fix (mins) 25

Overhead 15%

Hourly = 2008, Daily = 251, Weekly = 52, Monthly = 12, Quarterly = 4 Calculation
Formula

Total Cost per Year = Cost to complete (per Year) + cost to remediate (per year)

Cost to Complete (per Year): ((((FTE Cost/60) x # of FTEs to process) x time to process
(mins)) x frequency)

Cost to Remediate (per Year): (((Frequency x (Error Rate/100)) x Time to Fix (mins)) x FTE
Cost/60) x # of FTEs to Fix))

Full Formula: ((((FTE Cost/60) x # of FTEs to Process) x Time to Process (mins)) x


Frequency) + (((Frequency x (Error Rate/100)) x Time to Fix (mins)) x FTE Cost/60) x # of
FTEs to Fix))

ROI calculation example

Summary of cost calculations

Cost element Calculation

Cost to Complete (per $12,550 (((FTE Cost/60) x # of FTEs to Process) x Time to Process) x
Year) Frequency

Cost to Remediate (per $522.92 ((Frequency x (Error Rate/100)) x Time to Fix (mins)) x ((FTE
Year) Cost/60) x # of FTEs to Fix)
Cost element Calculation

Cost to Remediate (per $20.83 (Cost to Remediate (per Year) / (Frequency x (Error Rate/100))
Instance)

Total Cost per Year $13,072.92 (Cost to Complete (per Year) + Cost to Remediate (per Year)

Total Cost per Month $1,089.41 (Cost to Complete per Year / 12)

Cost per Instance $52.08 (Cost to Complete per Year / Frequency)

Summary of savings per instance for bot

Run Detail Total Savings Description

Successful Run, no errors $44.27 Cost per Instance minus Overhead %

Failure -$28.65 Cost to Remediate per Instance minus Overhead %


Install the automation kit using the
command line
Article • 07/01/2024

To install the latest version of the automation kit using the command line, follow the
steps in this article. If you're unable to use the command line tools, you can use the
manual steps in Set up the automation kit and Set up satellites.

1. Ensure that you have enabled the Power Apps component framework feature in
the environments that you want to install the automation kit for both main and
satellite environments.

2. Ensure that the Creator Kit is installed into the target environment

3. Open the latest release from the automation kit GitHub releases.

4. From the Assets section, download AutomationKitInstall.zip by selecting it.

5. On Windows Explorer, select the downloaded AutomationKitInstall.zip and open


the Properties dialog.

6. Select Unblock.

7. Select AutomationKitInstall.zip > Extract All.

A folder named AutomationKitInstall will be automatically created. You'll use this


folder in step 11.

8. Ensure that you have the Power Platform CLI installed.

9. Execute the PowerShell script using the following command line.

Windows Command Prompt

cd AutomationKitInstall
powershell Install_AutomationKit.ps1

7 Note

Depending on your PowerShell execution policy, you might need to run the
following command.

Windows Command Prompt


Unblock-File Install_AutomationKit.ps1

10. The PowerShell script will prompt you to create an installation configuration file
using the Automation Kit for Power Platform—Setup pages (English version).

If you'd like to view the automation kit for Power Platform pages in a language
other than English, choose from the following AI generated versions:

Danish , Dutch (Netherlands) , French , German , Italian , Japanese ,


Korean , Norwegian , Polish , Simplified Chinese Spanish , Swedish ,
Thai .

The setup configuration pages will provide you the following components:

Select configuration for main or satellite solutions.

Select users to assign to security groups.

Create connections required to install the solution.

Define environment variables.

(Optional) Define if sample data should be imported.

(Optional) Enable Power Automate flows contained in the solutions to be


enabled.

11. After you complete the website setup steps, copy downloaded automation-kit-
main-install.json or automation-kit-satellite-install.json file to the
AutomationKitInstall folder.

The AutomationKitInstall folder was automatically created when you extracted


files in step 7.

12. Once the file is downloaded, the script will prompt for y to install the main solution
and n to install satellite solution.

The install will upload start the install with the defined settings.

Feedback
Was this page helpful?  Yes  No
Provide product feedback
Set up the automation kit control center
Article • 08/08/2023

When you install the managed Power Platform control center solution, make sure you
follow these required steps.

1. Ensure the Power Apps component framework is enabled by following the steps in
Enable the Power Apps component framework feature.

2. Install the Creator Kit from App Source into the target environment.

3. In the Assets section in the latest GitHub release of the automation kit file,
download AutomationKitControlCenter_*_managed.zip by selecting it.

4. Import the downloaded AutomationKitControlCenter_*_managed.zip file.

To learn how to import, go to Import solutions.


Set up the automation kit main
component
Article • 04/22/2023

This section includes the manual install process for the main solution of the automation
kit. There's an automated command line installer that will also guide you through this
process.

Once you've completed the main solution setup, you can setup the following optional
components:

Satellite: Allows you to meter deployed Automation Project solutions.

Scheduler - Provides a calendar view of recurring cloud flows that include Power
Automate Desktop flows.

) Important

Perform all steps using the Automation CoE Admin account mentioned in the
Prerequisites article.

Import the main solution into the main


environment
Create an environment in which to set up the automation kit.

1. Sign in to the Power Platform admin center .

2. Select Environments > + New, and then enter a name, type, and purpose.

3. Select Yes to create the database, and then select Next.

4. Keep sample apps and data set to No.

5. Select Save.

Enable the Power Apps component framework


Once the environment is created, enable the Power Apps component framework.
1. Sign in to the Power Platform admin center .

2. Select an environment where you want to enable this feature.

You need to do this for main and all satellite environments.

3. On the top pane, select Settings.

4. Select Product > Features.

5. Turn on Allow publishing of canvas apps with code components.

6. Select Save.

Import the Creator Kit


Next, import the Power Platform Creator Kit.

1. Download the Power Platform Creator Kit .

2. Sign in to Power Automate .

3. Go to the environment you just created in which the main solution will be
imported. For this example, we're importing to the environment named
Contoso_Main.

4. On the left pane, select Solutions.

5. Select Import > Browse.

6. Select the Creator Kit solution named CreatorKitCore_x_x_x_x_managed.zip.

7. Select Import.

Wait for the Creator Kit to finish importing before continuing to the next step.

Import the automation kit main solution


1. Download the most recent release of the automation kit main managed solution
from the Assets section of the automation kit file.

2. On the left pane, select Solutions.

3. Select Import > Browse.

4. Select the Automation CoE main solution


(AutomationCoEMain_x_x_x_x_managed.zip).
5. After the compressed (.zip) file loads, select Next.

6. Review the information, and then select Next.

7. Establish connections to activate your solution.

If you create a new connection, you must select Refresh. You won't lose your
import progress.

8. Select Import. The import process can take 10 to 20 minutes to complete.

9. After importation completes, verify that all the flows are turned on, and share the
apps with the appropriate users.

(Optional) Provision the approvals solution


This step is optional.

In new environments, Power Automate approvals functionality isn't configured by


default. After Power Automate initiates an approval, the approvals solution gets created.

This process usually takes 5 to 10 minutes. You can do this step while the main solution
is being imported.

1. Sign in to Power Automate . This is where the main solution is being imported.

2. On the menu to the left, select My flows > New flow > Instant cloud flow.

3. Under the button trigger, add the Start and wait for approval action to the flow.

4. Fill in your details to trigger the flow.

Here's is an example of how you can configure the Start and wait for approval
action.
To learn more about approvals provisioning, go to Power Automate Approvals
Provisioning Overview and Troubleshooting .

Assign security roles


Once the importation completes, assign security roles to members of the organization.
The main solution comes with three security roles.

Review roles and assign roles based on responsibility.

Automation Project Admin: Maintains the configuration data in the automation kit
and maps automation projects to environments.

Automation Project Contributor: Generates or requests new automation projects.

Automation Project Viewer: The business process owner who approves or rejects
automation project requests.

To assign roles:

1. Sign in to the Power Platform admin center .

2. Select your main environment.


3. Under Security roles on the Access card, select See all.

4. Select the security role, and then add members to the security role.

Sync environments
Follow these steps to sync environments. The main solution has a cloud flow called Sync
Environments.

1. Select the Sync Environments flow.

2. Trigger it if it hasn’t run.

3. Wait for the run to complete.

Import the desktop flow actions csv


Import all the desktop flow actions from the csv file into the Desktop Flow Action table.

This must be done for all environments in which the automation kit is installed, such as
main and all satellites.

1. Sign in to Power Automate .

2. Go to your environment where the solution is installed.

3. Select the Solutions tab.

4. Find and then select the Automation COE Main solution.

5. Select the Desktop Flow Action table.

6. Expand the Import list near the top.

7. Select Import data from Excel.

8. After the popup opens, select Upload, and then upload the included Excel file
named autocoe_desktopflowactions.csv.

9. Wait for the mapping status to show as successful.

10. Select Import.

11. After the import completes, verify that the data was imported.
Set up the automation kit satellite
components
Article • 10/30/2023

This article includes the manual install process for the Satellite solution of the
automation kit. There's an automated command line installer that will also guide you
through this process.

Create a Microsoft Entra app registration to


connect to Dataverse Web API
Use the following steps to create an app registration that will be used by flows in the
satellite environment.

1. Sign in to Azure .

2. Go to Microsoft Entra > App registrations.

3. Select New Registration.

4. Enter a name (for example, Automation CoE Dataverse API), leave everything else,
and then select Register.

5. In the Overview tab, select Add an Application ID URI.

6. Select Set, leave the default, and then select Save.

Add a new client secret


Starting from the April 2023 release, a client secret isn't required.

) Important

Azure Key Vault is required only for automation kit satellite release March 2023 or
earlier. Starting with the April 2023 release, Azure Key Vault is no longer required as
a prerequisite.

1. Select Certificates & secrets > New client secret.


2. Enter description (for example, Auto CoE Dataverse), and then select appropriate
expiry value.

3. Select Add.

4. Copy the secret value that's generated.

This secret will be added to Azure Key Vault in a later step.

5. Go back to the Overview tab, and then copy down the following information.

Application (client) ID
Directory (tenant) ID

6. Go to your Azure Key Vault. This is where we'll store the values so that Power
Automate can use them to call the Dataverse Web API.

Create secrets for the client ID and tenant ID


you copied earlier
Starting from the April 2023 release, a client secret isn't required.

1. Inside the Secrets tab, select Generate/Import.

2. Use a descriptive name for each secret. Here are some examples:

KVS-AutomationCoE-ClientID
KVS-AutomationCoE-TenantID
KVS-AutomationCoE-Secret

Create a new environment or use an existing


environment for your satellite
Microsoft recommends that you have the satellite solution imported inside your
production environment. If you create a new environment, follow the steps in Set up the
automation kit.

Import the Creator Kit


Enable the Power Apps component framework, and then import the Power Platform
Creator Kit.

1. Enable the Power Apps component framework.


a. Sign in to the Power Platform admin center .

b. Select an environment where you want to enable this feature. (We need to do
this for both the Main and all Satellite environments)

c. Select Settings in the top pane.

d. Select Product > Features.

e. Turn on Allow publishing of canvas apps with code components.

f. Select Save.

2. Import the Power Platform Creator Kit.

a. Download the Power Platform Creator Kit .

b. Sign in to Power Automate .

c. Go to the environment you just created into which the main solution will be
imported. For this example, you're importing to the environment named
Contoso_Prod.

d. On the left pane, select Solutions > Import > Browse.

e. Select the Creator Kit solution named CreatorKitCore_x_x_x_x_managed.zip.

f. Select Import.

Wait for the Creator Kit to finish importing before continuing to next step.

Create an application user inside Dataverse


Create an application user inside Dataverse, per satellite environment.

1. Go to the Power Platform admin center .

2. Select the satellite environment, and then select Settings.

3. Select Users + permissions > Application Users > New app user.

4. Select Add an app.

5. Select the app registration that was created in previous steps.

 Tip
If you're unsure, verify the AppID.

6. Select a business unit.

7. Add the System Administrator security role.

8. Select Create.

Import the satellite solution into the satellite


environment
Follow these steps to import the satellite solution.

1. Sign in to Power Automate .

2. Select your designated environment for the satellite solution. For this example,
we're importing to the environment named Contoso_Prod.

3. Download the most recent release of the automation kit satellite managed solution
from the Assets section of the automation kit file.

4. On the left pane, select Solutions.

5. Select Import, and then Browse.

6. Select the Automation CoE satellite solution named


AutomationCoESatellite_x_x_x_x_managed.zip.

7. When the compressed (.zip) file loads, select Next.

8. Review the information, and then select Next.

9. Establish connections to activate your solution.

If you create a new connection, you must select Refresh. You won't lose your
import progress.

10. Configure the environment variables.

Import the desktop flow actions CSV


Import all the desktop flow actions from the CSV file into the Desktop Flow Action
table. This must be done for all environments in which the automation kit is installed,
such as main and all satellites.
1. Sign in to Power Automate .

2. Select the environment where the solution is installed.

3. Select Solutions.

4. Find and then select the Automation CoE main solution.

5. Select the Desktop Flow Action table.

6. Near the top, expand the Import list.

7. Select Import data from Excel.

8. After the popup opens, select Upload, and then upload the included Excel file
named autocoe_desktopflowactions.csv.

9. Wait for the mapping status to show as successful.

10. Select Import.

11. After the import completes, verify that the data was imported.
Configure automation kit environment
variables
Article • 04/22/2023

In this article, you'll learn how to configure the automation kit environment variables. To
get the information for the environment variables, open a new tab.

1. Sign in to Power Apps .

2. In the Environments dropdown menu in the title bar, select satellite.

) Important

Azure Key Vault is required only for automation kit satellite release March
2023 or earlier. Starting with the April 2023 release, Azure Key Vault is no
longer required as a prerequisite. Environment variables for this are no longer
required.

3. On the top right side of the title bar, select Settings (the gear icon) > Developer
resources.

The information on this panel will be copied to the Key Vault secrets that follow.

Get the URL path for your Azure Key Vault


secrets
The Azure Key Vault secrets are using the environment variable type. These environment
variables need to be in the following format.

Azure CLI

/subscriptions/{Subscription ID}/resourceGroups/{Resource Group


Name}/providers/Microsoft.KeyVault/vaults/{Key Vault Name}/secrets/{Secret
Name}

To get the format, follow these steps:

1. Sign in to Azure .

2. Under Security, open your Key Vault with the secrets for your app registration.
3. Select the Secrets tab.

4. In the URL field, copy the URL.

5. Open Notepad and paste the URL.

6. Remove everything from https:// to /resource.

7. At the end of the URL, enter /{SecretName}.

8. Replace {secretname} with your secret name.

9. Repeat step 6 through 8 for all three Azure Key Vault secrets (Client ID, Client
Secret, Tenant ID).

Use the following example as a guide. These reference strings are needed for three
environment variables.

Environment variable summary


Use the information in the following table for the environment variables.

Environment Description
variable
name

AKV Client ID Azure Key Vault secret for client ID (application ID) from app registration:
Secret /subscriptions/{Subscription ID}/resourceGroups/{Resource Group
Name}/providers/Microsoft.KeyVault/vaults/{Key Vault Name}/secrets/{Secret
Name}
Environment Description
variable
name

AKV Client Azure Key Vault secret for secret from app Registration
Secret Secret

AKV Tenant Azure Key Vault secret for Tenant ID from app Registration
ID Secret

Automation The email address where operational reports and alerts should be sent to, for this
CoE Alert environment. (See Flow exception rules framework)
Email
Recipient

Automation Enter the Automation Project app Id of the Power Apps that is deployed with the
Project app main solution (main environment)
ID

Desktop Follow the steps in Get the desktop flow base URL
Flows Base
URL

Environment Use Session details to find this value from current environment that you're
ID importing into (satellite)

Environment Display name of the current environment (satellite)


Name

Environment Region of the satellite. Can be found in the Power Platform admin center
Region

Environment Use Session details to find this value from the current environment that you're
Unique importing into (satellite) Also must add .crm to the end of the string. Example:
Name unq08ed139e532b4edc8f38851fd1bb3279.crm. Please note that the extension
'crm', 'crm[x]', and more. is region dependent. See datacenter regions

*Environment Use Session details to find this value from Main. Also must add .crm to the end
Unique of the string. For example: unq08ed139e532b4edc8f38851fd1bb3279.crm. Please
Name of CoE note that the extension 'crm', 'crm[x]', and more. is region dependent. See
Main datacenter regions

Environment Open a new tab and then sign in to the Power Platform admin center .
URL

Flow Session This is the UserID (GUID) from the Users table inside the satellite environment for
Trace Record the admin account. Select Tables under Data on the left > User > Data. Change
Owner Id the view to "All columns". Find the Auto CoE Admin account, and then copy the
value under User.
Get the desktop flow base URL
Follow these steps to get the desktop flow base URL.

1. Sign in to Power Automate .

2. On the left navigation menu, select My flows.

3. On the address bar, copy the web address up to …environments/.

You can also get the environment ID from this URL.

4. On the Environment dropdown menu, select your environment.

5. Select the environment URL.

6. Select Copy link.

Post environment variable setup


After you've configured all the environment variables, you need to import them.

1. Select Import.

2. After the import completes, verify the variables for accuracy,

3. Turn on all the flows.

4. Share all apps with the administrators or administrator group of the satellite
environment.
Set up the Automation Kit security roles
and permissions
Article • 12/01/2022

Assign security roles


Once the import is complete, assign the following roles, based on responsibility.

For each of the security roles that follow, execute these steps:

1. Sign in to the Power Platform admin center .


2. Select your satellite environment.
3. Select See all under Security roles on the Access card.
4. Select the Security role > Search for and assign the security roles.

Flow session exception admin


This is an admin role that provides full CRUD permissions to the flow session exception
data captured in the following tables:

Flow session exception rule configuration


Flow session exception

Desktop flow definition admin


The desktop flow definition admin role provides full CRUD permissions to the following
tables:

Desktop flow action


Desktop flow definition
Desktop flow DLP impact profile

Assign column security profiles


Inside the solution, there's a column security profile. This profile enables users to see the
script field inside the Desktop Flow Definition table. This role should only be assigned
to CoE Admins due to the sensitive information that may be visible.
The script field is synced and stored inside the Desktop Flow Definition table, by default.
If you would like to not sync this information, follow the steps in Disabling Sync of Script
field. Otherwise, follow the following steps to assign the security profile:

1. Select the column security profile, Desktop Flow Script Field Security.

2. Under Members, select Users.

3. Select Add.

4. Once the popup opens, search for the user to whom you want to assign this
profile.

5. Select the checkbox for the user.

6. Select Select.

The user should show in the Selected records list.

7. Select Add.
Disable syncing of desktop flows script
(optional)
Article • 12/01/2022

By default, the Desktop flows definition script is stored inside the Desktop Flow
Definition table. This field could contain sensitive information.

2 Warning

System Administrators and higher can view the definition tables and fields by
default.

Follow these steps in the environment in that contains the solution to disable sync.

1. Select the Solutions tab.


2. Select the Default Solution and click Environment Variables.
3. Find and then select Store Extracted Script.
4. Under Current value, click New value.
5. Change to No.
6. Select Save.

7 Note

You must stop and then start all flows in an environment after you make changes to
environment variables for the flows to get the new environment variable values.
See Limitations.

) Important

The script field isn't synced back to main due to security. You can extend the
Automation Kit to configure your environment to sync data back to main, if
needed.
Frequently asked questions about the
automation kit setup guidance
Article • 04/22/2023

This article provides answers to some of the most common questions about the
Automation Kit.

What are the datacenter region codes?


These values are crucial when configuring a satellite’s environment variables. Each
region has a different URL. The following is a list of regions and their URLs.

Region URL

Region URL

NAM crm.dynamics.com

DEU crm.microsoftdynamics.de

SAM crm2.dynamics.com

CAN crm3.dynamics.com

EUR crm4.dynamics.com

FRA crm12.dynamics.com

APJ crm5.dynamics.com

OCE crm6.dynamics.com

JPN crm7.dynamics.com

IND crm8.dynamics.com

GCC crm9.dynamics.com

GCC High crm.microsoftdynamics.us

GBR crm11.dynamics.com

ZAF crm14.dynamics.com

UAE crm15.dynamics.com

GER crm16.dynamics.com
Region URL

CHE crm17.dynamics.com

CHN crm.dynamics.cn

For more information, go to Datacenter regions.

How can the RPA CLI be used to extend the


Automation Kit?
For more information about how to use the RPA CLI, go to RPA CLI .

No organization matches the given dataset:


unq0a5fac6XXXXXXXXXXXXX.crm
The issue might be due to a wrongly entered environment variable value for the
Environment Unique Name of CoE main. If your environment is provisioned in Australia
for instance, you need to enter the Australian region suffix to the crm domain. For
example, enter crm6 instead of crm. After you fix this, you can turn on all cloud flows.

You can also review Environment variables aren't editable after you import a solution
and Environment variables continue to use the old values after a manual change to learn
more.

GetDataverseSolutionArtifacts.Run failed
Details
This error happens inside the Automation Solution Manager app (inside a satellite) when
you try to view the solution artifacts.

Answer
First check out these flows:

Get Dataverse Bearer Token (Azure KeyVault Env)


Get Dataverse Solution Artifacts

The error in the flow might be like the following screenshot.


There could be two main causes of this error:

The application user isn't created in the satellite environment. Create an


application user inside Dataverse (per satellite environment).

The satellite environment variables aren't configured properly. Configure


automation kit environment variables.
Configure Automation Kit
Article • 12/01/2022

Define configuration data for the Automation


Kit
Use the Automation Kit admin account, go to the main environment, and then open the
automation center app. The Automation Center is where we configure how several
aspects of how the Automation Kit functions.

There are a few things you need to configure when you first import the kit.

First, add an automation goal. Follow these steps to add an automation goal.

1. Sign in with your Automation CoE Admin account, and then go to the main
environment.

2. Open the Automation Center app.

You'll use the automation center app to configure how the Automation Kit
operates.

3. Inside the Automation Center, select the Corporate Goals tab.

4. Select New.

5. Fill in the required fields. The following table presents some sample data as a
guide.

Field Value

Goal Name Cost Savings through Automation

Period From 11/1/2021

Period To 11/1/2022

Target Efficiency Gain % 20

Target Total Savings 50000

6. Select Save & Close.

Next, we'll make further configurations. This configuration is split into the following
three sections:
1. General - Configuration that's related to resources and processes.
2. ROI calculation - Configuration that's used as a scoring metric to better determine
the estimated ROI and complexity of an automation project.
3. Console configuration - Information that pertains to the Automation Kit apps. This
configuration is used to build a console where users can launch the related apps.

Steps to configure general


Select "Automation Center" in the bottom navigation bar, and then switch to "Setup".

Environments

1. Select a Satellite Environment on the Environment tab.

2. Change Is Satellite Environment to Yes.

3. Select Save & Close.

4. Repeat the previous three steps for all of your Satellite environments (DEV, TEST,
PROD, and so on).
Departments
1. On the Department tab, select New.

2. Create as many departments as your organization needs. Here's a sample of the


departments that an organization may need.

Accounting
Enterprise Integration
Finance
Human Resources
Information Technology
Logistics
Operational Change Management
Purchasing

Process categories

1. On the Process Categories tab select New.

2. Create top level categories that will have child categories (subcategories). Here is a
sample names of the categories that an organization may need.

Artificial Intelligence
Claims
Invoicing
Legacy System

Process subcategories
Define the subcategories that relate in some way.

1. Select New Process Sub Category and define the subcategories as your
organization needs. Here are some samples you can use as a guide.

2. Artificial Intelligence

Forms Processing
Forms Validation

3. Claims

Return
Warranty

4. Invoicing

Internal
External
Other

5. Legacy System

Complex
No API
Other
Roles (app roles)
These roles don't give access to anything. They're used as lookups for the fallback
record, if needed.

1. Create the Automation CoE Admin account as the CoE Owner.

Field Value

Display Name Anything (Auto CoE Owner)

Type CoE Owner

User Principal Name The email address for the user

2. Create CoE Admins - At least one CoE Admin must be initialized to configure
the fallback.

Field Value

Display Name Anything (Auto CoE Admin)

Type CoE Admin

User Principal Name The email of the user

3. Create developer role – This role is optional and is used to sync maker information
back to main.

Field Value

Display Name Anything (Miles Gibbs - Dev)

Type Developer

User Principal Name The email of the user

Base configuration - fallback


This table should only have one record defined. This record is used if any of the values
that are needed to complete processing is null or not defined.

1. Define one fallback record. Use the following screenshot as a guide for your
fallback record.
Steps to configure ROI calculation
Inside the Setup page, you can find the ROI calculation configuration. These tables are
used to calculate the complexity of each Automation Project (scores). Some values are
also used for estimated ROI and savings.

Processing frequency scores


The following table displays a value and score for each record. You must use these
values. You can modify the scores.

Value Score

Daily 12

Hourly 18

Monthly 3

Quarterly 1

Weekly 5
Average automation steps scores
The following table contains some examples, and it's fully customizable based on your
organization's needs and processes. The table is used to get the score for the average
automation steps which the user inputs when they request a new automation project.
You can modify this table per your needs.

Range Value From Value To Score

>= 1, < 5 1 5 1

>= 250 250 999999999 10

>= 5, < 250 5 250 5

Processing peaks scores


This table has a value and score for each record. Here are the values that should be
used. You can modify the scores to suit your needs.

Value Score

daily 15

hourly 20

monthly 5

quarterly 2

weekly 10

Configuration - console
The automation console app and the project approval automation use these tables.

Name App description AppID / App


Link

Automation Create and manage your automation projects Follow these


Project steps.

Automation Manage and configure all aspects of your automation Follow these
Center resources steps.
How to get canvas app URL/ID
Sign in to the maker portal , and then perform the following steps:

1. Select the Apps tab on the left navigation bar.

2. Select the ... (more commands) on the app.

3. Select Details.

4. Copy the AppID.

5. Copy the web link.

How to get model driven app URL/ID


Sign in to the maker portal , and then perform the following steps:

1. Select Apps tab on the left navigation bar.

2. Select the ... (more commands) on the app.

3. Select Details.
4. Select Properties.

5. Copy Unified Interface URL.

) Important

You may need to switch to classic mode to see the properties pane.
Flow exception rules framework
Article • 02/09/2023

The flow exception rules framework is the combination of components in the satellite
solution.

This custom-built framework introduces new automation capabilities that are aimed at
automation Center of Excellence (CoE) or operations teams. It allows you to define
custom exception handling rules that are automatically applied to failed desktop flow
runs that meet certain threshold criteria.

Feature details
Desktop flow execution results together with their statuses are automatically stored in a
Dataverse table named process (flowsession is the internal name) which allows us to
build custom solutions that further process its data in automation. A common use case
is to automatically turn off a parent cloud flow if its child desktop flow reports three
consecutive errors, including the same error code, such as WindowsIdentityIncorrect or
NoUnlockedActiveSessionForAttended.

Components
These components are part of the automation satellite solution.

Type Name Description

Table Flow Session Exception This table is used to define exception rules.
Rule Configuration

Table Flow Session Exception This table is used for flow exception logs and
downstream analytics.

Cloud flow Flow Session Exception This is the main processing flow that takes processing
Sync instructions from the rule configuration.

Security Flow Session Exception Provides full CRUD permissions to the Flow Session
Role Admin Exception data captured in above’s tables.

Environment Automation CoE Alert Defines the email address or distribution list where
Variable Email Recipient operational reports and alerts should be sent to.

Configure flow exception rules framework


) Important

These steps must be done with a user with the following roles.

Flow Session Exception Admin


Basic User
Environment Maker

1. Sign into Power Automate .

2. Go to your satellite environment.

3. Select the Solutions tab.

4. Find and then select Automation CoE Satellite.

5. Select Tables, and then find these tables.

Flow Session Exception Rule Configuration


Flow Session Exception

7 Note

If you don't see the tables mentioned in the previous step, confirm that you have
the roles mentioned.

1. Select the Flow Exception Rule Configuration table.

2. Select the Data tab.

3. Select Add record.


4. Here's an example configuration rule. If you want to monitor multiple error codes,
you must create a record for each error code that you want to monitor.

This table contains the field names and the corresponding values.

Field Value

Rule Name CantLogin

Exception Code SessionExistsForTheUserWhenUnattended

Consecutive Exception Count 2

Turn Off Cloud Flow Yes

Send Alert Yes

Requires Acknowledgment Yes

Flow exceptions dashboard


To visualize and report on the flow exceptions configured, the flow exceptions
dashboard can be utilized.

When you launch the Power BI template, enter the URL for the environment you wish to
report on.
7 Note

The URL must be entered in the following format:


[organization].crm[N].dynamics.com . For example, contosttest.crm.dynamics.com

Flow Exceptions tab


The Flow Exceptions tab allows you to filter by date, exception rule, cloud flow name,
desktop flow name, or host name.

Here's a summary of the information provided in this tab.

Status Overall: Overall count and percentage of failed vs. successful flows for the
environment.
Total Exceptions by Rule: Displays the total number of flow runs, categorized by each
exception rule setup through the flow exceptions framework.

Failure Rate by Type: Displays the total number of errors for the environment across all
flows, the number of successful runs, and the percent of frequency for each error
compared to all other errors.

Exceptions by Month: Displays the total number of exceptions triggered for each rule
according to the flow exception framework, categorized by month.

Flow Exception Detail: Displays the detailed information for the flow exceptions.

Flow Exceptions Tree


The Flow Exceptions (Tree) tab allows you to filter by date, exception rule, cloud flow
name, desktop flow name, or host name.

Here's a summary of the information provided in this tab.

Flow Exceptions Tree Diagram: Displays and allows you to drill into specific flows
organized into the following categories: cloud flow name, desktop flow name, error
code, error message, run mode, or host name.

Flow Exception Detail: Provides detailed information on the flow exceptions.


DLP impact analysis for desktop flows
Article • 12/01/2022

Feature details
The recently launched preview for DLP support for Power Automate for desktop actions
is a critical and highly requested governance feature addition, as most organizations
expect the same governance breadth and depth for RPA as we have for the Desktop
Power Automate based flows.

This new DLP support allows even the most risk-averse organizations to enable Power
Automate for desktop. Power Automate for desktop enables citizen automation
developers to achieve unprecedented productivity gains and save thousands of hours by
automating highly repetitive and error-prone tasks, leading to higher employee and
customer satisfaction.

Administrators and CoE teams can define which action groups and individual actions
can be used as part of desktop flows created with Power Automate for desktop. In the
case of policy violations (for example, VBScript isn't allowed, but it's used in a desktop
flow), the platform notifies the maker that the action is disabled by the policy and
prevents the flow from being saved. However, it’s important to note that bots that are
already developed and deployed might also be affected by policy changes, potentially
causing production bots to stop without prior notice.

Components

Component Description

Canvas App DLP Impact Analysis for Power Automate for desktop

Cloud Flows - Remove Deleted Action from DLP Profile


- Sync Flow Definition

Column security profiles Desktop Flow Script Field Security

Custom API Desktop Flow Definition Analysis

Custom API Request Parameter - Desktop Flow Definition ID


- Store Extracted Desktop Flow Script
Component Description

Environment variables - Desktop Flows Base URL


- Environment ID
- Environment URL
- Store Extracted Script

Plug-in assemblies AutoCoE.Extensibility.Plugins

Security roles Desktop Flow Definition Admin

Tables - Desktop Flow Action


- Desktop Flow Definition
- Desktop Flow DLP Impact Profile

Power BI Advanced Power Automate for desktop DLP Impact Analysis

How to use DLP impact analysis


The canvas app is a single screen app that's used for basic filtering and visibility into the
actions used in desktop flows. With this app, we can easily see which desktop flows will
be impacted if we decide to disable specific modules or actions within the Data loss
prevention (DLP) policies (preview).

You can access this app in the main or any satellite environment.

Some of the fields in the filter include the following:

Module: This is the module to which the action belongs (for example, the scripting
module).
Action: The individual actions that are under Module > Scripting. These include,
Run DOS command, Run JavaScript, Run Python script, and more.

Scenario – Analyze impact of the scripting module

7 Note

The Power Automate for desktop DLP impact analysis app is in both main and
satellite environments. The satellite versions only display desktop flows that are in
the current environment. Open the app from main to get an overview of all the
satellite environments.

7 Note
The script field in the Desktop Flow Definitions table doesn't sync to main from
satellite environments.

Use the filter pane on the left to filter by Scripting module, to see which desktop flows
would be impacted.

All scripting actions

The following screenshot display all scripting actions.

Only Python actions

The following screenshot displays only the Python actions.

 Tip
Select the desktop flow name to go directly to the flow.

Advanced Power Automate for desktop DLP


Impact Analysis Dashboard
You can visualize the data by using the Automation Kit Power BI dashboard.
Use the Automation Kit
Article • 12/01/2022

This article details how to use each component in the Automation Kit:

Automation Console app


Automation Project app
Automation Center app
Automation Solution Manager app
Automation Kit Power BI dashboard

Automation Console app

Functionality
The Automation Console app is used to launch Automation Kit apps. You must update
the information for the apps manually. The setup process goes over configuring the
Automation Console

Features
The automation console is a console-like dashboard of all apps to enable you to launch
any of them from one place.

Automation Project app


The purpose of the Automation Project app is to request and approve new automation
projects. The approver receives a deep link to the project screen to view all automation
project details.

Employees can submit an idea for an automation project.

The project submitter enters data to enable the solution to calculate:

The complexity score


The money saved

The designated business owner must approve the automation project before
development begins.
The Power BI dashboard contains a scatter plot of all saved or submitted automation
projects, which is useful to decide which automation projects are good candidates to
develop.

Project dashboard (home screen)

Role What you see

Project Admin See all automation project requests.

Project Contributor Sees only automation project requests that you created.

Project Viewer Sees all automation projects in view mode.

Main screen
On this screen, you can perform the following tasks.

Create a new automation project.


Edit an existing automation project.
View project details.

1. Project information section: Fill out the information as it relates to the automation
project.
2. Business owner field: This is the approver for the request. If no business owner is
selected, the fallback is used.
3. ROI information section: Provide this information as it relates to ROI for the
automation project.
4. Command bar: Use for new, save, edit, and submit tasks on the form. This submit
button is available after the automation project has been saved.

Some fields are required to save the form. This is because when you submit a request, a
flow (Calculate ROI saving potential for automation project) runs. This flow calculates
the complexity score and then populates this information when you select the save
button.

Automation Center app


CoE admins use the Automation Center app to maintain the configuration and map
automation projects to environments. You can also access flow sessions and metered
artifacts in the automation center app.

Learn more about how to create and maintain the configuration data in setting up the
Automation Kit.

Map automation projects to environments


CoE admins will map automation projects to environments after the request is
approved.

1. Select Automation Projects tab.

2. Select the record you want to map.

3. Select the Related tab > Environments.

4. Select Add Existing Environments.


5. Select the environment you want to use, or create a new one.

6. Select Add > "Save & Close".

Automation Solution Manager app


System Administrators (Sys Admins)) use the Automation Solution Manager app to
enable the metering of solutions and their artifacts.

After a solution is created in or imported into the satellite environment, a CoE admin
maps the solution to an automation project.

Data syncs from the satellite environment to the main environment using real-time
trigger flows inside of the satellite. Only solutions that have been mapped (using the
Automation Solution Manager app) will sync data back to the main satellite
environment.

Turn on metering for a solution


After you create a solution in an environment, it appears in the list. The + icon displays
when metering is turned off for a solution.
1. Select the "+" on the solution you want to meter.

2. In the new screen, select your automation project from the list. If it's not listed,
select the Refresh button until it appears.

3. Select Submit, and then select Yes on the confirmation screen that appears.

Rename a cloud flow to match naming convention


Follow these steps to rename a cloud flow to match the naming convention from the
Automation Solution Manager app home screen.

1. Click inside one of the solutions by clicking the solution name.

2. Select the cloud flow that you want to rename.

3. Click rename flow (only available to unmanaged solutions). The naming convention
should apply automatically.

4. Click Save.

The naming convention should apply automatically.

7 Note

The last 3 digits will default to 001. If you have multiple solutions for a single project,
you could increase that number by 1 for each additional solution if you wish to
distinguish them. Read more about the naming convention.

Meter solution artifacts


This section defines how to meter the artifacts for Flow Sessions so that telemetry shows
up in our main environment.

We can get to the solution screen from the home screen by selecting the name of the
mapped solution.

Once selected, the + icon switches to a meter. Now, data flows to main (flow sessions). If
this option is grayed out, then the cloud flow doesn't follow the naming schema.
[Rename the cloud flow](./use-automation-kit.md#rename-a-cloud-flow-to-match-
naming-convention, if needed.

Bypass Flow naming convention


To bypass the naming convention, select the Disable flow naming convention
checkbox, and acknowledge the warning.

Now, you can meter the flow.

Automation Kit Power BI dashboard


You use the Automation Kit's Power BI dashboard to monitor your automation projects
in production.

Main dashboard
The main Power BI dashboard has the following sections:

Home: This screen provides an overview of key KPIs for the Automation Kit.

Project Backlog: Provides details of the ideas and projects submitted, status, and
ranks based on estimated savings and complexity.
Business KPI: Displays business details for savings realized, efficiency, hours saved,
and other business metrics.

Goals: Outlines savings and efficiency goals for the organization and the status by
department and project.
ROI: Overall ROI for the projects implemented.

ROI Financials: Displays information regarding ROI actual compared to estimate by


year, quarter, and month.
Solutions: Provides an overview of solutions in production, hours saved, error
rates, and bot success KPIs.

Machines: Displays detailed information on machine utilization and activity.


Flow Overview: Summarizes the number flows created, runs, status, and top 10
makers, machines, and flows.

Flow Run Detail: Detailed information on flow runs, durations, status, run modes,
hosts, and errors.
Run Performance: Shows a graphical display of run performance day over day.

Control Chart: Displays an overview of flow processing time averages and


operational performance within control points.
Flow Exceptions: Enables you to filter by date, exception rule, cloud flow name,
desktop flow name, or host name.

Flow Exceptions Tree Diagram: Displays and allows you to drill into specific flows
organized by categories, including cloud flow name, desktop flow name, error
code, error message, run mode, or host name.
Action Usage Analysis: Includes functionality similar to the Automation Kit DLP
impact analysis Power App but with added Power BI filters.

Action Decomposition Tree Analysis: Tree-like diagram that shows how each
action module or flow relates.
ROI Calculations: Contains examples on how calculations related to ROI and
efficiency are determined throughout the dashboards.

Detail pages
Every detail page consists of the following items:

Filters (Department, Project, Solution)


High-level statistics
Useful visuals
Overview of the Control Center
Article • 08/08/2023

The Automation Kit control center has been designed to complement the existing
Monitor desktop flow runs. The key focus of the control center is an orchestrator view
for support analysts and organizations to monitor, take action and alert if necessary.

The control center solution is an optional component can be installed independent or


along side the Automation Kit Main or Satellite solutions.

Key features
Key features of the Automation Kit control center

Scheduler - Provides a schedule of recurring desktop flows called from a cloud


flow.
Monitoring - Filter by Cloud flow to view and monitor the status and run data
Audit - Query and filter Dataverse Audit log data for configured tables
Voluminous Data - Provide the optional ability to integrate with Data Warehouse
to provide data analytics

Power BI comparison
Compared to the Daily Flow Run Power BI template and the Automation Kit Power BI
dashboard the control center is focused on interactive monitoring. The page provides
contextual actions to act on the most recent transactional data. The Power BI reports are
more focused on analytical use cases where the most recent data isn't always needed or
to share with different organizational stakeholders.

Roadmap
The following items are being considered for the control center:

Alert notification - The ability to configure notifications for schedules and


monitored flows
Monitoring - Desktop Flow provide actions to act on the data. For example, Run a
Cloud / Desktop flow
Automated Test - Execute and monitor automated test results
Environment Selection - The ability to select the environment so that the same
application can be used across multiple environments
Work Queue integration - The ability to monitor work queue metrics and schedules
Control Center desktop flow audit logs
Article • 08/08/2023

The Control Center desktop flow audit logs feature allows you to retrieve audit data for
Power Automate desktops. You can configure audit logs for specific Microsoft Dataverse
tables. Doing this provides valuable insights into activities related to Power Automate
desktop flows.

Use audit log monitoring to observe changes to the following Dataverse tables: Flow
Session, Flow Machine, Flow Machine Group, Work Queue, Work Queue Item, Desktop
Flow Module, Desktop Flow Binary, Flow Machine Image, Flow Machine Image Version
and Flow Machine Network.

Prerequisites
Before using Control Center Desktop Flow Audit Logs, make sure to complete the
following prerequisites:

Configure Audit for an Environment: To set up auditing, follow the instructions in


the Microsoft documentation.

Configure Auditing for Tables and Columns in Power Apps: To enable auditing for
specific tables and columns in Power Apps, refer to the Microsoft documentation.

Add Users to Desktop Flows Support Analyst Role: The Control Center installs a
new role called Desktop Flows Support Analyst. Users need to be added to this
custom security role to get access to the provision of new Audit logs.
Default audit logs data
By default, the Audit Logs load data for the last one week. However, you have the
flexibility to filter logs based on specific dates and view all changed attributes.

Learn more
To learn more about configuring auditing and utilizing Control Center desktop flow
audit logs, check out the following Microsoft documentation:

Configure auditing for an environment


Configure auditing for one or more tables and columns in Power Apps

With Control Center Desktop flow audit logs, you can gain valuable insights into your
Power Automate desktops' activities and ensure better traceability and compliance.
Control Center desktop flow monitoring
Article • 08/08/2023

In the Automation Kit, the Control Center offers comprehensive desktop flow
monitoring capabilities, allowing users to check the health of all running processes. This
monitoring feature provides users with the option to view the overall status of processes
at a general level and the ability to drill down to individual process details.

Security
To see desktop flows monitoring data, the following must be set up:

1. The Cloud flow and desktop flow must be part of a solution


2. The user be an owner or have the had the flow shared with them
3. Belong to a Security role like the System Customizer to see all flows in the
environment

Access desktop flow monitoring


To access the desktop flow monitoring features, follow these steps:

1. On in the Control Center interface, select the Monitoring tab.

2. Select Desktop Flows to access the monitoring dashboard specifically for desktop
flow processes.
Detailed run log and transactional data
The monitoring dashboard provides detailed run logs and transactional data for all
desktop flow processes. Users can visualize the performance and status of each process,
including the number of successful and failed executions.

Default data loaded for last 1 day


By default, the monitoring dashboard loads data for the last one day. Users can easily
analyze the recent performance of their desktop flow processes without the need to
manually adjust the date range.

Auto refresh with 60-second cadence


The monitoring dashboard supports an autorefresh feature with a cadence of 60
seconds. This feature allows users to stay up-to-date with real-time data, ensuring they
have the latest insights into the performance of their desktop flow processes.

Visualize and filter at individual process level


With the Control Center's desktop flow monitoring, users can easily visualize and filter
data at the individual process level. This capability enables users to focus on specific
processes and gain in-depth insights into their performance.

Overall the desktop flow monitoring feature, users can efficiently manage and optimize
their desktop flow processes, ensuring smooth and reliable automation within their
organization.
Control Center desktop flow scheduler
Article • 08/08/2023

In the Automation Kit, the Control Center offers comprehensive desktop flow scheduling
capabilities, allowing users to check the Power Automate Desktop flows that are
scheduled for execution triggered by a recurring Power Automate Cloud Flow. This
scheduling feature provides users with the option to view the overall status of scheduled
flows and restart either the cloud flow or desktop flow.

Features
The scheduler page of the control center provides the following functionality:

View the schedule of recurring Power Automate Cloud flows contained within a
solution that call a Power Automate Desktop flow.
A schedule view by day
View the status of scheduled flows yet to start, succeeded or failed
Filter by machine or cloud flow status
Calendar view by Month, Week or Day
Run the desktop or cloud flow now
Open the Power Automate portal desktop flow run monitoring page for deeper
analysis

Cloud flows
As noted only cloud flows that are included as part of a solution. The recent
https://powerautomate.microsoft.com/blog/more-manageable-cloud-flows-with-
dataverse-solutions-by-default/ includes information on how to use the new preview
of "Dataverse solutions by default" to help ensure that cloud flows are included in
solutions. Using this feature can assist users in ensuring the scheduled cloud flows that
are created are visible in the scheduler.

Calendar views

Day, week, and month views


The day, week, month views display information on Desktop Cloud flow runs that are
color coded as follows:

Green indicates successful run

Red indicates failed run

Blue indicates a scheduled future run.

The status and run information is available with long touch or hover mouse on the
event.

Schedule
The schedule view includes a set of cloud flows based on time from the current time and
future scheduled flows over the next days.

Run Now
The current version of Run Now executes the selected Power Automate desktop flow.
It's assumed that there's no parameters required to execute the desktop flow.
Open Grid View

Users can navigate to desktop flows runs page in power automate portal from our
Control Center Home page.

Security
To see scheduled desktop flows, the following must be set up:

1. The Cloud flow and desktop flow must be part of a solution


2. The user be an owner or have the had the flow shared with them
3. Belong to a Security role like the System Customizer to see all flows in the
environment

Error messages
Possible error messages that could occur when executing run flow.

Error message: "InvalidArgument - Cannot find a valid


connection associated with the provided connection
reference."

Description
This error message typically indicates that there's an issue with the connection reference
provided in the code or configuration. The system can't locate a valid connection
associated with the reference, which prevents it from executing the requested action.

Causes
There are several potential causes for this error message, including:

Incorrect or invalid connection reference: The provided connection reference may


be invalid or incorrect, which can cause the system to fail to locate a valid
connection associated with it.

Connection deleted or changed: If the connection used in the code or


configuration has been deleted or modified, it can cause the system to fail to
locate a valid connection associated with the reference.

Permissions issue: The user account executing the code or configuration may not
have the necessary permissions to access the connection or the resources
associated with it.

Resolution
To resolve this issue, you can take the following steps:

Verify the connection reference: Check the connection reference provided in the
code or configuration and ensure that it's valid and correct.

Delete existing connections and recreate: When the Flow Checker warns that a
connection reference hasn't been used, you can use the flow checker to delete
existing connections. Once the connections are deleted, you can recreate
connection references to the Machine or Machine group to enable the flow to be
run.

Notes
For the current release, the following notes apply:

1. Only Power Automate Desktop and Power Automate solutions contained within a
solution are displayed.
2. At least one Power Automate Desktop has been registered and executed.
Automation Kit Control Center -
Historical Voluminous Data Monitoring
Article • 08/08/2023

The Automation Kit Control Center offers two powerful options for historical data
monitoring: "Historical Analytics Dashboard using Data Warehouse Approach" and "TDS
Desktop Flows Monitoring." These options are suitable when monitoring Desktop flows
at scale for large volumes of transactional logs.

Historical Analytics Dashboard using Data


Warehouse Approach
When you need to monitor Desktop flows at scale and deal with voluminous historical
data, the "Historical Analytics Dashboard" is an excellent choice. This option requires a
set of prerequisites to be completed:

Synapse Link Setup: Configure the Azure Synapse Link to connect your Data
Warehouse with the Control Center.

Storage Compression using Delta Lake: Choose Delta Lake as the storage
compression option for efficient data management.

Azure Resource Configuration with ARM Deployment: Automate the setup and
deployment of all Azure resources required for the monitoring.

Historical Analytics Dashboard


To further enhance the monitoring capabilities, the Power BI dashboard of the
BYODL_FlowMonitoring_MMYYYY report should be published with an SQL endpoint and

configured with a refresh schedule.

Customized SQL View Script


The wizard provides a customized SQL view script, which, when run in the Synapse SQL
DB, enhances performance and reduces the burden of query processing and cost. This
optimization is crucial when dealing with vast amounts of historical data.

Azure Synapse SQL Endpoint


With the power of Azure Synapse Link and the efficient BI reporting, the Automation Kit
Control Center empowers you to make data-driven decisions and monitor your Desktop
flows with ease and accuracy.

Power BI Dashboard
The setup wizard simplifies the installation of Azure resources, configuration of Azure
Synapse Link, and customizes the SQL view script to optimize performance and reduce
query processing costs.

TDS Desktop Flows Monitoring


The "TDS Desktop Flows Monitoring" option enables you to set up the
FlowMonitoring_MMYYYY Power BI report to connect with your environment URL.

However, before using this option, ensure that TDS (Tabular Data Stream) is enabled at
the environment level.

The setup wizard guides you through the process, making it easier to set up the BI
report, configure the Synapse SQL endpoint, and connect with your environment.

By using the "Historical Analytics Dashboard using Data Warehouse Approach" or "TDS
Desktop Flows Monitoring" options in the Control Center Setup Wizard, you can
effectively monitor your Desktop flows, handle large volumes of historical data, and gain
valuable insights for better decision-making and process optimization.
Project end-to-end scenario
Article • 12/01/2022

The following test cases are full end-to-end processes.

1. Create an automation project request.


2. The request is approved or rejected. Approvals are sent to the business process
owner or fallback if no business process owner is selected.
3. Project admin maps the automation project to a new or existing environment.
4. The maker creates a solution and develops the automation project.
5. When the solution is deployed to prod (manually until ALM Automation actions
are available) the automation admin maps the solution to project by metering the
solution and its artifacts.
6. After metering is turned on for the solution, the flow / process can be triggered,
and data will sync back to main in near real time.
7. Verify in the main solution that the flow sessions are being synced.
8. Use Power BI to verify that the data calculations are correct.

User roles definition


Refer to the following table to see what roles and permissions are needed for each step.

Name Security roles

Maker Automation project contributor, basic user, environment maker

Approver/business owner Automation project viewer

CoE admin Power Platform admin (or system admin for all environments used)

Order materials and services (example 1)

Request automation project (maker)


1. Sign in to Power Automate and then change to your main environment.

2. Launch the Automation Project app from either of the following apps.

Automation Console app


Automation Project app
3. Create a new automation project request by selecting the + or selecting the New
Project tab.

4. Fill in the details as the following table indicates.

Question Value

Project Name Order materials and services

Process Challenges Complex, No API

Department ANY

Improvement Driver ANY

Processing Time 85

Process Category ANY

Processing Frequency Daily

Processing Peaks monthly

Process Sub Category ANY

Volume Per Process 5

Business Owner Business Approver or leave blank for Fallback

ROI

Automation Steps 14

Number Of FTEs Needed 2

Rework Time in Minutes 35

AVG Error Rate % 15

Hourly Cost Per FTE 50

Working Hours Per Day 8

Working Days Per Year 200

FTEs Needed for Rework 1

Automation Goal ANY

Maintenance Overhead % 10

Development Costs 1200


1. Select Save. After the save operation completes, the dashboard displays a list of
flow runs and then calculates the ROI potential and complexity score, based on the
information provided.

2. Select the Refresh button within the app until the ROI and score show up. These
should show up in a few seconds.

3. Now, select the project record for approval, and then select the Submit Project
button.

Now that the request has been submitted, it can be approved or rejected by the
business owner, or if none was provided, the fallback approver is used.

If this is the first time an approval is being used within this environment, it'll take around
5 minutes for the approval solutions to initialize. This is only a one-time event, and you
can avoid it by following the steps laid out in the setup steps.

Approve automation project request (approver)


1. Sign in as an approver in the main environment.

2. Open the approval from one of the following locations.

Microsoft Teams (preferred)


Power Automate actions tab
Outlook email

3. Approve the request.

Map the automation project to an environment (dev)


1. Sign in as a CoE admin in the main environment.

2. Open the automation center app.

3. Select the automation project that was just approved.

4. Select the Related tab > Environments.

5. Map the record to an environment (dev).

) Important

Perform step five only after the automation project is deployed to the test
environment.
6. Map the automation project to the test environment.

) Important

Perform step six only after the automation project has been deployed to the
production environment.

7. Map the automation project to the production environment.

Create / Export solution (DEV/TEST)


1. Sign in as a maker into the satellite environment.

2. Go to the DEV environment and create a solution.

3. If the department publisher does not exist, create one for the department to use.

4. Create a desktop flow.

5. Create a cloud flow that will trigger the desktop flow. Name the cloud flow using
the following naming convention: [CloudFlowName][AutomationProjectNumber]
[3digits].

CloudFlowName Your meaningful name for your process

AutomationProjectNumber Displayed in the automation project app

3digits The last 3 digits can be used for advanced use cases and are
typically 001 if you only have one solution per automation
project. If you have multiple solutions for a single project, you
could increase that number by 1 for each additional solution
to distinguish them.

7 Note

The renaming process can also be done easier inside the Automation
Solution Manager app.

The cloud flow should trigger the desktop flow based on the frequency defined in the
automation request.

1. Test by running the cloud or desktop flows.


2. Deploy to test (manual).

Automation project gets mapped to the test environment


1. Maker exports manually and deploys to test.
2. Automation project gets mapped to the TEST environment manually by the CoE
admin.
3. Maker does basic functional testing.

Automation project gets mapped to PROD app


Perform the following steps as a CoE admin in the satellite environment (PROD).

CoE admin maps the solution to automation project by metering the solution and its
artifacts.

1. Open the Automation Solution Manager app.


2. Select the solution for your automation project and meter by selecting the +" icon.
3. Once metered, meter the artifact (the trigger cloud flow).
4. Go to the metered artifact (cloud flow), and then trigger it. (This will sync a flow
session to main).
5. Wait for run to complete.

Verify data sync to main


Perform the following steps as a CoE admin in the main environment.

1. Open the Automation Center app.


2. Open the flow Sessions tab.
3. Filter on newest complete time if needed, and then verify that the run we triggered
is there.

Next, we can validate the ROI calculations using Power BI and the Excel ROI calculator.

1. Take the same information you entered into the automation project app, enter it
into the Excel ROI calculator.
2. Compare Power BI with the results from the Excel ROI calculator.
Limitations and resolutions for issues in
the Automation Kit
Article • 04/22/2023

This article contains some of the limitations in the Automation Kit.

Limitations and resolutions

Environment variables aren't editable after you import a


solution
Issue You can't update the values for environment variables from within the solution
because the solution is Managed.

Resolution
Use the following steps to update environment variables.

1. Go to Power Automate .
2. On the left pane, select Solutions.
3. Select the Default Solution and change the filter to show Environment Variables.
4. Select a variable that you want to update, and then configure its Current Value.

Environment variables continue to use the old values


after a manual change
Issue
When someone changes environment variable values directly within an environment,
instead of through an ALM operation like solution import, flows continue to use the
previous value.

Resolution
For canvas apps, the new value will be used during the next session (for example,
closing the app and then playing it again).

7 Note

You must deactivate and then reactivate cloud flows for them to use the updated
value.
Can't meter non-solution aware flows
Issue
At this time, the current solution can't meter any flows that aren't inside a solution.

Resolution
Put all flows that need to be metered inside a solution.

Cloud flows don't support metering


Issue
Cloud flows must follow a specific naming convention before they're used for metering.

Resolution
All solution-aware cloud flows that you want to be metered must follow a new naming
convention that is internally being validated using RegEx (AP-[0-9]{9}[0-9]{3}\b). Here's
the expected format.

[CloudFlowName]AP-[9digits][3digits]

For example, in MostLikleyTheBest-CLoUdfLoW_AP-000001013_001., you can use the


last 3 digits advanced scenarios and are typically 001 if you only have one solution per
automation project. If you have multiple solutions for a single project, you could
increase that number by 1 for each additional solution if you wish to distinguish them.

 Tip

You can select the disable naming convention button inside the solution manager
app to bypass this requirement.

Unexpected behaviors with the flow exception framework


Issue
Unable to disable or suspend desktop flows.

Resolution
Turn off parent cloud flows if they're configured to allow them to be turned off.

Issue
Flow runs remain in the waiting stage.
Resolution
If you configure your flows to require acknowledgements when they fail, all flow runs
remain in a waiting stage until either you acknowledge the email or the flow times out.

Desktop flows impact analysis sync limitation (No data in


app)
Issue
There's a limitation where only new or modified desktop flows will be analyzed and
shown in the app.

Resolution
Do any of the following tasks:

Modify the desktop flow.


Import the desktop flow (solution).
Create a new desktop flow.

7 Note

You can use a solution to extend the Automation Kit to sync all desktop flows in an
environment with the RPA CLI .

See also
RPA CLI for the Automation Kit
Released versions for the Automation
Kit
Article • 12/01/2022

The Automation Kit is published in GitHub. Feature requests and bug fixes are worked
on in Issues .

View the Latest Releases to learn what bugs have been fixed and what new features
have been introduced.
Feedback and support
Article • 01/11/2023

Although the underlying features and components used to build the Automation Kit
(such as Microsoft Dataverse, admin APIs, and connectors) are fully supported, the kit
itself represents sample implementations of these features. Our customers and
community can use and customize these features to support an Automation CoE in their
organizations.

If you face issues with:

Using the kit


Report your issue to aka.ms/automation-kit-issues . (Microsoft Support isn't
able to assist with issues related to the Automation Kit, but they're able to help
with related, underlying platform and feature issues.)

The core features in Microsoft Power Platform, Power Automate, or Power Apps
Use your standard channel to contact Microsoft Support.
Automation adoption best practices
overview
Article • 10/07/2022

Microsoft Power Platform adoption best practices for Automation CoEs provide proven
guidance for establishing and scaling an Automation Center of Excellence in your
organization. It consists of best practices, documentation, and tools.

By using these best practices, organizations can better align their business and technical
strategies to ensure success. Automation CoE members, RPA developers, Cloud
architects, IT professionals, and business decision makers use this information to achieve
their automation adoption goals.

These adoption best practices from Microsoft employees, partners, and customers are a
set of tools, guidance, and narratives to help shape your technology, business, and
people strategies to get the best business outcomes for your automation rollout.

Read the following articles in the Power Automate guidance center to learn more about
how to roll out the Microsoft automation platform at enterprise scale.

Strategy – The role of an Automation CoE in your adoption strategy


Holistic enterprise automation techniques (HEAT) – Best practices for deploying an
automation platform and managing and the end-to-end lifecycle of automation
projects
Whitepaper: Automation administration and governance – Technical whitepaper
that shows how to use the principles of HEAT for planning, deploying, and
managing an Automation Center of Excellence (CoE) for robotic process
automation (RPA) and other hyperautomation scenarios using Microsoft Power
Automate
Whitepaper: Manage Power Automate for desktop on Windows – Manage the
lifecycle of Power Automate for desktop, using Microsoft Endpoint Manager tools
such as Intune, SCCM, and ring deployment techniques to deploy, monitor, and
audit Power Automate for desktop
Automation CoE starter kit (preview) – Tools to help you automate, manage,
govern, and drive adoption of the automation platform
Automation adoption strategy
Article • 07/01/2024

Microsoft Power Automate includes powerful workflow automation directly in your apps
with a no-code approach that connects to hundreds of popular apps and services. It
covers the full range of hyperautomation scenarios, which includes automating using
APIs, UI, and AI. The cloud-first robotic process automation (RPA) platform delivers
fundamental technology benefits that can help your enterprise execute multiple
business strategies.

As you scale adoption of the platform, you will need to ensure your system is managed
and governed according to your organization’s strategic priorities. This is where the
Automation CoE comes in.

Automation Center of Excellence (CoE)


The Automation CoE ensures that the organization can realize its automation and
productivity objectives safely and efficiently. It does this by employing:

Governance: Security, data integrity, auditing: Ensures that only the right people
have the access to specific data that's governed in a specific way to avoid any risk
to the organization itself.
Repeatable patterns and templates: Error handling, instrumentation,
components: Avoids duplication and rework and enables everyone to get the
benefits of automation with the least effort.
Consistent benefits realization: KPIs/tracking/metrics, process rationalization:
Defines strategic benefit indicators, tracks and measures the return on investment
(such as time saved), and ensures that the processes being automated are
optimized.

As a result of these activities, the CoE helps the organization to scale by enabling
automation for everyone. Innovation doesn’t happen while standing in a line waiting for
your automation needs to be serviced!

Some of common stakeholders and their roles and responsibilities within the
automation platform and projects are:

Executive Sponsors: Organization leaders who set goals and business KPIs for an
automation initiative.
Automation CoE team: Automation CoE team comprised of admins, pro-devs,
RPA-devs, architects, and administrators to deploy, manage, and scale the
automation platform.
Citizen developers: Business users, citizen developers, process owners, and pro-
developers who can propose new project ideas, develop, and deploy projects into
production.

Getting started with your Automation CoE


To get started, we recommend that you learn about the HEAT techniques and how they
can be applied for automation administration and governance, and that you install the
Automation CoE Starter Kit (Preview).

Additional strategy resources


Building an RPA and Automation Center of Excellence (video):
https://aka.ms/autocoestrategy
Customer presentations:
Coca-Cola HQ Webinar: https://aka.ms/cocacolaautomationplatform
Coca-Cola United: https://aka.ms/cocacolaunitedrpa
TC Energy Bot Wars: https://aka.ms/Bot-Wars
Customer stories: https://aka.ms/powerautomatestories
Feedback
Was this page helpful?  Yes  No

Provide product feedback


Holistic enterprise automation
techniques (HEAT)
Article • 02/09/2023

HEAT is guidance that's designed to help you deploy the automation platform and
manage the entire lifecycle of an automation project. Cloud architects, RPA developers,
IT professionals, and business decision makers use these best practices, documentation,
and tools to achieve their cloud adoption goals.

By using HEAT, organizations can better align their business and technical strategies to
ensure success.

HEAT includes these stages:

Empower
Discover & plan
Design
Build & test
Deploy & manage
Secure & govern
Nurture

We have recorded a special series on our video channel Automate It where we go into
each of these stages in detail.

Begin with this video: Introducing HEAT


Learn how to apply the HEAT techniques in your Automation Center fo Excellence (CoE)
with the Administering a Low-Code Intelligent Automation Platform whitepaper .

Each stage of the HEAT technique is described below, with additional resources.

Empower
The start of any successful automation project is to ensure that the key stakeholders
understand the automation capabilities of the platform. In this stage, users new to
Microsoft Power Automate can learn about the automation capabilities in Power
Automate.

Watch the video: Power Automate & Power Platform

Discover & plan


In this stage, CoEs and business collaborate on what processes to automate based on
the expected ROI, form the development team, and set up the Power Automate
environment.

While automation enables organizations to become more efficient, deciding which


processes to automate is still often a challenge. It is impossible to automate every
process, so CoEs can use the Automation Project app to manage the backlog and pick
an automation based on ROI.
Power Automate provides environments of different types (for example, production,
sandbox, and trial). Each environment has a defined set of users and roles. Admins can
bring their own compute infrastructure to install Power Automate for desktop and the
necessary software.

Watch the video: Empower, discover, & plan"

Resources for determining ROI, process discovery, and setting up Power Automate
environment:

Discover which process to automate using process advisor


Use the Automation Project app for curating and managing ideas
Power Automate for desktop
Power Automate IP address configuration
Service administrator roles (Microsoft 365 Global / Power Platform/ Dynamics 365/
Azure/ Power BI admin)
Assign Power Automate RPA attended user plan through Microsoft 365 admin
center
Manage unattended RPA add-on capacity in Power Platform admin center

Design
Building robust automation solutions require well-defined design principles that will
build the foundation for scale, security, and compliance.

Watch the video: Design phase


Some other design principles (not a comprehensive list) to consider are:

Design for scale, throughput, and resiliency


Fundamentals – logging and credential management and testing
Error handling and retries strategy
Using API vs UI for automation

Build & test


Building is the heart of the automation lifecycle.

Watch the video: Build & test


More resources for building automations:

RPA playbook for SAP GUI automation -​ Microsoft AI Builder


Automate It video series: "Power Automate for desktop and SAP"
Use sensitive text in Power Automate for desktop + Azure Key Vault

In Robotic process automation (RPA) with SAP, shows how to build an enterprise grade
Invoice Processing automation solution. This intelligent automation solution processes
invoices in SAP and showcases some of the key enterprise scenarios such as logging,
auditing, tracking each invoice processing, human in the loop, ROI calculation, and
more.

Watch the video series: SAP GUI based RPA in Power Automate for desktop
Deploy & manage
Power Automate provides a rich set of capabilities for admins and developers
throughout the deployment cycle for a given automation, including detailed information
on the success or failure of each individual run and the ability to schedule, queue, and
prioritize an automation.

Developers can set up CI/CD with test integration to deploy automation and avoid
accidental changes that would break the automation in production.

Power Automate also helps users manage their automations. All execution data is
available in Dataverse, with reports and views that visualize this data. Power Automate
provides real-time information on individual bots and the machines and clusters they
run on, giving more-detailed visibility into the full automation health, bot health, and
infrastructure health.

Watch the video: Deploy & manage

Some resources for deploying, monitoring, and managing automations:

ALM with Microsoft Power Platform using Azure DevOps or GitHub


Run desktop flows as attended/unattended
Monitor desktop flow runs
Prioritize desktop flow using queues
Sharing desktop flows
Microsoft Dataverse auditing
Video: On-premises data gateway monitoring status
Video: Automate on-premises data gateway installs
Video: Clustering your gateways
Video: ALM with RPA in Power Automate
Video: "Power Automate desktop monitoring dashboard"

Secure & govern


In this stage, the Automation CoE can use security controls to establish guard rails to
scale RPA across their organization. They can leverage Azure Active Directory, a key
foundation that allows admins to create and manage access controls on users and
resources centrally. Power Automate provides rich governance and security controls to
ensure that you can run your business-critical processes in a trusted and compliant
manner.

The platform provides a rich set of auditing logs that help admins keep track of what
happened in the system. The deep integration of Microsoft Power Platform with Azure
and Microsoft 365 allows IT admins to define reactive and proactive policies and
procedures to track automation activity.

Some helpful resources for security, governance, and adoption nurturing:

Power Platform compliance and data privacy


Microsoft Trust Center
Establishing a data loss prevention (DLP) strategy
Audit Power Automate flow events through Microsoft 365 Security & Compliance
Center

Nurture
CoEs should establish their strategy to nurture and upskill their employees. They can
create a community of champions, train them, run bot wars, and evangelize success
stories. Power Automate provides a rich set of learning resources (documentation,
videos, tutorials, labs, courses, certifications, whitepapers, etc.). Microsoft Power Platform
Adoption Center provides guidance, workbooks, and tools to accelerate adoption
within your enterprise.

Apply HEAT techniques


Learn how to apply the HEAT techniques in your Automation CoE with the Administering
a low-code intelligent automation platform whitepaper .
More resources to start automating
Download Power Automate for desktop
Power Automate documentation
Get help with Power Automate for desktop on Forums
Watch and subscribe to the Automate It series
Follow us on Twitter: @MSPowerAutomate
Automation administration and
governance
Article • 10/04/2022

The Administering a Low-Code Intelligent Automation Platform whitepaper outlines key


considerations for planning, deploying, and managing an Automation Center of
Excellence (CoE) for hyperautomation scenarios with Microsoft Power Automate.

The purpose of this whitepaper is to streamline the exploration, implementation,


security, governance, and scaling of process automation across your organization. This
practical guidance helps you understand your network setup, build an environment to
support automation, proactively plan for automations being developed and deployed,
and run administrative tasks. The whitepaper is based on holistic enterprise automation
techniques (HEAT), a collection of learnings from deploying hyperautomation solutions
at many enterprises.

Download your copy of the whitepaper now.

Sample pages of the Administering a Low-Code Intelligent Automation Platform


whitepaper.
Overview of the Automation Kit
Article • 12/01/2022

The Automation Kit is set of tools that accelerates the use and support of Power
Automate for desktop for automation projects. The kit provides tools that help you
manage automation projects and monitor them to estimate money saved and return on
investment (ROI).

The Automation Kit applies the HEAT (Holistic Enterprise Automation Techniques) to
support your organization.

The kit is especially useful to an Automation Center of Excellence (CoE) team, which is a
team of experts who support automation within your organization. They have good
knowledge about Power Automate for desktop, set up and maintain the Automation Kit,
and maintain the configuration data such as departments, process categories, goals, and
more.

The goal of the Automation Kit is to enable organizations to manage, govern, and scale
automation platform adoption based on best practices. The Automation Kit provides the
following items in support of your Automation Center of Excellence.

Near-real time ROI / SLA: Short and long-term analytics to drive towards your
business goals.
Tools for all users: These tools are for makers (citizen and pro developers), your
Automation CoE team, and executive sponsors.
End-to-end automation lifecycle: These tools help to automate and manage all
aspects of hyperautomation scenarios, including ALM and templates to drive
consistency.
Enterprise readiness: Helps to secure, govern, audit, and monitor your automation
deployment.

Automation Kit components


The Automation Kit supports an automation CoE with the following components:
1. Automation Project: This project is a canvas app that supports requesting
automation projects and submitting them for approval.
2. Automation Center: This is a model-driven app that organizations can use to
create and maintain automation assets, such as master data records, map
resources and environments, and assign roles to employees.
3. Automation Solution Manager: This is a canvas app in satellite environments that
enables the metering of solutions and their artifacts.
4. Cloud flows: These cloud flows use Dataverse tables to sync data from satellite
environments, in near real time, to the main environment.
5. A Power BI dashboard that provides insights and monitors your automation assets.

These two solutions contain the components in the kit.

The main solution, which you deploy to the main environment.


The satellite solution, which you deploy in each satellite environment.

Use satellite environments to develop and test your automation projects before you
deploy them to production. The production satellite monitors and meters the solutions
and solution artifacts for an automation project.

The data from the metered solutions syncs to the main environment in near real time for
monitoring on a dashboard.

Conceptual design
The Automation Kit has the following conceptual design components.
The key element of the solution is the Power Platform main environment.

There are usually several satellite production environments that run your automation
projects. Depending on your environment strategy, these could also be development or
test environments.

Between these environments there is a near-real-time synchronization process that


includes cloud or desktop flow telemetry, machine and machine group usage, and audit
logs. The Power BI dashboard for the Automation Kit displays this information.

Near real-time data synchronization


The synchronization processes the minimum data that's required to calculate the ROI
and SLA. It doesn't create a complete inventory of all low code assets.

The components in satellite environments could be organized by geography or


capability. The cloud flows in these environments push information about metered
cloud and desktop flows in near real-time to the automation main environment.
Manage Power Automate for desktop
on Windows
Article • 10/04/2022

If you’re managing Power Automate for desktop on Windows at scale throughout your
organization, our PDF playbook and eight-episode YouTube video series introduce you
to:

Concepts for managing the lifecycle for Power Automate on desktop.


Leveraging Microsoft Endpoint Manager tools such as Intune and System Center
Configuration Manager (SCCM) and ring deployment techniques to deploy,
monitor, and audit Power Automate for desktop.

The whitepaper and video series include these topics:

Overview of managing Power Automate for desktop on Windows.


Create Intune Package for Power Automate for desktop.
Configure Intune for deploying the Power Automate for desktop package.
Deploy Power Automate for desktop and manage the lifecycle.
Manage Power Automate for desktop through System Center Configuration
Manager (SCCM).
Silent registration for Power Automate for desktop machines with Power
Automate.
Analytics for Power Automate for desktop app in Intune.
Auditing and compliance for Power Automate for desktop app in Intune.

Watch the Automate It Special Series: Managing Power Automate for desktop video
series on YouTube.

Download your copy of the Managing Power Automate for desktop on Windows
playbook .
Power Platform automation maturity
model overview
Article • 07/01/2024

Offering low-code, drag-and-drop tools and hundreds of pre-built connectors, Power


Automate empowers business users—citizen developers—to automate repetitive,
mundane tasks with ease. To help organizations establish and scale successful
automation implementations, Microsoft created a set of automation adoption best
practices . Microsoft also developed the automation maturity model, inspired by
holistic enterprise automation techniques and the capability maturity model . The
automation maturity model can help your organization and its partners think through
ways to improve your automation capabilities and align them to business outcomes.

This article explains the automation maturity levels. More information about the
capabilities at each level is available in Power Platform automation maturity model
details.

Level 100 – Initial


In this phase: Failing forward, experimentation, feasibility analysis, cautious optimism

The organization begins its journey with Power Automate in the Initial phase. During
this phase, the organization evaluates its alignment and integration with its architecture
and automation vision. Representatives from technology and business verticals discuss
the overarching technology and business goals, encompassing the networking,
information security, legal, and privacy requirements that are specific to the
organization. Technology upskilling endeavors are launched to support Power Automate
adoption across the organization. Key performance indicators (KPIs) are established to
measure the benefits of automation and its subsequent influence on operational
metrics. Recording progress is still manual in this phase.

Level 200 – Repeatable


In this phase: Organization "on-board," digital stewardship, bias to action

In the Repeatable phase, the organization codifies lessons learned from the Initial phase.
An automation Center of Excellence (CoE) is established to evangelize adoption, serve as
a liaison between technology and business stakeholders, and define best practices. The
organization understands the value proposition that automation offers, promotes Power
Automate training across the board, and drives change management initiatives. Based
on their experiences during the Initial phase, teams highlight potential automation
opportunities.

Level 300 – Defined


In this phase: Iterative standardization, feedback collection, gap fulfillment, business and
technology resilience

In the Defined phase, the organization standardizes and refines processes that were
established in the Repeatable phase. Organizations may redefine goals based on lessons
learned and progress made in earlier phases. The groundwork established earlier is
evolved with a bias towards scale, security, and resilience. The CoE drives initiatives to
automate governance and platform-related concerns to promote scale. Platform
governance-related initiatives, encompassing environment provisioning, Data Loss
Prevention (DLP) policy requests, licensing, and machine management are implemented
in this phase.

The organization's maturity to address network security concerns should be gauged


periodically. Potential policy violations, exploits, and vulnerabilities should be addressed
promptly. As the practice matures and Power Automate emerges as an integral part of
the organization's technology landscape, the need to build fault-tolerant processes is
inevitable. The automation CoE formalizes internal guard rails by defining business
continuity plans if a failure occurs.

Level 400 – Capable


In this phase: Strategic benchmarking, technology-driven optimization, "business led, CoE
supported"

In the Capable phase, the organization has established processes for monitoring and
managing automation health from both a quantitative and a qualitative perspective. The
organization is successful in democratizing automation across both business and
technical verticals. Automation initiatives shift; no longer driven by the CoE, now citizen
developers drive them. The organization builds on and enriches its technology
capabilities. Process mining tools are used to optimize processes, and pro developers
collaborate to build APIs that enable citizen developers to build automation. AI-based
capabilities are implemented to solve business problems in the context of process
automation. Organizations can attend to a wider array of use cases based on the fusion
of technologies.
Level 500 – Efficient
In this phase: Community beacons, cross-team collaboration, addressing gaps hindering
excellence

In the Efficient phase, the organization is at a mature state in its automation journey
from a business process, technology enablement, and automation adoption perspective.
In the context of automation, the organization has established its capabilities to
harmoniously integrate Power Automate into its digital landscape. Governance, security,
KPI dashboarding, and application lifecycle management are thoroughly automated,
with minimal to no manual effort required to sustain them.

A well-established community of experts helps to resolve issues and questions that


citizen developers have. Fusion teams involving experts from cloud, AI, and other
capabilities collaborate to build elegant automation processes that build on the benefits
that the combination of these technologies offers. Organizations at the efficient phase
have executive sponsorship and they play a pivotal role in the digitization of the
organization. Efficient organizations serve as beacons for other organizations, sharing
best practices and helping them to overcome any challenges they may face.

Automation maturity model details


The detailed Automation Maturity Model captures goals, indicators, and required
resources across the various phases of maturity.

Next step: Detailed capabilities

Resources for rolling out an automation


program
Automation CoE Blueprint
Automation CoE Strategy
HEAT
HEAT video series
Admin and Governance Whitepaper
Manage Power Automate for desktop on Windows
Hyper-automation SAP playbook and video series

Customer Stories
Coca-Cola
Coca-Cola United
TC Energy Bot Wars
Inter Pipeline Botwars
Chemours
Customer stories collection

Learning resources
Automate It
Holistic enterprise automation techniques (HEAT)
RPA in a Day training
Power Automate learn

Feedback
Was this page helpful?  Yes  No

Provide product feedback


Power Platform automation maturity
model details
Article • 07/01/2024

The following sections present detailed characteristics of an organization at each


maturity level of the Power Platform automation maturity model for each type of
capability.

Empower
ノ Expand table

Level Details

100: Initial Goal: Training resources available and Automation CoE have a fair grasp on tool
capabilities and usage

Indicators:

CoE have undergone initial training. (RPA in a day, Knowledge articles).

Resources: Maker Learning Resources

200: Goal: Organizations have a standard training curriculum that gives new users a
Repeatable starting point. All makers have undergone basic Power Automate training.

Indicators:

Leverage Microsoft's training resources and accredit knowledge gained


through certification programs

Resources: Microsoft Certified: Power Platform Fundamentals

300: Defined Goal: Extend lessons learned and share knowledge across the various verticals
internal to the organization.

Indicators:

Internal Monthly hackathons/ know-how sessions are conducted.


Leverage Power Platform Yammer groups
Lunch and learn sessions

Resources: Organize Hackathons

400: Capable Goal: Contribute to the community of Power Automate practitioners external to
the organization.
Level Details

Indicators:

Engage and involve in Power Automate community based external events.

Resources: Microsoft Power Automate Community

500: Efficient Goal: "Help us - Help you" - give Microsoft active feedback.

Indicators:

Provide product feedback and showcase product capabilities to Microsoft


organized initiatives such as Ignite and likewise.

Resources: Microsoft Ignite

Discover & Plan


ノ Expand table

Level Details

100: Initial Goal: Initiate technology readiness discussions with supporting IT/Business teams.

Indicators:

Architectural review meetings are established involving technology and


business stakeholders.
Key decision makers and stakeholders are identified

Resources: Power Automate Architecture - High Level


Level Details

200: Goal: Organizations document accepted priorities and concerns, initial draft to
Repeatable address foundational aspects of Power Automate are addressed. Strategy plan is
defined with a bias towards a "CoE led - Business supported" model

Indicators:

High-level guidelines to address network, security and infrastructure


concerns are defined
Stakeholder roles and responsibilities are defined (Business, CoE, Security,
Compliance and Admin)
High level ROI factors are defined

Resources:

Azure VM deployment
Power Automate Desktop Pre-Requisites
Design consideration for VM scale sets

300: Goal: Strategic planning document is enriched such that organizational level
Defined security and governance level parameters are addressed.

Indicators:

Network provisioning parameters including VM compute, machine groups


and VNET setup are defined
Credential/Access Management strategy is well defined - makers,
administrators and CoE users and their corresponding roles and privileges
are documented.
Security Controls and RBAC policies are documented
Data Encryption/Retention and Management strategy defined
Business continuity and Disaster recovery plan in place

Resources:

Azure Role Based access control


Data Protection Policy
Data Residency

400: Goal: Strategic planning document includes ideation around advanced reporting
Capable constructs, AI based automation strategy, process mining tools with a bias towards
a "Business led - CoE supported" model

Indicators:

Detailed operational and functional level analytics including KYC, Machine


management, ROI calculator, License Utilization, Common exceptions list,
top makers are defined.
Opportunities to leverage AI builder.
Process discovery through process mining
Level Details

Resources:

AI Builder
Process mining
Power Automate Analytics

500: Goal: Strategic planning document is complete with necessitate revisions from
Efficient time to time. Business and technology teams are aligned on strategy and
corresponding investments.

Indicators:

Strategy to automate governance based manual processes - ALM, Machine


Management, DLP, Access Management and License Management.
Design best practices shared amongst the Power Automate community.

Resources:

Solution Overview
Machine Management

Design
ノ Expand table

Level Details

100: Initial Goal: Initial design is scoped to support experimentation with Power Automate as
the organization familiarizes with the tool.

Indicators:

Design perspectives around the usage of API versus UI are explored


Environment demarcation (Dev/Test/Prod) isn't well established.

Resources: Overview of different types of flows

200: Goal: Design is scoped to support a few bots in the production environment
Repeatable addressing basic automation needs within a department. Organization is still very
early in its automation journey.

Indicators:

Design considerations around logging and credential management are still


at a rudimentary stage.
Clear demarcation between (Dev/Test/Prod) is established at this point.
Level Details

Practitioners have a clear understanding on when to use API versus UI from


a design perspective.

Resources:

Environment Strategy
API versus UI Approach

300: Defined Goal: Design is scoped towards supporting many bots for production usage needs
for the organization. Organization is maturing in its automation journey.

Indicators:

Logging and credential management is well established from a design


perspective.
Identity Security and Access Management tool integration to the
infrastructure are well laid out.
Code Review standards are well defined.
Designing exception handling models using Try-Catch-Finally pattern
Consider storage technology that scales proportionally

Resources:

Try-Catch-Finally pattern

400: Goal: Design is scoped towards supporting many bots across cross-functional
Capable teams for production usage - leveraging AI/ML, custom connectors and advanced
error handling.

Indicators:

Design considerations around templatization of common patterns and


practices is well established.
Establishing clear design strategies/blueprint on the usage of custom
connectors is defined.
Designing application health probes to check availability of the critical part
of the systems such as load balancers and traffic managers.

Resources:

Scalable Azure Architecture


On Premise Gateway Cluster
Template Gallery

500: Efficient Goal: Design is at a matured state and addresses all architectural challenges from
an infrastructure, security, and governance perspective in conformance to the
overarching organizational guidelines.

Indicators:
Level Details

Auto scaling capabilities - design thinking around use of machine groups


based on process volume are well laid out.
Advanced auditing capabilities to support proactive monitoring is supported
Design considerations to handle transient faults in a cloud infrastructure

Resources: Handling Transient Faults in Azure

Build & Test


ノ Expand table

Level Details

100: Initial Goal: Initial pockets of success are realized from an implementation standpoint.
CoE validates feasibility of the solution by building proof of concepts to support
simple use case scenarios.

Indicators:

Practitioners build basic cloud and desktop flows to understand ground


level functionality of the tool.
Testing is confined to a PoC level.
Monitoring is manual.
Implementation is confined to a generic development environment.

Resources: Basic cloud and desktop flows

200: Goal: Implementation is targeted towards building a few bots for production
Repeatable usage. Organization continues to explore Power Automate capabilities.

Indicators:

Code isn't adequately modularized/organized at this point.


Code promotion between environments is manual.
Power Automate Desktop installation and setup is manual
Implementation covers both DPA (Digital Process Automation) and RPA
(Robotic Process Automation) scenarios
On-premises Gateway / Direct Machine connectivity implemented
ROI calculation is primarily manual at this point

Resources: On-premises Gateway / Direct Machine connectivity

300: Defined Goal: Implementation is targeted towards building many bots for production
usage. Organization is maturing in its automation journey.

Indicators:
Level Details

Code is well modularized/organized - main flows and sub flows.


Code promotion between environments is automated for both managed
and unmanaged solutions
Power Automate Desktop installation and setup is automated
Credentials are managed through Azure Key Vault (or equivalent)
Exception Handling best practices are implemented for:
Cloud Flows: using Configure run after, Try/Catch, Error
Desktop Flows: Action level and block level exception handling

Resources:

Power Automate Desktop installation


Exception Handling - Cloud Flows
Exception Handling- Desktop Flows

400: Goal: Implementation is targeted towards building many bots supporting cross-
Capable functional teams for production usage with a bias towards high level of resilience
and reusability.

Indicators:

Custom connectors, API support (for in house applications) are built by pro-
developers to facilitate citizen developers to build automation.
Reusable templates for both cloud and desktop flows are utilized.
ROI calculation is automated.
Processes are implemented leveraging AI Builder.
Parallel execution of workloads to improve throughput are implemented
Business Component Testing is well executed (validating other components,
workload management, process branching, exception handling and
performance measurement)

Resources:

Custom Connectors
AI Builder in RPA

500: Efficient Goal: Implementation maturity is at an advanced state. Organization is well


equipped to build complex processes with a high degree of resilience.

Indicators:

Implementing distributed execution of workloads utilizing machine groups


across auto-scaled virtual machines.
Fusion teams involving developers from Cloud, AI and other technology
capabilities build hybrid automation solutions.

Resources: Fusion Teams


Deploy & Manage
ノ Expand table

Level State of Nurture and Citizen Makers

100: Initial Goal: Scope of deployment/management is to support automation at a PoC level.

Indicators:

ALM is at a rudimentary stage, bot development is confined to a generic


development environment.
Network/Infrastructure deployment isn't scalable - confined to a bot(s)
Automation is primarily executed in attended mode
Bot monitoring and management are manual
Source Control not implemented

Resources: Deployment and Management Guidelines

200: Goal: Scope of deployment/management is to support a few bots for production


Repeatable usage.

Indicators:

Efforts to automate application deployment is underway.


Version Control using source control software is implemented
Backup and restore environments established
Automation deployment supports both attended and unattended modes of
execution

Resources:

GitHub Power Platform Actions


Back Up and Restore

300: Defined Goal: Scope of deployment/management towards supporting many bots for
production usage. Organization is maturing in its automation journey.

Indicators:

ALM is automated - export, deploy, import pipelines in place


Static Analyzers not yet implemented, validation and verification are manual
Role privileges defining citizen developer/admin/auditor across
environments are implemented.
Configuration management and patch updates across multiple machines are
managed using tools like Intune or SCCM.
Monitoring/Reporting metrics (using Power BI/equivalent reporting tool)
from a bot deployment perspective is in place (may not be real time)

Resources:
Level State of Nurture and Citizen Makers

ALM Basics
Configuration management

400: Goal: Scope of deployment is targeted towards supporting bots serving cross-
Capable functional teams for production usage with a bias towards optimization and
efficiency.

Indicators:

Leverage Dataverse to build use case specific dashboards.


Add VM Insights solution to your Log Analytics workspace
Configure connectivity tests using Connection Monitor in Azure

Network deployment from a High Availability standpoint is complete - Azure


availability zones, Azure disaster recovery implementation in place.
Monitoring/Reporting metrics (using Power BI/equivalent reporting tool)
from a bot deployment perspective is in place (is real time)

Resources:

Azure Disaster Recovery


VM Insights Solution
Connection Monitor in Azure

500: Efficient Goal: Scope of deployment/management is at a mature state. Organization is well


equipped to deploy and manage solutions efficiently ensuring a high degree of
resilience.

Indicators:

Utilize Azure Monitor for monitoring workloads and resource utilization.


Detect anomalous behavior by setting alerts, visual logs and monitoring.
Log network traffic using NSG (Network Security Group) log flows using
Azure Network Watcher

Resources:

Log network traffic using NSG


Azure Monitor

Secure & Govern


ノ Expand table
Level Details

100: Initial Goal: Security and Governance are envisioned to support automation at a
foundational level to facilitate future growth.

Indicators:

Governance/Security implementation maturity is at a rudimentary stage as


there are fewer bots to oversee.
Sensitive data that needs to be encrypted are identified in the bot
building/exploration process.
Access rights required to connect to external systems are identified.

Resources: Encrypt Sensitive Data

200: Goal: Security and Governance are envisioned towards supporting a few bots for
Repeatable production usage.

Indicators:

Conditional Access Policies - define policies in Azure for Power Automate to


grant or block based on user/group, device, location.
Encrypt data at REST and in-transit
Enforce HTTPS only communication for internet facing applications
Encrypt connection to virtual machines using RDP
Firewall Protection - Define Network security groups to allow/deny traffic
outbound/inbound.
Manage sensitive data in desktop flow - encrypt the data by using input as
text type.

Resources:

Conditional Access Policies


URL / IP Allow-listing
Encrypt data

300: Defined Goal: Security and Governance are envisioned towards supporting many bots for
production usage. Organization is maturing in its automation journey.

Indicators:

Gateway/Machine connectivity is automated.


License management is an automated process, licenses are
assigned/unassigned based on team/department utilization needs.
Secure credential storage and rotation in place.
Access Management is automated to manage access as required to file
paths, artifacts and applications.
Usage of Dataverse Teams and Microsoft Entra groups, to effectively govern
user permissions.
Review Connector usage in PPAC (Power Platform Admin Center).
Level Details

Resources:

Access Management
Microsoft Entra groups
Power Platform Admin Center Analytics

400: Capable Goal: Security and Governance are targeted towards supporting bots serving
cross-functional teams for production usage with a bias towards optimization and
efficiency.

Indicators:

Proactive monitoring in place.


Implement Cross Tenant Block to restrict third-party tenant access.
Implement Granular End Point DLP definitions for connectors.
Ensure Compliance standards are met based on industry (for example, PCI
DSS - Payment Card Industry Data Standard etc.

Resources:

Enable Cross Tenant Isolation


Granular End Point DLP
Compliance and Data Privacy

500: Efficient Goal: Security and Governance are at a mature state. Organization is well
equipped to secure and govern solutions efficiently ensuring a high degree of
resilience.

Indicators:

Implement Security Information Event Management (SIEM) to deliver


intelligent security analytics and threat intelligence.
Establish an app discovery process to identify new app connections (using
application security management tools like, Microsoft Cloud App Security
Management).

Resources: Security Information Event Management (SIEM)

Nurture
ノ Expand table

Level State of Fusion Teams

100: Initial Goal: Organization has just begun its automation journey, the objective is to
evangelize Power Automate adoption at a foundational level.
Level State of Fusion Teams

Indicators:

Identify teams that need to be part of the automation journey


Advocate the importance and impact of automation at a team setting level

Resources: Digital Transformation

200: Goal: Organization is expanding its automation footprint, whilst still evaluating
Repeatable feasibility from a scale perspective. Nurture is geared to support and promote
makers who have a fair grasp of Power Automate.

Indicators:

Allocate Roles and responsibilities based on individual job functions


Automation Product Champions
Automation Makers
Automation CoE Team
Establish point of contacts and stream leads whose objective is to evangelize
and support automation across the organization.

Resources: Allocate Roles and responsibilities

300: Goal: Organization ascertains Power Automate as a viable solution. The


Defined organization is maturing in its automation journey and with it comes many
"learning moments". "Nurture" at this level is attuned to support growing pains
typical to increasing maturity.

Indicators:

Beacons of automation support other citizen developers as they progress


through a "try-fail-succeed" cycle.
Citizen developers stride towards promoting solutions to production, while
being supported through each step by their peers.

Resources: Beacons of automation

400: Goal: Organization is increasingly tending to a mature state in its automation


Capable journey.

Indicators:

Teams share success stories, citizen developers advocate ROI gains across
the organization because of automation.
Teams initially hesitant to automation, begin to embrace automation
convinced by its value proposition.

Resources: Real-World Automation Success Stories

500: Goal: Organization is at a mature state in its automation journey.


Efficient
Indicators:
Level State of Fusion Teams

Offer individual recognitions and career paths to promising citizen


developers.
Encourage citizen developers to blog about their learnings and automation
journey to benefit other enthusiasts who are starting out.

Resources: Career Paths for citizen developers

7 Note

You can download a printable version of the Power Platform Automation


maturity model.

Feedback
Was this page helpful?  Yes  No

Provide product feedback


How to build custom actions in Power
Automate for desktop
Article • 11/30/2023

Enhance productivity, reusability, and extensibility with custom actions in Power


Automate for desktop. This article discusses how custom actions in Power Automate for
desktop can help makers create their own reusable actions that can be used across
multiple flows. Makers create custom actions by composing a sequence of steps or
functions into a new action. Custom actions are created using the Power Automate for
desktop actions SDK, which provides a set of APIs that allow makers to create custom
actions using .NET language C#. Custom actions can also be shared with other users
through the custom actions section in Power Automate (make.powerautomate.com). In
this article, find detailed walkthroughs of how to create, build, deploy, use, and update
custom actions.

) Important

While the essential features utilized in creating custom actions are supported, the
provided solutions, assets, and sample scripts mentioned here serve as an example
implementation of these features and don't include support.

Overview
Custom actions capability in Power Automate for desktop allows you to create your own
reusable actions that can be used across multiple desktop flows. Custom actions save
you time and effort by allowing you to reuse complex or frequently used actions without
having to re-create them each time you build a new flow. Makers can apply their
existing skills and knowledge to create custom actions that integrate with other systems
and services. Additionally, pro-developers can wrap the existing functions or code
libraries to make a new custom action that results in increased reusability of
organizational assets.

You create custom actions by composing a sequence of methods or functions into a


new action. Once you create a custom action, use it in any desktop flow by dragging
and dropping it onto the Power Automate desktop designer canvas.

Custom actions can be shared with other users through the custom actions section in
Power Automate, which provides a central repository for sharing and discovering
custom actions. This means that users can benefit from the expertise and knowledge of
others in the organization and can easily find and use custom actions created by other
makers.

Overall, custom actions in Power Automate for desktop provide a powerful way to
extend the functionality of the product, streamline the flow-building process, and foster
collaboration and innovation within the organization.

Prerequisites
Latest version of Power Automate for desktop – Install Power Automate
C# authoring tool such as Visual Studio Community/Professional/Enterprise
2022 with the .NET desktop development workload
Custom actions SDK – NuGet Gallery |
Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK
Digital certificate:
Generate a self-signed certificate – Generate Self-Signed Certificates Overview –
.NET
Enterprise deployment – Your organization’s trusted certificate from certificate
authority – Digital signatures and certificates – Office Support
SignTool:
Using SignTool to sign a file – Win32 apps
SignTool
Windows PowerShell Script (.ps1) – Create custom actions – Power Automate

Create your own custom action


1. Open Visual Studio to create a new project using template of Class Library (.NET
Framework).

2. Configure your new project with a project name, file location, and set the
Framework as .NET Framework 4.7.2.
7 Note

Make sure to follow the naming conventions. More information: Custom


module name conventions

3. In Visual Studio, select Tools > NuGet Package Manager > Package Manager
console.

4. Open a PowerShell window and install NuGet package


PowerAutomate.Desktop.Actions.SDK using this PowerShell command.

PowerShell

Find-Package Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK
NuGet\Install-Package
Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK

5. Follow the steps in Create custom actions to create the Class file for your custom
action.

Information you can use as reference for your


action
Reference solution archive: .NET Module Solution

Action: Write a message to a local file.

Input parameters: File name, message to write to the file.

Output parameters: Status code – true if successful and false if not successful.

Class Definition:

C#
using System;
using System.IO;
using Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK;
using Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK.Attributes;

namespace ModulesLogEvent
{
[Action(Id = "LogEventToFile" , Order = 1, Category = "Logging")]
[Throws("LogEventError")]
public class LogEventToFile : ActionBase
{
[InputArgument]
public string LogFileName { get; set; }

[InputArgument]
public string LogMessage { get; set; }

[OutputArgument]
public bool StatusCode { get; set; }

public override void Execute(ActionContext context)


{
try
{
// To append all of the text to the file
File.AppendAllText(LogFileName, LogMessage);
StatusCode = true;
}
catch (Exception e)
{
if (e is ActionException) throw;

throw new ActionException("LogEventError", e.Message,


e.InnerException);
}
}
}
}

Resources: This table lists the descriptions and friendly names for parameters in a
Resources.resx file.

Name Value Comment

LogEventToFile_Description Custom action to log Action


message to the supplied file description

LogEventToFile_FriendlyName LogEventToFile Action name

LogEventToFile_LogFileName_Description Input parameter of text type Action input


description
Name Value Comment

LogEventToFile_LogFileName_FriendlyName LogFileName Action input


name

LogEventToFile_LogMessage_Description Input parameter of text type Action input


description

LogEventToFile_LogMessage_FriendlyName LogMessage Action input


Name

LogEventToFile_StatusCode_Description Output parameter of Action output


boolean type description

LogEventToFile_StatusCode_FriendlyName LogMessage Action


outputName

ModulesLogEvent_Description Module to manage log Module


events description

ModulesLogEvent_FriendlyName LogEvent Module name

Build the package and deploy your custom


action
Create the package and deploy to Power Automate.

1. Acquire the digital certificate so the custom action DLL file can be signed.

) Important

Self-signed certificates are only for test purposes and aren't recommended for
production use. For organizational deployment of custom actions in your
environment, we recommend you use a trusted digital certificate that follows
your organizational guidelines.

 Tip

To streamline the process of developing and using custom actions for Power
Automate for desktop across your organization, you can bundle a trusted
digital certificate with the Power Automate for desktop installer that is
distributed through SCCM/Appstore. > This will enable the certificate to be
installed automatically on both makers and unattended runtime machines
that require Power Automate for desktop, without any need for additional
actions.

For this example, a self-signed certificate is used.

a. Create a self-signed certificate using this script.

PowerShell

$certPFXFileName="C:\PADCustomAction\PADCustomActionCert.pfx";
$certCERFileName="C:\PADCustomAction\PADCustomActionCert.cer";
$certStoreLocation="Cert:\LocalMachine\AuthRoot";
$certname = "PADCustomActionSelfSignCert"
##Create certificate
$cert = New-SelfSignedCertificate -CertStoreLocation
Cert:\CurrentUser\My -Type CodeSigningCert -Subject "CN=$certname"
-KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -
KeyAlgorithm RSA -HashAlgorithm SHA256
$mypwd = ConvertTo-SecureString -String <YOUR CERT PASSWORD GOES
HERE> -Force -AsPlainText
##Export certificate
$certPFXFile = Export-PfxCertificate -Cert $cert -FilePath
$certPFXFileName -Password $mypwd
$certCERFile = Export-Certificate -Cert $cert -FilePath
$certCERFileName -Type CERT -Verbose -Force

b. Import the certificate into certificate store using this command.

PowerShell

##Import certificate
Import-Certificate -CertStoreLocation $certStoreLocation -FilePath
$certCERFile

c. Validate that the imported certificate appears under Trusted Root Certification
Authorities > Certificates in Certificates Microsoft Manager Console (MMC)
snap-in.

2. Finalize the custom module created by signing the DLL file using a trusted
certificate. Use Visual Studio’s developer command prompt to use the Signtool for
this activity.

PowerShell

Signtool sign /f "C:/PADActions/PADCustomActionCert.pfx" /p


<YOURCERTPASSWORD> /fd SHA256
"C:/PADActions/PADCustomActionEventLog/Modules.LogEvent.dll"

3. To deploy the custom action, build the package the contents into a cabinet file
(.cab) by using this PowerShell script.

PowerShell

.\BuildCAB.ps1 "C:/PADActions/PADCustomActionEventLog"
"C:/PADActions/PADCustomActionEventLog" PADCustomActionEventLog.cab

Go to the sample script file BuildCAB.ps1

4. Sign the generated cabinet file using Signtool.

PowerShell

Signtool sign /f "C:/PADActions/PADCustomActionCert.pfx" /p


<YOURCERTPASSWORD> /fd SHA256
"C:/PADActions/PADCustomActionEventLog/PADCustomActionEventLog.cab"

5. Go to the Power Automate custom action section to upload the custom action that
you created. Provide the name, description, and cabinet file and then select
Upload.

You receive a notification when the action is successfully uploaded.

Following these steps, the custom action module is packaged into a cabinet file and
signed with a trusted certificate. Additionally, the custom action cabinet file is uploaded
to the custom action library in Power Automate.

More information: Upload custom actions

Use your custom action activity in desktop flow


using Power Automate for desktop
1. Create a new desktop flow, and then select the Assets Library in the designer.

2. Inspect the custom action available in the assets library. Notice the action
previously created and uploaded to the custom actions section of Power
Automate.

Select Add to add this custom action to the Actions section of the designer.

3. Validate that the custom action is added successfully. Search for it on the Actions
search bar in Power Automate for desktop's designer.

4. Drag the custom action or double-click it to add to the desktop flow.

5. Provide the input parameters and additional steps to test the custom action.

Sample desktop flow using the custom action.

6. Test the flow to see the custom action working in real time.
7 Note

Import the certificate used to sign the cabinet file to the machine used to build
desktop flows with custom actions and to each of the runtime machines that will
run the desktop flows.

Following these steps, a custom action was created, the module packaged into a cabinet
file, signed with a trusted certificate, uploaded to the custom action library in Power
Automate, a desktop flow to use the custom action created and tested for a successful
run.

Update and redeploy the custom action


Update the functionality of the custom action to reflect the updated capability by
following these steps.

1. Update the class file in Visual Studio solution with new action functionality. More
information: Updated .NET Module solution

Modified the signature of the class file to take in a third input parameter as shown.

C#

using System;
using System.IO;
using Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK;
using
Microsoft.PowerPlatform.PowerAutomate.Desktop.Actions.SDK.Attributes;
namespace ModulesLogEvent
{
[Action(Id = "LogEventToFile" , Order = 1, Category = "Logging")]
[Throws("LogEventError")]
public class LogEventToFile : ActionBase
{
[InputArgument]
public string LogFileName { get; set; }

[InputArgument]
public string LogMessage { get; set; }

[InputArgument]
public string LogLevel { get; set; }

[OutputArgument]
public bool StatusCode { get; set; }

public override void Execute(ActionContext context)


{
try
{
// To append all of the text to the file
File.AppendAllText(LogFileName, LogLevel + ": " +
LogMessage);
StatusCode = true;
}
catch (Exception e)
{
if (e is ActionException) throw;

throw new ActionException("LogEventError", e.Message,


e.InnerException);
}
}
}
}

2. Use similar steps described earlier where you sign the DLL file, create the cabinet
file, sign the cabinet file, and upload the cabinet file to custom actions section in
Power Automate. More information: Build the package and deploy your custom
action

7 Note

Before uploading the updated custom action cabinet file, make sure to
analyze the impact of this change as desktop flows with this action will be
updated with new capabilities.

3. Update the desktop flow as required.

To validate the update capability, we added a third input parameter to the custom
action. Notice that custom action activity is marked as Error in the designer, and it
needs to be updated with new input parameter.


4. Test the flow to see the updated custom action working in real time.

In this section, you updated the underlying functionality of the custom action, built the
package, deployed to Power Automate, refactored the desktop flow, and validated the
functionality by running the desktop flow with updated capabilities of the custom action
in Power Automate for desktop.
Business approvals templates overview
(preview)
Article • 11/20/2023

[This topic is prerelease documentation and is subject to change.]

Approvals are a key use case present across every industry, organization, and
department. Use Power Automate to streamline your business processes by digitalizing
the approvals experience.

) Important

This is a preview feature.


Preview features aren’t meant for production use and may have restricted
functionality. These features are available before an official release so that
customers can get early access and provide feedback.
This feature is being gradually rolled out across regions and might not be
available in your region.

Business approvals templates as part of the approvals kit provide no-code templates
that are built on top of Microsoft Power Platform components. Use these templates to
accelerate the speed at which your organization creates sophisticated approvals
workflows that include conditional branching, delegation, administrative overrides, and
more. Since the templates are no-code, almost anyone in your organization can use
them to meet your approval needs.
Organizations often need approvals for key business processes. These processes might
include expense reporting, time sheet management, business travel requests,
procurement orders and sales discounts.

With Power Platform, you can automate approvals needs by combining components
from Power Apps and Power Automate . As the complexity of your approvals
increases, so does your configuration and need to assemble multiple low code
components together. The business approvals templates are available with as a
configured collection of components and tools that are designed to help organizations
automate their approvals processes quickly.

These templates cover many of the typical requirements in most organizations, such as
the ability to:

Configure multi-stage approvals


Delegate others to approve
View approvals progress and history
Manage out of office for approvers
Support for versions of approval process
Publish version approval workflow
Handle of work days and public holidays

7 Note

Depending on your role in the organization, you may need an assigned Microsoft
365 license and a Premium Power Platform license to design, store, and run the
cloud services related to the business approvals template. Refer to the setup guide
for specific license requirements.

Disclaimer
Although the underlying features and components used to build the kit (such as
Dataverse, Power Apps, Power Automate and connectors) are fully supported, the kit
itself which is managed by Power CAT represents sample implementations of these
features. Our customers and community can use and customize these features to
implement approvals workflows in their organizations.

If you face issues with:

Using the kit: Report your issue on the Power CAT Business Approvals Kit forum.
Microsoft Support won't help you with issues related to this kit, but they'll help
with related, underlying platform and feature issues.

The core features in Microsoft Power Platform: Use your standard channel to
contact Support.
User Journey
Article • 11/20/2023

In the content below we'll explore an end-to-end user journey using the business
approvals templates kit.

Charlotte is a business user who wants to make approvals with several considerations:

ノ Expand table

Scenario Notes

SAP invoices When SAP invoices are submitted based on data inside the invoice a manual
approval is required.

Manager Manager approval is required for SAP invoices above the line manager
approvals approval limits.

Approver Depending on where the approver is in the world there are different holiday
availability schedules or working schedules. Waiting for approvals can cause delays until
an approver returns.

Handling holidays Approvers may be on holiday when a request is submitted. In this case there
could be delays in processing the approval request or confusion on who is an
alternate approver.

Power Platform Charlotte has limited permissions to build and deploy approvals solutions to
permissions test and production environments.

COE team Charlotte works with Gibson as her technical contact and mentor to help
support install solutions and create low-code Power Platform solutions.

Charlotte evaluates these considerations and chooses to use the business approval kit
because it provides support for multi level approvals and handling for different approver
scenarios. Working with Gibson from the Center of Excellence team, Charlotte installs
the business approvals kit in an environment. Once the kit is installed, Charlotte models
the approval process. Because SAP Power Automate is a newer area for Charlotte, she
works with Gibson to build the required triggers to obtain the SAP data and trigger the
approval workflow. Approvals are generated by using the workflow and Power
Automate Cloud flow for approvers to accept and decline requests.

User journey persona summary


Reviewing this user journey example there are three major personas of approver,
approvals administrator and Power Platform administrator as part of user journey using
the Approvals kit.

ノ Expand table

Persona Role User type Description

Power Platform IT Pro Creates / assigns environments and import


administrator Manager Developer approvals kit solution as system administrator or
system customizer.

Approvals Sales Power User Configures the kit to match the business
administrator Director requirements. Analyzes the approval processes
for improvement opportunities.

Approver Sales Business Receives approvals to accept or decline.


Manager User

Example
An example user journey can be illustrated as a summarized diagram in the following
series of steps
Charlotte or Gibson Charlotte analyzes the
Gibson creates an
creates a flow to trigger approval processes for
environment for Business
approvals integrated with improvement
Approvals kit
SAP and internal systems opportunities

Charlotte configures the Rebecca can use the kit


kit to match the business to approve requests
requirements within Sales without additional
dept. training

ノ Expand table

Role Actions

Pro Gibson creates or nominates an environment to install Business Approvals


Developer/Admin kit. Requires a role of System Customizer for the approvals kit to be
installed

Approvals Charlotte configures the kit to match the business requirements within
Administrator sales dept.

Maker Gibson or another Power Platform maker creates a flow to trigger


approvals integrated with SAP and internal systems.

Approver Rebecca can use the Microsoft Teams to approve requests without extra
training.

Approvals Charlotte analyzes the approval processes for improvement opportunities.


Administrator

Experience

Install
Gibson reviews the Setup Guide to ensure the prerequisites are in place. Using System
administrator or System customizer permissions, the approvals kit can be installed into
an environment and provide access for Approvals Administrators to access the
application.

Configure the business approval process


Charlotte uses the Process Designer to create a new business approval workflow and
define approval stages, conditions, and approval steps within each stage that match the
business requirements within the Sales department.
More information: Configure preset approvals

Approval triggers
Gibson, another maker, or professional developers within organization can easily
integrate their existing solutions and IT systems. Triggers can include connections such
as SQL server, Azure Functions. These individuals can start the approvals process by
adding a new cloud flow or modifying existing cloud flow actions.

The rest of the approval processes are automatically handled and managed in the
approvals kit.

More information: Trigger approvals

Integration with business applications


Common approval scenarios require integrations with business applications such as SAP
and Dynamics 365, etc. Native connectors are available to integrate with these services.

Integration with RPA


Use desktop flows to streamline your approval processes with legacy systems,
automating your approval experiences in a modern way.

Process approval requests


Rebecca and other configured approvers receive messages as adaptive cards in
Microsoft Outlook and Microsoft Teams. These approvals include details provided by the
business approval designer, Charlotte, to determine if the request should be approved
or declined.

More information: Process approval requests, Setup out of office and delegation.
Business approvals templates approvals
kit comparison
Article • 11/20/2023

This document outlines the differences between the business approvals templates
approvals kit and the Power Platform approvals connector. The approvals kit is a low-
code innovative enhancement that seamlessly integrates the approvals connector with a
user-friendly, no-code configurable Power Platform solution components.

The approvals kit enhances the approvals connector significantly, offering a range of
features to help manage complex approval processes efficiently. Use the Approvals kit
to:

Configure multi-stage approvals


Delegate decision-making authority
Monitor approval progress
Handle employee absences
Accommodate various approval process versions
Publish versioned workflows
Account for workdays and public holidays

Whether you're a seasoned workflow expert or just getting started, the approvals kit
serves as a valuable resource to streamline your approval processes effectively.

Feature comparison
This table provides a comparison between the approvals connector action and the
approvals kit. It highlights the core use case, core product features, triggers, premium
connectors, and other requirements for each option. Use this information to determine
which option is best suited for your specific needs and requirements.

ノ Expand table

Feature Approvals Connector Approvals Kit

Core use case Simple approvals without need for No-code workflow
advance approval features designer with extra
features

Core Product Feature Yes No (requires install)


Feature Approvals Connector Approvals Kit

Trigger via Power Yes Yes


Automate

Requires Azure No Yes


Application
Registration

Premium Connector(s) No Yes (requires Dataverse and


custom connector)

Requires Dataverse No Yes

Requires Power App No Yes, to build and manage


License approvals workflows

Multi stage approvals Not by default. Requires multiple cloud Yes


flows or other custom implementation

Delegated approvers Not by default. Requires custom Yes


implementation

Design / Authoring Copilot and Power Automate Designer No-code Power App
Process Designer

Versioning Cloud flow versions Dataverse-based workflow


stages and data versions

Workdays / Holidays Not by default, requires custom Yes


implementation

Flow comparison
This table summarizes different approaches to construct approval processes using the
Power Platform. The options range from using out-of-the-box copilot features and
interactive designers to custom solutions creations with business process flows.

ノ Expand table

Approach Notes

Cloud Flow Makers can use the out of the box Copilot features or the interactive designer to
create triggers add actions and include the approval actions to build an approval
decision making work flows

Business Provide a streamlined user experience that leads people through the processes
process flows their organization defines for interactions that need to be advanced to a
conclusion. Modify business process flows in a custom solution to include
Approach Notes

approvals.
More information: Business process flows overview

Approvals Kit Provides a ready made process with a no-code templates and patterns that
- Business business users can use to define to trigger and approval workflow with advanced
Process features listed in the feature comparison. The kit combines premade Power
Apps, Power Automate and Dataverse components to offer a no-code Process
Designer to manage and monitor approvals.
Business approvals kit manual setup
Article • 01/20/2024

The business approvals kit and guidance is targeted towards the person or department
responsible for setting up approval system in your organization. Key sections walk you
through the prerequisites, setup instructions, and individual components of the
Approvals Kit.

Overview
The business approvals kit is a collection of components that are designed to help you
get started with digitalizing your organization's approval processes using Microsoft
Power Platform. More information about individual components can be found in the
business approvals kit.

7 Note

The approvals kit can only be used currently in Dataverse environments, and setting
up in Dataverse for Teams environments and default environments is not
supported.

Prerequisites
Microsoft Dataverse environment (Default Environment can't be used)

Required licenses:

Power Apps per user or per app license for users who will:

Configure approvals

Approve requests AND need to check progress between each approval step

Create approval requests AND need to check progress between each


approval step

Administer approval processes

Power Automate Process license for:


Approve requests but DO NOT need to check progress between each
approval step using the template

Create approval requests but DO NOT need to check progress between each
approval step using the template

Power Apps and Power Automate pay as you go plans offer alternatives to
monthly user, application or flow licenses. More information: Licensing overview
for Microsoft Power Platform

Data Loss Prevention Policy categorized to be usable for the following


connectors in the same grouping (Business or Non Business):

Approvals

Microsoft Dataverse

Office 365 Users

Custom connectors with allowed pattern https://*.crm.dynamics.com (For


multiple environment policy)

Approvals Kit custom connector (For environment level policy)

More information: Data loss prevention policies

Power Automate approvals connector capability enabled (see section on


enabling Power Automate approvals capability for steps)

Persona licenses
Mapping persons from the User journey to licenses

ノ Expand table

Persona User Journey License


Reference

Approver Rebecca Microsoft Office 365 (For Office 365 or Microsoft Teams)
or Power Platform standard user license (For Power
Automate maker portal).

Approval Charlotte Microsoft Power Apps license (per user, per app, or pay-
administrator as-you-go).

Maker Charlotte or Power Automate premium to author cloud flows.


Persona User Journey License
Reference

Gibson

Environment Gibson Assigned Power Automate license to execute cloud flows


administrator with premium connectors included.

7 Note

Refer to Compare Power Automate Plans for plans that include the ability to
include premium connectors.

(Optional) Set up a new environment to install


1. Create an environment in which to set up the approvals kit.

a. Go to the Power Platform admin center.

b. Select Environments, then + New and enter a name, type, and purpose.

c. Select Yes to create the database, and then chooseNext.

d. Ensure Sample apps and data set is set to No.

e. Select Save.

2. Go to your new environment to import the Approvals Kit solution for a manual
install.

Enable Power Automate approvals capability


The approvals kit relies on out of the box approvals functionality from Power Automate.
If you're using the approvals function for the first time, you must enable the function
first either by using the Power Platform Command Line interface or manually by running
a cloud flow that includes an approval

Command line setup


Use Power Platform Command Line to install the flow approvals solution into the
environment. In the environment allocated for the Approvals Kit, use the following
PowerShell commands as a starter script:
pwsh

$envs = (pac admin list --json | ConvertFrom-Json) | Where-Object {


$_.DisplayName -eq "Your Environment Name" }
pac application install --environment $envs[0].EnvironmentId --application-
name "msdyn_FlowApprovals"

Manual Setup
In new environments, use the Power Platform Admin Center to install the Power
Automate Approvals feature:

1. Open the Power Platform Admin center .

2. In the left navigation, select Resources.

3. Select Dynamics 365 apps.

4. From the list, select Microsoft Flow Approvals.

5. Select the ... menu for Microsoft Flow Approvals.

6. Select Install.

7. Select the environment that the Approvals Kit will be installed in.

8. Review the terms of service.

9. Agree to the terms of service and select Install.

7 Note

It can take up to 10 minutes to install.

Install the core components


We recommend that you set up a designated Approvals Kit environment for all users
within your organization who need to access the Business Approvals process. Further
information on environments is available in environments overview. If you're a business
user, you'll typically need a person with administrative access to Power Platform to
create an environment for you. Reach out to your IT department to ask for assistance on
environment setup.
7 Note

The Approvals Kit uses solution management capabilities of Dataverse to


package up all assets.
To ensure consistency and same experience for every customer, the template
is provided as managed solutions.
If you would like to extend the template, you will need to use a separate
unmanaged solution because you can't directly modify this template.

Import the Creator Kit


Install the creator kit using one of the options in Install the Creator Kit.

Import the solution


Using an account with System Customizer permissions in an environment. Open
https://make.powerapps.com . Go to the environment you either created or been
allocated, in which the Approvals Kit should be hosted.

The first step of the installation process is to install the solution. This step is required for
every other component in the Approvals Kit to work. You'll either need to create an
environment or to import into the existing environment (excluding the Default
environment).

1. Open the Power CAT business approvals kit GitHub release site .

2. From the expanded section Assets for the latest release, download the Approvals
Kit file BusinessApprovalsKit_*_managed.zip.

3. Go to Power Apps .

4. On the left pane, select Solutions.

5. Select Import, then choose Browse.

6. Select the Approvals Kit core components solution from File Explorer.

7. Once the compressed (.zip) value is available, select Next.

8. Review the information, and then select Next.


9. Establish connections to activate your solution. If the connections don't exist,
create new connections to proceed with import.

7 Note

If you create a new connection, you must select Refresh.

10. Select Import.

Once import is complete, you should see business approvals kit in the list of solutions

7 Note

The import can take up to 10 minutes to complete.

After import steps


Once the approvals kit solution is imported to an environment successfully, you must
update the Approvals kit custom connector to point to the target tenant Identity
provider and turn on cloud flows.

Update custom connector


You must have an app registered to interact with Dataverse table and Custom API.

App Registration
Follow these steps to perform the app registration.

1. Open the Microsoft Entra admin center in a new window.

2. Select App Registration from the Application section under Identity.

3. Select New registration and provide a name, then select Register.

4. Under API permission, select Add a permission and choose Dynamic CRM.

5. Choose Delegated permission and select user_impersonation.

6. Select Add Permissions.


App registration admin consent
For the API permissions, you might need to grant tenant wide permissions for the
created application. Follow the guidance in Grant Admin Consent to provide the
required permissions.

If administrator consent isn't granted, when users attempt to create a connection with
the custom connector they can receive an error similar to the following:

7 Note

[email protected]

Need admin approval

Needs permission to access resources in your organization that only an admin can
grant. Please ask an admin to grant permission to this app before you can use it.

Have an admin account? Sign in with that account.

Return to the application without granting consent.

App registration secret

For the created application add the application secret that will be used by the custom
connector using the following steps:
1. Create a secret by moving to Certificates and Secrets section and selecting New
client secret.
2. Add a description and select an appropriate expiry date.
3. Select Add.

) Important

Copy the secret value and save it. You'll use the copied value when
configuring custom connector in the next section.
You'll also need the Client ID from ths Overview section.

Update the Approvals kit with a custom connector

Now you'll edit the Approvals kit custom connector present inside Business Approval
solution.

1. Under the General tab, modify the following:

Specify that the Host is the host name of your dataverse instance. For
example, contoso.crm.dynamics.com

2. Under the Security tab, modify the following:

Select Authentication type as OAuth 2.0.


Enter the Client ID
Enter the Secret noted in previous section.
Specify the environment URL under Resource URL section. This Resource url
contains the link to your environment. This is in format
https://yourenv.crm[x].dynamics.com where [x] is optional depending on your
region

3. Copy the Redirect URL.

4. Open the created Entra App Registration.

5. Select Authentication.

6. In the Web Redirect URIs, add the Redirect URL.

7. Select Save to update the App Registration.

8. Switch back to the custom connector.

9. Select Update connector.


10. Under the Test tab, create a New connection.

Specify the account details for the connection and allow access if prompted.
Edit the Custom connector again and test the GetPublishedWorkflow
operation.

The operation should run successfully with status as 200.

7 Note

Not sure on your region? You can review /power-platform/admin/new-


datacenter-regions
You can obtain you environment url from https://aka.ms/ppac
environments or the Power Apps Portal in the settings of the environment.
Issue #144 [Business Approvals Kit - BUG] Approvals Kit Upgrade -
Documents the need to update connector with OAuth Secret is tracking the
need to update the custom connector after upgrade

Activate the core cloud flows

The template includes multiple core components that are used to manage the approval
experience. To use the template, you must turn on the cloud flows that came with the
template.

1. Open make.powerapps.com in a new window.

2. Select Solutions, and open the Business Approvals Kit solution to view the flows.

3. Activate cloud flows using in the list to ensure no errors occur as there are
dependencies across the flows. Some cloud flows can be enabled when importing
the solution in the previous steps.
a. Turn on: BACore | Approval Time-out
b. Turn on: BACore | Approver OOF
c. Turn on: BACore | Cascade Process Status
d. Turn on: BACore | Cascade Publishing Activation
e. Turn on: BACore | Child | Get Dynamic Approver
f. Turn on: BACore | Child | Get Dynamic Data Instance
g. Turn on: BACore | Child | Get Default Settings
h. Turn on: BACore | Child | Log Runs
i. Turn on: BACore | Child | Evaluate Rule
j. Turn on: BACore | Daily | Calculate Approval Timeouts
k. Turn on: BACore | Publish Process
l. Turn on: BACore | Runtime -- Start Approval
m. Turn on: BACore | Runtime -- Start Node
n. Turn on: BACore | Runtime -- Start Stage
o. Turn on: BACore | Runtime -- Start Workflow
p. Turn on: BACore | Runtime -- Update Approval
q. Turn on: BACore | Runtime -- Update Node Instance
r. Turn on: BACore | Runtime -- Update Stage Instance
s. Turn on: BACore | Sync Approver OOF

Once installation is complete for the core components, your next step is to set up the
approval processes in How to use Approvals Kit section.
Configure preset approvals
Article • 01/13/2024

The business approvals kit allows you to predefine approval workflows so that you don't
need to set up each approver manually.

To configure preset approvals, you must set up:

The Approvals Process


The Application Data
The Stage
The Condition (Optional)
The Node
The Approver

Define processes
First step of configuring preset approvals is to set up Processes. Processes are where all
information related to the particular approval process is defined.

1. Go to Power Apps .

2. Select Apps, then choose Business Approval Management.

7 Note

When you open the application for the first time, it prompts you to give
consent to the Office 365 connector.

3. Switch to Approvals Designer on bottom left corner of the screen.

4. Select Processes.

5. Select New (Process Designer).

6. Enter the Name of the process, the Category (optional), and the Description
(optional).

7. In the Default Approver, start typing the name and select from list.

8. Select Save.
Define application data
In a typical approval request, you often need to also submit information about the
approval. For example amount, project category, department, accommodation, chart of
account number, cost center code, etc.

The approvals kit allows you to use these types of data from Power Platform, other
applications, and systems. Use data retrieved via connectors or manually enter the data
when you make the request. To use the data in your approval process, define them in
Application Data.

1. Enter the Field Name

2. Select the Data Type from the following:

Text
Number
Boolean
Date/Time
User (Email) - Can be used as approval user

3. (Optional) Enter the Default Value.

4. (Optional) Enter the Description.

5. Select + Add.

6. Repeat these steps until you add all required application data.

Define workflow stages and nodes


In the Approvals kit, you can define stages in a workflow process. A minimum of one
stage is required even for a simple workflow. Each stage then includes at least one node.
The node is where you define who is going to be the actual approver. You can define
multiple nodes within one stage, and each node is run sequentially.

After the first stage is defined, you can add more stages with conditions, which allows
you to branch out to different nodes for sophisticated scenarios.

Define the first workflow stage


1. Select Workflow > New Stage.
2. Enter a Name and Description.
3. Select Save & Back.

Define the first node


Once you define your first stage, you optionally can add the first node.

1. Select the + sign.

2. Enter a Name and Description.

3. Select the Approval Type.

ノ Expand table

Approval type Description

Approve/Reject - Every person included in this node has to approve to proceed


Everyone must approve to the next steps.

Approve/Reject - First to Only a single person included in this node has to approve to
respond proceed to the next steps.

Custom Responses - Wait Define multiple responses beyond the Approve/Decline. Every
for all responses person included in this node has to respond to proceed to the
next steps.

Custom Responses - Wait Define multiple responses beyond the Approve/Decline. Only a
for one response single person included in this node has to respond to proceed
to the next steps.

4. Select either User or Dynamic.

User: You can select a specific user/employee.


Dynamic: The user information is automatically retrieved and set as approver.
5. Select Notification.
a. Default
b. None

6. Select a Delegation Rule.

ノ Expand table

Delegation Description
rule

None No delegation settings set by the approver are applied to this workflow.

Time-out Delegation settings set by the approver are automatically be applied if


the approver assigned doesn't respond within the number of days
defined.

Out of Office Delegation settings set by the approver are automatically applied if the
approver is out of office at the time the approval is received

Time-out or Delegation settings set by the approver are automatically applied if the
Out of Office approver assigned doesn't respond within the number of days defined, or
if the approver is out of office at the time the approval is received.

7. Select a Time Out Setting.

Actual Days: The timeout calculation is the actual number of days since the
node is started, and doesn't take holidays into consideration.

Business Days: The timeout calculation considers the number of business


days passed based on the approvers work profile settings, and company
holiday settings.

8. Select Save & Back.

Repeat steps if you would like to add more nodes in the same stage.

7 Note

You don't need to specify only one approver in each node and can add multiple
approvers together in a single node.

Add a conditional stage with a switch condition


After adding the first stage of the workflow, you can add conditional approval by
defining conditions in stages. A switch condition allows you to have 2-5 different paths
that flow to depending on the condition you set.

1. Select Add Stage.

2. Enter a Name and Description.

3. Change the Condition to Switch.

4. Select the number of Paths.

5. Select a Source.

Request Data: The system automatically retrieves the application data from
the approval request to use as condition.

Previous Node Outcome: The system automatically retrieves the outcome of


the previous node (such as Approve/Reject or any custom options you
defined) to use as condition.

6. Fill in the options for each path.

Select either Static value or Request Data.

Static value: You must define the condition yourself.


Request Data: The system compares the Request Data defined in step 5
against what request data you define here.

Switch example
The example shown here's a scenario where the expense approval branches off to
different nodes depending on what expense reimbursement category was selected in
the original request.
Once defined, the Process Configurator shows three paths to add nodes to. Below
shows if Gift is selected, Jamie from General Affairs is the approver, and if Equipment is
selected, Karen from Procurement is the first approver, followed by her manager
(Procurement Director). If neither option is selected, it will go to the next stage.

Add a conditional stage with an If/Else Condition


After adding the first stage of the workflow, you can add conditional approval by
defining conditions in stages. An If/Else condition allows you to outline a specific
requirement (for example, an amount greater than 5,000 USD, a company code starts
with 10.)

1. Select Add Stage.


2. Enter a Name and Description.

3. Change the Condition to If/Else.

4. Select a Source.

Request Data: The system automatically retrieves the application data from
the approval request to use as condition.
Previous Node Outcome: The system automatically retrieves the outcome of
the previous node (such as Approve/Reject or any custom options you
defined) to use as condition.

5. Select an operand.

7 Note

This will depend on what type of data you select in step 4.

6. Fill in the condition.

Select either Static Value or Request Data.

Static Value: You must define the condition yourself.


Request Data: Tthe system compares the Request Data defined in step 4
against what request data you define here.

If/Else example
The example shown here's a scenario where expense approval branches off to two paths
depending on the amount of expense reimbursement requested. The condition is for
expenses greater than or equal to 5000 USD.
Publishing workflow
Once you finish configuring workflows, you must publish the workflow to be able to use
it. The system automatically checks and validates that the workflow doesn't have any
errors.

1. Open process you want to publish.


2. Select Process Designer > Publish.
3. For Activate process when published, select Yes.
4. Select Publish.

Once you select publish, a new version is stored and the system begins the publishing
process. When complete, the system shows the Activation status as Active.

You're now ready to take approval requests.


Setup out of office and delegation
Article • 11/20/2023

The approvals kit has settings to manage you're out of office and approval delegations
so that approval requests while you're away can be delegated to somebody else to
make the approval decisions.

Out of office setup steps


To set up the out of office and delegation, follow these steps:

1. Open Approvals Manager app.

2. Select Out of Office Settings.

3. Select Create new.

4. Choose the desired start and end dates/times of when you're out of office.

5. (Optional) Enter a title for your out of office.

6. Choose the response type and select Next.

7. Choose whether you're able to access emails while you're away and select Next.

8. Select the contacts you would like to add whilst you're away from the dropdown
list. If it isn't listed in the drop-down, follow these steps:

a. Select Add contact.

b. Search for the contact you would like to add.

c. Select the contact.

d. Select Done.

9. (Optional) Enter what each contact is referred to when you're away.

10. Select Next.

11. Select the person you would like to delegate your approval requests to from the
dropdown list.

12. (Optional) Modify the generated message to customize what you would like to
send to the person you're delegating to.
13. Select Next.

14. Review the message being using to automatically reply back when a new email
arrives.

15. Select Submit.

While your out of office settings are enabled, any new approval requests are delegated
to whoever you specified within the date and time you selected.

7 Note

Delegation doesn't occur if you:

Set your out of office settings to off.


Your out of office date and times don't fall in the time you specified.
Process approval requests
Article • 11/20/2023

There are multiple options to process approval requests and multiple results to choose
from.

The approvals kit has standard approve and decline options and several other options to
accommodate complex approval scenarios. Review this content to understand how the
system handles the approval request based on results.

Approve from Microsoft Teams


You can respond directly to an approval request from Microsoft Teams. A notification is
sent in Teams if you receive a request. You can also respond by checking the Teams
Approvals app.

More information: Respond to an approval in Microsoft Teams

Approve from Outlook


You can approve requests directly from Outlook or Outlook online. When an approval
request is received, a request is sent to the approver.

Approve from Power Automate mobile app


If you would like to make approvals while you are on the go, you can use Power
Automate mobile app. To install, go to the Power Automate mobile website .

Once you download the app:

1. Open Power Automate mobile app.

2. Select Activity.

3. Select Approvals.

4. Select the approval request you would like to address.

7 Note
The screens on Android, iOS and Windows Phone may differ slightly; however, the
functionality is the same on all devices.

Approve from Power Automate approvals


center
You can approve a request from Power Automate approvals center. To make an
approval, following the following steps:

1. Go to Power Automate website .

2. Select Action items.

3. Select Approvals.

4. Select the approval request you would like to address.

5. Choose your response.

6. Select Confirm.

Approvals status reference


The Business Approval Management app uses icons as a visual guide to approval status.
Icon Status Description

Approve Approves the request. Approvals Kit will automatically request the next
approval for approval, or will complete the approval with status set to
Approved.

Approve Approves the request with condition. Approvals Kit will automatically
with request the next approval for approval, or will complete the approval with
condition status set to Approved with condition.

Reject Declines the request. The Approvals Kit automatically terminates the
approval request, and sets the approval statues to Declined

Send back Sends back the request to the previous approver. Approvals Kit sets the
current approval request status back to Not Started, and look up the
previous approval request to override existing result to Pending again for
the previous approval to look at the request again. If two or more requests
occur in the previous step simultaneously, all of the previous steps are
processed in the same way.

Reassign Reassigns the approval request to whoever you specify. Approval request
remains to be Pending

In addition to your decisions, there are also three other statuses that can be shown in
the approval status.

Icon Status Description

Not Started This status means that the approval request hasn't started, and is awaiting
previous request to complete.

Pending The approval status shows as pending if the approval request is pending
decision from the approver.

Pending This status means that the approval request hasn't responded by the
(Timeout) approver with in the first 30 days, and the Power Automate cloud flow run
that is managing this request is restarted automatically. Once restarted,
the status changes back to Pending again.

System This status shows if the cloud flow is still processing the results.
Processing
Nurturing and educating the user
community and the technical approvals
community
Article • 11/20/2023

Are you looking to streamline your organization's approval processes and reduce
manual effort? Look no further than the Approvals Kit, a low-code solution that allows
you to automate approvals across various platforms. In this article, we discuss the setup
the Approvals Kit, aligning it with your environments and scaling adoption. Once you're
up and running, we show you how to take advantage of the Automation Kit learn
module. Using the learn module you can learn and apply automated approvals
processes either individually or as a group. Finally, we discuss how to scale your
automated approvals with the help of workshop instructor guides and mentors from
earlier cohorts of learning. Let's get started!

Learning levels
Example learning levels

https://aka.ms/approvals-kit/learn

https://aka.ms/approvals-kit https://aka.ms/approvals-
kit/instructor

https://aka.ms/approvals-kit/setup

Level 100 Level 200 Level 300


Get hands on / Apply to your Scale and Nurture-
Understand what we have problem Help others use it

Level Description Related content

Level Typically involves learning what the Approvals Kit is and how User Journey, Setup
100 it's set up and applied to your organization Guide
Level Description Related content

Level Involves getting hands on and using the Approvals kit. Once Approvals Kit in Power
200 you have mastery of the key concepts applying the learnings Automate workshop
to automating your approvals process overview

Level Scale and nurture others in the organization to adopt and use Instructor Guide
300 the approvals kit

It's not enough to simply set-up the Approvals Kit and expect everyone to start using it.
To truly reap the benefits of this low-code solution, it's important to create a process to
nurture and educate both the user community and the technical community. In addition,
it's important to involve the governance team to ensure that the Approvals Kit aligns
with your organization's policies and regulations. Microsoft provides a set of best
practices for nurturing adoption of Power Platform solutions, which can be extended to
the Approvals Kit. These best practices include creating a center of excellence,
identifying champions, providing training and support, and measuring success. By
following these best practices and tailoring them to the Approvals Kit, you can ensure
that your organization fully embraces this powerful tool.

Learning personas
To effectively nurture and support the approvals user journey, it's important to consider
the needs of each persona involved.

For the Power Platform administrator, the focus should be on setting up the Approvals
Kit. Consider how you govern and monitoring its usage and ensuring that service level
agreements with the business are met. From a technical point of view creating and
assigning environments, importing the approvals kit solution as a system administrator
or customizer.

For the Approvals administrator, the focus should be on configuring the kit to match the
business requirements and analyzing the approval processes for improvement
opportunities. Learning goals include how to design, model and building low-code
approval automation, or work with other makers to trigger and respond to approvals.

For the Approver, the focus should be on understanding how to set up out of office and
delegate approvals to others.

Finally, for the instructor persona, the focus should be on setting up, provisioning, and
monitoring a group of learners as they apply the Approvals Kit. The Instructor guide
By creating a learning path tailored to each persona's needs, you can ensure that
everyone involved in the approvals process is equipped with the knowledge and skills
they need to succeed.

Factors to consider
The Approvals Kit as part of your low-code solution, there are several factors to consider
to ensure a successful implementation. As a starter for discussion with your user and
center of excellence community include three key points to consider for each factor.

By considering these factors and addressing them appropriately, you can ensure a
successful adoption of the Approvals Kit as part of your low-code solution.

Environment strategy
Determine the appropriate number of environments across development, test and
production for approvals process.
Access to data and isolation requirements for approval workflows.
Setup of security roles and permissions for different personas.

Data loss prevention policy


Understand how the data that needs to be integrated with the Approvals Kit and
how data is protected.
Consider the impact of data loss prevention policies on the approvals process and
ensure that they're properly configured.
Ensure that data is properly classified and labeled to prevent data leakage.

Automation skills and patterns


Determine the appropriate level of automation for your approvals processes.
Identify common patterns and use cases for approval automation and develop
reusable components.
Ensure that automation is properly tested and validated to prevent errors and
ensure reliability.

Exception management
Develop a process for handling exceptions and errors in the approvals process.
Ensure that exceptions are properly logged and tracked for analysis and
improvement.
Develop contingency plans for handling unexpected events or failures.

Monitoring
Develop a process for monitoring the performance and usage of the Approvals Kit.
Identify key metrics to track, such as approval time and volume.
Ensure that monitoring is properly configured and alerts are set up for critical
events.
Frequently asked questions
Article • 11/20/2023

This article provides you with answers to frequently asked questions and tips on setting
up and using the Business Approvals Kit.

Why is this feature in preview?


The underlying components that the Kit is created from are generally available. To help
balance improvement with moving from preview to public usage, we're introducing this
feature as a preview feature prior to general availability.

If you're an early adopter and think this feature could be useful to you, try it out and
help test the feature. We recommend that you don't depend on it yet, and first try it out
in a dedicated test environment. Users evaluating features and providing feedback helps
us validate that the kit meets your needs. The preview process allows you to ensure that
we're not introducing unintended side effects and we can then progress to removing
the preview label for the kit.

Your feedback is critical to this process. Post your feedback by raising an issue on
GitHub.

What are the license requirements and costs


for using the Approvals Kit?
License costs depend on, the number of environments that you install the Approvals Kit
to and the different personas involved with the kit in your user journey.

What permissions are required for the


Approvals Kit?
Refer to the Setup guide for user permissions and Microsoft Entra application
permissions to set up and use the kit.

Are there limitations in terms of how many


stages can be created?
The main limit isn't on the stages of your workflow but at a request limit level. The
limitations guide provides more information on rate limits that could apply.

What are the key elements I need to consider when


setting up the kit?
The user journey outlines the Power Platform administrator persona and setup guide
include information on dependencies and data loss prevention policies.

Why is the kit not "free", and why does this kit
require premium licenses?
We don't build any of our kits to drive licenses or avoid licenses. We look at what
features of the platform best enables us to build our kits and run at the scale we
anticipate customers using them in.

We invest in ongoing maintenance and support cost of building / maintaining the


Approvals kit. The kit is open source, you're always welcome to use it as a reference
implementation to build your own.

The kit is built with cloud flows that run independent of executing inside the context of
a Power App. The approvals workflows are typically automated backend activities (not
run manually by a person). As a result of automated backend activities requires a Power
Automate process license that supports Premium connectors.

To author and publish Power Automate cloud flows using the Premium connectors, you
need a Power Automate license that supports Premium connectors running as a user.

You can refer to our Frequently asked questions about Power Automate licensing with
further licensing related questions.

I would like to try the Approvals Kit for


nonproduction usage. What options are there?
Some options that could be explored are the using the following licenses together with
a Development environment:

Power Automate Trials


Power Apps Developer Plan
Limitations
Article • 11/20/2023

The approvals kit is created to address some of the most common approval patterns;
however, there are limits to what a template can do to accommodate every possible
approval scenario. The kit is intended to act as a base solution to what you could use for
approvals. Some requirements might not fit for your organization, and which you can
tailor to your needs by customizing the template.

There are some technical limitations this template, and the following sections describe
limitations of the Approvals kit.

Business approval management


The following limits apply to the business approval management application:

Conditions can't be applied to the first stage of a workflow.

Power Automate action request limits


To help ensure service levels, availability, and quality, there are entitlement limits for the
number of requests users can make each day across Power Apps and Power Automate.

The Approvals kit consumes approximately 30 actions in Power Automate. These actions
are an equivalent to:

160 approval actions per day per user for Power Apps per user plan
30 approval actions per day per user for Power Apps per app plan.

If you exceed this request amount, you might consider add-on licenses. More
information: Requests limits and allocations

Supported languages
The Approvals kit solutions aren't localized, and only support English. Add the English
language pack to your environment to make sure all apps and flows work. More
information: Regional and language options for your environment
Approvals kit content overview
Article • 11/20/2023

The approvals kit is built on the out of the box approvals connector using a Power
Platform solution. The solution includes a set of Power Apps, Power Automate and
Dataverse components to make creating business approvals processed easier to author
and trigger.

The kit enables you to rapidly make changes without the need to update or deploy a
Power Platform solution. You can create variations for one approval without impacting
other approval processes. Additionally, Dataverse provides auditing features to record
approvals processes.

Key components of the kit:

Process designer: A Power App that allows business users to create and version
approval workflows with input application data, stages and conditions. Data used
by the process designer is stored in Dataverse design time tables.
Custom connector: Provides a simple way for makers to use a Power Automate
trigger to start the process of a business approval based on the Power Platform
connectors and actions.
Dataverse: A set of custom tables that allow workflows to be defined and
monitored.
Power Automate cloud flows: A set of cloud flows that react to changes in the
Dataverse tables to manage the end to end approval process.
Consuming apps / flows: Power Platform solutions can create a connection
reference to the approvals Kit connector to begin a business approval workflow.
Power Automate
This section summarizes the Power Automate components that make up the approvals
kit.

Cloud flows
The table represents a list of flows and processes related to a business approval process.
Each row describes a specific flow or process and its purpose.

ノ Expand table

Name Description

BACore | Handles timeout counter reaching zero by using a Dataverse trigger that
Approval Time- retrieves a list of business approvers. It then checks if the delegate is out of
out office and assigns the backup delegate if necessary. If there's no delegate or
backup delegate, it retrieves the default approver. Finally, it creates an override
for the delegate if one exists.

BACore | This flow subscribes to a Dataverse trigger on business approvers and listens
Approver OOF for changes in a specific entity. When a change occurs, it retrieves a list of
delegates and their out-of-office status. It then selects an available delegate
and creates an override for any running approval instances assigned to the
original approver who is out of office. The override sets the delegate as the
new approver and provides a reason for the override.

BACore | Cascades process status from business approval process handling application
Cascade data, stage status, node status and conditions.
Process Status

BACore | Updates the associated runtime data status. When triggered, it retrieves all
Cascade associated runtime data records for a specific process version and updates their
Publishing state code to match the trigger's state code.
Activation

BACore | Child | Copies business approval runtime data to create business approval published
Activate workflows and business approval published runtime data.
Published
Workflow

BACore | Child | Evaluates a business approval node condition.


Evaluate Rule

BACore | Child | Gets the business approval settings or default values if none is set.
Get Default
Settings
Name Description

BACore | Child | Looks up user or manager from the office graph to add approvers to business
Get Dynamic approvals.
Approver

BACore | Child | Get Approver or business approval data instance data.


Get Dynamic
Data Instance

BACore | Child | Log data to business approval instance logs.


Log Runs

BACore | Copy Triggered with requests from a Power App V2. It retrieves a list of records from
Process an entity named business approvers from Dataverse.

BACore | Daily | This flow decrements the timeout counter of business approval instances. The
Calculate counter is decrements down one day for each business if the timeout mode is
Approval set to business days and it's a workday with no holidays. Otherwise, the counter
Timeouts decremented by one actual day.

BACore | Copies the process definition, stages, conditions and nodes.


Publish Process

BACore | Subscribes to a Dataverse trigger on business approval workflow queues and


Runtime -- creates a new workflow instance for a business approval process. It retrieves
Initialize data from a runtime data instance and creates a new record for each item in
Workflow the data instance. The flow also retrieves the active version of the business
Queue approval process and creates a new record for the workflow instance.

BACore | Subscribes to a Dataverse trigger on business approval instance and creates a


Runtime -- Start new workflow instance for a business approval process. It retrieves data from a
Approval runtime data instance and creates a new record for each item in the data
instance. The flow also retrieves the active version of the business approval
process and creates a new record for the workflow instance.

BACore | Subscribes to a Dataverse trigger on business runtime node approval instance


Runtime -- Start that creates approvals, stage, conditions. It includes the ability to move to new
Node stage based on conditions. A node can complete a business approval workflow
or a defined stage.

BACore | Subscribes to a Dataverse trigger on business approval runtime stage instance.


Runtime -- Start It sets the first node to process in the workflow if nodes defined or complete
Stage the stage if no nodes defined.

BACore | Subscribes to a Dataverse trigger on business approval workflow. It initializes


Runtime -- Start the first stage or saves and error isn't stages defined.
Workflow

BACore | Subscribes to a Dataverse trigger on business approval instance. When a


Runtime -- change occurs, it checks the status of the instance and performs different
Name Description

Update actions based on the outcome. If the outcome is Approve, it cancels other
Approval running instances and updates the node instance status. If the outcome is
Reject, it cancels other running instances and updates the node instance status
to Canceled.

BACore | Subscribes to a Dataverse trigger on business approval runtime node instance.


Runtime -- When a change occurs, it checks the instance status field and performs
Update Node different actions based on its value. It either creates a new node instance or
Instance completes the runtime stage instance status or it cancels the runtime stage
instance status.

BACore | Subscribes to a Dataverse trigger on business approval runtime stage instance.


Runtime -- Depending on the status, it either creates a new stage instance or updates the
Update Stage workflow instance to complete.
Instance

BACore | Sync Runs a daily task to set or clear the out of office state.
Approver OOF

BACore | Subscribes to a Dataverse trigger on business approval version and calls child
Update Active flow activate published workflow.
Published
Workflow

Dataverse
This section summarizes the Dataverse components that make up the approvals kit.

Tables
There are several types of tables available:

Process Definition Tables - Define the approval processes. These tables are used to
look up and configures approval processes to your business needs.
Version Tables - Define the versions of published processes.
Reference Tables - Define approvers, work profile and calendar dates.
Runtime Tables - Store the results/status of the approvals.

Process definition tables

The following tables are used for definition.

ノ Expand table
Name Description Example(s)

Business One record per approval scenarios that the Invoice Approval
Approvals approvals kit manages.
Process

Business One record per data item used to define For example request amount,
Approval Data the input fields of that are used within the transportation method and
approval process department.

Business One record per stage within the approval Manager Approval or Line
Approval Stage process. Manager Approval

Business Define optional condition for a stage. None, If, Switch


Approval
Condition

Business Defines each step of the approval n/a*


Approval Node including the type of approval of that step
and the approver.

*n/a = not applicable.

Reference tables

ノ Expand table

Name Description Example(s)

Business Approver Defines for associated to approver(s) for n/a*


each node within the approval process.

Business Approval Set up for each approver to define settings. Out of office
Work Profile

Business Approval Define the company holidays. Non work days and
Holiday Calendar weekend such as Saturdays

Business Approval Define the holidays separate from the n/a*


Public Holidays organization holiday.

Business Approver Defines the user out of office settings. n/a*


OOF

*n/a = not applicable.

Version tables
ノ Expand table

Name Description

Business Approval Version A saved version of a business approval process.

Business Approval Published Defined a published approval process version.


Workflow

Business Approval Published Runtime Defined data for a published approval process version.
Data

Business Approval Published Runtime Defined stage or stages for a published approval process
Stage version.

Business Approval Published Runtime Defined node or nodes for a published approval process
Node stage.

Runtime tables
For each approval request that is made, data is stored into runtime instance tables.
These entries are based on the configuration of the definition version tables.

ノ Expand table

Name Description Notes

Business Used to manage all incoming approval Created approval generated


Approval requests and the related tables used for the from data in your source
Workflow request triggered system. Created for a
version of a published business
approval process

Business Is used to store the metadata of what is For example, if you define
Approval defined in approval data "Request Amount" in Approval
Runtime Data Data table, the Instance Data
Instance would store the actual request
amount for the request

Business Creates a working copy of Business Any changes made during


Approval Approval Stage version. The record is used runtime are saved here
Runtime Stage to hold each transactional stage of the
Instance approval

Business creates a working copy of Nodes Any changes made during


Approval transactional reference to the approval runtime are saved here
Runtime Node request. Used to hold each transactional
Instance
Name Description Notes

step of the approval including the type of


approval of that step and the approver

Business Stores a transactional reference of Approval n/a*


Approval for a specific node instance
Instance

Business Stores a transactional reference of Approval n/a*


Approval reference and outcome
Instance Log

Business Stores a transactional a new approver that n/a*


Approval overrides the original approval instance
Instance
Override

*n/a = not applicable.

Data model

Entity Relationship diagram that shows the relationships between key tables.

Notes:

Tables in green are the current working version of a Business Approval Process
Tables in blue are the related tables that save information about a version of a
process
Tables in black are the related tables that save transactional information for a
specific approval process
Custom connector
The Approvals Kit includes a custom connector to help the process of starting a new
business approval process.

The custom connector provided as part of the kit uses Custom code support to query
the defined workflow processes and application data (variables) required by the
workflow.

Dynamic parameters
The custom connector makes use of multiple OpenAPI actions to get published
workflows, get data fields and start a business approval process.

Get published workflows

The get published workflow action queries Dataverse to return currently active and
published workflows so that the maker can define which approvals workflow should be
started.

Get approval data fields

To get the data fields for a workflow the custom connector makes user of the Use
dynamic schema so that x-ms-dynamic-schema calls the custom code action.

The custom code actions include the C# code to query the data fields so that the maker
can provide the required fields.

Create workflow instance

The create workflow instance action saves the business approval workflow data as a
JSON serialized string so that the approvals process is started.

Operations
This connector depends on three operations:

GetPublishedWorkflows
GetApprovalDataFields
CreateWorkflowInstance

GetPublishedWorkflows

This operation helps in listing all the published business approval templates by calling
the native API and reading the table Business Approval Published Workflows.

/api/data/v9.2/cat_businessapprovalpublishedworkflows
GetApprovalDataFields
This operation helps in fetching the list of dynamic schemas applicable for the selected
workflow. Under the hood, it calls a Custom API ( GetDynamicParameters ) which returns
the schema definition based on the workflow ID provided/selected.

/api/data/v9.2/cat_GetDynamicParameters(ProcessID={processID})

7 Note

The return values of the API, which are of type Entity, are further modified using
custom code for custom connector to support and align with the Open API schema.
For more information, see the ModifySchema() method in the Script.csx file.

CreateWorkflowInstance
This operation helps in creating a record in businessapprovalworkflowqueues by making
a POST operation to the native API.

/api/data/v9.2/cat_businessapprovalworkflowqueues

Here we use a CreateWorkflowQueue() method using custom code for the custom
connector to prepare and parse JSON, which holds all the parameter values and the
Workflow ID (also known as ProcessID).

7 Note

This operation performs the final POST call to create the record.

Here are the two values, which are updated during the post call:

JSON

"cat_runtimedata": "[{"id":"9a664958-c656-ee11-be6f-
0022482a97de","value":"True"},{"id":"7ddc1057-c656-ee11-be6f-
0022482a91f4","value":"123"},{"id":"49dc1057-c656-ee11-be6f-
0022482a91f4","value":""},{"id":"9b664958-c656-ee11-be6f-
0022482a97de","value":"10/11/2023 8:49:51 AM"},{"id":"7cdc1057-c656-ee11-
be6f-0022482a91f4","value":"Another title"}]",
"cat_processid": "57b2ee16-ea51-ee11-be6f-0022482a97de"

Common questions

Powered by AI

Nurturing and upskilling within a Center of Excellence (CoE) for automation is strategically important because it ensures the organization can effectively scale its automation capabilities and maximize return on investment. By developing skills through training programs and creating a community of champions, employees can better utilize automation technologies, leading to improved productivity and innovation . The CoE provides governance, security, and compliance oversight, which are crucial for managing and deploying automation solutions across the enterprise in a standardized way . Moreover, upskilling accelerates the journey from ideation to production, allowing citizen developers and pro developers to contribute actively to automation projects. This not only democratizes access to automation tools but also reduces the reliance on central IT resources, empowering departments to drive their own automation efforts . Furthermore, nurturing and upskilling support the establishment of consistent processes and best practices, enhancing the organization's maturity in managing automation . The ability to track project outcomes and measure impact through analytics also helps in refining strategies and achieving business goals effectively .

The Automation Kit applies Holistic Enterprise Automation Techniques (HEAT) to manage automation projects effectively by streamlining processes such as ideation, development, deployment, and impact tracking within the Automation Center of Excellence (CoE). It aids in creating and approving potential projects, monitoring ROI, and synchronizing data across environments in near real-time . The Automation Kit supports the governance, scaling, and management of automation through tools like the Automation Center, Automation Solution Manager, and a Power BI dashboard, which collectively track the project's lifecycle and estimate ROI . Additionally, the kit assists in establishing an approval process for project investments and employs dashboards for impact measurement, driving towards business objectives through short and long-term analytics .

Power Automate Desktop caters to both novice and advanced users by offering a no-code experience as well as more sophisticated automation options. Novices benefit from prebuilt, drag-and-drop actions to automate repetitive tasks easily, using UI-based robotic process automation (RPA) for desktop flows. This allows users to record and play back UI actions like clicks and keyboard input without writing any code . Advanced users, such as IT professionals and developers, can extend capabilities by integrating API-based digital process automation to create complex workflows across cloud and desktop applications, and even create custom connectors when APIs are available . Additionally, Power Automate Desktop supports both low-code and pro-code development, providing an inclusive platform that bridges different skill levels and use cases .

The low-code capabilities of Power Automate Desktop can be leveraged to enhance SAP GUI automation by allowing users to automate repetitive desktop processes using drag-and-drop actions or by recording their own cloud flows . These capabilities enable users to build end-to-end automation for processes such as updating employee records in SAP GUI, without requiring extensive coding skills . Power Automate Desktop facilitates the integration of VBScript generated from SAP GUI scripting, allowing automation processes to become dynamic by replacing hard-coded values with input variables . This simplifies automating tasks like entering a second address for employees in SAP, using predefined SAP transactions and addressing forms . Users can also manage automation flows with secure connectivity to SAP systems and seamless handling of login credentials using Azure Key Vault . These low-code features empower both IT professionals and citizen developers to streamline SAP GUI-based enterprise processes effectively .

To create a desktop flow for integrating with SAP using Power Automate Desktop, follow these steps: 1. **Open Power Automate Desktop**: Start by launching Power Automate Desktop and selecting 'New flow'. Enter a name for your flow and then 'Create' it . 2. **Configure Variables**: In the Power Automate Desktop designer, select 'Variables' and create both technical SAP variables (e.g., SAPPassword, SAPUser) and use-case-specific variables (e.g., EmployeeId, AddressType). 3. **Run Application Action**: Drag the 'Run application' action onto the design surface. Provide the path to SAP GUI (e.g., `sapshcut.exe`) and configure command line arguments including SAP system ID, client, user, and password . 4. **Wait Action**: Add a 'Wait' action to allow time for SAP GUI to start up, specifying a delay (e.g., 10 seconds). 5. **Interact with SAP GUI**: Use the 'Populate text field in window' action to enter data into SAP fields. Add UI elements by selecting the SAP window and setting focus on the desired elements . Track controls using the tracking session dialog . 6. **Test and Fine-Tune**: Save the flow, then test it using both nonexistent and valid personnel numbers. Adjust actions as needed for correct execution and results . 7. **Secure Connections**: Optionally, secure SAP credentials using Azure Key Vault for handling sensitive information securely . 8. **Deploy and Run**: Use the 'Run a flow built with Power Automate Desktop' action within a cloud flow to execute the desktop flow. Set necessary parameters and select run mode (often 'Attended'). Ensure that your system meets all prerequisites such as enabling SAP GUI scripting and having the necessary Power Automate licenses .

Power Automate Desktop plays a significant role in building SAP GUI automation by allowing users to automate repetitive desktop processes, including interacting with SAP GUI applications. It supports a range of automation from low-code to pro-code techniques. SAP GUI interactions can be recorded and translated into Power Automate Desktop actions using prebuilt or custom scripts generated by SAP's GUI scripting engine . To enhance these automations for advanced use, IT professionals and developers in an RPA Center of Excellence can integrate VBScript with Power Automate Desktop actions to handle more complex tasks. This involves replacing static text with dynamic inputs, allowing for scalable and maintainable automation solutions . Additionally, ensuring that all SAP GUI scripting configurations are complete is essential, which includes enabling SAP scripting and assigning necessary authorizations .

Power Automate offers comprehensive security and compliance measures for enterprise automation processes. It leverages Azure Active Directory for centralized management of access controls and uses rich governance and security controls to ensure trusted process execution . This includes providing a robust set of auditing logs that allow admins to track past events in the system and proactively manage security . Additionally, Power Automate supports secure credential storage and automated credential rotation, ensuring secure access management . It integrates deeply with Microsoft 365 and Azure, allowing the definition of reactive and proactive policies to monitor automation activity . Furthermore, the platform allows for the establishment of data loss prevention (DLP) policies, compliance with industry standards such as PCI DSS, and implements Security Information Event Management (SIEM) for threat intelligence and security analytics . These measures help maintain data integrity and security across all automation workflows .

Integrating an on-premises data gateway within Power Automate necessitates ensuring the necessary authentication and connection details are prepared for accessing local data sources, which the gateway allows to interact securely with Microsoft cloud services such as Power BI, Azure Analysis Services, and Power Automate itself . This gateway acts as a bridge for data transfer, providing a secure way to use on-premises data within cloud services, which is crucial for automations involving local systems and processes . By supporting secure data transfer and connectivity, the on-premises data gateway enables organizations to automate workflows effectively in hybrid environments where both cloud-based and local systems are in use .

The Automation Kit supports organizations in managing and scaling their automation strategies by providing a comprehensive set of tools designed to oversee automation projects and monitor returns on investment (ROI). It facilitates the holistic management of automation processes through the use of HEAT (Holistic Enterprise Automation Techniques) and is particularly beneficial for an Automation Center of Excellence (CoE) team dedicated to fostering automation in an organization . The kit includes components such as automation project facilitation apps, solution metering, data synchronization through cloud flows, and a Power BI dashboard for insights. These tools enable near-real-time tracking and governance of automation projects, ensuring consistency and enterprise readiness while facilitating end-to-end lifecycle management of hyperautomation scenarios . Additionally, it supports organizational governance, security, and audit requirements to ensure that scaling occurs effectively and aligns with business goals .

The Approvals Kit provides significant enhancements over the standard Power Platform approvals connector by offering features such as a no-code workflow designer, multi-stage approval capabilities, delegation of approvers, and version management of workflows. It accommodates workdays, holidays, and provides seamless integration with SAP and Dynamics 365 through native connectors. In business settings, it enables sophisticated approval workflows inline with organizational processes without extensive coding, allowing for better management and efficiency in handling approval tasks across various business applications.

You might also like