Atc TutorialSSIS4
Atc TutorialSSIS4
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
SQL Server Integration Services is a platform for building enterprise-level data integration and data
transformations solutions. Use Integration Services to solve complex business problems by copying or
downloading files, loading data warehouses, cleansing and mining data, and managing SQL Server objects and
data.
Integration Services can extract and transform data from a wide variety of sources such as XML data files, flat files,
and relational data sources, and then load the data into one or more destinations.
Integration Services includes a rich set of built-in tasks and transformations, graphical tools for building packages,
and the Integration Services Catalog database, where you store, run, and manage packages.
You can use the graphical Integration Services tools to create solutions without writing a single line of code. You
can also program the extensive Integration Services object model to create packages programmatically and code
custom tasks and other package objects.
Resources
Get help in the SSIS forum
Get help on Stack Overflow
Follow the SSIS team blog
Report issues & request features
Get the docs on your PC
What's New in Integration Services in SQL Server
2016
11/5/2020 • 16 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic describes the features that have been added or updated in SQL Server 2016 Integration Services. It also
includes features added or updated in the Azure Feature Pack for Integration Services (SSIS) during the SQL Server
2016 time frame.
Manageability
Better deployment
SSISDB Upgrade Wizard
Run the SSISDB Upgrade Wizard to upgrade the SSIS Catalog database, SSISDB, when the database is older than
the current version of the SQL Server instance. This occurs when one of the following conditions is true.
You restored the database from an older version of SQL Server.
You did not remove the database from an Always On Availability Group before upgrading the SQL Server
instance. This prevents the automatic upgrade of the database. For more info, see Upgrading SSISDB in an
availability group.
For more info, see SSIS Catalog (SSISDB).
Support for Always On in the SSIS Catalog
The Always On Availability Groups feature is a high-availability and disaster-recovery solution that provides an
enterprise-level alternative to database mirroring. An availability group supports a failover environment for a
discrete set of user databases known as availability databases that fail over together. For more information, see
Always On Availability Groups.
In SQL Server 2016, SSIS introduces new capabilities that let you easily deploy to a centralized SSIS Catalog (i.e.
SSISDB user database). In order to provide high availability for the SSISDB database and its contents - projects,
packages, execution logs, and so on - you can add the SSISDB database to an Always On Availability Group, just
like any other user database. When a failover occurs, one of the secondary nodes automatically becomes the new
primary node.
For a detailed overview and step-by-step instructions for enabling Always On for SSISDB, see SSIS Catalog.
Incremental package deployment
The Incremental Package Deployment feature lets you deploy one or more packages to an existing or new project
without deploying the whole project. You can incrementally deploy packages by using the following tools.
Deployment Wizard
SQL Server Management Studio (which uses the Deployment Wizard)
SQL Server Data Tools (Visual Studio) (which also uses the Deployment Wizard)
Stored procedures
The Management Object Model (MOM) API
For more info, see Deploy Integration Services (SSIS) Projects and Packages.
Support for Always Encrypted in the SSIS Catalog
SSIS already supports the Always Encrypted feature in SQL Server. For more info, see the following blog posts.
SSIS with Always Encrypted
Lookup transformation with Always Encrypted
Better debugging
New ssis_logreader database-level role in the SSIS catalog
In previous versions of the SSIS catalog, only users in the ssis_admin role can access the views that contain
logging output. There is now a new ssis_logreader database-level role that you can use to grant permissions to
access the views that contain logging output to users who aren't administrators.
There is also a new ssis_monitor role. This role supports Always On and is for internal use only by the SSIS
catalog.
New RuntimeLineage logging level in the SSIS catalog
The new RuntimeLineage logging level in the SSIS catalog collects the data required to track lineage information
in the data flow. You can parse this lineage information to map the lineage relationship between tasks. ISVs and
developers can build custom lineage mapping tools with this information.
New custom logging level in the SSIS catalog
Previous versions of the SSIS catalog let you choose from four built-in logging levels when you run a package:
None, Basic, Performance, or Verbose . SQL Server 2016 adds the RuntimeLineage logging level. In addition,
you can now create and save multiple customized logging levels in the SSIS catalog, and pick the logging level to
use every time you run a package. For each customized logging level, select only the statistics and events you want
to capture. Optionally include the event context to see variable values, connection strings, and task properties. For
more info, see Enable Logging for Package Execution on the SSIS Server.
Column names for errors in the data flow
When you redirect rows in the data flow that contain errors to an error output, the output contains a numeric
identifier for the column in which the error occurred, but does not display the name of the column. There are now
several ways to find or display the name of the column in which the error occurred.
When you configure logging, select the DiagnosticEx event for logging. This event writes a data flow
column map to the log. You can then look up the column name in this column map by using the column
identifier captured by an error output. For more info, see Error Handling in Data.
In the Advanced Editor, you can see the column name for the upstream column when you view the
properties of an input or output column of a data flow component.
To see the names of the columns in which the error occurred, attach a Data Viewer to an error output. The
Data Viewer now displays both the description of the error and the name of the column in which the error
occurred.
In the Script Component or a custom data flow component, call the new GetIdentificationStringByID method
of the IDTSComponentMetadata100 interface.
For more info about this improvement, see the following blog post by SSIS developer Bo Fan: Error Column
Improvements for SSIS Data Flow.
NOTE
(This support has been expanded in subsequent releases. For more info, see Expanded support for error column names and
New IDTSComponentMetaData130 interface in the API.)
NOTE
(In RC0, this method has been moved to the new IDTSComponentMetaData130 interface. For more info, see New
IDTSComponentMetaData130 interface in the API.)
Connectivity
Expanded connectivity on premises
Support for OData v4 data sources
The OData Source and the OData Connection Manager now support the OData v3 and v4 protocols.
For OData V3 protocol, the component supports the ATOM and JSON data formats .
For OData V4 protocol, the component supports the JSON data format .
For more info, see OData Source.
Explicit support for Excel 2013 data sources
The Excel Connection Manager, the Excel Source and the Excel Destination, and the SQL Server Import and Export
Wizard now provide explicit support for Excel 2013 data sources.
Support for the Hadoop file system (HDFS)
Support for HDFS contains connection managers to connect to Hadoop clusters and tasks to do common HDFS
operations. For more info, see Hadoop and HDFS Support in Integration Services (SSIS).
Expanded support for Hadoop and HDFS
The Hadoop Connection Manager now supports both Basic and Kerberos authentication. For more info, see
Hadoop Connection Manager.
The HDFS File Source and the HDFS File Destination how support both Text and Avro format. For more info,
see HDFS File Source and HDFS File Destination.
The Hadoop File System task now supports the CopyWithinHadoop option in addition to the CopyToHadoop
and the CopyFromHadoop options. For more info, see Hadoop File System Task.
HDFS File Destination now supports ORC file format
The HDFS File Destination now supports the ORC file format in addition to Text and Avro. (The HDFS File Source
supports only Text and Avro.) For more info about this component, see HDFS File Destination.
ODBC components updated for SQL Server 2016
The ODBC Source and Destination components have been updated to provide full compatibility with SQL Server
2016. There is no new functionality and there are no changes in behavior.
Explicit support for Excel 2016 data sources
The Excel Connection Manager, the Excel Source, and the Excel Destination now provide explicit support for Excel
2016 data sources.
Connector for SAP BW for SQL Server 2016 released
The Microsoft® Connector for SAP BW for Microsoft SQL Server® 2016 has been released as part of the SQL
Server 2016 Feature Pack. To download components of the Feature Pack, see Microsoft® SQL Server® 2016
Feature Pack.
Connectors v4.0 for Oracle and Teradata released
The Microsoft Connectors v4.0 for Oracle and Teradata have been released. To download the connectors, see
Microsoft Connectors v4.0 for Oracle and Teradata.
Connectors for Analytics Platform System (PDW) Appliance Update 5 released
The destination adapters for loading data into PDW with AU5 have been released. To download the adapters, see
Analytics Platform System Appliance Update 5 Documentation and Client Tools.
Expanded connectivity to the cloud
Azure Feature Pack for SSIS released for SQL Server 2016
The Azure Feature Pack for Integration Services has been released for SQL Server 2016. The feature pack contains
connection managers to connect to Azure data sources and tasks to do common Azure operations. For more info,
see Azure Feature Pack for Integration Services (SSIS).
Support for Microsoft Dynamics online resources released in Service Pack 1
With SQL Server 2016 Service Pack 1 installed, the OData Source and OData Connection Manager now support
connecting to the OData feeds of Microsoft Dynamics AX Online and Microsoft Dynamics CRM Online.
Support for Azure Data Lake Store released
The latest version of the Azure Feature Pack includes a connection manager, source, and destination to move data
to and from Azure Data Lake Store. For more info, see Azure Feature Pack for Integration Services (SSIS)
Support for Azure Synapse Analytics released
The latest version of the Azure Feature Pack includes the Azure SQL DW Upload task for populating Azure Synapse
Analytics with data. For more info, see Azure Feature Pack for Integration Services (SSIS)
IMPORTANT
If you develop custom extensions for SSIS, see Support multi-targeting in your custom components and Getting your SSIS
custom extensions to be supported by the multi-version support of SSDT 2015 for SQL Server 2016.
Get help
UserVoice: Have suggestions for improving SQL Server?
Microsoft Q & A (SQL Server)
DBA Stack Exchange (tag sql-server): Ask SQL Server questions
Stack Overflow (tag sql-server): Answers to SQL development questions
Reddit: General discussion about SQL Server
Microsoft SQL Server License Terms and Information
Support options for business users
Contact Microsoft
Additional SQL Server help and feedback
See Also
What's New in SQL Server 2016
Editions and Supported Features for SQL Server 2016
What's New in Integration Services in SQL Server
2017
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic describes the features that have been added or updated in SQL Server 2017 (14.x) Integration Services.
NOTE
SQL Server 2017 also includes the features of SQL Server 2016 and the features added in SQL Server 2016 updates. For info
about the new SSIS features in SQL Server 2016, see What's New in Integration Services in SQL Server 2016.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic provides details about the features of SQL Server Integration Services (SSIS) supported by the different
editions of SQL Server.
For features supported by Evaluation and Developer editions, see features listed for Enterprise Edition in the
following tables.
For the latest release notes and what's new information, see the following articles:
SQL Server 2016 release notes
What's New in Integration Services in SQL Server 2016
What's New in Integration Services in SQL Server 2017
Tr y SQL Ser ver 2016!
The SQL Server Evaluation edition is available for a 180-day trial period.
1 If you run packages that require Enterprise-only features in Scale Out, the Scale Out Workers must also run on
1 DTSWizard.exe is not provided with SQL on Linux. However, dtexec on Linux can be used to execute a package
created by DTSWizard on Windows.
Integration Services
EXP RESS W IT H
A DVA N C ED
F EAT URE EN T ERP RISE STA N DA RD W EB SERVIC ES EXP RESS
High- Yes
performance
Oracle source
and destination
by Attunity
High- Yes
performance
Teradata source
and destination
by Attunity
EXP RESS W IT H
A DVA N C ED
F EAT URE EN T ERP RISE STA N DA RD W EB SERVIC ES EXP RESS
Dimension Yes
processing
destination
Partition Yes
processing
destination
1 The Change Data Capture components by Attunity require Enterprise edition. The Change Data Capture Service
and the Change Data Capture Designer, however, do not require Enterprise edition. You can use the Designer and
the Service on a computer where SSIS is not installed.
Integration Services Developer Documentation
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Integration Services includes a completely rewritten object model, which has been enhanced with many features
that make extending and programming packages easier, more flexible, and more powerful. Developers can extend
and program almost every aspect of Integration Services packages.
As an Integration Services developer, there are two fundamental approaches that you can take to Integration
Services programming:
You can extend packages by writing components that become available within SSIS Designer to provide
custom functionality in a package.
You can create, configure, and run packages programmatically from your own applications.
If you find that the built-in components in Integration Services do not meet your requirements, you can extend the
power of Integration Services by coding your own extensions. In this approach, you have two discrete options:
For ad hoc use in a single package, you can create a custom task by writing code in the Script task, or a
custom data flow component by writing code in the Script component, which you can configure as a source,
transformation, or destination. These powerful wrappers write the infrastructure code for you and let you
focus exclusively on developing your custom functionality; however, they are not easily reusable elsewhere.
For use in multiple packages, you can create custom Integration Services extensions such as connection
managers, tasks, enumerators, log providers, and data flow components. The managed Integration Services
object model contains base classes that provide a starting point and make developing custom extensions
easier than ever.
If you want to create packages dynamically, or to manage and run Integration Services packages outside the
development environment, you can manipulate packages programmatically. You can load, modify, and run existing
packages, or you can create and run entirely new packages programmatically. In this approach, you have a
continuous range of options:
Load and run an existing package without modification.
Load an existing package, reconfigure it (for example, specify a different data source), and run it.
Create a new package, add and configure components, making changes object by object and property by
property, save it, and then run it.
These approaches to Integration Services programming are described in this section and demonstrated with
examples.
In This Section
Integration Services Programming Overview
Describes the roles of control flow and data flow in Integration Services development.
Understanding Synchronous and Asynchronous Transformations
Describes the important distinction between synchronous and asynchronous outputs and the components that use
them in the data flow.
Working with Connection Managers Programmatically
Lists the connection managers that you can use from managed code, and the values that the connection managers
return when the code calls the AcquireConnection method.
Extending Packages with Scripting
Describes how to extend the control flow by using the Script task, or the data flow by using the Script component.
Extending Packages with Custom Objects
Describes how to create and program custom tasks, data flow components, and other package objects for use in
multiple packages.
Building Packages Programmatically
Describes how to create, configure, and save Integration Services packages programmatically.
Running and Managing Packages Programmatically
Describes how to enumerate, run, and manage Integration Services packages programmatically.
Reference
Integration Services Error and Message Reference
Lists the predefined Integration Services error codes, together with their symbolic names and descriptions.
Related Sections
Troubleshooting Tools for Package Development
Describes the features and tools that Integration Services provides for troubleshooting packages during
development.
External Resources
CodePlex samples, Integration Services Product Samples, on www.codeplex.com/MSFTISProdSamples
See Also
SQL Server Integration Services
Integration Services Backward Compatibility
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic describes changes between versions of SQL Server Integration Services. It covers features that are no
longer available or are scheduled to be removed in a future release. It also describes changes to the product that
are known to break, or to change the behavior of, an existing application that includes Integration Services
functionality.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart how to use SQL Server Management Studio (SSMS) to connect to the SSIS Catalog database, and
then run the Integration Services Deployment Wizard to deploy an SSIS project to the SSIS Catalog.
SQL Server Management Studio is an integrated environment for managing any SQL infrastructure, from SQL
Server to SQL Database. For more info about SSMS, see SQL Server Management Studio (SSMS).
Prerequisites
Before you start, make sure you have the latest version of SQL Server Management Studio. To download SSMS,
see Download SQL Server Management Studio (SSMS).
The validation described in this article for deployment to Azure SQL Database requires SQL Server Data Tools
(SSDT) version 17.4 or later. To get the latest version of SSDT, see Download SQL Server Data Tools (SSDT).
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
SET T IN G SUGGEST ED VA L UE M O RE IN F O
Ser ver name The fully qualified server name If you're connecting to an Azure SQL
Database server, the name is in this
format:
<server_name>.database.windows.net
.
Login The server admin account This account is the account that you
specified when you created the
server.
Password The password for your server admin This password is the password that
account you specified when you created the
server.
Next steps
Consider other ways to deploy a package.
Deploy an SSIS package with Transact-SQL (SSMS)
Deploy an SSIS package with Transact-SQL (VS Code)
Deploy an SSIS package from the command prompt
Deploy an SSIS package with PowerShell
Deploy an SSIS package with C#
Run a deployed package. To run a package, you can choose from several tools and languages. For more info,
see the following articles:
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Deploy an SSIS project from SSMS with Transact-SQL
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use SQL Server Management Studio (SSMS) to connect to the SSIS Catalog
database, and then use Transact-SQL statements to deploy an SSIS project to the SSIS Catalog.
SQL Server Management Studio is an integrated environment for managing any SQL infrastructure, from SQL
Server to SQL Database. For more info about SSMS, see SQL Server Management Studio (SSMS).
Prerequisites
Before you start, make sure you have the latest version of SQL Server Management Studio. To download SSMS,
see Download SQL Server Management Studio (SSMS).
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
You cannot use the information in this quickstart to deploy an SSIS package to Azure SQL Database. The
catalog.deploy_project stored procedure expects path to the .ispac file in the local (on premises) file system.
For more info about deploying and running packages in Azure, see Lift and shift SQL Server Integration Services
workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
SET T IN G SUGGEST ED VA L UE M O RE IN F O
Login The server admin account This account is the account that you
specified when you created the
server.
Password The password for your server admin This password is the password that
account you specified when you created the
server.
Next steps
Consider other ways to deploy a package.
Deploy an SSIS package with SSMS
Deploy an SSIS package with Transact-SQL (VS Code)
Deploy an SSIS package from the command prompt
Deploy an SSIS package with PowerShell
Deploy an SSIS package with C#
Run a deployed package. To run a package, you can choose from several tools and languages. For more info,
see the following articles:
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Deploy an SSIS project from Visual Studio Code with
Transact-SQL
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use Visual Studio Code to connect to the SSIS Catalog database, and then use
Transact-SQL statements to deploy an SSIS project to the SSIS Catalog.
Visual Studio Code is a code editor for Windows, macOS, and Linux that supports extensions, including the mssql
extension for connecting to Microsoft SQL Server, Azure SQL Database, or Azure Synapse Analytics. For more info
about VS Code, see Visual Studio Code.
Prerequisites
Before you start, make sure you have installed the latest version of Visual Studio Code and loaded the mssql
extension. To download these tools, see the following pages:
Download Visual Studio Code
mssql extension
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
You cannot use the information in this quickstart to deploy an SSIS package to Azure SQL Database. The
catalog.deploy_project stored procedure expects path to the .ispac file in the local (on premises) file system.
For more info about deploying and running packages in Azure, see Lift and shift SQL Server Integration Services
workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
SET T IN G SUGGEST ED VA L UE M O RE IN F O
User name The server admin account This account is the account that you
specified when you created the
server.
Password (SQL Login) The password for your server admin This password is the password that
account you specified when you created the
server.
Enter a name for this profile A profile name, such as A saved profile name speeds your
mySSISSer ver connection on subsequent logins.
5. Press the ESC key to close the info message that informs you that the profile is created and connected.
6. Verify your connection in the status bar.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to deploy an SSIS project from the command prompt by running the
Integration Services Deployment Wizard, ISDeploymentWizard.exe .
For more info about the Integration Services Deployment Wizard, see Integration Services Deployment Wizard.
Prerequisites
The validation described in this article for deployment to Azure SQL Database requires SQL Server Data Tools
(SSDT) version 17.4 or later. To get the latest version of SSDT, see Download SQL Server Data Tools (SSDT).
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
Next steps
Consider other ways to deploy a package.
Deploy an SSIS package with SSMS
Deploy an SSIS package with Transact-SQL (SSMS)
Deploy an SSIS package with Transact-SQL (VS Code)
Deploy an SSIS package with PowerShell
Deploy an SSIS package with C#
Run a deployed package. To run a package, you can choose from several tools and languages. For more info,
see the following articles:
Run an SSIS package with SSMS
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Deploy an SSIS project with PowerShell
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use a PowerShell script to connect to a database server and deploy an SSIS
project to the SSIS Catalog.
Prerequisites
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
# Variables
$TargetInstanceName = "localhost\default"
$TargetFolderName = "Project1Folder"
$ProjectFilePath = "C:\Projects\Integration Services Project1\Integration Services
Project1\bin\Development\Integration Services Project1.ispac"
$ProjectName = "Integration Services Project1"
PowerShell script
Provide appropriate values for the variables at the top of the following script, and then run the script to deploy the
SSIS project.
NOTE
The following example uses Windows Authentication to deploy to a SQL Server on premises. To use SQL Server
authentication, replace the Integrated Security=SSPI; argument with User ID=<user name>;Password=<password>; . If
you're connecting to an Azure SQL Database server, you can't use Windows authentication.
# Variables
$SSISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
$TargetServerName = "localhost"
$TargetFolderName = "Project1Folder"
$ProjectFilePath = "C:\Projects\Integration Services Project1\Integration Services
Project1\bin\Development\Integration Services Project1.ispac"
$ProjectName = "Integration Services Project1"
Write-Host "Done."
Next steps
Consider other ways to deploy a package.
Deploy an SSIS package with SSMS
Deploy an SSIS package with Transact-SQL (SSMS)
Deploy an SSIS package with Transact-SQL (VS Code)
Deploy an SSIS package from the command prompt
Deploy an SSIS package with C#
Run a deployed package. To run a package, you can choose from several tools and languages. For more info,
see the following articles:
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Deploy an SSIS project with C# code in a .NET app
11/2/2020 • 4 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to write C# code to connect to a database server and deploy an SSIS project.
To create a C# app, you can use Visual Studio, Visual Studio Code, or another tool of your choice.
Prerequisites
Before you start, make sure you have Visual Studio or Visual Studio Code installed. Download the free Community
edition of Visual Studio, or the free Visual Studio Code, from Visual Studio Downloads.
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to deploy an SSIS project to the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to deploy an SSIS package to SQL Server on Linux. For more info
about running packages on Linux, see Extract, transform, and load data on Linux with SSIS.
Add references
1. In Solution Explorer, right-click the References folder and select Add Reference . The Reference Manager
dialog box opens.
2. In the Reference Manager dialog box, expand Assemblies and select Extensions .
3. Select the following two references to add:
Microsoft.SqlServer.Management.Sdk.Sfc
Microsoft.SqlServer.Smo
4. Click the Browse button to add a reference to Microsoft.SqlSer ver.Management.IntegrationSer vices .
(This assembly is installed only in the global assembly cache (GAC).) The Select the files to reference dialog
box opens.
5. In the Select the files to reference dialog box, navigate to the GAC folder that contains the assembly.
Typically this folder is
C:\Windows\assembly\GAC_MSIL\Microsoft.SqlServer.Management.IntegrationServices\14.0.0.0__89845dcd8080cc91 .
6. Select the assembly (that is, the .dll file) in the folder and click Add .
7. Click OK to close the Reference Manager dialog box and add the three references. To make sure the
references are there, check the References list in Solution Explorer.
NOTE
The following example uses Windows Authentication. To use SQL Server authentication, replace the
Integrated Security=SSPI; argument with User ID=<user name>;Password=<password>; . If you're connecting to an
Azure SQL Database server, you can't use Windows authentication.
using Microsoft.SqlServer.Management.IntegrationServices;
using System;
using System.Data.SqlClient;
using System.IO;
namespace deploy_ssis_project
{
class Program
{
static void Main(string[] args)
{
// Variables
string targetServerName = "localhost";
string targetFolderName = "Project1Folder";
string projectName = "Integration Services Project1";
string projectFilePath = @"C:\Projects\Integration Services Project1\Integration Services
Project1\bin\Development\Integration Services Project1.ispac";
Console.WriteLine("Done.");
}
}
}
Next steps
Consider other ways to deploy a package.
Deploy an SSIS package with SSMS
Deploy an SSIS package with Transact-SQL (SSMS)
Deploy an SSIS package with Transact-SQL (VS Code)
Deploy an SSIS package from the command prompt
Deploy an SSIS package with PowerShell
Run a deployed package. To run a package, you can choose from several tools and languages. For more info,
see the following articles:
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Run an SSIS package with SQL Server Management
Studio (SSMS)
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use SQL Server Management Studio (SSMS) to connect to the SSIS Catalog
database, and then run an SSIS package stored in the SSIS Catalog from Object Explorer in SSMS.
SQL Server Management Studio is an integrated environment for managing any SQL infrastructure, from SQL
Server to SQL Database. For more info about SSMS, see SQL Server Management Studio (SSMS).
Prerequisites
Before you start, make sure you have the latest version of SQL Server Management Studio (SSMS). To download
SSMS, see Download SQL Server Management Studio (SSMS).
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to run an SSIS package on the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift
SQL Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to run an SSIS package on Linux. For more info about running
packages on Linux, see Extract, transform, and load data on Linux with SSIS.
SET T IN G SUGGEST ED VA L UE M O RE IN F O
Ser ver name The fully qualified server name If you're connecting to an Azure SQL
Database server, the name is in this
format:
<server_name>.database.windows.net
.
Login The server admin account This account is the account that you
specified when you created the
server.
Password The password for your server admin This password is the password that
account you specified when you created the
server.
Run a package
1. In Object Explorer, select the package that you want to run.
2. Right-click and select Execute . The Execute Package dialog box opens.
3. Configure the package execution by using the settings on the Parameters , Connection Managers , and
Advanced tabs in the Execute Package dialog box.
4. Click OK to run the package.
Next steps
Consider other ways to run a package.
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Run an SSIS package from SSMS with Transact-SQL
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use SQL Server Management Studio (SSMS) to connect to the SSIS Catalog
database, and then use Transact-SQL statements to run an SSIS package stored in the SSIS Catalog.
SQL Server Management Studio is an integrated environment for managing any SQL infrastructure, from SQL
Server to SQL Database. For more info about SSMS, see SQL Server Management Studio (SSMS).
Prerequisites
Before you start, make sure you have the latest version of SQL Server Management Studio (SSMS). To download
SSMS, see Download SQL Server Management Studio (SSMS).
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to run an SSIS package on the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to run an SSIS package on Linux. For more info about running
packages on Linux, see Extract, transform, and load data on Linux with SSIS.
Ser ver name The fully qualified server name If you're connecting to an Azure SQL
Database server, the name is in this
format:
<server_name>.database.windows.net
.
Login The server admin account This account is the account that you
specified when you created the
server.
Password The password for your server admin This password is the password that
account you specified when you created the
server.
Run a package
Run the following Transact-SQL code to run an SSIS package.
1. In SSMS, open a new query window and paste the following code. (This code is the code generated by the
Script option in the Execute Package dialog box in SSMS.)
2. Update the parameter values in the catalog.create_execution stored procedure for your system.
3. Make sure that SSISDB is the current database.
4. Run the script.
5. In Object Explorer, refresh the contents of SSISDB if necessary and check for the project that you
deployed.
Declare @execution_id bigint
EXEC [SSISDB].[catalog].[create_execution] @package_name=N'Package.dtsx',
@execution_id=@execution_id OUTPUT,
@folder_name=N'Deployed Projects',
@project_name=N'Integration Services Project1',
@use32bitruntime=False,
@reference_id=Null
Select @execution_id
DECLARE @var0 smallint = 1
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id,
@object_type=50,
@parameter_name=N'LOGGING_LEVEL',
@parameter_value=@var0
EXEC [SSISDB].[catalog].[start_execution] @execution_id
GO
Next steps
Consider other ways to run a package.
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Run an SSIS package from Visual Studio Code with
Transact-SQL
11/2/2020 • 4 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use Visual Studio Code to connect to the SSIS Catalog database, and then
use Transact-SQL statements to run an SSIS package stored in the SSIS Catalog.
Visual Studio Code is a code editor for Windows, macOS, and Linux that supports extensions, including the
mssql extension for connecting to Microsoft SQL Server, Azure SQL Database, or Azure Synapse Analytics. For
more info about VS Code, see Visual Studio Code.
Prerequisites
Before you start, make sure you have installed the latest version of Visual Studio Code and loaded the mssql
extension. To download these tools, see the following pages:
Download Visual Studio Code
mssql extension
Supported platforms
You can use the information in this quickstart to run an SSIS package on the following platforms:
SQL Server on Windows.
Azure SQL Database. For more info about deploying and running packages in Azure, see Lift and shift SQL
Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to run an SSIS package on Linux. For more info about running
packages on Linux, see Extract, transform, and load data on Linux with SSIS.
IMPORTANT
Before continuing, make sure that you have your server, database, and login information ready. If you change your focus
from Visual Studio Code after you begin entering the connection profile information, you have to restart creating the
connection profile.
SET T IN G SUGGEST ED VA L UE M O RE IN F O
Ser ver name The fully qualified server name If you're connecting to an Azure SQL
Database server, the name is in this
format:
<server_name>.database.windows.net
.
User name The server admin account This account is the account that you
specified when you created the
server.
Password (SQL Login) The password for your server admin This password is the password that
account you specified when you created the
server.
Enter a name for this profile A profile name, such as A saved profile name speeds your
mySSISSer ver connection on subsequent logins.
5. Press the ESC key to close the info message that informs you that the profile is created and connected.
6. Verify your connection in the status bar.
Next steps
Consider other ways to run a package.
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Run an SSIS package with C#
Run an SSIS package from the command prompt
with DTExec.exe
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to run an SSIS package from the command prompt by running DTExec.exe
with the appropriate parameters.
NOTE
The method described in this article has not been tested with packages deployed to an Azure SQL Database server.
Supported platforms
You can use the information in this quickstart to run an SSIS package on the following platforms:
SQL Server on Windows.
The method described in this article has not been tested with packages deployed to an Azure SQL Database
server. For more info about deploying and running packages in Azure, see Lift and shift SQL Server Integration
Services workloads to the cloud.
You cannot use the information in this quickstart to run an SSIS package on Linux. For more info about running
packages on Linux, see Extract, transform, and load data on Linux with SSIS.
With the parameter values used in the following example, the program runs the package in the specified folder
path on the SSIS server - that is, the server that hosts the SSIS Catalog database (SSISDB). The /Server
parameter provides the server name. The program connects as the current user with Windows Integrated
Authentication. To use SQL Authentication, specify the /User and Password parameters with appropriate values.
1. Open a Command Prompt window.
2. Run DTExec.exe and provide values at least for the ISServer and the Server parameters, as shown in
the following example:
Next steps
Consider other ways to run a package.
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package with PowerShell
Run an SSIS package with C#
Run an SSIS package with PowerShell
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to use a PowerShell script to connect to a database server and run an SSIS
package.
Prerequisites
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Supported platforms
You can use the information in this quickstart to run an SSIS package on the following platforms:
SQL Server on Windows.
SSIS integration runtime (IR) in Azure Data Factory (ADF), where SSIS catalog (SSISDB) is hosted by Azure
SQL Managed Instance (MI). For more info about deploying and running packages in Azure, see Lift and
shift SQL Server Integration Services workloads to the cloud.
You cannot use the information in this quickstart to run an SSIS package on Linux. For more info about running
packages on Linux, see Extract, transform, and load data on Linux with SSIS.
PowerShell script
Provide appropriate values for the variables at the top of the following script, and then run the script to run the
SSIS package.
NOTE
The following example uses Windows Authentication. To use SQL Server authentication, replace the
Integrated Security=SSPI; argument with User ID=<user name>;Password=<password>; . If you're connecting to an
Azure SQL Database server, you can't use Windows authentication.
# Variables
$SSISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
$TargetServerName = "localhost"
$TargetFolderName = "Project1Folder"
$ProjectName = "Integration Services Project1"
$PackageName = "Package.dtsx"
Write-Host "Done."
Next steps
Consider other ways to run a package.
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with C#
Run an SSIS package with C# code in a .NET app
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This quickstart demonstrates how to write C# code to connect to a database server and run an SSIS package.
You can use Visual Studio, Visual Studio Code, or another tool of your choice to create a C# app.
Prerequisites
Before you start, make sure you have Visual Studio or Visual Studio Code installed. Download the free
Community edition of Visual Studio, or the free Visual Studio Code, from Visual Studio Downloads.
An Azure SQL Database server listens on port 1433. If you're trying to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Add references
1. In Solution Explorer, right-click the References folder and select Add Reference . The Reference Manager
dialog box opens.
2. In the Reference Manager dialog box, expand Assemblies and select Extensions .
3. Select the following two references to add:
Microsoft.SqlServer.Management.Sdk.Sfc
Microsoft.SqlServer.Smo
4. Click the Browse button to add a reference to Microsoft.SqlSer ver.Management.IntegrationSer vices .
(This assembly is installed only in the global assembly cache (GAC).) The Select the files to reference
dialog box opens.
5. In the Select the files to reference dialog box, navigate to the GAC folder that contains the assembly.
Typically this folder is
C:\Windows\assembly\GAC_MSIL\Microsoft.SqlServer.Management.IntegrationServices\14.0.0.0__89845dcd8080cc91
.
6. Select the assembly (that is, the .dll file) in the folder and click Add .
7. Click OK to close the Reference Manager dialog box and add the three references. To make sure the
references are there, check the References list in Solution Explorer.
NOTE
The following example uses Windows Authentication. To use SQL Server authentication, replace the
Integrated Security=SSPI; argument with User ID=<user name>;Password=<password>; . If you're connecting to an
Azure SQL Database server, you can't use Windows authentication.
using Microsoft.SqlServer.Management.IntegrationServices;
using System.Data.SqlClient;
namespace run_ssis_package
{
class Program
{
static void Main(string[] args)
{
// Variables
string targetServerName = "localhost";
string folderName = "Project1Folder";
string projectName = "Integration Services Project1";
string packageName = "Package.dtsx";
}
}
}
Next steps
Consider other ways to run a package.
Run an SSIS package with SSMS
Run an SSIS package with Transact-SQL (SSMS)
Run an SSIS package with Transact-SQL (VS Code)
Run an SSIS package from the command prompt
Run an SSIS package with PowerShell
Lift and shift SQL Server Integration Services
workloads to the cloud
11/2/2020 • 7 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can now move your SQL Server Integration Services (SSIS) projects, packages, and workloads to the Azure
cloud. Deploy, run, and manage SSIS projects and packages in the SSIS Catalog (SSISDB) on Azure SQL
Database or SQL Managed Instance with familiar tools such as SQL Server Management Studio (SSMS).
Benefits
Moving your on-premises SSIS workloads to Azure has the following potential benefits:
Reduce operational costs and reduce the burden of managing infrastructure that you have when you run
SSIS on-premises or on Azure virtual machines.
Increase high availability with the ability to specify multiple nodes per cluster, as well as the high
availability features of Azure and of Azure SQL Database.
Increase scalability with the ability to specify multiple cores per node (scale up) and multiple nodes per
cluster (scale out).
LO C AT IO N STO RA GE RUN T IM E SC A L A B IL IT Y
On premises SQL Server SSIS runtime hosted by SSIS Scale Out (in SQL
SQL Server Server 2017 and later)
On Azure SQL Database or SQL Azure-SSIS Integration Scaling options for the
Managed Instance Runtime, a component of Azure-SSIS Integration
Azure Data Factory Runtime
NOTE
The Azure-SSIS Integration Runtime is not yet available in all Azure regions. For info about the supported regions, see
Products available by region - Microsoft Azure.
Scale up and out . When you provision the Azure-SSIS IR, you can scale up and scale out by specifying values
for the following options:
The node size (including the number of cores) and the number of nodes in the cluster.
The existing instance of Azure SQL Database to host the SSIS Catalog Database (SSISDB), and the service
tier for the database.
The maximum parallel executions per node.
Improve performance . For more info, see Configure the Azure-SSIS Integration Runtime for high
performance.
Reduce costs . To reduce costs, run the Azure-SSIS IR only when you need it. For more info, see How to
schedule starting and stopping of an Azure SSIS integration runtime.
Design packages
You continue to design and build packages on-premises in SSDT, or in Visual Studio with SSDT installed.
Connect to data sources
To connect to on-premises data sources from the cloud with Windows authentication , see Connect to data
sources and file shares with Windows Authentication from SSIS packages in Azure.
To connect to files and file shares, see Open and save files on premises and in Azure with SSIS packages
deployed in Azure.
Available SSIS components
When you provision an instance of SQL Database to host SSISDB, the Azure Feature Pack for SSIS and the
Access Redistributable are also installed. These components provide connectivity to various Azure data
sources and to Excel and Access files, in addition to the data sources supported by the built-in components.
You can also install additional components - for example, you can install a driver that's not installed by default.
For more info, see Customize setup for the Azure-SSIS integration runtime.
If you have an Enterprise Edition license, additional components are available. For more info, see Provision
Enterprise Edition for the Azure-SSIS Integration Runtime.
If you're an ISV, you can update the installation of your licensed components to make them available on Azure.
For more info, see Install paid or licensed custom components for the Azure-SSIS integration runtime.
Transaction support
With SQL Server on premises and on Azure virtual machines, you can use Microsoft Distributed Transaction
Coordinator (MSDTC) transactions. To configure MSDTC on each node of the Azure-SSIS IR, use the custom
setup capability. For more info, see Custom setup for the Azure-SSIS integration runtime.
With Azure SQL Database, you can only use elastic transactions. For more info, see Distributed transactions
across cloud databases.
Monitor packages
To monitor running packages, use the following reporting options in SSMS.
Right-click SSISDB , and then select Active Operations to open the Active Operations dialog box.
Select a package in Object Explorer, right-click and select Repor ts , then Standard Repor ts , then All
Executions .
To monitor the Azure-SSIS Integration Runtime, see Monitor the Azure-SSIS integration runtime.
Schedule packages
To schedule the execution of packages deployed in Azure, you can use a variety of tools. For more info, see
Schedule the execution of SQL Server Integration Services (SSIS) packages deployed in Azure.
Next steps
To get started with SSIS workloads on Azure, see the following articles:
Tutorial: Deploy and run a SQL Server Integration Services (SSIS) package in Azure
Provision the Azure-SSIS Integration Runtime in Azure Data Factory
Tutorial: Deploy and run a SQL Server Integration
Services (SSIS) package in Azure
11/2/2020 • 7 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This tutorial shows you how to deploy a SQL Server Integration Services (SSIS) project to the SSIS Catalog in
Azure SQL Database, run a package in the Azure-SSIS Integration Runtime, and monitor the running package.
Prerequisites
Before you start, make sure you have version 17.2 or later of SQL Server Management Studio. To download the
latest version of SSMS, see Download SQL Server Management Studio (SSMS).
Also make sure that you have set up the SSISDB database in Azure and provisioned the Azure-SSIS Integration
Runtime. For info about how to provision SSIS on Azure, see Deploy SQL Server Integration Services packages to
Azure.
IMPORTANT
An Azure SQL Database server listens on port 1433. If you are attempting to connect to an Azure SQL Database server
from within a corporate firewall, this port must be open in the corporate firewall for you to connect successfully.
Ser ver name The fully qualified server name The name should be in this format:
mysqldbser ver.database.window
s.net . If you need the server name,
see Connect to the SSISDB Catalog
database on Azure.
Login The server admin account The account that you specified when
you created the server.
Password The password for your server admin The password that you specified
account when you created the server.
3. Connect to the SSISDB database . Select Options to expand the Connect to Ser ver dialog box. In the
expanded Connect to Ser ver dialog box, select the Connection Proper ties tab. In the Connect to
database field, select or enter SSISDB .
4. Then select Connect . The Object Explorer window opens in SSMS.
5. In Object Explorer, expand Integration Ser vices Catalogs and then expand SSISDB to view the objects in
the SSIS Catalog database.
NOTE
Deployment to Azure only supports the project deployment model.
NOTE
If you get the error message There is no active worker agent. (.Net SqlClient Data Provider) , make sure the
Azure-SSIS Integration Runtime is running. This error occurs if you try to deploy while the Azure-SSIS IR is in a
stopped state.
5. After the deployment process is complete, the Results page opens. This page displays the success or failure
of each action.
If the action failed, select Failed in the Result column to display an explanation of the error.
Optionally, select Save Repor t... to save the results to an XML file.
Select Close to exit the wizard.
if ($folders.Count -gt 0)
{
foreach ($filefolder in $folders)
{
Write-Host "Creating Folder " + $filefolder.Name + " ..."
Run a package
1. In Object Explorer in SSMS, select the package that you want to run.
2. Right-click and select Execute to open the Execute Package dialog box.
3. In the Execute Package dialog box, configure the package execution by using the settings on the
Parameters , Connection Managers , and Advanced tabs.
4. Select OK to run the package.
Next steps
Learn how to schedule package execution. For more info, see Schedule SSIS package execution on Azure
Connect to the SSIS Catalog (SSISDB) in Azure
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Find the connection information you need to connect to the SSIS Catalog (SSISDB) hosted on an Azure SQL
Database server. You need the following items to connect:
fully qualified server name
database name
login information
IMPORTANT
You can't create the SSISDB Catalog database on Azure SQL Database at this time independently of creating the Azure-SSIS
Integration Runtime in Azure Data Factory. The Azure-SSIS IR is the runtime environment that runs SSIS packages on Azure.
For a walkthrough of the process, see Deploy and run an SSIS package in Azure.
Prerequisites
Before you start, make sure you have version 17.2 or later of SQL Server Management Studio (SSMS). If the
SSISDB Catalog database is hosted on SQL Managed Instance, make sure you have version 17.6 or later of SSMS.
To download the latest version of SSMS, see Download SQL Server Management Studio (SSMS).
4. If you have forgotten the login information for the SQL Database server, navigate to the SQL Database
server page. There you can view the server admin name and, if necessary, reset the password.
Ser ver name The fully qualified server name The name should be in this format:
mysqldbser ver.database.window
s.net .
Login The server admin account This is the account that you specified
when you created the server.
Password The password for your server admin This is the password that you
account specified when you created the
server.
3. Connect to the SSISDB database . Select Options to expand the Connect to Ser ver dialog box. In the
expanded Connect to Ser ver dialog box, select the Connection Proper ties tab. In the Connect to
database field, select or enter SSISDB .
IMPORTANT
If you don't select SSISDB when you connect, you may not see the SSIS Catalog in Object Explorer.
4. Then select Connect .
5. In Object Explorer, expand Integration Ser vices Catalogs and then expand SSISDB to view the objects in
the SSIS Catalog database.
Next steps
Deploy a package. For more info, see Deploy an SSIS project with SQL Server Management Studio (SSMS).
Run a package. For more info, see Run an SSIS package with SQL Server Management Studio (SSMS).
Schedule a package. For more info, see Schedule SSIS packages in Azure
Validate SQL Server Integration Services (SSIS)
packages deployed to Azure
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
When you deploy a SQL Server Integration Services (SSIS) project to the SSIS Catalog (SSISDB) on an Azure
server, the Package Deployment Wizard adds an additional validation step after the Review page. This validation
step checks the packages in the project for known issues that may prevent the packages from running as expected
in the Azure SSIS Integration Runtime. Then the wizard displays any applicable warnings on the Validate page.
IMPORTANT
The validation described in this article occurs when you deploy a project with SQL Server Data Tools (SSDT) version 17.4 or
later. To get the latest version of SSDT, see Download SQL Server Data Tools (SSDT).
For more info about the Package Deployment Wizard, see Deploy Integration Services (SSIS) Projects and
Packages.
W IN DO W S
C O N N EC T IO N A UT H EN T IC AT IO P RO VIDER O R
M A N A GER N F IL E PAT H UN C PAT H H O ST N A M E DRIVER
Ftp ✓
Wmi ✓
Next steps
To learn how to schedule package execution on Azure, see Schedule SSIS packages in Azure.
Run SQL Server Integration Services (SSIS) packages
deployed in Azure
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can run SSIS packages deployed to the SSISDB Catalog on an Azure SQL Database server by choosing one of
the methods described in this article. You can run a package directly, or run a package as part of an Azure Data
Factory pipeline. For an overview about SSIS on Azure, see Deploy and run SSIS packages in Azure.
Run a package directly
Run with SSMS
Run with stored procedures
Run with script or code
Run a package as part of an Azure Data Factory pipeline
Run with the Execute SSIS Package activity
Run with the Stored Procedure activity
NOTE
Running a package with dtexec.exe has not been tested with packages deployed to Azure.
Next steps
Learn about options for scheduling SSIS packages deployed to Azure. For more info, see Schedule SSIS packages
in Azure.
Schedule the execution of SQL Server Integration
Services (SSIS) packages deployed in Azure
11/2/2020 • 4 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can schedule the execution of SSIS packages deployed to the SSISDB Catalog on an Azure SQL Database
server by choosing one of the methods described in this article. You can schedule a package directly, or schedule a
package indirectly as part of an Azure Data Factory pipeline. For an overview about SSIS on Azure, see Lift and
shift SQL Server Integration Services workloads to the cloud.
Schedule a package directly
Schedule with the Schedule option in SQL Server Management Studio (SSMS)
SQL Database elastic jobs
SQL Server Agent
Schedule a package indirectly as part of an Azure Data Factory pipeline
-- Add a job step to create/start SSIS package execution using SSISDB catalog stored procedures
EXEC jobs.sp_add_jobstep @job_name='ExecutePackageJob',
@command=N'DECLARE @exe_id bigint
EXEC [SSISDB].[catalog].[create_execution]
@folder_name=N''folderName'', @project_name=N''projectName'',
@package_name=N''packageName'', @use32bitruntime=0,
@runinscaleout=1, @useanyworker=1,
@execution_id=@exe_id OUTPUT
EXEC [SSISDB].[catalog].[start_execution] @exe_id, @retry_count=0',
@credential_name='YourDBScopedCredentials',
@target_group_name='TargetGroup'
-- Add the SSISDB database on your Azure SQL Database as a linked server to your SQL Server on
premises
EXEC sp_addlinkedserver
@server='myLinkedServer', -- Name your linked server
@srvproduct='',
@provider='sqlncli', -- Use SQL Server native client
@datasrc='<server_name>.database.windows.net', -- Add your Azure SQL Database server endpoint
@location='',
@provstr='',
@catalog='SSISDB' -- Add SSISDB as the initial catalog
For more info, see Create Linked Servers and Linked Servers.
Create a SQL Server Agent job
To schedule a package with SQL Server Agent on premises, create a job with a job step that calls the SSIS Catalog
stored procedures [catalog].[create_execution] and then [catalog].[start_execution] . For more info, see SQL
Server Agent Jobs for Packages.
1. In SQL Server Management Studio, connect to the on-premises SQL Server database on which you want to
create the job.
2. Right-click on the SQL Ser ver Agent node, select New , and then select Job to open the New Job dialog
box.
3. In the New Job dialog box, select the Steps page, and then select New to open the New Job Step dialog
box.
4. In the New Job Step dialog box, select SSISDB as the Database.
5. In the Command field, enter a Transact-SQL script similar to the script shown in the following example:
-- T-SQL script to create and start SSIS package execution using SSISDB stored procedures
DECLARE @return_value int, @exe_id bigint
Next steps
Review the options for running SSIS packages deployed to Azure. For more info, see Run SSIS packages in Azure.
Schedule the execution of SSIS packages deployed in
Azure with SQL Server Management Studio (SSMS)
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can use SQL Server Management Studio (SSMS) to schedule SSIS packages deployed to Azure SQL Database.
SQL Server on premises and SQL Managed Instance have SQL Server Agent and Managed Instance Agent
respectively as a first-class SSIS job scheduler. SQL Database, on the other hand, does not have a built-in first-class
SSIS job scheduler. The SSMS feature described in this article provides a familiar user interface that's similar to SQL
Server Agent for scheduling packages deployed to SQL Database.
If you're using SQL Database to host the SSIS catalog, SSISDB , you can use this SSMS feature to generate the Data
Factory pipelines, activities, and triggers required to schedule SSIS packages. Later you can optionally edit and
extend these objects in Data Factory.
When you use SSMS to schedule a package, SSIS automatically creates three new Data Factory objects, with names
based on the name of the selected package and the timestamp. For example, if the name of the SSIS package is
MyPackage , SSMS creates new Data Factory objects similar to the following:
O B JEC T NAME
Pipeline Pipeline_MyPackage_2018-05-08T09_00_00Z
Trigger Trigger_MyPackage_2018-05-08T09_00_00Z
Prerequisites
The feature described in this article requires SQL Server Management Studio version 17.7 or higher. To get the
latest version of SSMS, see Download SQL Server Management Studio (SSMS).
3. On the Package page of the New Schedule dialog box, select optional run-time settings and a run-time
environment.
4. On the Schedule page of the New Schedule dialog box, provide the schedule settings such as frequency,
time of day, and duration.
5. After you finish creating the job in the New Schedule dialog box, a confirmation appears to remind you
about the new Data Factory objects that SSMS is going to create. If you select Yes in confirmation dialog
box, the new Data Factory pipeline opens in the Azure portal for you to review and customize.
6. To customize the scheduling trigger, select New/Edit from the Trigger menu.
The Edit Trigger blade opens for you to customize the scheduling options.
Next steps
To learn about other methods for scheduling an SSIS package, see Schedule the execution of an SSIS package on
Azure.
To learn more about Azure Data Factory pipelines, activities, and triggers, see the following articles:
Pipelines and activities in Azure Data Factory
Pipeline execution and triggers in Azure Data Factory
Migrate on-premises SSIS workloads to SSIS in ADF
11/2/2020 • 2 minutes to read • Edit Online
This article lists process and tools that can help migrate SQL Server Integration Services (SSIS) workloads to SSIS
in ADF.
Migration overview highlights overall migration process of your ETL workloads from on-premises SSIS to SSIS in
ADF.
The migration process consists of two phases: Assessment and Migration.
Assessment
Data Migration Assistant (DMA) is a freely downloadable tool for this purpose that can be installed and executed
locally. DMA assessment project of type Integration Services can be created to assess SSIS packages in batches and
identify compatibility issues.
Get Database Migration Assistant, and perform package assessment.
Migration
Depending on the storage types of source SSIS packages and the migration destination of database workloads, the
steps to migrate SSIS packages and SQL Server Agent jobs that schedule SSIS package executions may vary. For
more information, see this page.
Next steps
Migrate SSIS packages to Azure SQL Managed Instance.
Migrate SSIS jobs to Azure Data Factory (ADF) with SQL Server Management Studio (SSMS).
Install Integration Services (SSIS)
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
SQL Server provides a single setup program to install any or all of its components, including Integration Services.
Use Setup to install Integration Services with or without other SQL Server components on a single computer.
This article highlights important considerations that you should know before you install Integration Services.
Information in this article helps you evaluate your installation options so that your selection results in a successful
installation.
NOTE
Some SQL Server components that you can select for installation on the Feature Selection page of the Setup Wizard
install a partial subset of Integration Services components. These components are useful for specific tasks, but the
functionality of Integration Services is limited. For example, the Database Engine Ser vices option installs the Integration
Services components required for the SQL Server Import and Export Wizard. To ensure a complete installation of
Integration Services, you must select Integration Ser vices on the Feature Selection page.
Next steps
Installing Integration Services Versions Side by Side
What's New in Integration Services in SQL Server 2016
What's New in Integration Services in SQL Server 2017
Installing Integration Services Versions Side by Side
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can install
SQL Server 2019 (15.x) Integration Services (SSIS) side-by-side with earlier versions of SSIS. This topic describes
some limitations of side-by-side installations.
or
or
When you add an existing package to an existing project, the package is converted to the format targeted by the
project .
Running packages
You can use the SQL Server 2019 (15.x) version of the dtexec utility or of SQL Server Agent to run Integration
Services packages created by earlier versions of the development tools. When these SQL Server 2019 (15.x) tools
load a package that was developed in an earlier version of the development tools, the tool temporarily converts the
package in memory to the package format that SQL Server 2017 Integration Services (SSIS) uses. If the package
has issues that prevent a successful conversion, the SQL Server 2019 (15.x) tool can't run the package until those
issues are resolved. For more info, see Upgrade Integration Services Packages.
Upgrade Integration Services
11/2/2020 • 9 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
If SQL Server 2008 Integration Services (SSIS) or later is currently installed on your computer, you can upgrade to
SQL Server 2017 Integration Services (SSIS).
When you upgrade to SQL Server 2017 Integration Services (SSIS) on a machine that has one of these earlier
versions of Integration Services installed, SQL Server 2017 Integration Services (SSIS) is installed side-by-side with
the earlier version.
With this side-by-side install, multiple versions of dtexec utility are installed. To ensure that you run the correct
version of the utility, at the command prompt run the utility by entering the full path (<drive>:\Program
Files\Microsoft SQL Server\<version>\DTS\Binn). For more information about dtexec, see dtexec Utility.
NOTE
In previous versions of SQL Server, by default when you installed SQL Server all users in the Users group had access to the
Integration Services service. When you install SQL Server 2019 (15.x), users do not have access to the Integration Services
service. The service is secure by default. After SQL Server 2019 (15.x) is installed, the SQL Server administrator must run the
DCOM Configuration tool (Dcomcnfg.exe) to grant specific users access to the Integration Services service. For more
information, see Integration Services Service (SSIS Service).
NOTE
Support for migrating or running Data Transformation Services (DTS) packages has been discontinued in SQL Server 2012.
The following DTS functionality has been discontinued.
DTS runtime
DTS API
Package Migration Wizard for migrating DTS packages to the next version of Integration Services
Support for DTS package maintenance in SQL Server Management Studio
Execute DTS 2000 Package task
Upgrade Advisor scan of DTS packages.
For information about other discontinued features, see Discontinued Integration Services Functionality in SQL Server 2016.
NOTE
Although the data moves to a different system table, the upgrade process does not migrate packages to the
new format.
Moves folder metadata from the msdb.sysdtsfolders90 system table to the msdb.sysssisfolders
system table.
Moves log data from the msdb.sysdtslog90 system table to the msdb.sysssislog system table.
Removes the msdb.sysdts*90 system tables and the stored procedures that are used to access them after
moving the data to the new msdb.sysssis* tables. However, upgrade replaces the sysdtslog90 table with a
view that is also named sysdtslog90. This new sysdtslog90 view exposes the new msdb.sysssislog system
table. This ensures that reports based on the log table continue to run without interruption.
To control access to packages, creates three new fixed database-level roles: db_ssisadmin, db_ssisltduser, and
db_ssisoperator. The SQL Server 2005 (9.x)Integration Services roles of db_dtsadmin, db_dtsltduser, and
db_dtsoperator are not removed, but are made members of the corresponding new roles.
If the SSIS package store (that is, the file system location managed by the Integration Services service) is the
default location under \SQL Ser ver\90 , \SQL Ser ver\100 , \SQL Ser ver\110 , or \SQL Ser ver\120
moves those packages to the new default location under \SQL Ser ver\130 .
Updates the Integration Services service configuration file to point to the upgraded instance of the Database
Engine.
What the Upgrade Process Does Not Do
The upgrade process does not do the following tasks:
Does not remove the SQL Server 2008 Integration Services (SSIS) or later service.
Does not migrate existing Integration Services packages to the new package format that SQL Server 2019
(15.x) uses. For information about how to migrate packages, see Upgrade Integration Services Packages.
Does not move packages from file system locations, other than the default location, that have been added to
the service configuration file. If you have previously edited the service configuration file to add more file
system folders, packages that are stored in those folders will not be moved to a new location.
In SQL Server Agent job steps that call the dtexec utility (dtexec.exe) directly, does not update the file
system path for the dtexec utility. You have to edit these job steps manually to update the file system path
to specify the SQL Server 2019 (15.x) location for the dtexec utility.
What You Can Do After Upgrading
After the upgrade process finishes, you can do the following tasks:
Run SQL Server Agent jobs that run packages.
Use Management Studio to manage Integration Services packages that are stored in an instance of SQL
Server 2008, SQL Server 2008 R2, SQL Server 2012 (11.x), or SQL Server 2014 (12.x). You need to modify
the service configuration file to add the instance of SQL Server 2008, SQL Server 2008 R2, SQL Server 2012
(11.x), or SQL Server 2014 (12.x) to the list of locations managed by the service.
NOTE
Early versions of Management Studio cannot connect to SQL Server 2017 Integration Services (SSIS) Service.
Identify the version of packages in the msdb.dbo.sysssispackages system table by checking the value in the
packageformat column. The table has a packageformat column that identifies the version of each package. A
value of 3 indicates a SQL Server 2008 Integration Services (SSIS) package. Until you migrate packages to
the new package format, the value in the packageformat column does not change.
You cannot use the SQL Server 2008, SQL Server 2008 R2, SQL Server 2012 (11.x), or SQL Server 2014
(12.x) tools to design, run, or manage Integration Services packages. The SQL Server 2008, SQL Server 2008
R2, SQL Server 2012 (11.x), or SQL Server 2014 (12.x) tools include the respective versions of SQL Server
Data Tools (SSDT), the SQL Server Import and Export Wizard, and the Package Execution Utility
(dtexecui.exe). The upgrade process does not remove the SQL Server 2008, SQL Server 2008 R2, SQL
Server 2012 (11.x), or SQL Server 2014 (12.x)tools. However, you will not able to use these tools to continue
to work with SQL Server 2008 Integration Services (SSIS) or later packages on a server that has been
upgraded.
By default, in an upgrade installation, Integration Services is configured to log events that are related to the
running of packages to the Application event log. This setting might generate too many event log entries
when you use the Data Collector feature of SQL Server 2019 (15.x). The events that are logged include
EventID 12288, "Package started," and EventID 12289, "Package finished successfully." To stop logging these
two events to the Application event log, open the registry for editing. Then in the registry, locate the
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\130\SSIS node, and change the
DWORD value of the LogPackageExecutionToEventLog setting from 1 to 0.
NOTE
Although the packages in the upgraded instance of the Database Engine have not yet been migrated to the new
package format, they are not discoverable by the SQL Server 2008 tools. Therefore, the packages cannot be used by
the SQL Server 2008 tools.
You cannot use SQL Server 2008 Integration Services (SSIS) on other computers to run packages that are
stored in msdb on the upgraded instance of the Database Engine.
You cannot use SQL Server Agent jobs on SQL Server 2008 computers to run SQL Server 2008 Integration
Services (SSIS) packages that are stored in the upgraded instance of the Database Engine.
External Resources
Blog entry, Making your Existing Custom SSIS Extensions and Applications Work in Denali, on blogs.msdn.com.
Upgrade Integration Services Packages
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
When you upgrade an instance of SQL Server 2008 to the current release of SQL Server, your existing SQL Server
2008 Integration Services (SSIS) packages are not automatically upgraded to the package format that the current
release SQL ServerIntegration Services uses. You will have to select an upgrade method and manually upgrade
your packages.
For information on upgrading packages when you convert a project to the project deployment model, see Deploy
Integration Services (SSIS) Projects and Packages
NOTE
When you run a SQL Server 2008, SQL Server 2008 R2, SQL Server 2012 (11.x), or SQL Server 2014 (12.x) package using
the dtexec utility (dtexec.exe) that is installed with the current release of SQL Server, the temporary package upgrade
increases the execution time. The rate of increase in package execution time varies depending on the size of the package. To
avoid an increase in the execution time, it is recommended that you upgrade the package before running it.
UP GRA DE M ET H O D T Y P E O F UP GRA DE
Use the dtexec utility (dtexec.exe) that is installed with the The package upgrade is temporary.
current release of SQL Server to run a SQL Server 2008, SQL
Server 2008 R2, SQL Server 2012 (11.x), or SQL Server 2014 The changes cannot be saved.
(12.x) package.
Open a SQL Server 2008, SQL Server 2008 R2, SQL Server The package upgrade is permanent if you save the package;
2012 (11.x), or SQL Server 2014 (12.x) package file in SQL otherwise, it is temporary if you do not save the package.
Server Data Tools (SSDT).
Add a SQL Server 2008, SQL Server 2008 R2, SQL Server The package upgrade is permanent.
2012 (11.x), or SQL Server 2014 (12.x) package to an existing
project in SQL Server Data Tools (SSDT).
Open a SQL Server 2008 Integration Services (SSIS) or later The package upgrade is permanent.
project file in Visual Studio, and then use the SSIS Package
Upgrade Wizard to upgrade multiple packages in the project.
Use the Upgrade method to upgrade one or more The package upgrade is permanent.
Integration Services packages.
NOTE
To identify which packages have the issues listed in this table, run Upgrade Advisor.
C O M P O N EN T O R F EAT URE UP GRA DE RESULT S
Connection strings For SQL Server 2008, SQL Server 2008 R2, SQL Server 2012
(11.x), or SQL Server 2014 (12.x) packages, the names of
certain providers have changed and require different values in
the connection strings. To update the connection strings, use
one of the following procedures:
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
You can upgrade packages that were created in earlier versions of Integration Services to the Integration Services
format that SQL Server 2019 (15.x) uses. SQL Server provides the SSIS Package Upgrade Wizard to help in this
process. Because you can configure the wizard to backup up your original packages, you can continue to use the
original packages if you experience upgrade difficulties.
The SSIS Package Upgrade Wizard is installed when Integration Services is installed.
NOTE
The SSIS Package Upgrade Wizard is available in the Standard, Enterprise, and Developer Editions of SQL Server.
For more information about how to upgrade Integration Services packages, see Upgrade Integration Services
Packages.
The remainder of this topic describes how to run the wizard and back up the original packages.
NOTE
When you open an Integration Services project that contains SQL Server 2008 Integration Services (SSIS) or later
packages, Integration Services automatically opens the SSIS Package Upgrade Wizard.
NOTE
The backup option in the wizard only works with packages that have been stored to the file system.
2. In SQL Server Management Studio or at the command prompt, run the SSIS Package Upgrade Wizard.
3. On the Select Source Location page of the wizard, set the Package source property to File System .
4. On the Select Destination Location page of the wizard, select Save to source location tosave the
upgraded packages to the same location as the original packages.
NOTE
The backup option in the wizard is available only when the upgraded packages are stored in the same folder as the
original packages.
5. On the Select Package Management Options page of the wizard, select the Backup original
packages option.
To back up the original packages in SQL Server Data Tools
1. Save the original packages to a location on the file system.
2. On the Select Package Management Options page of the wizard, select the Backup original
packages option.
WARNING
The Backup original packages option is not displayed when you open a SQL Server 2008 Integration Services
(SSIS) or later project in SQL Server Data Tools (SSDT), which automatically launches the wizard.
3. In SQL Server Data Tools (SSDT), run the SSIS Package Upgrade Wizard.
Integration Services (SSIS) Development and
Management Tools
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Integration Services includes two studios for working with packages:
SQL Server Data Tools (SSDT) for developing the Integration Services packages that a business solution
requires. SQL Server Data Tools (SSDT) provides the Integration Services project in which you create
packages.
SQL Server Management Studio for managing packages in a production environment.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
SQL Server provides SQL Server Data Tools (SSDT) for the development of Integration Services packages.
Integration Services packages reside in projects. To create and work with Integration Services projects, you must
install SQL Server Data Tools. For more information, see Install Integration Services.
When you create a new Integration Services project in SQL Server Data Tools (SSDT), the New Project dialog box
includes an Integration Ser vices Project template. This project template creates a new project that contains a
single package.
TIP
By default, when you create a new project in SQL Server Data Tools, the solution is not shown in Solution Explorer pane. To
change this default behavior, on the Tools menus, click Options . In the Options dialog box, expand Projects and
Solutions , and then click General. On the General page, select Always show solution .
The following table describes the folders that appear in an Integration Services project.
Package Parts Contains Package Parts that can be reused or imported. For
more information, see Reuse Control Flow across Packages by
Using Control Flow Package Parts
NOTE: To view and change the selected source control plug-in and to configure the source control
environment, click Options on the Tools menu, and then expand the Source Control node.
8. Click OK to add the solution to Solution Explorer and add the project to the solution.
Next steps
Download and install SQL Server Data Tools.
SSIS How to Create an ETL Package
SQL Server Integration Services (SSIS) DevOps Tools
Azure DevOps extension
11/2/2020 • 11 minutes to read • Edit Online
Properties
Project path
Path of the project folder or file to be built. If a folder path is specified, SSIS Build task will search all dtproj files
recursively under this folder and build them all.
Project path cannot be empty, set as . to build from the root folder of the repository.
Project configuration
Name of the project configuration to be used for build. If not supplied, it defaults to the first defined project
configuration in each dtproj file.
Output path
Path of a separate folder to save build results, which can be published as build artifact via publish build artifacts
task.
Limitations and known issues
SSIS Build task relies on Visual Studio and SSIS designer, which is mandatory on build agents. Thus, to run
SSIS Build task in the pipeline, you must choose vs2017-win2016 for Microsoft-hosted agents, or install
Visual Studio and SSIS designer (either VS2017 + SSDT2017, or VS2019 + SSIS Projects extension) on self-
hosted agents.
To build SSIS projects using any out-of-box components (including SSIS Azure feature pack, and other third-
party components), those out-of-box components must be installed on the machine where the pipeline
agent is running. For Microsoft-hosted agent, user can add a PowerShell Script task or Command Line Script
task to download and install the components before SSIS Build task is executed. Below is the sample
PowerShell script to install Azure Feature Pack:
start -Wait -FilePath msiexec -Args "/i AFP.msi /quiet /l* log.txt"
cat log.txt
Protection level Encr yptSensitiveWithPassword and Encr yptAllWithPassword are not supported in SSIS
Build task. Make sure all SSIS projects in codebase are not using these two protection levels, or SSIS Build task
will stop responding and time out during execution.
Domain name
Domain name to access the specified file system. This property is only visible when Destination type is File System.
You can leave it empty when the user account to run the self-hosted agent has been granted read/write access to
the specified destination path.
Username
Username to access the specified file system or SSISDB. This property is visible when Destination type is File
System or Authentication type is SQL Server Authentication or Active Directory - Password. You can leave it empty
when the destination type is file system, and the user account to run the self-hosted agent has been granted
read/write access to the specified destination path.
Password
Password to access the specified file system or SSISDB. This property is visible when destination type is file system
or authentication type is SQL Server authentication or Active Directory - password. You can leave it empty when
destination type is file system, and the user account to run the self-hosted agent has been granted read/write
access to the specified destination path.
Overwrite existing projects or SSISDeploymentManifest files of the same names
Specify whether overwrite the existing projects or SSISDeploymentManifest files of the same names. If 'No', SSIS
Deploy task will skip deploying those projects or files.
Continue deployment when error occurs
Specify whether to continue deployment for remaining projects or files when an error occurs. If 'No', SSIS Deploy
task will stop immediately when error occurs.
Limitations and known issues
SSIS Deploy Task doesn't support the following scenarios currently:
Configure environment in SSIS catalog.
Deploy ispac to Azure SQL Server or Azure SQL Managed Instance, which only allows multi-factor
authentication (MFA).
Deploy packages to MSDB or SSIS Package Store.
Username
Username to access the target SQL Server. This property is only visible when Authentication type is SQL Server
Authentication or Active Directory - Password.
Password
Password to access the target SQL Server. This property is only visible when Authentication type is SQL Server
Authentication or Active Directory - Password.
Define configuration JSON
The configuration JSON schema has three layers:
catalog
folder
project and environment
JSON schema
Cat al o g A t t r i bu t es
P RO P ERT Y DESC RIP T IO N N OT ES
folders An array of folder objects. Each object See Folder Attributes for the schema of
contains configuration information for a a folder object.
catalog folder.
Fo l der A t t r i bu t es
name Name of the catalog folder. Folder will be created if not exists.
description Description of the catalog folder. The value of null will be skipped.
projects An array of project objects. Each object See Project Attributes for the schema of
contains configuration information for a a project object.
project.
environments An array of environment objects. Each See Environment Attributes for the
object contains configuration schema of an environment object.
information for an environment.
Pr o j ec t A t t r i bu t es
parameters An array of parameter objects. Each See Parameter Attributes the schema of
object contains configuration a parameter object.
information for a parameter.
references An array of reference objects. Each See Reference Attributes for the schema
object represents an environment of a reference object.
reference to the target project.
Par am et er A t t r i bu t es
R e fe r e n c e A t t r i b u t e s
environmentFolder Folder name of the environment. Folder will be created if not exists.
Value can be ".", which represents
parent folder of the project, which
references the environment.
En v i r o n m e n t A t t r i b u t e s
Va r i a b l e A t t r i b u t e s
description Description of the environment variable. The value of null will be skipped.
value Value of the environment variable. This attribute supports any valid
boolean, number, and string JSON
values.
The value will be converted to the type
specified by type attribute. Error will
occur if conversion fails.
The value of null is invalid. The task will
skip this environment variable object,
and give a warning.
Release notes
Version 1.0.3
Release Date: October 21, 2020
Allow specifying connection string suffix for SSIS Deploy task and SSIS Catalog Configuration task.
Version 1.0.2
Release Date: May 26, 2020
Fixed an issue that SSIS Catalog Configuration task may fail in some case after configuration work is done.
Version 1.0.1
Release Date: May 9, 2020
Fixed an issue that SSIS Build task always build the whole solution even if only single dtproj file is specified as
project path.
Version 1.0.0
Release Date: May 8, 2020
General Availability (GA) release.
Added a restriction of minimum .NET framework version on agent. Currently the minimum .NET framework
version is 4.6.2.
Refined description of SSIS Build task and SSIS Deploy task.
Version 0.2.0 Preview
Release Date: March 31, 2020
Add SSIS Catalog Configuration task.
Version 0.1.3 Preview
Release Date: January 19, 2020
Fixed an issue that prevented ispac from being deployed if its original file name was changed.
Version 0.1.2 Preview
Release Date: January 13, 2020
Added more detailed exception information in the SSIS Deploy task log when the destination type is SSISDB.
Fixed the example destination path in the help text of the property Destination path of SSIS Deploy task.
Version 0.1.1 Preview
Release Date: January 6, 2020
Added a restriction of minimal agent version requirement. Currently the minimal agent version of this product is
2.144.0.
Fixed some incorrect display text for SSIS Deploy task.
Refined some error messages.
Version 0.1.0 Preview
Release Date: December 5, 2019
Initial release of SSIS DevOps Tools. This is a preview release.
Next steps
Get SSIS DevOps extension
Standalone SQL Server Integration Service (SSIS)
DevOps Tools (Preview)
11/2/2020 • 4 minutes to read • Edit Online
Standalone SSIS DevOps Tools provide a set of executables to do SSIS CICD tasks. Without the dependency on
the installation of Visual Studio or SSIS runtime, these executables can be easily integrated with any CICD platform.
The executables provided are:
SSISBuild.exe: build SSIS projects in project deployment model or package deployment model.
SSISDeploy.exe: deploy ISPAC files to SSIS catalog, or DTSX files and their dependencies to file system.
Installation
.NET framework 4.6.2 or higher is required.
Download the latest installer from download center, then install via wizard or command line:
Install via wizard
Double-click the .exe file to install, then specify a folder to extract the executables and dependency files.
SSISBuild.exe
Syntax
PA RA M ET ER DESC RIP T IO N
-project |-p:<dtproj file path> File path of the dtproj file to be built.
-configuration|-c:<configuration name> Name of the project configuration to be used for build. If not
supplied, it defaults to the first defined project configuration in
the dtproj file.
-projectPassword|-pp:<project password> Password of the SSIS project and its packages. This argument
is only valid when the protection level of the SSIS project and
packages is EncryptSensitiveWithPassword or
EncryptAllWithPassword. For package deployment model, all
packages must share the same password specified by this
argument.
-output|-o:<output path> Output path of the build artifact. The value of this argument
will overwrite the default output path in the project
configuration.
Examples
Build a dtproj with the first defined project configuration, not encrypted with password:
SSISBuild.exe -p:"C:\projects\demo\demo.dtproj"
Build a dtproj with configuration "DevConfiguration", encrypted with password, and output the build
artifacts to a specific folder:
Build a dtproj with configuration "DevConfiguration", encrypted with password, striping its sensitive data,
and log level DIAG:
Parameters
PA RA M ET ER DESC RIP T IO N
-source|-s:<source path> Local file path of artifacts to be deployed. ISPAC, DTSX, path of
folder for DTSX, SSISDeploymentManfiest are allowed.
-authType|-at:<auth type name> Authentication type to access SQL Server. Mandatory for
CATALOG destination. Following types are supported:
WIN: Windows Authentication
SQL: SQL Server Authentication
ADPWD: Active Directory - Password
ADINT: Active Directory - Integrated
-connectionStringSuffix|-css:<connection string suffix> Suffix of the connection string, which is used to connect to the
SSIS catalog.
-log|-l:<log level>[;<log path>] Log related settings for running this utility.
log level: Only logs with equal or higher logging level will be
written to the log file. There are four logging levels (from low
to high): DIAG, INFO, WRN, ERR. The default logging level is
INFO if it's not specified.
log path: Path of the file to persist logs. Log file will not be
generated if the path is not specified.
Examples
Deploy a single ISPAC not encrypted with password to SSIS catalog with Windows authentication.
Deploy a single ISPAC encrypted with password to SSIS catalog with SQL authentication, and rename the
project name.
Deploy a single SSISDeploymentManifest and its associated files to Azure file share.
SSISDeploy.exe -s:D:\myfolder\mypackage.SSISDeploymentManifest -
d:file;\\myssisshare.file.core.windows.net\destfolder -u:Azure\myssisshare -p:storagekey
Release notes
Version 0.1.0 Preview
Release Date: October 16, 2020
Initial preview release of standalone SSIS DevOps Tools.
Next steps
Get standalone SSIS DevOps Tools
If you have questions, visit Q&A
Integration Services User Interface
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
In addition to the design surfaces on the SSIS Designer tabs, the user interface provides access to the following
windows and dialog boxes for adding features to packages and configuring the properties of package objects:
The dialog boxes and windows that you use to add functionality such as logging and configurations to
packages.
The custom editors for configuring the properties of package objects. Almost every type of container, task,
and data flow component has its own custom editor.
The Advanced Editor dialog box, a generic editor that provides more detailed configuration options for
many data flow components.
SQL Server Data Tools (SSDT) also provides windows and dialog boxes for configuring the environment and
working with packages.
Getting Star ted Access samples, tutorials, and videos. On the design surface of the Control
Flow tab or the Data Flow tab, right-
click and then click Getting Star ted .
Configure SSIS Logs Configure logging for a package and its On the SSIS menu, click Logging .
tasks by adding logs and setting
logging details. -or-
Package Configuration Organizer Add and edit package configurations. On the SSIS menu, click Package
You run the Package Configuration Configurations .
Wizard from this dialog box.
-or-
Digital Signing Sign a package or remove the signature On the SSIS menu, click Digital
from the package. Signing .
-or-
Set Breakpoints Enable breakpoints on tasks and set On the design surface of the Control
breakpoint properties. Flow tab, right-click a task or container,
and then click Edit Breakpoints . To set
a breakpoint on the package, right-click
anywhere on the design surface of the
Control Flow tab, and then click Edit
Breakpoints .
The Getting Star ted window provides links to samples, tutorials, and videos. To add links to additional content,
modify the SamplesSites.xml file that is included with the current release of SQL ServerIntegration Services. It is
recommended that you not modify the <GettingStartedSamples> element value that specifies the RSS feed URL.
The file is located in the <drive>:\Program Files\Microsoft SQL Server\110\DTS\Binn folder. On a 64-bit computer,
the file is located in the <drive>:\Program Files(x86)\Microsoft SQL Server\110\DTS\Binn folder
If the SamplesSites.xml file does become corrupted, replace the xml in the file with the following default xml.
<?xml version="1.0" ?>
- <SamplesSites>
<GettingStartedSamples>https://go.microsoft.com/fwlink/?LinkID=203147</GettingStartedSamples>
- <ToolboxSamples>
<Site>https://go.microsoft.com/fwlink/?LinkID=203286&query=SSIS%20{0}</Site>
</ToolboxSamples>
</SamplesSites>
This table lists the windows that are available from the SSIS and View menus and the design surfaces of SSIS
Designer.
W IN DO W P URP O SE A C C ESS
Variables Add and manage custom variables. On the SSIS menu, click Variables .
-or-
-or-
Log Events View log entries at run time. On the SSIS menu, click Log Events .
-or-
-or-
Custom Editors
Integration Services provides a custom dialog box for most containers, tasks, sources, transformations, and
destinations.
The following table describes how to access custom dialog boxes.
EDITO R T Y P E A C C ESS
Container. For more information, see Integration Services On the design surface of the Control Flow tab, double-click
Containers. the container.
Task. For more information, see Integration Services Tasks. On the design surface of the Control Flow tab, double-click
the task.
Source. On the design surface of the Data Flow tab, double-click the
source.
Transformation. For more information, see Integration Services On the design surface of the Data Flow tab, double-click the
Transformations. transformation.
Destination. On the design surface of the Data Flow tab, double-click the
destination.
Advanced Editor
The Advanced Editor dialog box is a user interface for configuring data flow components. It reflects the
properties of the component using a generic layout. The Advanced Editor dialog box is not available to
Integration Services transformations that have multiple inputs.
To open this editor, click ShowAdvanced Editor in the Proper ties window or right-click a data flow component,
and then click ShowAdvanced Editor .
If you create a custom source, transformation, or destination but do not want to write a custom user interface, you
can use the Advanced Editor instead.
Related Tasks
For information about how to create packages in SQL Server Data Tools (SSDT), see Create Packages in SQL Server
Data Tools
See Also
SSIS Designer
SSIS Designer
11/2/2020 • 8 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
SSIS Designer is a graphical tool that you can use to create and maintain Integration Services packages. SSIS
Designer is available in SQL Server Data Tools (SSDT) as part of an Integration Services project.
You can use SSIS Designer to perform the following tasks:
Constructing the control flow in a package.
Constructing the data flows in a package.
Adding event handlers to the package and package objects.
Viewing the package content.
At run time, viewing the execution progress of the package.
The following diagram shows SSIS Designer and the Toolbox window.
Integration Services includes additional dialog boxes and windows for adding functionality to packages, and SQL
Server Data Tools (SSDT) provides windows and dialog boxes for configuring the development environment and
working with packages. For more information, see Integration Services User Interface.
SSIS Designer has no dependency on the Integration Services service, the service that manages and monitors
packages, and it is not required that the service be running to create or modify packages in SSIS Designer. However,
if you stop the service while SSIS Designer is open, you can no longer open the dialog boxes that SSIS Designer
provides and you may receive the error message "RPC server is unavailable." To reset SSIS Designer and continue
working with the package, you must close the designer, exit SQL Server Data Tools (SSDT), and then reopen SQL
Server Data Tools (SSDT), the Integration Services project, and the package.
Undo and Redo
You can undo and redo up to 20 actions in the SSIS Designer. For a package, undo /redo is available in the Control
Flow , Data Flow , Event Handlers , and Parameters tabs, and in the Variables window. For a project, undo/redo
is available in the Project Parameters window.
You can't undo/redo changes to the new SSIS Toolbox .
When you make changes to a component using the component editor, you undo and redo the changes as a set
rather than undoing and redoing individual changes. The set of changes appears as a single action in the undo and
redo drop-down list.
To undo an action, click the undo toolbar button, Edit/Undo menu item, or press CTRL+Z. To redo an action, click
the redo toolbar button, Edit/Redo menu item or press CTRL + Y. You can undo and redo multiple actions, by
clicking the arrow next to the toolbar button, highlighting multiple actions in the drop-down list, and then clicking
in the list.
IMPORTANT!! Parameters are available only to projects developed for the project deployment model.
Therefore, you will see the Parameters tab only for packages that are part of a project configured to use the
project deployment model.
NOTE: To enable or disable the display of messages on the Progress tab, toggle the Debug Progress
Repor ting option on the SSIS menu.
Options
Executable
Select the executable for which you want to build an event handler. The executable can be the package, or a task or
containers in the package.
Event handler
Select a type of event handler. Create the event handler by dragging items from the Toolbox .
Delete
Select an event handler, and remove it from the package by clicking Delete .
Click here to create an <event handler name> for the executable <executable name>
Click to create the event handler.
Create the control flow by dragging graphical objects that represent SSIS tasks and containers from the Toolbox to
the design surface of the Event Handlers tab, and then connecting the objects by using precedence constraints to
define the sequence in which they run.
Additionally, to add annotations, right-click the design surface, and then on the menu, click Add Annotation .
Progress tab
Use the Progress tab of SSIS Designer to view the progress of execution of an Integration Services package when
you run the package in SQL Server Data Tools (SSDT). The Progress tab lists the start time, the finish time, and the
elapsed time for validation and execution of the package and its executables; any information or warnings for the
package; progress notifications; the success or failure of the package; and any error messages that are generated
during package execution.
To enable or disable the display of messages on the Progress tab, toggle the Debug Progress Repor ting option
on the SSIS menu. Disabling progress reporting can help improve performance while running a complex package
in SQL Server Data Tools.
After the package stops running, the Progress tab becomes the Execution Results tab.
Related Tasks
Create Packages in SQL Server Data Tools
See Also
Integration Services User Interface
Advanced Editor
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the Advanced Editor dialog box to configure to configure properties for the selected Integration Services
object.
The Advanced Editor is available for most Integration Services objects that have configurable properties. It is the
only editor available for those objects that do not expose a custom user interface.
Integration Services data flow objects have properties that can be set at the component level, the input and output
level, and the input and output column level. The Advanced Editor enumerates all the common and custom
properties of the selected object and displays them on up to four of the following five tabs as applicable:
Connection Managers -- use this tab to set connection properties
Component Proper ties -- use this tab to set component-level properties
Column Mappings -- use this tab to map available columns as output columns
Input Columns -- use this tab to select input columns
Input and Output Proper ties -- use this tab to set input and output properties; and to add and remove
outputs, select or remove columns for inputs and outputs, and set properties for inputs and outputs
The properties displayed vary by component. For more information on the properties that may be displayed in the
Advanced Editor , see the following topics:
Common Properties
Transformation Custom Properties
Path Properties
For more information about the specific component that you are editing, see the description of the component in
the Data Flow Elements section of the Integration Services Objects and Concepts documentation:
Integration Services Transformations
See Also
Integration Services Error and Message Reference
Group or Ungroup Components
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Control Flow , Data Flow , and Event Handlers tabs in SSIS Designer supports collapsible grouping. If a
package has many components, the tabs can become crowded, making it difficult to view all the components at
one time and to locate the item with which you want to work. The collapsible grouping feature can conserve space
on the work surface and make it easier to work with large packages.
You select the components that you want to group, group them, and then expand or collapse the groups to suit
your work. Expanding a group provides access to the properties of the components in the group. The precedence
constraints that connect tasks and containers are automatically included in the group.
The following are considerations for grouping components.
To group components, the control flow, data flow, or event handler must contain more than one component.
Groups can also be nested, making it possible to create groups within groups. The design surface can
implement multiple un-nested groups, but a component can belong to only one group, unless the groups
are nested.
When a package is saved, SSIS Designer saves the grouping, but the grouping has no effect on package
execution. The ability to group components is a design-time feature; it does not affect the run-time behavior
of the package.
To group components
1. In SQL Server Data Tools (SSDT), open the Integration Services project that contains the package you want.
2. In Solution Explorer, double-click the package to open it.
3. Click the Control Flow , Data Flow , or Event Handlers tab.
4. On the design surface of the tab, select the components you want to group, right-click a selected component,
and then click Group .
5. To save the updated package, click Save Selected Items on the File menu.
To ungroup components
1. In SQL Server Data Tools (SSDT), open the Integration Services project that contains the package you want.
2. In Solution Explorer, double-click the package to open it.
3. Click the Control Flow , Data Flow , or Event Handlers tab.
4. On the design surface of the tab, select the group that contains the component you want to ungroup, right-
click, and then click Ungroup .
5. To save the updated package, click Save Selected Items on the File menu.
See Also
Add or Delete a Task or a Container in a Control Flow
Connect Tasks and Containers by Using a Default Precedence Constraint
Use Annotations in Packages
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The SSIS Designer provides annotations, which you can use to make packages self-documenting and easier to
understand and maintain. You can add annotations to the control flow, data flow, and event handler design surfaces
of SSIS Designer. The annotations can contain any type of text, and they are useful for adding labels, comments,
and other descriptive information to a package. Annotations are a design-time feature only. For example, they are
not written to logs.
When you press ENTER, the text wraps to the next line. The annotation box automatically increases in size as you
add additional lines of text. Package annotations are persisted as clear text in the CDATA section of the package file.
For more information about changes to the format of the package file, see SSIS Package Format.
When you save the package, SSIS Designer saves the annotations in the package.
NOTE
If you add no text, the text block is removed when you click outside the block.
5. To change the size or format of the text in the annotation, right-click the annotation and then click Set Text
Annotation Font .
6. To add an additional line of text, press ENTER.
The annotation box automatically increases in size as you add additional lines of text.
7. To add an annotation to a group, right-click the annotation and then click Group .
8. To save the updated package, on the File menu, click Save All .
SSIS Toolbox
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
All components installed on the local machine automatically appear in the SSIS Toolbox . When you install
additional components, right-click inside the toolbox and then click Refresh Toolbox to add the components.
When you create a new SSIS project or open an existing project, the SSIS Toolbox displays automatically. You can
also open the toolbox by clicking the toolbox button that is located in the top-right corner of the package design
surface, or by clicking VIEW -> Other Windows -> SSIS Toolbox.
NOTE
If you can't see the toolbox, go to VIEW -> Other Windows -> SSIS Toolbox.
Get more information about a component in the toolbox by clicking the component to view its description at the
bottom of the toolbox. For some components you can also access samples that demonstrate how to configure and
use the components. The samples are available on MSDN. To access the samples from the SSIS Toolbox , click the
Find Samples link that appears below the description.
NOTE
You can't remove installed components from the toolbox.
Toolbox categories
In the SSIS Toolbox , control flow and data flow components are organized into categories. You can expand and
collapse categories, and rearrange components. Restore the default organization by right-clicking inside the toolbox
and then click Restore Toolbox Defaults .
The Favorites and Common categories appear in the toolbox when you select the Control Flow , Data Flow , and
Event Handlers tabs. The Other Tasks category appears in the toolbox when you select the Control Flow tab or
the Event Handlers tab. The Other Transforms , Other Sources , and Other Destinations categories appear in
the toolbox when you select the Data Flow tab.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the General page of the Integration Ser vices Designers page in the Options dialog box to specify the
options for loading, displaying, and upgrading packages.
To open the General page, in SQL Server Data Tools (SSDT), on the Tools menu, click Options , expand Business
Intelligence Designers , and select Integration Ser vices Designers .
Options
Check digital signature when loading a package
Select to have Integration Services check the digital signature when loading a package. Integration Services will
only check whether the digital signature is present, is valid, and is from a trusted source. Integration Services will
not check whether the package has been changed since the package was signed.
If you set the BlockedSignatureStates registry value, this registry value overrides the Check digital signature
when loading a package option. For more information, see Implement a Signing Policy by Setting a Registry
Value.
For more information about digital certificates and packages, see Identify the Source of Packages with Digital
Signatures.
Show warning if package is unsigned
Select to display a warning when loading a package that is not signed.
Show precedence constraint labels
Select which label-Success, Failure, or Completion-to display on precedence constraints when viewing packages in
SQL Server Data Tools (SSDT).
Scripting language
Select the default scripting language for new Script tasks and Script components.
Update connection strings to use new provider names
When opening or upgrading SQL Server 2005 Integration Services (SSIS) packages, update connection strings to
use the names for the following providers, for the current release of SQL ServerIntegration Services:
Analysis Services OLE DB provider
SQL Server Native Client
The SSIS Package Upgrade Wizard updates only connection strings that are stored in connection managers. The
wizard does not update connection strings that are constructed dynamically by using the Integration Services
expression language, or by using code in a Script task.
Create new package ID
When upgrading SQL Server 2005 Integration Services (SSIS) packages, create new package IDs for the upgraded
versions of the packages.
See Also
Security Overview (Integration Services)
Extending Packages with Scripting
Integration Services (SSIS) Packages
11/2/2020 • 8 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A package is an organized collection of connections, control flow elements, data flow elements, event handlers,
variables, parameters, and configurations, that you assemble using either the graphical design tools that SQL
Server Integration Services provides, or build programmatically. You then save the completed package to SQL
Server, the SSIS Package Store, or the file system, or you can deploy the ssISnoversion project to the SSIS server.
The package is the unit of work that is retrieved, executed, and saved.
When you first create a package, it is an empty object that does nothing. To add functionality to a package, you
add a control flow and, optionally, one or more data flows to the package.
The following diagram shows a simple package that contains a control flow with a Data Flow task, which in turn
contains a data flow.
After you have created the basic package, you can add advanced features such as logging and variables to extend
package functionality. For more information, see the section about Objects that Extend Package Functionality.
The completed package can then be configured by setting package-level properties that implement security,
enable restarting of packages from checkpoints, or incorporate transactions in package workflow. For more
information, see the section about Properties that Support Extended Features.
Contents of a package
Tasks and containers (control flow). A control flow consists of one or more tasks and containers that execute
when the package runs. To control order or define the conditions for running the next task or container in the
package control flow, you use precedence constraints to connect the tasks and containers in a package. A subset
of tasks and containers can also be grouped and run repeatedly as a unit within the package control flow. For
more information, see Control Flow.
Data sources and destinations (data flow). A data flow consists of the sources and destinations that extract
and load data, the transformations that modify and extend data, and the paths that link sources, transformations,
and destinations. Before you can add a data flow to a package, the package control flow must include a Data Flow
task. The Data Flow task is the executable within the SSIS package that creates, orders, and runs the data flow. A
separate instance of the data flow engine is opened for each Data Flow task in a package. For more information,
see Data Flow Task and Data Flow.
Connection managers (connections). A package typically includes at least one connection manager. A
connection manager is a link between a package and a data source that defines the connection string for
accessing the data that the tasks, transformations, and event handlers in the package use. Integration Services
includes connection types for data sources such as text and XML files, relational databases, and Analysis Services
databases and projects. For more information, see Integration Services (SSIS) Connections.
LO G EN T RY DESC RIP T IO N
Related Tasks
Integration Services includes two graphical tools, SSIS Designer and SQL Server Import and Export Wizard, in
addition to the SSIS object model for creating packages. See the following topics for details.
Import and Export Data with the SQL Server Import and Export Wizard
Create Packages in SQL Server Data Tools
See Building Packages Programmatically in the Developer Guide.
Create Packages in SQL Server Data Tools
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
In SQL Server Data Tools (SSDT), you can create a new package by using one of the following methods:
Use the package template that Integration Services includes.
Use a custom template
To use custom packages as templates for creating new packages, you simply copy them to the
DataTransformationItems folder. By default, this folder is in C:\Program Files\Microsoft Visual Studio
10.0\Common7\IDE\PrivateAssemblies\ProjectItems\DataTransformationProject.
Copy an existing package.
If existing packages include functionality that you want to reuse, you can build the control flow and data
flows in the new package more quickly by copying and pasting objects from other packages. For more
information about using copy and paste in Integration Services projects, see Reuse of Package Objects.
If you create a new package by copying an existing package or by using a custom package as a template,
the name and the GUID of the existing package are copied as well. You should update the name and the
GUID of the new package to help differentiate it from the package from which it was copied. For example, if
packages have the same GUID, it is more difficult to identify the package to which log data belongs. You
can regenerate the GUID in the ID property and update the value of the Name property by using the
Properties window in SQL Server Data Tools (SSDT). For more information, see Set Package Properties and
dtutil Utility.
Use a custom package that you have designated as a template.
Run the SQL Server Import and Export Wizard
The SQL Server Import and Export Wizard creates a complete package for a simple import or export. This
wizard configures the connections, source, and destination, and adds any data transformations that are
required to let you run the import or export immediately. You can optionally save the package to run it
again later, or to refine and enhance the package in SQL Server Data Tools. However, if you save the
package, you must add the package to an existing Integration Services project before you can change the
package or run the package in SQL Server Data Tools.
The packages that you create in SQL Server Data Tools (SSDT) using SSIS Designer are saved to the file system. To
save a package to SQL Server or to the package store, you need to save a copy of the package. For more
information, see Save a Copy of a Package.
For a video that demonstrates how to create a basic package using the default package template, see Creating a
Basic Package (SQL Server Video).
Create a package in SQL Server Data Tools using the Package Template
1. In SQL Server Data Tools (SSDT), open the Integration Services project in which you want to create a
package.
2. In Solution Explorer, right-click the SSIS Packages folder, and then click New SSIS Package .
3. Optionally, add control flow, data flow tasks, and event handlers to the package. For more information, see
Control Flow, Data Flow, and Integration Services (SSIS) Event Handlers.
4. On the File menu, click Save Selected Items to save the new package.
NOTE
You can save an empty package.
You can create, maintain, and run packages that target SQL Server 2016, SQL Server 2014, or SQL Server 2012.
Add Copy of Existing Package
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the Add Copy of Existing Package dialog box to add a copy of a package stored in SQL Server, the file
system, or the SSIS Package Store to an Integration Services project.
Options
Package location
Select the type of storage location from which to copy the package.
Ser ver
If copying from SQL Server or the SSIS Package Store, type a server name or select a server from the list.
Authentication type
If copying from SQL Server, select an authentication type.
User name
If using SQL Server Authentication, provide a user name.
Password
If using SQL Server Authentication, provide a password.
Package path
Type the package path, or click the browse button (...) and locate the package to copy.
See Also
Save Copy of Package
Save Packages
Integration Services Service (SSIS Service)
Set Package Properties
11/2/2020 • 7 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
When you create a package in SQL Server Data Tools (SSDT) by using the graphical interface that Integration
Services provides, you set the properties of the package object in the Properties window.
The Proper ties window provides a categorized and alphabetical list of properties. To arrange the Proper ties
window by category, click the Categorized icon.
When arranged by category, the Proper ties window groups properties in the following categories:
Checkpoints
Execution
Forced Execution Value
Identification
Misc
Security
Transactions
Version
For information about additional package properties that you cannot set in the Proper ties window, see Package.
To set package properties in the Properties window
Set the Properties of a Package
Properties by Category
The following tables list the package properties by category.
Checkpoints
You can use the properties in this category to restart the package from a point of failure in the package control
flow, instead of rerunning the package from the beginning of its control flow. For more information, see Restart
Packages by Using Checkpoints.
CheckpointFileName The name of the file that captures the checkpoint information
that enables a package to restart. When the package finishes
successfully, this file is deleted.
NOTE
The /CheckPointing on option of dtexec is equivalent to setting the SaveCheckpoints property of the package to True,
and the CheckpointUsage property to Always. For more information, see dtexec Utility.
Execution
The properties in this category configure the run-time behavior of the package object.
DisableEventHandlers Specifies whether the package event handlers run. The default
value of this property is False .
MaxConcurrentExecutables The number of executable files that the package can run
concurrently. The default value of this property is -1 , which
indicates that there is no limit.
PackagePriorityClass The Win32 thread priority class of the package thread. The
values are Default , AboveNormal, Normal, BelowNormal,
Idle . The default value of this property is Default . For more
information, see DTSPriorityClass.
Identification
The properties in this category provide information such as the unique identifier and name of the package.
CreatorComputerName The name of the computer on which the package was created.
Misc
The properties in this category are used to access the configurations and expressions that a package uses and to
provide information about the locale and logging mode of the package. For more information, see Use Property
Expressions in Packages.
Expressions Click the browse button (...) to create expressions for package
properties.
Note that you can create property expressions for all the
package properties that object model includes, not just the
properties listed in the Properties window.
ForceExecutionResult The execution result of the package. The values are None ,
Success , Failure , and Completion . The default value of this
property is None . For more information, see
T:Microsoft.SqlServer.Dts.Runtime.DTSForcedExecResult.
Security
The properties in this category are used to set the protection level of the package. For more information, see
Access Control for Sensitive Data in Packages.
Transactions
The properties in this category configure the isolation level and the transaction option of the package. For more
information, see Integration Services Transactions.
IsolationLevel The isolation level of the package transaction. The values are
Unspecified , Chaos , ReadUncommitted ,
ReadCommitted , RepeatableRead , Serializable , and
Snapshot . The default value of this property is Serializable .
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
In SSIS Designer, the Package Explorer tab provides an explorer view of the package. The view reflects the
container hierarchy of the Integration Services architecture. The package container is at the top of the hierarchy,
and you expand the package to view the connections, executables, event handlers, log providers, precedence
constraints, and variables in the package.
The executables, which are the containers and tasks in the package, can include event handlers, precedence
constraints, and variables. Integration Services supports a nested hierarchy of containers, and the For Loop,
Foreach Loop, and Sequence containers can include other executables.
If a package includes a data flow, the Package Explorer lists the Data Flow task and includes a Components
folder that lists the data flow components.
From the Package Explorer tab, you can delete objects in a package and access the Proper ties window to view
object properties.
The following diagram shows a tree view of a simple package.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic describes how to create a new Integration Services package by copying an existing package, and how to
update the Name and GUID properties of the new package.
To copy a package
1. In SQL Server Data Tools (SSDT), open the Integration Services project that contains the package that you
want to copy.
2. In Solution Explorer, double-click the package.
3. Verify either the package to copy is selected in Solution Explorer or the tab in SSIS Designer that contains the
package is the active tab
4. On the File menu, click Save <package name> As .
NOTE
The package must be opened in SSIS Designer before the Save As option appears on the File menu.
See Also
Save a Copy of a Package
Create Packages in SQL Server Data Tools
Integration Services (SSIS) Packages
Copy Package Objects
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
This topic describes how to copy control flow items, data flow items, and connection managers within a package or
between packages.
To copy control and data flow items
1. In SQL Server Data Tools (SSDT), open the Integration Services project that contains the packages that you
want work with.
2. In Solution Explorer, double-click the packages that you want to copy between.
3. In SSIS Designer, click the tab for the package that contains the items to copy and click the Control Flow ,
Data Flow , or Event Handlers tab.
4. Select the control flow or data flow items to copy. You can either select items one at a time by pressing the
Shift key and clicking the item or select items as a group by dragging the pointer across the items you want
to select.
IMPORTANT
The precedence constraints and paths that connect items are not selected automatically when you select the two
items that they connect. To copy an ordered workflow-a segment of control flow or data flow-make sure to also copy
the precedence constrains and the paths.
IMPORTANT
You cannot copy a data flow to a package unless the package contains at least one Data Flow task.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
In SQL Server Data Tools (SSDT) you build packages by using SSIS Designer and save the packages to the file
system as XML files (.dtsx files). You can also save copies of the package XML file to the msdb database in SQL
Server or to the package store. The package store represents the folders in the file system location that the
Integration Services service manages.
If you save a package to the file system, you can later use the Integration Services service to import the package
to SQL Server or to the package store. For more information, see Integration Services Service (SSIS Service).
You can also use a command prompt utility, dtutil , to copy a package between the file system and msdb. For
more information, see dtutil Utility.
NOTE
You can verify the path and file name where the package was saved in the Properties window.
NOTE
The package must be opened in SSIS Designer before you can save a copy of the package.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Save a commonly used control flow task or container to a standalone part file - a ".dtsxp" file - and reuse it
multiple times in one or more packages by using control flow package parts. This reusability makes SSIS packages
easier to design and maintain.
A new part file with the ".dtsxp" extension is created under the Package Par ts | Control Flow folder. At the same
time, a new item with the same name is also added to the SSIS toolbox. (The toolbox item is only visible while you
have a project that contains the part open in Visual Studio.)
Add a copy of an existing control flow package part or a reference to an existing part
To add a copy of an existing part in the file system to a package, in Solution Explorer, expand the Package Par ts
folder. Right-click on Control Flow and select Add Existing Control Flow Package Par t .
Options
Package Par t path
Type the path to the part file, or click the browse button (...) and locate the part file to copy or to reference.
Add as a reference
If selected, the part is added to the Integration Services project as a reference. Select this option when you
when want to reference a single copy of a part file in multiple Integration Services projects.
If cleared, a copy of the part file is added to the project.
NOTE
If you want to remove a part from an Integration Services project, but continue to use it in other projects, use the Exclude
from Project option instead of the Delete option.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Frequently packages functionality that you want to reuse. For example, if you created a set of tasks, you might want
to reuse the items together as a group, or you might want to reuse a single item such as a connection manager that
you created in a different Integration Services project.
Related Tasks
Copy Package Objects
Copy Project Items
Save a Package as a Package Template
Delete Packages
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
In SQL Server Data Tools (SSDT), you can delete packages saved to the file system. If you delete a package, it is
deleted permanently and it cannot be restored to an Integration Services project.
NOTE
If you want to remove packages from an Integration Services project, but use them in other projects, then you should use
the Exclude From Project option instead of the Delete option.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The dtutil command prompt utility is used to manage SQL Server Integration Services packages. The utility can
copy, move, delete, or verify the existence of a package. These actions can be performed on any SSIS package that
is stored in one of three locations: a Microsoft SQL Server database, the SSIS Package Store, and the file system. If
the utility accesses a package that is stored in msdb , the command prompt may require a user name and a
password. If the instance of SQL Server uses SQL Server Authentication, the command prompt requires both a
user name and a password. If the user name is missing, dtutil tries to log on to SQL Server using Windows
Authentication. The storage type of the package is identified by the /SQL , /FILE , and /DTS options.
The dtutil command prompt utility does not support the use of command files or redirection.
The dtutil command prompt utility includes the following features:
Remarks in the command prompt, which makes the command prompt action self-documenting and easier
to understand.
Overwrite protection, to prompt for a confirmation before overwriting an existing package when you are
copying or moving packages.
Console help, to provide information about the command options for dtutil .
NOTE
Many of the operations that are performed by dtutil can also be performed visually in SQL Server Management Studio
when you are connected to an instance of Integration Services. For more information, see Package Management (SSIS
Service).
The options can be typed in any order. The pipe ("|") character is the OR operator and is used to show possible
values. You must use one of the options that are delimited by the OR pipe.
All options must start with a slash (/) or a minus sign (-). However, do not include a space between the slash or
minus sign and the text for the option; otherwise, the command will fail.
Arguments must be strings that are either enclosed in quotation marks or contain no white space.
Double quotation marks within strings that are enclosed in quotation marks represent escaped single quotation
marks.
Options and arguments, except for passwords, are not case sensitive.
Installation Considerations on 64-bit Computers
On a 64-bit computer, Integration Services installs a 64-bit version of the dtexec utility (dtexec.exe) and the dtutil
utility (dtutil.exe). To install 32-bit versions of these Integration Services tools, you must select either Client Tools
or SQL Server Data Tools (SSDT) during setup.
By default, a 64-bit computer that has both the 64-bit and 32-bit versions of an Integration Services command
prompt utility installed will run the 32-bit version at the command prompt. The 32-bit version runs because the
directory path for the 32-bit version appears in the PATH environment variable before the directory path for the
64-bit version. (Typically, the 32-bit directory path is <drive>:\Program Files(x86)\Microsoft SQL
Server\130\DTS\Binn, while the 64-bit directory path is <drive>:\Program Files\Microsoft SQL
Server\130\DTS\Binn.)
NOTE
If you use SQL Server Agent to run the utility, SQL Server Agent automatically uses the 64-bit version of the utility. SQL
Server Agent uses the registry, not the PATH environment variable, to locate the correct executable for the utility.
To ensure that you run the 64-bit version of the utility at the command prompt, you can take one of the following
actions:
Open a Command Prompt window, change to the directory that contains the 64-bit version of the utility
(<drive>:\Program Files\Microsoft SQL Server\130\DTS\Binn), and then run the utility from that location.
At the command prompt, run the utility by entering the full path (<drive>:\Program Files\Microsoft SQL
Server\130\DTS\Binn) to the 64-bit version of the utility.
Permanently change the order of the paths in the PATH environment variable by placing the 64-bit path
(<drive>:\Program Files\Microsoft SQL Server\130\DTS\Binn) before the 32-bit path (<drive>:\ Program
Files(x86)\Microsoft SQL Server\130\DTS\Binn) in the variable.
Syntax
dtutil /option [value] [/option [value]]...
Parameters
O P T IO N DESC RIP T IO N
/Dec[rypt] password (Optional). Sets the decryption password that is used when
you load a package with password encryption.
/Del[ete] Deletes the package specified by the SQL, DTS or FILE option.
If dtutil cannot delete the package, the program ends.
O P T IO N DESC RIP T IO N
/DestP[assword] password Specifies the password that is used with the SQL option to
connect to a destination SQL Server instance using SQL
Server Authentication. An error is generated if
DESTPASSWORD is specified in a command line that does not
include the DTSUSER option.
/DestS[erver] server_instance Specifies the server name that is used with any action that
causes a destination to be saved to SQL Server. It is used to
identify a non-local or non-default server when saving an
SSIS package. It is an error to specify DESTSERVER in a
command line that does not have an action associated with
SQL Server. Actions such as SIGN SQL, COPY SQL, or MOVE
SQL options would be appropriate commands to combine
with this option.
/DestU[ser] username Specifies the user name that is used with the SIGN SQL,
COPY SQL, and MOVE SQL options to connect to a SQL
Server instance that uses SQL Server Authentication. It is an
error to specify DESTUSER in a command line that does not
include the SIGN SQL, COPY SQL, or MOVE SQL option.
/Dump process ID (Optional) Causes the specified process, either the dtexec
utility or the dtsDebugHost.exe process, to pause and
create the debug dump files, .mdmp and .tmp.
To find the process ID for the process that you want to pause,
use Windows Task Manager.
FILE
SQL
SOURCEUSER
SOURCEPASSWORD
SOURCESERVER
/En[crypt] {SQL | FILE}; Path;ProtectionLevel[;password] (Optional). Encrypts the loaded package with the specified
protection level and password, and saves it to the location
specified in Path. The ProtectionLevel determines whether a
password is required.
FILE - Path is the fully-qualified path and file name for the
package.
ProtectionLevel options:
/FC[reate] {SQL | DTS};ParentFolderPath;NewFolderName (Optional). Create a new folder that has the name that you
specified in NewFolderName. The location of the new folder is
indicated by the ParentFolderPath.
O P T IO N DESC RIP T IO N
/FDe[lete] {SQL | DTS}[;ParentFolderPath;FolderName] (Optional). Deletes from SQL Server or SSIS the folder that
was specified by the name in FolderName. The location of the
folder to delete is indicated by the ParentFolderPath.
/FDi[rectory] {SQL | DTS};FolderPath[;S] (Optional). Lists the contents, both folders and packages, in a
folder on SSIS or SQL Server. The optional FolderPath
parameter specifies the folder that you want to view the
content of. The optional S parameter specifies that you want
to view a listing of the contents of the subfolders for the
folder specified in FolderPath.
/FE[xists ] {SQL | DTS};FolderPath (Optional). Verifies if the specified folder exists on SSIS or SQL
Server. The FolderPath parameter is the path and name of the
folder to verify.
/Fi[le] filespec This option specifies that the SSIS package to be operated on
is located in the file system. The filespec value can be
provided as either a Universal Naming Convention (UNC)
path or local path.
DTS
SQL
SOURCEUSER
SOURCEPASSWORD
SOURCESERVER
/FR[ename] {SQL | DTS} [;ParentFolderPath; (Optional). Renames a folder on the SSIS or SQL Server. The
OldFolderName;NewFolderName] ParentFolderPath is the location of the folder to rename. The
OldFolderName is the current name of the folder, and
NewFolderName is the new name to give the folder.
/H[elp] option Displays text extensive help that shows the dtutil options
and describes their use. The option argument is optional. If
the argument is included, the Help text includes detailed
information about the specified option. The following example
displays help for all options:
dtutil /H
dtutil /H Q
O P T IO N DESC RIP T IO N
/I[DRegenerate] Creates a new GUID for the package and updates the
package ID property. When a package is copied, the package
ID remains the same; therefore, the log files contain the same
GUID for both packages. This action creates a new GUID for
the newly-copied package to distinguish it from the original.
/M[ove] {SQL | File | DTS}; pathandname Specifies a move action on an SSIS package. To use this
parameter, first specify the location of the package using the
/FI , /SQ , or /DT option. Next, specify the Move action. This
action requires two arguments, which are separated by a
semicolon:
/Si[gn] {SQL | File | DTS}; path; hash Signs an SSIS package. This action uses three required
arguments, which are separated by semicolons; destination,
path, and hash:
/SourceP[assword] password Specifies the password that is used with the SQL and
SOURCEUSER options to enable the retrieval of an SSIS
package that is stored in a database on a SQL Server instance
that uses SQL Server Authentication. It is an error to specify
SOURCEPASSWORD in a command line that does not include
the SOURCEUSER option.
/SourceS[erver] server_instance Specifies the server name that is used with the SQL option to
enable the retrieval of an SSIS package that is stored in SQL
Server. It is an error to specify SOURCESERVER in a command
line that does not include the SIGN SQL, COPY SQL, or
MOVE SQL option.
/SourceU[ser] username Specifies the user name that is used with the SOURCESERVER
option to enable the retrieval of an SSIS package stored in
SQL Server using SQL Server Authentication. It is an error to
specify SOURCEUSER in a command line that does not
include the SIGN SQL, COPY SQL, or MOVE SQL option.
/SQ[L] package_path Specifies the location of an SSIS package. This option indicates
that the package is stored in the msdb database. The
package_path argument specifies the path and name of the
SSIS package. Folder names are terminated with back slashes.
DTS
FILE
SOURCEUSER
SOURCEPASSWORD
SOURCESERVER
VA L UE DESC RIP T IO N
Remarks
You cannot use command files or redirection with dtutil .
The order of the options within the command line is not significant.
Examples
The following examples detail typical command line usage scenarios.
Copy Examples
To copy a package that is stored in the msdb database on a local instance of SQL Server using Windows
Authentication to the SSIS Package Store, use the following syntax:
To copy a package from a location on the File system to another location and give the copy a different name, use
the following syntax:
To copy a package on the local file system to an instance of SQL Server hosted on another computer, use the
following syntax:
Because the /DestU[ser] and /DestP[assword] options were not used, Windows Authentication is assumed.
To create a new ID for a package after it is copied, use the following syntax:
To create a new ID for all the packages in a specific folder, use the following syntax:
Use a single percent sign (%) when typing the command at the command prompt. Use a double percent sign
(%%) if the command is used inside a batch file.
Delete Examples
To delete a package that is stored in the msdb database on an instance of SQL Server that uses Windows
Authentication, use the following syntax:
To delete a package that is stored in the msdb database on an instance of SQL Server that uses SQL Server
Authentication, use the following syntax:
To delete a package that is stored in the SSIS Package Store, use the following syntax:
To delete a package that is stored in the file system, use the following syntax:
Exists Examples
To determine whether a package exists in the msdb database on a local instance of SQL Server that uses
Windows Authentication, use the following syntax:
To determine whether a package exists in the msdb database on a local instance of SQL Server that uses SQL
Server Authentication, use the following syntax:
NOTE
To determine whether a package exists on a named server, include the SOURCESERVER option and its argument. You can
only specify a server by using the SQL option.
To determine whether a package exists in the local package store, use the following syntax:
To determine whether a package exists in the local file system, use the following syntax:
Move Examples
To move a package that is stored in the SSIS Package Store to the msdb database on a local instance of SQL
Server that uses Windows Authentication, use the following syntax:
To move a package that is stored in the msdb database on a local instance of SQL Server that uses SQL Server
Authentication to the msdb database on another local instance of SQL Server that uses SQL Server
Authentication, use the following syntax:
dtutil /SQL srcPackage /SOURCEUSER srcUserName /SOURCEPASSWORD $Hj45jhd@X /MOVE SQL;destPackage /DESTUSER
destUserName /DESTPASSWORD !38dsFH@v
NOTE
To move a package from one named server to another, include the SOURCES and the DESTS option and their arguments.
You can only specify servers by using the SQL option.
To move a package that is stored in the SSIS Package Store, use the following syntax:
To move a package that is stored in the file system, use the following syntax:
Sign Examples
To sign a package that is stored in a SQL Server database on a local instance of SQL Server that uses Windows
Authentication, use the following syntax:
To locate information about your certificate, use Cer tMgr . The hash code can be viewed in the Cer tMgr utility by
selecting the certificate, and then clicking View to view the properties. The Details tab provides more
information about the certificate. The Thumbprint property is used as the hash value, with spaces removed.
NOTE
The hash used in this example is not a real hash.
For more information, see the CertMgr section in Signing and Checking Code with Authenticode.
Encrypt Examples
The following sample encrypts the file-based PackageToEncrypt.dtsx to the file-based EncryptedPackage.dts using
full package encryption, with a password. The password that is used for the encryption is EncPswd.
See Also
Run Integration Services (SSIS) Packages
SSIS Package Upgrade Wizard F1 Help
11/2/2020 • 8 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the SSIS Package Upgrade Wizard to upgrade packages created by earlier versions of SQL Server to the
package format for the current release of SQL Server Integration Services.
To run the SSIS Package Upgrade Wizard
Upgrade Integration Services Packages Using the SSIS Package Upgrade Wizard
NOTE
This page is only available when you run the SSIS Package Upgrade Wizard from SQL Server Management Studio or at the
command prompt.
Static Options
Package source
Select the storage location that contains the packages to be upgraded. This option has the values listed in the
following table.
VA L UE DESC RIP T IO N
SSIS Package Store Indicates that the packages to be upgraded are in the package
store. The package store consists of the set of file system
folders that the Integration Services service manages. For
more information, see Package Management (SSIS Service).
Microsoft SQL Ser ver Indicates the packages to be upgraded are from an existing
instance of SQL Server.
Folder
Type the name of a folder that contains the packages you want to upgrade or click Browse and locate the folder.
Browse
Browse to locate the folder that contains the packages you want to upgrade.
Package Source Dynamic Options
Package source = SSIS Package Store
Ser ver
Type the name of the server that has the packages to be upgraded, or select this server in the list.
Package source = Microsoft SQL Server
Ser ver
Type the name of the server that has the packages to be upgraded, or select this server from the list.
Use Windows authentication
Select to use Windows Authentication to connect to the server.
Use SQL Ser ver authentication
Select to use SQL Server Authentication to connect to the server. If you use SQL Server Authentication, you must
provide a user name and password.
User name
Type the user name that SQL Server Authentication will use to connect to the server.
Password
Type the password that SQL Server Authentication will use to connect to the server.
NOTE
This page is only available when you run the SSIS Package Upgrade Wizard from SQL Server Management Studio or at the
command prompt.
Static Options
Save to source location
Save the upgraded packages to the same location as specified on the Select Source Location page of the wizard.
If the original packages are stored in the file system and you want the wizard to back up those packages, select the
Save to source location option. For more information, see Upgrade Integration Services Packages Using the
SSIS Package Upgrade Wizard.
Select new destination location
Save the upgraded packages to the destination location that is specified on this page.
Package source
Specify where the upgrade packages are to be stored. This option has the values listed in the following table.
VA L UE DESC RIP T IO N
SSIS Package Store Indicates that the upgraded packages are to be saved to the
Integration Services package store. The package store consists
of the set of file system folders that the Integration Services
service manages. For more information, see Package
Management (SSIS Service).
Microsoft SQL Ser ver Indicates that the upgraded packages are to be saved to an
existing instance of SQL Server.
Folder
Type the name of a folder to which the upgraded packages will be saved, or click Browse and locate the folder.
Browse
Browse to locate the folder to which the upgraded packages will be saved.
Package Source Dynamic Options
Package source = SSIS Package Store
Ser ver
Type the name of the server to which the upgrade packages will be saved, or select a server in the list.
Package source = Microsoft SQL Server
Ser ver
Type the name of the server to which the upgrade packages will be saved, or select this server in the list.
Use Windows authentication
Select to use Windows Authentication to connect to the server.
Use SQL Ser ver authentication
Select to use SQL Server Authentication to connect to the server. If you use SQL Server Authentication, you must
provide a user name and password.
User name
Type the user name to be used when using SQL Server Authentication to connect to the server.
Password
Type the password to be used when using SQL Server Authentication to connect to the server.
NOTE
This option is available only when you specify that the original packages and the upgraded packages are stored in the file
system and in the same folder.
Select Packages page
Use the Select Packages page to select the packages to upgrade. This page lists the packages that are stored in
the location that was specified on the Select Source Location page of the wizard.
Options
Existing package name
Select one or more packages to upgrade.
Upgrade package name
Provide the destination package name, or use the default name that the wizard provides.
NOTE
You can also change the destination package name after upgrading the package. In SQL Server Data Tools (SSDT) or SQL
Server Management Studio, open the upgraded package and change the package name.
Password
Specify the password that is used to decrypt the selected upgrade packages.
Apply to selection
Apply the specified password to decrypt the selected upgrade packages.
See Also
Upgrade Integration Services Packages
Integration Services (SSIS) Package and Project
Parameters
11/2/2020 • 11 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Integration Services (SSIS) parameters allow you to assign values to properties within packages at the time of
package execution. You can create project parameters at the project level and package parameters at the package
level. Project parameters are used to supply any external input the project receives to one or more packages in the
project. Package parameters allow you to modify package execution without having to edit and redeploy the
package.
In SQL Server Data Tools you create, modify, or delete project parameters by using the Project.params window.
You create, modify, and delete package parameters by using the Parameters tab in the SSIS Designer. You
associate a new or an existing parameter with a task property by using the Parameterize dialog box. For more
about using the Project.params window and the Parameters tab, see . For more information about the
Create Parameters
VA L UE N A M E DESC RIP T IO N T Y P E O F VA L UE
VA L UE N A M E DESC RIP T IO N T Y P E O F VA L UE
Server Value The value assigned to the parameter Literal or Environment Variable
within the scope of the project, after Reference
the project is deployed to the
Integration Services server. This value
overrides the design default.
You can use a single parameter to assign a value to multiple package properties. A single package property can be
assigned a value only from a single parameter.
Executions and Parameter Values
The execution is an object that represents a single instance of package execution. When you create an execution,
you specify all of the details necessary to run a package such as execution parameter values. You can also modify
the parameters values for existing executions.
When you explicitly set an execution parameter value, the value is applicable only to that particular instance of
execution. The execution value is used instead of a server value or a design value. If you do not explicitly set an
execution value, and a server value has been specified, the server value is used.
When a parameter is marked as required, a server value or execution value must be specified for that parameter.
Otherwise, the corresponding package does not execute. Although the parameter has a default value at design
time, it will never be used once the project is deployed.
Environment Variables
If a parameter references an environment variable, the literal value from that variable is resolved through the
specified environment reference and applied to the parameter. The final literal parameter value that is used for
package execution is referred to as the execution parameter value. You specify the environment reference for an
execution by using the Execute dialog box
If a project parameter references an environment variable and the literal value from the variable cannot be
resolved at execution, the design value is used. The server value is not used.
To view the environment variables that are assigned to parameter values, query the catalog.object_parameters
view. For more information, see catalog.object_parameters (SSISDB Database).
Determining Execution Parameter Values
The following Transact-SQL views and stored procedure can be used to display and set parameter values.
catalog.execution_parameter_values (SSISDB Database)(view)
Shows the actual parameter values in a specific execution.
catalog.get_parameter_values (SSISDB Database) (stored procedure)
Resolves and shows the actual values for the specified package and environment reference.
catalog.object_parameters (SSISDB Database) (view)
Displays the parameters and properties for all packages and projects in the Integration Services catalog, including
the design default and server default values.
catalog.set_execution_parameter_value (SSISDB Database)
Sets the value of a parameter for an instance of execution in the Integration Services catalog.
You can also use the Execute Package dialog box in SQL Server Data Tools (SSDT) modify the parameter value.
For more information, see Execute Package Dialog Box.
You can also use the dtexec /Parameter option to modify a parameter value. For more information, see dtexec
Utility.
Parameter Validation
If parameter values cannot be resolved, the corresponding package execution will fail. To help avoid failures, you
can validate projects and packages by using the Validate dialog box in SQL Server Data Tools (SSDT). Validation
allows you to confirm that all parameters have the necessary values or can resolve the necessary values with
specific environment references. Validation also checks for other common package issues.
For more information, see Validate Dialog Box.
Parameter Example
This example describes a parameter named pkgOptions that is used to specify options for the package in which it
resides.
During design time, when the parameter was created in SQL Server Data Tools, a default value of 1 was assigned
to the parameter. This default value is referred to as the design default. If the project was deployed to the SSISDB
catalog and no other values were assigned to this parameter, the package property corresponding to the
pkgOptions parameter would be assigned the value of 1 during package execution. The design default persists
with the project throughout its life cycle.
While preparing a specific instance of package execution, a value of 5 is assigned to the pkgOptions parameter.
This value is referred to as the execution value because it applies to the parameter only for that particular instance
of execution. When execution starts, the package property corresponding to the pkgOptions parameter is
assigned the value of 5.
Create parameters
You use SQL Server Data Tools (SSDT) to create project parameters and package parameters. The following
procedures provide step-by-step instructions for creating package/project parameters.
NOTE: If you are converting a project that you created using an earlier version of Integration Services to the
project deployment model, you can use the Integration Ser vices Project Conversion Wizard to create
parameters based on configurations. For more information, see Deploy Integration Services (SSIS) Projects
and Packages.
Default value The default value for the parameter assigned at design
time. This is also known as the design default.
NOTE: When you deploy a project to the catalog, several more properties become associated with the
project. To see all properties for all parameters in the catalog, use the catalog.object_parameters
(SSISDB Database) view.
4. Save the project to save changes to parameters. Parameter values are stored in the project file.
WARNING!! You can in-place edit in the list or use the Proper ties window to modify the values of
parameter properties. You can delete a parameter by using the Delete (X) toolbar button. Using the
last toolbar button, you can specify a value for a parameter that is used only when you execute the
package in SQL Server Data Tools.
NOTE: If you re-open the package file without opening the project in SQL Server Data Tools, the
Parameters tab will be empty and disabled.
4. Enter values for the Name , Data Type , Value , Sensitive , and Required properties.
Default value The default value for the parameter assigned at design
time. This is also known as the design default.
5. Save the project to save changes to parameters. Parameter values are stored in configurations in the project
file. Save the project file to commit to disk any changes in the parameter values.
WARNING!!! You can in-place edit in the list or use the Proper ties window to modify the values of
parameter properties. You can delete a parameter by using the Delete (X) toolbar button. Using the
last toolbar button to open the Manage Parameter Values dialog box, you can specify a value for a
parameter that is used only when you execute the package in SQL Server Data Tools.
Options
Proper ty
Select the property of the task that you want to associate with a parameter. This list is populated with all the
properties that can be parameterized.
Use existing parameter
Select this option to associate the property of task with an existing parameter and then select the parameter from
drop-down list.
Do not use parameter
Select this option to remove a reference to a parameter. The parameter is not deleted.
Create new parameter
Select this option to create a new parameter that you want to associate with the property of the task.
Name
Specify the name of the parameter you want to create.
Description
Specify the description for parameter.
Value
Specify the default value for the parameter. This is also known as the design default, which can be overridden later
at the deployment time.
Scope
Specify the scope of the parameter by selecting either Project or Package option. Project parameters are used to
supply any external input the project receives to one or more packages in the project. Package parameters allow
you to modify package execution without having to edit and redeploy the package.
Sensitive
Specify whether the parameter is a sensitive by checking or clearing the check box. Sensitive parameter values are
encrypted in the catalog and appear as a NULL value when viewed with Transact-SQL or SQL Server Management
Studio.
Required
Specify whether the parameter requires that a value, other than the design default, is specified before the package
can execute.
Related Content
Blog entry, SSIS Quick Tip: Required Parameters, on mattmasson.com.
Integration Services (SSIS) Connections
11/2/2020 • 16 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Microsoft SQL Server Integration Services packages use connections to perform different tasks and to
implement Integration Services features:
Connecting to source and destination data stores such as text, XML, Excel workbooks, and relational
databases to extract and load data.
Connecting to relational databases that contain reference data to perform exact or fuzzy lookups.
Connecting to relational databases to run SQL statements such as SELECT, DELETE, and INSERT
commands and also stored procedures.
Connecting to SQL Server to perform maintenance and transfer tasks such as backing up databases
and transferring logins.
Writing log entries in text and XML files and SQL Server tables and package configurations to SQL
Server tables.
Connecting to SQL Server to create temporary work tables that some transformations require to do
their work.
Connecting to Analysis Services projects and databases to access data mining models, process cubes
and dimensions, and run DDL code.
Specifying existing or creating new files and folders to use with Foreach Loop enumerators and tasks.
Connecting to message queues and to Windows Management Instrumentation (WMI), SQL Server
Management Objects (SMO), Web, and mail servers.
To make these connections, Integration Services uses connection managers, as described in the next section.
Connection Managers
Integration Services uses the connection manager as a logical representation of a connection. At design time,
you set the properties of a connection manager to describe the physical connection that Integration Services
creates when the package runs. For example, a connection manager includes the ConnectionString
property that you set at design time; at run time, a physical connection is created using the value in the
connection string property.
A package can use multiple instances of a connection manager type, and you can set the properties on each
instance. At run time, each instance of a connection manager type creates a connection that has different
attributes.
SQL Server Integration Services provides different types of connection managers that enable packages to
connect to a variety of data sources and servers:
There are built-in connection managers that Setup installs when you install Integration Services.
There are connection managers that are available for download from the Microsoft website.
You can create your own custom connection manager if the existing connection managers do not meet
your needs.
Package level and project level connection managers
A connection manager can be created at the package level or at the project level. The connection manager
created at the project level is available all the packages in the project. Whereas, connection manager created
at the package level is available to that specific package.
You use connection managers that are created at the project level in place of data sources, to share
connections to sources. To add a connection manager at the project level, the Integration Services project
must use the project deployment model. When a project is configured to use this model, the Connection
Managers folder appears in Solution Explorer , and the Data Sources folder is removed from Solution
Explorer .
NOTE
If you want to use data sources in your package, you need to convert the project to the package deployment model.
For more information about the two models, and about converting a project to the project deployment model, see
Deploy Integration Services (SSIS) Projects and Packages.
CACHE Reads data from the data flow or Cache Connection Manager
from a cache file (.caw), and can save
data to the cache file.
FLATFILE Connect to data in a single flat file. Flat File Connection Manager
MULTIFILE Connects to multiple files and folders. Multiple Files Connection Manager
TYPE DESC RIP T IO N TO P IC
MULTIFLATFILE Connects to multiple data files and Multiple Flat Files Connection
folders. Manager
IMPORTANT
The connection managers listed in the following table work only with Microsoft SQL Server 2012 Enterprise and
Microsoft SQL Server 2012 Developer.
ORACLE Connects to an Oracle <version info> The Oracle connection manager is the
server. connection manager component of
the Microsoft Connector for Oracle
by Attunity. The Microsoft Connector
for Oracle by Attunity also includes a
source and a destination. For more
information, see the download page,
Microsoft Connectors for Oracle and
Teradata by Attunity.
C O N N EC T IO N M A N A GER O P T IO N S
Analysis Services Connection Manager Add Analysis Services Connection Manager Dialog Box
UI Reference
Multiple Files Connection Manager Add File Connection Manager Dialog Box UI Reference
Flat File Connection Manager Flat File Connection Manager Editor (General Page)
Multiple Flat Files Connection Manager Multiple Flat Files Connection Manager Editor (General
Page)
SQL Server Compact Edition Connection Manager SQL Server Compact Edition Connection Manager
Editor (Connection Page)
C O N N EC T IO N M A N A GER O P T IO N S
Analysis Services Connection Manager Add Analysis Services Connection Manager Dialog Box
UI Reference
Multiple Files Connection Manager Add File Connection Manager Dialog Box UI Reference
C O N N EC T IO N M A N A GER O P T IO N S
Flat File Connection Manager Flat File Connection Manager Editor (General Page)
Multiple Flat Files Connection Manager Multiple Flat Files Connection Manager Editor (General
Page)
SQL Server Compact Edition Connection Manager SQL Server Compact Edition Connection Manager
Editor (Connection Page)
The connection manager you added will show up under the Connections Managers node in the
Solution Explorer . It will also appear in the Connection Managers tab in the SSIS Designer
window for all the packages in the project. The name of the connection manager in this tab will have a
(project) prefix in order to differentiate this project level connection manager from the package level
connection managers.
4. Optionally, right-click the connection manager in the Solution Explorer window under Connection
Managers node (or) in the Connection Managers tab of the SSIS Designer window, click
Rename , and then modify the default name of the connection manager.
NOTE
In the Connection Managers tab of the SSIS Designer window, you won't be able to overwrite the
(project) prefix from the connection manager name. This is by design.
NOTE
You can also delete a project level connection manager from the Connection Manager tab of the SSIS
Designer window opened for any package in the project. You do so by right-clicking the connection manager
in the tab and then by clicking Delete .
C O N N EC T IO N M A N A GER O P T IO N S
Analysis Services Connection Manager Add Analysis Services Connection Manager Dialog Box
UI Reference
Multiple Files Connection Manager Add File Connection Manager Dialog Box UI Reference
Flat File Connection Manager Flat File Connection Manager Editor (General Page)
Multiple Flat Files Connection Manager Multiple Flat Files Connection Manager Editor (General
Page)
SQL Server Compact Edition Connection Manager SQL Server Compact Edition Connection Manager
Editor (Connection Page)
Related Content
Video, Leverage Microsoft Attunity Connector for Oracle to enhance Package Performance, on
technet.microsoft.com
Wiki articles, SSIS Connectivity, on social.technet.microsoft.com
Blog entry, Connecting to MySQL from SSIS, on blogs.msdn.com.
Technical article, Extracting and Loading SharePoint Data in SQL Server Integration Services, on
msdn.microsoft.com.
Technical article, You get "DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER"
error message when using Oracle connection manager in SSIS, on support.microsoft.com.
Data Sources for Integration Services packages
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
SQL Server Data Tools (SSDT) includes a design-time object that you can use in Microsoft Integration Services
packages: the data source.
A data source object is a reference to a connection, and at a minimum, it includes a connection string and a data
source identifier. It can also include additional metadata such a description, a name, a user name, and a password.
NOTE: You can add data sources only to projects that are configured to use the package deployment model. If a
project is configured to use the project deployment model, you use connection managers created at the project
level to share connections, in place of using data sources.
For more information about the deployment models, see Deployment of Projects and Packages. For more
information about converting a project to the project deployment model, see Deploy Projects to Integration
Services Server.
The advantages of using data sources in Integration Services packages include the following:
A data source has project scope, which means that a data source created in an Integration Services project is
available to all the packages in the project. A data source can be defined one time and then referenced by
connection managers in multiple packages.
A data source offers synchronization between the data source object and its package references. If the data
source and the packages that reference it reside in the same project, the connection string property of the
data source references is automatically updated when the data source changes.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An ADO connection manager enables a package to connect to ActiveX Data Objects (ADO) objects, such as a
recordset. This connection manager is typically used in custom tasks written in an earlier version of a language,
such as Microsoft Visual Basic 6.0, or in custom tasks that are part of an existing application that uses ADO to
connect to a data source.
When you add an ADO connection manager to a package, Microsoft SQL Server Integration Services creates a
connection manager that will resolve to an ADO connection at run time, sets the connection manager properties,
and adds the connection manager to the Connections collection on the package. The
ConnectionManagerType property of the connection manager is set to ADO .
time , datetimeoffset The package fails unless the package uses parameterized SQL
commands. To use parameterized SQL commands, use the
Execute SQL Task in your package. For more information, see
Execute SQL Task and Parameters and Return Codes in the
Execute SQL Task.
NOTE
For more information about SQL Server data types and how they map to Integration Services data types, see Data Types
(Transact-SQL) and Integration Services Data Types.
See Also
Integration Services (SSIS) Connections
ADO.NET connection manager
11/2/2020 • 6 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An ADO.NET connection manager enables a package to access data sources by using a .NET provider. Typically,
you use this connection manager to access data sources such as Microsoft SQL Server. You can also access data
sources exposed through OLE DB and XML in custom tasks that are written in managed code, by using a
language such as C#.
When you add an ADO.NET connection manager to a package, SQL Server Integration Services creates a
connection manager that is resolved as an ADO.NET connection at runtime. It sets the connection manager
properties, and adds the connection manager to the Connections collection on the package.
The ConnectionManagerType property of the connection manager is set to ADO.NET . The value of
ConnectionManagerType is qualified to include the name of the .NET provider that the connection manager uses.
time , datetimeoffset The package fails unless the package uses parameterized
SQL commands. To use parameterized SQL commands, use
the Execute SQL Task in your package. For more information,
see Execute SQL Task and Parameters and Return Codes in
the Execute SQL Task.
NOTE
For more information about SQL Server data types and how they map to Integration Services data types, see Data Types
(Transact-SQL) and Integration Services Data Types.
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
Configure ADO.NET connection manager
Use the Configure ADO.NET Connection Manager dialog box to add a connection to a data source that can
be accessed by using a .NET Framework data provider. For example, one such provider is the SqlClient provider.
The connection manager can use an existing connection, or you can create a new one.
To learn more about the ADO.NET connection manager, see ADO.NET Connection Manager.
Options
Data connections
Select an existing ADO.NET data connection from the list.
Data connection proper ties
View properties and values for the selected ADO.NET data connection.
New
Create an ADO.NET data connection by using the Connection Manager dialog box.
Delete
Select a connection, and then delete it by selecting Delete .
Managed identities for Azure resources authentication
When running SSIS packages on Azure-SSIS integration runtime in Azure Data Factory, you can use the
managed identity associated with your data factory for Azure SQL Database or Azure SQL Managed Instance
authentication. The designated factory can access and copy data from or to your database by using this identity.
NOTE
When you use Azure Active Directory (Azure AD) authentication (including managed identity authentication) to connect to
Azure SQL Database or Azure SQL Managed Instance, you might encounter a problem related to package execution
failure or unexpected behavior change. For more information, see Azure AD features and limitations.
To use managed identity authentication for Azure SQL Database, follow these steps to configure your database:
1. Provision an Azure Active Directory administrator for your Azure SQL server on the Azure portal, if you
haven't already done so. The Azure AD administrator can be an Azure AD user or Azure AD group. If you
grant the group with managed identity an admin role, skip step 2 and 3. The administrator will have full
access to the database.
2. Create contained database users for the data factory managed identity. Connect to the database from or
to which you want to copy data by using tools like SSMS, with an Azure AD identity that has at least
ALTER ANY USER permission. Run the following T-SQL:
3. Grant the data factory managed identity needed permissions, as you normally do for SQL users and
others. Refer to Database-Level Roles for appropriate roles. Run the following code. For more options, see
this document.
To use managed identity authentication for Azure SQL Managed Instance, follow these steps to configure your
database:
1. Provision an Azure Active Directory administrator for your managed instance on the Azure portal, if you
haven't already done so. The Azure AD administrator can be an Azure AD user or Azure AD group. If you
grant the group with managed identity an admin role, skip step 2-4. The administrator will have full
access to the database.
2. Create logins for the data factory managed identity. In SQL Server Management Studio (SSMS), connect
to your Managed Instance using a SQL Server account that is a sysadmin . In master database, run the
following T-SQL:
3. Create contained database users for the data factory managed identity. Connect to the database from or
to which you want to copy data, run the following T-SQL:
4. Grant the data factory managed identity needed permissions as you normally do for SQL users and
others. Run the following code. For more options, see this document.
ALTER ROLE [role name e.g., db_owner] ADD MEMBER [your data factory name];
Finally, configure managed identity authentication for the ADO.NET connection manager. Here are the options to
do this:
Configure at design time. In SSIS Designer, right-click the ADO.NET connection manager, and select
Proper ties . Update the property ConnectUsingManagedIdentity to True .
NOTE
Currently, the connection manager property ConnectUsingManagedIdentity doesn't take effect (indicating that
managed identity authentication doesn't work) when you run SSIS package in SSIS Designer or Microsoft SQL
Server.
Configure at runtime. When you run the package via SQL Server Management Studio (SSMS) or Azure
Data Factory Execute SSIS Package activity, find the ADO.NET connection manager. Update its property
ConnectUsingManagedIdentity to True .
NOTE
In Azure-SSIS integration runtime, all other authentication methods (for example, integrated authentication and
password) preconfigured on the ADO.NET connection manager are overridden when managed identity
authentication is used to establish a database connection.
NOTE
To configure managed identity authentication on existing packages, the preferred way is to rebuild your SSIS project with
the latest SSIS Designer at least once. Redeploy that SSIS project to your Azure-SSIS integration runtime, so that the new
connection manager property ConnectUsingManagedIdentity is automatically added to all ADO.NET connection
managers in your SSIS project. The alternative way is to directly use a property override with property path
\Package.Connections[{the name of your connection
manager}].Proper ties[ConnectUsingManagedIdentity] at runtime.
See also
Integration Services (SSIS) Connections
Analysis Services Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An SQL Server Analysis Services connection manager enables a package to connect to a server that runs an
Analysis Services database or to an Analysis Services project that provides access to cube and dimension data.
You can only connect to an Analysis Services project while developing packages in SQL Server Data Tools (SSDT).
At run time, packages connect to the server and the database to which you deployed the Analysis Services
project.
Both tasks, such as the Analysis Services Execute DDL task and the Analysis Services Processing task, and
destinations, such as the Data Mining Model Training destination, use an Analysis Services connection manager.
For more information about Analysis Services databases, see Multidimensional Model Databases (SSAS).
NOTE
If you use SSIS in Azure Data Factory (ADF) and want to connect to Azure Analysis Services (AAS) instance, you can not
use an account with Multi-Factor Authentication (MFA) enabled, but must use an account that does not require any
interactivity/MFA or a service principal instead. To use the latter, see here to create one and assign the server administrator
role to it, then select Use a specific user name and password to log on to the server in your connection manager,
and finally enter User name: app:YourApplicationID and Password: YourAuthorizationKey .
Indicate whether the connection that is created from the connection manager is retained at run time.
You can set properties through SSIS Designer or programmatically.
For more information about the properties that you can set in SSIS Designer, click one of the following topic:
Add Analysis Services Connection Manager Dialog Box UI Reference
For information about configuring a connection manager programmatically, see ConnectionManager and Adding
Connections Programmatically.
Add Analysis Services Connection Manager Dialog
Box UI Reference
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the Add Analysis Ser vices Connection Manager dialog box to create a connection to a server running
SQL Server Analysis Services, or to edit connection properties.
To learn more about the Analysis Services connection manager, see Analysis Services Connection Manager.
Options
Create a connection to a computer running Analysis Ser vices
Use the default connection to a server that is running an instance of Analysis Services, or create a new connection
by clicking Edit .
Edit
Use the Connection Manager dialog box to create a connection to a server that is running an instance of
Analysis Services, and to edit connection properties.
Create a connection to an Analysis Ser vices project in this solution
Specify that the connection will use an Analysis Services project in the open solution.
NOTE
Analysis Services tabular model projects are not supported for this scenario.
See Also
Integration Services Error and Message Reference
Integration Services (SSIS) Connections
Azure Data Lake Analytics connection manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A SQL Server Integration Services (SSIS) package can use the Azure Data Lake Analytics connection manager to
connect to a Data Lake Analytics account with one of the two following authentication types:
Azure Active Directory (Azure AD) User Identity
Azure AD Service Identity
The Data Lake Analytics connection manager is a component of the SQL Server Integration Services (SSIS) Feature
Pack for Azure.
NOTE
When you select the Azure AD User Identity authentication option, multi-factor authentication and Microsoft
account authentication are not supported.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A SQL Server Integration Services (SSIS) package can use the Azure Data Lake Store Connection Manager to
connect to an Azure Data Lake Storage Gen1 account with one of the two following authentication types:
Azure AD User Identity
Azure AD Service Identity
The Azure Data Lake Store Connection Manager is a component of the SQL Server Integration Services (SSIS)
Feature Pack for Azure.
NOTE
To ensure that the Azure Data Lake Store Connection Manager and the components that use it - that is, the Data Lake
Storage Gen1 source and the Data Lake Storage Gen1 destination - can connect to services, make sure you download the
latest version of the Azure Feature Pack here.
3. In the Authentication field, choose the appropriate authentication type to access the data in Data Lake
Storage Gen1.
a. If you select the Azure AD User Identity authentication option, do the following things:
a. Provide values for the User Name and Password fields.
b. To test the connection, select Test Connection . If you or the tenant administrator didn't
previously consent to allow SSIS to access your Data Lake Storage Gen1 data, select Accept
when prompted. For more information about this consent experience, see Integrating
applications with Azure Active Directory.
NOTE
When you select the Azure AD User Identity authentication option, multi-factor authentication and
Microsoft account authentication are not supported.
b. If you select the Azure AD Ser vice Identity authentication option, do the following things:
a. Create an Azure Active Directory (AAD) application and service principal to access the Data
Lake Storage Gen1 data.
b. Assign appropriate permissions to let this AAD application access your Data Lake Storage
Gen1 resources. For more information about this authentication option, see Use portal to
create Active Directory application and service principal that can access resources.
c. Provide values for the Client Id , Secret Key , and Tenant Name fields.
d. To test the connection, select Test Connection .
4. Select OK to close the Azure Data Lake Store Connection Manager Editor dialog box.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Azure HDInsight Connection Manager enables an SSIS package to connect to an Azure HDInsight cluster.
The Azure HDInsight Connection Manager is a component of the SQL Server Integration Services (SSIS)
Feature Pack for Azure.
To create and configure an Azure HDInsight Connection Manager , follow the steps below:
1. In the Add SSIS Connection Manager dialog box, select AzureHDInsight , and click Add .
2. In the Azure HDInsight Connection Manager Editor dialog box, specify the Cluster DNS name (without
the protocol prefix), Username , and Password for the HDInsight cluster to connect to.
3. Click OK to close the dialog box.
4. You can see the properties of the connection manager you created in the Proper ties window.
Azure Resource Manager Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Azure Resource Manager Connection Manager enables an SSIS package to manage Azure resources
using a service principal.
The Azure Resource Manager Connection Manager is a component of the SQL Server Integration Services
(SSIS) Feature Pack for Azure.
To create and configure an Azure Resource Manager Connection Manager , follow the steps below:
1. In the Add SSIS Connection Manager dialog box, select AzureResourceManager , and click Add .
2. In the Azure Resource Manager Connection Manager Editor dialog box, specify the Application ID ,
Application Key , and Tenant ID for the service principal. For details about these properties, please refer to this
article.
3. Click OK to close the dialog box.
4. You can see the properties of the connection manager you created in the Proper ties window.
Azure Storage connection manager
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Azure Storage connection manager enables a SQL Server Integration Services (SSIS) package to connect to an
Azure Storage account. The connection manager is a component of the SQL Server Integration Services (SSIS)
Feature Pack for Azure.
In the Add SSIS Connection Manager dialog box, select AzureStorage > Add .
The following properties are available.
Ser vice: Specifies the storage service to connect to.
Account name: Specifies the storage account name.
Authentication: Specifies the authentication method to use. AccessKey, ServicePrincipal, and
SharedAccessSignature authentication are supported.
AccessKey: For this authentication method, specify the Account key .
Ser vicePrincipal: For this authentication method, specify the Application ID , Application key , and
Tenant ID of the service principal. For Test Connection to work, the service principal should be
assigned at least the Storage Blob Data Reader role to the storage account. For more information, see
Grant access to Azure blob and queue data with RBAC in the Azure portal.
SharedAccessSignature: For this authentication method, specify at least the Token of the shared
access signature. To test connection, specify additionally the resource scope to test against. It may be
Ser vice , Container , or Blob . For Container and Blob , specify container name and blob path,
respectively. For more information, see Azure Storage shared access signature overview.
Environment: Specifies the cloud environment hosting the storage account.
Configure at runtime. When you run the package via SQL Server Management Studio (SSMS) or Azure
Data Factory Execute SSIS Package activity, find the Azure Storage connection manager. Update its property
ConnectUsingManagedIdentity to True .
NOTE
In Azure-SSIS integration runtime, all other authentication methods (for example, access key and service principal)
preconfigured on the Azure Storage connection manager are overridden when managed identity authentication is
used for storage operations.
NOTE
To configure managed identity authentication on existing packages, the preferred way is to rebuild your SSIS project with the
latest SSIS Designer at least once. Redeploy that SSIS project to your Azure-SSIS integration runtime, so that the new
connection manager property ConnectUsingManagedIdentity is automatically added to all Azure Storage connection
managers in your SSIS project. The alternative way is to directly use a property override with property path
\Package.Connections[{the name of your connection manager}].Proper ties[ConnectUsingManagedIdentity] at
runtime.
See also
Integration Services (SSIS) Connections
Azure Subscription Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Azure Subscription connection manager enables an SSIS package to connect to an Azure subscription by
using the values you specify for the properties: Azure Subscription ID and Management Certificate.
The Azure Subscription connection manager is a component of the SQL Server Integration Services (SSIS)
Feature Pack for Azure.
1. In the Add SSIS Connection Manager dialog box shown above, you select Azure Subscription , and
click Add . You should see the following Azure Subscription Connection Manager Editor dialog box.
2. Enter your Azure subscription ID, which uniquely identifies an Azure subscription, for the Azure
subscription ID . The value can be found on the Azure Management Portal under Settings page:
3. Choose Management cer tificate store location and Management cer tificate store name from the
drop-down lists.
4. Enter Management cer tificate thumbprint or click the Browse... to choose a certificate from the
selected store. The certificate must be uploaded as a management certificate for the subscription. To do so,
click Upload on the following page of the Azure Portal (see this MSDN post for more detail).
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An Excel connection manager enables a package to connect to a Microsoft Excel workbook file. The Excel source
and the Excel destination that Microsoft SQL Server Integration Services includes use the Excel connection
manager.
IMPORTANT
For detailed info about connecting to Excel files, and about limitations and known issues for loading data from or to Excel
files, see Load data from or to Excel with SQL Server Integration Services (SSIS).
When you add an Excel connection manager to a package, Integration Services creates a connection manager
that is resolved as an Excel connection at run time, sets the connection manager properties, and adds the
connection manager to the Connections collection on the package.
The ConnectionManagerType property of the connection manager is set to EXCEL .
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
For this solution to work reliably, you might have to also modify the registry settings. The main.cmd file is
as follows:
Save the file in CSV format and change the SSIS package to support a CSV import.
Related Tasks
Load data from or to Excel with SQL Server Integration Services (SSIS)
Excel Source
Excel Destination
File Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data
Factory
A File connection manager enables a package to reference an existing file or folder, or to create a file or
folder at run time. For example, you can reference an Excel file. Certain components in Microsoft SQL
Server Integration Services use information in files to perform their work. For example, an Execute SQL
task can reference a file that contains the SQL statements that the task executes. Other components
perform operations on files. For example, the File System task can reference a file to copy it to a new
location.
VA L UE DESC RIP T IO N
NOTE
You can set the ConnectionString property for the File connection manager by specifying an expression in the
Properties window of SQL Server Data Tools (SSDT). However, to avoid a validation error when you use an
expression to specify the file or folder, in the File Connection Manager Editor , for File/Folder , add a file or
folder path.
To learn more about the File connection manager, see File Connection Manager.
Options
Usage Type
Specify whether the File Connection Manager connects to an existing file or folder or creates a new
file or folder.
VA L UE DESC RIP T IO N
File / Folder
If File , specify the file to use.
If Folder , specify the folder to use.
Browse
Select the file or folder by using the Select File or Browse for Folder dialog box.
Add File Connection Manager Dialog Box UI
Reference
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the Add File Connection Manager dialog box to define a connection to a group of files or folders.
To learn more about the Multiple Files connection manager, see Multiple Files Connection Manager.
NOTE
The built-in tasks and data flow components in Integration Services do not use the Multiple Files connection manager.
However, you can use this connection manager in the Script task or Script component.
Options
Usage type
Specify the type of files to use for the multiple files connection manager.
VA L UE DESC RIP T IO N
Files / Folders
View the files or folders that you have added by using the buttons described as follows.
Add
Add a file by using the Select Files dialog box, or add a folder by using the Browse for Folder dialog box.
Edit
Select a file or folder, and then replace it with a different file or folder by using the Select Files or Browse for
Folder dialog box.
Remove
Select a file or folder, and then remove it from the list by using the Remove button.
Arrow buttons
Select a file or folder, and then use the arrow buttons to move it up or down to specify the sequence of access.
See Also
Integration Services Error and Message Reference
Suggest Column Types Dialog Box UI Reference
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Use the Suggest Column Types dialog box to identify the data type and length of columns in a Flat File
Connection Manager based on a sampling of the file content.
To learn more about the data types used by Integration Services, see Integration Services Data Types.
Options
Number of rows
Type or select the number of rows in the sample that the algorithm uses.
Suggest the smallest integer data type
Clear this check box to skip the assessment. If selected, determines the smallest possible integer data type for
columns that contain integral numeric data.
Suggest the smallest real data type
Clear this check box to skip the assessment. If selected, determines whether columns that contain real numeric data
can use the smaller real data type, DT_R4.
Identify Boolean columns using the following values
Type the two values that you want to use as the Boolean values true and false. The values must be separated by a
comma, and the first value represents True.
Pad string columns
Select this check box to enable string padding.
Percent padding
Type or select the percentage of the column lengths by which to increase the length of columns for character data
types. The percentage must be an integer.
See Also
Integration Services Error and Message Reference
Flat File Connection Manager
Flat File Connection Manager
11/2/2020 • 13 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A Flat File connection manager enables a package to access data in a flat file. For example, the Flat File
source and destination can use Flat File connection managers to extract and load data.
The Flat File connection manager can access only one file. To reference multiple files, use a Multiple Flat Files
connection manager instead of a Flat File connection manager. For more information, see Multiple Flat Files
Connection Manager.
Column Length
By default, the Flat File connection manager sets the length of string columns to 50 characters. In the Flat
File Connection Manager Editor dialog box, you can evaluate sample data and automatically resize the
length of these columns to prevent truncation of data or excess column width. Also, unless you subsequently
resize the column length in a Flat File source or a transformation, the column length of string column
remains the same throughout the data flow. If these string columns map to destination columns that are
narrower, warnings appear in the user interface. Moreover, at run time, errors may occur due to data
truncation. To avoid errors or truncation, you can resize the columns to be compatible with the destination
columns in the Flat File connection manager, the Flat File source, or a transformation. To modify the length of
output columns, you set the Length property of the output column on the Input and Output Proper ties
tab in the Advanced Editor dialog box.
If you update column lengths in the Flat File connection manager after you have added and configured the
Flat File source that uses the connection manager, you do not have to manually resize the output columns in
the Flat File source. When you open the Flat File Source dialog box, the Flat File source provides an option
to synchronize the column metadata.
VA L UE DESC RIP T IO N
Ragged right Ragged right files are files in which every column has a
fixed width, except for the last column. It is delimited by
the row delimiter.
Text qualifier
Specify the text qualifier to use. For example, you can specify that text fields are enclosed in quotation marks.
NOTE
After you select a text qualifier, you cannot re-select the None option. Type None to de-select the text qualifier.
VA L UE DESC RIP T IO N
Ver tical bar {|} The header row is delimited by a vertical bar.
VA L UE DESC RIP T IO N
Column delimiter
Select from the list of available column delimiters, or enter the delimiter text.
VA L UE DESC RIP T IO N
Refresh
View the effect of changing the delimiters to skip by clicking Refresh . This button only becomes visible after
you have changed other connection options.
Preview rows
View sample data in the flat file, divided into columns and rows by using the options selected.
Reset Columns
Remove all but the original columns by clicking Reset Columns .
Format = Fixed Width
Font
Select the font in which to display the preview data.
Source data columns
Adjust the width of the row by sliding the vertical red row marker, and adjust the width of the columns by
clicking the ruler at the top of the preview window
Row width
Specify the length of the row before adding delimiters for individual columns. Or, drag the vertical red line in
the preview window to mark the end of the row. The row width value is automatically updated.
Reset Columns
Remove all but the original columns by clicking Reset Columns .
Format = Ragged Right
NOTE
Ragged right files are files in which every column has a fixed width, except for the last column. It is delimited by the
row delimiter.
Font
Select the font in which to display the preview data.
Source data columns
Adjust the width of the row by sliding the vertical red row marker, and adjust the width of the columns by
clicking the ruler at the top of the preview window
Row delimiter
Select from the list of available row delimiters, or enter the delimiter text.
VA L UE DESC RIP T IO N
Reset Columns
Remove all but the original columns by clicking Reset Columns .
DataType Select from the list of available data types. For more
information, see Integration Services Data Types.
True: Text data in the flat file is qualified. False: Text data in
the flat file is NOT qualified.
New
Add a new column by clicking New . By default, the New button adds a new column at the end of the list. The
button also has the following options, available in the drop-down list.
VA L UE DESC RIP T IO N
Delete
Select a column, and then remove it by clicking Delete .
Suggest Types
Use the Suggest Column Types dialog box to evaluate sample data in the file and to obtain suggestions
for the data type and length of each column. For more information, see Suggest Column Types Dialog Box UI
Reference.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An FTP connection manager enables a package to connect to a File Transfer Protocol (FTP) server. The FTP task
that SQL Server Integration Services includes uses this connection manager.
When you add an FTP connection manager to a package, Integration Services creates a connection manager that
can be resolved as an FTP connection at run time, sets the connection manager properties, and adds the
connection manager to the Connections collection on the package.
The ConnectionManagerType property of the connection manager is set to FTP .
You can configure the FTP connection manager in the following ways:
Specify a server name and server port.
Specify anonymous access, or provide a user name and a password for basic authentication.
IMPORTANT
The FTP connection manager supports only anonymous authentication and basic authentication. It does not
support Windows Authentication.
Set the time-out, number of retries, and the amount of data to copy at a time.
Indicate whether the FTP connection manager uses passive or active mode.
Depending on the configuration of the FTP site to which the FTP connection manager connects, you may need to
change the following default values of the connection manager:
The server port is set to 21. You should specify the port that the FTP site listens to.
The user name is set to "anonymous". You should provide the credentials that the FTP site requires.
Active/Passive Modes
An FTP connection manager can send and receive files using either active mode or passive mode. In active mode,
the server initiates the data connection, and in passive mode, the client initiates the data connection.
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
To learn more about the FTP connection manager, see FTP Connection Manager.
Options
Ser ver name
Provide the name of the FTP server.
Ser ver por t
Specify the port number on the FTP server to use for the connection. The default value of this property is 21 .
User name
Provide a user name to access the FTP server. The default value of this property is anonymous .
Password
Provide the password to access the FTP server.
Time-out (in seconds)
Specify the number of seconds the task takes before timing out. A value of 0 indicates an infinite amount of time.
The default value of this property is 60 .
Use passive mode
Specify whether the server or the client initiates the connection. The server initiates the connection in active
mode, and the client activates the connection in passive mode. The default value of this property is active
mode .
Retries
Specify the number of times the task attempts to make a connection. A value of 0 indicates no limit to the
number of attempts.
Chunk size (in KB)
Provide a chunk size in kilobytes for transmitting data.
Test Connection
After configuring the FTP Connection Manager, confirm that the connection is viable by clicking Test
Connection .
See Also
FTP Task
Integration Services (SSIS) Connections
Hadoop Connection Manager
11/2/2020 • 4 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Hadoop Connection Manager enables a SQL Server Integration Services (SSIS) package to connect to a
Hadoop cluster, by using the values you specify for the properties.
C:> Ksetup
default realm = REALM.COM (external)
REALM.com:
kdc = <your_kdc_server_address>
Option 2: Enable mutual trust between the Windows domain and the Kerberos realm
Requirements:
The gateway computer must join a Windows domain.
You need permission to update the domain controller's settings.
How to configure:
NOTE
Replace REALM.COM and AD.COM in the following tutorial with your own respective realm and domain controller, as
needed.
[libdefaults]
default_realm = REALM.COM
dns_lookup_realm = false
dns_lookup_kdc = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
[realms]
REALM.COM = {
kdc = node.REALM.COM
admin_server = node.REALM.COM
}
AD.COM = {
kdc = windc.ad.com
admin_server = windc.ad.com
}
[domain_realm]
.REALM.COM = REALM.COM
REALM.COM = REALM.COM
.ad.com = AD.COM
ad.com = AD.COM
[capaths]
AD.COM = {
REALM.COM = .
}
2. Establish trust from the Windows domain to the Kerberos realm. In the following example, [password] is
the password for the principal krbtgt/[email protected] .
C:> netdom trust REALM.COM /Domain: AD.COM /add /realm /password:[password]
d. Use the Ksetup command to specify the encryption algorithm to be used on the specific realm.
C:> ksetup /SetEncTypeAttr REALM.COM DES-CBC-CRC DES-CBC-MD5 RC4-HMAC-MD5 AES128-CTS-HMAC-SHA1-96
AES256-CTS-HMAC-SHA1-96
4. To use the Kerberos principal in the Windows domain, create the mapping between the domain account
and Kerberos principal.
a. Go to Administrative tools > Active Director y Users and Computers .
b. Configure advanced features by selecting View > Advanced Features .
c. Locate the account to which you want to create mappings, right-click to view Name Mappings , and
then select the Kerberos Names tab.
d. Add a principal from the realm.
See also
Hadoop Hive Task
Hadoop Pig Task
Hadoop File System Task
HTTP Connection Manager
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An HTTP connection enables a package to access a Web server by using HTTP to send or receive files. The Web
Service task that SQL Server Integration Services includes uses this connection manager.
When you add an HTTP connection manager to a package, Integration Services creates a connection manager
that will resolve to an HTTP connection at run time, sets the connection manager properties, and adds the
connection manager to the Connections collection on the package.
The ConnectionManagerType property of the connection manager is set to HTTP.
You can configure the HTTP connection manager the following ways:
Use credentials. If the connection manager uses credentials, its properties include the user name,
password, and domain.
IMPORTANT
The HTTP connection manager supports only anonymous authentication and basic authentication. It does not
support Windows Authentication.
Use a client certificate. If the connection manager uses a client certificate, its properties include the
certificate name.
Provide a time-out for connecting to the server and a chunk size for writing data.
Use a proxy server. The proxy server can also be configured to use credentials and to bypass the proxy
server and use local addresses instead.
IMPORTANT
The HTTP connection manager supports only anonymous authentication and basic authentication. It does not support
Windows Authentication.
To learn more about the HTTP connection manager, see HTTP Connection Manager. To learn more about a
common usage scenario for the HTTP Connection Manager, see Web Service Task.
Options
Ser ver URL
Type the URL for the server.
If you plan to use the Download WSDL button on the General page of the Web Ser vice Task Editor to
download a WSDL file, type the URL for the WSDL file. This URL ends with "?wsdl".
Use credentials
Specify whether you want the HTTP Connection Manager to use the user's security credentials for
authentication.
User name
If the HTTP Connection Manager uses credentials, you must specify a user name, password, and domain.
Password
If the HTTP Connection Manager uses credentials, you must specify a user name, password, and domain.
Domain
If the HTTP Connection Manager uses credentials, you must specify a user name, password, and domain.
Use client cer tificate
Specify whether you want the HTTP Connection Manager to use a client certificate for authentication.
Cer tificate
Select a certificate from the list by using the Select Cer tificate dialog box. The text box displays the name
associated with this certificate.
Time-out (in seconds)
Provide a time-out for connecting to the Web server. The default value of this property is 30 seconds.
Chunk size (in KB)
Provide a chunk size for writing data.
Test Connection
After configuring the HTTP Connection Manager, confirm that the connection is viable by clicking Test
Connection .
See Also
Web Service Task
Integration Services (SSIS) Connections
MSMQ Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An MSMQ connection manager enables a package to connect to a message queue that uses Message Queuing
(also known as MSMQ). The Message Queue task that Microsoft SQL Server Integration Services includes uses
an MSMQ connection manager.
When you add an MSMQ connection manager to a package, Integration Services creates a connection manager
that will resolve to an MSMQ connection at run time, sets the connection manager properties, and adds the
connection manager to the Connections collection on the package. The ConnectionManagerType property
of the connection manager is set to MSMQ .
You can configure an MSMQ connection manager in the following ways:
Provide a connection string.
Provide the path of the message queue to connect to.
The format of the path depends on the type of queue, as shown in the following table.
Q UEUE T Y P E SA M P L E PAT H
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
NOTE
The MSMQ connection manager supports local public and private queues and remote public queues. It does not support
remote private queues. For a workaround that uses the Script Task, see Sending to a Remote Private Message Queue with
the Script Task.
Options
Name
Provide a unique name for the MSMQ connection manager in the workflow. The name provided will be
displayed within SSIS Designer.
Description
Describe the connection manager. As a best practice, describe the connection manager in terms of its purpose, to
make packages self-documenting and easier to maintain.
Path
Type the complete path of the message queue. The format of the path depends on the type of queue.
Q UEUE T Y P E SA M P L E PAT H
See Also
Message Queue Task
Integration Services (SSIS) Connections
Multiple Files Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A Multiple Files connection manager enables a package to reference existing files and folders, or to create files
and folders at run time.
NOTE
The built-in tasks and data flow components in Integration Services do not use the Multiple Files connection manager.
However, you can use this connection manager in the Script task or Script component. For information about how to use
connection managers in the Script task, see Connecting to Data Sources in the Script Task. For information about how to
use connection managers in the Script component, see Connecting to Data Sources in the Script Component.
VA L UE DESC RIP T IO N
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A Multiple Flat Files connection manager enables a package to access data in multiple flat files. For example, a
Flat File source can use a Multiple Flat Files connection manager when the Data Flow task is inside a loop
container, such as the For Loop container. On each loop of the container, the Flat File source loads data from
the next file name that the Multiple Flat Files connection manager provides.
When you add a Multiple Flat Files connection manager to a package, SQL Server Integration Services creates
a connection manager that will resolve to a Multiple Flat Files connection at run time, sets the properties on
the Multiple Flat Files connection manager, and adds the Multiple Flat Files connection manager to the
Connections collection of the package.
The ConnectionManagerType property of the connection manager is set to MULTIFL ATFILE .
You can configure the Multiple Flat Files connection manager in the following ways:
Specify the files, locale, and code page to use. The locale is used to interpret locale-sensitive data such
as dates, and the code page is used to convert string data to Unicode.
Specify the file format. You can use a delimited, fixed width, or ragged right format.
Specify a header row, data row, and column delimiters. Column delimiters can be set at the file level
and overwritten at the column level.
Indicate whether the first row in the files contains column names.
Specify a text qualifier character. Each column can be configured to recognize a text qualifier.
Set properties such as the name, data type, and maximum width on individual columns.
When the Multiple Flat Files connection manager references multiple files, the paths of the files are separated
by the pipe (|) character. The ConnectionString property of the connection manager has the following
format:
<path>|<path>
You can also specify multiple files by using wildcard characters. For example, to reference all the text files on
the C drive, the value of the ConnectionString property can be set to C:\*.txt.
If a Multiple Flat Files connection manager references multiple files, all the files must have the same format.
By default, the Multiple Flat Files connection manager sets the length of string columns to 50 characters. In the
Multiple Flat Files Connection Manager Editor dialog box, you can evaluate sample data and
automatically resize the length of these columns to prevent truncation of data or excess column width. Unless
you resize the column length in a Flat File source or a transformation, the column length remains the same
throughout the data flow. If these columns map to destination columns that are narrower, warnings appear in
the user interface, and at run time, errors may occur due to data truncation. You can resize the columns to be
compatible with the destination columns in the Flat File connection manager, the Flat File source, or a
transformation. To modify the length of output columns, you set the Length property of the output column
on the Input and Output Proper ties tab in the Advanced Editor dialog box.
If you update column lengths in the Multiple Flat Files connection manager after you have added and
configured the Flat File source that uses the connection manager, you do not have to manually resize the
output columns in the Flat File source. When you open the Flat File Source dialog box, the Flat File source
provides an option to synchronize the column metadata.
VA L UE DESC RIP T IO N
Ragged right Ragged right files are files in which every column has a fixed
width, except for the last column, which is delimited by the
row delimiter, specified on the Columns page.
Text qualifier
Specify the text qualifier to use. For example, you can specify to enclose text with quotation marks.
Header row delimiter
Select from the list of delimiters for header rows, or enter the delimiter text.
VA L UE DESC RIP T IO N
Ver tical bar {|} The header row is delimited by a vertical bar.
VA L UE DESC RIP T IO N
Column delimiter
Select from the list of available column delimiters, or enter the delimiter text.
VA L UE DESC RIP T IO N
Reset Columns
Remove all but the original columns by clicking Reset Columns .
Format = Fixed Width
Font
Select the font in which to display the preview data.
Source data columns
Adjust the width of the row by sliding the vertical row marker, and adjust the width of the columns by clicking
the ruler at the top of the preview window
Row width
Specify the length of the row before adding delimiters for individual columns. Or, drag the vertical line in the
preview window to mark the end of the row. The row width value is automatically updated.
Reset Columns
Remove all but the original columns by clicking Reset Columns .
Format = Ragged Right
NOTE
Ragged right files are those in which every column has a fixed width, except for the last column. It is delimited by the
row delimiter.
Font
Select the font in which to display the preview data.
Source data columns
Adjust the width of the row by sliding the vertical row marker, and adjust the width of the columns by clicking
the ruler at the top of the preview window
Row delimiter
Select from the list of available row delimiters, or enter the delimiter text.
VA L UE DESC RIP T IO N
Reset Columns
Remove all but the original columns by clicking Reset Columns .
DataType Select from the list of available data types. For more
information, see Integration Services Data Types.
New
Add a new column by clicking New . By default, the New button adds a new column at the end of the list. The
button also has the following options, available in the dropdown list.
VA L UE DESC RIP T IO N
Delete
Select a column, and then remove it by clicking Delete .
Suggest Types
Use the Suggest Column Types dialog box to evaluate sample data in the first selected file and to obtain
suggestions for the data type and length of each column. For more information, see Suggest Column Types
Dialog Box UI Reference.
See Also
Flat File Source
Flat File Destination
Integration Services (SSIS) Connections
OData Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Connect to an OData data source with an OData connection manager. An OData Source component uses an OData
connection manager to connect to an OData data source and consume data from the service. For more info, see
OData Source.
enter in the editor are persisted in the package. The password value is encrypted according to the package
protection level.
There are several ways to parameterize the username and password values or to store them outside the package.
For example, you can use parameters, or set the connection manager properties directly when you run the package
from SQL Server Management Studio.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An ODBC connection manager enables a package to connect to a variety of database management systems using
the Open Database Connectivity specification (ODBC).
When you add an ODBC connection to a package and set the connection manager properties, SQL Server
Integration Services creates a connection manager and adds the connection manager to the Connections
collection of the package. At run time the connection manager is resolved as a physical ODBC connection.
The ConnectionManagerType property of the connection manager is set to ODBC .
You can configure the ODBC connection manager in the following ways:
Provide a connection string that references a user or system data source name.
Specify the server to connect to.
Indicate whether the connection is retained at run time.
For information about configuring a connection manager programmatically, see ConnectionManager and Adding
Connections Programmatically.
See Also
Integration Services (SSIS) Connections
OLE DB connection manager
11/2/2020 • 7 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An OLE DB connection manager enables a package to connect to a data source by using an OLE DB provider.
For example, an OLE DB connection manager that connects to SQL Server can use the Microsoft OLE DB
Provider for SQL Server.
NOTE
The SQL Server Native Client 11.0 OLEDB provider doesn't support the new connection string key words
(MultiSubnetFailover=True) for multi-subnet failover clustering. For more information, see the SQL Server Release
Notes.
NOTE
If the data source is Microsoft Office Excel 2007 or Microsoft Office Access 2007, the data source requires a different
data provider than earlier versions of Excel or Access. For more information, see Connect to an Excel Workbook and
Connect to an Access Database.
Several SQL Server Integration Services tasks and data flow components use an OLE DB connection manager.
For example, the OLE DB source and OLE DB destination use this connection manager to extract and load
data. The Execute SQL task can use this connection manager to connect to a SQL Server database to run
queries.
You can also use the OLE DB connection manager to access OLE DB data sources in custom tasks written in
unmanaged code that uses a language such as C++.
When you add an OLE DB connection manager to a package, Integration Services creates a connection
manager that resolves to an OLE DB connection at runtime, sets the connection manager properties, and
adds the connection manager to the Connections collection on the package.
The ConnectionManagerType property of the connection manager is set to OLEDB .
Configure the OLE DB connection manager in the following ways:
Provide a specific connection string configured to meet the requirements of the selected provider.
Depending on the provider, include the name of the data source to connect to.
Provide security credentials as appropriate for the selected provider.
Indicate whether the connection created from the connection manager is retained at runtime.
NOTE
If the data source is Microsoft Office Excel 2007, the data source requires a different connection manager than earlier
versions of Excel. For more information, see Connect to an Excel Workbook.
If the data source is Microsoft Office Access 2007, the data source requires a different OLE DB provider than earlier
versions of Access. For more information, see Connect to an Access Database.
To learn more about the OLE DB connection manager, see OLE DB Connection Manager.
Options
Data connections
Select an existing OLE DB data connection from the list.
Data connection proper ties
View properties and values for the selected OLE DB data connection.
New
Create an OLE DB data connection by using the Connection Manager dialog box.
Delete
Select a data connection, and then delete it by selecting Delete .
Managed identities for Azure resources authentication
When running SSIS packages on Azure-SSIS integration runtime in Azure Data Factory, use the managed
identity associated with your data factory for Azure SQL Database or Managed Instance authentication. The
designated factory can access and copy data from or to your database by using this identity.
NOTE
When you use Azure Active Directory (Azure AD) authentication (including managed identity authentication) to
connect to Azure SQL Database or Managed Instance, you might encounter a problem related to package execution
failure or unexpected behavior change. For more information, see Azure AD features and limitations.
To use managed identity authentication for Azure SQL Database, follow these steps to configure your
database:
1. Provision an Azure Active Directory administrator for your Azure SQL server on the Azure portal, if
you haven't already done so. The Azure AD administrator can be an Azure AD user or Azure AD group.
If you grant the group with managed identity an admin role, skip step 2 and 3. The administrator will
have full access to the database.
2. Create contained database users for the data factory managed identity. Connect to the database from
or to which you want to copy data by using tools like SSMS, with an Azure AD identity that has at least
ALTER ANY USER permission. Run the following T-SQL:
CREATE USER [your data factory name] FROM EXTERNAL PROVIDER;
3. Grant the data factory managed identity needed permissions, as you normally do for SQL users and
others. Refer to Database-Level Roles for appropriate roles. Run the following code. For more options,
see this document.
To use managed identity authentication for Azure SQL Managed Instance, follow these steps to configure
your database:
1. Provision an Azure Active Directory administrator for your managed instance on the Azure portal, if
you haven't already done so. The Azure AD administrator can be an Azure AD user or Azure AD group.
If you grant the group with managed identity an admin role, skip step 2-4. The administrator will have
full access to the database.
2. Create logins for the data factory managed identity. In SQL Server Management Studio (SSMS),
connect to your Managed Instance using a SQL Server account that is a sysadmin . In master
database, run the following T-SQL:
3. Create contained database users for the data factory managed identity. Connect to the database from
or to which you want to copy data, run the following T-SQL:
4. Grant the data factory managed identity needed permissions as you normally do for SQL users and
others. Run the following code. For more options, see this document.
ALTER ROLE [role name e.g., db_owner] ADD MEMBER [your data factory name];
Then configure OLE DB provider for the OLE DB connection manager. Here are the options to do this:
Configure at design time. In SSIS Designer, double-click the OLE DB connection manager to open
the Connection Manager window. In the Provider drop-down list, select Microsoft OLE DB
Driver for SQL Ser ver .
NOTE
Other providers in the drop-down list might not support managed identity authentication.
Configure at runtime. When you run the package via SQL Server Management Studio (SSMS) or
Azure Data Factory Execute SSIS Package activity, find the connection manager property
ConnectionString for the OLE DB connection manager. Update the connection property Provider to
MSOLEDBSQL (that is, Microsoft OLE DB Driver for SQL Server).
Finally, configure managed identity authentication for the OLE DB connection manager. Here are the options
to do this:
Configure at design time. In SSIS Designer, right-click the OLE DB connection manager, and select
Proper ties . Update the property ConnectUsingManagedIdentity to True .
NOTE
Currently, the connection manager property ConnectUsingManagedIdentity doesn't take effect (indicating
that managed identity authentication doesn't work) when you run SSIS package in SSIS Designer or Microsoft
SQL Server.
Configure at runtime. When you run the package via SSMS or an Execute SQL Package activity,
find the OLE DB connection manager, and update its property ConnectUsingManagedIdentity to True .
NOTE
In Azure-SSIS integration runtime, all other authentication methods (for example, integrated security and
password) preconfigured on the OLE DB connection manager are overridden when managed identity
authentication is used to establish a database connection.
NOTE
To configure managed identity authentication on existing packages, the preferred way is to rebuild your SSIS project
with the latest SSIS Designer at least once. Redeploy that SSIS project to your Azure-SSIS integration runtime, so that
the new connection manager property ConnectUsingManagedIdentity is automatically added to all OLE DB
connection managers in your SSIS project. The alternative way is to directly use a property override with property path
\Package.Connections[{the name of your connection
manager}].Proper ties[ConnectUsingManagedIdentity] at runtime.
See also
OLE DB Source
OLE DB Destination
Execute SQL Task
Integration Services (SSIS) Connections
SAP BW Connection Manager
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The SAP BW connection manager is the connection manager component of the Microsoft Connector 1.1 for SAP
BW. Thus, the SAP BW connection manager provides the connectivity to an SAP Netweaver BW version 7 system
that the source and destination components of the Microsoft Connector 1.1 for SAP BW need. (The SAP BW source
and destination that are part of the Microsoft Connector 1.1 for SAP BW package are the only Integration Services
components that use the SAP BW connection manager.)
IMPORTANT
The documentation for the Microsoft Connector 1.1 for SAP BW assumes familiarity with the SAP Netweaver BW
environment. For more information about SAP Netweaver BW, or for information about how to configure SAP Netweaver
BW objects and processes, see your SAP documentation.
When you add an SAP BW connection manager to a package, the ConnectionManagerType property of the
connection manager is set to SAPBI .
IMPORTANT
If the data that is transferred contains sensitive information, the log files will also contain that sensitive information.
Use the values that you have entered to test the connection.
If you do not know all the values that are required to configure the connection manager, you might have to ask
your SAP administrator.
For a walkthrough that demonstrates how to configure and use the SAP BW connection manager, source, and
destination, see the white paper, Using SQL Server 2008 Integration Services with SAP BI 7.0. This white paper also
shows how to configure the required objects in SAP BW.
Using the SSIS Designer to Configure the Source
For more information about the properties of the SAP BW connection manager that you can set in SSIS Designer,
click the following topic:
SAP BW Connection Manager Edit or
IMPORTANT
The documentation for the Microsoft Connector 1.1 for SAP BW assumes familiarity with the SAP Netweaver BW
environment. For more information about SAP Netweaver BW, or for information about how to configure SAP Netweaver
BW objects and processes, see your SAP documentation.
NOTE
If you do not know all the values that are required to configure the connection manager, you might have to ask your SAP
administrator.
Client
Specify the client number of the system.
Language
Specify the language that the system uses. For example, specify EN for English.
User name
Specify the user name that will be used to connect to the system.
Password
Specify the password that will be used with the user name.
Use single application ser ver
Connect to a single application server.
To connect to a group of load-balanced servers, use the Use load balancing option instead.
Host
If connecting to a single application server, specify the host name.
NOTE
This option is only available if you have selected the Use single application ser ver option.
System number
If connecting to a single application server, specify the system number.
NOTE
This option is only available if you have selected the Use single application ser ver option.
NOTE
This option is only available if you have selected the Use load balancing option.
Group
If connecting to a group of load-balanced servers, specify the name of the server group name.
NOTE
This option is only available if you have selected the Use load balancing option.
SID
If connecting to a group of load-balanced servers, specify the System ID for the connection.
NOTE
This option is only available if you have selected the Use load balancing option.
Log director y
Enable logging for the components of the Microsoft Connector 1.1 for SAP BW.
To enable logging, specify a directory for the log files that are created before and after each RFC function call. (This
logging feature creates many log files in XML format. As these log files also contain all the rows of data that are
transferred, these log files may consume a large amount of disk space.)
IMPORTANT
If the data that is transferred contains sensitive information, the log files will also contain that sensitive information.
To specify the log directory, you can either enter the directory path manually, or click Browse and browse to the
log directory.
If you do not select a log directory, logging is not enabled.
Browse
Browse to select a folder for the log directory.
Test Connection
Test the connection using the values that you have provided. After clicking Test Connection , a message box
appears and indicates whether the connection succeeded or failed.
See Also
Microsoft Connector for SAP BW Components
SMTP Connection Manager
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An SMTP connection manager enables a package to connect to a Simple Mail Transfer Protocol (SMTP) server.
The Send Mail task that Microsoft SQL Server Integration Services includes uses an SMTP connection manager.
When using Microsoft Exchange as the SMTP server, you may need to configure the SMTP connection manager
to use Windows Authentication. Exchange servers may be configured to not allow unauthenticated SMTP
connections.
IMPORTANT
The SMTP connection manager supports only anonymous authentication and Windows Authentication. It does not
support basic authentication.
Specify whether to encrypt communication using Transport Layer Security (TLS), previously known as
Secure Sockets Layer (SSL), when sending e-mail messages.
You can set properties through SSIS Designer or programmatically.
For more information about the properties that you can set in SSIS Designer, see .
SMTP Connection Manager Edit or
For information about configuring a connection manager programmatically, see ConnectionManager and Adding
Connections Programmatically.
IMPORTANT
The SMTP connection manager supports only anonymous authentication and Windows Authentication. It does not
support basic authentication.
NOTE
When using Microsoft Exchange as the SMTP server, you may need to set Use Windows Authentication to True .
Exchange servers may be configured to disallow unauthenticated SMTP connections.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
An SMO connection manager enables a package to connect to a SQL Management Object (SMO) server. The
transfer tasks that SQL Server Integration Services includes use an SMO connection manager. For example, the
Transfer Logins task that transfers SQL Server logins uses an SMO connection manager.
When you add an SMO connection manager to a package, Integration Services creates a connection manager
that will resolve to an SMO connection at run time, sets the connection manager properties, and adds the
connection manager to the Connections collection on the package. The ConnectionManagerType property
of the connection manager is set to SMOSer ver .
You can configure an SMO connection manager in the following ways:
Specify the name of a server on which SQL Server is installed.
Select the authentication mode for connecting to the server.
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
See Also
Integration Services (SSIS) Connections
SQL Server Compact Edition Connection Manager
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A SQL Server Compact connection manager enables a package to connect to a SQL Server Compact database.
The SQL Server Compact destination that Microsoft SQL Server Integration Services includes uses this
connection manager to load data into a table in a SQL Server Compact database.
NOTE
On a 64-bit computer, you must run packages that connect to SQL Server Compact data sources in 32-bit mode. The
SQL Server Compact provider that Integration Services uses to connect to SQL Server Compact data sources is available
only in a 32-bit version.
VA L UE DESC RIP T IO N
Shared Read Specifies that other users can read from the database at the
same time.
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A WMI connection manager enables a package to use Windows Management Instrumentation (WMI) to manage
information in an enterprise environment. The Web Service task that Microsoft SQL Server Integration Services
includes uses a WMI connection manager.
When you add a WMI connection manager to a package, Integration Services creates a connection manager that
will resolve to a WMI connection at run time, sets the connection manager properties, and adds the connection
manager to the Connections collection on the package. The ConnectionManagerType property of the
connection manager is set to WMI .
For information about configuring a connection manager programmatically, see ConnectionManager and
Adding Connections Programmatically.
See Also
Web Service Task
Integration Services (SSIS) Connections
Control Flow
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
A package consists of a control flow and, optionally, one or more data flows. SQL Server Integration Services
provides three different types of control flow elements: containers that provide structures in packages, tasks
that provide functionality, and precedence constraints that connect the executables, containers, and tasks into
an ordered control flow.
For more information, see Precedence Constraints, Integration Services Containers, and Integration Services
Tasks.
The following diagram shows a control flow that has one container and six tasks. Five of the tasks are defined
at the package level, and one task is defined at the container level. The task is inside a container.
The Integration Services architecture supports the nesting of containers, and a control flow can include
multiple levels of nested containers. For example, a package could contain a container such as a Foreach Loop
container, which in turn could contain another Foreach Loop container and so on.
Event handlers also have control flows, which are built using the same kinds of control flow elements.
Related Tasks
Add or Delete a Task or a Container in a Control Flow
Set the Properties of a Task or Container
Group or Ungroup Components
Add or Delete a Task or a Container in a Control
Flow
11/2/2020 • 4 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
When you are working in the control flow designer, the Toolbox in SSIS Designer lists the tasks that
Integration Services provides for building control flow in a package. For more information about the
Toolbox, see SSIS Toolbox.
A package can include multiple instances of the same task. Each instance of a task is uniquely identified in
the package, and you can configure each instance differently.
If you delete a task, the precedence constraints that connect the task to other tasks and containers in the
control flow are also deleted.
The following procedures describe how to add or delete a task or container in the control flow of a package.
NOTE
Most properties can be set by typing a value directly in the text box, or by selecting a value from a list.
However, some properties are more complex and have a custom property editor. To set the property, click in
the text box, and then click the build (...) button to open the custom editor.
6. Optionally, create property expressions to dynamically update the properties of the task or container.
For more information, see Add or Change a Property Expression.
7. To save the updated package, click Save Selected Items on the File menu.
Set the properties of a task or container with the task or container editor
1. In SQL Server Data Tools (SSDT), open the Integration Services project that contains the package you
want.
2. In Solution Explorer, double-click the package to open it.
3. Click the Control Flow tab.
4. On the design surface of the Control Flow tab, right-click the task or container, and then click Edit
to open the corresponding task or container editor.
For information about how to configure the For Loop container, see Configure a For Loop Container.
For information about how to configure the Foreach Loop container, see Configure a Foreach Loop
Container.
NOTE
The Sequence container has no custom editor.
5. If the task or container editor has multiple nodes, click the node that contains the property that you
want to set.
6. Optionally, click Expressions and, on the Expressions page, create property expressions to
dynamically update the properties of the task or container. For more information, see Add or Change
a Property Expression.
7. Update the property value.
8. To save the updated package, click Save Selected Items on the File menu.
See Also
Integration Services Tasks
Integration Services Containers
Control Flow
Integration Services Containers
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Containers are objects in SQL Server Integration Services that provide structure to packages and services to
tasks. They support repeating control flows in packages, and they group tasks and containers into meaningful
units of work. Containers can include other containers in addition to tasks.
Packages use containers for the following purposes:
Repeat tasks for each element in a collection, such as files in a folder, schemas, or SQL Server Management
Objects (SMO) objects. For example, a package can run Transact-SQL statements that reside in multiple
files.
Repeat tasks until a specified expression evaluates to false . For example, a package can send a different e-
mail message seven times, one time for every day of the week.
Group tasks and containers that must succeed or fail as a unit. For example, a package can group tasks that
delete and add rows in a database table, and then commit or roll back all the tasks when one fails.
Container Types
Integration Services provides four types of containers for building packages. The following table lists the
container types.
C O N TA IN ER DESC RIP T IO N
Sequence Container Groups tasks and containers into control flows that are
subsets of the package control flow.
Packages and events handlers are also types of containers. For information see Integration Services (SSIS)
Packages and Integration Services (SSIS) Event Handlers.
Summary of Container Properties
All container types have a set of properties in common. If you create packages using the graphical tools that
Integration Services provides, the Properties window lists the following properties for the Foreach Loop, For
Loop, and Sequence containers. The task host container properties are configured as part of configuring the task
that the task host encapsulates. You set the Task Host properties when you configure the task.
ID.
IsolationLevel The isolation level of the container transaction. The values are
Unspecified , Chaos , ReadUncommitted ,
ReadCommitted , RepeatableRead , Serializable , and
Snapshot . The default value of this property is Serializable .
For more information, see IsolationLevel.
To learn about all the properties that are available to Foreach Loop, For Loop, Sequence, and Task Host containers
when configure them programmatically, see the following Integration Services API topics:
T:Microsoft.SqlServer.Dts.Runtime.ForEachLoop
T:Microsoft.SqlServer.Dts.Runtime.ForLoop
T:Microsoft.SqlServer.Dts.Runtime.Sequence
T:Microsoft.SqlServer.Dts.Runtime.TaskHost
Break Points
When you set a breakpoint on a container and the break condition is Break when the container recevies the
OnVariableValueChanged event , define the variable in the container scope.
See Also
Control Flow
Foreach Loop Container
11/2/2020 • 28 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Foreach Loop container defines a repeating control flow in a package. The loop implementation is similar to
Foreach looping structure in programming languages. In a package, looping is enabled by using a Foreach
enumerator. The Foreach Loop container repeats the control flow for each member of a specified enumerator.
SQL Server Integration Services provides the following enumerator types:
Foreach ADO enumerator to enumerate rows in tables. For example, you can get the rows in an ADO
recordset.
The Recordset destination saves data in memory in a recordset that is stored in a package variable of
Object data type. You typically use a Foreach Loop container with the Foreach ADO enumerator to
process one row of the recordset at a time. The variable specified for the Foreach ADO enumerator must
be of Object data type. For more information about the Recordset destination, see Use a Recordset
Destination.
Foreach ADO.NET Schema Rowset enumerator to enumerate the schema information about a data source.
For example, you can enumerate and get a list of the tables in the AdventureWorks2012 SQL Server
database.
Foreach File enumerator to enumerate files in a folder. The enumerator can traverse subfolders. For
example, you can read all the files that have the *.log file name extension in the Windows folder and its
subfolders. Note that the order in which the files are retrieved cannot be specified.
Foreach From Variable enumerator to enumerate the enumerable object that a specified variable contains.
The enumerable object can be an array, an ADO.NET DataTable , an Integration Services enumerator, and
so on. For example, you can enumerate the values of an array that contains the name of servers.
Foreach Item enumerator to enumerate items that are collections. For example, you can enumerate the
names of executables and working directories that an Execute Process task uses.
Foreach Nodelist enumerator to enumerate the result set of an XML Path Language (XPath) expression.
For example, this expression enumerates and gets a list of all the authors in the classical period:
/authors/author[@period='classical'] .
Foreach SMO enumerator to enumerate SQL Server Management Objects (SMO) objects. For example,
you can enumerate and get a list of the views in a SQL Server database.
Foreach HDFS File Enumerator to enumerate HDFS files in the specified HDFS location.
Foreach Azure Blob enumerator to enumerate blobs in a blob container in Azure Storage.
Foreach ADLS File enumerator to enumerate files in a directory in Azure Data Lake Store.
Foreach Data Lake Storage Gen2 File enumerator to enumerate files in a directory in Azure Data Lake
Store Gen2.
The following diagram shows a Foreach Loop container that has a File System task. The Foreach loop uses the
Foreach File enumerator, and the File System task is configured to copy a file. If the folder that the enumerator
specifies contains four files, the loop repeats four times and copies four files.
You can use a combination of variables and property expressions to update the property of the package object
with the enumerator collection value. First you map the collection value to a user-defined variable, and then you
implement a property expression on the property that uses the variable. For example, the collection value of the
Foreach File enumerator is mapped to a variable called MyFile and the variable is then used in the property
expression for the Subject property of a Send Mail task. When the package runs, the Subject property is updated
with the name of a file each time that the loop repeats. For more information, see Use Property Expressions in
Packages.
Variables that are mapped to the enumerator collection value can also be used in expressions and scripts.
A Foreach Loop container can include multiple tasks and containers, but it can use only one type of enumerator.
If the Foreach Loop container includes multiple tasks, you can map the enumerator collection value to multiple
properties of each task.
You can set a transaction attribute on the Foreach Loop container to define a transaction for a subset of the
package control flow. In this way, you can manage transactions at the level of the Foreach Loop instead of the
package level. For example, if a Foreach Loop container repeats a control flow that updates dimensions and fact
tables in a star schema, you can configure a transaction to ensure that all fact tables are updated successfully, or
none are updated. For more information, see Integration Services Transactions.
Enumerator Types
Enumerators are configurable, and you must provide different information, depending on the enumerator.
The following table summarizes the information each enumerator type requires.
Foreach ADO Specify the ADO object source variable and the enumerator
mode. The variable must be of Object data type.
Foreach ADO.NET Schema Rowset Specify the connection to a database and the schema to
enumerate.
Foreach File Specify a folder and the files to enumerate, the format of the
file name of the retrieved files, and whether to traverse
subfolders.
Foreach From Variable Specify the variable that contains the objects to enumerate.
Foreach Item Define the items in the Foreach Item collection, including
columns and column data types.
Foreach Nodelist Specify the source of the XML document and configure the
XPath operation.
Foreach SMO Specify the connection to a database and the SMO objects to
enumerate.
EN UM ERATO R C O N F IGURAT IO N REQ UIREM EN T S
Foreach HDFS File Enumerator Specify a folder and the files to enumerate, the format of the
file name of the retrieved files, and whether to traverse
subfolders.
Foreach Azure Blob Specify the Azure blob container that containers blobs to be
enumerated.
Foreach ADLS File Specify the Azure Data Lake Store directory that contains the
files to be enumerated.
Foreach Data Lake Storage Gen2 File Specify the Azure Data Lake Storage Gen2 directory that
contains the files to be enumerated, along with other
options.
NOTE
To add a new row, click anywhere outside the cell in which you typed.
NOTE
If a value is not compatible with the column data type, the text is highlighted.
To use the Foreach ADO enumerator, select an existing variable or click New variable in the ADO
object source variable list to specify the variable that contains the name of the ADO object to
enumerate, and select an enumeration mode option.
If creating a new variable, set the variable properties in the Add Variable dialog box.
To use the Foreach ADO.NET Schema Rowset enumerator, select an existing ADO.NET connection or
click New connection in the Connection list, and then select a schema.
Optionally, click Set Restrictions and select schema restrictions, select the variable that contains
the restriction value or type the restriction value, and click OK .
To use the Foreach From Variable enumerator, select a variable in the Variable list.
To use the Foreach NodeList enumerator, click DocumentSourceType and select the source type
from the list, and then click DocumentSource. Depending on the value selected for
DocumentSourceType, select a variable or a file connection from the list, create a new variable or
file connection, or type the XML source in the Document Source Editor .
Next, click EnumerationType and select an enumeration type from the list. If EnumerationType is
Navigator, Node, or NodeText , click OuterXPathStringSourceType and select the source type,
and then click OuterXPathString. Depending on the value set for OuterXPathStringSourceType,
select a variable or a file connection from the list, create a new variable or file connection, or type
the string for the outer XML Path Language (XPath) expression.
If EnumerationType is ElementCollection , set OuterXPathStringSourceType and OuterXPathString
as described above. Then, click InnerElementType and select an enumeration type for the inner
elements, and then click InnerXPathStringSourceType. Depending on the value set for
InnerXPathStringSourceType, select a variable or a file connection, create a new variable or file
connection, or type the string for the inner XPath expression.
To use the Foreach SMO enumerator, select an existing ADO.NET connection or click New
connection in the Connection list, and then either type the string to use or click Browse . If you
click Browse , in the Select SMO Enumeration dialog box, select the object type to enumerate
and the enumeration type, and click OK .
6. Optionally, click the browse button (...) in the Expressions text box on the Collection page to create
expressions that update property values. For more information, see Add or Change a Property Expression.
NOTE
The properties listed in the Proper ty list vary by enumerator.
7. Optionally, click Variable Mappings to map object properties to the collection value, and then do the
following things:
a. In the Variables list, select a variable or click <New Variable> to create a new variable.
b. If you add a new variable, set the variable properties in the Add Variable dialog box and click OK .
c. If you use the For Each Item enumerator, you can update the index value in the Index list.
NOTE
The index value indicates which column in the item to map to the variable. Only the For Each Item
enumerator can use an index value other than 0.
8. Optionally, click Expressions and, on the Expressions page, create property expressions for the
properties of the Foreach Loop container. For more information, see Add or Change a Property
Expression.
9. Click OK .
Options
Name
Provide a unique name for the Foreach Loop container. This name is used as the label in the task icon and in the
logs.
NOTE
Object names must be unique within a package.
Description
Type a description of the Foreach Loop container.
Static Options
Enumerator
Select the enumerator type from the list. This property has the options listed in the following table:
VA L UE DESC RIP T IO N
Foreach File Enumerator Enumerate files. Selecting this value displays the dynamic
options in the section, Foreach File Enumerator .
Foreach Item Enumerator Enumerate values in an item. Selecting this value displays the
dynamic options in the section, Foreach Item
Enumerator .
Foreach ADO Enumerator Enumerate tables or rows in tables. Selecting this value
displays the dynamic options in the section, Foreach ADO
Enumerator .
Foreach ADO.NET Schema Rowset Enumerator Enumerate a schema. Selecting this value displays the
dynamic options in the section, Foreach ADO.NET
Enumerator .
Foreach From Variable Enumerator Enumerate the value in a variable. Selecting this value
displays the dynamic options in the section, Foreach From
Variable Enumerator .
Foreach Nodelist Enumerator Enumerate nodes in an XML document. Selecting this value
displays the dynamic options in the section, Foreach
Nodelist Enumerator .
Foreach SMO Enumerator Enumerate a SMO object. Selecting this value displays the
dynamic options in the section, Foreach SMO
Enumerator .
Foreach HDFS File Enumerator Enumerate HDFS files in the specified HDFS location.
Selecting this value displays the dynamic options in the
section, Foreach HDFS File Enumerator .
Foreach Azure Blob Enumerator Enumerate blob files in the specified blob location. Selecting
this value displays the dynamic options in the section,
Foreach Azure Blob Enumerator .
Foreach ADLS File Enumerator Enumerate files in the specified Data Lake Store directory.
Selecting this value displays the dynamic options in the
section, Foreach ADLS File Enumerator .
Foreach Data Lake Storage Gen2 File Enumerator Enumerate files in the specified Data Lake Storage Gen2
directory. Selecting this value displays the dynamic options in
the section, Foreach Data Lake Storage Gen2 File
Enumerator .
Expressions
Click or expand Expressions to view the list of existing property expressions. Click the ellipsis button (...) to add
a property expression for an enumerator property, or edit and evaluate an existing property expression.
Related Topics: Integration Services (SSIS) Expressions, Property Expressions Editor, Expression Builder
Enumerator Dynamic Options
Enumerator = Foreach File Enumerator
You use the Foreach File enumerator to enumerate files in a folder. For example, if the Foreach Loop includes an
Execute SQL task, you can use the Foreach File enumerator to enumerate files that contain SQL statements that
the Execute SQL task runs. The enumerator can be configured to include subfolders.
The content of the folders and subfolders that the Foreach File enumerator enumerates might change while the
loop is executing because external processes or tasks in the loop add, rename, or delete files while the loop is
executing. These changes may cause a number of unexpected situations:
If files are deleted, the actions of one task in the Foreach Loop may affect a different set of files than the
files used by subsequent tasks.
If files are renamed and an external process automatically adds files to replace the renamed files, the
actions of tasks in the Foreach Loop may affect the same files twice.
If files are added, it may be difficult to determine for which files the Foreach Loop affected.
Folder
Provide the path of the root folder to enumerate.
Browse
Browse to locate the root folder.
Files
Specify the files to enumerate.
NOTE
Use wildcard characters (*) to specify the files to include in the collection. For example, to include files with names that
contain "abc", use the following filter: *abc*.
When you specify a file name extension, the enumerator also returns files that have the same extension with additional
characters appended. (This is the same behavior as that of the dir command in the operating system, which also compares
8.3 file names for backward compatibility.) This behavior of the enumerator could cause unexpected results. For example,
you want to enumerate only Excel 2003 files, and you specify "*.xls". However, the enumerator also returns Excel 2007 files
because those files have the extension, ".xlsx".
You can use an expression to specify the files to include in a collection, by expanding Expressions on the Collection
page, selecting the FileSpec property, and then clicking the ellipsis button (...) to add the property expression.
Fully qualified
Select to retrieve the fully qualified path of file names. If wildcard characters are specified in the Files option, then
the fully qualified paths that are returned match the filter.
Name only
Select to retrieve only the file names. If wildcard characters are specified in the Files option, then the file names
returned match the filter.
Name and extension
Select to retrieve the file names and their file name extensions. If wildcard characters are specified in the Files
option, then the name and extension of files returned match the filter.
Traverse Subfolders
Select to include the subfolders in the enumeration.
Enumerator = Foreach Item Enumerator
You use the Foreach Item enumerator to enumerate items in a collection. You define the items in the collection by
specifying columns and column values. The columns in a row define an item. For example, an item that specifies
the executables that an Execute Process task runs and the working directory that the task uses has two columns,
one that lists the names of executables and one that lists the working directory. The number of rows determines
the number of times that the loop is repeated. If the table has 10 rows, the loop repeats 10 times.
To update the properties of the Execute Process task, you map variables to item columns by using the index of
the column. The first column defined in the enumerator item has the index value 0, the second column 1, and so
on. The variable values are updated with each repeat of the loop. The Executable and WorkingDirector y
properties of the Execute Process task can then be updated by property expressions that use these variables.
Define the items in the For Each Item collection
Provide a value for each column in the table.
NOTE
A new row is automatically added to the table after you enter values in row columns.
NOTE
If the values provided are not compatible with the column data type, the text is colored red.
NOTE
The variable must have the Object data type, otherwise an error occurs.
IMPORTANT
The ADO.NET connection manager must use a .NET provider for OLE DB. If connecting to SQL Server, the recommended
provider to use is the SQL Server Native Client, listed in the .Net Providers for OleDb section of the Connection
Manager dialog box.
VA L UE DESC RIP T IO N
VA L UE DESC RIP T IO N
OuterXPathStringSourceType
Select the source type of the XPath string. This property has the options listed in the following table:
VA L UE DESC RIP T IO N
OuterXPathString
If OuterXPathStringSourceType is set to Direct input , provide the XPath string.
If OuterXPathStringSourceType is set to File connection , select a File connection manager, or click <New
connection...> to create a new connection manager.
Related Topics: File Connection Manager, File Connection Manager Editor
If OuterXPathStringSourceType is set to Variable , select an existing variable, or click <New variable...> to
create a new variable.
Related Topics: Integration Services (SSIS) Variables, Add Variable.
InnerElementType
If EnumerationType is set to ElementCollection , select the type of inner element in the list.
InnerXPathStringSourceType
Select the source type of the inner XPath string. This property has the options listed in the following table:
VA L UE DESC RIP T IO N
InnerXPathString
If InnerXPathStringSourceType is set to Direct input , provide the XPath string.
If InnerXPathStringSourceType is set to File connection , select a File connection manager, or click <New
connection...> to create a new connection manager.
Related Topics: File Connection Manager, File Connection Manager Editor
If InnerXPathStringSourceType is set to Variable , select an existing variable, or click <New variable...> to
create a new variable.
Related Topics: Integration Services (SSIS) Variables, Add Variable.
Enumerator = Foreach SMO Enumerator
You use the Foreach SMO enumerator to enumerate SQL Server Management Object (SMO) objects. For
example, if the Foreach Loop includes an Execute SQL task, you can use the Foreach SMO enumerator to
enumerate the tables in the AdventureWorks database and run queries that count the number of rows in each
table.
Connection
Select an existing ADO.NET connection manager, or click <New connection...> to create a new connection
manager.
Related Topics: ADO.NET Connection Manager, Configure ADO.NET Connection Manager
Enumerate
Specify the SMO object to enumerate.
Browse
Select the SMO enumeration.
Related Topics:
Select SMO Enumeration Dialog Box
The Microsoft SQL Server Integration Services tutorial, Creating a Simple ETL Package Tutorial, includes a lesson
that teaches you to add and configure a Foreach Loop.
Options
Variable
Select an existing variable, or click New variable... to create a new variable.
NOTE
After you map a variable, a new row is automatically added to the Variable list.
Delete
Select a variable, and then click Delete .
See Also
Control Flow
Integration Services Containers
Loop through Excel Files and Tables with a Foreach
Loop Container
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The procedures in this topic describe how to loop through the Excel workbooks in a folder, or through the tables in
an Excel workbook, by using the Foreach Loop container with the appropriate enumerator.
IMPORTANT
For detailed info about connecting to Excel files, and about limitations and known issues for loading data from or to Excel
files, see Load data from or to Excel with SQL Server Integration Services (SSIS).
If you do not use a variable for the Extended Properties argument, then you must add it manually to the
expression that contains the connection string.
3. Add a Foreach Loop container to the Control Flow tab. For information about how to configure the
Foreach Loop Container, see Configure a Foreach Loop Container.
4. On the Collection page of the Foreach Loop Editor , select the Foreach File enumerator, specify the folder
in which the Excel workbooks are located, and specify the file filter (ordinarily *.xlsx).
5. On the Variable Mapping page, map Index 0 to a user-defined string variable that will receive the current
Excel path and file name on each iteration of the loop. (The sample expression shown later in this procedure
uses the variable name ExcelFile .)
6. Close the Foreach Loop Editor .
7. Add an Excel connection manager to the package as described in Add, Delete, or Share a Connection
Manager in a Package. Select an existing Excel workbook file for the connection to avoid validation errors.
IMPORTANT
To avoid validation errors as you configure tasks and data flow components that use this Excel connection manager,
select an existing Excel workbook in the Excel Connection Manager Editor . The connection manager will not use
this workbook at run time after you configure an expression for the ConnectionString property as described in the
following steps. After you create and configure the package, you can clear the value of the ConnectionString
property in the Properties window. However, if you clear this value, the connection string property of the Excel
connection manager is no longer valid until the Foreach Loop runs. Therefore you must set the DelayValidation
property to True on the tasks in which the connection manager is used, or on the package, to avoid validation
errors.
You must also use the default value of False for the RetainSameConnection property of the Excel connection
manager. If you change this value to True , each iteration of the loop will continue to open the first Excel workbook.
8. Select the new Excel connection manager, click the Expressions property in the Properties window, and
then click the ellipsis.
9. In the Proper ty Expressions Editor , select the ConnectionString property, and then click the ellipsis.
10. In the Expression Builder, enter the following expression:
Note the use of the escape character "\" to escape the inner quotation marks required around the value of
the Extended Properties argument.
The Extended Properties argument is not optional. If you do not use a variable to contain its value, then you
must add it manually to the expression, as in the following example:
11. Create tasks in the Foreach Loop container that use the Excel connection manager to perform the same
operations on each Excel workbook that matches the specified file location and pattern.
7. On the Variable Mappings page, map Index 2 to the string variable created earlier to hold the name of the
current table.
8. Close the Foreach Loop Editor .
9. Create tasks in the Foreach Loop container that use the Excel connection manager to perform the same
operations on each Excel table in the specified workbook. If you use a Script Task to examine the
enumerated table name or to work with each table, remember to add the string variable to the
ReadOnlyVariables property of the Script task.
See Also
Load data from or to Excel with SQL Server Integration Services (SSIS)
Configure a Foreach Loop Container
Add or Change a Property Expression
Excel Connection Manager
Excel Source
Excel Destination
Working with Excel Files with the Script Task
For Loop Container
11/2/2020 • 6 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The For Loop container defines a repeating control flow in a package. The loop implementation is similar to the
For looping structure in programming languages. In each repeat of the loop, the For Loop container evaluates an
expression and repeats its workflow until the expression evaluates to False .
The For Loop container usesthe following elements to define the loop:
An optional initialization expression that assigns values to the loop counters.
An evaluation expression that contains the expression used to test whether the loop should stop or
continue.
An optional iteration expression that increments or decrements the loop counter.
The following diagram shows a For Loop container with a Send Mail task. If the initialization expression is
@Counter = 0 , the evaluation expression is @Counter < 4 , and the iteration expression is @Counter = @Counter + 1 ,
the loop repeats four times and sends four e-mail messages.
NOTE
The expression must evaluate to a Boolean. When the expression evaluates to false , the loop stops running.
NOTE
Object names must be unique within a package.
Description
Provide a description of the For Loop container.
See Also
Control Flow
Integration Services (SSIS) Expressions
Sequence Container
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Sequence container defines a control flow that is a subset of the package control flow. Sequence containers
group the package into multiple separate control flows, each containing one or more tasks and containers that run
within the overall package control flow.
The Sequence container can include multiple tasks in addition to other containers. Adding tasks and containers to a
Sequence container is similar to adding them to a package, except you drag the tasks and containers to the
Sequence container instead of to the package container. If the Sequence container includes more than one task or
container, you can connect them using precedence constraints just as you do in a package. For more information,
see Precedence Constraints.
There are many benefits of using a Sequence container:
Disabling groups of tasks to focus package debugging on one subset of the package control flow.
Managing properties on multiple tasks in one location by setting properties on a Sequence container
instead of on the individual tasks.
For example, you can set the Disable property of the Sequence container to True to disable all the tasks
and containers in the Sequence container.
Providing scope for variables that a group of related tasks and containers use.
Grouping many tasks so you can more easily managed them by collapsing and expanding the Sequence
container.
You can also create task groups, which expand and collapse using the Group box. However, the Group box
is a design-time feature that has no properties or run-time behavior. For more information, see Group or
Ungroup Components
Set a transaction attribute on the Sequence container to define a transaction for a subset of the package
control flow. In this way, you can manage transactions at a more granular level.
For example, if a Sequence container includes two related tasks, one task that deletes data in a table and
another task that inserts data into a table, you can configure a transaction to ensure that the delete action is
rolled back if the insert action fails. For more information, see Integration Services Transactions.
Related Tasks
For information about how to set properties of the component in the SQL Server Data Tools (SSDT), see Set the
Properties of a Task or Container.
See Also
Add or Delete a Task or a Container in a Control Flow
Connect Tasks and Containers by Using a Default Precedence Constraint
Integration Services Containers
Task Host Container
11/2/2020 • 2 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The task host container encapsulates a single task. In SSIS Designer, the task host is not configured separately;
instead, it is configured when you set the properties of the task it encapsulates. For more information about the
tasks that the task host containers encapsulate, see Integration Services Tasks.
This container extends the use of variables and event handlers to the task level. For more information, see
Integration Services (SSIS) Event Handlers and Integration Services (SSIS) Variables.
Related Tasks
Set the Properties of a Task or Container
See Also
Integration Services Containers
Integration Services Tasks
11/2/2020 • 3 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
Tasks are control flow elements that define units of work that are performed in a package control flow. An SQL
Server Integration Services package is made up of one or more tasks. If the package contains more than one
task, they are connected and sequenced in the control flow by precedence constraints.
You can also write custom tasks using a programming language that supports COM, such as Visual Basic, or a
.NET programming language, such as C#.
The SSIS Designer, the graphical tool in SQL Server Integration Services for working with packages, provides
the design surface for creating package control flow, and provides custom editors for configuring tasks. You
can also program the SQL Server Integration Services object model to create packages programmatically.
Types of Tasks
Integration Services includes the following types of tasks.
Data Flow Task
The task that runs data flows to extract data, apply column level transformations, and load data.
Data Preparation Tasks
These tasks do the following processes: copy files and directories; download files and data; run Web methods;
apply operations to XML documents; and profile data for cleansing.
Workflow Tasks
The tasks that communicate with other processes to run packages, run programs or batch files, send and
receive messages between packages, send e-mail messages, read Windows Management Instrumentation
(WMI) data, and watch for WMI events.
SQL Server Tasks
The tasks that access, copy, insert, delete, and modify SQL Server objects and data.
Scripting Tasks
The tasks that extend package functionality by using scripts.
Analysis Services Tasks
The tasks that create, modify, delete, and process Analysis Services objects.
Maintenance Tasks
The tasks that perform administrative functions such as backing up and shrinking SQL Server databases,
rebuilding and reorganizing indexes, and running SQL Server Agent jobs.
Custom Tasks
Additionally, you can write custom tasks using a programming language that supports COM, such as Visual
Basic, or a .NET programming language, such as C#. If you want to access your custom task in the SSIS
Designer, you can create and register a user interface for the task. For more information, see Developing a
Custom Task.
Configuration of Tasks
An Integration Services package can contain a single task, such as an Execute SQL task that deletes records in
a database table when the package runs. However, packages typically contain several tasks, and each task is
set to run within the context of the package control flow. Event handlers, which are workflows that run in
response to run-time events, can also have tasks.
For more information about adding a task to a package using SSIS Designer, see Add or Delete a Task or a
Container in a Control Flow.
For more information about adding a task to a package programmatically, see Adding Tasks Programmatically.
Each task can be configured individually using the custom dialog boxes for each task that SSIS Designer
provides, or the Properties window included in SQL Server Data Tools (SSDT). A package can include multiple
tasks of the same type-for example, six Execute SQL tasks-and each task can be configured differently. For
more information, see Set the Properties of a Task or Container.
Related Tasks
Add or Delete a Task or a Container in a Control Flow
Analysis Services Execute DDL Task
11/2/2020 • 5 minutes to read • Edit Online
Applies to: SQL Server (all supported versions) SSIS Integration Runtime in Azure Data Factory
The Analysis Services Execute DDL task runs data definition language (DDL) statements that can create, drop, or
alter mining models and multidimensional objects such as cubes and dimensions. For example, a DDL statement
can create a partition in the Adventure Works cube, or delete a dimension in Adventure Works DW
Multidimensional 2012, the sample Analysis Services database included in SQL Server.
The Analysis Services Execute DDL task uses an Analysis Services connection manager to connect to an instance of
Analysis Services or an Analysis Services project. For more information, see Analysis Services Connection
Manager.
Integration Services includes a number of tasks that perform business intelligence operations, such as processing
analytic objects and running data mining prediction queries.
For more information about related business intelligence tasks, click one of the following topics:
Analysis Services Processing Task
Data Mining Query Task
DDL Statements
The DDL statements are represented as statements in Analysis Services Scripting Language (ASSL), and framed in
an XML for Analysis (XMLA) command.
ASSL is used to define and describe an instance of Analysis Services and the databases and database
objects it contains. For more information, see Analysis Services Scripting Language (ASSL for XMLA).
XMLA is a command language that is used to send action commands, such as Create, Alter, or Process, to an
instance of Analysis Services. For more information, see XML for Analysis (XMLA) Reference.
If the DDL code is stored in a separate file, the Analysis Services Execute DDL task uses a File connection manager
to specify the path of the file. For more information, see File Connection Manager.
Because DDL statements can contain passwords and other sensitive information, a package that contains one or
more Analysis Services Execute DDL tasks should use the package protection level Encr yptAllWithUserKey or
Encr yptAllWithPassword . For more information, see Integration Services (SSIS) Packages.
DDL Examples
The following three DDL statements were generated by scripting objects in the Adventure Works DW
Multidimensional 2012, the Analysis Services database included in SQL Server.
The following DDL statement deletes the Promotion dimension.
<Delete xmlns="https://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
<DimensionID>Dim Promotion</DimensionID>
</Object>
</Delete>
The following DDL statement processes the Adventure Works DW Multidimensional 2012 cube.
<Batch xmlns="https://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance">
<Object>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Parallel>
</Batch>
The following three DDL statements were generated by scripting objects in the Adventure Works DW
Multidimensional 2012, the Analysis Services database included in SQL Server.
The following DDL statement deletes the Promotion dimension.
<Delete xmlns="https://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
<DimensionID>Dim Promotion</DimensionID>
</Object>
</Delete>
The following DDL statement processes the Adventure Works DW Multidimensional 2012 cube.
<Batch xmlns="https://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance">
<Object>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Parallel>
</Batch>
<Create xmlns="https://schemas.microsoft.com/analysisservices/2003/engine">
<ParentObject>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
<MiningStructureID>Forecasting</MiningStructureID>
</ParentObject>
<ObjectDefinition>
<MiningModel xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-
instance">
<ID>Forecasting</ID>
<Name>Forecasting</Name>
<Algorithm>Microsoft_Time_Series</Algorithm>
<AlgorithmParameters>
<AlgorithmParameter>
<Name>PERIODICITY_HINT</Name>
<Value xsi:type="xsd:string">{12}</Value>
</AlgorithmParameter>
</AlgorithmParameters>
<Columns>
<Column>
<ID>Amount</ID>
<Name>Amount</Name>
<SourceColumnID>Amount</SourceColumnID>
<Usage>Predict</Usage>
</Column>
<Column>
<ID>Model Region</ID>
<Name>Model Region</Name>
<SourceColumnID>Model Region</SourceColumnID>
<Usage>Key</Usage>
</Column>
<Column>
<ID>Quantity</ID>
<Name>Quantity</Name>
<SourceColumnID>Quantity</SourceColumnID>
<Usage>Predict</Usage>
</Column>
<Column>
<ID>Time Index</ID>
<Name>Time Index</Name>
<SourceColumnID>Time Index</SourceColumnID>
<Usage>Key</Usage>
</Column>
</Columns>
<Collation>Latin1_General_CS_AS_KS</Collation>
</MiningModel>
</ObjectDefinition>
</Create>
Configuration of the Analysis Services Execute DDL Task
You can set properties through SSIS Designer or programmatically.
For more information about the properties that you can set in SSIS Designer, click the following topic:
Expressions Page
For more information about setting these properties in SSIS Designer, click the following topic:
Set the Properties of a Task or Container