0% found this document useful (0 votes)
27 views1,282 pages

User Guide

The Gradle User Manual version 8.12.1 provides comprehensive guidance on using Gradle for building and managing projects. It covers topics such as installation, upgrading, running builds, core concepts, authoring builds, and developing tasks and plugins. The manual serves as a detailed resource for both new and experienced users to effectively utilize Gradle's features.

Uploaded by

Vinicius Benedet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views1,282 pages

User Guide

The Gradle User Manual version 8.12.1 provides comprehensive guidance on using Gradle for building and managing projects. It covers topics such as installation, upgrading, running builds, core concepts, authoring builds, and developing tasks and plugins. The manual serves as a detailed resource for both new and experienced users to effectively utilize Gradle's features.

Uploaded by

Vinicius Benedet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Gradle User Manual

Version 8.12.1
Version 8.12.1
Table of Contents
OVERVIEW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Gradle User Manual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
RELEASES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Installing Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Compatibility Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
The Feature Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
UPGRADING . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Upgrading your build from Gradle 8.x to the latest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
RUNNING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
CORE CONCEPTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Gradle Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Gradle Wrapper Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Command-Line Interface Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Settings File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Build File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Dependency Management Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Task Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Plugin Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Gradle Incremental Builds and Build Caching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Build Scans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
AUTHORING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
CORE CONCEPTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Gradle Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Multi-Project Build Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Build Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Writing Settings Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Writing Build Scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Using Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Writing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Using Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Writing Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
GRADLE TYPES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Understanding Properties and Providers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Understanding Collections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Understanding Services and Service Injection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
STRUCTURING BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
Structuring Projects with Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
Declaring Dependencies between Subprojects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
Sharing Build Logic between Subprojects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
Composite Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Configuration On Demand. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
DEVELOPING TASKS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Understanding Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Controlling Task Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
Organizing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
Configuring Tasks Lazily . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
Developing Parallel Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
Advanced Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350
Using Shared Build Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
DEVELOPING PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
Understanding Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
Understanding Implementation Options for Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
Implementing Pre-compiled Script Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
Implementing Binary Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
Testing Gradle plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424
Publishing Plugins to the Gradle Plugin Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Working With Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Initialization Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 503
Dataflow Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511
Testing Build Logic with TestKit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
Using Ant from Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 526
OPTIMIZING BUILD PERFORMANCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
Configuring the Build Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
Gradle-managed Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551
Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558
Improve the Performance of Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
Configuration cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 589
Continuous Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 631
Inspecting Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 632
Isolated Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 637
File System Watching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 641
THE BUILD CACHE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645
Build Cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645
Use cases for the build cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
Build cache performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 661
Important concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665
Caching Java projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 670
Caching Android projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Debugging and diagnosing cache misses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 678
Solving common problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 686
DEPENDENCY MANAGEMENT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 696
CORE CONCEPTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
1. Declaring dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
2. Dependency Configurations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 703
3. Declaring repositories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 706
4. Centralizing dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 711
5. Dependency Constraints and Conflict Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 715
6. Dependency Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 716
7. Variant Aware Dependency Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 724
DECLARING DEPENDENCIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 729
Declaring Dependencies Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 729
Viewing Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
Declaring Versions and Ranges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 745
Declaring Dependency Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 757
Declaring Dependency Configurations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 761
DECLARING REPOSITORIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 773
Declaring Repositories Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 773
Centralizing Repository Declarations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 778
Repository Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 780
Metadata Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 788
Supported Protocols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 792
Filtering Repository Content . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 804
CENTRALIZING DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 809
Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 809
Version Catalogs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 812
Using Catalogs with Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 827
MANAGING DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Locking Versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Using Resolution Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 841
Modifying Dependency Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 866
Dependency Caching. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 889
UNDERSTANDING DEPENDENCY RESOLUTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
Understanding the Dependency Resolution Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 916
Variants and Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 921
CONTROLLING DEPENDENCY RESOLUTION. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 933
Dependency Resolution Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 933
Dependency Graph Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 934
Artifact Resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 939
Artifact Transforms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 950
PUBLISHING LIBRARIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 973
Publishing a project as module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 973
Understanding Gradle Module Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 977
Signing artifacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 982
Customizing publishing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 983
The Maven Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 994
The Ivy Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1011
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1022
Verifying dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1022
Aligning dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1046
Modeling library features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1053
PLATFORMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1065
JVM BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1066
Building Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1066
Testing in Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1091
Managing Dependencies of JVM Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1124
JAVA TOOLCHAINS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129
Toolchains for JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129
Toolchain Resolver Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1145
JVM PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1148
The Java Library Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1148
The Application Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1160
The Java Platform Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1167
The Groovy Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1173
The Scala Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1182
INTEGRATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1194
Gradle & Third-party Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1194
REFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1198
Gradle Wrapper Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1198
Gradle Daemon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1208
Command-Line Interface Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1216
GRADLE DSL/API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1235
A Groovy Build Script Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1235
Gradle Kotlin DSL Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1240
CORE PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1271
Gradle Plugin Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1271
HOW TO GUIDES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1274
How to share outputs between projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1274
LICENSE INFORMATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1281
License Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1281
OVERVIEW
Gradle User Manual
Gradle Build Tool

Gradle Build Tool is a fast, dependable, and adaptable open-source build


automation tool with an elegant and extensible declarative build language.

In this User Manual, Gradle Build Tool is abbreviated Gradle.

Why Gradle?

Gradle is a widely used and mature tool with an active community and a strong developer
ecosystem.

• Gradle is the most popular build system for the JVM and is the default system for Android and
Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.

• Gradle can automate a wide range of software build scenarios using either its built-in
functionality, third-party plugins, or custom build logic.

• Gradle provides a high-level, declarative, and expressive build language that makes it easy to
read and write build logic.

• Gradle is fast, scalable, and can build projects of any size and complexity.

• Gradle produces dependable results while benefiting from optimizations such as incremental
builds, build caching, and parallel execution.

Gradle, Inc. provides a free service called Build Scan® that provides extensive information and
insights about your builds. You can view scans to identify problems or share them for debugging
help.

Supported Languages and Frameworks

Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.

Compatible IDEs

All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.

You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your
continuous integration (CI) server.

Releases

Information on Gradle releases and how to install Gradle is found on the Installation page.

User Manual

The Gradle User Manual is the official documentation for the Gradle Build Tool:

• Running Gradle Builds — Learn how to use Gradle with your project.

• Authoring Gradle Builds — Learn how to develop tasks and plugins to customize your build.

• Working with Dependencies — Learn how to add dependencies to your build.

• Authoring JVM Builds — Learn how to use Gradle with your Java project.

• Optimizing Builds — Learn how to use caches and other tools to optimize your build.

Education

• Training Courses — Head over to the courses page to sign up for free Gradle training.

Support

• Forum — The fastest way to get help is through the Gradle Forum.

• Slack — Community members and core contributors answer questions directly on our Slack
Channel.

Licenses

Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user
manual and DSL reference manual are licensed under Creative Commons Attribution-
NonCommercial-ShareAlike 4.0 International License.

Copyright

Copyright © 2024 Gradle, Inc. All rights reserved. Gradle is a trademark of Gradle, Inc.

For inquiries related to commercial use or licensing, contact Gradle Inc. directly.
RELEASES
Installing Gradle
Gradle Installation

If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the
build uses the Gradle Wrapper. This is identifiable by the presence of the gradlew or [Link]
files in the root of the project:

. ①
├── gradle
│ └── wrapper ②
├── gradlew ③
├── [Link] ③
└── ⋮

① Project root directory.

② Gradle Wrapper.

③ Scripts for executing Gradle builds.

If the gradlew or [Link] files are already present in your project, you do not need to install
Gradle. But you need to make sure your system satisfies Gradle’s prerequisites.

You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version
for your project. Please use the Gradle Wrapper to upgrade Gradle.

Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle
separately when only working within that IDE.

If you do not meet the criteria above and decide to install Gradle on your machine, first check if
Gradle is already installed by running gradle -v in your terminal. If the command does not return
anything, then Gradle is not installed, and you can follow the instructions below.

You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done
manually or using a package manager like SDKMAN! or Homebrew.

You can find all Gradle releases and their checksums on the releases page.

Prerequisites

Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or
higher to run. You can check the compatibility matrix for more information.

To check, run java -version:

❯ java -version
openjdk version "11.0.18" 2023-01-17
OpenJDK Runtime Environment Homebrew (build 11.0.18+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)

Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your
project. In this example, the $PATH points to JDK17:

❯ echo $PATH
/opt/homebrew/opt/openjdk@17/bin

You can also set the JAVA_HOME environment variable to point to a specific JDK installation directory.
This is especially useful when multiple JDKs are installed:

❯ echo %JAVA_HOME%
C:\Program Files\Java\jdk1.7.0_80

❯ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/[Link]/Contents/Home

Gradle supports Kotlin and Groovy as the main build languages. Gradle ships with its own Kotlin
and Groovy libraries, therefore they do not need to be installed. Existing installations are ignored
by Gradle.

See the full compatibility notes for Java, Groovy, Kotlin, and Android.

Linux installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:

❯ sdk install gradle

Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle
that is incompatible or incomplete when compared to the official version.

▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)
• Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /opt/gradle
❯ unzip -d /opt/gradle [Link]
❯ ls /opt/gradle/gradle-8.12.1
LICENSE NOTICE bin README init.d lib media

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/opt/gradle/gradle-8.12.1/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.

export GRADLE_HOME=/opt/gradle/gradle-8.12.1
export PATH=${GRADLE_HOME}/bin:${PATH}

macOS installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:

❯ sdk install gradle

Using Homebrew:

❯ brew install gradle

Using MacPorts:
❯ sudo port install gradle

Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc.

▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)

• Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /usr/local/gradle
❯ unzip [Link] -d /usr/local/gradle
❯ ls /usr/local/gradle/gradle-8.12.1
LICENSE NOTICE README bin init.d lib

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/usr/local/gradle/gradle-8.12.1/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.

It’s a good idea to edit .bash_profile in your home directory to add GRADLE_HOME variable:

export GRADLE_HOME=/usr/local/gradle/gradle-8.12.1
export PATH=$GRADLE_HOME/bin:$PATH

Windows installation
▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)

• Complete (all) with docs and sources

We recommend downloading the bin file.

Step 2 - Unpack the distribution

Create a new directory C:\Gradle with File Explorer.

Open a second File Explorer window and go to the directory where the Gradle distribution was
downloaded. Double-click the ZIP archive to expose the content. Drag the content folder gradle-
8.12.1 to your newly created C:\Gradle folder.

Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle using the archiver tool of
your choice.

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path.

In File Explorer right-click on the This PC (or Computer) icon, then click Properties → Advanced
System Settings → Environmental Variables.

Under System Variables select Path, then click Edit. Add an entry for C:\Gradle\gradle-8.12.1\bin.
Click OK to save.

Alternatively, you can add the environment variable GRADLE_HOME and point this to the unzipped
distribution. Instead of adding a specific version of Gradle to your Path, you can add
%GRADLE_HOME%\bin to your Path. When upgrading to a different version of Gradle, just change the
GRADLE_HOME environment variable.

Verify the installation

Open a console (or a Windows command prompt) and run gradle -v to run gradle and display the
version, e.g.:

❯ gradle -v

------------------------------------------------------------
Gradle 8.12.1
------------------------------------------------------------

Build time: 2024-06-17 [Link] UTC


Revision: 6028379bb5a8512d0b2c1be6403543b79825ef08
Kotlin: 1.9.23
Groovy: 3.0.21
Ant: Apache Ant(TM) version 1.10.13 compiled on January 4 2023
Launcher JVM: 11.0.23 (Eclipse Adoptium 11.0.23+9)
Daemon JVM: /Library/Java/JavaVirtualMachines/[Link]/Contents/Home (no JDK
specified, using current Java home)
OS: Mac OS X 14.5 aarch64

You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available
from the releases page) and following these verification instructions.

Compatibility Matrix
The sections below describe Gradle’s compatibility with several integrations. Versions not listed
here may or may not work.

Java Runtime

Gradle runs on the Java Virtual Machine (JVM), which is often provided by either a JDK or JRE. A
JVM version between 8 and 23 is required to execute Gradle. JVM 24 and later versions are not yet
supported.

Executing the Gradle daemon with JVM 16 or earlier has been deprecated and will become an error
in Gradle 9.0. The Gradle wrapper, Gradle client, Tooling API client, and TestKit client will remain
compatible with JVM 8.

JDK 6 and 7 can be used for compilation. Testing with JVM 6 and 7 is deprecated and will not be
supported in Gradle 9.0.

Any fully supported version of Java can be used for compilation or testing. However, the latest Java
version may only be supported for compilation or testing, not for running Gradle. Support is
achieved using toolchains and applies to all tasks supporting toolchains.

See the table below for the Java version supported by a specific Gradle release:

Table 1. Java Compatibility

Java version Support for toolchains Support for running Gradle

8 N/A 2.0

9 N/A 4.3

10 N/A 4.7

11 N/A 5.0

12 N/A 5.4

13 N/A 6.0

14 N/A 6.3

15 6.7 6.7
Java version Support for toolchains Support for running Gradle

16 7.0 7.0

17 7.3 7.3

18 7.5 7.5

19 7.6 7.6

20 8.1 8.3

21 8.4 8.5

22 8.7 8.8

23 8.10 8.10

24 N/A N/A

Kotlin

Gradle is tested with Kotlin 1.6.10 through 2.1.0-Beta2. Beta and RC versions may or may not work.

Table 2. Embedded Kotlin version

Embedded Kotlin version Minimum Gradle version Kotlin Language version

1.3.10 5.0 1.3

1.3.11 5.1 1.3

1.3.20 5.2 1.3

1.3.21 5.3 1.3

1.3.31 5.5 1.3

1.3.41 5.6 1.3

1.3.50 6.0 1.3

1.3.61 6.1 1.3

1.3.70 6.3 1.3

1.3.71 6.4 1.3

1.3.72 6.5 1.3

1.4.20 6.8 1.3

1.4.31 7.0 1.4

1.5.21 7.2 1.4

1.5.31 7.3 1.4

1.6.21 7.5 1.4

1.7.10 7.6 1.4

1.8.10 8.0 1.8

1.8.20 8.2 1.8


Embedded Kotlin version Minimum Gradle version Kotlin Language version

1.9.0 8.3 1.8

1.9.10 8.4 1.8

1.9.20 8.5 1.8

1.9.22 8.7 1.8

1.9.23 8.9 1.8

1.9.24 8.10 1.8

2.0.20 8.11 1.8

2.0.21 8.12 1.8

Groovy

Gradle is tested with Groovy 1.5.8 through 4.0.0.

Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy
DSL build scripts.

Android

Gradle is tested with Android Gradle Plugin 7.3 through 8.7. Alpha and beta versions may or may
not work.

The Feature Lifecycle


Gradle is under constant development. New versions are delivered on a regular and frequent basis
(approximately every six weeks) as described in the section on end-of-life support.

Continuous improvement combined with frequent delivery allows new features to be available to
users early. Early users provide invaluable feedback, which is incorporated into the development
process.

Getting new functionality into the hands of users regularly is a core value of the Gradle platform.

At the same time, API and feature stability are taken very seriously and considered a core value of
the Gradle platform. Design choices and automated testing are engineered into the development
process and formalized by the section on backward compatibility.

The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of
Gradle what the state of a feature is. The term feature typically means an API or DSL method or
property in this context, but it is not restricted to this definition. Command line arguments and
modes of execution (e.g. the Build Daemon) are two examples of other features.

Feature States

Features can be in one of four states:


1. Internal

2. Incubating

3. Public

4. Deprecated

1. Internal

Internal features are not designed for public use and are only intended to be used by Gradle itself.
They can change in any way at any point in time without any notice. Therefore, we recommend
avoiding the use of such features. Internal features are not documented. If it appears in this User
Manual, the DSL Reference, or the API Reference, then the feature is not internal.

Internal features may evolve into public features.

2. Incubating

Features are introduced in the incubating state to allow real-world feedback to be incorporated into
the feature before making it public. It also gives users willing to test potential future changes early
access.

A feature in an incubating state may change in future Gradle versions until it is no longer
incubating. Changes to incubating features for a Gradle release will be highlighted in the release
notes for that release. The incubation period for new features varies depending on the feature’s
scope, complexity, and nature.

Features in incubation are indicated. In the source code, all methods/properties/classes that are
incubating are annotated with incubating. This results in a special mark for them in the DSL and
API references.

If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the
incubating state.

Feature Preview API

The feature preview API allows certain incubating features to be activated by adding
enableFeaturePreview('FEATURE') in your settings file. Individual preview features will be
announced in release notes.

When incubating features are either promoted to public or removed, the feature preview flags for
them become obsolete, have no effect, and should be removed from the settings file.

3. Public

The default state for a non-internal feature is public. Anything documented in the User Manual, DSL
Reference, or API reference that is not explicitly said to be incubating or deprecated is considered
public. Features are said to be promoted from an incubating state to public. The release notes for
each release indicate which previously incubating features are being promoted by the release.

A public feature will never be removed or intentionally changed without undergoing deprecation.
All public features are subject to the backward compatibility policy.

4. Deprecated

Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such
features will eventually be removed from Gradle after being deprecated. A deprecated feature may
become stale until it is finally removed according to the backward compatibility policy.

Deprecated features are indicated to be so. In the source code, all methods/properties/classes that
are deprecated are annotated with “@[Link]” which is reflected in the DSL and API
References. In most cases, there is a replacement for the deprecated element, which will be
described in the documentation. Using a deprecated feature will result in a runtime warning in
Gradle’s output.

The use of deprecated features should be avoided. The release notes for each release indicate any
features being deprecated by the release.

Backward compatibility policy

Gradle provides backward compatibility across major versions (e.g., 1.x, 2.x, etc.). Once a public
feature is introduced in a Gradle release, it will remain indefinitely unless deprecated. Once
deprecated, it may be removed in the next major release. Deprecated features may be supported
across major releases, but this is not guaranteed.

Release end-of-life Policy

Every day, a new nightly build of Gradle is created.

This contains all of the changes made through Gradle’s extensive continuous integration tests
during that day. Nightly builds may contain new changes that may or may not be stable.

The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or
major release. When no problems are found after a short time (usually a week), the release
candidate is promoted to a general availability (GA) release. If a regression is found in the release
candidate, a new RC distribution is created, and the process repeats. Release candidates are
supported for as long as the release window is open, but they are not intended to be used for
production. Bug reports are greatly appreciated during the RC phase.

The Gradle team may create additional patch releases to replace the final release due to critical bug
fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.

Once a release candidate has been made, all feature development moves on to the next release for
the latest major version. As such, each minor Gradle release causes the previous minor releases in
the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or
feature backports.

For major versions, Gradle will backport critical fixes and security fixes to the last minor in the
previous major version. For example, when Gradle 7 was the latest major version, several releases
were made in the 6.x line, including Gradle 6.9 (and subsequent releases).
As such, each major Gradle release causes:

• The previous major version becomes maintenance only. It will only receive critical bug fixes
and security fixes.

• The major version before the previous one to become end-of-life (EOL), and that release line
will not receive any new fixes.
UPGRADING
Upgrading your build from Gradle 8.x to the latest
This chapter provides the information you need to migrate your Gradle 8.x builds to the latest
Gradle release. For migrating from Gradle 4.x, 5.x, 6.x, or 7.x, see the older migration guide first.

We recommend the following steps for all users:

1. Try running gradle help --scan and view the deprecations view of the generated build scan.

This lets you see any deprecation warnings that apply to your build.

Alternatively, you can run gradle help --warning-mode=all to see the deprecations in the
console, though it may not report as much detailed information.

2. Update your plugins.

Some plugins will break with this new version of Gradle because they use internal APIs that
have been removed or changed. The previous step will help you identify potential problems by
issuing deprecation warnings when a plugin tries to use a deprecated part of the API.

3. Run gradle wrapper --gradle-version 8.12.1 to update the project to 8.12.1.

4. Try to run the project and debug any errors using the Troubleshooting Guide.

Upgrading from 8.11 and earlier

Potential breaking changes

Upgrade to Kotlin 2.0.21

The embedded Kotlin has been updated from 2.0.20 to Kotlin 2.0.21.
Upgrade to Ant 1.10.15

Ant has been updated to Ant 1.10.15.

Upgrade to Zinc 1.10.4

Zinc has been updated to 1.10.4.

Swift SDK discovery

To determine the location of the Mac OS X SDK for Swift, Gradle now passes the --sdk macosx
arguments to xcrun. This is necessary because the SDK could be discovered inconsistently without
this argument across different environments.

Source level deprecation of [Link] methods

Eager task creation methods on the TaskContainer interface have been marked @Deprecated and will
generate compiler and IDE warnings when used in build scripts or plugin code. There is not yet a
Gradle deprecation warning emitted for their use.

However, if the build is configured to fail on warnings during Kotlin script or plugin code
compilation, this behavior may cause the build to fail.

A standard Gradle deprecation warning will be printed upon use when these methods are fully
deprecated in a future version.

Deprecations

Deprecated Ambiguous Transformation Chains

Previously, when at least two equal-length chains of artifact transforms were available that would
produce compatible variants that would each satisfy a resolution request, Gradle would arbitrarily,
and silently, pick one.

Now, Gradle emits a deprecation warning that explains this situation:

There are multiple distinct artifact transformation chains of the same length that
would satisfy this request. This behavior has been deprecated. This will fail with an
error in Gradle 9.0.
Found multiple transformation chains that produce a variant of 'root project :' with
requested attributes:
- color 'red'
- texture 'smooth'
Found the following transformation chains:
- From configuration ':squareBlueSmoothElements':
- With source attributes:
- artifactType 'txt'
- color 'blue'
- shape 'square'
- texture 'smooth'
- Candidate transformation chains:
- Transformation chain: 'ColorTransform':
- 'BrokenColorTransform':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
- Transformation chain: 'ColorTransform2':
- 'BrokenColorTransform2':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
Remove one or more registered transforms, or add additional attributes to them to
ensure only a single valid transformation chain exists.

In such a scenario, Gradle has no way to know which of the two (or more) possible transformation
chains should be used. Picking an arbitrary chain can lead to inefficient performance or
unexpected behavior changes when seemingly unrelated parts of the build are modified. This is
potentially a very complex situation and the message now fully explains the situation by printing
all the registered transforms in order, along with their source (input) variants for each candidate
chain.

When encountering this type of failure, build authors should either:

1. Add additional, distinguishing attributes when registering transforms present in the chain, to
ensure that only a single chain will be selectable to satisfy the request

2. Request additional attributes to disambiguate which chain is selected (if they result in non-
identical final attributes)

3. Remove unnecessary registered transforms from the build

This will become an error in Gradle 9.0.

init must run alone

The init task must run by itself. This task should not be combined with other tasks in a single
Gradle invocation.

Running init in the same invocation as other tasks will become an error in Gradle 9.0.

For instance, this wil not be allowed:

> gradlew init tasks

Calling [Link]() from a task action

Calling [Link]() from a task action at execution time is now deprecated and will be made
an error in Gradle 10.0. This method can still be used during configuration time.

The deprecation is only issued if the configuration cache is not enabled. When the configuration
cache is enabled, calls to [Link]() are reported as configuration cache problems instead.

This deprecation was originally introduced in Gradle 7.4 but was only issued when the
STABLE_CONFIGURATION_CACHE feature flag was enabled. That feature flag no longer controls this
deprecation. This is another step towards moving users away from idioms that are incompatible
with the configuration cache, which will become the only mode supported by Gradle in a future
release.

Please refer to the configuration cache documentation for alternatives to invoking


[Link]() at execution time that are compatible with the configuration cache.

Groovy "space assignment" syntax

Currently, there are multiple ways to set a property with Groovy DSL syntax:

---
propertyName = value
setPropertyName(value)
setPropertyName value
propertyName(value)
propertyName value
---

The latter one, "space-assignment", is a Gradle-specific feature that is not part of the Groovy
language. In regular Groovy, this is just a method call: propertyName(value), and Gradle generates
propertyName method in the runtime if this method hasn’t been present already. This feature may be
a source of confusion (especially for new users) and adds an extra layer of complexity for users and
the Gradle codebase without providing any significant value. Sometimes, classes declare methods
with the same name, and these may even have semantics that are different from a plain
assignment.

These generated methods are now deprecated and will be removed in Gradle 10.0, and both
propertyName value and propertyName(value) will stop working unless the explicit method
propertyName is defined. Use explicit assignment propertyName = value instead.

For explicit methods, consider using the propertyName(value) syntax instead of propertyName value
for clarity. For example, jvmArgs "some", "arg" can be replaced with jvmArgs("some", "arg") or with
jvmArgs = ["some", "arg"] for Test tasks.

If you have a big project, to replace occurrences of space-assignment syntax you can use, for
example, the following sed command:

---
find . -name '[Link]' -type f -exec sed -[Link] -E 's/([^A-Za-z]|^)(replaceme)[
\t]*([^= \t{])/\1\2 = \3/g' {} +
---

You should replace replaceme with one or more property names you want to replace, separated by |,
e.g. (url|group).

[Link]

The method was deprecated because it was not intended for public use in build scripts.

[Link]

[Link](), `[Link](File), and


[Link](Object) were deprecated. They should be replaced with
[Link]() property.

Upgrading from 8.10 and earlier

Potential breaking changes

Upgrade to Kotlin 2.0.20

The embedded Kotlin has been updated from 1.9.24 to Kotlin 2.0.20. Also see the Kotlin 2.0.10 and
Kotlin 2.0.0 release notes.

The default kotlin-test version in JVM test suites has been upgraded to 2.0.20 as well.

Kotlin DSL scripts are still compiled with Kotlin language version set to 1.8 for backward
compatibility.

Gradle daemon JVM configuration via toolchain

The type of the property [Link] is now Property<JavaLanguageVersion>.

If you configured the task in a build script, you will need to replace:

jvmVersion = JavaVersion.VERSION_17

With:

jvmVersion = [Link](17)

Using the CLI options to configure which JVM version to use for the Gradle Daemon has no impact.

Name matching changes

The name-matching logic has been updated to treat numbers as word boundaries for camelCase
names. Previously, a request like unique would match both uniqueA and unique1. Such a request will
now fail due to ambiguity. To avoid issues, use the exact name instead of a shortened version.

This change impacts:

• Task selection
• Project selection

• Configuration selection in dependency report tasks

Deprecations

Deprecated JavaHome property of ForkOptions

The JavaHome property of the ForkOptions type has been deprecated and will be removed in Gradle
9.0.

Use JVM Toolchains, or the executable property instead.

Deprecated mutating buildscript configurations

Starting in Gradle 9.0, mutating configurations in a script’s buildscript block will result in an error.
This applies to project, settings, init, and standalone scripts.

The buildscript configurations block is only intended to control buildscript classpath resolution.

Consider the following script that creates a new buildscript configuration in a Settings script and
resolves it:

buildscript {
configurations {
create("myConfig")
}
dependencies {
"myConfig"("org:foo:1.0")
}
}

val files = [Link]["myConfig"].files

This pattern is sometimes used to resolve dependencies in Settings, where there is no other way to
obtain a Configuration. Resolving dependencies in this context is not recommended. Using a
detached configuration is a possible but discouraged alternative.

The above example can be modified to use a detached configuration:

val myConfig = [Link](


[Link]("org:foo:1.0")
)

val files = [Link]

Selecting Maven variants by configuration name

Starting in Gradle 9.0, selecting variants by name from non-Ivy external components will be
forbidden.
Selecting variants by name from local components will still be permitted; however, this pattern is
discouraged. Variant aware dependency resolution should be preferred over selecting variants by
name for local components.

The following dependencies will fail to resolve when targeting a non-Ivy external component:

dependencies {
implementation(group: "[Link]", name: "example", version: "1.0",
configuration: "conf")
implementation("[Link]:example:1.0") {
targetConfiguration = "conf"
}
}

Deprecated manually adding to configuration container

Starting in Gradle 9.0, manually adding configuration instances to a configuration container will
result in an error. Configurations should only be added to the container through the eager or lazy
factory methods. Detached configurations and copied configurations should not be added to the
container.

Calling the following methods on ConfigurationContainer will be forbidden: - add(Configuration) -


addAll(Collection) - addLater(Provider) - addAllLater(Provider)

Deprecated ProjectDependency#getDependencyProject()

The ProjectDependency#getDependencyProject() method has been deprecated and will be removed in


Gradle 9.0.

Accessing the mutable project instance of other projects should be avoided.

To discover details about all projects that were included in a resolution, inspect the full
ResolutionResult. Project dependencies are exposed in the DependencyResult. See the user guide
section on programmatic dependency resolution for more details on this API. This is the only
reliable way to find all projects that are used in a resolution. Inspecting only the declared
`ProjectDependency`s may miss transitive or substituted project dependencies.

To get the identity of the target project, use the new Isolated Projects safe project path method:
ProjectDependency#getPath().

To access or configure the target project, consider this direct replacement:

val projectDependency: ProjectDependency = getSomeProjectDependency()

// Old way:
val someProject = [Link]

// New way:
val someProject = [Link]([Link])
This approach will not fetch project instances from different builds.

Deprecated [Link]() and [Link]()

The [Link]() and [Link]() methods have been


deprecated and will be removed in Gradle 9.0.

These deprecated methods do not track task dependencies, unlike their replacements.

val deprecated: Set<File> = [Link]


val replacement: FileCollection = [Link]

val lenientDeprecated: Set<File> =


[Link]
val lenientReplacement: FileCollection = [Link] {
isLenient = true
}.files

Deprecated AbstractOptions

The AbstractOptions class has been deprecated and will be removed in Gradle 9.0. All classes
extending AbstractOptions will no longer extend it.

As a result, the AbstractOptions#define(Map) method will no longer be present. This method exposes
a non-type-safe API and unnecessarily relies on reflection. It can be replaced by directly setting the
properties specified in the map.

Additionally, CompileOptions#fork(Map), CompileOptions#debug(Map), and


GroovyCompileOptions#fork(Map), which depend on define, are also deprecated for removal in Gradle
9.0.

Consider the following example of the deprecated behavior and its replacement:

[Link](JavaCompile) {
// Deprecated behavior
[Link](encoding: 'UTF-8')
[Link](memoryMaximumSize: '1G')
[Link](debugLevel: 'lines')

// Can be replaced by
[Link] = 'UTF-8'

[Link] = true
[Link] = '1G'

[Link] = true
[Link] = 'lines'
}
Deprecated Dependency#contentEquals(Dependency)

The Dependency#contentEquals(Dependency) method has been deprecated and will be removed in


Gradle 9.0.

The method was originally intended to compare dependencies based on their actual target
component, regardless of whether they were of different dependency type. The existing method
does not behave as specified by its Javadoc, and we do not plan to introduce a replacement that
does.

Potential migrations include using [Link](Object) directly, or comparing the fields of


dependencies manually.

Deprecated Project#exec and Project#javaexec

The Project#exec(Closure), Project#exec(Action), Project#javaexec(Closure),


Project#javaexec(Action) methods have been deprecated and will be removed in Gradle 9.0.

These methods are scheduled for removal as part of the ongoing effort to make writing
configuration-cache-compatible code easier. There is no way to use these methods without breaking
configuration cache requirements so it is recommended to migrate to a compatible alternative. The
appropriate replacement for your use case depends on the context in which the method was
previously called.

At execution time, for example in @TaskAction or doFirst/doLast callbacks, the use of Project
instance is not allowed when the configuration cache is enabled. To run external processes, tasks
should use an injected ExecOperation service, which has the same API and can act as a drop-in
replacement. The standard Java/Groovy/Kotlin process APIs, like [Link] can be
used as well.

At configuration time, only special Provider-based APIs must be used to run external processes
when the configuration cache is enabled. You can use [Link] and
[Link] to obtain the output of the process. A custom ValueSource implementation
can be used for more sophisticated scenarios. The configuration cache guide has a more elaborate
example of using these APIs.

Detached Configurations should not use extendsFrom

Detached configurations should not extend other configurations using extendsFrom.

This behavior has been deprecated and will become an error in Gradle 9.0.

To create extension relationships between configurations, you should change to using non-
detached configurations created via the other factory methods present in the project’s
ConfigurationContainer.

Deprecated customized Gradle logging

The Gradle#useLogger(Object) method has been deprecated and will be removed in Gradle 9.0.

This method was originally intended to customize logs printed by Gradle. However, it only allows
intercepting a subset of the logs and cannot work with the configuration cache. We do not plan to
introduce a replacement for this feature.

Unnecessary options on compile options and doc tasks have been deprecated

Gradle’s API allowed some properties that represented nested groups of properties to be replaced
wholesale with a setter method. This was awkward and unusual to do and would sometimes
require the use of internal APIs. The setters for these properties will be removed in Gradle 9.0 to
simplify the API and ensure consistent behavior. Instead of using the setter method, these
properties should be configured by calling the getter and configuring the object directly or using
the convenient configuration method. For example, in CompileOptions, instead of calling the
setForkOptions setter, you can call getForkOptions() or forkOptions(Action).

The affected properties are:

• [Link]

• [Link]

• [Link]

• [Link]

• [Link]

• [Link]

Deprecated [Link]() and [Link](boolean)

These methods on Javadoc have been deprecated and will be removed in Gradle 9.0.

• isVerbose() is replaced by getOptions().isVerbose()

• Calling setVerbose(boolean) with true is replaced by getOptions().verbose()

• Calling setVerbose(false) did nothing.

Upgrading from 8.9 and earlier

Potential breaking changes

JavaCompile tasks may fail when using a JRE even if compilation is not necessary

The JavaCompile tasks may sometimes fail when using a JRE instead of a JDK. This is due to changes
in the toolchain resolution code, which enforces the presence of a compiler when one is requested.
The java-base plugin uses the JavaCompile tasks it creates to determine the default source and target
compatibility when sourceCompatibility/targetCompatibility or release are not set. With the new
enforcement, the absence of a compiler causes this to fail when only a JRE is provided, even if no
compilation is needed (e.g., in projects with no sources).

This can be fixed by setting the sourceCompatibility/targetCompatibility explicitly in the java


extension, or by setting sourceCompatibility/targetCompatibility or release in the relevant task(s).
Upgrade to Kotlin 1.9.24

The embedded Kotlin has been updated from 1.9.23 to Kotlin 1.9.24.

Upgrade to Ant 1.10.14

Ant has been updated to Ant 1.10.14.

Upgrade to JaCoCo 0.8.12

JaCoCo has been updated to 0.8.12.

Upgrade to Groovy 3.0.22

Groovy has been updated to Groovy 3.0.22.

Deprecations

Running Gradle on older JVMs

Starting in Gradle 9.0, Gradle will require JVM 17 or later to run. Most Gradle APIs will be compiled
to target JVM 17 bytecode.

Gradle will still support compiling Java code to target JVM version 6 or later. The target JVM version
of the compiled code can be configured separately from the JVM version used to run Gradle.

All Gradle clients (wrapper, launcher, Tooling API and TestKit) will remain compatible with JVM 8
and will be compiled to target JVM 8 bytecode. Only the Gradle daemon will require JVM 17 or later.
These clients can be configured to run Gradle builds with a different JVM version than the one used
to run the client:

• Using Daemon JVM criteria (an incubating feature)

• Setting the [Link] Gradle property

• Using the ConfigurableLauncher#setJavaHome method on the Tooling API

Alternatively, the JAVA_HOME environment variable can be set to a JVM 17 or newer, which will run
both the client and daemon with the same version of the JVM.

Running Gradle builds with --no-daemon or using ProjectBuilder in tests will require JVM version
17 or later. The worker API will remain compatible with JVM 8, and running JVM tests will require
JVM 8.

We decided to upgrade the minimum version of the Java runtime for a number of reasons:

• Dependencies are beginning to drop support for older versions and may not release security
patches.

• Significant language improvements between Java 8 and Java 17 cannot be used without
upgrading.

• Some of the most popular plugins already require JVM 17 or later.

• Download metrics for Gradle distributions show that JVM 17 is widely used.
Deprecated consuming non-consumable configurations from Ivy

In prior versions of Gradle, it was possible to consume non-consumable configurations of a project


using published Ivy metadata. An Ivy dependency may sometimes be substituted for a project
dependency, either explicitly through the DependencySubstitutions API or through included builds.
When this happens, configurations in the substituted project could be selected that were marked as
non-consumable.

Consuming non-consumable configurations in this manner is deprecated and will result in an error
in Gradle 9.0.

Deprecated extending configurations in the same project

In prior versions of Gradle, it was possible to extend a configuration in a different project.

The hierarchy of a Project’s configurations should not be influenced by configurations in other


projects. Cross-project hierarchies can lead to unexpected behavior when configurations are
extended in a way that is not intended by the configuration’s owner.

Projects should also never access the mutable state of another project. Since Configurations are
mutable, extending configurations across project boundaries restricts the parallelism that Gradle
can apply.

Extending configurations in different projects is deprecated and will result in an error in Gradle
9.0.

Upgrading from 8.8 and earlier

Potential breaking changes

Change to toolchain provisioning

In previous versions of Gradle, toolchain provisioning could leave a partially provisioned toolchain
in place with a marker file indicating that the toolchain was fully provisioned. This could lead
to strange behavior with the toolchain. In Gradle 8.9, the toolchain is fully provisioned before the
marker file is written. However, to not detect potentially broken toolchains, a different marker file
(.ready) is used. This means all your existing toolchains will be re-provisioned the first time you use
them with Gradle 8.9. Gradle 8.9 also writes the old marker file ([Link]) to indicate that the
toolchain was fully provisioned. This means that if you return to an older version of Gradle, an 8.9-
provisioned toolchain will not be re-provisioned.

Upgrade to Kotlin 1.9.23

The embedded Kotlin has been updated from 1.9.22 to Kotlin 1.9.23.

Change the encoding of daemon log files

In previous versions of Gradle, the daemon log file, located at $GRADLE_USER_HOME/daemon/8.12.1/,


was encoded with the default JVM encoding. This file is now always encoded with UTF-8 to prevent
clients who may use different default encodings from reading data incorrectly. This change may
affect third-party tools trying to read this file.
Compiling against Gradle implementation classpath

In previous versions of Gradle, Java projects that had no declared dependencies could implicitly
compile against Gradle’s runtime classes. This means that some projects were able to compile
without any declared dependencies even though they referenced Gradle runtime classes. This
situation is unlikely to arise in projects since IDE integration and test execution would be
compromised. However, if you need to utilize the Gradle API, declare a gradleApi dependency or
apply the java-gradle-plugin plugin.

Configuration cache implementation packages now under [Link]

References to Gradle types not part of the public API should be avoided, as their direct use is
unsupported. Gradle internal implementation classes may suffer breaking changes (or be renamed
or removed) from one version to another without warning.

Users need to distinguish between the API and internal parts of the Gradle codebase. This is
typically achieved by including internal in the implementation package names. However, before
this release, the configuration cache subsystem did not follow this pattern.

To address this issue, all code initially under the [Link]* packages has been
moved to new internal packages ([Link].*).

File-system watching on macOS 11 (Big Sur) and earlier is disabled

Since Gradle 8.8, file-system watching has only been supported on macOS 12 (Monterey) and later.
We added a check to automatically disable file-system watching on macOS 11 (Big Sur) and earlier
versions.

Possible change to JDK8-based compiler output when annotation processors are used

The Java compilation infrastructure has been updated to use the Problems API. This change will
supply the Tooling API clients with structured, rich information about compilation issues.

The feature should not have any visible impact on the usual build output, with JDK8 being an
exception. When annotation processors are used in the compiler, the output message differs
slightly from the previous ones.

The change mainly manifests itself in typename printed. For example, Java standard types like
[Link] will be reported as [Link] instead of String.

Upgrading from 8.7 and earlier

Deprecations

Deprecate mutating configuration after observation

To ensure the accuracy of dependency resolution, Gradle checks that Configurations are not
mutated after they have been used as part of a dependency graph.

• Resolvable configurations should not have their resolution strategy, dependencies, hierarchy,
etc., modified after they have been resolved.
• Consumable configurations should not have their dependencies, hierarchy, attributes, etc.
modified after they have been published or consumed as a variant.

• Dependency scope configurations should not have their dependencies, constraints, etc.,
modified after a configuration that extends from them is observed.

In prior versions of Gradle, many of these circumstances were detected and handled by failing the
build. However, some cases went undetected or did not trigger build failures. In Gradle 9.0, all
changes to a configuration, once observed, will become an error. After a configuration of any type
has been observed, it should be considered immutable. This validation covers the following
properties of a configuration:

• Resolution Strategy

• Dependencies

• Constraints

• Exclude Rules

• Artifacts

• Role (consumable, resolvable, dependency scope)

• Hierarchy (extendsFrom)

• Others (Transitive, Visible)

Starting in Gradle 8.8, a deprecation warning will be emitted in cases that were not already an
error. Usually, this deprecation is caused by mutating a configuration in a beforeResolve hook. This
hook is only executed after a configuration is fully resolved but not when it is partially resolved for
computing task dependencies.

Consider the following code that showcases the deprecated behavior:

[Link]

plugins {
id("java-library")
}

[Link] {
// `beforeResolve` is not called before the configuration is partially
resolved for
// build dependencies, but only before a full graph resolution.
// Configurations should not be mutated in this hook
[Link] {
// Add a dependency on `com:foo` if not already present
if ([Link] { [Link] == "com" && [Link] == "foo" }) {

[Link]().[Link]([Link]
ate("com:foo:1.0"))
}
}
}

[Link]("resolve") {
val conf: FileCollection = configurations["runtimeClasspath"]

// Wire build dependencies


dependsOn(conf)

// Resolve dependencies
doLast {
assert([Link] { [Link] } == listOf("[Link]"))
}
}

For the following use cases, consider these alternatives when replacing a beforeResolve hook:

• Adding dependencies: Use a DependencyFactory and addLater or addAllLater on


DependencySet.

• Changing dependency versions: Use preferred version constraints.

• Adding excludes: Use Component Metadata Rules to adjust dependency-level excludes, or


withDependencies to add excludes to a configuration.

• Roles: Configuration roles should be set upon creation and not changed afterward.

• Hierarchy: Configuration hierarchy (extendsFrom) should be set upon creation. Mutating the
hierarchy prior to resolution is highly discouraged but permitted within a withDependencies
hook.

• Resolution Strategy: Mutating a configuration’s ResolutionStrategy is still permitted in a


beforeResolve hook; however, this is not recommended.

Filtered Configuration file and fileCollection methods are deprecated

In an ongoing effort to simplify the Gradle API, the following methods that support filtering based
on declared dependencies have been deprecated:

On Configuration:

• files(Dependency…)

• files(Spec)

• files(Closure)

• fileCollection(Dependency…)

• fileCollection(Spec)

• fileCollection(Closure)

On ResolvedConfiguration:
• getFiles(Spec)

• getFirstLevelModuleDependencies(Spec)

On LenientConfiguration:

• getFirstLevelModuleDependencies(Spec)

• getFiles(Spec)

• getArtifacts(Spec)

To mitigate this deprecation, consider the example below that leverages the ArtifactView API along
with the componentFilter method to select a subset of a Configuration’s artifacts:

[Link]

val conf by [Link]

dependencies {
conf("[Link]:foo:1.0")
conf("[Link]:bar:1.0")
}

[Link]("filterDependencies") {
val files: FileCollection = [Link] {
componentFilter {
when(it) {
is ModuleComponentIdentifier ->
[Link] == "[Link]" && [Link] == "foo"
else -> false
}
}
}.files

doLast {
assert([Link] { [Link] } == listOf("[Link]"))
}
}

[Link]

configurations {
conf
}

dependencies {
conf "[Link]:foo:1.0"
conf "[Link]:bar:1.0"
}

[Link]("filterDependencies") {
FileCollection files = [Link] {
componentFilter {
it instanceof ModuleComponentIdentifier
&& [Link] == "[Link]"
&& [Link] == "foo"
}
}.files

doLast {
assert files*.name == ["[Link]"]
}
}

Contrary to the deprecated Dependency filtering methods, componentFilter does not consider the
transitive dependencies of the component being filtered. This allows for more granular control over
which artifacts are selected.

Deprecated Namer of Task and Configuration

Task and Configuration have a Namer inner class (also called Namer) that can be used as a common
way to retrieve the name of a task or configuration. Now that these types implement Named, these
classes are no longer necessary and have been deprecated. They will be removed in Gradle 9.0. Use
[Link] instead.

The super interface, Namer, is not being deprecated.

Unix mode-based file permissions deprecated

A new API for defining file permissions has been added in Gradle 8.3, see:

• FilePermissions.

• ConfigurableFilePermissions.

The new API has now been promoted to stable, and the old methods have been deprecated:

• [Link]

• [Link]

• [Link]

• [Link]

• [Link]

• [Link]
Deprecated setting retention period directly on local build cache

In previous versions, cleanup of the local build cache entries ran every 24 hours, and this interval
could not be configured. The retention period was configured using
[Link].

In Gradle 8.0, a new mechanism was added to configure the cleanup and retention periods for
various resources in Gradle User Home. In Gradle 8.8, this mechanism was extended to permit the
retention configuration of local build cache entries, providing improved control and consistency.

• Specifying [Link] or [Link] will now prevent or force the cleanup of the local
build cache

• Build cache entry retention is now configured via an init-script, in the same manner as other
caches.

If you want build cache entries to be retained for 30 days, remove any calls to the deprecated
method:

buildCache {
local {
// Remove this line
removeUnusedEntriesAfterDays = 30
}
}

Add a file like this in ~/.gradle/init.d:

beforeSettings {
caches {
[Link](30)
}
}

Calling [Link] is deprecated, and this method will be


removed in Gradle 9.0. If set to a non-default value, this deprecated setting will take precedence
over [Link]().

Deprecated Kotlin DSL gradle-enterprise plugin block extension

In [Link] (Kotlin DSL), you can use gradle-enterprise in the plugins block to apply the
Gradle Enterprise plugin with the same version as gradle --scan.

plugins {
`gradle-enterprise`
}

There is no equivalent to this in [Link] (Groovy DSL).


Gradle Enterprise has been renamed Develocity, and the [Link] plugin has been
renamed [Link]. Therefore, the gradle-enterprise plugin block extension has been
deprecated and will be removed in Gradle 9.0.

The Develocity plugin must be applied with an explicit plugin ID and version. There is no
develocity shorthand available in the plugins block:

plugins {
id("[Link]") version "3.17.3"
}

If you want to continue using the Gradle Enterprise plugin, you can specify the deprecated plugin
ID:

plugins {
id("[Link]") version "3.17.3"
}

We encourage you to use the latest released Develocity plugin version, even when using an older
Gradle version.

Potential breaking changes

Changes in the Problems API

We have implemented several refactorings of the Problems API, including a significant change in
how problem definitions and contextual information are handled. The complete design
specification can be found here.

In implementing this spec, we have introduced the following breaking changes to the ProblemSpec
interface:

• The label(String) and description(String) methods have been replaced with the id(String,
String) method and its overloaded variants.

Changes to collection properties

The following incubating API introduced in 8.7 have been removed:

• [Link]*(…)

• [Link]*(…)

Replacements that better handle conventions are under consideration for a future 8.x release.

Upgrade to Groovy 3.0.21

Groovy has been updated to Groovy 3.0.21.

Since the previous version was 3.0.17, the 3.0.18 and 3.0.19, and 3.0.20 changes are also included.
Some changes in static type checking have resulted in source-code incompatibilities. Starting with
3.0.18, if you cast a closure to an Action without generics, the closure parameter will be Object
instead of any explicit type specified. This can be fixed by adding the appropriate type to the cast,
and the redundant parameter declaration can be removed:

// Before
[Link]("foo", { Task it -> [Link] = "Foo task" } as Action)

// Fixed
[Link]("foo", { [Link] = "Foo task" } as Action<Task>)

Upgrade to ASM 9.7

ASM was upgraded from 9.6 to 9.7 to ensure earlier compatibility for Java 23.

Upgrading from 8.6 and earlier

Potential breaking changes

Upgrade to Kotlin 1.9.22

The embedded Kotlin has been updated from 1.9.10 to Kotlin 1.9.22.

Upgrade to Apache SSHD 2.10.0

Apache SSHD has been updated from 2.0.0 to 2.10.0.

Replacement and upgrade of JSch

JSch has been replaced by [Link]:jsch and updated from 0.1.55 to 0.2.16

Upgrade to Eclipse JGit 5.13.3

Eclipse JGit has been updated from 5.7.0 to 5.13.3.

This includes reworking the way that Gradle configures JGit for SSH operations by moving from
JSch to Apache SSHD.

Upgrade to Apache Commons Compress 1.25.0

Apache Commons Compress has been updated from 1.21 to 1.25.0. This change may affect the
checksums of the produced jars, zips, and other archive types because the metadata of the
produced artifacts may differ.

Upgrade to ASM 9.6

ASM was upgraded from 9.5 to 9.6 for better support of multi-release jars.

Upgrade of the version catalog parser

The version catalog parser has been upgraded and is now compliant with version 1.0.0 of the TOML
spec.

This should not impact catalogs that use the recommended syntax or were generated by Gradle for
publication.

Deprecations

Deprecated registration of plugin conventions

Using plugin conventions has been emitting warnings since Gradle 8.2. Now, registering plugin
conventions will also trigger deprecation warnings. For more information, see the section about
plugin convention deprecation.

Referencing tasks and domain objects by "name"() in Kotlin DSL

In Kotlin DSL, it is possible to reference a task or other domain object by its name using the
"name"() notation.

There are several ways to look up an element in a container by name:

tasks {
"wrapper"() // 1 - returns TaskProvider<Task>
"wrapper"(Wrapper::class) // 2 - returns TaskProvider<Wrapper>
"wrapper"(Wrapper::class) { // 3 - configures a task named wrapper of type Wrapper
}
"wrapper" { // 4 - configures a task named wrapper of type Task
}
}

The first notation is deprecated and will be removed in Gradle 9.0. Instead of using "name"() to
reference a task or domain object, use named("name") or one of the other supported notations.

The above example would be written as:

tasks {
named("wrapper") // returns TaskProvider<Task>
}

The Gradle API and Groovy build scripts are not impacted by this.

Deprecated invalid URL decoding behavior

Before Gradle 8.3, Gradle would decode a CharSequence given to [Link](Object) using an
algorithm that accepted invalid URLs and improperly decoded others. Gradle now uses the URI class
to parse and decode URLs, but with a fallback to the legacy behavior in the event of an error.

Starting in Gradle 9.0, the fallback will be removed, and an error will be thrown instead.

To fix a deprecation warning, invalid URLs that require the legacy behavior should be re-encoded
to be valid URLs, such as in the following examples:

Table 3. Legacy URL Conversions

Original Input New Input Reasoning


file:relative/path relative/path The file scheme does not
support relative paths.
file:relative/path%21 relative/path! Without a scheme, the path is
taken as-is, without decoding.
[Link] folder/ [Link] Spaces are not valid in URLs.
my%20folder/
[Link] [Link] % must be encoded as %25 in
my%%badly%encoded%path my%25%25badly%25encoded%25path URLs, and no %-escapes should
be invalid.

Deprecated SelfResolvingDependency

The SelfResolvingDependency interface has been deprecated for removal in Gradle 9.0. This type
dates back to the first versions of Gradle, where some dependencies could be resolved
independently. Now, all dependencies should be resolved as part of a dependency graph using a
Configuration.

Currently, ProjectDependency and FileCollectionDependency implement this interface. In Gradle 9.0,


these types will no longer implement SelfResolvingDependency. Instead, they will both directly
implement Dependency.

As such, the following methods of ProjectDependency and FileCollectionDependency will no longer be


available:

• resolve

• resolve(boolean)

• getBuildDependencies

Consider the following scripts that showcase the deprecated interface and its replacement:

[Link]

plugins {
id("java-library")
}

dependencies {
implementation(files("[Link]"))
implementation(project(":foo"))
}

[Link]("resolveDeprecated") {
// Wire build dependencies (calls getBuildDependencies)
dependsOn(configurations["implementation"].[Link]())

// Resolve dependencies
doLast {

configurations["implementation"].[Link]<FileCollectionDependen
cy>() {
assert(resolve().map { [Link] } == listOf("[Link]"))
assert(resolve(true).map { [Link] } == listOf("[Link]"))
}

configurations["implementation"].[Link]<ProjectDependency>() {
// These methods do not even work properly.
assert(resolve().map { [Link] } == listOf<String>())
assert(resolve(true).map { [Link] } == listOf<String>())
}
}
}

[Link]("resolveReplacement") {
val conf = configurations["runtimeClasspath"]

// Wire build dependencies


dependsOn(conf)

// Resolve dependencies
val files = [Link]
doLast {
assert([Link] { [Link] } == listOf("[Link]", "[Link]"))
}
}

Deprecated members of the [Link] package now report their deprecation

These members will be removed in Gradle 9.0.

• [Link](Collection)

Upgrading from 8.5 and earlier

Potential breaking changes

Upgrade to JaCoCo 0.8.11

JaCoCo has been updated to 0.8.11.


DependencyAdder renamed to DependencyCollector

The incubating DependencyAdder interface has been renamed to DependencyCollector. A


getDependencies method has been added to the interface that returns all declared dependencies.

Deprecations

Deprecated calling registerFeature using the main source set

Calling registerFeature on the java extension using the main source set is deprecated and will
change behavior in Gradle 9.0.

Currently, features created while calling usingSourceSet with the main source set are initialized
differently than features created while calling usingSourceSet with any other source set. Previously,
when using the main source set, new implementation, compileOnly, runtimeOnly, api, and
compileOnlyApi configurations were created, and the compile and runtime classpaths of the main
source set were configured to extend these configurations.

Starting in Gradle 9.0, the main source set will be treated like any other source set. With the java-
library plugin applied (or any other plugin that applies the java plugin), calling usingSourceSet with
the main source set will throw an exception. This is because the java plugin already configures a
main feature. Only if the java plugin is not applied will the main source set be permitted when calling
usingSourceSet.

Code that currently registers features with the main source set, such as:

[Link]

plugins {
id("java-library")
}

java {
registerFeature("feature") {
usingSourceSet(sourceSets["main"])
}
}

[Link]

plugins {
id("java-library")
}

java {
registerFeature("feature") {
usingSourceSet([Link])
}
}

Should instead, create a separate source set for the feature and register the feature with that source
set:

[Link]

plugins {
id("java-library")
}

sourceSets {
create("feature")
}

java {
registerFeature("feature") {
usingSourceSet(sourceSets["feature"])
}
}

[Link]

plugins {
id("java-library")
}

sourceSets {
feature
}

java {
registerFeature("feature") {
usingSourceSet([Link])
}
}

Deprecated publishing artifact dependencies with explicit name to Maven repositories

Publishing dependencies with an explicit artifact with a name different from the dependency’s
artifactId to Maven repositories has been deprecated. This behavior is still permitted when
publishing to Ivy repositories. It will result in an error in Gradle 9.0.
When publishing to Maven repositories, Gradle will interpret the dependency below as if it were
declared with coordinates org:notfoo:1.0:

[Link]

dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}

[Link]

dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}

Instead, this dependency should be declared as:

[Link]

dependencies {
implementation("org:notfoo:1.0")
}

[Link]

dependencies {
implementation("org:notfoo:1.0")
}
Deprecated ArtifactIdentifier

The ArtifactIdentifier class has been deprecated for removal in Gradle 9.0.

Deprecate mutating DependencyCollector dependencies after observation

Starting in Gradle 9.0, mutating dependencies sourced from a DependencyCollector, after those
dependencies have been observed will result in an error. The DependencyCollector interface is used
to declare dependencies within the test suites DSL.

Consider the following example where a test suite’s dependency is mutated after it is observed:

[Link]

plugins {
id("java-library")
}

[Link] {
named<JvmTestSuite>("test") {
dependencies {
// Dependency is declared on a `DependencyCollector`
implementation("com:foo")
}
}
}

[Link] {
// Calling `all` here realizes/observes all lazy sources, including the
`DependencyCollector`
// from the test suite block. Operations like resolving a configuration
similarly realize lazy sources.
[Link] {
if (this is ExternalDependency && group == "com" && name == "foo" &&
version == null) {
// Dependency is mutated after observation
version {
require("2.0")
}
}
}
}

In the above example, the build logic uses iteration and mutation to try to set a default version for a
particular dependency if the version is not already set. Build logic like the above example creates
challenges in resolving declared dependencies, as reporting tools will display this dependency as if
the user declared the version as "2.0", even though they never did. Instead, the build logic can avoid
iteration and mutation by declaring a preferred version constraint on the dependency’s
coordinates. This allows the dependency management engine to use the version declared on the
constraint if no other version is declared.

Consider the following example that replaces the above iteration with an indiscriminate preferred
version constraint:

[Link]

dependencies {
constraints {
testImplementation("com:foo") {
version {
prefer("2.0")
}
}
}
}

Upgrading from 8.4 and earlier

Potential breaking changes

Upgrade to Kotlin 1.9.20

The embedded Kotlin has been updated to Kotlin 1.9.20.

Changes to Groovy task conventions

The groovy-base plugin is now responsible for configuring source and target compatibility version
conventions on all GroovyCompile tasks.

If you are using this task without applying grooy-base, you will have to manually set compatibility
versions on these tasks. In general, the groovy-base plugin should be applied whenever working
with Groovy language tasks.

[Link]

The type of argument passed to [Link] is changed from Predicate to Spec for a more
consistent API. This change should not affect anyone using [Link] with a lambda
expression. However, this might affect plugin authors if they don’t use SAM conversions to create a
lambda.

Deprecations

Deprecated members of the [Link] package now report their deprecation

These members will be removed in Gradle 9.0:


• [Link](String)

• [Link](VersionNumber)

Deprecated depending on resolved configuration

When resolving a Configuration, selecting that same configuration as a variant is sometimes


possible. Configurations should be used for one purpose (resolution, consumption or dependency
declarations), so this can only occur when a configuration is marked as both consumable and
resolvable.

This can lead to circular dependency graphs, as the resolved configuration is used for two purposes.

To avoid this problem, plugins should mark all resolvable configurations as canBeConsumed=false or
use the resolvable(String) configuration factory method when creating configurations meant for
resolution.

In Gradle 9.0, consuming configurations in this manner will no longer be allowed and result in an
error.

Including projects without an existing directory

Gradle will warn if a project is added to the build where the associated projectDir does not exist or
is not writable. Starting with version 9.0, Gradle will not run builds if a project directory is missing
or read-only. If you intend to dynamically synthesize projects, make sure to create directories for
them as well:

[Link]

include("project-without-directory")
project(":project-without-directory").[Link]()

[Link]

include 'project-without-directory'
project(":project-without-directory").[Link]()

Upgrading from 8.3 and earlier

Potential breaking changes

Upgrade to Kotlin 1.9.10

The embedded Kotlin has been updated to Kotlin 1.9.10.


XML parsing now requires recent parsers

Gradle 8.4 now configures XML parsers with security features enabled. If your build logic depends
on old XML parsers that don’t support secure parsing, your build may fail. If you encounter a
failure, check and update or remove any dependency on legacy XML parsers.

If you are an Android user, please upgrade your AGP version to 8.3.0 or higher to fix the issue
caused by AGP itself. See the Update XML parser used in AGP for Gradle 8.4 compatibility for more
details.

If you are unable to upgrade XML parsers coming from your build logic dependencies, you can
force the use of the XML parsers built into the JVM. In OpenJDK, for example, this can be done by
adding the following to [Link]:

[Link]=[Link].
SAXParserFactoryImpl
[Link]=[Link]
[Link]
[Link]=[Link]
.[Link]

See the CVE-2023-42445 advisory for more details and ways to enable secure XML processing on
previous Gradle versions.

EAR plugin with customized JEE 1.3 descriptor

Gradle 8.4 forbids external XML entities when parsing XML documents. If you use the EAR plugin
and configure the [Link] descriptor via the EAR plugin’s DSL and customize the descriptor
using withXml {} and use asElement{} in the customization block, then the build will now fail for
security reasons.

[Link]

plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}
[Link]

plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}

If you happen to use asNode() instead of asElement(), then nothing changes, given asNode() simply
ignores external DTDs.

You can work around this by running your build with the [Link] system
property set to http.

On the command line, add this to your Gradle invocation:

-[Link]=http

To make this workaround persistent, add the following line to your [Link]:

[Link]=http

Note that this will enable HTTP access to external DTDs for the whole build JVM. See the JAXP
documentation for more details.

Deprecations

Deprecated GenerateMavenPom methods

The following methods on GenerateMavenPom are deprecated and will be removed in Gradle 9.0. They
were never intended to be public API.

• getVersionRangeMapper

• withCompileScopeAttributes

• withRuntimeScopeAttributes
Upgrading from 8.2 and earlier

Potential breaking changes

Deprecated [Link] can cause script compilation failure

With the deprecation of [Link], buildscripts that are compiled with warnings as errors
could fail if the deprecated field is used.

See the deprecation entry for details.

TestLauncher API no longer ignores build failures

The TestLauncher interface is part of the Tooling API, specialized for running tests. It is a logical
extension of the BuildLauncher that can only launch tasks. A discrepancy has been reported in their
behavior: if the same failing test is executed, BuildLauncher will report a build failure, but
TestLauncher won’t. Originally, this was a design decision in order to continue the execution and
run the tests in all test tasks and not stop at the first failure. At the same time, this behavior can be
confusing for users as they can experience a failing test in a successful build. To make the two APIs
more uniform, we made TestLauncher also fail the build, which is a potential breaking change.
Tooling API clients should explicitly pass --continue to the build to continue the test execution even
if a test task fails.

Fixed variant selection behavior with ArtifactView and ArtifactCollection

The dependency resolution APIs for selecting different artifacts or files


([Link]().artifactView { } and [Link]().getArtifacts())
captured immutable copies of the underlying `Configuration’s attributes to use for variant
selection. If the `Configuration’s attributes were changed after these methods were called, the
artifacts selected by these methods could be unexpected.

Consider the case where the set of attributes on a Configuration is changed after an ArtifactView is
created:

[Link]

tasks {
myTask {
[Link]([Link] {
attributes {
// Add attributes to select a different type of artifact
}
}.files)
}
}

configurations {
classpath {
attributes {
// Add more attributes to the configuration
}
}
}

The inputFiles property of myTask uses an artifact view to select a different type of artifact from the
configuration classpath. Since the artifact view was created before the attributes were added to the
configuration, Gradle could not select the correct artifact.

Some builds may have worked around this by also putting the additional attributes into the artifact
view. This is no longer necessary.

Upgrade to Kotlin 1.9.0

The embedded Kotlin has been updated from 1.8.20 to Kotlin 1.9.0. The Kotlin language and API
levels for the Kotlin DSL are still set to 1.8 for backward compatibility. See the release notes for
Kotlin 1.8.22 and Kotlin 1.8.21.

Kotlin 1.9 dropped support for Kotlin language and API level 1.3. If you build Gradle plugins written
in Kotlin with this version of Gradle and need to support Gradle <7.0 you need to stick to using the
Kotlin Gradle Plugin <1.9.0 and configure the Kotlin language and API levels to 1.3. See the
Compatibility Matrix for details about other versions.

Eager evaluation of Configuration attributes

Gradle 8.3 updates the [Link] and [Link] attributes of JVM


Configurations to be present at the time of creation, as opposed to previously, where they were only
present after the Configuration had been resolved or consumed. In particular, the value for
[Link] relies on the project’s configured toolchain, meaning that querying the
value for this attribute will finalize the value of the project’s Java toolchain.

Plugins or build logic that eagerly queries the attributes of JVM configurations may now cause the
project’s Java toolchain to be finalized earlier than before. Attempting to modify the toolchain after
it has been finalized will result in error messages similar to the following:

The value for property 'implementation' is final and cannot be changed any further.
The value for property 'languageVersion' is final and cannot be changed any further.
The value for property 'vendor' is final and cannot be changed any further.

This situation may arise when plugins or build logic eagerly query an existing JVM Configuration’s
attributes to create a new Configuration with the same attributes. Previously, this logic would have
omitted the two above-noted attributes entirely, while now, the same logic will copy the attributes
and finalize the project’s Java toolchain. To avoid early toolchain finalization, attribute-copying
logic should be updated to query the source Configuration’s attributes lazily:
[Link]

fun <T> copyAttribute(attribute: Attribute<T>, from: AttributeContainer, to:


AttributeContainer) =
[Link]<T>(attribute, provider {
[Link](attribute)!! })

val source = configurations["runtimeClasspath"].attributes


configurations {
create("customRuntimeClasspath") {
[Link]().forEach { key ->
copyAttribute(key, source, attributes)
}
}
}

[Link]

def source = [Link]


configurations {
customRuntimeClasspath {
[Link]().each { key ->
[Link](key, provider { [Link]
(key) })
}
}
}

Deprecations

Deprecated [Link] is to be replaced by [Link]

The [Link] property is deprecated. It uses eager APIs and has ordering issues if the value
is read in build logic and then later modified. It could result in outputs ending up in different
locations.

It is replaced by a DirectoryProperty found at [Link]. See the ProjectLayout


interface for details.

Note that, at this stage, Gradle will not print deprecation warnings if you still use [Link].
We know this is a big change, and we want to give the authors of major plugins time to stop using it.

Switching from a File to a DirectoryProperty requires adaptations in build logic. The main impact is
that you cannot use the property inside a String to expand it. Instead, you should leverage the dir
and file methods to compute your desired location.
Here is an example of creating a file where the following:

[Link]

// Returns a [Link]
file("$buildDir/[Link]")

[Link]

// Returns a [Link]
file("$buildDir/[Link]")

Should be replaced by:

[Link]

// Compatible with a number of Gradle lazy APIs that accept also [Link]
val output: Provider<RegularFile> =
[Link]("[Link]")

// If you really need the [Link] for a non lazy API


[Link]().asFile

// Or a path for a lazy String based API


[Link] { [Link] }

[Link]

// Compatible with a number of Gradle lazy APIs that accept also [Link]
Provider<RegularFile> output = [Link]("[Link]")

// If you really need the [Link] for a non lazy API


[Link]().asFile

// Or a path for a lazy String based API


[Link] { [Link] }

Here is another example for creating a directory where the following:


[Link]

// Returns a [Link]
file("$buildDir/outputLocation")

[Link]

// Returns a [Link]
file("$buildDir/outputLocation")

Should be replaced by:

[Link]

// Compatible with a number of Gradle APIs that accept a [Link]


val output: Provider<Directory> = [Link]("outputLocation")

// If you really need the [Link] for a non lazy API


[Link]().asFile

// Or a path for a lazy String based API


[Link] { [Link] }

[Link]

// Compatible with a number of Gradle APIs that accept a [Link]


Provider<Directory> output = [Link]("outputLocation")

// If you really need the [Link] for a non lazy API


[Link]().asFile

// Or a path for a lazy String based API


[Link] { [Link] }

Deprecated ClientModule dependencies

ClientModule dependencies are deprecated and will be removed in Gradle 9.0.

Client module dependencies were originally intended to allow builds to override incorrect or
missing component metadata of external dependencies by defining the metadata locally. This
functionality has since been replaced by Component Metadata Rules.

Consider the following client module dependency example:

[Link]

dependencies {
implementation(module("org:foo:1.0") {
dependency("org:bar:1.0")
module("org:baz:1.0") {
dependency("com:example:1.0")
}
})
}

[Link]

dependencies {
implementation module("org:foo:1.0") {
dependency "org:bar:1.0"
module("org:baz:1.0") {
dependency "com:example:1.0"
}
}
}

This can be replaced with the following component metadata rule:

build-logic/src/main/kotlin/[Link]

@CacheableRule
abstract class AddDependenciesRule @Inject constructor(val dependencies:
List<String>) : ComponentMetadataRule {
override fun execute(context: ComponentMetadataContext) {
listOf("compile", "runtime").forEach { base ->
[Link](base) {
withDependencies {
[Link] {
add(it)
}
}
}
}
}
}

[Link]

dependencies {
components {
withModule<AddDependenciesRule>("org:foo") {
params(listOf(
"org:bar:1.0",
"org:baz:1.0"
))
}
withModule<AddDependenciesRule>("org:baz") {
params(listOf("com:example:1.0"))
}
}

implementation("org:foo:1.0")
}

build-logic/src/main/groovy/[Link]

@CacheableRule
abstract class AddDependenciesRule implements ComponentMetadataRule {

List<String> dependencies

@Inject
AddDependenciesRule(List<String> dependencies) {
[Link] = dependencies
}

@Override
void execute(ComponentMetadataContext context) {
["compile", "runtime"].each { base ->
[Link](base) {
withDependencies {
[Link] {
add(it)
}
}
}
}
}
}
[Link]

dependencies {
components {
withModule("org:foo", AddDependenciesRule) {
params([
"org:bar:1.0",
"org:baz:1.0"
])
}
withModule("org:baz", AddDependenciesRule) {
params(["com:example:1.0"])
}
}

implementation "org:foo:1.0"
}

Earliest supported Develocity plugin version is 3.13.1

Starting in Gradle 9.0, the earliest supported Develocity plugin version is 3.13.1. The plugin versions
from 3.0 up to 3.13 will be ignored when applied.

Upgrade to version 3.13.1 or later of the Develocity plugin. You can find the latest available version
on the Gradle Plugin Portal. More information on the compatibility can be found here.

Upgrading from 8.1 and earlier

Potential breaking changes

Upgrade to Kotlin 1.8.20

The embedded Kotlin has been updated to Kotlin 1.8.20. For more information, see What’s new in
Kotlin 1.8.20.

Note that there is a known issue with Kotlin compilation avoidance that can cause OutOfMemory
exceptions in compileKotlin tasks if the compilation classpath contains very large JAR files. This
applies to builds applying the Kotlin plugin v1.8.20 or the kotlin-dsl plugin.

You can work around it by disabling Kotlin compilation avoidance in your [Link] file:

[Link]=false

See KT-57757 for more information.

Upgrade to Groovy 3.0.17

Groovy has been updated to Groovy 3.0.17.


Since the previous version was 3.0.15, the 3.0.16 changes are also included.

Upgrade to Ant 1.10.13

Ant has been updated to Ant 1.10.13.

Since the previous version was 1.10.11, the 1.10.12 changes are also included.

Upgrade to CodeNarc 3.2.0

The default version of CodeNarc has been updated to CodeNarc 3.2.0.

Upgrade to PMD 6.55.0

PMD has been updated to PMD 6.55.0.

Since the previous version was 6.48.0, all changes since then are included.

Upgrade to JaCoCo 0.8.9

JaCoCo has been updated to 0.8.9.

Plugin compatibility changes

A plugin compiled with Gradle >= 8.2 that makes use of the Kotlin DSL functions [Link]<T>(),
[Link](KClass) or [Link]<T> {} cannot run on Gradle ⇐ 6.1.

Deferred or avoided configuration of some tasks

When performing dependency resolution, Gradle creates an internal representation of the


available Configurations. This requires inspecting all configurations and artifacts. Processing
artifacts created by tasks causes those tasks to be realized and configured.

This internal representation is now created more lazily, which can change the order in which tasks
are configured. Some tasks may never be configured.

This change may cause code paths that relied on a particular order to no longer function, such as
conditionally adding attributes to a configuration based on the presence of certain attributes.

This impacted the bnd plugin and JUnit5 build.

We recommend not modifying domain objects (configurations, source sets, tasks, etc) from
configuration blocks for other domain objects that may not be configured.

For example, avoid doing something like this:

configurations {
val myConfig = create("myConfig")
}

[Link]("myTask") {
// This is not safe, as the execution of this block may not occur, or may
not occur in the order expected
configurations["myConfig"].attributes {
attribute(Usage.USAGE_ATTRIBUTE, [Link](Usage::[Link],
Usage.JAVA_RUNTIME))
}
}

Deprecations

CompileOptions method deprecations

The following methods on CompileOptions are deprecated:

• getAnnotationProcessorGeneratedSourcesDirectory()

• setAnnotationProcessorGeneratedSourcesDirectory(File)

• setAnnotationProcessorGeneratedSourcesDirectory(Provider<File>)

Current usages of these methods should migrate to DirectoryProperty


getGeneratedSourceOutputDirectory()

Using configurations incorrectly

Gradle will now warn at runtime when methods of Configuration are called inconsistently with the
configuration’s intended usage.

This change is part of a larger ongoing effort to make the intended behavior of configurations more
consistent and predictable and to unlock further speed and memory improvements.

Currently, the following methods should only be called with these listed allowed usages:

• resolve() - RESOLVABLE configurations only

• files(Closure), files(Spec), files(Dependency…), fileCollection(Spec), fileCollection(Closure),


fileCollection(Dependency…) - RESOLVABLE configurations only

• getResolvedConfigurations() - RESOLVABLE configurations only

• defaultDependencies(Action) - DECLARABLE configurations only

• shouldResolveConsistentlyWith(Configuration) - RESOLVABLE configurations only

• disableConsistentResolution() - RESOLVABLE configurations only

• getDependencyConstraints() - DECLARABLE configurations only

• copy(), copy(Spec), copy(Closure), copyRecursive(), copyRecursive(Spec), copyRecursive(Closure) -


RESOLVABLE configurations only

Intended usage is noted in the Configuration interface’s Javadoc. This list is likely to grow in future
releases.

Starting in Gradle 9.0, using a configuration inconsistently with its intended usage will be
prohibited.
Also note that although it is not currently restricted, the getDependencies() method is only intended
for use with DECLARABLE configurations. The getAllDependencies() method, which retrieves all
declared dependencies on a configuration and any superconfigurations, will not be restricted to
any particular usage.

Deprecated access to plugin conventions

The concept of conventions is outdated and superseded by extensions to provide custom DSLs.

To reflect this in the Gradle API, the following elements are deprecated:

• [Link]()

• [Link]

• [Link]

Gradle Core plugins still register their conventions in addition to their extensions for backwards
compatibility.

It is deprecated to access any of these conventions and their properties. Doing so will now emit a
deprecation warning. This will become an error in Gradle 9.0. You should prefer accessing the
extensions and their properties instead.

For specific examples, see the next sections.

Prominent community plugins already migrated to using extensions to provide custom DSLs. Some
of them still register conventions for backward compatibility. Registering conventions does not emit
a deprecation warning yet to provide a migration window. Future Gradle versions will do.

Also note that Plugins compiled with Gradle ⇐ 8.1 that make use of the Kotlin DSL functions
[Link]<T>(), [Link](KClass) or [Link]<T> {} will emit a deprecation warning
when run on Gradle >= 8.2. To fix this these plugins should be recompiled with Gradle >= 8.2 or
changed to access extensions directly using [Link]<T>() instead.

Deprecated base plugin conventions

The convention properties contributed by the base plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.

The conventions are replaced by the base { } configuration block backed by BasePluginExtension.
The old convention object defines the distsDirName, libsDirName, and archivesBaseName properties
with simple getter and setter methods. Those methods are available in the extension only to
maintain backward compatibility. Build scripts should solely use the properties of type Property:

[Link]

plugins {
base
}
base {
[Link]("gradle")
[Link]([Link]("custom-dist"))
[Link]([Link]("custom-libs"))
}

[Link]

plugins {
id 'base'
}

base {
archivesName = "gradle"
distsDirectory = [Link]('custom-dist')
libsDirectory = [Link]('custom-libs')
}

Deprecated application plugin conventions

The convention properties the application plugin contributed have been deprecated and scheduled
for removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.

The following code will now emit deprecation warnings:

[Link]

plugins {
application
}

applicationDefaultJvmArgs = listOf("-[Link]=en") // Accessing a


convention

[Link]

plugins {
id 'application'
}

applicationDefaultJvmArgs = ['-[Link]=en'] // Accessing a


convention
This should be changed to use the application { } configuration block, backed by JavaApplication,
instead:

[Link]

plugins {
application
}

application {
applicationDefaultJvmArgs = listOf("-[Link]=en")
}

[Link]

plugins {
id 'application'
}

application {
applicationDefaultJvmArgs = ['-[Link]=en']
}

Deprecated java plugin conventions

The convention properties the java plugin contributed have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.

The following code will now emit deprecation warnings:

[Link]

plugins {
id("java")
}

configure<JavaPluginConvention> { // Accessing a convention


sourceCompatibility = JavaVersion.VERSION_18
}
[Link]

plugins {
id 'java'
}

sourceCompatibility = 18 // Accessing a convention

This should be changed to use the java { } configuration block, backed by JavaPluginExtension,
instead:

[Link]

plugins {
id("java")
}

java {
sourceCompatibility = JavaVersion.VERSION_18
}

[Link]

plugins {
id 'java'
}

java {
sourceCompatibility = JavaVersion.VERSION_18
}

Deprecated war plugin conventions

The convention properties contributed by the war plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.

The following code will now emit deprecation warnings:

[Link]

plugins {
id("war")
}

configure<WarPluginConvention> { // Accessing a convention


webAppDirName = "src/main/webapp"
}

[Link]

plugins {
id 'war'
}

webAppDirName = 'src/main/webapp' // Accessing a convention

Clients should configure the war task directly. Also, [Link]([Link]).configureEach(…) can
be used to configure each task of type War.

[Link]

plugins {
id("war")
}

[Link] {
[Link](file("src/main/webapp"))
}

[Link]

plugins {
id 'war'
}

war {
webAppDirectory = file('src/main/webapp')
}
Deprecated ear plugin conventions

The convention properties contributed by the ear plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.

The following code will now emit deprecation warnings:

[Link]

plugins {
id("ear")
}

configure<EarPluginConvention> { // Accessing a convention


appDirName = "src/main/app"
}

[Link]

plugins {
id 'ear'
}

appDirName = 'src/main/app' // Accessing a convention

Clients should configure the ear task directly. Also, [Link]([Link]).configureEach(…) can
be used to configure each task of type Ear.

[Link]

plugins {
id("ear")
}

[Link] {
[Link](file("src/main/app"))
}

[Link]

plugins {
id 'ear'
}

ear {
appDirectory = file('src/main/app') // use application metadata found in
this folder
}

Deprecated project-report plugin conventions

The convention properties contributed by the project-reports plugin have been deprecated and
scheduled for removal in Gradle 9.0. For more context, see the section about plugin convention
deprecation.

The following code will now emit deprecation warnings:

[Link]

plugins {
`project-report`
}

configure<ProjectReportsPluginConvention> {
projectReportDirName = "custom" // Accessing a convention
}

[Link]

plugins {
id 'project-report'
}

projectReportDirName = "custom" // Accessing a convention

Configure your report task instead:

[Link]

plugins {
`project-report`
}
[Link]<HtmlDependencyReportTask>() {

[Link]([Link]("reports/custom"
))
}

[Link]

plugins {
id 'project-report'
}

[Link](HtmlDependencyReportTask) {
projectReportDirectory = [Link](
"reports/custom")
}

Configuration method deprecations

The following method on Configuration is deprecated for removal:

• getAll()

Obtain the set of all configurations from the project’s configurations container instead.

Relying on automatic test framework implementation dependencies

In some cases, Gradle will load JVM test framework dependencies from the Gradle distribution to
execute tests. This existing behavior can lead to test framework dependency version conflicts on
the test classpath. To avoid these conflicts, this behavior is deprecated and will be removed in
Gradle 9.0. Tests using TestNG are unaffected.

To prepare for this change in behavior, either declare the required dependencies explicitly or
migrate to Test Suites, where these dependencies are managed automatically.

Test Suites

Builds that use test suites will not be affected by this change. Test suites manage the test framework
dependencies automatically and do not require dependencies to be explicitly declared. See the user
manual for further information on migrating to test suites.

Manually declaring dependencies

In the absence of test suites, dependencies must be manually declared on the test runtime
classpath:

• If using JUnit 5, an explicit runtimeOnly dependency on junit-platform-launcher is required in


addition to the existing implementation dependency on the test engine.

• If using JUnit 4, only the existing implementation dependency on junit 4 is required.

• If using JUnit 3, a test runtimeOnly dependency on junit 4 is required in addition to a compileOnly


dependency on junit 3.

[Link]

dependencies {
// If using JUnit Jupiter
testImplementation("[Link]:junit-jupiter:5.9.2")
testRuntimeOnly("[Link]:junit-platform-launcher")

// If using JUnit Vintage


testCompileOnly("junit:junit:4.13.2")
testRuntimeOnly("[Link]:junit-vintage-engine:5.9.2")
testRuntimeOnly("[Link]:junit-platform-launcher")

// If using JUnit 4
testImplementation("junit:junit:4.13.2")

// If using JUnit 3
testCompileOnly("junit:junit:3.8.2")
testRuntimeOnly("junit:junit:4.13.2")
}

[Link]

dependencies {
// If using JUnit Jupiter
testImplementation '[Link]:junit-jupiter:5.9.2'
testRuntimeOnly '[Link]:junit-platform-launcher'

// If using JUnit Vintage


testCompileOnly 'junit:junit:4.13.2'
testRuntimeOnly '[Link]:junit-vintage-engine:5.9.2'
testRuntimeOnly '[Link]:junit-platform-launcher'

// If using JUnit 4
testImplementation 'junit:junit:4.13.2'

// If using JUnit 3
testCompileOnly 'junit:junit:3.8.2'
testRuntimeOnly 'junit:junit:4.13.2'
}
BuildIdentifier and ProjectComponentSelector method deprecations

The following methods on BuildIdentifier are deprecated:

• getName()

• isCurrentBuild()

You could use these methods to distinguish between different project components with the same
name but from different builds. However, for certain composite build setups, these methods do not
provide enough information to guarantee uniqueness.

Current usages of these methods should migrate to [Link]().

Similarly, the method [Link]() is deprecated. Use


[Link]() instead.

Upgrading from 8.0 and earlier

[Link] files are created in global cache directories

Gradle now emits a [Link] file in some global cache directories, as specified in Cache
marking.

This may cause these directories to no longer be searched or backed up by some tools. To disable it,
use the following code in an init script in the Gradle User Home:

[Link]

beforeSettings {
caches {
// Disable cache marking for all caches
[Link]([Link])
}
}

[Link]

beforeSettings { settings ->


[Link] {
// Disable cache marking for all caches
markingStrategy = [Link]
}
}
Configuration cache options renamed

In this release, the configuration cache feature was promoted from incubating to stable. As such, all
properties originally mentioned in the feature documentation (which had an unsafe part in their
names, e.g., [Link]-cache) were renamed, in some cases, by removing the
unsafe part of the name.

Incubating property Finalized property


[Link]-cache [Link]-cache
[Link]-cache-problems [Link]*

[Link]- [Link]-problems
problems

Note that the original [Link]-cache… properties continue to be honored


in this release, and no warnings will be produced if they are used, but they will be deprecated and
removed in a future release.

Potential breaking changes

Kotlin DSL scripts emit compilation warnings

Compilation warnings from Kotlin DSL scripts are printed to the console output. For example, the
use of deprecated APIs in Kotlin DSL will emit warnings each time the script is compiled.

This is a potentially breaking change if you are consuming the console output of Gradle builds.

Configuring Kotlin compiler options with the kotlin-dsl plugin applied

If you are configuring custom Kotlin compiler options on a project with the kotlin-dsl plugin
applied you might encounter a breaking change.

In previous Gradle versions, the kotlin-dsl plugin was adding required compiler arguments on
afterEvaluate {}. Now that the Kotlin Gradle Plugin provides lazy configuration properties, our
kotlin-dsl plugin switched to adding required compiler arguments to the lazy properties directly.
As a consequence, if you were setting freeCompilerArgs the kotlin-dsl plugin is now failing the
build because its required compiler arguments are overridden by your configuration.

[Link]

plugins {
`kotlin-dsl`
}

[Link](KotlinCompile::class).configureEach {
kotlinOptions { // Deprecated non-lazy configuration options
freeCompilerArgs = listOf("-Xcontext-receivers")
}
}
With the configuration above you would get the following build failure:

* What went wrong


Execution failed for task ':compileKotlin'.
> Kotlin compiler arguments of task ':compileKotlin' do not work for the `kotlin-dsl`
plugin. The 'freeCompilerArgs' property has been reassigned. It must instead be
appended to. Please use '[Link](\"your\", \"args\")' to fix this.

You must change this to adding your custom compiler arguments to the lazy configuration
properties of the Kotlin Gradle Plugin for them to be appended to the ones required by the kotlin-
dsl plugin:

[Link]

plugins {
`kotlin-dsl`
}

[Link](KotlinCompile::class).configureEach {
compilerOptions { // New lazy configuration options
[Link]("-Xcontext-receivers")
}
}

If you were already adding to freeCompilerArgs instead of setting its value, you should not
experience a build failure.

New API introduced may clash with existing Gradle DSL code

When a new property or method is added to an existing type in the Gradle DSL, it may clash with
names already used in user code.

When a name clash occurs, one solution is to rename the element in user code.

This is a non-exhaustive list of API additions in 8.1 that may cause name collisions with existing
user code.

• [Link]()

• [Link]()

Using unsupported API to start external processes at configuration time is no longer allowed with the
configuration cache enabled

Since Gradle 7.5, using [Link], [Link], and standard Java and Groovy APIs to run
external processes at configuration time has been considered an error only if the feature preview
STABLE_CONFIGURATION_CACHE was enabled. With the configuration cache promotion to a stable
feature in Gradle 8.1, this error is detected regardless of the feature preview status. The
configuration cache chapter has more details to help with the migration to the new provider-based
APIs to execute external processes at configuration time.

Builds that do not use the configuration cache, or only start external processes at execution time
are not affected by this change.

Deprecations

Mutating core plugin configuration usage

The allowed usage of a configuration should be immutable after creation. Mutating the allowed
usage on a configuration created by a Gradle core plugin is deprecated. This includes calling any of
the following Configuration methods:

• setCanBeConsumed(boolean)

• setCanBeResolved(boolean)

These methods now emit deprecation warnings on these configurations, except for certain special
cases which make allowances for the existing behavior of popular plugins. This rule does not yet
apply to detached configurations or configurations created in buildscripts and third-party plugins.
Calling setCanBeConsumed(false) on apiElements or runtimeElements is not yet deprecated in order to
avoid warnings that would be otherwise emitted when using select popular third-party plugins.

This change is part of a larger ongoing effort to make the intended behavior of configurations more
consistent and predictable, and to unlock further speed and memory improvements in this area of
Gradle.

The ability to change the allowed usage of a configuration after creation will be removed in Gradle
9.0.

Reserved configuration names

Configuration names "detachedConfiguration" and "detachedConfigurationX" (where X is any


integer) are reserved for internal use when creating detached configurations.

The ability to create non-detached configurations with these names will be removed in Gradle 9.0.

Calling select methods on the JavaPluginExtension without the java component present

Starting in Gradle 8.1, calling any of the following methods on JavaPluginExtension without the
presence of the default java component is deprecated:

• withJavadocJar()

• withSourcesJar()

• consistentResolution(Action)

This java component is added by the JavaPlugin, which is applied by any of the Gradle JVM plugins
including:
• java-library

• application

• groovy

• scala

Starting in Gradle 9.0, calling any of the above listed methods without the presence of the default
java component will become an error.

WarPlugin#configureConfiguration(ConfigurationContainer)

Starting in Gradle 8.1, calling WarPlugin#configureConfiguration(ConfigurationContainer) is


deprecated. This method was intended for internal use and was never intended to be used as part
of the public interface.

Starting in Gradle 9.0, this method will be removed without replacement.

Relying on conventions for custom Test tasks

By default, when applying the java plugin, the testClassesDirs`and `classpath of all Test tasks have
the same convention. Unless otherwise changed, the default behavior is to execute the tests from
the default test TestSuite by configuring the task with the classpath and testClassesDirs from the
test suite. This behavior will be removed in Gradle 9.0.

While this existing default behavior is correct for the use case of executing the default unit test
suite under a different environment, it does not support the use case of executing an entirely
separate set of tests.

If you wish to continue including these tests, use the following code to avoid the deprecation
warning in 8.1 and prepare for the behavior change in 9.0. Alternatively, consider migrating to test
suites.

[Link]

val test by [Link](JvmTestSuite::class)


[Link]<Test>("myTestTask") {
testClassesDirs = files([Link] { [Link] })
classpath = files([Link] { [Link] })
}

[Link]

[Link] {
testClassesDirs = [Link]
classpath = [Link]
}
Modifying Gradle Module Metadata after a publication has been populated

Altering the GMM (e.g., changing a component configuration variants) after a Maven or Ivy
publication has been populated from their components is now deprecated. This feature will be
removed in Gradle 9.0.

Eager population of the publication can happen if the following methods are called:

• Maven

◦ [Link]()

• Ivy

◦ [Link]()

◦ [Link]()

◦ [Link](Action)

Previously, the following code did not generate warnings, but it created inconsistencies between
published artifacts:

[Link]

publishing {
publications {
create<MavenPublication>("maven") {
from(components["java"])
}
create<IvyPublication>("ivy") {
from(components["java"])
}
}
}

// These calls eagerly populate the Maven and Ivy publications

([Link]["maven"] as MavenPublication).artifacts
([Link]["ivy"] as IvyPublication).artifacts

val javaComponent = components["java"] as AdhocComponentWithVariants


[Link](configurations["apiElements"]) {
skip() }
[Link](configurations["runtimeElements"]
) { skip() }

[Link]

publishing {
publications {
maven(MavenPublication) {
from [Link]
}
ivy(IvyPublication) {
from [Link]
}
}
}

// These calls eagerly populate the Maven and Ivy publications

[Link]
[Link]

[Link]([Link]) {
skip() }
[Link]([Link])
{ skip() }

In this example, the Maven and Ivy publications will contain the main JAR artifacts for the project,
whereas the GMM module file will omit them.

Running tests on JVM versions 6 and 7

Running JVM tests on JVM versions older than 8 is deprecated. Testing on these versions will
become an error in Gradle 9.0

Applying Kotlin DSL precompiled scripts published with Gradle < 6.0

Applying Kotlin DSL precompiled scripts published with Gradle < 6.0 is deprecated. Please use a
version of the plugin published with Gradle >= 6.0.

Applying the kotlin-dsl together with Kotlin Gradle Plugin < 1.8.0

Applying the kotlin-dsl together with Kotlin Gradle Plugin < 1.8.0 is deprecated. Please let Gradle
control the version of kotlin-dsl by removing any explicit kotlin-dsl version constraints from your
build logic. This will let the kotlin-dsl plugin decide which version of the Kotlin Gradle Plugin to
use. If you explicitly declare which version of the Kotlin Gradle Plugin to use for your build logic,
update it to >= 1.8.0.

Accessing libraries or bundles from dependency version catalogs in the plugins {} block of a Kotlin script

Accessing libraries or bundles from dependency version catalogs in the plugins {} block of a Kotlin
script is deprecated. Please only use versions or plugins from dependency version catalogs in the
plugins {} block.
Using ValidatePlugins task without a Java Toolchain

Using a task of type ValidatePlugins without applying the Java Toolchains plugin is deprecated, and
will become an error in Gradle 9.0.

To avoid this warning, please apply the plugin to your project:

[Link]

plugins {
id("jvm-toolchains")
}

[Link]

plugins {
id 'jvm-toolchains'
}

The Java Toolchains plugin is applied automatically by the Java library plugin or other JVM plugins.
So you can apply any of them to your project and it will fix the warning.

Deprecated members of the [Link] package now report their deprecation

These members will be removed in Gradle 9.0.

• [Link](…)

• [Link](…)

• [Link](…)

• ConfigureUtil

Deprecated JVM vendor IBM Semeru

The enum constant JvmVendorSpec.IBM_SEMERU is now deprecated and will be removed in Gradle 9.0.

Please replace it by its equivalent [Link] to avoid warnings and potential errors in the
next major version release.

Setting custom build layout on StartParameter and GradleBuild

Following the related previous deprecation of the behaviour in Gradle 7.1, it is now also deprecated
to use related StartParameter and GradleBuild properties. These properties will be removed in
Gradle 9.0.

Setting custom build file using buildFile property in GradleBuild task has been deprecated.
Please use the dir property instead to specify the root of the nested build. Alternatively, consider
using one of the recommended alternatives for GradleBuild task.

Setting custom build layout using StartParameter methods setBuildFile(File) and


setSettingsFile(File) as well as the counterpart getters getBuildFile() and getSettingsFile() have been
deprecated.

Please use standard locations for settings and build files:

• settings file in the root of the build

• build file in the root of each subproject

Deprecated [Link] property

The [Link] property in [Link] under Gradle User Home has been
deprecated. Please use the cache cleanup DSL instead to disable or modify the cleanup
configuration.

Since the [Link] property may still be needed for older versions of Gradle, this
property may still be present and no deprecation warnings will be printed as long as it is also
configured via the DSL. The DSL value will always take preference over the
[Link] property. If the desired configuration is to disable cleanup for older
versions of Gradle (using [Link]), but to enable cleanup with the default values
for Gradle versions at or above Gradle 8, then cleanup should be configured to use
[Link]:

[Link]

if ([Link]() >= [Link]('8.0')) {


apply from: "gradle8/[Link]"
}

[Link]

if ([Link]() >= [Link]("8.0")) {


apply(from = "gradle8/[Link]")
}

gradle8/[Link]

beforeSettings { settings ->


[Link] {
cleanup = [Link]
}
}

gradle8/[Link]

beforeSettings {
caches {
[Link]([Link])
}
}

Deprecated using relative paths to specify Java executables

Using relative file paths to point to Java executables is now deprecated and will become an error in
Gradle 9. This is done to reduce confusion about what such relative paths should resolve against.

Calling [Link](), [Link]() from a task action

Calling [Link](), [Link]() from a task action at execution time is now


deprecated and will be made an error in Gradle 9.0.

See the configuration cache chapter for details on how to migrate these usages to APIs that are
supported by the configuration cache.

Deprecated running test task successfully when no test executed

Running the Test task successfully when no test was executed is now deprecated and will become
an error in Gradle 9. Note that it is not an error when no test sources are present, in this case the
test task is simply skipped. It is only an error when test sources are present, but no test was
selected for execution. This is changed to avoid accidental successful test runs due to erroneous
configuration.

Changes in the IDE integration

Workaround for false positive errors shown in Kotlin DSL plugins {} block using version catalog is not
needed anymore

Version catalog accessors for plugin aliases in the plugins {} block aren’t shown as errors in IntelliJ
IDEA and Android Studio Kotlin script editor anymore.

If you were using the @Suppress("DSL_SCOPE_VIOLATION") annotation as a workaround, you can now
remove it.

If you were using the Gradle Libs Error Suppressor IntelliJ IDEA plugin, you can now uninstall it.

After upgrading Gradle to 8.1 you will need to clear the IDE caches and restart.
Also see the deprecated usages of version catalogs in the Kotlin DSL plugins {} block above.
RUNNING GRADLE BUILDS
CORE CONCEPTS
Gradle Basics
Gradle automates building, testing, and deployment of software from information in build
scripts.

Gradle core concepts

Projects

A Gradle project is a piece of software that can be built, such as an application or a library.

Single project builds include a single project called the root project.

Multi-project builds include one root project and any number of subprojects.

Build Scripts

Build scripts detail to Gradle what steps to take to build the project.

Each project can include one or more build scripts.

Dependency Management

Dependency management is an automated technique for declaring and resolving external


resources required by a project.

Each project typically includes a number of external dependencies that Gradle will resolve during
the build.
Tasks

Tasks are a basic unit of work such as compiling code or running your test.

Each project contains one or more tasks defined inside a build script or a plugin.

Plugins

Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.

Gradle project structure

Many developers will interact with Gradle for the first time through an existing project.

The presence of the gradlew and [Link] files in the root directory of a project is a clear
indicator that Gradle is used.

A Gradle project will look similar to the following:

project
├── gradle ①
│ ├── [Link] ②
│ └── wrapper
│ ├── [Link]
│ └── [Link]
├── gradlew ③
├── [Link] ③
├── [Link](.kts) ④
├── subproject-a
│ ├── [Link](.kts) ⑤
│ └── src ⑥
└── subproject-b
├── [Link](.kts) ⑤
└── src ⑥

① Gradle directory to store wrapper files and more

② Gradle version catalog for dependency management

③ Gradle wrapper scripts

④ Gradle settings file to define a root project name and subprojects

⑤ Gradle build scripts of the two subprojects - subproject-a and subproject-b

⑥ Source code and/or additional files for the projects

Invoking Gradle

IDE

Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
Gradle can be automatically invoked when you build, clean, or run your app in the IDE.

It is recommended that you consult the manual for the IDE of your choice to learn more about how
Gradle can be used and configured.

Command line

Gradle can be invoked in the command line once installed. For example:

$ gradle build

NOTE Most projects do not use the installed version of Gradle.

Gradle Wrapper

The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to
execute a Gradle build. It is found in the project root directory as a gradlew or [Link] file:

$ gradlew build // Linux or OSX


$ [Link] build // Windows

Next Step: Learn about the Gradle Wrapper >>

Gradle Wrapper Basics


The recommended way to execute any Gradle build is with the Gradle Wrapper.
The Wrapper script invokes a declared version of Gradle, downloading it beforehand if necessary.

The Wrapper is available as a gradlew or [Link] file.

The Wrapper provides the following benefits:

• Standardizes a project on a given Gradle version.

• Provisions the same Gradle version for different users.

• Provisions the Gradle version for different execution environments (IDEs, CI servers…).

Using the Gradle Wrapper

It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and
standardized execution of the build.

Depending on the operating system, you run gradlew or [Link] instead of the gradle command.

Typical Gradle invocation:

$ gradle build

To run the Wrapper on a Linux or OSX machine:

$ ./gradlew build

To run the Wrapper on Windows PowerShell:

$ .\[Link] build

The command is run in the same directory that the Wrapper is located in. If you want to run the
command in a different directory, you must provide the relative path to the Wrapper:
$ ../gradlew build

The following console output demonstrates the use of the Wrapper on a Windows machine, in the
command prompt (cmd), for a Java-based project:

$ [Link] build

Downloading [Link]
.....................................................................................
Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\[Link] to C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv
Set executable permissions for: C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle

BUILD SUCCESSFUL in 12s


1 actionable task: 1 executed

Understanding the Wrapper files

The following files are part of the Gradle Wrapper:

.
├── gradle
│ └── wrapper
│ ├── [Link] ①
│ └── [Link] ②
├── gradlew ③
└── [Link] ④

① [Link]: This is a small JAR file that contains the Gradle Wrapper code. It is
responsible for downloading and installing the correct version of Gradle for a project if it’s not
already installed.

② [Link]: This file contains configuration properties for the Gradle Wrapper,
such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or
TARBALL).

③ gradlew: This is a shell script (Unix-based systems) that acts as a wrapper around gradle-
[Link]. It is used to execute Gradle tasks on Unix-based systems without needing to
manually install Gradle.

④ [Link]: This is a batch script (Windows) that serves the same purpose as gradlew but is used
on Windows systems.

IMPORTANT You should never alter these files.


If you want to view or update the Gradle version of your project, use the command line. Do not edit
the wrapper files manually:

$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2

$ [Link] --version
$ [Link] wrapper --gradle-version 7.2

Consult the Gradle Wrapper reference to learn more.

Next Step: Learn about the Gradle CLI >>

Command-Line Interface Basics


The command-line interface is the primary method of interacting with Gradle outside the IDE.

Use of the Gradle Wrapper is highly encouraged.

Substitute ./gradlew (in macOS / Linux) or [Link] (in Windows) for gradle in the following
examples.

Executing Gradle on the command line conforms to the following structure:

gradle [taskName...] [--option-name...]

Options are allowed before and after task names.


gradle [--option-name...] [taskName...]

If multiple tasks are specified, you should separate them with a space.

gradle [taskName1 taskName2...] [--option-name...]

Options that accept values can be specified with or without = between the option and argument.
The use of = is recommended.

gradle [...] --console=plain

Options that enable behavior have long-form options with inverses specified with --no-. The
following are opposites.

gradle [...] --build-cache


gradle [...] --no-build-cache

Many long-form options have short-option equivalents. The following are equivalent:

gradle --help
gradle -h

Command-line usage

The following sections describe the use of the Gradle command-line interface. Some plugins also
add their own command line options.

Executing tasks

To execute a task called taskName on the root project, type:

$ gradle :taskName

This will run the single taskName and all of its dependencies.

Specify options for tasks

To pass an option to a task, prefix the option name with -- after the task name:

$ gradle taskName --exampleOption=exampleValue

Consult the Gradle Command Line Interface reference to learn more.


Next Step: Learn about the Settings file >>

Settings File Basics


The settings file is the entry point of every Gradle project.

The primary purpose of the settings file is to add subprojects to your build.

Gradle supports single and multi-project builds.

• For single-project builds, the settings file is optional.

• For multi-project builds, the settings file is mandatory and declares all subprojects.

Settings script

The settings file is a script. It is either a [Link] file written in Groovy or a


[Link] file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

The settings file is typically found in the root directory of the project.

Let’s take a look at an example and break it down:

[Link]

[Link] = "root-project" ①

include("sub-project-a") ②
include("sub-project-b")
include("sub-project-c")

① Define the project name.

② Add subprojects.

[Link]

[Link] = 'root-project' ①

include('sub-project-a') ②
include('sub-project-b')
include('sub-project-c')

① Define the project name.

② Add subprojects.

1. Define the project name

The settings file defines your project name:

[Link] = "root-project"

There is only one root project per build.

2. Add subprojects

The settings file defines the structure of the project by including subprojects, if there are any:

include("app")
include("business-logic")
include("data-model")

Consult the Writing Settings File page to learn more.

Next Step: Learn about the Build scripts >>

Build File Basics


Generally, a build script details build configuration, tasks, and plugins.
Every Gradle build comprises at least one build script.

In the build file, two types of dependencies can be added:

1. The libraries and/or plugins on which Gradle and the build script depend.

2. The libraries on which the project sources (i.e., source code) depend.

Build scripts

The build script is either a [Link] file written in Groovy or a [Link] file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

Let’s take a look at an example and break it down:

[Link]

plugins {
id("application") ①
}

application {
mainClass = "[Link]" ②
}

① Add plugins.

② Use convention properties.


[Link]

plugins {
id 'application' ①
}

application {
mainClass = '[Link]' ②
}

① Add plugins.

② Use convention properties.

1. Add plugins

Plugins extend Gradle’s functionality and can contribute tasks to a project.

Adding a plugin to a build is called applying a plugin and makes additional functionality available.

plugins {
id("application")
}

The application plugin facilitates creating an executable JVM application.

Applying the Application plugin also implicitly applies the Java plugin. The java plugin adds Java
compilation along with testing and bundling capabilities to a project.

2. Use convention properties

A plugin adds tasks to a project. It also adds properties and methods to a project.

The application plugin defines tasks that package and distribute an application, such as the run
task.

The Application plugin provides a way to declare the main class of a Java application, which is
required to execute the code.

application {
mainClass = "[Link]"
}

In this example, the main class (i.e., the point where the program’s execution begins) is
[Link].

Consult the Writing Build Scripts page to learn more.


Next Step: Learn about Dependency Management >>

Dependency Management Basics


Gradle has built-in support for dependency management.

Dependency management is an automated technique for declaring and resolving external


resources required by a project.

Gradle build scripts define the process to build projects that may require external dependencies.
Dependencies refer to JARs, plugins, libraries, or source code that support building your project.

Version Catalog

Version catalogs provide a way to centralize your dependency declarations in a [Link]


file.

The catalog makes sharing dependencies and version configurations between subprojects simple. It
also allows teams to enforce versions of libraries and plugins in large projects.

The version catalog typically contains four sections:

1. [versions] to declare the version numbers that plugins and libraries will reference.

2. [libraries] to define the libraries used in the build files.

3. [bundles] to define a set of dependencies.

4. [plugins] to define plugins.

[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"

[libraries]
googleMaterial = { group = "[Link]", name = "material", version =
"1.1.0-alpha05" }
mockitoCore = { module = "[Link]:mockito-core", [Link] = "mockito" }

[plugins]
androidApplication = { id = "[Link]", [Link] =
"androidGradlePlugin" }

The file is located in the gradle directory so that it can be used by Gradle and IDEs automatically.
The version catalog should be checked into source control: gradle/[Link].

Declaring Your Dependencies

To add a dependency to your project, specify a dependency in the dependencies block of your
[Link](.kts) file.

The following [Link] file adds a plugin and two dependencies to the project using the
version catalog above:

plugins {
alias([Link]) ①
}

dependencies {
// Dependency on a remote binary to compile and run the code
implementation([Link]) ②

// Dependency on a remote binary to compile and run the test code


testImplementation([Link]) ③
}

① Applies the Android Gradle plugin to this project, which adds several features that are specific to
building Android apps.

② Adds the Material dependency to the project. Material Design provides components for creating
a user interface in an Android App. This library will be used to compile and run the Kotlin
source code in this project.

③ Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java
code. This library will be used to compile and run the test source code in this project.

Dependencies in Gradle are grouped by configurations.

• The material library is added to the implementation configuration, which is used for compiling
and running production code.

• The mockito-core library is added to the testImplementation configuration, which is used for
compiling and running test code.

NOTE There are many more configurations available.

Viewing Project Dependencies

You can view your dependency tree in the terminal using the ./gradlew :app:dependencies
command:

$ ./gradlew :app:dependencies

> Task :app:dependencies

------------------------------------------------------------
Project ':app'
------------------------------------------------------------

implementation - Implementation only dependencies for source set 'main'. (n)


\--- [Link]:material:1.1.0-alpha05 (n)

testImplementation - Implementation only dependencies for source set 'test'. (n)


\--- [Link]:mockito-core:2.16.0 (n)

...

Consult the Dependency Management chapter to learn more.

Next Step: Learn about Tasks >>

Task Basics
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
You run a Gradle build task using the gradle command or by invoking the Gradle Wrapper
(./gradlew or [Link]) in your project directory:

$ ./gradlew build

Available tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in the project by running the following command in the terminal:

$ ./gradlew tasks

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

...

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
...

Other tasks
-----------
compileJava - Compiles main Java source.

...

Running tasks

The run task is executed with ./gradlew run:

$ ./gradlew run

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes

> Task :app:run


Hello World!

BUILD SUCCESSFUL in 904ms


2 actionable tasks: 2 executed

In this example Java project, the output of the run task is a Hello World statement printed on the
console.

Task dependency

Many times, a task requires another task to run first.

For example, for Gradle to execute the build task, the Java code must first be compiled. Thus, the
build task depends on the compileJava task.

This means that the compileJava task will run before the build task:

$ ./gradlew build

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

BUILD SUCCESSFUL in 764ms


7 actionable tasks: 7 executed

Build scripts can optionally define task dependencies. Gradle then automatically determines the
task execution order.

Consult the Task development chapter to learn more.

Next Step: Learn about Plugins >>

Plugin Basics
Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a
sophisticated dependency resolution engine. The rest of its functionality comes from plugins.

A plugin is a piece of software that provides additional functionality to the Gradle build system.

Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-
related capabilities:

The Java Library Plugin - java-library


Used to define and build Java libraries. It compiles Java source code with the compileJava task,
generates Javadoc with the javadoc task, and packages the compiled classes into a JAR file with
the jar task.
The Google Services Gradle Plugin - [Link]:google-services
Enables Google APIs and Firebase services in your Android application with a configuration
block called googleServices{} and a task called generateReleaseAssets.

The Gradle Bintray Plugin - [Link]


Allows you to publish artifacts to Bintray by configuring the plugin using the bintray{} block.

Plugin distribution

Plugins are distributed in three ways:

1. Core plugins - Gradle develops and maintains a set of Core Plugins.

2. Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.

3. Local plugins - Gradle enables users to create custom plugins using APIs.

Applying plugins

Applying a plugin to a project allows the plugin to extend the project’s capabilities.

You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a
version:

plugins {
id «plugin id» version «plugin version»
}

1. Core plugins

Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These
plugins provide essential functionality for building and managing projects.

Some examples of core plugins include:

• java: Provides support for building Java projects.

• groovy: Adds support for compiling and testing Groovy source files.

• ear: Adds support for building EAR files for enterprise applications.

Core plugins are unique in that they provide short names, such as java for the core JavaPlugin,
when applied in build scripts. They also do not require versions. To apply the java plugin to a
project:

[Link]

plugins {
id("java")
}
There are many Gradle Core Plugins users can take advantage of.

2. Community plugins

Community plugins are plugins developed by the Gradle community, rather than being part of the
core Gradle distribution. These plugins provide additional functionality that may be specific to
certain use cases or technologies.

The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java
applications.

To apply the [Link] plugin to a project:

[Link]

plugins {
id("[Link]") version "3.1.5"
}

Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can
easily discover and use them.

3. Local plugins

Custom or local plugins are developed and used within a specific project or organization. These
plugins are not shared publicly and are tailored to the specific needs of the project or organization.

Local plugins can encapsulate common build logic, provide integrations with internal systems or
tools, or abstract complex functionality into reusable components.

Gradle provides users with the ability to develop custom plugins using APIs. To create your own
plugin, you’ll typically follow these steps:

1. Define the plugin class: create a new class that implements the Plugin<Project> interface.

// Define a 'HelloPlugin' plugin


class HelloPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Define the 'hello' task
val helloTask = [Link]("hello") {
doLast {
println("Hello, Gradle!")
}
}
}
}

2. Build and optionally publish your plugin: generate a JAR file containing your plugin code and
optionally publish this JAR to a repository (local or remote) to be used in other projects.
// Publish the plugin
plugins {
`maven-publish`
}

publishing {
publications {
create<MavenPublication>("mavenJava") {
from(components["java"])
}
}
repositories {
mavenLocal()
}
}

3. Apply your plugin: when you want to use the plugin, include the plugin ID and version in the
plugins{} block of the build file.

// Apply the plugin


plugins {
id("[Link]") version "1.0"
}

Consult the Plugin development chapter to learn more.

Next Step: Learn about Incremental Builds and Build Caching >>

Gradle Incremental Builds and Build Caching


<div class="badge-wrapper">
<a class="badge" href="[Link]
4969-ac3e-82dea16f87b0/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">Incremental Builds and Build Caching with
Gradle&nbsp;&nbsp;&nbsp;&gt;</span>
</a>
</div>

Gradle uses two main features to reduce build time: incremental builds and build caching.
Incremental builds

An incremental build is a build that avoids running tasks whose inputs have not changed since the
previous build. Re-executing such tasks is unnecessary if they would only re-produce the same
output.

For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine
whether the input or outputs have changed at build time. If they have changed, Gradle will execute
the task. Otherwise, it will skip execution.

Incremental builds are always enabled, and the best way to see them in action is to turn on verbose
mode. With verbose mode, each task state is labeled during a build:

$ ./gradlew compileJava --console=verbose

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE


> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava UP-TO-DATE
BUILD SUCCESSFUL in 374ms
12 actionable tasks: 12 up-to-date

When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE is
printed next to the task.

To permanently enable verbose mode, add [Link]=verbose to your


TIP
[Link] file.

Build caching

Incremental Builds are a great optimization that helps avoid work already done. If a developer
continuously changes a single file, there is likely no need to rebuild all the other files in the project.

However, what happens when the same developer switches to a new branch created last week? The
files are rebuilt, even though the developer is building something that has been built before.

This is where a build cache is helpful.

The build cache stores previous build results and restores them when needed. It prevents the
redundant work and cost of executing time-consuming and expensive processes.

When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-
CACHE:

$ ./gradlew compileJava --build-cache

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE


> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava FROM-CACHE
> Task :utilities:compileJava FROM-CACHE
> Task :app:compileJava FROM-CACHE

BUILD SUCCESSFUL in 364ms


12 actionable tasks: 3 from cache, 9 up-to-date

Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE and
not FROM-CACHE.
The build cache allows you to share and reuse unchanged build and test outputs across teams. This
speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new
code changes.

Consult the Build cache chapter to learn more.

Next Step: Learn about Build Scans >>

Build Scans
<div class="badge-wrapper">
<a class="badge" href="[Link]
4393-b645-7a2c713853d5/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">How to Use Build Scans&nbsp;&nbsp;&nbsp;&gt;</span>
</a>
</div>

A build scan is a representation of metadata captured as you run your build.

Build Scans

Gradle captures your build metadata and sends it to the Build Scan Service. The service then
transforms the metadata into information you can analyze and share with others.
The information that scans collect can be an invaluable resource when troubleshooting,
collaborating on, or optimizing the performance of your builds.

For example, with a build scan, it’s no longer necessary to copy and paste error messages or include
all the details about your environment each time you want to ask a question on Stack Overflow,
Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.
Enable Build Scans

To enable build scans on a gradle command, add --scan to the command line option:

./gradlew build --scan

You may be prompted to agree to the terms to use Build Scans.

Vist the Build Scans page to learn more.

Next Step: Start the Tutorial >>


AUTHORING GRADLE BUILDS
CORE CONCEPTS
Gradle Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory
and the Project Root directory.

Gradle User Home directory

By default, the Gradle User Home (~/.gradle or C:\Users\<USERNAME>\.gradle) stores global


configuration properties, initialization scripts, caches, and log files.

It can be set with the environment variable GRADLE_USER_HOME.

TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.

It is roughly structured as follows:

├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── [Link]
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── [Link] ⑧

① Global cache directory (for everything that is not project-specific).

② Version-specific caches (e.g., to support incremental builds).

③ Shared caches (e.g., for artifacts of dependencies).

④ Registry and logs of the Gradle Daemon.

⑤ Global initialization scripts.

⑥ JDKs downloaded by the toolchain support.

⑦ Distributions downloaded by the Gradle Wrapper.

⑧ Global Gradle configuration properties.

Consult the Gradle Directories reference to learn more.

Project Root directory

The project root directory contains all source files from your project.

It also contains files and directories Gradle generates, such as .gradle and build, as well as the
Gradle configuration directory: gradle.

TIP gradle and .gradle directories are different.

While gradle is usually checked into source control, build and .gradle directories contain the
output of your builds, caches, and other transient files Gradle uses to support features like
incremental builds.

The anatomy of a typical project root directory looks as follows:

├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── [Link] ⑤
├── gradlew ⑥
├── [Link] ⑥
├── [Link] ⑦
├── subproject-one ⑧
| └── [Link] ⑨
├── subproject-two ⑧
| └── [Link] ⑨
└── ⋮

① Project-specific cache directory generated by Gradle.

② Version-specific caches (e.g., to support incremental builds).

③ The build directory of this project into which Gradle generates all build artifacts.

④ Contains the JAR file and configuration of the Gradle Wrapper.

⑤ Project-specific Gradle configuration properties.

⑥ Scripts for executing builds using the Gradle Wrapper.

⑦ The project’s settings file where the list of subprojects is defined.

⑧ Usually, a project is organized into one or multiple subprojects.

⑨ Each subproject has its own Gradle build script.

Consult the Gradle Directories reference to learn more.

Next Step: Learn how to structure Multi-Project Builds >>

Multi-Project Build Basics


Gradle supports multi-project builds.

While some small projects and monolithic applications may contain a single build file and source
tree, it is often more common for a project to have been split into smaller, interdependent modules.
The word "interdependent" is vital, as you typically want to link the many modules together
through a single build.

Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-
module project. Gradle refers to modules as subprojects.

A multi-project build consists of one root project and one or more subprojects.

Multi-Project structure

The following represents the structure of a multi-project build that contains three subprojects:

The directory structure should look as follows:

├── .gradle
│ └── ⋮
├── gradle
│ ├── [Link]
│ └── wrapper
├── gradlew
├── [Link]
├── [Link] ①
├── sub-project-1
│ └── [Link] ②
├── sub-project-2
│ └── [Link] ②
└── sub-project-3
└── [Link] ②

① The [Link] file should include all subprojects.

② Each subproject should have its own [Link] file.


Multi-Project standards

The Gradle community has two standards for multi-project build structures:

1. Multi-Project Builds using buildSrc - where buildSrc is a subproject-like directory at the


Gradle project root containing all the build logic.

2. Composite Builds - a build that includes other builds where build-logic is a build directory at
the Gradle project root containing reusable build logic.

1. Multi-Project Builds using buildSrc

Multi-project builds allow you to organize projects with many modules, wire dependencies between
those modules, and easily share common build logic amongst them.

For example, a build that has many modules called mobile-app, web-app, api, lib, and documentation
could be structured as follows:

.
├── gradle
├── gradlew
├── [Link]
├── buildSrc
│ ├── [Link]
│ └── src/main/kotlin/[Link]
├── mobile-app
│ └── [Link]
├── web-app
│ └── [Link]
├── api
│ └── [Link]
├── lib
│ └── [Link]
└── documentation
└── [Link]

The modules will have dependencies between them such as web-app and mobile-app depending on
lib. This means that in order for Gradle to build web-app or mobile-app, it must build lib first.

In this example, the root settings file will look as follows:

[Link]

include("mobile-app", "web-app", "api", "lib", "documentation")

[Link]

include("mobile-app", "web-app", "api", "lib", "documentation")

NOTE The order in which the subprojects (modules) are included does not matter.

The buildSrc directory is automatically recognized by Gradle. It is a good place to define and
maintain shared configuration or imperative build logic, such as custom tasks or plugins.

buildSrc is automatically included in your build as a special subproject if a [Link](.kts) file is


found under buildSrc.

If the java plugin is applied to the buildSrc project, the compiled code from buildSrc/src/main/java
is put in the classpath of the root build script, making it available to any subproject (web-app, mobile-
app, lib, etc…) in the build.

Consult how to declare dependencies between subprojects to learn more.

2. Composite Builds

Composite Builds, also referred to as included builds, are best for sharing logic between builds (not
subprojects) or isolating access to shared build logic (i.e., convention plugins).

Let’s take the previous example. The logic in buildSrc has been turned into a project that contains
plugins and can be published and worked on independently of the root project build.

The plugin is moved to its own build called build-logic with a build script and settings file:

.
├── gradle
├── gradlew
├── [Link]
├── build-logic
│ ├── [Link]
│ └── conventions
│ ├── [Link]
│ └── src/main/kotlin/[Link]
├── mobile-app
│ └── [Link]
├── web-app
│ └── [Link]
├── api
│ └── [Link]
├── lib
│ └── [Link]
└── documentation
└── [Link]

The fact that build-logic is located in a subdirectory of the root project is irrelevant.
NOTE
The folder could be located outside the root project if desired.

The root settings file includes the entire build-logic build:

[Link]

pluginManagement {
includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")

Consult how to create composite builds with includeBuild to learn more.

Multi-Project path

A project path has the following pattern: it starts with an optional colon, which denotes the root
project.

The root project, :, is the only project in a path not specified by its name.

The rest of a project path is a colon-separated sequence of project names, where the next project is
a subproject of the previous project:

:sub-project-1

You can see the project paths when running gradle projects:

------------------------------------------------------------
Root project 'project'
------------------------------------------------------------
Root project 'project'
+--- Project ':sub-project-1'
\--- Project ':sub-project-2'

Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for
composite builds.

Identifying project structure

You can use the gradle projects command to identify the project structure.

As an example, let’s use a multi-project build with the following structure:

$ gradle -q projects

Projects:

------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------

Root project 'multiproject'


+--- Project ':api'
+--- Project ':services'
| +--- Project ':services:shared'
| \--- Project ':services:webservice'
\--- Project ':shared'

To see a list of the tasks of a project, run gradle <project-path>:tasks


For example, try running gradle :api:tasks

Multi-project builds are collections of tasks you can run. The difference is that you may want to
control which project’s tasks get executed.

The following sections will cover your two options for executing tasks in a multi-project build.

Executing tasks by name

The command gradle test will execute the test task in any subprojects relative to the current
working directory that has that task.

If you run the command from the root project directory, you will run test in api, shared,
services:shared and services:webservice.

If you run the command from the services project directory, you will only execute the task in
services:shared and services:webservice.

The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this
name. And complain if there is no such task found in any of the subprojects traversed.
Some task selectors, like help or dependencies, will only run the task on the project
NOTE they are invoked on and not on all the subprojects to reduce the amount of
information printed on the screen.

Executing tasks by fully qualified name

You can use a task’s fully qualified name to execute a specific task in a particular subproject. For
example: gradle :services:webservice:build will run the build task of the webservice subproject.

The fully qualified name of a task is its project path plus the task name.

This approach works for any task, so if you want to know what tasks are in a particular subproject,
use the tasks task, e.g. gradle :services:webservice:tasks.

Multi-Project building and testing

The build task is typically used to compile, test, and check a single project.

In multi-project builds, you may often want to do all of these tasks across various projects. The
buildNeeded and buildDependents tasks can help with this.

In this example, the :services:person-service project depends on both the :api and :shared
projects. The :api project also depends on the :shared project.

Assuming you are working on a single project, the :api project, you have been making changes but
have not built the entire project since performing a clean. You want to build any necessary
supporting JARs but only perform code quality and unit tests on the parts of the project you have
changed.

The build task does this:

$ gradle :api:build
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build

BUILD SUCCESSFUL in 0s
If you have just gotten the latest version of the source from your version control system, which
included changes in other projects that :api depends on, you might want to build all the projects
you depend on AND test them too.

The buildNeeded task builds AND tests all the projects from the project dependencies of the
testRuntime configuration:

$ gradle :api:buildNeeded
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :shared:assemble
> Task :shared:compileTestJava
> Task :shared:processTestResources
> Task :shared:testClasses
> Task :shared:test
> Task :shared:check
> Task :shared:build
> Task :shared:buildNeeded
> Task :api:buildNeeded

BUILD SUCCESSFUL in 0s

You may want to refactor some part of the :api project used in other projects. If you make these
changes, testing only the :api project is insufficient. You must test all projects that depend on the
:api project.

The buildDependents task tests ALL the projects that have a project dependency (in the testRuntime
configuration) on the specified project:

$ gradle :api:buildDependents
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :services:person-service:compileJava
> Task :services:person-service:processResources
> Task :services:person-service:classes
> Task :services:person-service:jar
> Task :services:person-service:assemble
> Task :services:person-service:compileTestJava
> Task :services:person-service:processTestResources
> Task :services:person-service:testClasses
> Task :services:person-service:test
> Task :services:person-service:check
> Task :services:person-service:build
> Task :services:person-service:buildDependents
> Task :api:buildDependents

BUILD SUCCESSFUL in 0s

Finally, you can build and test everything in all projects. Any task you run in the root project folder
will cause that same-named task to be run on all the children.

You can run gradle build to build and test ALL projects.

Consult the Structuring Builds chapter to learn more.

Next Step: Learn about the Gradle Build Lifecycle >>

Build Lifecycle
As a build author, you define tasks and specify dependencies between them. Gradle guarantees that
tasks will execute in the order dictated by these dependencies.

Your build scripts and plugins configure this task dependency graph.

For example, if your project includes tasks such as build, assemble, and createDocs, you can
configure the build script so that they are executed in the order: build → assemble → createDocs.

Task Graphs

Gradle builds the task graph before executing any task.

Across all projects in the build, tasks form a Directed Acyclic Graph (DAG).
This diagram shows two example task graphs, one abstract and the other concrete, with
dependencies between tasks represented as arrows:

Both plugins and build scripts contribute to the task graph via the task dependency mechanism and
annotated inputs/outputs.

Build Phases

A Gradle build has three distinct phases.

Gradle runs these phases in order:

Phase 1. Initialization
• Detects the [Link](.kts) file.

• Creates a Settings instance.

• Evaluates the settings file to determine which projects (and included builds) make up the
build.

• Creates a Project instance for every project.


Phase 2. Configuration
• Evaluates the build scripts, [Link](.kts), of every project participating in the build.

• Creates a task graph for requested tasks.

Phase 3. Execution
• Schedules and executes the selected tasks.

• Dependencies between tasks determine execution order.

• Execution of tasks can occur in parallel.

Example

The following example shows which parts of settings and build files correspond to various build
phases:

[Link]

[Link] = "basic"
println("This is executed during the initialization phase.")

[Link]

println("This is executed during the configuration phase.")


[Link]("configured") {
println("This is also executed during the configuration phase, because
:configured is used in the build.")
}

[Link]("test") {
doLast {
println("This is executed during the execution phase.")
}
}

[Link]("testBoth") {
doFirst {
println("This is executed first during the execution phase.")
}
doLast {
println("This is executed last during the execution phase.")
}
println("This is executed during the configuration phase as well, because
:testBoth is used in the build.")
}

[Link]

[Link] = 'basic'
println 'This is executed during the initialization phase.'

[Link]

println 'This is executed during the configuration phase.'

[Link]('configured') {
println 'This is also executed during the configuration phase, because
:configured is used in the build.'
}

[Link]('test') {
doLast {
println 'This is executed during the execution phase.'
}
}

[Link]('testBoth') {
doFirst {
println 'This is executed first during the execution phase.'
}
doLast {
println 'This is executed last during the execution phase.'
}
println 'This is executed during the configuration phase as well, because
:testBoth is used in the build.'
}

The following command executes the test and testBoth tasks specified above. Because Gradle only
configures requested tasks and their dependencies, the configured task never configures:

> gradle test testBoth


This is executed during the initialization phase.

> Configure project :


This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in
the build.

> Task :test


This is executed during the execution phase.

> Task :testBoth


This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

> gradle test testBoth


This is executed during the initialization phase.

> Configure project :


This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in
the build.

> Task :test


This is executed during the execution phase.

> Task :testBoth


This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Phase 1. Initialization

In the initialization phase, Gradle detects the set of projects (root and subprojects) and included
builds participating in the build.

Gradle first evaluates the settings file, [Link](.kts), and instantiates a Settings object.
Then, Gradle instantiates Project instances for each project.

Phase 2. Configuration

In the configuration phase, Gradle adds tasks and other properties to the projects found by the
initialization phase.

Phase 3. Execution

In the execution phase, Gradle runs tasks.

Gradle uses the task execution graphs generated by the configuration phase to determine which
tasks to execute.

Next Step: Learn how to write Settings files >>

Writing Settings Files


The settings file is the entry point of every Gradle build.

Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root
directory.

When the settings file [Link](.kts) is found, Gradle instantiates a Settings object.

One of the purposes of the Settings object is to allow you to declare all the projects to be included in
the build.
Settings Scripts

The settings script is either a [Link] file in Groovy or a [Link] file in Kotlin.

Before Gradle assembles the projects for a build, it creates a Settings instance and executes the
settings file against it.

As the settings script executes, it configures this Settings. Therefore, the settings file defines the
Settings object.

There is a one-to-one correspondence between a Settings instance and a


IMPORTANT
[Link](.kts) file.

The Settings Object

The Settings object is part of the Gradle API.

• In the Groovy DSL, the Settings object documentation is found here.

• In the Kotlin DSL, the Settings object documentation is found here.

Many top-level properties and blocks in a settings script are part of the Settings API.

For example, we can set the root project name in the settings script using the [Link]
property:

[Link] = "application"

Which is usually shortened to:

[Link]

[Link] = "application"

[Link]

[Link] = 'application'
Standard Settings properties

The Settings object exposes a standard set of properties in your settings script.

The following table lists a few commonly used properties:

Name Description
buildCache The build cache configuration.

plugins The container of plugins that have been applied to the settings.
rootDir The root directory of the build. The root directory is the project directory of the root
project.
rootProjec The root project of the build.
t
settings Returns this settings object.

The following table lists a few commonly used methods:

Name Description
include() Adds the given projects to the build.
includeBuild() Includes a build at the specified path to the composite build.

Settings Script structure

A Settings script is a series of method calls to the Gradle API that often use { … }, a special
shortcut in both the Groovy and Kotlin languages. A { } block is called a lambda in Kotlin or a
closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:

plugins(function() {
id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.

The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script
line by line, top to bottom.

Let’s take a look at an example and break it down:


[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}

[Link] = "simple-project" ③

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")

[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}

[Link] = 'simple-project' ③

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")

① Define the location of plugins

② Apply settings plugins.

③ Define the root project name.

④ Define dependency resolution strategies.

⑤ Add subprojects to the build.

1. Define the location of plugins

The settings file can manage plugin versions and repositories for your build using the
pluginManagement block. It provides a way to define which plugins should be used in your project
and from which repositories they should be resolved.

[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

2. Apply settings plugins

The settings file can optionally apply plugins that are required for configuring the settings of the
project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the
example below.

Plugins applied in the settings file only affect the Settings object.
[Link]

plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}

[Link]

plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}

3. Define the root project name

The settings file defines your project name using the [Link] property:

[Link]

[Link] = "simple-project" ③

[Link]

[Link] = 'simple-project' ③

There is only one root project per build.

4. Define dependency resolution strategies

The settings file can optionally define rules and configurations for dependency resolution across
your project(s). It provides a centralized way to manage and customize dependency resolution.

[Link]

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

[Link]

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

You can also include version catalogs in this section.

5. Add subprojects to the build

The settings file defines the structure of the project by adding all the subprojects using the include
statement:

[Link]

include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")

[Link]

include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")

You can also include entire builds using includeBuild.

Settings File Scripting

There are many more properties and methods on the Settings object that you can use to configure
your build.

It’s important to remember that while many Gradle scripts are typically written in short Groovy or
Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings
object in the Gradle API:
include("app")

Is actually:

[Link]("app")

Additionally, the full power of the Groovy and Kotlin languages is available to you.

For example, instead of using include many times to add subprojects, you can iterate over the list of
directories in the project root folder and include them automatically:

[Link]().filter { [Link] && (new File(it,


"[Link]").exists()) }.forEach {
include([Link])
}

TIP This type of logic should be developed in a plugin.

Next Step: Learn how to write Build scripts >>

Writing Build Scripts


The initialization phase in the Gradle Build lifecycle finds the root project and subprojects included
in your project root directory using the settings file.

Then, for each project included in the settings file, Gradle creates a Project instance.

Gradle then looks for a corresponding build script file, which is used in the configuration phase.
Build Scripts

Every Gradle build comprises one or more projects; a root project and subprojects.

A project typically corresponds to a software component that needs to be built, like a library or an
application. It might represent a library JAR, a web application, or a distribution ZIP assembled
from the JARs produced by other projects.

On the other hand, it might represent a thing to be done, such as deploying your application to
staging or production environments.

Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).

A build script configures a project and is associated with an object of type Project.

As the build script executes, it configures Project.

The build script is either a *.gradle file in Groovy or a *.[Link] file in Kotlin.

IMPORTANT Build scripts configure Project objects and their children.

The Project object

The Project object is part of the Gradle API:

• In the Groovy DSL, the Project object documentation is found here.

• In the Kotlin DSL, the Project object documentation is found here.

Many top-level properties and blocks in a build script are part of the Project API.

For example, the following build script uses the [Link] property to print the name of the
project:

[Link]

println(name)
println([Link])

[Link]

println name
println [Link]

$ gradle -q check
project-api
project-api

Both println statements print out the same property.

The first uses the top-level reference to the name property of the Project object. The second
statement uses the project property available to any build script, which returns the associated
Project object.

Standard project properties

The Project object exposes a standard set of properties in your build script.

The following table lists a few commonly used properties:

Name Type Description


name String The name of the project directory.
path String The fully qualified name of the project.
description String A description for the project.
dependencies DependencyHandler Returns the dependency handler of the project.

repositories RepositoryHandler Returns the repository handler of the project.

layout ProjectLayout Provides access to several important locations for a project.


group Object The group of this project.
version Object The version of this project.

The following table lists a few commonly used methods:

Name Description
uri() Resolves a file path to a URI, relative to the project directory of this project.
task() Creates a Task with the given name and adds it to this project.

Build Script structure

The Build script is composed of { … }, a special object in both Groovy and Kotlin. This object is
called a lambda in Kotlin or a closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:
plugins(function() {
id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.

The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes
the script line by line, top to bottom.

Let’s take a look at an example and break it down:

[Link]

plugins { ①
id("application")
}

repositories { ②
mavenCentral()
}

dependencies { ③
testImplementation("[Link]:junit-jupiter-engine:5.9.3")
testRuntimeOnly("[Link]:junit-platform-launcher")
implementation("[Link]:guava:32.1.1-jre")
}

application { ④
mainClass = "[Link]"
}

[Link]<Test>("test") { ⑤
useJUnitPlatform()
}

[Link]<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}

[Link]<Zip>("zip-reports") {
from("Reports/")
include("*")
[Link]("[Link]")
[Link](file("/dir"))
}

[Link]

plugins { ①
id 'application'
}

repositories { ②
mavenCentral()
}

dependencies { ③
testImplementation '[Link]:junit-jupiter-engine:5.9.3'
testRuntimeOnly '[Link]:junit-platform-launcher'
implementation '[Link]:guava:32.1.1-jre'
}

application { ④
mainClass = '[Link]'
}

[Link]('test', Test) { ⑤
useJUnitPlatform()
}

[Link]('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}

[Link]('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = '[Link]'
destinationDirectory = file('/dir')
}

① Apply plugins to the build.

② Define the locations where dependencies can be found.

③ Add dependencies.

④ Set properties.
⑤ Register and configure tasks.

1. Apply plugins to the build

Plugins are used to extend Gradle. They are also used to modularize and reuse project
configurations.

Plugins can be applied using the PluginDependenciesSpec plugins script block.

The plugins block is preferred:

[Link]

plugins { ①
id("application")
}

[Link]

plugins { ①
id 'application'
}

In the example, the application plugin, which is included with Gradle, has been applied, describing
our project as a Java application.

2. Define the locations where dependencies can be found

A project generally has a number of dependencies it needs to do its work. Dependencies include
plugins, libraries, or components that Gradle must download for the build to succeed.

The build script lets Gradle know where to look for the binaries of the dependencies. More than one
location can be provided:

[Link]

repositories { ②
mavenCentral()
}
[Link]

repositories { ②
mavenCentral()
}

In the example, the guava library and the JetBrains Kotlin plugin ([Link]) will be
downloaded from the Maven Central Repository.

3. Add dependencies

A project generally has a number of dependencies it needs to do its work. These dependencies are
often libraries of precompiled classes that are imported in the project’s source code.

Dependencies are managed via configurations and are retrieved from repositories.

Use the DependencyHandler returned by [Link]() method to manage the


dependencies. Use the RepositoryHandler returned by [Link]() method to manage
the repositories.

[Link]

dependencies { ③
testImplementation("[Link]:junit-jupiter-engine:5.9.3")
testRuntimeOnly("[Link]:junit-platform-launcher")
implementation("[Link]:guava:32.1.1-jre")
}

[Link]

dependencies { ③
testImplementation '[Link]:junit-jupiter-engine:5.9.3'
testRuntimeOnly '[Link]:junit-platform-launcher'
implementation '[Link]:guava:32.1.1-jre'
}

In the example, the application code uses Google’s guava libraries. Guava provides utility methods
for collections, caching, primitives support, concurrency, common annotations, string processing,
I/O, and validations.
4. Set properties

A plugin can add properties and methods to a project using extensions.

The Project object has an associated ExtensionContainer object that contains all the settings and
properties for the plugins that have been applied to the project.

In the example, the application plugin added an application property, which is used to detail the
main class of our Java application:

[Link]

application { ④
mainClass = "[Link]"
}

[Link]

application { ④
mainClass = '[Link]'
}

5. Register and configure tasks

Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping
up a WAR file.

While tasks are typically defined in plugins, you may need to register or configure tasks in build
scripts.

Registering a task adds the task to your project.

You can register tasks in a project using the [Link]([Link]) method:

[Link]

[Link]<Zip>("zip-reports") {
from("Reports/")
include("*")
[Link]("[Link]")
[Link](file("/dir"))
}
[Link]

[Link]('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = '[Link]'
destinationDirectory = file('/dir')
}

You may have seen usage of the [Link]([Link]) method which should be
avoided.

[Link]<Zip>("zip-reports") { }

TIP register(), which enables task configuration avoidance, is preferred over create().

You can locate a task to configure it using the [Link]([Link]) method:

[Link]

[Link]<Test>("test") { ⑤
useJUnitPlatform()
}

[Link]

[Link]('test', Test) { ⑤
useJUnitPlatform()
}

The example below configures the Javadoc task to automatically generate HTML documentation
from Java code:

[Link]

[Link]<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}

[Link]

[Link]('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}

Build Scripting

A build script is made up of zero or more statements and script blocks:

println([Link]);

Statements can include method calls, property assignments, and local variable definitions:

version = '[Link]'

A script block is a method call which takes a closure/lambda as a parameter:

configurations {
}

The closure/lambda configures some delegate object as it executes:

repositories {
google()
}

A build script is also a Groovy or a Kotlin script:

[Link]

[Link]("upper") {
doLast {
val someString = "mY_nAmE"
println("Original: $someString")
println("Upper case: ${[Link]()}")
}
}

[Link]

[Link]('upper') {
doLast {
String someString = 'mY_nAmE'
println "Original: $someString"
println "Upper case: ${[Link]()}"
}
}

$ gradle -q upper
Original: mY_nAmE
Upper case: MY_NAME

It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class
definitions:

[Link]

[Link]("count") {
doLast {
repeat(4) { print("$it ") }
}
}

[Link]

[Link]('count') {
doLast {
[Link] { print "$it " }
}
}

$ gradle -q count
0 1 2 3
Flexible task registration

Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:

[Link]

repeat(4) { counter ->


[Link]("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}

[Link]

[Link] { counter ->


[Link]("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}

$ gradle -q task1
I'm task number 1

Gradle Types

In Gradle, types, properties, and providers are foundational for managing and configuring build
logic:

• Types: Gradle defines types (like Task, Configuration, File, etc.) to represent build components.
You can extend these types to create custom tasks or domain objects.

• Properties: Gradle properties (e.g., Property<T>, ListProperty<T>, SetProperty<T>) are used for
build configuration. They allow lazy evaluation, meaning their values are calculated only when
needed, enhancing flexibility and performance.

• Providers: A Provider<T> represents a value that is computed or retrieved lazily. Providers are
often used with properties to defer value computation until necessary. This is especially useful
for integrating dynamic, runtime values into your build.

You can learn more about this in Understanding Gradle Types.


Declare Variables

Build scripts can declare two variables: local variables and extra properties.

Local Variables

Declare local variables with the val keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Kotlin language.

Declare local variables with the def keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Groovy language.

[Link]

val dest = "dest"

[Link]<Copy>("copy") {
from("source")
into(dest)
}

[Link]

def dest = 'dest'

[Link]('copy', Copy) {
from 'source'
into dest
}

Extra Properties

Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined
properties.

Add, read, and set extra properties via the owning object’s extra property. Alternatively, you can
access extra properties via Kotlin delegated properties using by extra.

Add, read, and set extra properties via the owning object’s ext property. Alternatively, you can use
an ext block to add multiple properties simultaneously.

[Link]

plugins {
id("java-library")
}

val springVersion by extra("[Link]")


val emailNotification by extra { "build@[Link]" }

[Link] { extra["purpose"] = null }

sourceSets {
main {
extra["purpose"] = "production"
}
test {
extra["purpose"] = "test"
}
create("plugin") {
extra["purpose"] = "production"
}
}

[Link]("printProperties") {
val springVersion = springVersion
val emailNotification = emailNotification
val productionSourceSets = provider {
[Link] { [Link]["purpose"] == "production" }.map {
[Link] }
}
doLast {
println(springVersion)
println(emailNotification)
[Link]().forEach { println(it) }
}
}

[Link]

plugins {
id 'java-library'
}

ext {
springVersion = "[Link]"
emailNotification = "build@[Link]"
}

[Link] { [Link] = null }

sourceSets {
main {
purpose = "production"
}
test {
purpose = "test"
}
plugin {
purpose = "production"
}
}

[Link]('printProperties') {
def springVersion = springVersion
def emailNotification = emailNotification
def productionSourceSets = provider {
[Link] { [Link] == "production" }.collect { [Link]
}
}
doLast {
println springVersion
println emailNotification
[Link]().each { println it }
}
}

$ gradle -q printProperties
[Link]
build@[Link]
main
plugin

This example adds two extra properties to the project object via by extra. Additionally, this
example adds a property named purpose to each source set by setting extra["purpose"] to null. Once
added, you can read and set these properties via extra.

This example adds two extra properties to the project object via an ext block. Additionally, this
example adds a property named purpose to each source set by setting [Link] to null. Once
added, you can read and set all these properties just like predefined ones.

Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows
Gradle to recognize when a script attempts to set a property that does not exist. You can access
extra properties anywhere where you can access their owning object. This gives extra properties a
wider scope than local variables. Subprojects can access extra properties on their parent projects.

For more information about extra properties, see ExtraPropertiesExtension in the API
documentation.
Configure Arbitrary Objects

The example greet() task shows an example of arbitrary object configuration:

[Link]

class UserInfo(
var name: String? = null,
var email: String? = null
)

[Link]("greet") {
val user = UserInfo().apply {
name = "Isaac Newton"
email = "isaac@[Link]"
}
doLast {
println([Link])
println([Link])
}
}

[Link]

class UserInfo {
String name
String email
}

[Link]('greet') {
def user = configure(new UserInfo()) {
name = "Isaac Newton"
email = "isaac@[Link]"
}
doLast {
println [Link]
println [Link]
}
}

$ gradle -q greet
Isaac Newton
isaac@[Link]
Closure Delegates

Each closure has a delegate object. Groovy uses this delegate to look up variable and method
references to nonlocal variables and closure parameters. Gradle uses this for configuration closures,
where the delegate object refers to the object being configured.

[Link]

dependencies {
assert delegate == [Link]
testImplementation('junit:junit:4.13')
[Link]('junit:junit:4.13')
}

Default imports

To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.

As a result, instead of writing throw new [Link](), you can


write throw new StopExecutionException() instead.

Gradle implicitly adds the following imports to each script:

import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].c.*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*

Next Step: Learn how to use Tasks >>

Using Tasks
The work that Gradle can do on a project is defined by one or more tasks.
A task represents some independent unit of work that a build performs. This might be compiling
some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.

When a user runs ./gradlew build in the command line, Gradle will execute the build task along
with any other tasks it depends on.

List available tasks

Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks:

> Task :tasks

------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------

Build Setup tasks


-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project
'myTutorial'.
...

Tasks either come from build scripts or plugins.


Once we apply a plugin to our project, such as the application plugin, additional tasks become
available:

[Link]

plugins {
id("application")
}

[Link]

plugins {
id 'application'
}

$ ./gradlew tasks

> Task :tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.

Other tasks
-----------
compileJava - Compiles main Java source.

...

Many of these tasks, such as assemble, build, and run, should be familiar to a developer.
Task classification

There are two classes of tasks that can be executed:

1. Actionable tasks have some action(s) attached to do work in your build: compileJava.

2. Lifecycle tasks are tasks with no actions attached: assemble, build.

Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at
once.

Task registration and action

Let’s take a look at a simple "Hello World" task in a build script:

[Link]

[Link]("hello") {
doLast {
println("Hello world!")
}
}

[Link]

[Link]('hello') {
doLast {
println 'Hello world!'
}
}

In the example, the build script registers a single task called hello using the TaskContainer API,
and adds an action to it.

If the tasks in the project are listed, the hello task is available to Gradle:

$ ./gradlew app:tasks --all

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.

You can execute the task in the build script with ./gradlew hello:

$ ./gradlew hello
Hello world!

When Gradle executes the hello task, it executes the action provided. In this case, the action is
simply a block containing some code: println("Hello world!").

Task group and description

The hello task from the previous section can be detailed with a description and assigned to a
group with the following update:

[Link]

[Link]("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
}

[Link]

[Link]('hello') {
group = 'Custom'
description = 'A lovely greeting task.'
doLast {
println 'Hello world!'
}
}

Once the task is assigned to a group, it will be listed by ./gradlew tasks:


$ ./gradlew tasks

> Task :tasks

Custom tasks
------------------
hello - A lovely greeting task.

To view information about a task, use the help --task <task-name> command:

$./gradlew help --task hello

> Task :help


Detailed task information for hello

Path
:app:hello

Type
Task ([Link])

Options
--rerun Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom

As we can see, the hello task belongs to the custom group.

Task dependencies

You can declare tasks that depend on other tasks:

[Link]

[Link]("hello") {
doLast {
println("Hello world!")
}
}
[Link]("intro") {
dependsOn("hello")
doLast {
println("I'm Gradle")
}
}

[Link]

[Link]('hello') {
doLast {
println 'Hello world!'
}
}
[Link]('intro') {
dependsOn [Link]
doLast {
println "I'm Gradle"
}
}

$ gradle -q intro
Hello world!
I'm Gradle

The dependency of taskX to taskY may be declared before taskY is defined:

[Link]

[Link]("taskX") {
dependsOn("taskY")
doLast {
println("taskX")
}
}
[Link]("taskY") {
doLast {
println("taskY")
}
}

[Link]

[Link]('taskX') {
dependsOn 'taskY'
doLast {
println 'taskX'
}
}
[Link]('taskY') {
doLast {
println 'taskY'
}
}

$ gradle -q taskX
taskY
taskX

The hello task from the previous example is updated to include a dependency:

[Link]

[Link]("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn([Link])
}

[Link]

[Link]('hello') {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn([Link])
}

The hello task now depends on the assemble task, which means that Gradle must execute the
assemble task before it can execute the hello task:

$ ./gradlew :app:hello
> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:startScripts UP-TO-DATE
> Task :app:distTar UP-TO-DATE
> Task :app:distZip UP-TO-DATE
> Task :app:assemble UP-TO-DATE

> Task :app:hello


Hello world!

Task configuration

Once registered, tasks can be accessed via the TaskProvider API for further configuration.

For instance, you can use this to add dependencies to a task at runtime dynamically:

[Link]

repeat(4) { counter ->


[Link]("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}
[Link]("task0") { dependsOn("task2", "task3") }

[Link]

[Link] { counter ->


[Link]("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}
[Link]('task0') { dependsOn('task2', 'task3') }

$ gradle -q task0
I'm task number 2
I'm task number 3
I'm task number 0

Or you can add behavior to an existing task:

[Link]

[Link]("hello") {
doLast {
println("Hello Earth")
}
}
[Link]("hello") {
doFirst {
println("Hello Venus")
}
}
[Link]("hello") {
doLast {
println("Hello Mars")
}
}
[Link]("hello") {
doLast {
println("Hello Jupiter")
}
}

[Link]

[Link]('hello') {
doLast {
println 'Hello Earth'
}
}
[Link]('hello') {
doFirst {
println 'Hello Venus'
}
}
[Link]('hello') {
doLast {
println 'Hello Mars'
}
}
[Link]('hello') {
doLast {
println 'Hello Jupiter'
}
}

$ gradle -q hello
Hello Venus
Hello Earth
Hello Mars
Hello Jupiter

The calls doFirst and doLast can be executed multiple times. They add an action to the
TIP beginning or the end of the task’s actions list. When the task executes, the actions in
the action list are executed in order.

Here is an example of the named method being used to configure a task added by a plugin:

[Link]

[Link] {
[Link](buildDir)
}

[Link]

[Link]("dokkaHtml") {
[Link](buildDir)
}

Task types

Gradle tasks are a subclass of Task.

In the build script, the HelloTask class is created by extending DefaultTask:

[Link]

// Extend the DefaultTask class to create a HelloTask class


abstract class HelloTask : DefaultTask() {
@TaskAction
fun hello() {
println("hello from HelloTask")
}
}

// Register the hello Task with type HelloTask


[Link]<HelloTask>("hello") {
group = "Custom tasks"
description = "A lovely greeting task."
}

[Link]

// Extend the DefaultTask class to create a HelloTask class


class HelloTask extends DefaultTask {
@TaskAction
void hello() {
println("hello from HelloTask")
}
}

// Register the hello Task with type HelloTask


[Link]("hello", HelloTask) {
group = "Custom tasks"
description = "A lovely greeting task."
}

The hello task is registered with the type HelloTask.

Executing our new hello task:

$ ./gradlew hello

> Task :app:hello


hello from HelloTask

Now the hello task is of type HelloTask instead of type Task.

The Gradle help task reveals the change:

$ ./gradlew help --task hello

> Task :help


Detailed task information for hello
Path
:app:hello

Type
HelloTask (Build_gradle$HelloTask)

Options
--rerun Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom tasks

Built-in task types

Gradle provides many built-in task types with common and popular functionality, such as copying
or deleting files.

This example task copies *.war files from the source directory to the target directory using the Copy
built-in task:

[Link]

[Link]<Copy>("copyTask") {
from("source")
into("target")
include("*.war")
}

[Link]

[Link]('copyTask', Copy) {
from("source")
into("target")
include("*.war")
}

There are many task types developers can take advantage of, including GroovyDoc, Zip, Jar,
JacocoReport, Sign, or Delete, which are available in the link:DSL.

Next Step: Learn how to write Tasks >>


Writing Tasks
Gradle tasks are created by extending DefaultTask.

However, the generic DefaultTask provides no action for Gradle. If users want to extend the
capabilities of Gradle and their build script, they must either use a built-in task or create a custom
task:

1. Built-in task - Gradle provides built-in utility tasks such as Copy, Jar, Zip, Delete, etc…

2. Custom task - Gradle allows users to subclass DefaultTask to create their own task types.

Create a task

The simplest and quickest way to create a custom task is in a build script:

To create a task, inherit from the DefaultTask class and implement a @TaskAction handler:

[Link]

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("[Link]")
[Link]()
[Link]("HELLO FROM MY TASK")
}
}

[Link]

class CreateFileTask extends DefaultTask {


@TaskAction
void action() {
def file = new File("[Link]")
[Link]()
[Link] = "HELLO FROM MY TASK"
}
}

The CreateFileTask implements a simple set of actions. First, a file called "[Link]" is created in
the main project. Then, some text is written to the file.
Register a task

A task is registered in the build script using the [Link]() method, which allows it
to be then used in the build logic.

[Link]

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("[Link]")
[Link]()
[Link]("HELLO FROM MY TASK")
}
}

[Link]<CreateFileTask>("createFileTask")

[Link]

class CreateFileTask extends DefaultTask {


@TaskAction
void action() {
def file = new File("[Link]")
[Link]()
[Link] = "HELLO FROM MY TASK"
}
}

[Link]("createFileTask", CreateFileTask)

Task group and description

Setting the group and description properties on your tasks can help users understand how to use
your task:

[Link]

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("[Link]")
[Link]()
[Link]("HELLO FROM MY TASK")
}
}

[Link]<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create [Link] in the current directory"
}

[Link]

class CreateFileTask extends DefaultTask {


@TaskAction
void action() {
def file = new File("[Link]")
[Link]()
[Link] = "HELLO FROM MY TASK"
}
}

[Link]("createFileTask", CreateFileTask) {
group = "custom"
description = "Create [Link] in the current directory"
}

Once a task is added to a group, it is visible when listing tasks.

Task input and outputs

For the task to do useful work, it typically needs some inputs. A task typically produces outputs.

[Link]

abstract class CreateAFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = "[Link]"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}

[Link]

abstract class CreateAFileTask extends DefaultTask {


@Input
abstract Property<String> getFileText()

@Input
final String fileName = "[Link]"

@OutputFile
final File myFile = new File(fileName)

@TaskAction
void action() {
[Link]()
[Link] = [Link]()
}
}

Configure a task

A task is optionally configured in a build script using the [Link]() method.

The CreateAFileTask class is updated so that the text in the file is configurable:

[Link]

abstract class CreateAFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = "[Link]"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}

[Link]<CreateAFileTask>("createAFileTask") {
group = "custom"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM THE CREATE FILE TASK METHOD") // Set
convention
}

[Link]<CreateAFileTask>("createAFileTask") {
[Link]("HELLO FROM THE NAMED METHOD") // Override with custom
message
}

[Link]

abstract class CreateAFileTask extends DefaultTask {


@Input
abstract Property<String> getFileText()

@Input
final String fileName = "[Link]"

@OutputFile
final File myFile = new File(fileName)

@TaskAction
void action() {
[Link]()
[Link] = [Link]()
}
}

[Link]("createAFileTask", CreateAFileTask) {
group = "custom"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM THE CREATE FILE TASK METHOD") // Set
convention
}

[Link]("createAFileTask", CreateAFileTask) {
[Link]("HELLO FROM THE NAMED METHOD") // Override with custom
message
}
In the named() method, we find the createAFileTask task and set the text that will be written to the
file.

When the task is executed:

$ ./gradlew createAFileTask

> Configure project :app

> Task :app:createAFileTask

BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date

A text file called [Link] is created in the project root folder:

[Link]

HELLO FROM THE NAMED METHOD

Consult the Developing Gradle Tasks chapter to learn more.

Next Step: Learn how to use Plugins >>

Using Plugins
Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with
Gradle, third-party plugins, and script plugins defined within builds.

Plugins introduce new tasks (e.g., JavaCompile), domain objects (e.g., SourceSet), conventions (e.g.,
locating Java source at src/main/java), and extend core or other plugin objects.

Plugins in Gradle are essential for automating common build tasks, integrating with external tools
or services, and tailoring the build process to meet specific project needs. They also serve as the
primary mechanism for organizing build logic.

Benefits of plugins

Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins
offer several advantages over adding logic directly to the build script:

• Promotes Reusability: Reduces the need to duplicate similar logic across projects.

• Enhances Modularity: Allows for a more modular and organized build script.

• Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.
Plugin distribution

You can leverage plugins from Gradle and the Gradle community or create your own.

Plugins are available in three ways:

1. Core plugins - Gradle develops and maintains a set of Core Plugins.

2. Community plugins - Gradle plugins shared in a remote repository such as Maven or the
Gradle Plugin Portal.

3. Custom plugins - Gradle enables users to create plugins using APIs.

Types of plugins

Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:

1. Script Plugins

Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build script
using the apply from: syntax. They are applied inline within a build script to add functionality or
customize the build process. They are not recommended but it’s important to understand how to
work:

[Link]

// Define a plugin
class HelloWorldPlugin : Plugin<Project> {
override fun apply(project: Project) {
[Link]("helloWorld") {
group = "Example"
description = "Prints 'Hello, World!' to the console"
doLast {
println("Hello, World!")
}
}
}
}

// Apply the plugin


apply<HelloWorldPlugin>()

2. Precompiled Script Plugins

Precompiled script plugins are Groovy DSL or Kotlin DSL scripts compiled and distributed as Java
class files packaged in some library. They are meant to be consumed as a binary Gradle plugin, so
they are applied to a project using the plugins {} block. The plugin ID by which the precompiled
script can be referenced is derived from its name and optional package declaration.
plugin/src/main/kotlin/[Link]

// This script is automatically exposed to downstream consumers as the `my-plugin`


plugin
tasks {
register("myCopyTask", Copy::class) {
group = "sample"
from("[Link]")
into("build/copy")
}
}

consumer/[Link]

plugins {
id("my-plugin") version "1.0"
}

3. BuildSrc and Convention Plugins

These are a hybrid of precompiled plugins and binary plugins that provide a way to reuse complex
logic across projects and allow for better organization of build logic.

buildSrc/src/main/kotlin/[Link]

plugins {
java
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("[Link]:junit-jupiter:5.8.1")
implementation("[Link]:guava:30.1.1-jre")
}

[Link]<Test>("test") {
useJUnitPlatform()
}

[Link]<Copy>("backupTestXml") {
from("build/test-results/test")
into("/tmp/results/")
exclude("binary/**")
}
app/[Link]

plugins {
application
id("shared-build-conventions")
}

4. Binary Plugins

Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as
JAR files. They are applied to a project using the plugins {} block. They offer better performance
and maintainability compared to script plugins or precompiled script plugins.

plugin/src/main/kotlin/plugin/[Link]

class MyPlugin : Plugin<Project> {


override fun apply(project: Project) {
[Link] {
tasks {
register("myCopyTask", Copy::class) {
group = "sample"
from("[Link]")
into("build/copy")
}
}
}
}
}

consumer/[Link]

plugins {
id("my-plugin") version "1.0"
}

The difference between a binary plugin and a script plugin lies in how they are shared and
executed:

• A binary plugin is compiled into bytecode, and the bytecode is shared.

• A script plugin is shared as source code, and it is compiled at the time of use.

Binary plugins can be written in any language that produces JVM bytecode, such as Java, Kotlin, or
Groovy. In contrast, script plugins can only be written using Kotlin DSL or Groovy DSL.

However, there is also a middle ground: precompiled script plugins. These are written in Kotlin
DSL or Groovy DSL, like script plugins, but are compiled into bytecode and shared like binary
plugins.

A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes
more valuable, it’s migrated to a binary plugin that can be easily tested and shared between
multiple projects or organizations.

Using plugins

To use the build logic encapsulated in a plugin, Gradle needs to perform two steps. First, it needs to
resolve the plugin, and then it needs to apply the plugin to the target, usually a Project.

1. Resolving a plugin means finding the correct version of the JAR that contains a given plugin
and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build
script. Script plugins are self-resolving in that they are resolved from the specific file path or
URL provided when applying them. Core binary plugins provided as part of the Gradle
distribution are automatically resolved.

2. Applying a plugin means executing the plugin’s [Link](T) on a project.

The plugins DSL is recommended to resolve and apply plugins in one step.

Resolving plugins

Gradle provides the core plugins (e.g., JavaPlugin, GroovyPlugin, MavenPublishPlugin, etc.) as part of
its distribution, which means they are automatically resolved.

Core plugins are applied in a build script using the plugin name:

plugins {
id «plugin name»
}

For example:

plugins {
id("java")
}

Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a
unique ID and a version in the build file:

plugins {
id «plugin id» version «plugin version»
}

And the location of the plugin must be specified in the settings file:
[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

[Link]

pluginManagement { ①
repositories {
gradlePluginPortal()
}
}

There are additional considerations for resolving and applying plugins:

# To Use For example:

1 Apply a plugin to a The plugins block in the build


project. file plugins {

id("[Link]")
version "2.1.0"
}

2 Apply a plugin to The subprojects or allprojects


multiple blocks in the root build file. Not plugins {
subprojects. Recommended
id("[Link]")
version "2.1.0"
}
allprojects {
apply(plugin =
"[Link]")
repositories {
mavenCentral()
}
}
# To Use For example:

3 Apply a plugin to A convention plugin in the


multiple buildSrc directory plugins {
id("my-
subprojects. Recommended.
[Link]")
}

4 Apply a plugin The buildscript block in the


needed for the build build file itself. Legacy. buildscript {
repositories {
script itself.
mavenCentral()
}
dependencies {

classpath("[Link]
kinfo:gradle-taskinfo:2.1.0")
}
}
apply(plugin =
"[Link]")

5 Apply a script The legacy apply() method in the


plugins. build file. Not Recommended. apply<MyCustomBarfuinTaskInfoPlug
in>()
Legacy.

1. Applying plugins using the plugins{} block

The plugin DSL provides a concise and convenient way to declare plugin dependencies.

The plugins block configures an instance of PluginDependenciesSpec:

plugins {
application // by name
java // by name
id("java") // by id - recommended
id("[Link]") version "1.9.0" // by id - recommended
}

Core Gradle plugins are unique in that they provide short names, such as java for the core
JavaPlugin.

To apply a core plugin, the short name can be used:

[Link]

plugins {
java
}

[Link]

plugins {
id 'java'
}

All other binary plugins must use the fully qualified form of the plugin id (e.g., [Link]).

To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally
unique identifier, must be used:

[Link]

plugins {
id("[Link]") version "3.3.1"
}

[Link]

plugins {
id '[Link]' version '3.3.1'
}

See PluginDependenciesSpec for more information on using the Plugin DSL.

Limitations of the plugins DSL

The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine
which plugins are used quickly. This allows Gradle to:

• Optimize the loading and reuse of plugin classes.

• Provide editors with detailed information about the potential properties and values in the build
script.

However, the DSL requires that plugins be defined statically.

There are some key differences between the plugins {} block mechanism and the "traditional"
apply() method mechanism. There are also some constraints and possible limitations.
The plugins{} block can only be used in a project’s build script [Link](.kts) and the
[Link](.kts) file. It must appear before any other block. It cannot be used in script plugins
or init scripts.

Constrained Syntax

The plugins {} block does not support arbitrary code.

It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for
Gradle to execute at any time).

The form is:

plugins {
id(«plugin id») ①
id(«plugin id») version «plugin version» ②
}

① for core Gradle plugins or plugins already available to the build script

② for binary Gradle plugins that need to be resolved

Where «plugin id» and «plugin version» are a string.

Where «plugin id» and «plugin version» must be constant, literal strings.

The plugins{} block must also be a top-level statement in the build script. It cannot be nested inside
another construct (e.g., an if-statement or for-loop).

2. Applying plugins to all subprojects{} or allprojects{}

Suppose you have a multi-project build, you probably want to apply plugins to some or all of the
subprojects in your build but not to the root project.

While the default behavior of the plugins{} block is to immediately resolve and apply the plugins,
you can use the apply false syntax to tell Gradle not to apply the plugin to the current project.
Then, use the plugins{} block without the version in subprojects' build scripts:

[Link]

include("hello-a")
include("hello-b")
include("goodbye-c")

[Link]

plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective
build files).
id("[Link]") version "1.0.0" apply false
id("[Link]") version "1.0.0" apply false
}

allprojects {
// Apply the common 'java' plugin to all projects (including the root)
[Link]("java")
}

subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the
root)
[Link]("java-library")
}

hello-a/[Link]

plugins {
id("[Link]")
}

hello-b/[Link]

plugins {
id("[Link]")
}

goodbye-c/[Link]

plugins {
id("[Link]")
}

[Link]

include 'hello-a'
include 'hello-b'
include 'goodbye-c'

[Link]

plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective
build files).
id '[Link]' version '1.0.0' apply false
id '[Link]' version '1.0.0' apply false
}

allprojects {
// Apply the common 'java' plugin to all projects (including the root)
apply(plugin: 'java')
}

subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the
root)
apply(plugin: 'java-library')
}

hello-a/[Link]

plugins {
id '[Link]'
}

hello-b/[Link]

plugins {
id '[Link]'
}

goodbye-c/[Link]

plugins {
id '[Link]'
}

You can also encapsulate the versions of external plugins by composing the build logic using your
own convention plugins.

3. Applying convention plugins from the buildSrc directory

buildSrc is an optional directory at the Gradle project root that contains build logic (i.e., plugins)
used in building the main project. You can apply plugins that reside in a project’s buildSrc directory
as long as they have a defined ID.

The following example shows how to tie the plugin implementation class [Link], defined in
buildSrc, to the id "my-plugin":
buildSrc/[Link]

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("myPlugins") {
id = "my-plugin"
implementationClass = "[Link]"
}
}
}

buildSrc/[Link]

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
myPlugins {
id = 'my-plugin'
implementationClass = '[Link]'
}
}
}

The plugin can then be applied by ID:

[Link]

plugins {
id("my-plugin")
}

[Link]

plugins {
id 'my-plugin'
}

4. Applying plugins using the buildscript{} block

To define libraries or plugins used in the build script itself, you can use the buildscript block. The
buildscript block is also used for specifying where to find those dependencies.

This approach is less common with newer versions of Gradle, as the plugins {} block simplifies
plugin usage. However, buildscript {} may be necessary when dealing with custom or non-
standard plugin repositories as well as libraries dependencies:

[Link]

import [Link]
import [Link]

buildscript {
repositories {
maven {
url = uri("[Link]
}
mavenCentral() // Where to find the plugin
}
dependencies {
classpath("[Link]:snakeyaml:1.19") // The library's classpath
dependency
classpath("[Link]:shadow-gradle-plugin:8.3.4") // Plugin
dependency for legacy plugin application
}
}

// Applies legacy Shadow plugin


apply(plugin = "[Link]")

// Uses the library in the build script


val yamlContent = """
name: Project
""".trimIndent()
val yaml = Yaml()
val data: Map<String, Any> = [Link](yamlContent)

[Link]

import [Link]
buildscript {
repositories { // Where to find the plugin or library
maven {
url = uri("[Link]
}
mavenCentral()
}
dependencies {
classpath '[Link]:snakeyaml:1.19' // The library's classpath
dependency
classpath '[Link]:shadow-gradle-plugin:8.3.4' // Plugin
dependency for legacy plugin application
}
}

// Applies legacy Shadow plugin


apply plugin: '[Link]'

// Uses the library in the build script


def yamlContent = """
name: Project Name
"""
def yaml = new Yaml()
def data = [Link](yamlContent)

5. Applying script plugins using the legacy apply() method

A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is
applied using the legacy application method:

[Link]

class MyPlugin : Plugin<Project> {


override fun apply(project: Project) {
println("Plugin ${[Link]} applied on
${[Link]}")
}
}

apply<MyPlugin>()

[Link]

class MyPlugin implements Plugin<Project> {


@Override
void apply(Project project) {
println("Plugin ${[Link]().getSimpleName()} applied on
${[Link]}")
}
}

apply plugin: MyPlugin

Plugin Management

The pluginManagement{} block is used to configure repositories for plugin resolution and to define
version constraints for plugins that are applied in the build scripts.

The pluginManagement{} block can be used in a [Link](.kts) file, where it must be the first
block in the file:

[Link]

pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
[Link] = "plugin-management"

[Link]

pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
[Link] = 'plugin-management'

The block can also be used in Initialization Script:


[Link]

settingsEvaluated {
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}

[Link]

settingsEvaluated { settings ->


[Link] {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}

Custom Plugin Repositories

By default, the plugins{} DSL resolves plugins from the public Gradle Plugin Portal.

Many build authors would also like to resolve plugins from private Maven or Ivy repositories
because they contain proprietary implementation details or to have more control over what
plugins are available to their builds.

To specify custom plugin repositories, use the repositories{} block inside pluginManagement{}:

[Link]

pluginManagement {
repositories {
maven(url = file("./maven-repo"))
gradlePluginPortal()
ivy(url = file("./ivy-repo"))
}
}

[Link]

pluginManagement {
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}

This tells Gradle to first look in the Maven repository at ../maven-repo when resolving plugins and
then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository. If you
don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal() line. Finally, the
Ivy repository at ../ivy-repo will be checked.

Plugin Version Management

A plugins{} block inside pluginManagement{} allows all plugin versions for the build to be defined in
a single location. Plugins can then be applied by id to any build script via the plugins{} block.

One benefit of setting plugin versions this way is that the [Link]{} does not have
the same constrained syntax as the build script plugins{} block. This allows plugin versions to be
taken from [Link], or loaded via another mechanism.

Managing plugin versions via pluginManagement:

[Link]

pluginManagement {
val helloPluginVersion: String by settings
plugins {
id("[Link]") version "${helloPluginVersion}"
}
}
[Link]

plugins {
id("[Link]")
}

[Link]

helloPluginVersion=1.0.0

[Link]

pluginManagement {
plugins {
id '[Link]' version "${helloPluginVersion}"
}
}

[Link]

plugins {
id '[Link]'
}

[Link]

helloPluginVersion=1.0.0

The plugin version is loaded from [Link] and configured in the settings script, allowing
the plugin to be added to any project without specifying the version.

Plugin Resolution Rules

Plugin resolution rules allow you to modify plugin requests made in plugins{} blocks, e.g., changing
the requested version or explicitly specifying the implementation artifact coordinates.

To add resolution rules, use the resolutionStrategy{} inside the pluginManagement{} block:

[Link]

pluginManagement {
resolutionStrategy {
eachPlugin {
if ([Link] == "[Link]") {
useModule("[Link]:sample-plugins:1.0.0")
}
}
}
repositories {
maven {
url = uri("./maven-repo")
}
gradlePluginPortal()
ivy {
url = uri("./ivy-repo")
}
}
}

[Link]

pluginManagement {
resolutionStrategy {
eachPlugin {
if ([Link] == '[Link]') {
useModule('[Link]:sample-plugins:1.0.0')
}
}
}
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}

This tells Gradle to use the specified plugin implementation artifact instead of its built-in default
mapping from plugin ID to Maven/Ivy coordinates.

Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts
that implement the plugin. Read Gradle Plugin Development Plugin for more information on
publishing plugins to custom repositories.

See PluginManagementSpec for complete documentation for using the pluginManagement{} block.
Plugin Marker Artifacts

Since the plugins{} DSL block only allows for declaring plugins by their globally unique plugin id
and version properties, Gradle needs a way to look up the coordinates of the plugin implementation
artifact.

To do so, Gradle will look for a Plugin Marker Artifact with the coordinates
[Link]:[Link]:[Link]. This marker needs to have a dependency on the
actual plugin implementation. Publishing these markers is automated by the java-gradle-plugin.

For example, the following complete sample from the sample-plugins project shows how to publish
a [Link] plugin and a [Link] plugin to both an Ivy and Maven repository
using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish
plugin.

[Link]

plugins {
`java-gradle-plugin`
`maven-publish`
`ivy-publish`
}

group = "[Link]"
version = "1.0.0"

gradlePlugin {
plugins {
create("hello") {
id = "[Link]"
implementationClass = "[Link]"
}
create("goodbye") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}

publishing {
repositories {
maven {
url = uri([Link]("maven-repo"))
}
ivy {
url = uri([Link]("ivy-repo"))
}
}
}
[Link]

plugins {
id 'java-gradle-plugin'
id 'maven-publish'
id 'ivy-publish'
}

group = '[Link]'
version = '1.0.0'

gradlePlugin {
plugins {
hello {
id = '[Link]'
implementationClass = '[Link]'
}
goodbye {
id = '[Link]'
implementationClass = '[Link]'
}
}
}

publishing {
repositories {
maven {
url = [Link]('maven-repo')
}
ivy {
url = [Link]('ivy-repo')
}
}
}

Running gradle publish in the sample directory creates the following Maven repository layout (the
Ivy layout is similar):
Legacy Plugin Application

With the introduction of the plugins DSL, users should have little reason to use the legacy method
of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to
restrictions in how it currently works.

[Link]

apply(plugin = "java")

[Link]

apply plugin: 'java'

Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to
apply the JavaPlugin.

Rather than using a plugin id, plugins can also be applied by simply specifying the class of the
plugin:

[Link]

apply<JavaPlugin>()
[Link]

apply plugin: JavaPlugin

The JavaPlugin symbol in the above sample refers to the JavaPlugin. This class does not strictly need
to be imported as the [Link] package is automatically imported in all build scripts
(see Default imports).

Furthermore, one needs to append the ::class suffix to identify a class literal in Kotlin instead of
.class in Java.

Furthermore, it is unnecessary to append .class to identify a class literal in Groovy as it is in Java.

You may also see the apply method used to include an entire build file:

[Link]

apply(from = "[Link]")

[Link]

apply from: '[Link]'

Using a Version Catalog

When a project uses a version catalog, plugins can be referenced via aliases when applied.

Let’s take a look at a simple Version Catalog:

[Link]

[versions]
groovy = "3.0.5"
checkstyle = "8.37"

[libraries]
groovy-core = { module = "[Link]:groovy", [Link] = "groovy"
}
groovy-json = { module = "[Link]:groovy-json", [Link] =
"groovy" }
groovy-nio = { module = "[Link]:groovy-nio", [Link] =
"groovy" }
commons-lang3 = { group = "[Link]", name = "commons-lang3",
version = { strictly = "[3.8, 4.0[", prefer="3.9" } }

[bundles]
groovy = ["groovy-core", "groovy-json", "groovy-nio"]

[plugins]
versions = { id = "[Link]", version = "0.45.0" }

Then a plugin can be applied to any build script using the alias method:

[Link]

plugins {
`java-library`
alias([Link])
}

[Link]

plugins {
id 'java-library'
alias([Link])
}

TIP Gradle generates type safe accessors for catalog items.

Next Step: Learn how to write Plugins >>

Writing Plugins
If Gradle or the Gradle community does not offer the specific capabilities your project needs,
creating your own custom plugin could be a solution.

Additionally, if you find yourself duplicating build logic across subprojects and need a better way to
organize it, convention plugins can help.
Script plugin

A plugin is any class that implements the Plugin interface. For example, this is a "hello world"
plugin:

[Link]

abstract class SamplePlugin : Plugin<Project> { ①


override fun apply(project: Project) { ②
[Link]("ScriptPlugin") {
doLast {
println("Hello world from the build file!")
}
}
}
}

apply<SamplePlugin>() ③

[Link]

class SamplePlugin implements Plugin<Project> { ①


void apply(Project project) { ②
[Link]("ScriptPlugin") {
doLast {
println("Hello world from the build file!")
}
}
}
}

apply plugin: SamplePlugin ③

① Extend the [Link] interface.

② Override the apply method.

③ apply the plugin to the project.

1. Extend the [Link] interface

Create a class that extends the Plugin interface:


[Link]

abstract class SamplePlugin : Plugin<Project> {


}

[Link]

class SamplePlugin implements Plugin<Project> {


}

2. Override the apply method

Add tasks and other logic in the apply() method:

[Link]

override fun apply() {

[Link]

void apply(Project project) {

3. apply the plugin to your project

When SamplePlugin is applied in your project, Gradle calls the fun apply() {} method defined. This
adds the ScriptPlugin task to your project:

[Link]

apply<SamplePlugin>()
[Link]

apply plugin: SamplePlugin

Note that this is a simple hello-world example and does not reflect best practices.

IMPORTANT Script plugins are not recommended.

The best practice for developing plugins is to create convention plugins or binary plugins.

Pre-compiled script plugin

Pre-compiled script plugins offer an easy way to rapidly prototype and experiment. They let you
package build logic as *.gradle(.kts) script files using the Groovy or Kotlin DSL. These scripts
reside in specific directories, such as src/main/groovy or src/main/kotlin.

To apply one, simply use its ID derived from the script filename (without .gradle). You can think of
the file itself as the plugin, so you do not need to subclass the Plugin interface in a precompiled
script.

Let’s take a look at an example with the following structure:

.
└── buildSrc
├── [Link]
└── src
└── main
└── kotlin
└── [Link]

Our [Link] file contains the following code:

buildSrc/src/main/kotlin/[Link]

abstract class CreateFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = "[Link]"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}

[Link]<CreateFileTask>("createMyFileTaskInConventionPlugin") {
group = "from my convention plugin"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM MY CONVENTION PLUGIN")
}

buildSrc/src/main/groovy/[Link]

abstract class CreateFileTask extends DefaultTask {


@Input
abstract Property<String> getFileText()

@Input
String fileName = "[Link]"

@OutputFile
File getMyFile() {
return new File(fileName)
}

@TaskAction
void action() {
[Link]()
[Link]([Link]())
}
}

[Link]("createMyFileTaskInConventionPlugin", CreateFileTask) {
group = "from my convention plugin"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM MY CONVENTION PLUGIN")
}

The pre-compiled script can now be applied in the [Link](.kts) file of any subproject:

[Link]

plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
`kotlin-dsl`
}

[Link]

plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id 'groovy' // Apply the Groovy DSL plugin
}

The createFileTask task from the plugin is now available in your subproject.

Binary Plugins

A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR
file. It is resolved as a dependency rather than compiled from source.

For most use cases, convention plugins must be updated infrequently. Having each developer
execute the plugin build as part of their development process is wasteful, and we can instead
distribute them as binary dependencies.

There are two ways to update the convention plugin in the example above into a binary plugin.

1. Use composite builds:

[Link]

includeBuild("my-plugin")

2. Publish the plugin to a repository:

[Link]

plugins {
id("[Link]-plugin") version "1.0.0"
}

Let’s go with the second solution. This plugin has been re-written in Kotlin and is called
[Link]. It is still stored in buildSrc:

buildSrc/src/main/kotlin/[Link]

import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]

abstract class CreateFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = [Link]() + "/[Link]"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}

class MyCreateFileBinaryPlugin : Plugin<Project> {


override fun apply(project: Project) {
[Link]("createFileTaskFromBinaryPlugin",
CreateFileTask::[Link]) {
group = "from my binary plugin"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM MY BINARY PLUGIN")
}
}
}

The plugin can be published and given an id using a gradlePlugin{} block so that it can be
referenced in the root:

buildSrc/[Link]

group = "[Link]"
version = "1.0.0"

gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "[Link]-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}

publishing {
repositories {
mavenLocal()
}
}

buildSrc/[Link]

group = '[Link]'
version = '1.0.0'

gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "[Link]-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}

publishing {
repositories {
mavenLocal()
}
}

Then, the plugin can be applied in the build file:

[Link]

plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
id("[Link]-binary-plugin") // Apply the binary plugin
`kotlin-dsl`
}
[Link]

plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id '[Link]-binary-plugin' // Apply the binary plugin
id 'groovy' // Apply the Groovy DSL plugin
}

Consult the Developing Plugins chapter to learn more.


GRADLE TYPES
Understanding Properties and Providers
Gradle provides properties that are important for lazy configuration. When implementing a custom
task or plugin, it’s imperative that you use these lazy properties.

Gradle represents lazy properties with two interfaces:

1. Property - Represents a value that can be queried and changed.

2. Provider - Represents a value that can only be queried and cannot be changed.

Properties and providers manage values and configurations in a build script.

In this example, configuration is a Property<String> that is set to the configurationProvider


Provider<String>. The configurationProvider lazily provides the value "Hello, Gradle!":

[Link]

abstract class MyIntroTask : DefaultTask() {


@get:Input
abstract val configuration: Property<String>

@TaskAction
fun printConfiguration() {
println("Configuration value: ${[Link]()}")
}
}

val configurationProvider: Provider<String> = [Link] { "Hello,


Gradle!" }

[Link]("myIntroTask", MyIntroTask::class) {
[Link](configurationProvider)
}

[Link]

abstract class MyIntroTask extends DefaultTask {


@Input
abstract Property<String> getConfiguration()

@TaskAction
void printConfiguration() {
println "Configuration value: ${[Link]()}"
}
}

Provider<String> configurationProvider = [Link] { "Hello, Gradle!"


}

[Link]("myIntroTask", MyIntroTask) {
[Link](configurationProvider)
}

Understanding Properties

Properties in Gradle are variables that hold values. They can be defined and accessed within the
build script to store information like file paths, version numbers, or custom values.

Properties can be set and retrieved using the project object:

[Link]

// Setting a property
val simpleMessageProperty: Property<String> =
[Link](String::class)
[Link]("Hello, World from a Property!")
// Accessing a property
println([Link]())

[Link]

// Setting a property
def simpleMessageProperty = [Link](String)
[Link]("Hello, World from a Property!")
// Accessing a property
println([Link]())

Properties:

• Properties with these types are configurable.


• Property extends the Provider interface.

• The method [Link](T) specifies a value for the property, overwriting whatever value may
have been present.

• The method [Link](Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together Provider
and Property instances before the values are configured.

• A Property can be created by the factory method [Link](Class).

Understanding Providers

Providers are objects that represent a value that may not be immediately available. Providers are
useful for lazy evaluation and can be used to model values that may change over time or depend on
other tasks or inputs:

[Link]

// Setting a provider
val simpleMessageProvider: Provider<String> = [Link] {
"Hello, World from a Provider!" }
// Accessing a provider
println([Link]())

[Link]

// Setting a provider
def simpleMessageProvider = [Link] { "Hello, World from a
Provider!" }
// Accessing a provider
println([Link]())

Providers:

• Properties with these types are read-only.

• The method [Link]() returns the current value of the property.

• A Provider can be created from another Provider using [Link](Transformer).

• Many other types extend Provider and can be used wherever a Provider is required.

Using Gradle Managed Properties

Gradle’s managed properties allow you to declare properties as abstract getters (Java, Groovy) or
abstract properties (Kotlin).
Gradle then automatically provides the implementation for these properties, managing their state.

A property may be mutable, meaning that it has both a get() method and set() method:

[Link]

abstract class MyPropertyTask : DefaultTask() {


@get:Input
abstract val messageProperty: Property<String> // message property

@TaskAction
fun printMessage() {
println([Link]())
}
}

[Link]<MyPropertyTask>("myPropertyTask") {
[Link]("Hello, Gradle!")
}

[Link]

abstract class MyPropertyTask extends DefaultTask {


@Input
abstract Property<String> messageProperty = [Link]
(String)

@TaskAction
void printMessage() {
println([Link]())
}
}

[Link]('myPropertyTask', MyPropertyTask) {
[Link]("Hello, Gradle!")
}

Or read-only, meaning that it has only a get() method. The read-only properties are providers:

[Link]

abstract class MyProviderTask : DefaultTask() {


final val messageProvider: Provider<String> = [Link]
{ "Hello, Gradle!" } // message provider
@TaskAction
fun printMessage() {
println([Link]())
}
}

[Link]<MyProviderTask>("MyProviderTask") {

[Link]

abstract class MyProviderTask extends DefaultTask {


final Provider<String> messageProvider = [Link] {
"Hello, Gradle!" }

@TaskAction
void printMessage() {
println([Link]())
}
}

[Link]('MyProviderTask', MyProviderTask)

Mutable Managed Properties

A mutable managed property is declared using an abstract getter method of type Property<T>,
where T can be any serializable type or a fully managed Gradle type. The property must not have
any setter methods.

Here is an example of a task type with an uri property of type URI:

[Link]

public abstract class Download extends DefaultTask {


@Input
public abstract Property<URI> getUri(); // abstract getter of type Property<T>

@TaskAction
void run() {
[Link]("Downloading " + getUri().get()); // Use the `uri` property
}
}

Note that for a property to be considered a mutable managed property, the property’s getter
methods must be abstract and have public or protected visibility.

The property type must be one of the following:

Property Type Note


Property<T> Where T is typically Double, Integer, Long, String,
or Bool
RegularFileProperty Configurable regular file location, whose value
is mutable
DirectoryProperty Configurable directory location, whose value is
mutable
ListProperty<T> List of elements of type T
SetProperty<T> Set of elements of type T
MapProperty<K, V> Map of K type keys with V type values
ConfigurableFileCollection A mutable FileCollection which represents a
collection of file system locations
ConfigurableFileTree A mutable FileTree which represents a
hierarchy of files

Read-only Managed Properties (Providers)

You can declare a read-only managed property, also known as a provider, using a getter method of
type Provider<T>. The method implementation needs to derive the value. It can, for example, derive
the value from other properties.

Here is an example of a task type with a uri provider that is derived from a location property:

[Link]

public abstract class Download extends DefaultTask {


@Input
public abstract Property<String> getLocation();

@Internal
public Provider<URI> getUri() {
return getLocation().map(l -> [Link]("[Link] + l));
}

@TaskAction
void run() {
[Link]("Downloading " + getUri().get()); // Use the `uri`
provider (read-only property)
}
}
Read-only Managed Nested Properties (Nested Providers)

You can declare a read-only managed nested property by adding an abstract getter method for the
property to a type annotated with @Nested. The property should not have any setter methods. Gradle
provides the implementation for the getter method and creates a value for the property.

This pattern is useful when a custom type has a nested complex type which has the same lifecycle.
If the lifecycle is different, consider using Property<NestedType> instead.

Here is an example of a task type with a resource property. The Resource type is also a custom
Gradle type and defines some managed properties:

[Link]

public abstract class Download extends DefaultTask {


@Nested
public abstract Resource getResource(); // Use an abstract getter method annotated
with @Nested

@TaskAction
void run() {
// Use the `resource` property
[Link]("Downloading [Link] + getResource().getHostName().get()
+ "/" + getResource().getPath().get());
}
}

public interface Resource {


@Input
Property<String> getHostName();
@Input
Property<String> getPath();
}

Read-only Managed "name" Property (Provider)

If the type contains an abstract property called "name" of type String, Gradle provides an
implementation for the getter method, and extends each constructor with a "name" parameter,
which comes before all other constructor parameters.

If the type is an interface, Gradle will provide a constructor with a single "name" parameter and
@Inject semantics.

You can have your type implement or extend the Named interface, which defines such a read-only
"name" property:

import [Link]

interface MyType : Named {


// Other properties and methods...
}

class MyTypeImpl(override val name: String) : MyType {


// Implement other properties and methods...
}

// Usage
val instance = MyTypeImpl("myName")
println([Link]) // Prints: myName

Using Gradle Managed Types

A managed type as an abstract class or interface with no fields and whose properties are all
managed. These types have their state entirely managed by Gradle.

For example, this managed type is defined as an interface:

[Link]

public interface Resource {


@Input
Property<String> getHostName();
@Input
Property<String> getPath();
}

A named managed type is a managed type that additionally has an abstract property "name" of type
String. Named managed types are especially useful as the element type of
NamedDomainObjectContainer:

[Link]

interface MyNamedType {
val name: String
}

class MyNamedTypeImpl(override val name: String) : MyNamedType

class MyPluginExtension(project: Project) {


val myNamedContainer: NamedDomainObjectContainer<MyNamedType> =
[Link](MyNamedType::[Link]) { name ->
[Link](MyNamedTypeImpl::[Link], name)
}
}
[Link]

interface MyNamedType {
String getName()
}

class MyNamedTypeImpl implements MyNamedType {


String name

MyNamedTypeImpl(String name) {
[Link] = name
}
}

class MyPluginExtension {
NamedDomainObjectContainer<MyNamedType> myNamedContainer

MyPluginExtension(Project project) {
myNamedContainer = [Link](MyNamedType) { name ->
new MyNamedTypeImpl(name)
}
}
}

Using Java Bean Properties

Sometimes you may see properties implemented in the Java bean property style. That is, they do
not use a Property<T> or Provider<T> types but are instead implemented with concrete setter and
getter methods (or corresponding conveniences in Groovy or Kotlin).

This style of property definition is legacy in Gradle and is discouraged:

public class MyTask extends DefaultTask {


private String someProperty;

public String getSomeProperty() {


return someProperty;
}

public void setSomeProperty(String someProperty) {


[Link] = someProperty;
}

@TaskAction
public void myAction() {
[Link]("SomeProperty: " + someProperty);
}
}

Understanding Collections
Gradle provides types for maintaining collections of objects, intended to work well to extends
Gradle’s DSLs and provide useful features such as lazy configuration.

Available collections

These collection types are used for managing collections of objects, particularly in the context of
build scripts and plugins:

1. DomainObjectSet<T>: Represents a set of objects of type T. This set does not allow duplicate
elements, and you can add, remove, and query objects in the set.

2. NamedDomainObjectSet<T>: A specialization of DomainObjectSet where each object has a unique


name associated with it. This is often used for collections where each element needs to be
uniquely identified by a name.

3. NamedDomainObjectList<T>: Similar to NamedDomainObjectSet, but represents a list of objects where


order matters. Each element has a unique name associated with it, and you can access elements
by index as well as by name.

4. NamedDomainObjectContainer<T>: A container for managing objects of type T, where each object


has a unique name. This container provides methods for adding, removing, and querying
objects by name.

5. ExtensiblePolymorphicDomainObjectContainer<T>: An extension of NamedDomainObjectContainer


that allows you to define instantiation strategies for different types of objects. This is useful
when you have a container that can hold multiple types of objects, and you want to control how
each type of object is instantiated.

These types are commonly used in Gradle plugins and build scripts to manage collections of objects,
such as tasks, configurations, or custom domain objects.

1. DomainObjectSet

A DomainObjectSet simply holds a set of configurable objects.

Compared to NamedDomainObjectContainer, a DomainObjectSet doesn’t manage the objects in the


collection. They need to be created and added manually.

You can create an instance using the [Link]() method:

[Link]

abstract class MyPluginExtensionDomainObjectSet {


// Define a domain object set to hold strings
val myStrings: DomainObjectSet<String> =
[Link](String::class)

// Add some strings to the domain object set


fun addString(value: String) {
[Link](value)
}
}

[Link]

abstract class MyPluginExtensionDomainObjectSet {


// Define a domain object set to hold strings
DomainObjectSet<String> myStrings = [Link]
(String)

// Add some strings to the domain object set


void addString(String value) {
[Link](value)
}
}

2. NamedDomainObjectSet

A NamedDomainObjectSet holds a set of configurable objects, where each element has a name
associated with it.

This is similar to NamedDomainObjectContainer, however a NamedDomainObjectSet doesn’t manage the


objects in the collection. They need to be created and added manually.

You can create an instance using the [Link]() method.

[Link]

abstract class Person(val name: String)

abstract class MyPluginExtensionNamedDomainObjectSet {


// Define a named domain object set to hold Person objects
private val people: NamedDomainObjectSet<Person> =
[Link](Person::class)

// Add a person to the set


fun addPerson(name: String) {
[Link](name)
}
}

[Link]

abstract class Person {


String name
}

abstract class MyPluginExtensionNamedDomainObjectSet {


// Define a named domain object set to hold Person objects
NamedDomainObjectSet<Person> people = [Link]
.namedDomainObjectSet(Person)

// Add a person to the set


void addPerson(String name) {
[Link](name)
}
}

3. NamedDomainObjectList

A NamedDomainObjectList holds a list of configurable objects, where each element has a name
associated with it.

This is similar to NamedDomainObjectContainer, however a NamedDomainObjectList doesn’t manage the


objects in the collection. They need to be created and added manually.

You can create an instance using the [Link]() method.

[Link]

abstract class Person(val name: String)

abstract class MyPluginExtensionNamedDomainObjectList {


// Define a named domain object list to hold Person objects
private val people: NamedDomainObjectList<Person> =
[Link](Person::class)

// Add a person to the container


fun addPerson(name: String) {
[Link](name)
}
}
[Link]

abstract class Person {


String name
}

abstract class MyPluginExtensionNamedDomainObjectList {


// Define a named domain object container to hold Person objects
NamedDomainObjectList<Person> people = [Link](Person)

// Add a person to the container


void addPerson(String name) {
[Link](name: name)
}
}

4. NamedDomainObjectContainer

A NamedDomainObjectContainer manages a set of objects, where each element has a name associated
with it.

The container takes care of creating and configuring the elements, and provides a DSL that build
scripts can use to define and configure elements. It is intended to hold objects which are themselves
configurable, for example a set of custom Gradle objects.

Gradle uses NamedDomainObjectContainer type extensively throughout the API. For example, the
[Link] object used to manage the tasks of a project is a NamedDomainObjectContainer<Task>.

You can create a container instance using the ObjectFactory service, which provides the
[Link]() method. This is also available using the [Link]()
method, however in a custom Gradle type it’s generally better to use the injected ObjectFactory
service instead of passing around a Project instance.

You can also create a container instance using a read-only managed property.

[Link]

abstract class Person(val name: String)

abstract class MyPluginExtensionNamedDomainObjectContainer {


// Define a named domain object container to hold Person objects
private val people: NamedDomainObjectContainer<Person> =
[Link](Person::class)

// Add a person to the container


fun addPerson(name: String) {
[Link](name)
}
}

[Link]

abstract class Person {


String name
}

abstract class MyPluginExtensionNamedDomainObjectContainer {


// Define a named domain object container to hold Person objects
NamedDomainObjectContainer<Person> people = [Link](Person)

// Add a person to the container


void addPerson(String name) {
[Link](name: name)
}
}

In order to use a type with any of the domainObjectContainer() methods, it must either

• be a named managed type; or

• expose a property named “name” as the unique, and constant, name for the object. The
domainObjectContainer(Class) variant of the method creates new instances by calling the
constructor of the class that takes a string argument, which is the desired name of the object.

Objects created this way are treated as custom Gradle types, and so can make use of the features
discussed in this chapter, for example service injection or managed properties.

See the above link for domainObjectContainer() method variants that allow custom instantiation
strategies:

public interface DownloadExtension {


NamedDomainObjectContainer<Resource> getResources();
}

public interface Resource {


// Type must have a read-only 'name' property
String getName();

Property<URI> getUri();

Property<String> getUserName();
}
For each container property, Gradle automatically adds a block to the Groovy and Kotlin DSL that
you can use to configure the contents of the container:

[Link]

plugins {
id("[Link]")
}

download {
// Can use a block to configure the container contents
resources {
register("gradle") {
uri = uri("[Link]
}
}
}

[Link]

plugins {
id("[Link]")
}

download {
// Can use a block to configure the container contents
resources {
register('gradle') {
uri = uri('[Link]
}
}
}

5. ExtensiblePolymorphicDomainObjectContainer

An ExtensiblePolymorphicDomainObjectContainer is a NamedDomainObjectContainer that allows you


to define instantiation strategies for different types of objects.

You can create an instance using the [Link]() method:

[Link]

abstract class Animal(val name: String)

class Dog(name: String, val breed: String) : Animal(name)


abstract class
MyPluginExtensionExtensiblePolymorphicDomainObjectContainer(objectFactory:
ObjectFactory) {
// Define a container for animals
private val animals: ExtensiblePolymorphicDomainObjectContainer<Animal> =
[Link](Animal::class)

// Add a dog to the container


fun addDog(name: String, breed: String) {
var dog : Dog = Dog(name, breed)
[Link](dog)
}
}

[Link]

abstract class Animal {


String name
}

abstract class Dog extends Animal {


String breed
}

abstract class MyPluginExtensionExtensiblePolymorphicDomainObjectContainer {


// Define a container for animals
ExtensiblePolymorphicDomainObjectContainer<Animal> animals

MyPluginExtensionExtensiblePolymorphicDomainObjectContainer(ObjectFactory
objectFactory) {
// Create the container
animals = [Link](Animal)
}

// Add a dog to the container


void addDog(String name, String breed) {
[Link](Dog, name: name, breed: breed)
}
}

Understanding Services and Service Injection


Gradle provides a number of useful services that can be used by custom Gradle types. For example,
the WorkerExecutor service can be used by a task to run work in parallel, as seen in the worker API
section. The services are made available through service injection.
Available services

The following services are available for injection:

1. ObjectFactory - Allows model objects to be created.

2. ProjectLayout - Provides access to key project locations.

3. BuildLayout - Provides access to important locations for a Gradle build.

4. ProviderFactory - Creates Provider instances.

5. WorkerExecutor - Allows a task to run work in parallel.

6. FileSystemOperations - Allows a task to run operations on the filesystem such as deleting files,
copying files or syncing directories.

7. ArchiveOperations - Allows a task to run operations on archive files such as ZIP or TAR files.

8. ExecOperations - Allows a task to run external processes with dedicated support for running
external java programs.

9. ToolingModelBuilderRegistry - Allows a plugin to registers a Gradle tooling API model.

Out of the above, ProjectLayout and WorkerExecutor services are only available for injection in
project plugins. BuildLayout is only available in settings plugins and settings files. ProjectLayout is
unavailable in Worker API actions.

1. ObjectFactory

ObjectFactory is a service for creating custom Gradle types, allowing you to define nested objects
and DSLs in your build logic. It provides methods for creating instances of different types, such as
properties (Property<T>), collections (ListProperty<T>, SetProperty<T>, MapProperty<K, V>), file-
related objects (RegularFileProperty, DirectoryProperty, ConfigurableFileCollection,
ConfigurableFileTree), and more.

You can obtain an instance of ObjectFactory using the [Link] property. Here’s a simple
example demonstrating how to use ObjectFactory to create a property and set its value:

[Link]

[Link]("myObjectFactoryTask") {
doLast {
val objectFactory = [Link]
val myProperty = [Link](String::class)
[Link]("Hello, Gradle!")
println([Link]())
}
}
[Link]

[Link]("myObjectFactoryTask") {
doLast {
def objectFactory = [Link]
def myProperty = [Link](String)
[Link]("Hello, Gradle!")
println [Link]()
}
}

TIP It is preferable to let Gradle create objects automatically by using managed properties.

Using ObjectFactory to create these objects ensures that they are properly managed by Gradle,
especially in terms of configuration avoidance and lazy evaluation. This means that the values of
these objects are only calculated when needed, which can improve build performance.

In the following example, a project extension called DownloadExtension receives an ObjectFactory


instance through its constructor. The constructor uses this to create a nested Resource object (also a
custom Gradle type) and makes this object available through the resource property:

[Link]

public class DownloadExtension {


// A nested instance
private final Resource resource;

@Inject
public DownloadExtension(ObjectFactory objectFactory) {
// Use an injected ObjectFactory to create a Resource object
resource = [Link]([Link]);
}

public Resource getResource() {


return resource;
}
}

public interface Resource {


Property<URI> getUri();
}

Here is another example using [Link]:


[Link]

abstract class MyObjectFactoryTask


@Inject constructor(private var objectFactory: ObjectFactory) : DefaultTask()
{

@TaskAction
fun doTaskAction() {
val outputDirectory = [Link]()
[Link]([Link])
println([Link]())
}
}

[Link]("myInjectedObjectFactoryTask", MyObjectFactoryTask::class) {}

[Link]

abstract class MyObjectFactoryTask extends DefaultTask {


private ObjectFactory objectFactory

@Inject //@[Link]
MyObjectFactoryTask(ObjectFactory objectFactory) {
[Link] = objectFactory
}

@TaskAction
void doTaskAction() {
var outputDirectory = [Link]()
[Link]([Link])
println([Link]())
}
}

[Link]("myInjectedObjectFactoryTask",MyObjectFactoryTask) {}

The MyObjectFactoryTask task uses an ObjectFactory instance, which is injected into the task’s
constructor using the @Inject annotation.

2. ProjectLayout

ProjectLayout is a service that provides access to the layout of a Gradle project’s directories and
files. It’s part of the [Link] package and allows you to query the project’s layout to get
information about source sets, build directories, and other file-related aspects of the project.
You can obtain a ProjectLayout instance from a Project object using the [Link] property.
Here’s a simple example:

[Link]

[Link]("showLayout") {
doLast {
val layout = [Link]
println("Project Directory: ${[Link]}")
println("Build Directory: ${[Link]()}")
}
}

[Link]

[Link]('showLayout') {
doLast {
def layout = [Link]
println "Project Directory: ${[Link]}"
println "Build Directory: ${[Link]()}"
}
}

Here is an example using [Link]:

[Link]

abstract class MyProjectLayoutTask


@Inject constructor(private var projectLayout: ProjectLayout) : DefaultTask()
{

@TaskAction
fun doTaskAction() {
val outputDirectory = [Link]
println(outputDirectory)
}
}

[Link]("myInjectedProjectLayoutTask", MyProjectLayoutTask::class) {}
[Link]

abstract class MyProjectLayoutTask extends DefaultTask {


private ProjectLayout projectLayout

@Inject //@[Link]
MyProjectLayoutTask(ProjectLayout projectLayout) {
[Link] = projectLayout
}

@TaskAction
void doTaskAction() {
var outputDirectory = [Link]
println(outputDirectory)
}
}

[Link]("myInjectedProjectLayoutTask",MyProjectLayoutTask) {}

The MyProjectLayoutTask task uses a ProjectLayout instance, which is injected into the task’s
constructor using the @Inject annotation.

3. BuildLayout

BuildLayout is a service that provides access to the root and settings directory in a Settings plugin or
a Settings script, it is analogous to ProjectLayout. It’s part of the [Link] package to
access standard build-wide file system locations as lazily computed value.

These APIs are currently incubating but eventually should replace existing
accessors in Settings, which return eagerly computed locations:
NOTE
[Link] → [Link]
[Link] → [Link]

You can obtain a BuildLayout instance from a Settings object using the [Link] property.
Here’s a simple example:

[Link]

println("Root Directory: ${[Link]}")


println("Settings Directory: ${[Link]}")
[Link]

println "Root Directory: ${[Link]().rootDirectory}"


println "Settings Directory: ${[Link]().settingsDirectory}"

Here is an example using [Link]:

[Link]

abstract class MyBuildLayoutPlugin @Inject constructor(private val


buildLayout: BuildLayout) : Plugin<Settings> {
override fun apply(settings: Settings) {
println([Link])
}
}

apply<MyBuildLayoutPlugin>()

[Link]

abstract class MyBuildLayoutPlugin implements Plugin<Settings> {


private BuildLayout buildLayout

@Inject //@[Link]
MyBuildLayoutPlugin(BuildLayout buildLayout) {
[Link] = buildLayout
}

@Override void apply(Settings settings) {


// the meat and potatoes of the plugin
println [Link]
}
}

apply plugin: MyBuildLayoutPlugin

This code defines a MyBuildLayoutPlugin plugin that implements the Plugin interface for the Settings
type. The plugin expects a BuildLayout instance to be injected into its constructor using the @Inject
annotation.
4. ProviderFactory

ProviderFactory is a service that provides methods for creating different types of providers.
Providers are used to model values that may be computed lazily in your build scripts.

The ProviderFactory interface provides methods for creating various types of providers, including:

• provider(Callable<T> value) to create a provider with a value that is lazily computed based on a
Callable.

• provider(Provider<T> value) to create a provider that simply wraps an existing provider.

• property(Class<T> type) to create a property provider for a specific type.

• gradleProperty(Class<T> type) to create a property provider that reads its value from a Gradle
project property.

Here’s a simple example demonstrating the use of ProviderFactory using [Link]:

[Link]

[Link]("printMessage") {
doLast {
val providerFactory = [Link]
val messageProvider = [Link] { "Hello, Gradle!" }
println([Link]())
}
}

[Link]

[Link]('printMessage') {
doLast {
def providerFactory = [Link]
def messageProvider = [Link] { "Hello, Gradle!" }
println [Link]()
}
}

The task named printMessage uses the ProviderFactory to create a provider that supplies the
message string.

Here is an example using [Link]:


[Link]

abstract class MyProviderFactoryTask


@Inject constructor(private var providerFactory: ProviderFactory) :
DefaultTask() {

@TaskAction
fun doTaskAction() {
val outputDirectory = [Link] { "build/[Link]"
}
println([Link]())
}
}

[Link]("myInjectedProviderFactoryTask", MyProviderFactoryTask::class)
{}

[Link]

abstract class MyProviderFactoryTask extends DefaultTask {


private ProviderFactory providerFactory

@Inject //@[Link]
MyProviderFactoryTask(ProviderFactory providerFactory) {
[Link] = providerFactory
}

@TaskAction
void doTaskAction() {
var outputDirectory = [Link] { "build/[Link]"
}
println([Link]())
}
}

[Link]("myInjectedProviderFactoryTask",MyProviderFactoryTask) {}

The ProviderFactory service is injected into the MyProviderFactoryTask task’s constructor using the
@Inject annotation.

5. WorkerExecutor

WorkerExecutor is a service that allows you to perform parallel execution of tasks using worker
processes. This is particularly useful for tasks that perform CPU-intensive or long-running
operations, as it allows them to be executed in parallel, improving build performance.

Using WorkerExecutor, you can submit units of work (called actions) to be executed in separate
worker processes. This helps isolate the work from the main Gradle process, providing better
reliability and performance.

Here’s a basic example of how you might use WorkerExecutor in a build script:

[Link]

abstract class MyWorkAction : WorkAction<[Link]> {


private val greeting: String = "Hello from a Worker!"

override fun execute() {


println(greeting)
}
}

abstract class MyWorkerTask


@Inject constructor(private var workerExecutor: WorkerExecutor) :
DefaultTask() {
@get:Input
abstract val booleanFlag: Property<Boolean>
@TaskAction
fun doThings() {
[Link]().submit(MyWorkAction::[Link]) {}
}
}

[Link]("myWorkTask", MyWorkerTask::class) {}

[Link]

abstract class MyWorkAction implements WorkAction<[Link]> {


private final String greeting;

@Inject
public MyWorkAction() {
[Link] = "Hello from a Worker!";
}

@Override
public void execute() {
[Link](greeting);
}
}
abstract class MyWorkerTask extends DefaultTask {
@Input
abstract Property<Boolean> getBooleanFlag()

@Inject
abstract WorkerExecutor getWorkerExecutor()

@TaskAction
void doThings() {
[Link]().submit(MyWorkAction) {}
}
}

[Link]("myWorkTask", MyWorkerTask) {}

See the worker API for more details.

6. FileSystemOperations

FileSystemOperations is a service that provides methods for performing file system operations such
as copying, deleting, and syncing. It is part of the [Link] package and is typically used
in custom tasks or plugins to interact with the file system.

Here is an example using [Link]:

[Link]

abstract class MyFileSystemOperationsTask


@Inject constructor(private var fileSystemOperations: FileSystemOperations) :
DefaultTask() {

@TaskAction
fun doTaskAction() {
[Link] {
from("src")
into("dest")
}
}
}

[Link]("myInjectedFileSystemOperationsTask",
MyFileSystemOperationsTask::class)
[Link]

abstract class MyFileSystemOperationsTask extends DefaultTask {


private FileSystemOperations fileSystemOperations

@Inject //@[Link]
MyFileSystemOperationsTask(FileSystemOperations fileSystemOperations) {
[Link] = fileSystemOperations
}

@TaskAction
void doTaskAction() {
[Link] {
from 'src'
into 'dest'
}
}
}

[Link]("myInjectedFileSystemOperationsTask",
MyFileSystemOperationsTask)

The FileSystemOperations service is injected into the MyFileSystemOperationsTask task’s constructor


using the @Inject annotation.

With some ceremony, it is possible to use FileSystemOperations in an ad-hoc task defined in a build
script:

[Link]

interface InjectedFsOps {
@get:Inject val fs: FileSystemOperations
}

[Link]("myAdHocFileSystemOperationsTask") {
val injected = [Link]<InjectedFsOps>()
doLast {
[Link] {
from("src")
into("dest")
}
}
}
[Link]

interface InjectedFsOps {
@Inject //@[Link]
FileSystemOperations getFs()
}

[Link]('myAdHocFileSystemOperationsTask') {
def injected = [Link](InjectedFsOps)
doLast {
[Link] {
from 'source'
into 'destination'
}
}
}

First, you need to declare an interface with a property of type FileSystemOperations, here named
InjectedFsOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.

TIP This is a good time to consider extracting the ad-hoc task into a proper class.

7. ArchiveOperations

ArchiveOperations is a service that provides methods for accessing the contents of archives, such as
ZIP and TAR files. It is part of the [Link] package and is typically used in custom tasks
or plugins to unpack archive files.

Here is an example using [Link]:

[Link]

abstract class MyArchiveOperationsTask


@Inject constructor(
private val archiveOperations: ArchiveOperations,
private val layout: ProjectLayout,
private val fs: FileSystemOperations
) : DefaultTask() {
@TaskAction
fun doTaskAction() {
[Link] {

from([Link]([Link]("[Link]")))
into([Link]("unpacked-sources"))
}
}
}

[Link]("myInjectedArchiveOperationsTask",
MyArchiveOperationsTask::class)

[Link]

abstract class MyArchiveOperationsTask extends DefaultTask {


private ArchiveOperations archiveOperations
private ProjectLayout layout
private FileSystemOperations fs

@Inject
MyArchiveOperationsTask(ArchiveOperations archiveOperations,
ProjectLayout layout, FileSystemOperations fs) {
[Link] = archiveOperations
[Link] = layout
[Link] = fs
}

@TaskAction
void doTaskAction() {
[Link] {
from([Link]([Link](
"[Link]")))
into([Link]("unpacked-sources"))
}
}
}

[Link]("myInjectedArchiveOperationsTask", MyArchiveOperationsTask)

The ArchiveOperations service is injected into the MyArchiveOperationsTask task’s constructor using
the @Inject annotation.

With some ceremony, it is possible to use ArchiveOperations in an ad-hoc task defined in a build
script:

[Link]

interface InjectedArcOps {
@get:Inject val arcOps: ArchiveOperations
}
[Link]("myAdHocArchiveOperationsTask") {
val injected = [Link]<InjectedArcOps>()
val archiveFile = "${[Link]}/[Link]"
doLast {
[Link](archiveFile)
}
}

[Link]

interface InjectedArcOps {
@Inject //@[Link]
ArchiveOperations getArcOps()
}

[Link]('myAdHocArchiveOperationsTask') {
def injected = [Link](InjectedArcOps)
def archiveFile = "${projectDir}/[Link]"

doLast {
[Link](archiveFile)
}
}

First, you need to declare an interface with a property of type ArchiveOperations, here named
InjectedArcOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.

TIP This is a good time to consider extracting the ad-hoc task into a proper class.

8. ExecOperations

ExecOperations is a service that provides methods for executing external processes (commands)
from within a build script. It is part of the [Link] package and is typically used in
custom tasks or plugins to run command-line tools or scripts as part of the build process.

Here is an example using [Link]:

[Link]

abstract class MyExecOperationsTask


@Inject constructor(private var execOperations: ExecOperations) :
DefaultTask() {
@TaskAction
fun doTaskAction() {
[Link] {
commandLine("ls", "-la")
}
}
}

[Link]("myInjectedExecOperationsTask", MyExecOperationsTask::class)

[Link]

abstract class MyExecOperationsTask extends DefaultTask {


private ExecOperations execOperations

@Inject //@[Link]
MyExecOperationsTask(ExecOperations execOperations) {
[Link] = execOperations
}

@TaskAction
void doTaskAction() {
[Link] {
commandLine 'ls', '-la'
}
}
}

[Link]("myInjectedExecOperationsTask", MyExecOperationsTask)

The ExecOperations is injected into the MyExecOperationsTask task’s constructor using the @Inject
annotation.

With some ceremony, it is possible to use ExecOperations in an ad-hoc task defined in a build script:

[Link]

interface InjectedExecOps {
@get:Inject val execOps: ExecOperations
}

[Link]("myAdHocExecOperationsTask") {
val injected = [Link]<InjectedExecOps>()

doLast {
[Link] {
commandLine("ls", "-la")
}
}
}

[Link]

interface InjectedExecOps {
@Inject //@[Link]
ExecOperations getExecOps()
}

[Link]('myAdHocExecOperationsTask') {
def injected = [Link](InjectedExecOps)

doLast {
[Link] {
commandLine 'ls', '-la'
}
}
}

First, you need to declare an interface with a property of type ExecOperations, here named
InjectedExecOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.

TIP This is a good time to consider extracting the ad-hoc task into a proper class.

9. ToolingModelBuilderRegistry

ToolingModelBuilderRegistry is a service that allows you to register custom tooling model builders.
Tooling models are used to provide rich IDE integration for Gradle projects, allowing IDEs to
understand and work with the project’s structure, dependencies, and other aspects.

The ToolingModelBuilderRegistry interface is part of the [Link] package


and is typically used in custom Gradle plugins that provide enhanced IDE support.

Here’s a simplified example:

[Link]

// Implements the ToolingModelBuilder interface.


// This interface is used in Gradle to define custom tooling models that can
// be accessed by IDEs or other tools through the Gradle tooling API.
class OrtModelBuilder : ToolingModelBuilder {
private val repositories: MutableMap<String, String> = mutableMapOf()

private val platformCategories: Set<String> = setOf("platform",


"enforced-platform")

private val visitedDependencies: MutableSet<ModuleComponentIdentifier> =


mutableSetOf()
private val visitedProjects: MutableSet<ModuleVersionIdentifier> =
mutableSetOf()

private val logger = [Link](OrtModelBuilder::[Link])


private val errors: MutableList<String> = mutableListOf()
private val warnings: MutableList<String> = mutableListOf()

override fun canBuild(modelName: String): Boolean {


return false
}

override fun buildAll(modelName: String, project: Project): Any? {


return null
}
}

// Plugin is responsible for registering a custom tooling model builder


// (OrtModelBuilder) with the ToolingModelBuilderRegistry, which allows
// IDEs and other tools to access the custom tooling model.
class OrtModelPlugin(private val registry: ToolingModelBuilderRegistry) :
Plugin<Project> {
override fun apply(project: Project) {
[Link](OrtModelBuilder())
}
}

[Link]

// Implements the ToolingModelBuilder interface.


// This interface is used in Gradle to define custom tooling models that can
// be accessed by IDEs or other tools through the Gradle tooling API.
class OrtModelBuilder implements ToolingModelBuilder {
private Map<String, String> repositories = [:]

private Set<String> platformCategories = ["platform", "enforced-platform


"]

private Set<ModuleComponentIdentifier> visitedDependencies = []


private Set<ModuleVersionIdentifier> visitedProjects = []
private static final logger = [Link]([Link])
private List<String> errors = []
private List<String> warnings = []

@Override
boolean canBuild(String modelName) {
return false
}

@Override
Object buildAll(String modelName, Project project) {
return null
}
}

// Plugin is responsible for registering a custom tooling model builder


// (OrtModelBuilder) with the ToolingModelBuilderRegistry, which allows
// IDEs and other tools to access the custom tooling model.
class OrtModelPlugin implements Plugin<Project> {
ToolingModelBuilderRegistry registry

OrtModelPlugin(ToolingModelBuilderRegistry registry) {
[Link] = registry
}

void apply(Project project) {


[Link](new OrtModelBuilder())
}
}

Constructor injection

There are 2 ways that an object can receive the services that it needs. The first option is to add the
service as a parameter of the class constructor. The constructor must be annotated with the
[Link] annotation. Gradle uses the declared type of each constructor parameter to
determine the services that the object requires. The order of the constructor parameters and their
names are not significant and can be whatever you like.

Here is an example that shows a task type that receives an ObjectFactory via its constructor:

[Link]

public class Download extends DefaultTask {


private final DirectoryProperty outputDirectory;

// Inject an ObjectFactory into the constructor


@Inject
public Download(ObjectFactory objectFactory) {
// Use the factory
outputDirectory = [Link]();
}

@OutputDirectory
public DirectoryProperty getOutputDirectory() {
return outputDirectory;
}

@TaskAction
void run() {
// ...
}
}

Property injection

Alternatively, a service can be injected by adding a property getter method annotated with the
[Link] annotation to the class. This can be useful, for example, when you cannot
change the constructor of the class due to backwards compatibility constraints. This pattern also
allows Gradle to defer creation of the service until the getter method is called, rather than when the
instance is created. This can help with performance. Gradle uses the declared return type of the
getter method to determine the service to make available. The name of the property is not
significant and can be whatever you like.

The property getter method must be public or protected. The method can be abstract or, in cases
where this isn’t possible, can have a dummy method body. The method body is discarded.

Here is an example that shows a task type that receives a two services via property getter methods:

[Link]

public abstract class Download extends DefaultTask {


// Use an abstract getter method
@Inject
protected abstract ObjectFactory getObjectFactory();

// Alternatively, use a getter method with a dummy implementation


@Inject
protected WorkerExecutor getWorkerExecutor() {
// Method body is ignored
throw new UnsupportedOperationException();
}

@TaskAction
void run() {
WorkerExecutor workerExecutor = getWorkerExecutor();
ObjectFactory objectFactory = getObjectFactory();
// Use the executor and factory ...
}
}
STRUCTURING BUILDS
Structuring Projects with Gradle
It is important to structure your Gradle project to optimize build performance. A multi-project build
is the standard in Gradle.

A multi-project build consists of one root project and one or more subprojects. Gradle can build the
root project and any number of the subprojects in a single execution.

Project locations

Multi-project builds contain a single root project in a directory that Gradle views as the root path: ..

Subprojects are located physically under the root path: ./subproject.

A subproject has a path, which denotes the position of that subproject in the multi-project build. In
most cases, the project path is consistent with its location in the file system.

The project structure is created in the [Link](.kts) file. The settings file must be present
in the root directory.

A simple multi-project build

Let’s look at a basic multi-project build example that contains a root project and a single subproject.

The root project is called basic-multiproject, located somewhere on your machine. From Gradle’s
perspective, the root is the top-level directory ..

The project contains a single subproject called ./app:

.
├── app
│ ...
│ └── [Link]
└── [Link]

.
├── app
│ ...
│ └── [Link]
└── [Link]

This is the recommended project structure for starting any Gradle project. The build init plugin also
generates skeleton projects that follow this structure - a root project with a single subproject:

The [Link](.kts) file describes the project structure to Gradle:

[Link]

[Link] = "basic-multiproject"
include("app")

[Link]

[Link] = 'basic-multiproject'
include 'app'

In this case, Gradle will look for a build file for the app subproject in the ./app directory.

You can view the structure of a multi-project build by running the projects command:

$ ./gradlew -q projects

Projects:

------------------------------------------------------------
Root project 'basic-multiproject'
------------------------------------------------------------

Root project 'basic-multiproject'


\--- Project ':app'
To see a list of the tasks of a project, run gradle <project-path>:tasks
For example, try running gradle :app:tasks

In this example, the app subproject is a Java application that applies the application plugin and
configures the main class. The application prints Hello World to the console:

app/[Link]

plugins {
id("application")
}

application {
mainClass = "[Link]"
}

app/[Link]

plugins {
id 'application'
}

application {
mainClass = '[Link]'
}

app/src/main/java/com/example/[Link]

package [Link];

public class Hello {


public static void main(String[] args) {
[Link]("Hello, world!");
}
}

You can run the application by executing the run task from the application plugin in the project
root:

$ ./gradlew -q run
Hello, world!
Adding a subproject

In the settings file, you can use the include method to add another subproject to the root project:

[Link]

include("project1", "project2:child1", "project3:child1")

[Link]

include 'project1', 'project2:child1', 'project3:child1'

The include method takes project paths as arguments. The project path is assumed to be equal to
the relative physical file system path. For example, a path services:api is mapped by default to a
folder ./services/api (relative to the project root .).

More examples of how to work with the project path can be found in the DSL documentation of
[Link]([Link][]).

Let’s add another subproject called lib to the previously created project.

All we need to do is add another include statement in the root settings file:

[Link]

[Link] = "basic-multiproject"
include("app")
include("lib")

[Link]

[Link] = 'basic-multiproject'
include 'app'
include 'lib'

Gradle will then look for the build file of the new lib subproject in the ./lib/ directory:

.
├── app
│ ...
│ └── [Link]
├── lib
│ ...
│ └── [Link]
└── [Link]

.
├── app
│ ...
│ └── [Link]
├── lib
│ ...
│ └── [Link]
└── [Link]

Project Descriptors

To further describe the project architecture to Gradle, the settings file provides project descriptors.

You can modify these descriptors in the settings file at any time.

To access a descriptor, you can:

[Link]

include("project-a")
println([Link])
println(project(":project-a").name)

[Link]

include('project-a')
println [Link]
println project(':project-a').name

Using this descriptor, you can change the name, project directory, and build file of a project:
[Link]

[Link] = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "[Link]"

[Link]

[Link] = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = '[Link]'

Consult the ProjectDescriptor class in the API documentation for more information.

Modifying a subproject path

Let’s take a hypothetical project with the following structure:

.
├── app
│ ...
│ └── [Link]
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── [Link]
└── [Link]

.
├── app
│ ...
│ └── [Link]
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── [Link]
└── [Link]

If your [Link](.kts) looks like this:

include(':subs:web:my-web-module')

Gradle sees a subproject with a logical project name of :subs:web:my-web-module and two, possibly
unintentional, other subprojects logically named :subs and :subs:web. This can lead to phantom
build directories, especially when using allprojects{} or subproject{}.

To avoid this, you can use:

include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"

So that you only end up with a single subproject named :my-web-module.

So, while the physical project layout is the same, the logical results are different.

Naming recommendations

As your project grows, naming and consistency get increasingly more important. To keep your
builds maintainable, we recommend the following:

1. Keep default project names for subprojects: It is possible to configure custom project names
in the settings file. However, it’s an unnecessary extra effort for the developers to track which
projects belong to what folders.

2. Use lower case hyphenation for all project names: All letters are lowercase, and words are
separated with a dash (-) character.

3. Define the root project name in the settings file: The [Link] effectively assigns a
name to the build, used in reports like Build Scans. If the root project name is not set, the name
will be the container directory name, which can be unstable (i.e., you can check out your project
in any directory). The name will be generated randomly if the root project name is not set and
checked out to a file system’s root (e.g., / or C:\).

Declaring Dependencies between Subprojects


What if one subproject depends on another subproject? What if one project needs the artifact
produced by another project?
This is a common use case for multi-project builds. Gradle offers project dependencies for this.

Depending on another project

Let’s explore a theoretical multi-project build with the following layout:

.
├── api
│ ├── src
│ │ └──...
│ └── [Link]
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link]
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]

.
├── api
│ ├── src
│ │ └──...
│ └── [Link]
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link]
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]

In this example, there are three subprojects called shared, api, and person-service:

1. The person-service subproject depends on the other two subprojects, shared and api.

2. The api subproject depends on the shared subproject.

We use the : separator to define a project path such as services:person-service or :shared. Consult
the DSL documentation of [Link]([Link][]) for more information about defining
project paths.

[Link]

[Link] = "dependencies-java"
include("api", "shared", "services:person-service")

shared/[Link]

plugins {
id("java")
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
}

api/[Link]

plugins {
id("java")
}

repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
}

services/person-service/[Link]

plugins {
id("java")
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
implementation(project(":api"))
}

[Link]

[Link] = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'

shared/[Link]

plugins {
id 'java'
}

repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
}

api/[Link]

plugins {
id 'java'
}
repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
}

services/person-service/[Link]

plugins {
id 'java'
}

repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
implementation project(':api')
}

A project dependency affects execution order. It causes the other project to be built first and adds
the output with the classes of the other project to the classpath. It also adds the dependencies of the
other project to the classpath.

If you execute ./gradlew :api:compile, first the shared project is built, and then the api project is
built.

Depending on artifacts produced by another project

Sometimes, you might want to depend on the output of a specific task within another project rather
than the entire project. However, explicitly declaring a task dependency from one project to
another is discouraged as it introduces unnecessary coupling between tasks.

The recommended way to model dependencies, where a task in one project depends on the output
of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency
management engine allows you to share arbitrary artifacts between projects and build them on
demand.

Sharing Build Logic between Subprojects


Subprojects in a multi-project build typically share some common dependencies.
Instead of copying and pasting the same Java version and libraries in each subproject build script,
Gradle provides a special directory for storing shared build logic that can be automatically applied
to subprojects.

Share logic in buildSrc

buildSrc is a Gradle-recognized and protected directory which comes with some benefits:

1. Reusable Build Logic:

buildSrc allows you to organize and centralize your custom build logic, tasks, and plugins in a
structured manner. The code written in buildSrc can be reused across your project, making it
easier to maintain and share common build functionality.

2. Isolation from the Main Build:

Code placed in buildSrc is isolated from the other build scripts of your project. This helps keep
the main build scripts cleaner and more focused on project-specific configurations.

3. Automatic Compilation and Classpath:

The contents of the buildSrc directory are automatically compiled and included in the classpath
of your main build. This means that classes and plugins defined in buildSrc can be directly used
in your project’s build scripts without any additional configuration.

4. Ease of Testing:

Since buildSrc is a separate build, it allows for easy testing of your custom build logic. You can
write tests for your build code, ensuring that it behaves as expected.
5. Gradle Plugin Development:

If you are developing custom Gradle plugins for your project, buildSrc is a convenient place to
house the plugin code. This makes the plugins easily accessible within your project.

The buildSrc directory is treated as an included build.

For multi-project builds, there can be only one buildSrc directory, which must be in the root project
directory.

The downside of using buildSrc is that any change to it will invalidate every task in
NOTE
your project and require a rerun.

buildSrc uses the same source code conventions applicable to Java, Groovy, and Kotlin projects. It
also provides direct access to the Gradle API.

A typical project including buildSrc has the following layout:

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──[Link] ①
│ ├── [Link] ②
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ③
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]

① Create the MyCustomTask task.

② A shared build script.

③ Uses the MyCustomTask task and shared build script.

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──[Link] ①
│ ├── [Link] ②
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ③
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]

① Create the MyCustomTask task.

② A shared build script.

③ Uses the MyCustomTask task and shared build script.

In the buildSrc, the build script [Link](.kts) is created. It contains dependencies and other
build information that is common to multiple subprojects:

[Link]

repositories {
mavenCentral()
}

dependencies {
implementation("org.slf4j:slf4j-api:1.7.32")
}

[Link]

repositories {
mavenCentral()
}

dependencies {
implementation 'org.slf4j:slf4j-api:1.7.32'
}

In the buildSrc, the MyCustomTask is also created. It is a helper task that is used as part of the build
logic for multiple subprojects:

[Link]

import [Link]
import [Link]

open class MyCustomTask : DefaultTask() {


@TaskAction
fun calculateSum() {
// Custom logic to calculate the sum of two numbers
val num1 = 5
val num2 = 7
val sum = num1 + num2

// Print the result


println("Sum: $sum")
}
}

[Link]

import [Link]
import [Link]

class MyCustomTask extends DefaultTask {


@TaskAction
void calculateSum() {
// Custom logic to calculate the sum of two numbers
int num1 = 5
int num2 = 7
int sum = num1 + num2

// Print the result


println "Sum: $sum"
}
}

The MyCustomTask task is used in the build script of the api and shared projects. The task is
automatically available because it’s part of buildSrc.
The [Link](.kts) file is also applied:

[Link]

// Apply any other configurations specific to your project

// Use the build script defined in buildSrc


apply(from = [Link]("buildSrc/[Link]"))

// Use the custom task defined in buildSrc


[Link]<MyCustomTask>("myCustomTask")

[Link]

// Apply any other configurations specific to your project

// Use the build script defined in buildSrc


apply from: [Link]('buildSrc/[Link]')

// Use the custom task defined in buildSrc


[Link]('myCustomTask', MyCustomTask)

Share logic using convention plugins

Gradle’s recommended way of organizing build logic is to use its plugin system.

We can write a plugin that encapsulates the build logic common to several subprojects in a project.
This kind of plugin is called a convention plugin.

While writing plugins is outside the scope of this section, the recommended way to build a Gradle
project is to put common build logic in a convention plugin located in the buildSrc.

Let’s take a look at an example project:

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──[Link] ①
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ②
├── shared
│ ├── src
│ │ └──...
│ └── [Link] ②
└── [Link]

① Create the [Link]-conventions convention plugin.

② Applies the [Link]-conventions convention plugin.

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──[Link] ①
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ②
├── shared
│ ├── src
│ │ └──...
│ └── [Link] ②
└── [Link]

① Create the [Link]-conventions convention plugin.

② Applies the [Link]-conventions convention plugin.

This build contains three subprojects:


[Link]

[Link] = "dependencies-java"
include("api", "shared", "services:person-service")

[Link]

[Link] = 'dependencies-java'
include 'api', 'shared', 'services:person-service'

The source code for the convention plugin created in the buildSrc directory is as follows:

buildSrc/src/main/kotlin/[Link]

plugins {
id("java")
}

group = "[Link]"
version = "1.0"

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
}

buildSrc/src/main/groovy/[Link]

plugins {
id 'java'
}

group = '[Link]'
version = '1.0'

repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}

For the convention plugin to compile, basic configuration needs to be applied in the build file of the
buildSrc directory:

buildSrc/[Link]

plugins {
`kotlin-dsl`
}

repositories {
mavenCentral()
}

buildSrc/[Link]

plugins {
id 'groovy-gradle-plugin'
}

The convention plugin is applied to the api, shared, and person-service subprojects:

api/[Link]

plugins {
id("[Link]-conventions")
}

dependencies {
implementation(project(":shared"))
}

shared/[Link]

plugins {
id("[Link]-conventions")
}

services/person-service/[Link]

plugins {
id("[Link]-conventions")
}

dependencies {
implementation(project(":shared"))
implementation(project(":api"))
}

api/[Link]

plugins {
id '[Link]-conventions'
}

dependencies {
implementation project(':shared')
}

shared/[Link]

plugins {
id '[Link]-conventions'
}

services/person-service/[Link]

plugins {
id '[Link]-conventions'
}

dependencies {
implementation project(':shared')
implementation project(':api')
}

Do not use cross-project configuration

An improper way to share build logic between subprojects is cross-project configuration via the
subprojects {} and allprojects {} DSL constructs.
TIP Avoid using subprojects {} and allprojects {}.

With cross-project configuration, build logic can be injected into a subproject which is not obvious
when looking at its build script.

In the long run, cross-project configuration usually grows in complexity and becomes a burden.
Cross-project configuration can also introduce configuration-time coupling between projects, which
can prevent optimizations like configuration-on-demand from working properly.

Convention plugins versus cross-project configuration

The two most common uses of cross-project configuration can be better modeled using convention
plugins:

1. Applying plugins or other configurations to subprojects of a certain type.


Often, the cross-project configuration logic is if subproject is of type X, then configure Y.
This is equivalent to applying X-conventions plugin directly to a subproject.

2. Extracting information from subprojects of a certain type.


This use case can be modeled using outgoing configuration variants.

Composite Builds
A composite build is a build that includes other builds.

A composite build is similar to a Gradle multi-project build, except that instead of including
subprojects, entire builds are included.

Composite builds allow you to:

• Combine builds that are usually developed independently, for instance, when trying out a bug
fix in a library that your application uses.

• Decompose a large multi-project build into smaller, more isolated chunks that can be worked on
independently or together as needed.

A build that is included in a composite build is referred to as an included build. Included builds do
not share any configuration with the composite build or the other included builds. Each included
build is configured and executed in isolation.

Defining a composite build

The following example demonstrates how two Gradle builds, normally developed separately, can be
combined into a composite build.

my-composite
├── gradle
├── gradlew
├── [Link]
├── [Link]
├── my-app
│ ├── [Link]
│ └── app
│ ├── [Link]
│ └── src/main/java/org/sample/my-app/[Link]
└── my-utils
├── [Link]
├── number-utils
│ ├── [Link]
│ └── src/main/java/org/sample/numberutils/[Link]
└── string-utils
├── [Link]
└── src/main/java/org/sample/stringutils/[Link]

The my-utils multi-project build produces two Java libraries, number-utils and string-utils. The my-
app build produces an executable using functions from those libraries.

The my-app build does not depend directly on my-utils. Instead, it declares binary dependencies on
the libraries produced by my-utils:

my-app/app/[Link]

plugins {
id("application")
}

application {
mainClass = "[Link]"
}
dependencies {
implementation("[Link]:number-utils:1.0")
implementation("[Link]:string-utils:1.0")
}

my-app/app/[Link]

plugins {
id 'application'
}

application {
mainClass = '[Link]'
}

dependencies {
implementation '[Link]:number-utils:1.0'
implementation '[Link]:string-utils:1.0'
}

Defining a composite build via --include-build

The --include-build command-line argument turns the executed build into a composite,
substituting dependencies from the included build into the executed build.

For example, the output of ./gradlew run --include-build ../my-utils run from my-app:

$ ./gradlew --include-build ../my-utils run


link:[Link]
builds/basic/tests/[Link][role=include]

Defining a composite build via the settings file

It’s possible to make the above arrangement persistent by using


[Link]([Link]) to declare the included build in the [Link](.kts)
file.

The settings file can be used to add subprojects and included builds simultaneously.

Included builds are added by location:

[Link]

includeBuild("my-utils")
In the example, the [Link](.kts) file combines otherwise separate builds:

[Link]

[Link] = "my-composite"

includeBuild("my-app")
includeBuild("my-utils")

[Link]

[Link] = 'my-composite'

includeBuild 'my-app'
includeBuild 'my-utils'

To execute the run task in the my-app build from my-composite, run ./gradlew my-app:app:run.

You can optionally define a run task in my-composite that depends on my-app:app:run so that you can
execute ./gradlew run:

[Link]

[Link]("run") {
dependsOn([Link]("my-app").task(":app:run"))
}

[Link]

[Link]('run') {
dependsOn [Link]('my-app').task(':app:run')
}

Including builds that define Gradle plugins

A special case of included builds are builds that define Gradle plugins.

These builds should be included using the includeBuild statement inside the pluginManagement {}
block of the settings file.
Using this mechanism, the included build may also contribute a settings plugin that can be applied
in the settings file itself:

[Link]

pluginManagement {
includeBuild("../url-verifier-plugin")
}

[Link]

pluginManagement {
includeBuild '../url-verifier-plugin'
}

Restrictions on included builds

Most builds can be included in a composite, including other composite builds. There are some
restrictions.

In a regular build, Gradle ensures that each project has a unique project path. It makes projects
identifiable and addressable without conflicts.

In a composite build, Gradle adds additional qualification to each project from an included build to
avoid project path conflicts. The full path to identify a project in a composite build is called a build-
tree path. It consists of a build path of an included build and a project path of the project.

By default, build paths and project paths are derived from directory names and structure on disk.
Since included builds can be located anywhere on disk, their build path is determined by the name
of the containing directory. This can sometimes lead to conflicts.

To summarize, the included builds must fulfill these requirements:

• Each included build must have a unique build path.

• Each included build path must not conflict with any project path of the main build.

These conditions guarantee that each project can be uniquely identified even in a composite build.

If conflicts arise, the way to resolve them is by changing the build name of an included build:

[Link]

includeBuild("some-included-build") {
name = "other-name"
}

When a composite build is included in another composite build, both builds have
NOTE
the same parent. In other words, the nested composite build structure is flattened.

Interacting with a composite build

Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be
executed, tests can be run, and builds can be imported into the IDE.

Executing tasks

Tasks from an included build can be executed from the command-line or IDE in the same way as
tasks from a regular multi-project build. Executing a task will result in task dependencies being
executed, as well as those tasks required to build dependency artifacts from other included builds.

You can call a task in an included build using a fully qualified path, for example, :included-build-
name:project-name:taskName. Project and task names can be abbreviated.

$ ./gradlew :included-build:subproject-a:compileJava
> Task :included-build:subproject-a:compileJava

$ ./gradlew :i-b:sA:cJ
> Task :included-build:subproject-a:compileJava

To exclude a task from the command line, you need to provide the fully qualified path to the task.

Included build tasks are automatically executed to generate required dependency


NOTE artifacts, or the including build can declare a dependency on a task from an
included build.

Importing into the IDE

One of the most useful features of composite builds is IDE integration.

Importing a composite build permits sources from separate Gradle builds to be easily developed
together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse
Project. Source dependencies are configured, providing cross-build navigation and refactoring.

Declaring dependencies substituted by an included build

By default, Gradle will configure each included build to determine the dependencies it can provide.
The algorithm for doing this is simple. Gradle will inspect the group and name for the projects in
the included build and substitute project dependencies for any external dependency matching
${[Link]}:${[Link]}.

NOTE By default, substitutions are not registered for the main build.
To make the (sub)projects of the main build addressable by
${[Link]}:${[Link]}, you can tell Gradle to treat the main build like an
included build by self-including it: includeBuild(".").

There are cases when the default substitutions determined by Gradle are insufficient or must be
corrected for a particular composite. For these cases, explicitly declaring the substitutions for an
included build is possible.

For example, a single-project build called anonymous-library, produces a Java utility library but does
not declare a value for the group attribute:

[Link]

plugins {
java
}

[Link]

plugins {
id 'java'
}

When this build is included in a composite, it will attempt to substitute for the dependency module
undefined:anonymous-library (undefined being the default value for [Link], and anonymous-
library being the root project name). Clearly, this isn’t useful in a composite build.

To use the unpublished library in a composite build, you can explicitly declare the substitutions
that it provides:

[Link]

includeBuild("anonymous-library") {
dependencySubstitution {
substitute(module("[Link]:number-utils")).using(project(":"))
}
}

[Link]

includeBuild('anonymous-library') {
dependencySubstitution {
substitute module('[Link]:number-utils') using project(':')
}
}

With this configuration, the my-app composite build will substitute any dependency on
[Link]:number-utils with a dependency on the root project of anonymous-library.

Deactivate included build substitutions for a configuration

If you need to resolve a published version of a module that is also available as part of an included
build, you can deactivate the included build substitution rules on the ResolutionStrategy of the
Configuration that is resolved. This is necessary because the rules are globally applied in the build,
and Gradle does not consider published versions during resolution by default.

For example, we create a separate publishedRuntimeClasspath configuration that gets resolved to the
published versions of modules that also exist in one of the local builds. This is done by deactivating
global dependency substitution rules:

[Link]

[Link]("publishedRuntimeClasspath") {
[Link] = false

extendsFrom([Link]())
isCanBeConsumed = false
[Link](Usage.USAGE_ATTRIBUTE,
[Link](Usage.JAVA_RUNTIME))
}

[Link]

[Link]('publishedRuntimeClasspath') {
[Link] = false

extendsFrom([Link])
canBeConsumed = false
[Link](Usage.USAGE_ATTRIBUTE, [Link](Usage, Usage
.JAVA_RUNTIME))
}

A use-case would be to compare published and locally built JAR files.


Cases where included build substitutions must be declared

Many builds will function automatically as an included build, without declared substitutions. Here
are some common cases where declared substitutions are required:

• When the archivesBaseName property is used to set the name of the published artifact.

• When a configuration other than default is published.

• When the [Link]() is used to publish artifacts that don’t match the project name.

• When the maven-publish or ivy-publish plugins are used for publishing and the publication
coordinates don’t match ${[Link]}:${[Link]}.

Cases where composite build substitutions won’t work

Some builds won’t function correctly when included in a composite, even when dependency
substitutions are explicitly declared. This limitation is because a substituted project dependency
will always point to the default configuration of the target project. Any time the artifacts and
dependencies specified for the default configuration of a project don’t match what is published to a
repository, the composite build may exhibit different behavior.

Here are some cases where the published module metadata may be different from the project
default configuration:

• When a configuration other than default is published.

• When the maven-publish or ivy-publish plugins are used.

• When the POM or [Link] file is tweaked as part of publication.

Builds using these features function incorrectly when included in a composite build.

Depending on tasks in an included build

While included builds are isolated from one another and cannot declare direct dependencies, a
composite build can declare task dependencies on its included builds. The included builds are
accessed using [Link]() or [Link]([Link]), and a task
reference is obtained via the [Link]([Link]) method.

Using these APIs, it is possible to declare a dependency on a task in a particular included build:

[Link]

[Link]("run") {
dependsOn([Link]("my-app").task(":app:run"))
}
[Link]

[Link]('run') {
dependsOn [Link]('my-app').task(':app:run')
}

Or you can declare a dependency on tasks with a certain path in some or all of the included builds:

[Link]

[Link]("publishDeps") {
dependsOn([Link] {
[Link](":publishMavenPublicationToMavenRepository") })
}

[Link]

[Link]('publishDeps') {
dependsOn [Link]*.task(
':publishMavenPublicationToMavenRepository')
}

Limitations of composite builds

Limitations of the current implementation include:

• No support for included builds with publications that don’t mirror the project default
configuration.
See Cases where composite builds won’t work.

• Multiple composite builds may conflict when run in parallel if more than one includes the same
build.
Gradle does not share the project lock of a shared composite build between Gradle invocations
to prevent concurrent execution.

Configuration On Demand
Configuration-on-demand attempts to configure only the relevant projects for the requested tasks,
i.e., it only evaluates the build script file of projects participating in the build. This way, the
configuration time of a large multi-project build can be reduced.
The configuration-on-demand feature is incubating, so only some builds are guaranteed to work
correctly. The feature works well for decoupled multi-project builds.

In configuration-on-demand mode, projects are configured as follows:

• The root project is always configured.

• The project in the directory where the build is executed is also configured, but only when
Gradle is executed without any tasks.
This way, the default tasks behave correctly when projects are configured on demand.

• The standard project dependencies are supported, and relevant projects are configured.
If project A has a compile dependency on project B, then building A causes the configuration of
both projects.

• The task dependencies declared via the task path are supported and cause relevant projects to
be configured.
Example: [Link](":some-other-project:someOtherTask")

• A task requested via task path from the command line (or tooling API) causes the relevant
project to be configured.
For example, building project-a:project-b:someTask causes configuration of project-b.

Enable configuration-on-demand

You can enable configuration-on-demand using the --configure-on-demand flag or adding


[Link]=true to the [Link] file.

To configure on demand with every build run, see Gradle properties.

To configure on demand for a given build, see command-line performance-oriented options.

Decoupled projects

Gradle allows projects to access each other’s configurations and tasks during the configuration and
execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform
optimizations such as parallel project builds and configuration on demand.

Projects are considered decoupled when they interact solely through declared dependencies and
task dependencies. Any direct modification or reading of another project’s object creates coupling
between the projects. Coupling during configuration can result in flawed build outcomes when
using 'configuration on demand', while coupling during execution can affect parallel execution.

One common source of coupling is configuration injection, such as using allprojects{} or


subprojects{} in build scripts.

To avoid coupling issues, it’s recommended to:

• Refrain from referencing other subprojects' build scripts and prefer cross-project configuration
from the root project.

• Avoid dynamically changing other projects' configurations during execution.


As Gradle evolves, it aims to provide features that leverage decoupled projects while offering
solutions for common use cases like configuration injection without introducing coupling.

Parallel projects

Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently
executing tasks from different projects.

To enable parallel execution, use the --parallel command-line argument or configure your build
environment. Gradle automatically determines the optimal number of parallel threads based on
CPU cores.

During parallel execution, each worker handles a specific project exclusively. Task dependencies
are respected, with workers prioritizing upstream tasks. However, tasks may not execute in
alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and
inputs/outputs to avoid ordering issues.
DEVELOPING TASKS
Understanding Tasks
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.

Before reading this chapter, it’s recommended that you first read the Learning The Basics and
complete the Tutorial.

Listing tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in a project by running the following command in the terminal:

$ ./gradlew tasks

Let’s take a very basic Gradle project as an example. The project has the following structure:

gradle-project
├── app
│ ├── [Link] // empty file - no build logic
│ └── ... // some java code
├── [Link] // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── [Link]

gradle-project
├── app
│ ├── [Link] // empty file - no build logic
│ └── ... // some java code
├── [Link] // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── [Link]

The settings file contains the following:

[Link]

[Link] = "gradle-project"
include("app")

[Link]

[Link] = 'gradle-project'
include('app')

Currently, the app subproject’s build file is empty.

To see the tasks available in the app subproject, run ./gradlew :app:tasks:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

We observe that only a small number of help tasks are available at the moment. This is because the
core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build
your project or compile your code, are added by plugins.

Let’s explore this by adding the Gradle core base plugin to the app build script:

app/[Link]

plugins {
id("base")
}

app/[Link]

plugins {
id('base')
}

The base plugin adds central lifecycle tasks. Now when we run ./gradlew app:tasks, we can see the
assemble and build tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.

Task outcomes

When Gradle executes a task, it labels the task with outcomes via the console.

These labels are based on whether a task has actions to execute and if Gradle executed them.
Actions include, but are not limited to, compiling code, zipping files, and publishing archives.

(no label) or EXECUTED


Task executed its actions.

• Task has actions and Gradle executed them.

• Task has no actions and some dependencies, and Gradle executed one or more of the
dependencies. See also Lifecycle Tasks.

UP-TO-DATE
Task’s outputs did not change.

• Task has outputs and inputs but they have not changed. See Incremental Build.

• Task has actions, but the task tells Gradle it did not change its outputs.

• Task has no actions and some dependencies, but all the dependencies are UP-TO-DATE, SKIPPED
or FROM-CACHE. See Lifecycle Tasks.

• Task has no actions and no dependencies.

FROM-CACHE
Task’s outputs could be found from a previous execution.

• Task has outputs restored from the build cache. See Build Cache.

SKIPPED
Task did not execute its actions.

• Task has been explicitly excluded from the command-line. See Excluding tasks from
execution.

• Task has an onlyIf predicate return false. See Using a predicate.

NO-SOURCE
Task did not need to execute its actions.

• Task has inputs and outputs, but no sources (i.e., inputs were not found).

Task group and description

Task groups and descriptions are used to organize and describe tasks.

Groups
Task groups are used to categorize tasks. When you run ./gradlew tasks, tasks are listed under
their respective groups, making it easier to understand their purpose and relationship to other
tasks. Groups are set using the group property.

Descriptions
Descriptions provide a brief explanation of what a task does. When you run ./gradlew tasks, the
descriptions are shown next to each task, helping you understand its purpose and how to use it.
Descriptions are set using the description property.

Let’s consider a basic Java application as an example. The build contains a subproject called app.

Let’s list the available tasks in app at the moment:

$ ./gradlew :app:tasks

> Task :app:tasks


------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application.

Build tasks
-----------
assemble - Assembles the outputs of this project.

Here, the :run task is part of the Application group with the description Runs this project as a JVM
application. In code, it would look something like this:

app/[Link]

[Link]("run") {
group = "Application"
description = "Runs this project as a JVM application."
}

app/[Link]

[Link]("run") {
group = "Application"
description = "Runs this project as a JVM application."
}

Private and hidden tasks

Gradle doesn’t support marking a task as private.

However, tasks will only show up when running :tasks if [Link] is set or no other task depends
on it.

For instance, the following task will not appear when running ./gradlew :app:tasks because it does
not have a group; it is called a hidden task:

app/[Link]

[Link]("helloTask") {
println("Hello")
}

app/[Link]

[Link]("helloTask") {
println 'Hello'
}

Although helloTask is not listed, it can still be executed by Gradle:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Let’s add a group to the same task:

app/[Link]

[Link]("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}

app/[Link]

[Link]("helloTask") {
group = "Other"
description = "Hello task"
println 'Hello'
}

Now that the group is added, the task is visible:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Other tasks
-----------
helloTask - Hello task

In contrast, ./gradlew tasks --all will show all tasks; hidden and visible tasks are listed.

Grouping tasks

If you want to customize which tasks are shown to users when listed, you can group tasks and set
the visibility of each group.

Remember, even if you hide tasks, they are still available, and Gradle can still run
NOTE
them.

Let’s start with an example built by Gradle init for a Java application with multiple subprojects.
The project structure is as follows:

gradle-project
├── app
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── utilities
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── list
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── [Link]
│ ├── [Link]
│ └── src // common build logic
│ └── ...
├── [Link]
├── gradle
├── gradlew
└── [Link]

gradle-project
├── app
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── utilities
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── list
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── [Link]
│ ├── [Link]
│ └── src // common build logic
│ └── ...
├── [Link]
├── gradle
├── gradlew
└── [Link]

Run app:tasks to see available tasks in the app subproject:

$ ./gradlew :app:tasks

> Task :app:tasks


------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these
tasks are rarely required directly by developers using the build.

We can configure the :tasks task and limit the tasks shown to a certain group.

Let’s create our own group so that all tasks are hidden by default by updating the app build script:

app/[Link]

val myBuildGroup = "my app build" // Create a group name

[Link]<TaskReportTask>("tasksAll") { // Register the tasksAll task


group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}

[Link]<TaskReportTask>("tasks") { // Move all existing tasks to


the group
displayGroup = myBuildGroup
}

app/[Link]

def myBuildGroup = "my app build" // Create a group name

[Link](TaskReportTask, "tasksAll") { // Register the tasksAll task


group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}

[Link](TaskReportTask, "tasks") { // Move all existing tasks to


the group
displayGroup = myBuildGroup
}

Now, when we list tasks available in app, the list is shorter:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

My app build tasks


------------------
tasksAll - Show additional tasks.

Task categories

Gradle distinguishes between two categories of tasks:

1. Lifecycle tasks

2. Actionable tasks

Lifecycle tasks define targets you can call, such as :build your project. Lifecycle tasks do not
provide Gradle with actions. They must be wired to actionable tasks. The base Gradle plugin only
adds lifecycle tasks.

Actionable tasks define actions for Gradle to take, such as :compileJava, which compiles the Java
code of your project. Actions include creating JARs, zipping files, publishing archives, and much
more. Plugins like the java-library plugin adds actionable tasks.

Let’s update the build script of the previous example, which is currently an empty file so that our
app subproject is a Java library:

app/[Link]

plugins {
id("java-library")
}

app/[Link]

plugins {
id('java-library')
}

Once again, we list the available tasks to see what new tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.

We see that many new tasks are available such as jar and testClasses.

Additionally, the java-library plugin has wired actionable tasks to lifecycle tasks. If we call the
:build task, we can see several tasks have been executed, including the :app:compileJava task.

$./gradlew :app:build

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

The actionable :compileJava task is wired to the lifecycle :build task.

Incremental tasks

A key feature of Gradle tasks is their incremental nature.

Gradle can reuse results from prior builds. Therefore, if we’ve built our project before and made
only minor changes, rerunning :build will not require Gradle to perform extensive work.

For example, if we modify only the test code in our project, leaving the production code unchanged,
executing the build will solely recompile the test code. Gradle marks the tasks for the production
code as UP-TO-DATE, indicating that it remains unchanged since the last successful build:

$./gradlew :app:build

gradle@MacBook-Pro temp1 % ./gradlew :app:build


> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:assemble UP-TO-DATE
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check UP-TO-DATE
> Task :app:build UP-TO-DATE

Caching tasks

Gradle can reuse results from past builds using the build cache.

To enable this feature, activate the build cache by using the --build-cache command line parameter
or by setting [Link]=true in your [Link] file.

This optimization has the potential to accelerate your builds significantly:

$./gradlew :app:clean :app:build --build-cache

> Task :app:compileJava FROM-CACHE


> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava FROM-CACHE
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses UP-TO-DATE
> Task :app:test FROM-CACHE
> Task :app:check UP-TO-DATE
> Task :app:build

When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE.

The build cache is handy if you switch between branches regularly. Gradle supports both local and
remote build caches.

Developing tasks

When developing Gradle tasks, you have two choices:

1. Use an existing Gradle task type such as Zip, Copy, or Delete

2. Create your own Gradle task type such as MyResolveTask or CustomTaskUsingToolchains.

Task types are simply subclasses of the Gradle Task class.

With Gradle tasks, there are three states to consider:

1. Registering a task - using a task (implemented by you or provided by Gradle) in your build
logic.

2. Configuring a task - defining inputs and outputs for a registered task.

3. Implementing a task - creating a custom task class (i.e., custom class type).

Registration is commonly done with the register() method.


Configuring a task is commonly done with the named() method.
Implementing a task is commonly done by extending Gradle’s DefaultTask class:

[Link]<Copy>("myCopy") ①

[Link]<Copy>("myCopy") { ②
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

abstract class MyCopyTask : DefaultTask() { ③


@TaskAction
fun copyFiles() {
val sourceDir = File("sourceDir")
val destinationDir = File("destinationDir")
[Link]()?.forEach { file ->
if ([Link] && [Link] == "txt") {
[Link](File(destinationDir, [Link]))
}
}
}
}

① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.

② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.

③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.

[Link](Copy, "myCopy") ①

[Link](Copy, "myCopy") { ②
from "resources"
into "target"
include "**/*.txt", "**/*.xml", "**/*.properties"
}

abstract class MyCopyTask extends DefaultTask { ③


@TaskAction
void copyFiles() {
fileTree('sourceDir').matching {
include '**/*.txt'
}.forEach { file ->
[Link]([Link]('sourceDir', 'destinationDir'))
}
}
}

① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.

② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.

③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.

1. Registering tasks

You define actions for Gradle to take by registering tasks in build scripts or plugins.
Tasks are defined using strings for task names:

[Link]

[Link]("hello") {
doLast {
println("hello")
}
}

[Link]

[Link]('hello') {
doLast {
println 'hello'
}
}

In the example above, the task is added to the TasksCollection using the register() method in
TaskContainer.

2. Configuring tasks

Gradle tasks must be configured to complete their action(s) successfully. If a task needs to ZIP a file,
it must be configured with the file name and location. You can refer to the API for the Gradle Zip
task to learn how to configure it appropriately.

Let’s look at the Copy task provided by Gradle as an example. We first register a task called myCopy of
type Copy in the build script:

[Link]

[Link]<Copy>("myCopy")

[Link]

[Link]('myCopy', Copy)

This registers a copy task with no default behavior. Since the task is of type Copy, a Gradle supported
task type, it can be configured using its API.

The following examples show several ways to achieve the same configuration:

1. Using the named() method:

Use named() to configure an existing task registered elsewhere:

[Link]

[Link]<Copy>("myCopy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

[Link]

[Link]('myCopy') {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}

2. Using a configuration block:

Use a block to configure the task immediately upon registering it:

[Link]

[Link]<Copy>("copy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

[Link]

[Link]('copy', Copy) {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}

3. Name method as call:

A popular option that is only supported in Groovy is the shorthand notation:

copy {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

NOTE This option breaks task configuration avoidance and is not recommended!

Regardless of the method chosen, the task is configured with the name of the files to be copied and
the location of the files.

3. Implementing tasks

Gradle provides many task types including Delete, Javadoc, Copy, Exec, Tar, and Pmd. You can
implement a custom task type if Gradle does not provide a task type that meets your build logic
needs.

To create a custom task class, you extend DefaultTask and make the extending class abstract:

app/[Link]

abstract class MyCopyTask : DefaultTask() {

app/[Link]

abstract class MyCopyTask extends DefaultTask {

Controlling Task Execution


Task dependencies allow tasks to be executed in a specific order based on their dependencies. This
ensures that tasks dependent on others are only executed after those dependencies have
completed.

Task dependencies can be categorized as either implicit or explicit:

Implicit dependencies
These dependencies are automatically inferred by Gradle based on the tasks' actions and
configuration. For example, if taskB uses the output of taskA (e.g., a file generated by taskA),
Gradle will automatically ensure that taskA is executed before taskB to fulfill this dependency.

Explicit dependencies
These dependencies are explicitly declared in the build script using the dependsOn, mustRunAfter,
or shouldRunAfter methods. For example, if you want to ensure that taskB always runs after
taskA, you can explicitly declare this dependency using [Link](taskA).

Both implicit and explicit dependencies play a crucial role in defining the order of task execution
and ensuring that tasks are executed in the correct sequence to produce the desired build output.

Task dependencies

Gradle inherently understands the dependencies among tasks. Consequently, it can determine the
tasks that need execution when you target a specific task.

Let’s take an example application with an app subproject and a some-logic subproject:

[Link]

[Link] = "gradle-project"
include("app")
include("some-logic")

[Link]

[Link] = 'gradle-project'
include('app')
include('some-logic')

Let’s imagine that the app subproject depends on the subproject called some-logic, which contains
some Java code. We add this dependency in the app build script:

app/[Link]

plugins {
id("application") // app is now a java application
}

application {
[Link]("[Link]") // main class name required by
the application plugin
}

dependencies {
implementation(project(":some-logic")) // dependency on some-logic
}

app/[Link]

plugins {
id('application') // app is now a java application
}

application {
mainClass = '[Link]' // main class name required by
the application plugin
}

dependencies {
implementation(project(':some-logic')) // dependency on some-logic
}

If we run :app:build again, we see the Java code of some-logic is also compiled by Gradle
automatically:

$./gradlew :app:build

> Task :app:processResources NO-SOURCE


> Task :app:processTestResources NO-SOURCE
> Task :some-logic:compileJava UP-TO-DATE
> Task :some-logic:processResources NO-SOURCE
> Task :some-logic:classes UP-TO-DATE
> Task :some-logic:jar UP-TO-DATE
> Task :app:compileJava
> Task :app:classes
> Task :app:jar UP-TO-DATE
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava UP-TO-DATE
> Task :app:testClasses UP-TO-DATE
> Task :app:test
> Task :app:check
> Task :app:build

BUILD SUCCESSFUL in 430ms


9 actionable tasks: 5 executed, 4 up-to-date

Adding dependencies

There are several ways you can define the dependencies of a task.

Defining dependencies using task names and the dependsOn()` method is simplest.

The following is an example which adds a dependency from taskX to taskY:

[Link]("taskX") {
dependsOn("taskY")
}

[Link]("taskX") {
dependsOn "taskY"
}

$ gradle -q taskX
taskY
taskX
For more information about task dependencies, see the Task API.

Ordering tasks

In some cases, it is useful to control the order in which two tasks will execute, without introducing
an explicit dependency between those tasks.

The primary difference between a task ordering and a task dependency is that an ordering rule does
not influence which tasks will be executed, only the order in which they will be executed.

Task ordering can be useful in a number of scenarios:

• Enforce sequential ordering of tasks (e.g., build never runs before clean).

• Run build validations early in the build (e.g., validate I have the correct credentials before
starting the work for a release build).

• Get feedback faster by running quick verification tasks before long verification tasks (e.g., unit
tests should run before integration tests).

• A task that aggregates the results of all tasks of a particular type (e.g., test report task combines
the outputs of all executed test tasks).

Two ordering rules are available: "must run after" and "should run after".

To specify a "must run after" or "should run after" ordering between 2 tasks, you use the
[Link]([Link]...) and [Link]([Link]...) methods. These
methods accept a task instance, a task name, or any other input accepted by
[Link]([Link]...).

When you use "must run after", you specify that taskY must always run after taskX when the build
requires the execution of taskX and taskY. So if you only run taskY with mustRunAfter, you won’t
cause taskX to run. This is expressed as [Link](taskX).

[Link]

val taskX by [Link] {


doLast {
println("taskX")
}
}
val taskY by [Link] {
doLast {
println("taskY")
}
}
taskY {
mustRunAfter(taskX)
}
[Link]

def taskX = [Link]('taskX') {


doLast {
println 'taskX'
}
}
def taskY = [Link]('taskY') {
doLast {
println 'taskY'
}
}
[Link] {
mustRunAfter taskX
}

$ gradle -q taskY taskX


taskX
taskY

The "should run after" ordering rule is similar but less strict, as it will be ignored in two situations:

1. If using that rule introduces an ordering cycle.

2. When using parallel execution and all task dependencies have been satisfied apart from the
"should run after" task, then this task will be run regardless of whether or not its "should run
after" dependencies have been run.

You should use "should run after" where the ordering is helpful but not strictly required:

[Link]

val taskX by [Link] {


doLast {
println("taskX")
}
}
val taskY by [Link] {
doLast {
println("taskY")
}
}
taskY {
shouldRunAfter(taskX)
}
[Link]

def taskX = [Link]('taskX') {


doLast {
println 'taskX'
}
}
def taskY = [Link]('taskY') {
doLast {
println 'taskY'
}
}
[Link] {
shouldRunAfter taskX
}

$ gradle -q taskY taskX


taskX
taskY

In the examples above, it is still possible to execute taskY without causing taskX to run:

$ gradle -q taskY
taskY

The “should run after” ordering rule will be ignored if it introduces an ordering cycle:

[Link]

val taskX by [Link] {


doLast {
println("taskX")
}
}
val taskY by [Link] {
doLast {
println("taskY")
}
}
val taskZ by [Link] {
doLast {
println("taskZ")
}
}
taskX { dependsOn(taskY) }
taskY { dependsOn(taskZ) }
taskZ { shouldRunAfter(taskX) }

[Link]

def taskX = [Link]('taskX') {


doLast {
println 'taskX'
}
}
def taskY = [Link]('taskY') {
doLast {
println 'taskY'
}
}
def taskZ = [Link]('taskZ') {
doLast {
println 'taskZ'
}
}
[Link] { dependsOn(taskY) }
[Link] { dependsOn(taskZ) }
[Link] { shouldRunAfter(taskX) }

$ gradle -q taskX
taskZ
taskY
taskX

Note that [Link](taskX) or [Link](taskX) does not imply any execution


dependency between the tasks:

• It is possible to execute taskX and taskY independently. The ordering rule only has an effect
when both tasks are scheduled for execution.

• When run with --continue, it is possible for taskY to execute if taskX fails.

Finalizer tasks

Finalizer tasks are automatically added to the task graph when the finalized task is scheduled to
run.

To specify a finalizer task, you use the [Link]([Link]…) method. This method
accepts a task instance, a task name, or any other input accepted by
[Link]([Link]…):

[Link]

val taskX by [Link] {


doLast {
println("taskX")
}
}
val taskY by [Link] {
doLast {
println("taskY")
}
}

taskX { finalizedBy(taskY) }

[Link]

def taskX = [Link]('taskX') {


doLast {
println 'taskX'
}
}
def taskY = [Link]('taskY') {
doLast {
println 'taskY'
}
}

[Link] { finalizedBy taskY }

$ gradle -q taskX
taskX
taskY

Finalizer tasks are executed even if the finalized task fails or if the finalized task is considered UP-
TO-DATE:
[Link]

val taskX by [Link] {


doLast {
println("taskX")
throw RuntimeException()
}
}
val taskY by [Link] {
doLast {
println("taskY")
}
}

taskX { finalizedBy(taskY) }

[Link]

def taskX = [Link]('taskX') {


doLast {
println 'taskX'
throw new RuntimeException()
}
}
def taskY = [Link]('taskY') {
doLast {
println 'taskY'
}
}

[Link] { finalizedBy taskY }

$ gradle -q taskX
taskX
taskY

FAILURE: Build failed with an exception.

* Where:
Build file '/home/user/gradle/samples/[Link]' line: 4

* What went wrong:


Execution failed for task ':taskX'.
> [Link] (no error message)
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
> Get more help at [Link]

BUILD FAILED in 0s

Finalizer tasks are useful when the build creates a resource that must be cleaned up, regardless of
whether the build fails or succeeds. An example of such a resource is a web container that is started
before an integration test task and must be shut down, even if some tests fail.

Skipping tasks

Gradle offers multiple ways to skip the execution of a task.

1. Using a predicate

You can use [Link] to attach a predicate to a task. The task’s actions will only be executed if the
predicate is evaluated to be true.

The predicate is passed to the task as a parameter and returns true if the task will execute and
false if the task will be skipped. The predicate is evaluated just before the task is executed.

Passing an optional reason string to onlyIf() is useful for explaining why the task is skipped:

[Link]

val hello by [Link] {


doLast {
println("hello world")
}
}

hello {
val skipProvider = [Link]("skipHello")
onlyIf("there is no property skipHello") {
![Link]()
}
}

[Link]

def hello = [Link]('hello') {


doLast {
println 'hello world'
}
}

[Link] {
def skipProvider = [Link]("skipHello")
onlyIf("there is no property skipHello") {
![Link]
}
}

$ gradle hello -PskipHello


> Task :hello SKIPPED

BUILD SUCCESSFUL in 0s

To find why a task was skipped, run the build with the --info logging level.

$ gradle hello -PskipHello --info


...

> Task :hello SKIPPED


Skipping task ':hello' as task onlyIf 'there is no property skipHello' is false.
:hello (Thread[included builds,5,main]) completed. Took 0.018 secs.

BUILD SUCCESSFUL in 13s

2. Using StopExecutionException

If the logic for skipping a task can’t be expressed with a predicate, you can use the
StopExecutionException.

If this exception is thrown by an action, the task action as well as the execution of any following
action is skipped. The build continues by executing the next task:

[Link]

val compile by [Link] {


doLast {
println("We are doing the compile.")
}
}

compile {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw StopExecutionException()
}
}
}
[Link]("myTask") {
dependsOn(compile)
doLast {
println("I am not affected")
}
}

[Link]

def compile = [Link]('compile') {


doLast {
println 'We are doing the compile.'
}
}

[Link] {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw new StopExecutionException()
}
}
}
[Link]('myTask') {
dependsOn('compile')
doLast {
println 'I am not affected'
}
}

$ gradle -q myTask
I am not affected

This feature is helpful if you work with tasks provided by Gradle. It allows you to add conditional
[1]
execution of the built-in actions of such a task.
3. Enabling and Disabling tasks

Every task has an enabled flag, which defaults to true. Setting it to false prevents executing the
task’s actions.

A disabled task will be labeled SKIPPED:

[Link]

val disableMe by [Link] {


doLast {
println("This should not be printed if the task is disabled.")
}
}

disableMe {
enabled = false
}

[Link]

def disableMe = [Link]('disableMe') {


doLast {
println 'This should not be printed if the task is disabled.'
}
}

[Link] {
enabled = false
}

$ gradle disableMe
> Task :disableMe SKIPPED

BUILD SUCCESSFUL in 0s

4. Task timeouts

Every task has a timeout property, which can be used to limit its execution time. When a task
reaches its timeout, its task execution thread is interrupted. The task will be marked as FAILED.

Finalizer tasks are executed. If --continue is used, other tasks continue running.

Tasks that don’t respond to interrupts can’t be timed out. All of Gradle’s built-in tasks respond to
timeouts.

[Link]

[Link]("hangingTask") {
doLast {
[Link](100000)
}
timeout = [Link](500)
}

[Link]

[Link]("hangingTask") {
doLast {
[Link](100000)
}
timeout = [Link](500)
}

Task rules

Sometimes you want to have a task whose behavior depends on a large or infinite number value
range of parameters. A very nice and expressive way to provide such tasks are task rules:

[Link]

[Link]("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + ([Link]("ping", "")))
}
}
}
}

[Link]

[Link]("Pattern: ping<ID>") { String taskName ->


if ([Link]("ping")) {
task(taskName) {
doLast {
println "Pinging: " + (taskName - 'ping')
}
}
}
}

$ gradle -q pingServer1
Pinging: Server1

The String parameter is used as a description for the rule, which is shown with ./gradlew tasks.

Rules are not only used when calling tasks from the command line. You can also create dependsOn
relations on rule based tasks:

[Link]

[Link]("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + ([Link]("ping", "")))
}
}
}
}

[Link]("groupPing") {
dependsOn("pingServer1", "pingServer2")
}

[Link]

[Link]("Pattern: ping<ID>") { String taskName ->

if ([Link]("ping")) {
task(taskName) {
doLast {
println "Pinging: " + (taskName - 'ping')
}
}
}
}

[Link]('groupPing') {
dependsOn 'pingServer1', 'pingServer2'
}

$ gradle -q groupPing
Pinging: Server1
Pinging: Server2

If you run ./gradlew -q tasks, you won’t find a task named pingServer1 or pingServer2, but this
script is executing logic based on the request to run those tasks.

Exclude tasks from execution

You can exclude a task from execution using the -x or --exclude-task command-line option and
provide the task’s name to exclude.

$ ./gradlew build -x test

For instance, you can run the check task but exclude the test task from running. This approach can
lead to unexpected outcomes, particularly if you exclude an actionable task that produces results
needed by other tasks. Instead of relying on the -x parameter, defining a suitable lifecycle task for
the desired action is recommended.

Using -x is a practice that should be avoided, although still commonly observed.

Organizing Tasks
There are two types of tasks, actionable and lifecycle tasks.

Actionable tasks in Gradle are tasks that perform actual work, such as compiling code. Lifecycle
tasks are tasks that do not do work themselves. These tasks have no actions, instead, they bundle
actionable tasks and serve as targets for the build.
A well-organized setup of lifecycle tasks enhances the accessibility of your build for new users and
simplifies integration with CI.

Lifecycle tasks

Lifecycle tasks can be particularly beneficial for separating work between users or machines (CI vs
local). For example, a developer on a local machine might not want to run an entire build on every
single change.

Let’s take a standard app as an example which applies the base plugin.

The Gradle base plugin defines several lifecycle tasks, including build, assemble, and
NOTE
check.

We group the build, check task, and the run task by adding the following lines to the app build script:

app/[Link]

[Link] {
group = myBuildGroup
}

[Link] {
group = myBuildGroup
description = "Runs checks (including tests)."
}

[Link]("run") {
group = myBuildGroup
}
app/[Link]

[Link] {
group = myBuildGroup
}

[Link] {
group = myBuildGroup
description = "Runs checks (including tests)."
}

[Link]('run') {
group = myBuildGroup
}

If we now look at the app:tasks list, we can see the three tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

My app build tasks


------------------
build - Assembles and tests this project.
check - Runs checks (including tests).
run - Runs this project as a JVM application
tasksAll - Show additional tasks.

This is already useful if the standard lifecycle tasks are sufficient. Moving the groups around helps
clarify the tasks you expect to used in your build.

In many cases, there are more specific requirements that you want to address. One common
scenario is running quality checks without running tests. Currently, the :check task runs tests and
the code quality checks. Instead, we want to run code quality checks all the time, but not the
lengthy test.

To add a quality check lifecycle task, we introduce an additional lifecycle task called qualityCheck
and a plugin called spotbugs.

To add a lifecycle task, use [Link](). The only thing you need to provide is a name. Put this
task in our group and wire the actionable tasks that belong to this new lifecycle task using the
dependsOn() method:
app/[Link]

plugins {
id("[Link]") version "6.0.7" // spotbugs plugin
}

[Link]("qualityCheck") { // qualityCheck task


group = myBuildGroup // group
description = "Runs checks (excluding tests)." // description
dependsOn([Link], [Link]) // dependencies
dependsOn([Link], [Link]) // dependencies
}

app/[Link]

plugins {
id '[Link]' version '6.0.7' // spotbugs plugin
}

[Link]('qualityCheck') { // qualityCheck task


group = myBuildGroup // group
description = 'Runs checks (excluding tests).' // description
dependsOn [Link], [Link] // dependencies
dependsOn [Link], [Link] // dependencies
}

Note that you don’t need to list all the tasks that Gradle will execute. Just specify the targets you
want to collect here. Gradle will determine which other tasks it needs to call to reach these goals.

In the example, we add the classes task, a lifecycle task to compile all our production code, and the
spotbugsMain task, which checks our production code.

We also add a description that will show up in the task list that helps distinguish the two check
tasks better.

Now, if run './gradlew :app:tasks', we can see that our new qualityCheck lifecycle task is available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
My app build tasks
------------------
build - Assembles and tests this project.
check - Runs checks (including tests).
qualityCheck - Runs checks (excluding tests).
run - Runs this project as a JVM application
tasksAll - Show additional tasks.

If we run it, we can see that it runs checkstyle but not the tests:

$ ./gradlew :app:qualityCheck

> Task :buildSrc:checkKotlinGradlePluginConfigurationErrors


> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:processTestResources NO-SOURCE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava
> Task :app:classes
> Task :app:compileTestJava
> Task :app:testClasses
> Task :app:spotbugsTest
> Task :app:spotbugsMain
> Task :app:qualityCheck

BUILD SUCCESSFUL in 1s
16 actionable tasks: 5 executed, 11 up-to-date

So far, we have looked at tasks in individual subprojects, which is useful for local development
when you work on code in one subproject.

With this setup, developers only need to know that they can call Gradle with :subproject-
name:tasks to see which tasks are available and useful for them.
Global lifecycle tasks

Another place to invoke lifecycle tasks is within the root build; this is especially useful for
Continuous Integration (CI).

Gradle tasks play a crucial role in CI or CD systems, where activities like compiling all code, running
tests, or building and packaging the complete application are typical. To facilitate this, you can
include lifecycle tasks that span multiple subprojects.

Gradle has been around for a long time, and you will frequently observe build files
in the root directory serving various purposes. In older Gradle versions, many tasks
NOTE
were defined within the root Gradle build file, resulting in various issues.
Therefore, exercise caution when determining the content of this file.

One of the few elements that should be placed in the root build file is global lifecycle tasks.

Let’s continue using the Gradle init Java application multi-project as an example.

This time, we’re incorporating a build script in the root project. We’ll establish two groups for our
global lifecycle tasks: one for tasks relevant to local development, such as running all checks, and
another exclusively for our CI system.

Once again, we narrowed down the tasks listed to our specific groups:

[Link]

val globalBuildGroup = "My global build"


val ciBuildGroup = "My CI build"

[Link]<TaskReportTask>("tasks") {
displayGroups = listOf<String>(globalBuildGroup, ciBuildGroup)
}

[Link]

def globalBuildGroup = "My global build"


def ciBuildGroup = "My CI build"

[Link](TaskReportTask, "tasks") {
displayGroups = [globalBuildGroup, ciBuildGroup]
}

You could hide the CI tasks if you wanted to by updating displayGroups.

Currently, the root project exposes no tasks:


$ ./gradlew :tasks

> Task :tasks

------------------------------------------------------------
Tasks runnable from root project 'gradle-project'
------------------------------------------------------------

No tasks

NOTE In this file, we don’t apply a plugin!

Let’s add a qualityCheckApp task to execute all code quality checks in the app subproject. Similarly,
for CI purposes, we implement a checkAll task that runs all tests:

[Link]

[Link]("qualityCheckApp") {
group = globalBuildGroup
description = "Runs checks on app (globally)"
dependsOn(":app:qualityCheck" )
}

[Link]("checkAll") {
group = ciBuildGroup
description = "Runs checks for all projects (CI)"
dependsOn([Link] { ":${[Link]}:check" })
dependsOn([Link] { [Link](":checkAll") })
}

[Link]

[Link]("qualityCheckApp") {
group = globalBuildGroup
description = "Runs checks on app (globally)"
dependsOn(":app:qualityCheck")
}

[Link]("checkAll") {
group = ciBuildGroup
description = "Runs checks for all projects (CI)"
dependsOn [Link] { ":${[Link]}:check" }
dependsOn [Link] { [Link](":checkAll") }
}
So we can now ask Gradle to show us the tasks for the root project and, by default, it will only show
us the qualityCheckAll task (and optionally the checkAll task depending on the value of
displayGroups).

It should be clear what a user should run locally:

$ ./gradlew :tasks

> Task :tasks

------------------------------------------------------------
Tasks runnable from root project 'gradle-project'
------------------------------------------------------------

My CI build tasks
-----------------
checkAll - Runs checks for all projects (CI)

My global build tasks


---------------------
qualityCheckApp - Runs checks on app (globally)

If we run the :checkAll task, we see that it compiles all the code and runs the code quality checks
(including spotbug):

$ ./gradlew :checkAll

> Task :buildSrc:checkKotlinGradlePluginConfigurationErrors


> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :utilities:processResources NO-SOURCE
> Task :app:processResources NO-SOURCE
> Task :utilities:processTestResources NO-SOURCE
> Task :app:processTestResources NO-SOURCE
> Task :list:compileJava
> Task :list:processResources NO-SOURCE
> Task :list:classes
> Task :list:jar
> Task :utilities:compileJava
> Task :utilities:classes
> Task :utilities:jar
> Task :utilities:compileTestJava NO-SOURCE
> Task :utilities:testClasses UP-TO-DATE
> Task :utilities:test NO-SOURCE
> Task :utilities:check UP-TO-DATE
> Task :list:compileTestJava
> Task :list:processTestResources NO-SOURCE
> Task :list:testClasses
> Task :app:compileJava
> Task :app:classes
> Task :app:compileTestJava
> Task :app:testClasses
> Task :list:test
> Task :list:check
> Task :app:test
> Task :app:spotbugsTest
> Task :app:spotbugsMain
> Task :app:check
> Task :checkAll

BUILD SUCCESSFUL in 1s
21 actionable tasks: 12 executed, 9 up-to-date

Configuring Tasks Lazily


Knowing when and where a particular value is configured is difficult to track as a build grows in
complexity. Gradle provides several ways to manage this using lazy configuration.

Understanding Lazy properties

Gradle provides lazy properties, which delay calculating a property’s value until it’s actually
required.

Lazy properties provide three main benefits:

1. Deferred Value Resolution: Allows wiring Gradle models without needing to know when a
property’s value will be known. For example, you may want to set the input source files of a
task based on the source directories property of an extension, but the extension property value
isn’t known until the build script or some other plugin configures them.

2. Automatic Task Dependency Management: Connects output of one task to input of another,
automatically determining task dependencies. Property instances carry information about
which task, if any, produces their value. Build authors do not need to worry about keeping task
dependencies in sync with configuration changes.

3. Improved Build Performance: Avoids resource-intensive work during configuration,


impacting build performance positively. For example, when a configuration value comes from
parsing a file but is only used when functional tests are run, using a property instance to
capture this means that the file is parsed only when the functional tests are run (and not when
clean is run, for example).

Gradle represents lazy properties with two interfaces:

Provider
Represents a value that can only be queried and cannot be changed.

• Properties with these types are read-only.

• The method [Link]() returns the current value of the property.

• A Provider can be created from another Provider using [Link](Transformer).

• Many other types extend Provider and can be used wherever a Provider is required.

Property
Represents a value that can be queried and changed.

• Properties with these types are configurable.

• Property extends the Provider interface.

• The method [Link](T) specifies a value for the property, overwriting whatever value
may have been present.

• The method [Link](Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together
Provider and Property instances before the values are configured.

• A Property can be created by the factory method [Link](Class).

Lazy properties are intended to be passed around and only queried when required. This typically
happens during the execution phase.

The following demonstrates a task with a configurable greeting property and a read-only message
property:

[Link]

abstract class Greeting : DefaultTask() { ①


@get:Input
abstract val greeting: Property<String> ②

@Internal
val message: Provider<String> = [Link] { it + " from Gradle" } ③
@TaskAction
fun printMessage() {
[Link]([Link]())
}
}

[Link]<Greeting>("greeting") {
[Link]("Hi") ④
greeting = "Hi" ⑤
}

[Link]

abstract class Greeting extends DefaultTask { ①


@Input
abstract Property<String> getGreeting() ②

@Internal
final Provider<String> message = [Link] { it + ' from Gradle' } ③

@TaskAction
void printMessage() {
[Link]([Link]())
}
}

[Link]("greeting", Greeting) {
[Link]('Hi') ④
greeting = 'Hi' ⑤
}

① A task that displays a greeting

② A configurable greeting

③ Read-only property calculated from the greeting

④ Configure the greeting

⑤ Alternative notation to calling [Link]()

$ gradle greeting

> Task :greeting


Hi from Gradle

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

The Greeting task has a property of type Property<String> to represent the configurable greeting
and a property of type Provider<String> to represent the calculated, read-only, message. The
message Provider is created from the greeting Property using the map() method; its value is kept up-
to-date as the value of the greeting property changes.

Creating a Property or Provider instance

Neither Provider nor its subtypes, such as Property, are intended to be implemented by a build
script or plugin. Gradle provides factory methods to create instances of these types instead.

In the previous example, two factory methods were presented:

• [Link](Class) create a new Property instance. An instance of the ObjectFactory


can be referenced from [Link]() or by injecting ObjectFactory through a constructor
or method.

• [Link](Transformer) creates a new Provider from an existing Provider or Property


instance.

See the Quick Reference for all of the types and factories available.

A Provider can also be created by the factory method [Link](Callable).

There are no specific methods to create a provider using a [Link].

When writing a plugin or build script with Groovy, you can use the map(Transformer)
NOTE method with a closure, and Groovy will convert the closure to a Transformer.

Similarly, when writing a plugin or build script with Kotlin, the Kotlin compiler will
convert a Kotlin function into a Transformer.

Connecting properties together

An important feature of lazy properties is that they can be connected together so that changes to
one property are automatically reflected in other properties.

Here is an example where the property of a task is connected to a property of a project extension:

[Link]

// A project extension
interface MessageExtension {
// A configurable greeting
abstract val greeting: Property<String>
}

// A task that displays a greeting


abstract class Greeting : DefaultTask() {
// Configurable by the user
@get:Input
abstract val greeting: Property<String>

// Read-only property calculated from the greeting


@Internal
val message: Provider<String> = [Link] { it + " from Gradle" }

@TaskAction
fun printMessage() {
[Link]([Link]())
}
}

// Create the project extension


val messages = [Link]<MessageExtension>("messages")

// Create the greeting task


[Link]<Greeting>("greeting") {
// Attach the greeting from the project extension
// Note that the values of the project extension have not been configured
yet
greeting = [Link]
}

[Link] {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = "Hi"
}

[Link]

// A project extension
interface MessageExtension {
// A configurable greeting
Property<String> getGreeting()
}

// A task that displays a greeting


abstract class Greeting extends DefaultTask {
// Configurable by the user
@Input
abstract Property<String> getGreeting()

// Read-only property calculated from the greeting


@Internal
final Provider<String> message = [Link] { it + ' from Gradle' }

@TaskAction
void printMessage() {
[Link]([Link]())
}
}

// Create the project extension


[Link]('messages', MessageExtension)

// Create the greeting task


[Link]("greeting", Greeting) {
// Attach the greeting from the project extension
// Note that the values of the project extension have not been configured
yet
greeting = [Link]
}

messages {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = 'Hi'
}

$ gradle greeting

> Task :greeting


Hi from Gradle

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example calls the [Link](Provider) method to attach a Provider to a Property to supply the
value of the property. In this case, the Provider happens to be a Property as well, but you can
connect any Provider implementation, for example one created using [Link]()

Working with files

In Working with Files, we introduced four collection types for File-like objects:

Read-only Type Configurable Type

FileCollection ConfigurableFileCollection

FileTree ConfigurableFileTree
All of these types are also considered lazy types.

There are more strongly typed models used to represent elements of the file system: Directory and
RegularFile. These types shouldn’t be confused with the standard Java File type as they are used to
tell Gradle that you expect more specific values such as a directory or a non-directory, regular file.

Gradle provides two specialized Property subtypes for dealing with values of these types:
RegularFileProperty and DirectoryProperty. ObjectFactory has methods to create these:
[Link]() and [Link]().

A DirectoryProperty can also be used to create a lazily evaluated Provider for a Directory and
RegularFile via [Link](String) and [Link](String) respectively. These
methods create providers whose values are calculated relative to the location for the
DirectoryProperty they were created from. The values returned from these providers will reflect
changes to the DirectoryProperty.

[Link]

// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource : DefaultTask() {
// The configuration file to use to generate the source file
@get:InputFile
abstract val configFile: RegularFileProperty

// The directory to write source files to


@get:OutputDirectory
abstract val outputDir: DirectoryProperty

@TaskAction
fun compile() {
val inFile = [Link]().asFile
[Link]("configuration file = $inFile")
val dir = [Link]().asFile
[Link]("output dir = $dir")
val className = [Link]().trim()
val srcFile = File(dir, "${className}.java")
[Link]("public class ${className} { }")
}
}

// Create the source generation task


[Link]<GenerateSource>("generate") {
// Configure the locations, relative to the project and build directories
configFile = [Link]("src/[Link]")
outputDir = [Link]("generated-source")
}

// Change the build directory


// Don't need to reconfigure the task properties. These are automatically
updated as the build directory changes
[Link] = [Link]("output")

[Link]

// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource extends DefaultTask {
// The configuration file to use to generate the source file
@InputFile
abstract RegularFileProperty getConfigFile()

// The directory to write source files to


@OutputDirectory
abstract DirectoryProperty getOutputDir()

@TaskAction
def compile() {
def inFile = [Link]().asFile
[Link]("configuration file = $inFile")
def dir = [Link]().asFile
[Link]("output dir = $dir")
def className = [Link]()
def srcFile = new File(dir, "${className}.java")
[Link] = "public class ${className} { ... }"
}
}

// Create the source generation task


[Link]('generate', GenerateSource) {
// Configure the locations, relative to the project and build directories
configFile = [Link]('src/[Link]')
outputDir = [Link]('generated-source')
}

// Change the build directory


// Don't need to reconfigure the task properties. These are automatically
updated as the build directory changes
[Link] = [Link]('output')

$ gradle generate

> Task :generate


configuration file = /home/user/gradle/samples/src/[Link]
output dir = /home/user/gradle/samples/output/generated-source
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

$ gradle generate

> Task :generate


configuration file = /home/user/gradle/samples/kotlin/src/[Link]
output dir = /home/user/gradle/samples/kotlin/output/generated-source

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example creates providers that represent locations in the project and build directories through
[Link]() with [Link]() and [Link]().

To close the loop, note that a DirectoryProperty, or a simple Directory, can be turned into a FileTree
that allows the files and directories contained in the directory to be queried with
[Link]() or [Link](). From a DirectoryProperty or a
Directory, you can create FileCollection instances containing a set of the files contained in the
directory with [Link](Object...) or [Link](Object...).

Working with task inputs and outputs

Many builds have several tasks connected together, where one task consumes the outputs of
another task as an input.

To make this work, we need to configure each task to know where to look for its inputs and where
to place its outputs. Ensure that the producing and consuming tasks are configured with the same
location and attach task dependencies between the tasks. This can be cumbersome and brittle if any
of these values are configurable by a user or configured by multiple plugins, as task properties need
to be configured in the correct order and locations, and task dependencies kept in sync as values
change.

The Property API makes this easier by keeping track of the value of a property and the task that
produces the value.

As an example, consider the following plugin with a producer and consumer task which are wired
together:

[Link]

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:InputFile
abstract val inputFile: RegularFileProperty

@TaskAction
fun consume() {
val input = [Link]().asFile
val message = [Link]()
[Link]("Read '${message}' from ${input}")
}
}

val producer = [Link]<Producer>("producer")


val consumer = [Link]<Consumer>("consumer")

consumer {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = [Link] { [Link] }
}

producer {
// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]("[Link]")
}

// Change the build directory.


// Don't need to update [Link] and [Link]. These are
automatically updated as the build directory changes
[Link] = [Link]("output")

[Link]

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@InputFile
abstract RegularFileProperty getInputFile()

@TaskAction
void consume() {
def input = [Link]().asFile
def message = [Link]
[Link]("Read '${message}' from ${input}")
}
}

def producer = [Link]("producer", Producer)


def consumer = [Link]("consumer", Consumer)

[Link] {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = [Link] { [Link] }
}

[Link] {
// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]('[Link]')
}

// Change the build directory.


// Don't need to update [Link] and [Link]. These are
automatically updated as the build directory changes
[Link] = [Link]('output')

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/output/[Link]

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/output/[Link]

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/[Link]

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/[Link]

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

In the example above, the task outputs and inputs are connected before any location is defined. The
setters can be called at any time before the task is executed, and the change will automatically
affect all related input and output properties.

Another important thing to note in this example is the absence of any explicit task dependency.
Task outputs represented using Providers keep track of which task produces their value, and using
them as task inputs will implicitly add the correct task dependencies.

Implicit task dependencies also work for input properties that are not files:

[Link]

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:Input
abstract val message: Property<String>

@TaskAction
fun consume() {
[Link]([Link]())
}
}

val producer = [Link]<Producer>("producer") {


// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]("[Link]")
}
[Link]<Consumer>("consumer") {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
message = [Link] { [Link] }.map { [Link]() }
}

[Link]

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@Input
abstract Property<String> getMessage()

@TaskAction
void consume() {
[Link]([Link]())
}
}

def producer = [Link]('producer', Producer) {


// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]('[Link]')
}
[Link]('consumer', Consumer) {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
message = [Link] { [Link] }.map { [Link] }
}

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/build/[Link]

> Task :consumer


Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/build/[Link]

> Task :consumer


Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

Working with collections

Gradle provides two lazy property types to help configure Collection properties.

These work exactly like any other Provider and, just like file providers, they have additional
modeling around them:

• For List values the interface is called ListProperty.


You can create a new ListProperty using [Link](Class) and specifying the
element type.

• For Set values the interface is called SetProperty.


You can create a new SetProperty using [Link](Class) and specifying the
element type.

This type of property allows you to overwrite the entire collection value with
[Link](Iterable) and [Link](Provider) or add new elements through
the various add methods:
• [Link](T): Add a single element to the collection

• [Link](Provider): Add a lazily calculated element to the collection

• [Link](Provider): Add a lazily calculated collection of elements to the list

Just like every Provider, the collection is calculated when [Link]() is called. The following
example shows the ListProperty in action:

[Link]

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:InputFiles
abstract val inputFiles: ListProperty<RegularFile>

@TaskAction
fun consume() {
[Link]().forEach { inputFile ->
val input = [Link]
val message = [Link]()
[Link]("Read '${message}' from ${input}")
}
}
}

val producerOne = [Link]<Producer>("producerOne")


val producerTwo = [Link]<Producer>("producerTwo")
[Link]<Consumer>("consumer") {
// Connect the producer task outputs to the consumer task input
// Don't need to add task dependencies to the consumer task. These are
automatically added
[Link]([Link]().outputFile)
[Link]([Link]().outputFile)
}

// Set values for the producer tasks lazily


// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
producerOne { outputFile = [Link]("[Link]") }
producerTwo { outputFile = [Link]("[Link]") }

// Change the build directory.


// Don't need to update the task properties. These are automatically updated
as the build directory changes
[Link] = [Link]("output")

[Link]

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@InputFiles
abstract ListProperty<RegularFile> getInputFiles()

@TaskAction
void consume() {
[Link]().each { inputFile ->
def input = [Link]
def message = [Link]
[Link]("Read '${message}' from ${input}")
}
}
}

def producerOne = [Link]('producerOne', Producer)


def producerTwo = [Link]('producerTwo', Producer)
[Link]('consumer', Consumer) {
// Connect the producer task outputs to the consumer task input
// Don't need to add task dependencies to the consumer task. These are
automatically added
[Link]([Link]().outputFile)
[Link]([Link]().outputFile)
}

// Set values for the producer tasks lazily


// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
[Link] { outputFile = [Link]('[Link]') }
[Link] { outputFile = [Link]('[Link]') }

// Change the build directory.


// Don't need to update the task properties. These are automatically updated
as the build directory changes
[Link] = [Link]('output')

$ gradle consumer

> Task :producerOne


Wrote 'Hello, World!' to /home/user/gradle/samples/output/[Link]

> Task :producerTwo


Wrote 'Hello, World!' to /home/user/gradle/samples/output/[Link]

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/output/[Link]
Read 'Hello, World!' from /home/user/gradle/samples/output/[Link]

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed

$ gradle consumer

> Task :producerOne


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/[Link]

> Task :producerTwo


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/[Link]

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/[Link]
Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/[Link]

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed

Working with maps

Gradle provides a lazy MapProperty type to allow Map values to be configured. You can create a
MapProperty instance using [Link](Class, Class).

Similar to other property types, a MapProperty has a set() method that you can use to specify the
value for the property. Some additional methods allow entries with lazy values to be added to the
map.

[Link]

abstract class Generator: DefaultTask() {


@get:Input
abstract val properties: MapProperty<String, Int>

@TaskAction
fun generate() {
[Link]().forEach { entry ->
[Link]("${[Link]} = ${[Link]}")
}
}
}

// Some values to be configured later


var b = 0
var c = 0

[Link]<Generator>("generate") {
[Link]("a", 1)
// Values have not been configured yet
[Link]("b", [Link] { b })
[Link]([Link] { mapOf("c" to c, "d" to c + 1) })
}

// Configure the values. There is no need to reconfigure the task


b = 2
c = 3

[Link]

abstract class Generator extends DefaultTask {


@Input
abstract MapProperty<String, Integer> getProperties()

@TaskAction
void generate() {
[Link]().each { key, value ->
[Link]("${key} = ${value}")
}
}
}

// Some values to be configured later


def b = 0
def c = 0

[Link]('generate', Generator) {
[Link]("a", 1)
// Values have not been configured yet
[Link]("b", [Link] { b })
[Link]([Link] { [c: c, d: c + 1] })
}

// Configure the values. There is no need to reconfigure the task


b = 2
c = 3

$ gradle generate

> Task :generate


a = 1
b = 2
c = 3
d = 4

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

Applying a convention to a property

Often, you want to apply some convention, or default value to a property to be used if no value has
been configured. You can use the convention() method for this. This method accepts either a value
or a Provider, and this will be used as the value until some other value is configured.

[Link]

[Link]("show") {
val property = [Link](String::class)

// Set a convention
[Link]("convention 1")

println("value = " + [Link]())

// Can replace the convention


[Link]("convention 2")
println("value = " + [Link]())

[Link]("explicit value")
// Once a value is set, the convention is ignored
[Link]("ignored convention")

doLast {
println("value = " + [Link]())
}
}

[Link]

[Link]("show") {
def property = [Link](String)

// Set a convention
[Link]("convention 1")

println("value = " + [Link]())

// Can replace the convention


[Link]("convention 2")
println("value = " + [Link]())

[Link]("explicit value")

// Once a value is set, the convention is ignored


[Link]("ignored convention")

doLast {
println("value = " + [Link]())
}
}

$ gradle show
value = convention 1
value = convention 2

> Task :show


value = explicit value

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

Where to apply conventions from?

There are several appropriate locations for setting a convention on a property at configuration time
(i.e., before execution).

[Link]

// setting convention when registering a task from plugin


class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
[Link]().register<GreetingTask>("hello") {
[Link]("Greeter")
}
}
}

apply<GreetingPlugin>()

[Link]<GreetingTask>().configureEach {
// setting convention from build script
[Link]("Guest")
}

abstract class GreetingTask : DefaultTask() {


// setting convention from constructor
@get:Input
abstract val guest: Property<String>

init {
[Link]("person2")
}

// setting convention from declaration


@Input
val greeter = [Link]<String>().convention("person1")

@TaskAction
fun greet() {
println("hello, ${[Link]()}, from ${[Link]()}")
}
}

[Link]

// setting convention when registering a task from plugin


class GreetingPlugin implements Plugin<Project> {
void apply(Project project) {
[Link]().register("hello", GreetingTask) {
[Link]("Greeter")
}
}
}

apply plugin: GreetingPlugin

[Link](GreetingTask).configureEach {
// setting convention from build script
[Link]("Guest")
}

abstract class GreetingTask extends DefaultTask {


// setting convention from constructor
@Input
abstract Property<String> getGuest()

GreetingTask() {
[Link]("person2")
}

// setting convention from declaration


@Input
final Property<String> greeter = [Link](String)
.convention("person1")

@TaskAction
void greet() {
println("hello, ${[Link]()}, from ${[Link]()}")
}
}

From a plugin’s apply() method

Plugin authors may configure a convention on a lazy property from a plugin’s apply() method,
while performing preliminary configuration of the task or extension defining the property. This
works well for regular plugins (meant to be distributed and used in the wild), and internal
convention plugins (which often configure properties defined by third party plugins in a uniform
way for the entire build).

[Link]

// setting convention when registering a task from plugin


class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
[Link]().register<GreetingTask>("hello") {
[Link]("Greeter")
}
}
}

[Link]

// setting convention when registering a task from plugin


class GreetingPlugin implements Plugin<Project> {
void apply(Project project) {
[Link]().register("hello", GreetingTask) {
[Link]("Greeter")
}
}
}

From a build script

Build engineers may configure a convention on a lazy property from shared build logic that is
configuring tasks (for instance, from third-party plugins) in a standard way for the entire build.

[Link]

apply<GreetingPlugin>()

[Link]<GreetingTask>().configureEach {
// setting convention from build script
[Link]("Guest")
}

[Link]

[Link](GreetingTask).configureEach {
// setting convention from build script
[Link]("Guest")
}

Note that for project-specific values, instead of conventions, you should prefer setting explicit
values (using [Link](…) or [Link](…), for instance), as
conventions are only meant to define defaults.
From the task initialization

A task author may configure a convention on a lazy property from the task constructor or (if in
Kotlin) initializer block. This approach works for properties with trivial defaults, but it is not
appropriate if additional context (external to the task implementation) is required in order to set a
suitable default.

[Link]

// setting convention from constructor


@get:Input
abstract val guest: Property<String>

init {
[Link]("person2")
}

[Link]

// setting convention from constructor


@Input
abstract Property<String> getGuest()

GreetingTask() {
[Link]("person2")
}

Next to the property declaration

You may configure a convention on a lazy property next to the place where the property is
declared. Note this option is not available for managed properties, and has the same caveats as
configuring a convention from the task constructor.

[Link]

// setting convention from declaration


@Input
val greeter = [Link]<String>().convention("person1")
[Link]

// setting convention from declaration


@Input
final Property<String> greeter = [Link](String).convention
("person1")

Making a property unmodifiable

Most properties of a task or project are intended to be configured by plugins or build scripts so that
they can use specific values for that build.

For example, a property that specifies the output directory for a compilation task may start with a
value specified by a plugin. Then a build script might change the value to some custom location,
then this value is used by the task when it runs. However, once the task starts to run, we want to
prevent further property changes. This way we avoid errors that result from different consumers,
such as the task action, Gradle’s up-to-date checks, build caching, or other tasks, using different
values for the property.

Lazy properties provide several methods that you can use to disallow changes to their value once
the value has been configured. The finalizeValue() method calculates the final value for the
property and prevents further changes to the property.

[Link]()

When the property’s value comes from a Provider, the provider is queried for its current value, and
the result becomes the final value for the property. This final value replaces the provider and the
property no longer tracks the value of the provider. Calling this method also makes a property
instance unmodifiable and any further attempts to change the value of the property will fail. Gradle
automatically makes the properties of a task final when the task starts execution.

The finalizeValueOnRead() method is similar, except that the property’s final value is not calculated
until the value of the property is queried.

[Link]()

In other words, this method calculates the final value lazily as required, whereas finalizeValue()
calculates the final value eagerly. This method can be used when the value may be expensive to
calculate or may not have been configured yet. You also want to ensure that all consumers of the
property see the same value when they query the value.

Using the Provider API

Guidelines to be successful with the Provider API:


1. The Property and Provider types have all of the overloads you need to query or configure a
value. For this reason, you should follow the following guidelines:

◦ For configurable properties, expose the Property directly through a single getter.

◦ For non-configurable properties, expose an Provider directly through a single getter.

2. Avoid simplifying calls like [Link]().get() and [Link]().set(T) in your code


by introducing additional getters and setters.

3. When migrating your plugin to use providers, follow these guidelines:

◦ If it’s a new property, expose it as a Property or Provider using a single getter.

◦ If it’s incubating, change it to use a Property or Provider using a single getter.

◦ If it’s a stable property, add a new Property or Provider and deprecate the old one. You
should wire the old getter/setters into the new property as appropriate.

Provider Files API Reference

Use these types for read-only values:

Provider<RegularFile>
File on disk

Factories
• [Link](Transformer).

• [Link](Transformer).

• [Link](String)

Provider<Directory>
Directory on disk

Factories
• [Link](Transformer).

• [Link](Transformer).

• [Link](String)

FileCollection
Unstructured collection of files

Factories
• [Link](Object[])

• [Link](Object...)

• [Link](Object...)

FileTree
Hierarchy of files
Factories
• [Link](Object) will produce a ConfigurableFileTree, or you can use
[Link](Object) and [Link](Object)

• [Link]()

Property Files API Reference

Use these types for mutable values:

RegularFileProperty
File on disk

Factories
• [Link]()

DirectoryProperty
Directory on disk

Factories
• [Link]()

ConfigurableFileCollection
Unstructured collection of files

Factories
• [Link]()

ConfigurableFileTree
Hierarchy of files

Factories
• [Link]()

SourceDirectorySet
Hierarchy of source directories

Factories
• [Link](String, String)

Lazy Collections API Reference

Use these types for mutable values:

ListProperty<T>
a property whose value is List<T>

Factories
• [Link](Class)
SetProperty<T>
a property whose value is Set<T>

Factories
• [Link](Class)

Lazy Objects API Reference

Use these types for read only values:

Provider<T>
a property whose value is an instance of T

Factories
• [Link](Transformer).

• [Link](Transformer).

• [Link](Callable). Always prefer one of the other factory methods over


this method.

Use these types for mutable values:

Property<T>
a property whose value is an instance of T

Factories
• [Link](Class)

Developing Parallel Tasks


Gradle provides an API that can split tasks into sections that can be executed in parallel.

This allows Gradle to fully utilize the resources available and complete builds faster.
The Worker API

The Worker API provides the ability to break up the execution of a task action into discrete units of
work and then execute that work concurrently and asynchronously.

Worker API example

The best way to understand how to use the API is to go through the process of converting an
existing custom task to use the Worker API:

1. You’ll start by creating a custom task class that generates MD5 hashes for a configurable set of
files.

2. Then, you’ll convert this custom task to use the Worker API.

3. Then, we’ll explore running the task with different levels of isolation.

In the process, you’ll learn about the basics of the Worker API and the capabilities it provides.

Step 1. Create a custom task class

First, create a custom task that generates MD5 hashes of a configurable set of files.

In a new directory, create a buildSrc/[Link](.kts) file:

buildSrc/[Link]

repositories {
mavenCentral()
}

dependencies {
implementation("commons-io:commons-io:2.5")
implementation("commons-codec:commons-codec:1.9") ①
}

buildSrc/[Link]

repositories {
mavenCentral()
}

dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'commons-codec:commons-codec:1.9' ①
}
① Your custom task class will use Apache Commons Codec to generate MD5 hashes.

Next, create a custom task class in your buildSrc/src/main/java directory. You should name this
class CreateMD5:

buildSrc/src/main/java/[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];

import [Link];
import [Link];
import [Link];

abstract public class CreateMD5 extends SourceTask { ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory(); ②

@TaskAction
public void createHashes() {
for (File sourceFile : getSource().getFiles()) { ③
try {
InputStream stream = new FileInputStream(sourceFile);
[Link]("Generating MD5 for " + [Link]() + "
...");
// Artificially make this task slower.
[Link](3000); ④
Provider<RegularFile> md5File = getDestinationDirectory().file
([Link]() + ".md5"); ⑤
[Link]([Link]().getAsFile(), DigestUtils
.md5Hex(stream), (String) null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
}

① SourceTask is a convenience type for tasks that operate on a set of source files.

② The task output will go into a configured directory.

③ The task iterates over all the files defined as "source files" and creates an MD5 hash of each.
④ Insert an artificial sleep to simulate hashing a large file (the sample files won’t be that large).

⑤ The MD5 hash of each file is written to the output directory into a file of the same name with an
"md5" extension.

Next, create a [Link](.kts) that registers your new CreateMD5 task:

[Link]

plugins { id("base") } ①

[Link]<CreateMD5>("md5") {
destinationDirectory = [Link]("md5") ②
source([Link]("src")) ③
}

[Link]

plugins { id 'base' } ①

[Link]("md5", CreateMD5) {
destinationDirectory = [Link]("md5") ②
source([Link]('src')) ③
}

① Apply the base plugin so that you’ll have a clean task to use to remove the output.

② MD5 hash files will be written to build/md5.

③ This task will generate MD5 hash files for every file in the src directory.

You will need some source to generate MD5 hashes from. Create three files in the src directory:

src/[Link]

Intellectual growth should commence at birth and cease only at death.

src/[Link]

I was born not knowing and have had only a little time to change that here and there.

src/[Link]

Intelligence is the ability to adapt to change.


At this point, you can test your task by running it ./gradlew md5:

$ gradle md5

The output should look similar to:

> Task :md5


Generating MD5 for [Link]...
Generating MD5 for [Link]...
Generating MD5 for [Link]...

BUILD SUCCESSFUL in 9s
3 actionable tasks: 3 executed

In the build/md5 directory, you should now see corresponding files with an md5 extension containing
MD5 hashes of the files from the src directory. Notice that the task takes at least 9 seconds to run
because it hashes each file one at a time (i.e., three files at ~3 seconds apiece).

Step 2. Convert to the Worker API

Although this task processes each file in sequence, the processing of each file is independent of any
other file. This work can be done in parallel and take advantage of multiple processors. This is
where the Worker API can help.

To use the Worker API, you need to define an interface that represents the parameters of each unit
of work and extends [Link].

For the generation of MD5 hash files, the unit of work will require two parameters:

1. the file to be hashed and,

2. the file to write the hash to.

There is no need to create a concrete implementation because Gradle will generate one for us at
runtime.

buildSrc/src/main/java/[Link]

import [Link];
import [Link];

public interface MD5WorkParameters extends WorkParameters {


RegularFileProperty getSourceFile(); ①
RegularFileProperty getMD5File();
}

① Use Property objects to represent the source and MD5 hash files.

Then, you need to refactor the part of your custom task that does the work for each individual file
into a separate class. This class is your "unit of work" implementation, and it should be an abstract
class that extends [Link]:

buildSrc/src/main/java/[Link]

import [Link];
import [Link];
import [Link];

import [Link];
import [Link];
import [Link];

public abstract class GenerateMD5 implements WorkAction<MD5WorkParameters> { ①


@Override
public void execute() {
try {
File sourceFile = getParameters().getSourceFile().getAsFile().get();
File md5File = getParameters().getMD5File().getAsFile().get();
InputStream stream = new FileInputStream(sourceFile);
[Link]("Generating MD5 for " + [Link]() + "...");
// Artificially make this task slower.
[Link](3000);
[Link](md5File, DigestUtils.md5Hex(stream), (String)
null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}

① Do not implement the getParameters() method - Gradle will inject this at runtime.

Now, change your custom task class to submit work to the WorkerExecutor instead of doing the
work itself.

buildSrc/src/main/java/[Link]

import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link].*;
import [Link];

import [Link];
import [Link];

abstract public class CreateMD5 extends SourceTask {

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor(); ①

@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().noIsolation(); ②

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
[Link]([Link], parameters -> { ③
[Link]().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① The WorkerExecutor service is required in order to submit your work. Create an abstract getter
method annotated [Link], and Gradle will inject the service at runtime when the
task is created.

② Before submitting work, get a WorkQueue object with the desired isolation mode (described
below).

③ When submitting the unit of work, specify the unit of work implementation, in this case
GenerateMD5, and configure its parameters.

At this point, you should be able to rerun your task:

$ gradle clean md5

> Task :md5


Generating MD5 for [Link]...
Generating MD5 for [Link]...
Generating MD5 for [Link]...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

The results should look the same as before, although the MD5 hash files may be generated in a
different order since the units of work are executed in parallel. This time, however, the task runs
much faster. This is because the Worker API executes the MD5 calculation for each file in parallel
rather than in sequence.

Step 3. Change the isolation mode

The isolation mode controls how strongly Gradle will isolate items of work from each other and the
rest of the Gradle runtime.

There are three methods on WorkerExecutor that control this:

1. noIsolation()

2. classLoaderIsolation()

3. processIsolation()

The noIsolation() mode is the lowest level of isolation and will prevent a unit of work from
changing the project state. This is the fastest isolation mode because it requires the least overhead
to set up and execute the work item. However, it will use a single shared classloader for all units of
work. This means that each unit of work can affect one another through static class state. It also
means that every unit of work uses the same version of libraries on the buildscript classpath. If you
wanted the user to be able to configure the task to run with a different (but compatible) version of
the Apache Commons Codec library, you would need to use a different isolation mode.

First, you must change the dependency in buildSrc/[Link] to be compileOnly. This tells Gradle
that it should use this dependency when building the classes, but should not put it on the build
script classpath:

buildSrc/[Link]

repositories {
mavenCentral()
}

dependencies {
implementation("commons-io:commons-io:2.5")
compileOnly("commons-codec:commons-codec:1.9")
}

buildSrc/[Link]

repositories {
mavenCentral()
}

dependencies {
implementation 'commons-io:commons-io:2.5'
compileOnly 'commons-codec:commons-codec:1.9'
}

Next, change the CreateMD5 task to allow the user to configure the version of the codec library that
they want to use. It will resolve the appropriate version of the library at runtime and configure the
workers to use this version.

The classLoaderIsolation() method tells Gradle to run this work in a thread with an isolated
classloader:

buildSrc/src/main/java/[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link];
import [Link].*;

import [Link];
import [Link];
import [Link];

abstract public class CreateMD5 extends SourceTask {

@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor();

@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().classLoaderIsolation(workerSpec -> {
[Link]().from(getCodecClasspath()); ②
});

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
[Link]([Link], parameters -> {
[Link]().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① Expose an input property for the codec library classpath.

② Configure the classpath on the ClassLoaderWorkerSpec when creating the work queue.
Next, you need to configure your build so that it has a repository to look up the codec version at
task execution time. We also create a dependency to resolve our codec library from this repository:

[Link]

plugins { id("base") }

repositories {
mavenCentral() ①
}

val codec = [Link]("codec") { ②


attributes {
attribute(Usage.USAGE_ATTRIBUTE, [Link](Usage.JAVA_RUNTIME))
}
isVisible = false
isCanBeConsumed = false
}

dependencies {
codec("commons-codec:commons-codec:1.10") ③
}

[Link]<CreateMD5>("md5") {
[Link](codec) ④
destinationDirectory = [Link]("md5")
source([Link]("src"))
}

[Link]

plugins { id 'base' }

repositories {
mavenCentral() ①
}

[Link]('codec') { ②
attributes {
attribute(Usage.USAGE_ATTRIBUTE, [Link](Usage, Usage
.JAVA_RUNTIME))
}
visible = false
canBeConsumed = false
}

dependencies {
codec 'commons-codec:commons-codec:1.10' ③
}

[Link]('md5', CreateMD5) {
[Link]([Link]) ④
destinationDirectory = [Link]('md5')
source([Link]('src'))
}

① Add a repository to resolve the codec library - this can be a different repository than the one
used to build the CreateMD5 task class.

② Add a configuration to resolve our codec library version.

③ Configure an alternate, compatible version of Apache Commons Codec.

④ Configure the md5 task to use the configuration as its classpath. Note that the configuration will
not be resolved until the task is executed.

Now, if you run your task, it should work as expected using the configured version of the codec
library:

$ gradle clean md5

> Task :md5


Generating MD5 for [Link]...
Generating MD5 for [Link]...
Generating MD5 for [Link]...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

Step 4. Create a Worker Daemon

Sometimes, it is desirable to utilize even greater levels of isolation when executing items of work.
For instance, external libraries may rely on certain system properties to be set, which may conflict
between work items. Or a library might not be compatible with the version of JDK that Gradle is
running with and may need to be run with a different version.

The Worker API can accommodate this using the processIsolation() method that causes the work
to execute in a separate "worker daemon". These worker processes will be session-scoped and can
be reused within the same build session, but they won’t persist across builds. However, if system
resources get low, Gradle will stop unused worker daemons.

To utilize a worker daemon, use the processIsolation() method when creating the WorkQueue. You
may also want to configure custom settings for the new process:
buildSrc/src/main/java/[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link];
import [Link].*;

import [Link];
import [Link];
import [Link];

abstract public class CreateMD5 extends SourceTask {

@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor();

@TaskAction
public void createHashes() {

WorkQueue workQueue = getWorkerExecutor().processIsolation(workerSpec -> {
[Link]().from(getCodecClasspath());
[Link](options -> {
[Link]("64m"); ②
});
});

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
[Link]([Link], parameters -> {
[Link]().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① Change the isolation mode to PROCESS.

② Set up the JavaForkOptions for the new process.


Now, you should be able to run your task, and it will work as expected but using worker daemons
instead:

$ gradle clean md5

> Task :md5


Generating MD5 for [Link]...
Generating MD5 for [Link]...
Generating MD5 for [Link]...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

Note that the execution time may be high. This is because Gradle has to start a new process for each
worker daemon, which is expensive.

However, if you run your task a second time, you will see that it runs much faster. This is because
the worker daemon(s) started during the initial build have persisted and are available for use
immediately during subsequent builds:

$ gradle clean md5

> Task :md5


Generating MD5 for [Link]...
Generating MD5 for [Link]...
Generating MD5 for [Link]...

BUILD SUCCESSFUL in 1s
3 actionable tasks: 3 executed

Isolation modes

Gradle provides three isolation modes that can be configured when creating a WorkQueue and are
specified using one of the following methods on WorkerExecutor:

[Link]()
This states that the work should be run in a thread with minimal isolation.
For instance, it will share the same classloader that the task is loaded from. This is the fastest
level of isolation.

[Link]()
This states that the work should be run in a thread with an isolated classloader.
The classloader will have the classpath from the classloader that the unit of work
implementation class was loaded from as well as any additional classpath entries added through
[Link]().
[Link]()
This states that the work should be run with a maximum isolation level by executing the work in
a separate process.
The classloader of the process will use the classpath from the classloader that the unit of work
was loaded from as well as any additional classpath entries added through
[Link](). Furthermore, the process will be a worker daemon that
will stay alive and can be reused for future work items with the same requirements. This
process can be configured with different settings than the Gradle JVM using
[Link]([Link]).

Worker Daemons

When using processIsolation(), Gradle will start a long-lived worker daemon process that can be
reused for future work items.

[Link]

// Create a WorkQueue with process isolation


val workQueue = [Link]() {
// Configure the options for the forked process
forkOptions {
maxHeapSize = "512m"
systemProperty("[Link]", "true")
}
}

// Create and submit a unit of work for each file


[Link] { file ->
[Link](ReverseFile::class) {
fileToReverse = file
destinationDir = outputDir
}
}

[Link]

// Create a WorkQueue with process isolation


WorkQueue workQueue = [Link]() { ProcessWorkerSpec
spec ->
// Configure the options for the forked process
forkOptions { JavaForkOptions options ->
[Link] = "512m"
[Link] "[Link]", "true"
}
}
// Create and submit a unit of work for each file
[Link] { file ->
[Link]([Link]) { ReverseParameters parameters ->
[Link] = file
[Link] = outputDir
}
}

When a unit of work for a worker daemon is submitted, Gradle will first look to see if a compatible,
idle daemon already exists. If so, it will send the unit of work to the idle daemon, marking it as
busy. If not, it will start a new daemon. When evaluating compatibility, Gradle looks at a number of
criteria, all of which can be controlled through
[Link]([Link]).

By default, a worker daemon starts with a maximum heap of 512MB. This can be changed by
adjusting the workers' fork options.

executable
A daemon is considered compatible only if it uses the same Java executable.

classpath
A daemon is considered compatible if its classpath contains all the classpath entries requested.
Note that a daemon is considered compatible only if the classpath exactly matches the requested
classpath.

heap settings
A daemon is considered compatible if it has at least the same heap size settings as requested.
In other words, a daemon that has higher heap settings than requested would be considered
compatible.

jvm arguments
A daemon is compatible if it has set all the JVM arguments requested.
Note that a daemon is compatible if it has additional JVM arguments beyond those requested
(except for those treated especially, such as heap settings, assertions, debug, etc.).

system properties
A daemon is considered compatible if it has set all the system properties requested with the
same values.
Note that a daemon is compatible if it has additional system properties beyond those requested.

environment variables
A daemon is considered compatible if it has set all the environment variables requested with the
same values.
Note that a daemon is compatible if it has more environment variables than requested.

bootstrap classpath
A daemon is considered compatible if it contains all the bootstrap classpath entries requested.
Note that a daemon is compatible if it has more bootstrap classpath entries than requested.

debug
A daemon is considered compatible only if debug is set to the same value as requested (true or
false).

enable assertions
A daemon is considered compatible only if enable assertions are set to the same value as
requested (true or false).

default character encoding


A daemon is considered compatible only if the default character encoding is set to the same
value as requested.

Worker daemons will remain running until the build daemon that started them is stopped or
system memory becomes scarce. When system memory is low, Gradle will stop worker daemons to
minimize memory consumption.

A step-by-step description of converting a normal task action to use the worker API
NOTE
can be found in the section on developing parallel tasks.

Cancellation and timeouts

To support cancellation (e.g., when the user stops the build with CTRL+C) and task timeouts, custom
tasks should react to interrupting their executing thread. The same is true for work items submitted
via the worker API. If a task does not respond to an interrupt within 10s, the daemon will shut
down to free up system resources.

Advanced Tasks
Incremental tasks

In Gradle, implementing a task that skips execution when its inputs and outputs are already UP-TO-
DATE is simple and efficient, thanks to the Incremental Build feature.

However, there are times when only a few input files have changed since the last execution, and it
is best to avoid reprocessing all the unchanged inputs. This situation is common in tasks that
transform input files into output files on a one-to-one basis.

To optimize your build process you can use an incremental task. This approach ensures that only
out-of-date input files are processed, improving build performance.

Implementing an incremental task

For a task to process inputs incrementally, that task must contain an incremental task action.

This is a task action method that has a single InputChanges parameter. That parameter tells Gradle
that the action only wants to process the changed inputs.
In addition, the task needs to declare at least one incremental file input property by using either
@Incremental or @SkipWhenEmpty:

[Link]

public class IncrementalReverseTask : DefaultTask() {

@get:Incremental
@get:InputDirectory
val inputDir: DirectoryProperty = [Link]()

@get:OutputDirectory
val outputDir: DirectoryProperty = [Link]()

@get:Input
val inputProperty: RegularFileProperty = [Link]()
// File input property

@TaskAction
fun execute(inputs: InputChanges) { // InputChanges parameter
val msg = if ([Link]) "CHANGED inputs are out of date"
else "ALL inputs are out of date"
println(msg)
}
}

[Link]

class IncrementalReverseTask extends DefaultTask {

@Incremental
@InputDirectory
def File inputDir

@OutputDirectory
def File outputDir

@Input
def inputProperty // File input property

@TaskAction
void execute(InputChanges inputs) { // InputChanges parameter
println [Link] ? "CHANGED inputs are out of date"
: "ALL inputs are out of date"
}
}
To query incremental changes for an input file property, that property must
always return the same instance. The easiest way to accomplish this is to use
one of the following property types: RegularFileProperty, DirectoryProperty
IMPORTANT or ConfigurableFileCollection.

You can learn more about RegularFileProperty and DirectoryProperty in Lazy


Configuration.

The incremental task action can use [Link]() to find out what files have
changed for a given file-based input property, be it of type RegularFileProperty, DirectoryProperty
or ConfigurableFileCollection.

The method returns an Iterable of type FileChanges, which in turn can be queried for the
following:

• the affected file

• the change type (ADDED, REMOVED or MODIFIED)

• the normalized path of the changed file

• the file type of the changed file

The following example demonstrates an incremental task that has a directory input. It assumes that
the directory contains a collection of text files and copies them to an output directory, reversing the
text within each file:

[Link]

abstract class IncrementalReverseTask : DefaultTask() {


@get:Incremental
@get:PathSensitive(PathSensitivity.NAME_ONLY)
@get:InputDirectory
abstract val inputDir: DirectoryProperty

@get:OutputDirectory
abstract val outputDir: DirectoryProperty

@get:Input
abstract val inputProperty: Property<String>

@TaskAction
fun execute(inputChanges: InputChanges) {
println(
if ([Link]) "Executing incrementally"
else "Executing non-incrementally"
)

[Link](inputDir).forEach { change ->


if ([Link] == [Link]) return@forEach
println("${[Link]}: ${[Link]}")
val targetFile =
[Link]([Link]).get().asFile
if ([Link] == [Link]) {
[Link]()
} else {
[Link]([Link]().reversed())
}
}
}
}

[Link]

abstract class IncrementalReverseTask extends DefaultTask {


@Incremental
@PathSensitive(PathSensitivity.NAME_ONLY)
@InputDirectory
abstract DirectoryProperty getInputDir()

@OutputDirectory
abstract DirectoryProperty getOutputDir()

@Input
abstract Property<String> getInputProperty()

@TaskAction
void execute(InputChanges inputChanges) {
println([Link]
? 'Executing incrementally'
: 'Executing non-incrementally'
)

[Link](inputDir).each { change ->


if ([Link] == [Link]) return

println "${[Link]}: ${[Link]}"


def targetFile = [Link]([Link]).get()
.asFile
if ([Link] == [Link]) {
[Link]()
} else {
[Link] = [Link]()
}
}
}
}
The type of the inputDir property, its annotations, and the execute() action use
NOTE getFileChanges() to process the subset of files that have changed since the last build.
The action deletes a target file if the corresponding input file has been removed.

If, for some reason, the task is executed non-incrementally (by running with --rerun-tasks, for
example), all files are reported as ADDED, irrespective of the previous state. In this case, Gradle
automatically removes the previous outputs, so the incremental task must only process the given
files.

For a simple transformer task like the above example, the task action must generate output files for
any out-of-date inputs and delete output files for any removed inputs.

IMPORTANT A task may only contain a single incremental task action.

Which inputs are considered out of date?

When a task has been previously executed, and the only changes since that execution are to
incremental input file properties, Gradle can intelligently determine which input files need to be
processed, a concept known as incremental execution.

In this scenario, the [Link]() method, available in the


[Link] class, provides details for all input files associated with the given
property that have been ADDED, REMOVED or MODIFIED.

However, there are many cases where Gradle cannot determine which input files need to be
processed (i.e., non-incremental execution). Examples include:

• There is no history available from a previous execution.

• You are building with a different version of Gradle. Currently, Gradle does not use task history
from a different version.

• An upToDateWhen criterion added to the task returns false.

• An input property has changed since the previous execution.

• A non-incremental input file property has changed since the previous execution.

• One or more output files have changed since the previous execution.

In these cases, Gradle will report all input files as ADDED, and the getFileChanges() method will
return details for all the files that comprise the given input property.

You can check if the task execution is incremental or not with the [Link]()
method.

An incremental task in action

Consider an instance of IncrementalReverseTask executed against a set of inputs for the first time.

In this case, all inputs will be considered ADDED, as shown here:


[Link]

[Link]<IncrementalReverseTask>("incrementalReverse") {
inputDir = file("inputs")
outputDir = [Link]("outputs")
inputProperty = [Link]("taskInputProperty") as String? ?:
"original"
}

[Link]

[Link]('incrementalReverse', IncrementalReverseTask) {
inputDir = file('inputs')
outputDir = [Link]("outputs")
inputProperty = [Link]['taskInputProperty'] ?: 'original'
}

The build layout:

.
├── [Link]
└── inputs
├── [Link]
├── [Link]
└── [Link]

$ gradle -q incrementalReverse
Executing non-incrementally
ADDED: [Link]
ADDED: [Link]
ADDED: [Link]

Naturally, when the task is executed again with no changes, then the entire task is UP-TO-DATE, and
the task action is not executed:

$ gradle incrementalReverse
> Task :incrementalReverse UP-TO-DATE

BUILD SUCCESSFUL in 0s
1 actionable task: 1 up-to-date
When an input file is modified in some way or a new input file is added, then re-executing the task
results in those files being returned by [Link]().

The following example modifies the content of one file and adds another before running the
incremental task:

[Link]

[Link]("updateInputs") {
val inputsDir = [Link]("inputs")
[Link](inputsDir)
doLast {
[Link]("[Link]").[Link]("Changed content for
existing file 1.")
[Link]("[Link]").[Link]("Content for new file 4.")
}
}

[Link]

[Link]('updateInputs') {
def inputsDir = [Link]('inputs')
[Link](inputsDir)
doLast {
[Link]('[Link]').[Link] = 'Changed content for existing
file 1.'
[Link]('[Link]').[Link] = 'Content for new file 4.'
}
}

$ gradle -q updateInputs incrementalReverse


Executing incrementally
MODIFIED: [Link]
ADDED: [Link]

The various mutation tasks (updateInputs, removeInput, etc) are only present to
NOTE demonstrate the behavior of incremental tasks. They should not be viewed as the
kinds of tasks or task implementations you should have in your own build scripts.

When an existing input file is removed, then re-executing the task results in that file being returned
by [Link]() as REMOVED.

The following example removes one of the existing files before executing the incremental task:
[Link]

[Link]<Delete>("removeInput") {
delete("inputs/[Link]")
}

[Link]

[Link]('removeInput', Delete) {
delete 'inputs/[Link]'
}

$ gradle -q removeInput incrementalReverse


Executing incrementally
REMOVED: [Link]

Gradle cannot determine which input files are out-of-date when an output file is deleted (or
modified). In this case, details for all the input files for the given property are returned by
[Link]().

The following example removes one of the output files from the build directory. However, all the
input files are considered to be ADDED:

[Link]

[Link]<Delete>("removeOutput") {
delete([Link]("outputs/[Link]"))
}

[Link]

[Link]('removeOutput', Delete) {
delete [Link]("outputs/[Link]")
}

$ gradle -q removeOutput incrementalReverse


Executing non-incrementally
ADDED: [Link]
ADDED: [Link]
ADDED: [Link]

The last scenario we want to cover concerns what happens when a non-file-based input property is
modified. In such cases, Gradle cannot determine how the property impacts the task outputs, so the
task is executed non-incrementally. This means that all input files for the given property are
returned by [Link]() and they are all treated as ADDED.

The following example sets the project property taskInputProperty to a new value when running
the incrementalReverse task. That project property is used to initialize the task’s inputProperty
property, as you can see in the first example of this section.

Here is the expected output in this case:

$ gradle -q -PtaskInputProperty=changed incrementalReverse


Executing non-incrementally
ADDED: [Link]
ADDED: [Link]
ADDED: [Link]

Command Line options

Sometimes, a user wants to declare the value of an exposed task property on the command line
instead of the build script. Passing property values on the command line is particularly helpful if
they change more frequently.

The task API supports a mechanism for marking a property to automatically generate a
corresponding command line parameter with a specific name at runtime.

Step 1. Declare a command-line option

To expose a new command line option for a task property, annotate the corresponding setter
method of a property with Option:

@Option(option = "flag", description = "Sets the flag")

An option requires a mandatory identifier. You can provide an optional description.

A task can expose as many command line options as properties available in the class.

Options may be declared in superinterfaces of the task class as well. If multiple interfaces declare
the same property but with different option flags, they will both work to set the property.

In the example below, the custom task UrlVerify verifies whether a URL can be resolved by making
an HTTP call and checking the response code. The URL to be verified is configurable through the
property url. The setter method for the property is annotated with @Option:

[Link]

import [Link];

public class UrlVerify extends DefaultTask {


private String url;

@Option(option = "url", description = "Configures the URL to be verified.")


public void setUrl(String url) {
[Link] = url;
}

@Input
public String getUrl() {
return url;
}

@TaskAction
public void verify() {
getLogger().quiet("Verifying URL '{}'", url);

// verify URL by making a HTTP call


}
}

All options declared for a task can be rendered as console output by running the help task and the
--task option.

Step 2. Use an option on the command line

There are a few rules for options on the command line:

• The option uses a double-dash as a prefix, e.g., --url. A single dash does not qualify as valid
syntax for a task option.

• The option argument follows directly after the task declaration, e.g., verifyUrl
--url=[Link]

• Multiple task options can be declared in any order on the command line following the task
name.

Building upon the earlier example, the build script creates a task instance of type UrlVerify and
provides a value from the command line through the exposed option:

[Link]

[Link]<UrlVerify>("verifyUrl")
[Link]

[Link]('verifyUrl', UrlVerify)

$ gradle -q verifyUrl --url=[Link]


Verifying URL '[Link]

Supported data types for options

Gradle limits the data types that can be used for declaring command line options.

The use of the command line differs per type:

boolean, Boolean, Property<Boolean>


Describes an option with the value true or false.
Passing the option on the command line treats the value as true. For example, --foo equates to
true.
The absence of the option uses the default value of the property. For each boolean option, an
opposite option is created automatically. For example, --no-foo is created for the provided
option --foo and --bar is created for --no-bar. Options whose name starts with --no are disabled
options and set the option value to false. An opposite option is only created if no option with the
same name already exists for the task.

Double, Property<Double>
Describes an option with a double value.
Passing the option on the command line also requires a value, e.g., --factor=2.2 or --factor 2.2.

Integer, Property<Integer>
Describes an option with an integer value.
Passing the option on the command line also requires a value, e.g., --network-timeout=5000 or
--network-timeout 5000.

Long, Property<Long>
Describes an option with a long value.
Passing the option on the command line also requires a value, e.g., --threshold=2147483648 or
--threshold 2147483648.

String, Property<String>
Describes an option with an arbitrary String value.
Passing the option on the command line also requires a value, e.g., --container-id=2x94held or
--container-id 2x94held.

enum, Property<enum>
Describes an option as an enumerated type.
Passing the option on the command line also requires a value e.g., --log-level=DEBUG or --log
-level debug.
The value is not case-sensitive.

List<T> where T is Double, Integer, Long, String, enum


Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123
--image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space
character, are currently not supported.

ListProperty<T>, SetProperty<T> where T is Double, Integer, Long, String, enum


Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123
--image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space
character, are currently not supported.

DirectoryProperty, RegularFileProperty
Describes an option with a file system element.
Passing the option on the command line also requires a value representing a path, e.g., --output
-file=[Link] or --output-dir outputDir.
Relative paths are resolved relative to the project directory of the project that owns this property
instance. See [Link]().

Documenting available values for an option

Theoretically, an option for a property type String or List<String> can accept any arbitrary value.
Accepted values for such an option can be documented programmatically with the help of the
annotation OptionValues:

@OptionValues('file')

This annotation may be assigned to any method that returns a List of one of the supported data
types. You need to specify an option identifier to indicate the relationship between the option and
available values.

Passing a value on the command line not supported by the option does not fail the
NOTE build or throw an exception. You must implement custom logic for such behavior in
the task action.

The example below demonstrates the use of multiple options for a single task. The task
implementation provides a list of available values for the option output-type:

[Link]

import [Link];
import [Link];
public abstract class UrlProcess extends DefaultTask {
private String url;
private OutputType outputType;

@Input
@Option(option = "http", description = "Configures the http protocol to be
allowed.")
public abstract Property<Boolean> getHttp();

@Option(option = "url", description = "Configures the URL to send the request to.
")
public void setUrl(String url) {
if (!getHttp().getOrElse(true) && [Link]("[Link] {
throw new IllegalArgumentException("HTTP is not allowed");
} else {
[Link] = url;
}
}

@Input
public String getUrl() {
return url;
}

@Option(option = "output-type", description = "Configures the output type.")


public void setOutputType(OutputType outputType) {
[Link] = outputType;
}

@OptionValues("output-type")
public List<OutputType> getAvailableOutputTypes() {
return new ArrayList<OutputType>([Link]([Link]()));
}

@Input
public OutputType getOutputType() {
return outputType;
}

@TaskAction
public void process() {
getLogger().quiet("Writing out the URL response from '{}' to '{}'", url,
outputType);

// retrieve content from URL and write to output


}

private static enum OutputType {


CONSOLE, FILE
}
}

Listing command line options

Command line options using the annotations Option and OptionValues are self-documenting.

You will see declared options and their available values reflected in the console output of the help
task. The output renders options alphabetically, except for boolean disable options, which appear
following the enable option:

$ gradle -q help --task processUrl


Detailed task information for processUrl

Path
:processUrl

Type
UrlProcess (UrlProcess)

Options
--http Configures the http protocol to be allowed.

--no-http Disables option --http.

--output-type Configures the output type.


Available values are:
CONSOLE
FILE

--url Configures the URL to send the request to.

--rerun Causes the task to be re-run even if up-to-date.

Description
-

Group
-

Limitations

Support for declaring command line options currently comes with a few limitations.

• Command line options can only be declared for custom tasks via annotation. There’s no
programmatic equivalent for defining options.

• Options cannot be declared globally, e.g., on a project level or as part of a plugin.

• When assigning an option on the command line, the task exposing the option needs to be
spelled out explicitly, e.g., gradle check --tests abc does not work even though the check task
depends on the test task.

• If you specify a task option name that conflicts with the name of a built-in Gradle option, use the
-- delimiter before calling your task to reference that option. For more information, see
Disambiguate Task Options from Built-in Options.

Verification failures

Normally, exceptions thrown during task execution result in a failure that immediately terminates
a build. The outcome of the task will be FAILED, the result of the build will be FAILED, and no further
tasks will be executed. When running with the --continue flag, Gradle will continue to run other
requested tasks in the build after encountering a task failure. However, any tasks that depend on a
failed task will not be executed.

There is a special type of exception that behaves differently when downstream tasks only rely on
the outputs of a failing task. A task can throw a subtype of VerificationException to indicate that it
has failed in a controlled manner such that its output is still valid for consumers. A task depends on
the outcome of another task when it directly depends on it using dependsOn. When Gradle is run
with --continue, consumer tasks that depend on a producer task’s output (via a relationship
between task inputs and outputs) can still run after the producer fails.

A failed unit test, for instance, will cause a failing outcome for the test task. However, this doesn’t
prevent another task from reading and processing the (valid) test results the task produced.
Verification failures are used in exactly this manner by the Test Report Aggregation Plugin.

Verification failures are also useful for tasks that need to report a failure even after producing
useful output consumable by other tasks.

[Link]

val process = [Link]("process") {


val outputFile = [Link]("[Link]")
[Link](outputFile) ①

doLast {
val logFile = [Link]().asFile
[Link]("Step 1 Complete.") ②
throw VerificationException("Process failed!") ③
[Link]("Step 2 Complete.") ④
}
}

[Link]("postProcess") {
[Link](process) ⑤

doLast {
println("Results: ${[Link]()}") ⑥
}
}

[Link]

[Link]("process") {
def outputFile = [Link]("[Link]")
[Link](outputFile) ①

doLast {
def logFile = [Link]().asFile
logFile << "Step 1 Complete." ②
throw new VerificationException("Process failed!") ③
logFile << "Step 2 Complete." ④
}
}

[Link]("postProcess") {
[Link]([Link]("process")) ⑤

doLast {
println("Results: ${[Link]}") ⑥
}
}

$ gradle postProcess --continue


> Task :process FAILED

> Task :postProcess


Results: Step 1 Complete.
2 actionable tasks: 2 executed

FAILURE: Build failed with an exception.

① Register Output: The process task writes its output to a log file.

② Modify Output: The task writes to its output file as it executes.

③ Task Failure: The task throws a VerificationException and fails at this point.

④ Continue to Modify Output: This line never runs due to the exception stopping the task.

⑤ Consume Output: The postProcess task depends on the output of the process task due to using
that task’s outputs as its own inputs.

⑥ Use Partial Result: With the --continue flag set, Gradle still runs the requested postProcess task
despite the process task’s failure. postProcess can read and display the partial (though still valid)
result.
Using Shared Build Services
Shared build services allow tasks to share state or resources. For example, tasks might share a
cache of pre-computed values or use a web service or database instance.

A build service is an object that holds the state for tasks to use. It provides an alternative
mechanism for hooking into a Gradle build and receiving information about task execution and
operation completion.

Build services are configuration cacheable.

Gradle manages the service lifecycle, creating the service instance only when required and
cleaning it up when no longer needed. Gradle can also coordinate access to the build service,
ensuring that no more than a specified number of tasks use the service concurrently.

Implementing a build service

To implement a build service, create an abstract class that implements BuildService. Then, define
methods you want the tasks to use on this type.

abstract class BaseCountingService implements BuildService<CountingParams>,


AutoCloseable {

A build service implementation is treated as a custom Gradle type and can use any of the features
available to custom Gradle types.

A build service can optionally take parameters, which Gradle injects into the service instance when
creating it. To provide parameters, you define an abstract class (or interface) that holds the
parameters. The parameters type must implement (or extend) BuildServiceParameters. The service
implementation can access the parameters using [Link](). The parameters type is also
a custom Gradle type.

When the build service does not require any parameters, you can use [Link]
as the type of parameter.

interface CountingParams extends BuildServiceParameters {


Property<Integer> getInitial()
}

A build service implementation can also optionally implement AutoCloseable, in which case Gradle
will call the build service instance’s close() method when it discards the service instance. This
happens sometime between the completion of the last task that uses the build service and the end
of the build.

Here is an example of a service that takes parameters and is closeable:


[Link]

import [Link];
import [Link];
import [Link];
import [Link];

import [Link];
import [Link];

public abstract class WebServer implements BuildService<[Link]>,


AutoCloseable {

// Some parameters for the web server


interface Params extends BuildServiceParameters {
Property<Integer> getPort();

DirectoryProperty getResources();
}

private final URI uri;

public WebServer() throws URISyntaxException {


// Use the parameters
int port = getParameters().getPort().get();
uri = new URI([Link]("[Link] port));

// Start the server ...

[Link]([Link]("Server is running at %s", uri));


}

// A public method for tasks to use


public URI getUri() {
return uri;
}

@Override
public void close() {
// Stop the server ...
}
}

Note that you should not implement the [Link]() method, as Gradle will
provide an implementation of this.

A build service implementation must be thread-safe, as it will potentially be used by multiple tasks
concurrently.
Registering a build service and connecting it to a task

To create a build service, you register the service instance using the
[Link]() method.

Registering the service does not create the service instance. This happens on demand when a task
first uses the service. The service instance will not be created if no task uses the service during a
build.

Currently, build services are scoped to a build, rather than a project, and these services are
available to be shared by the tasks of all projects. You can access the registry of shared build
services via [Link]().getSharedServices().

Registering a build service to be consumed via @ServiceReference task properties

Here is an example of a plugin that registers the previous service when the task property
consuming the service is annotated with @ServiceReference:

[Link]

import [Link];
import [Link];
import [Link];

public class DownloadPlugin implements Plugin<Project> {


public void apply(Project project) {
// Register the service
[Link]().getSharedServices().registerIfAbsent("web", WebServer
.class, spec -> {
// Provide some parameters
[Link]().getPort().set(5005);
});

[Link]().register("download", [Link], task -> {


[Link]().set([Link]().getBuildDirectory().file
("[Link]"));
});
}
}

As you can see, there is no need to assign the build service provider returned by registerIfAbsent()
to the task, the service is automatically injected into all matching properties that were annotated
with @ServiceReference.

Here is an example of a task that consumes the previous service via a property annotated with
@ServiceReference:
[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];

import [Link];

public abstract class Download extends DefaultTask {


// This property provides access to the service instance
@ServiceReference("web")
abstract Property<WebServer> getServer();

@OutputFile
abstract RegularFileProperty getOutputFile();

@TaskAction
public void download() {
// Use the server to download a file
WebServer server = getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}

Automatic matching of registered build services with service reference properties is done by type
and (optionally) by name (for properties that declare the name of the service they expect). In case
multiple services would match the requested service type (i.e. multiple services were registered for
the same type, and a service name was not provided in the @ServiceReference annotation), you will
need also to assign the shared build service provider manually to the task property.

Read on to compare that to when the task property consuming the service is instead annotated with
@Internal.

Registering a build service to be consumed via @Internal task properties

[Link]

import [Link];
import [Link];
import [Link];

public class DownloadPlugin implements Plugin<Project> {


public void apply(Project project) {
// Register the service
Provider<WebServer> serviceProvider = [Link]()
.getSharedServices().registerIfAbsent("web", [Link], spec -> {
// Provide some parameters
[Link]().getPort().set(5005);
});

[Link]().register("download", [Link], task -> {


// Connect the service provider to the task
[Link]().set(serviceProvider);
// Declare the association between the task and the service
[Link](serviceProvider);
[Link]().set([Link]().getBuildDirectory().file
("[Link]"));
});
}
}

In this case, the plugin registers the service and receives a Provider<WebService> back. This provider
can be connected to task properties to pass the service to the task. Note that for a task property
annotated with @Internal, the task property needs to (1) be explicitly assigned with the provider
obtained during registation, and (2) you must tell Gradle the task uses the service via
[Link]. None of that is needed when the task property consuming the service is
annotated with @ServiceReference.

Here is an example of a task that consumes the previous service via a property annotated with
@Internal:

[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];

import [Link];

public abstract class Download extends DefaultTask {


// This property provides access to the service instance
@Internal
abstract Property<WebServer> getServer();

@OutputFile
abstract RegularFileProperty getOutputFile();

@TaskAction
public void download() {
// Use the server to download a file
WebServer server = getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}

Note that using a service with any annotation other than @ServiceReference or @Internal is currently
not supported. For example, it is currently impossible to mark a service as an input to a task.

Using shared build services from configuration actions

Generally, build services are intended to be used by tasks, and as they usually represent some
potentially expensive state to create, you should avoid using them at configuration time. However,
sometimes, using the service at configuration time can make sense. This is possible; call get() on
the provider.

Using a build service with the Worker API

In addition to using a build service from a task, you can use a build service from a Worker API
action, an artifact transform or another build service. To do this, pass the build service Provider as a
parameter of the consuming action or service, in the same way you pass other parameters to the
action or service.

For example, to pass a MyServiceType service to Worker API action, you might add a property of type
Property<MyServiceType> to the action’s parameters object and then connect the
Provider<MyServiceType> that you receive when registering the service to this property:

[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];

import [Link];
import [Link];

public abstract class Download extends DefaultTask {

public static abstract class DownloadWorkAction implements WorkAction


<[Link]> {
interface Parameters extends WorkParameters {
// This property provides access to the service instance from the work
action
abstract Property<WebServer> getServer();
}

@Override
public void execute() {
// Use the server to download a file
WebServer server = getParameters().getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}

@Inject
abstract public WorkerExecutor getWorkerExecutor();

// This property provides access to the service instance from the task
@ServiceReference("web")
abstract Property<WebServer> getServer();

@TaskAction
public void download() {
WorkQueue workQueue = getWorkerExecutor().noIsolation();
[Link]([Link], parameter -> {
[Link]().set(getServer());
});
}
}

Currently, it is impossible to use a build service with a worker API action that uses ClassLoader or
process isolation modes.

Accessing the build service concurrently

You can constrain concurrent execution when you register the service, by using the Property object
returned from [Link](). When this property has no value, which is
the default, Gradle does not constrain access to the service. When this property has a value > 0,
Gradle will allow no more than the specified number of tasks to use the service concurrently.

When the consuming task property is annotated with @Internal, for the
constraint to take effect, the build service must be registered with the
consuming task via [Link]. NOTE: at this time, Gradle cannot
discover indirect usage of services (for instance, if an additional service is
IMPORTANT used only by a service that the task uses directly). As a workaround, indirect
usage may be declared explicitly to Gradle by either adding a
@ServiceReference property to the task and assigning the service that is only
used indirectly to it (making it a direct reference), or invoking
[Link].
Receiving information about task execution

A build service can be used to receive events as tasks are executed. To do this, create and register a
build service that implements OperationCompletionListener:

[Link]

import [Link];
import [Link];
import [Link];
import [Link];
import [Link];

public abstract class TaskEventsService implements BuildService


<[Link]>,
OperationCompletionListener { ①

@Override
public void onFinish(FinishEvent finishEvent) {
if (finishEvent instanceof TaskFinishEvent) { ②
// Handle task finish event...
}
}
}

① Implement the OperationCompletionListener interface and the BuildService interface.

② Check if the finish event is a TaskFinishEvent.

Then, in the plugin, you can use the methods on the BuildEventsListenerRegistry service to start
receiving events:

[Link]

import [Link];
import [Link];
import [Link];
import [Link];

import [Link];

public abstract class TaskEventsPlugin implements Plugin<Project> {


@Inject
public abstract BuildEventsListenerRegistry getEventsListenerRegistry(); ①

@Override
public void apply(Project project) {
Provider<TaskEventsService> serviceProvider =
[Link]().getSharedServices().registerIfAbsent(
"taskEvents", [Link], spec -> {}); ②

getEventsListenerRegistry().onTaskCompletion(serviceProvider); ③
}
}

① Use service injection to obtain an instance of the BuildEventsListenerRegistry.

② Register the build service as usual.

③ Use the service Provider to subscribe to the build service to build events.

[1] You might be wondering why there is neither an import for the StopExecutionException nor do we access it via its fully qualified
name. The reason is that Gradle adds a set of default imports to your script (see Default imports).
DEVELOPING PLUGINS
Understanding Plugins
Gradle comes with a set of powerful core systems such as dependency management, task execution,
and project configuration. But everything else it can do is supplied by plugins.

Plugins encapsulate logic for specific tasks or integrations, such as compiling code, running tests, or
deploying artifacts. By applying plugins, users can easily add new features to their build process
without having to write complex code from scratch.

This plugin-based approach allows Gradle to be lightweight and modular. It also promotes code
reuse and maintainability, as plugins can be shared across projects or within an organization.

Before reading this chapter, it’s recommended that you first read Learning The Basics and complete
the Tutorial.

Plugins Introduction

Plugins can be sourced from Gradle or the Gradle community. But when users want to organize
their build logic or need specific build capabilities not provided by existing plugins, they can
develop their own.

As such, we distinguish between three different kinds of plugins:

1. Core Plugins - plugins that come from Gradle.

2. Community Plugins - plugins that come from Gradle Plugin Portal or a public repository.

3. Local or Custom Plugins - plugins that you develop yourself.

Core Plugins

The term core plugin refers to a plugin that is part of the Gradle distribution such as the Java
Library Plugin. They are always available.

Community Plugins

The term community plugin refers to a plugin published to the Gradle Plugin Portal (or another
public repository) such as the Spotless Plugin.

Local or Custom Plugins

The term local or custom plugin refers to a plugin you write yourself for your own build.

Custom plugins

There are three types of custom plugins:


# Type Location: Most likely: Benefit:

1 Script plugins A .gradle(.kts) A local plugin Plugin is


script file automatically
compiled and
included in the
classpath of the
build script.

2 Precompiled script buildSrc folder or A convention Plugin is


plugins composite build plugin automatically
compiled, tested,
and available on
the classpath of
the build script.
The plugin is
visible to every
build script used
by the build.

3 Binary plugins Standalone project A shared plugin Plugin JAR is


produced and
published. The
plugin can be used
in multiple builds
and shared with
others.

Script plugins

Script plugins are typically small, local plugins written in script files for tasks specific to a single
build or project. They do not need to be reused across multiple projects. Script plugins are not
recommended but many other forms of plugins evolve from script plugins.

To create a plugin, you need to write a class that implements the Plugin interface.

The following sample creates a GreetingPlugin, which adds a hello task to a project when applied:

[Link]

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
[Link]("hello") {
doLast {
println("Hello from the GreetingPlugin")
}
}
}
}
// Apply the plugin
apply<GreetingPlugin>()

[Link]

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
[Link]('hello') {
doLast {
println 'Hello from the GreetingPlugin'
}
}
}
}

// Apply the plugin


apply plugin: GreetingPlugin

$ gradle -q hello
Hello from the GreetingPlugin

The Project object is passed as a parameter in apply(), which the plugin can use to configure the
project however it needs to (such as adding tasks, configuring dependencies, etc.). In this example,
the plugin is written directly in the build file which is not a recommended practice.

When the plugin is written in a separate script file, it can be applied using apply(from =
"file_name.[Link]") or apply from: 'file_name.gradle'. In the example below, the plugin is
coded in the [Link](.kts) script file. Then, the [Link](.kts) is applied to
[Link](.kts) using apply from:

[Link]

class GreetingScriptPlugin : Plugin<Project> {


override fun apply(project: Project) {
[Link]("hi") {
doLast {
println("Hi from the GreetingScriptPlugin")
}
}
}
}
// Apply the plugin
apply<GreetingScriptPlugin>()

[Link]

class GreetingScriptPlugin implements Plugin<Project> {


void apply(Project project) {
[Link]('hi') {
doLast {
println 'Hi from the GreetingScriptPlugin'
}
}
}
}

// Apply the plugin


apply plugin: GreetingScriptPlugin

[Link]

apply(from = "[Link]")

[Link]

apply from: '[Link]'

$ gradle -q hi
Hi from the GreetingScriptPlugin

Script plugins should be avoided.

Precompiled script plugins

Precompiled script plugins are compiled into class files and packaged into a JAR before they are
executed. These plugins use the Groovy DSL or Kotlin DSL instead of pure Java, Kotlin, or Groovy.
They are best used as convention plugins that share build logic across projects or as a way to
neatly organize build logic.

To create a precompiled script plugin, you can:


1. Use Gradle’s Kotlin DSL - The plugin is a .[Link] file, and apply kotlin-dsl .

2. Use Gradle’s Groovy DSL - The plugin is a .gradle file, and apply id("groovy-gradle-plugin").

To apply a precompiled script plugin, you need to know its ID. The ID is derived from the plugin
script’s filename and its (optional) package declaration.

For example, the script src/main/*/[Link](.kts) has a plugin ID of some-java-


library (assuming it has no package declaration). Likewise, src/main/*/my/some-java-
[Link](.kts) has a plugin ID of [Link]-java-library as long as it has a package declaration
of my.

Precompiled script plugin names have two important limitations:

• They cannot start with [Link].

• They cannot have the same name as a core plugin.

When the plugin is applied to a project, Gradle creates an instance of the plugin class and calls the
instance’s [Link]() method.

NOTE A new instance of a Plugin is created within each project applying that plugin.

Let’s rewrite the GreetingPlugin script plugin as a precompiled script plugin. Since we are using the
Groovy or Kotlin DSL, the file essentially becomes the plugin. The original script plugin simply
created a hello task which printed a greeting, this is what we will do in the pre-compiled script
plugin:

buildSrc/src/main/kotlin/[Link]

[Link]("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}

buildSrc/src/main/groovy/[Link]

[Link]("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}

The GreetingPlugin can now be applied in other subprojects' builds by using its ID:
app/[Link]

plugins {
application
id("GreetingPlugin")
}

app/[Link]

plugins {
id 'application'
id('GreetingPlugin')
}

$ gradle -q hello
Hello from the convention GreetingPlugin

Convention plugins

A convention plugin is typically a precompiled script plugin that configures existing core and
community plugins with your own conventions (i.e. default values) such as setting the Java version
by using [Link] = [Link](17). Convention plugins are
also used to enforce project standards and help streamline the build process. They can apply and
configure plugins, create new tasks and extensions, set dependencies, and much more.

Let’s take an example build with three subprojects: one for data-model, one for database-logic and
one for app code. The project has the following structure:

.
├── buildSrc
│ ├── src
│ │ └──...
│ └── [Link]
├── data-model
│ ├── src
│ │ └──...
│ └── [Link]
├── database-logic
│ ├── src
│ │ └──...
│ └── [Link]
├── app
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]

The build file of the database-logic subproject is as follows:

database-logic/[Link]

plugins {
id("java-library")
id("[Link]") version "2.0.21"
}

repositories {
mavenCentral()
}

java {
[Link]([Link](11))
}

[Link] {
useJUnitPlatform()
}

kotlin {
jvmToolchain(11)
}

// More build logic

database-logic/[Link]

plugins {
id 'java-library'
id '[Link]' version '2.0.21'
}

repositories {
mavenCentral()
}

java {
[Link]([Link](11))
}
[Link] {
useJUnitPlatform()
}

kotlin {
jvmToolchain {
[Link]([Link](11))
}
}

// More build logic

We apply the java-library plugin and add the [Link] plugin for Kotlin support.
We also configure Kotlin, Java, tests and more.

Our build file is beginning to grow…

The more plugins we apply and the more plugins we configure, the larger it gets. There’s also
repetition in the build files of the app and data-model subprojects, especially when configuring
common extensions like setting the Java version and Kotlin support.

To address this, we use convention plugins. This allows us to avoid repeating configuration in each
build file and keeps our build scripts more concise and maintainable. In convention plugins, we can
encapsulate arbitrary build configuration or custom build logic.

To develop a convention plugin, we recommend using buildSrc – which represents a completely


separate Gradle build. buildSrc has its own settings file to define where dependencies of this build
are located.

We add a Kotlin script called [Link] inside the buildSrc/src/main/kotlin


directory. Or conversely, a Groovy script called [Link] inside the
buildSrc/src/main/groovy directory. We put all the plugin application and configuration from the
database-logic build file into it:

buildSrc/src/main/kotlin/[Link]

plugins {
id("java-library")
id("[Link]")
}

repositories {
mavenCentral()
}

java {
[Link]([Link](11))
}

[Link] {
useJUnitPlatform()
}

kotlin {
jvmToolchain(11)
}

buildSrc/src/main/groovy/[Link]

plugins {
id 'java-library'
id '[Link]'
}

repositories {
mavenCentral()
}

java {
[Link]([Link](11))
}

[Link] {
useJUnitPlatform()
}

kotlin {
jvmToolchain {
[Link]([Link](11))
}
}

The name of the file my-java-library is the ID of our brand-new plugin, which we can now use in all
of our subprojects.

Why is the version of id '[Link]' missing? See Applying External


TIP
Plugins to Pre-Compiled Script Plugins.

The database-logic build file becomes much simpler by removing all the redundant build logic and
applying our convention my-java-library plugin instead:
database-logic/[Link]

plugins {
id("my-java-library")
}

database-logic/[Link]

plugins {
id('my-java-library')
}

This convention plugin enables us to easily share common configurations across all our build files.
Any modifications can be made in one place, simplifying maintenance.

Binary plugins

Binary plugins in Gradle are plugins that are built as standalone JAR files and applied to a project
using the plugins{} block in the build script.

Let’s move our GreetingPlugin to a standalone project so that we can publish it and share it with
others. The plugin is essentially moved from the buildSrc folder to its own build called greeting-
plugin.

You can publish the plugin from buildSrc, but this is not recommended practice.
NOTE
Plugins that are ready for publication should be in their own build.

greeting-plugin is simply a Java project that produces a JAR containing the plugin classes.

The easiest way to package and publish a plugin to a repository is to use the Gradle Plugin
Development Plugin. This plugin provides the necessary tasks and configurations (including the
plugin metadata) to compile your script into a plugin that can be applied in other builds.

Here is a simple build script for the greeting-plugin project using the Gradle Plugin Development
Plugin:

[Link]

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("simplePlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}

[Link]

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
simplePlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}

For more on publishing plugins, see Publishing Plugins.

Project vs Settings vs Init plugins

In the example used through this section, the plugin accepts the Project type as a type parameter.
Alternatively, the plugin can accept a parameter of type Settings to be applied in a settings script, or
a parameter of type Gradle to be applied in an initialization script.

The difference between these types of plugins lies in the scope of their application:

Project Plugin
A project plugin is a plugin that is applied to a specific project in a build. It can customize the
build logic, add tasks, and configure the project-specific settings.

Settings Plugin
A settings plugin is a plugin that is applied in the [Link] or [Link] file. It
can configure settings that apply to the entire build, such as defining which projects are
included in the build, configuring build script repositories, and applying common configurations
to all projects.

Init Plugin
An init plugin is a plugin that is applied in the [Link] or [Link] file. It can
configure settings that apply globally to all Gradle builds on a machine, such as configuring the
Gradle version, setting up default repositories, or applying common plugins to all builds.

Understanding Implementation Options for Plugins


The choice between script, precompiled script, or binary plugins depends on your specific
requirements and preferences.

Script Plugins are simple and easy to write. They are written in Kotlin DSL or Groovy DSL. They
are suitable for small, one-off tasks or for quick experimentation. However, they can become hard
to maintain as the build script grows in size and complexity.

Precompiled Script Plugins are Kotlin or Groovy DSL scripts compiled into Java class files
packaged in a library. They offer better performance and maintainability compared to script
plugins, and they can be reused across different projects. You can also write them in Groovy DSL
but that is not recommended.

Binary Plugins are full-fledged plugins written in Java, Groovy, or Kotlin, compiled into JAR files,
and published to a repository. They offer the best performance, maintainability, and reusability.
They are suitable for complex build logic that needs to be shared across projects, builds, and teams.
You can also write them in Scala or Groovy but that is not recommended.

Here is a breakdown of all options for implementing Gradle plugins:

# Using: Type: The Plugin is: Recommended?


[1]
1 Kotlin DSL Script plugin in a .[Link] No
file as an abstract
class that
implements the
apply(Project
project) method
of the
Plugin<Project>
interface.
[1]
2 Groovy DSL Script plugin in a .gradle file as No
an abstract class
that implements
the apply(Project
project) method
of the
Plugin<Project>
interface.

3 Kotlin DSL Pre-compiled a .[Link] file. Yes


script plugin
[2]
4 Groovy DSL Pre-compiled a .gradle file. Ok
script plugin
# Using: Type: The Plugin is: Recommended?

5 Java Binary plugin an abstract class Yes


that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Java.

6 Kotlin / Kotlin DSL Binary plugin an abstract class Yes


that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Kotlin
and/or Kotlin DSL.
[2]
7 Groovy / Groovy Binary plugin an abstract class Ok
DSL that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in
Groovy and/or
Groovy DSL.
[2]
8 Scala Binary plugin an abstract class No
that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Scala.

If you suspect issues with your plugin code, try creating a Build Scan to identify bottlenecks. The
Gradle profiler can help automate Build Scan generation and gather more low-level information.

Implementing Pre-compiled Script Plugins


A precompiled script plugin is typically a Kotlin script that has been compiled and distributed as
Java class files packaged in a library. These scripts are intended to be consumed as binary Gradle
plugins and are recommended for use as convention plugins.

A convention plugin is a plugin that normally configures existing core and community plugins
with your own conventions (i.e. default values) such as setting the Java version by using
[Link] = [Link](17). Convention plugins are also used to
enforce project standards and help streamline the build process. They can apply and configure
plugins, create new tasks and extensions, set dependencies, and much more.

Setting the plugin ID

The plugin ID for a precompiled script is derived from its file name and optional package
declaration.

For example, a script named [Link](.kts) located in src/main/groovy (or


src/main/kotlin) without a package declaration would be exposed as the code-quality plugin:

buildSrc/[Link]

plugins {
`kotlin-dsl`
}

app/[Link]

plugins {
id("code-quality")
}

buildSrc/[Link]

plugins {
id 'groovy-gradle-plugin'
}

app/[Link]

plugins {
id 'code-quality'
}

On the other hand, a script named [Link] located in src/main/kotlin/my with the
package declaration my would be exposed as the [Link]-quality plugin:

buildSrc/[Link]

plugins {
`kotlin-dsl`
}
app/[Link]

plugins {
id("[Link]-quality")
}

IMPORTANT Groovy pre-compiled script plugins cannot have packages.

Making a plugin configurable using extensions

Extension objects are commonly used in plugins to expose configuration options and additional
functionality to build scripts.

When you apply a plugin that defines an extension, you can access the extension object and
configure its properties or call its methods to customize the behavior of the plugin or tasks
provided by the plugin.

A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.

Let’s update our greetings example:

buildSrc/src/main/kotlin/[Link]

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
[Link]<GreetingPluginExtension>("greeting")

buildSrc/src/main/groovy/[Link]

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = [Link]("greeting",
GreetingPluginExtension)
You can set the value of the message property directly with [Link]("Hi from
Gradle,").

However, the GreetingPluginExtension object becomes available as a project property with the same
name as the extension object. You can now access message like so:

buildSrc/src/main/kotlin/[Link]

// Where the<GreetingPluginExtension>() is equivalent to


[Link](GreetingPluginExtension::[Link])
the<GreetingPluginExtension>().[Link]("Hi from Gradle")

buildSrc/src/main/groovy/[Link]

[Link](GreetingPluginExtension).[Link]("Hi from Gradle")

If you apply the greetings plugin, you can set the convention in your build script:

app/[Link]

plugins {
application
id("greetings")
}

greeting {
message = "Hello from Gradle"
}

app/[Link]

plugins {
id 'application'
id('greetings')
}

configure(greeting) {
message = "Hello from Gradle"
}
Adding default configuration as conventions

In plugins, you can define default values, also known as conventions, using the project object.

Convention properties are properties that are initialized with default values but can be overridden:

buildSrc/src/main/kotlin/[Link]

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
[Link]<GreetingPluginExtension>("greeting")

// Set a default value for 'message'


[Link]("Hello from Gradle")

buildSrc/src/main/groovy/[Link]

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = [Link]("greeting",
GreetingPluginExtension)

// Set a default value for 'message'


[Link]("Hello from Gradle")

[Link](…) sets a convention for the message property of the extension. This
convention specifies that the value of message should default to "Hello from Gradle".

If the message property is not explicitly set, its value will be automatically set to "Hello from Gradle".

Mapping extension properties to task properties

Using an extension and mapping it to a custom task’s input/output properties is common in plugins.

In this example, the message property of the GreetingPluginExtension is mapped to the message
property of the GreetingTask as an input:
buildSrc/src/main/kotlin/[Link]

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
[Link]<GreetingPluginExtension>("greeting")

// Set a default value for 'message'


[Link]("Hello from Gradle")

// Create a greeting task


abstract class GreetingTask : DefaultTask() {
@Input
val message = [Link]<String>()

@TaskAction
fun greet() {
println("Message: ${[Link]()}")
}
}

// Register the task and set the convention


[Link]<GreetingTask>("hello") {
[Link]([Link])
}

buildSrc/src/main/groovy/[Link]

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = [Link]("greeting",
GreetingPluginExtension)

// Set a default value for 'message'


[Link]("Hello from Gradle")

// Create a greeting task


abstract class GreetingTask extends DefaultTask {
@Input
abstract Property<String> getMessage()

@TaskAction
void greet() {
println("Message: ${[Link]()}")
}
}

// Register the task and set the convention


[Link]("hello", GreetingTask) {
[Link]([Link])
}

$ gradle -q hello
Message: Hello from Gradle

This means that changes to the extension’s message property will trigger the task to be considered
out-of-date, ensuring that the task is re-executed with the new message.

You can find out more about types that you can use in task implementations and extensions in Lazy
Configuration.

Applying external plugins

In order to apply an external plugin in a precompiled script plugin, it has to be added to the plugin
project’s implementation classpath in the plugin’s build file:

buildSrc/[Link]

plugins {
`kotlin-dsl`
}

repositories {
mavenCentral()
}

dependencies {
implementation("[Link]:gradle-docker-plugin:6.4.0")
}
buildSrc/[Link]

plugins {
id 'groovy-gradle-plugin'
}

repositories {
mavenCentral()
}

dependencies {
implementation '[Link]:gradle-docker-plugin:6.4.0'
}

It can then be applied in the precompiled script plugin:

buildSrc/src/main/kotlin/[Link]

plugins {
id("[Link]-remote-api")
}

buildSrc/src/main/groovy/[Link]

plugins {
id '[Link]-remote-api'
}

The plugin version in this case is defined in the dependency declaration.

Implementing Binary Plugins


Binary plugins refer to plugins that are compiled and distributed as JAR files. These plugins are
usually written in Java or Kotlin and provide custom functionality or tasks to a Gradle build.

Using the Plugin Development plugin

The Gradle Plugin Development plugin can be used to assist in developing Gradle plugins.

This plugin will automatically apply the Java Plugin, add the gradleApi() dependency to the api
configuration, generate the required plugin descriptors in the resulting JAR file, and configure the
Plugin Marker Artifact to be used when publishing.

To apply and configure the plugin, add the following code to your build file:

[Link]

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("simplePlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}

[Link]

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
simplePlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}

Writing and using custom task types is recommended when developing plugins as it automatically
benefits from incremental builds. As an added benefit of applying the plugin to your project, the
task validatePlugins automatically checks for an existing input/output annotation for every public
property defined in a custom task type implementation.

Creating a plugin ID

Plugin IDs are meant to be globally unique, similar to Java package names (i.e., a reverse domain
name). This format helps prevent naming collisions and allows grouping plugins with similar
ownership.
An explicit plugin identifier simplifies applying the plugin to a project. Your plugin ID should
combine components that reflect the namespace (a reasonable pointer to you or your organization)
and the name of the plugin it provides. For example, if your Github account is named foo and your
plugin is named bar, a suitable plugin ID might be [Link]. Similarly, if the plugin was
developed at the baz organization, the plugin ID might be [Link].

Plugin IDs should adhere to the following guidelines:

• May contain any alphanumeric character, '.', and '-'.

• Must contain at least one '.' character separating the namespace from the plugin’s name.

• Conventionally use a lowercase reverse domain name convention for the namespace.

• Conventionally use only lowercase characters in the name.

• [Link], [Link], and [Link] namespaces may not be used.

• Cannot start or end with a '.' character.

• Cannot contain consecutive '.' characters (i.e., '..').

A namespace that identifies ownership and a name is sufficient for a plugin ID.

When bundling multiple plugins in a single JAR artifact, adhering to the same naming conventions
is recommended. This practice helps logically group related plugins.

There is no limit to the number of plugins that can be defined and registered (by different
identifiers) within a single project.

The identifiers for plugins written as a class should be defined in the project’s build script
containing the plugin classes. For this, the java-gradle-plugin needs to be applied:

buildSrc/[Link]

plugins {
id("java-gradle-plugin")
}

gradlePlugin {
plugins {
create("androidApplicationPlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
create("androidLibraryPlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}
buildSrc/[Link]

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
androidApplicationPlugin {
id = '[Link]'
implementationClass = '[Link]'
}
androidLibraryPlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}

Working with files

When developing plugins, it’s a good idea to be flexible when accepting input configuration for file
locations.

It is recommended to use Gradle’s managed properties and [Link] to select file or directory
locations. This will enable lazy configuration so that the actual location will only be resolved when
the file is needed and can be reconfigured at any time during build configuration.

This Gradle build file defines a task GreetingToFileTask that writes a greeting to a file. It also
registers two tasks: greet, which creates the file with the greeting, and sayGreeting, which prints the
file’s contents. The greetingFile property is used to specify the file path for the greeting:

[Link]

abstract class GreetingToFileTask : DefaultTask() {

@get:OutputFile
abstract val destination: RegularFileProperty

@TaskAction
fun greet() {
val file = [Link]().asFile
[Link]()
[Link]("Hello!")
}
}
val greetingFile = [Link]()

[Link]<GreetingToFileTask>("greet") {
destination = greetingFile
}

[Link]("sayGreeting") {
dependsOn("greet")
val greetingFile = greetingFile
doLast {
val file = [Link]().asFile
println("${[Link]()} (file: ${[Link]})")
}
}

greetingFile = [Link]("[Link]")

[Link]

abstract class GreetingToFileTask extends DefaultTask {

@OutputFile
abstract RegularFileProperty getDestination()

@TaskAction
def greet() {
def file = getDestination().get().asFile
[Link]()
[Link] 'Hello!'
}
}

def greetingFile = [Link]()

[Link]('greet', GreetingToFileTask) {
destination = greetingFile
}

[Link]('sayGreeting') {
dependsOn greet
doLast {
def file = [Link]().asFile
println "${[Link]} (file: ${[Link]})"
}
}

greetingFile = [Link]('[Link]')
$ gradle -q sayGreeting
Hello! (file: [Link])

In this example, we configure the greet task destination property as a closure/provider, which is
evaluated with the [Link]([Link]) method to turn the return value of the
closure/provider into a File object at the last minute. Note that we specify the greetingFile
property value after the task configuration. This lazy evaluation is a key benefit of accepting any
value when setting a file property and then resolving that value when reading the property.

You can learn more about working with files lazily in Working with Files.

Making a plugin configurable using extensions

Most plugins offer configuration options for build scripts and other plugins to customize how the
plugin works. Plugins do this using extension objects.

A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.

An extension object is simply an object with Java Bean properties representing the configuration.

Let’s add a greeting extension object to the project, which allows you to configure the greeting:

[Link]

interface GreetingPluginExtension {
val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
// Add the 'greeting' extension object
val extension =
[Link]<GreetingPluginExtension>("greeting")
// Add a task that uses configuration from the extension object
[Link]("hello") {
doLast {
println([Link]())
}
}
}
}

apply<GreetingPlugin>()

// Configure the extension


the<GreetingPluginExtension>().message = "Hi from Gradle"

[Link]

interface GreetingPluginExtension {
Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
// Add the 'greeting' extension object
def extension = [Link]('greeting',
GreetingPluginExtension)
// Add a task that uses configuration from the extension object
[Link]('hello') {
doLast {
println [Link]()
}
}
}
}

apply plugin: GreetingPlugin

// Configure the extension


[Link] = 'Hi from Gradle'

$ gradle -q hello
Hi from Gradle

In this example, GreetingPluginExtension is an object with a property called message. The extension
object is added to the project with the name greeting. This object becomes available as a project
property with the same name as the extension object. the<GreetingPluginExtension>() is equivalent
to [Link](GreetingPluginExtension::[Link]).

Often, you have several related properties you need to specify on a single plugin. Gradle adds a
configuration block for each extension object, so you can group settings:

[Link]

interface GreetingPluginExtension {
val message: Property<String>
val greeter: Property<String>
}
class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
val extension =
[Link]<GreetingPluginExtension>("greeting")
[Link]("hello") {
doLast {
println("${[Link]()} from
${[Link]()}")
}
}
}
}

apply<GreetingPlugin>()

// Configure the extension using a DSL block


configure<GreetingPluginExtension> {
message = "Hi"
greeter = "Gradle"
}

[Link]

interface GreetingPluginExtension {
Property<String> getMessage()
Property<String> getGreeter()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
def extension = [Link]('greeting',
GreetingPluginExtension)
[Link]('hello') {
doLast {
println "${[Link]()} from ${[Link]
.get()}"
}
}
}
}

apply plugin: GreetingPlugin

// Configure the extension using a DSL block


greeting {
message = 'Hi'
greeter = 'Gradle'
}

$ gradle -q hello
Hi from Gradle

In this example, several settings can be grouped within the configure<GreetingPluginExtension>


block. The configure function is used to configure an extension object. It provides a convenient way
to set properties or apply configurations to these objects. The type used in the build script’s
configure function (GreetingPluginExtension) must match the extension type. Then, when the block
is executed, the receiver of the block is the extension.

In this example, several settings can be grouped within the greeting closure. The name of the
closure block in the build script (greeting) must match the extension object name. Then, when the
closure is executed, the fields on the extension object will be mapped to the variables within the
closure based on the standard Groovy closure delegate feature.

Declaring a DSL configuration container

Using an extension object extends the Gradle DSL to add a project property and DSL block for the
plugin. Because an extension object is a regular object, you can provide your own DSL nested inside
the plugin block by adding properties and methods to the extension object.

Let’s consider the following build script for illustration purposes.

[Link]

plugins {
id("[Link]-env")
}

environments {
create("dev") {
url = "[Link]
}

create("staging") {
url = "[Link]
}

create("production") {
url = "[Link]
}
}
[Link]

plugins {
id '[Link]-env'
}

environments {
dev {
url = '[Link]
}

staging {
url = '[Link]
}

production {
url = '[Link]
}
}

The DSL exposed by the plugin exposes a container for defining a set of environments. Each
environment the user configures has an arbitrary but declarative name and is represented with its
own DSL configuration block. The example above instantiates a development, staging, and
production environment, including its respective URL.

Each environment must have a data representation in code to capture the values. The name of an
environment is immutable and can be passed in as a constructor parameter. Currently, the only
other parameter the data object stores is a URL.

The following ServerEnvironment object fulfills those requirements:

[Link]

abstract public class ServerEnvironment {


private final String name;

@[Link]
public ServerEnvironment(String name) {
[Link] = name;
}

public String getName() {


return name;
}

abstract public Property<String> getUrl();


}
Gradle exposes the factory method [Link](Class,
NamedDomainObjectFactory) to create a container of data objects. The parameter the method takes
is the class representing the data. The created instance of type NamedDomainObjectContainer can
be exposed to the end user by adding it to the extension container with a specific name.

It’s common for a plugin to post-process the captured values within the plugin implementation, e.g.,
to configure tasks:

[Link]

public class ServerEnvironmentPlugin implements Plugin<Project> {


@Override
public void apply(final Project project) {
ObjectFactory objects = [Link]();

NamedDomainObjectContainer<ServerEnvironment> serverEnvironmentContainer =
[Link]([Link], name -> objects
.newInstance([Link], name));
[Link]().add("environments", serverEnvironmentContainer);

[Link](serverEnvironment -> {
String env = [Link]();
String capitalizedServerEnv = [Link](0, 1).toUpperCase() + env
.substring(1);
String taskName = "deployTo" + capitalizedServerEnv;
[Link]().register(taskName, [Link], task -> [Link]()
.set([Link]()));
});
}
}

In the example above, a deployment task is created dynamically for every user-configured
environment.

You can find out more about implementing project extensions in Developing Custom Gradle Types.

Modeling DSL-like APIs

DSLs exposed by plugins should be readable and easy to understand.

For example, let’s consider the following extension provided by a plugin. In its current form, it
offers a "flat" list of properties for configuring the creation of a website:

[Link]

plugins {
id("[Link]")
}
site {
outputDir = [Link]("mysite")
websiteUrl = "[Link]
vcsUrl = "[Link]
}

[Link]

plugins {
id '[Link]'
}

site {
outputDir = [Link]("mysite")
websiteUrl = '[Link]
vcsUrl = '[Link]
}

As the number of exposed properties grows, you should introduce a nested, more expressive
structure.

The following code snippet adds a new configuration block named siteInfo as part of the extension.
This provides a stronger indication of what those properties mean:

[Link]

plugins {
id("[Link]")
}

site {
outputDir = [Link]("mysite")

siteInfo {
websiteUrl = "[Link]
vcsUrl = "[Link]
}
}

[Link]

plugins {
id '[Link]'
}

site {
outputDir = [Link]("mysite")

siteInfo {
websiteUrl = '[Link]
vcsUrl = '[Link]
}
}

Implementing the backing objects for such an extension is simple. First, introduce a new data
object for managing the properties websiteUrl and vcsUrl:

[Link]

abstract public class SiteInfo {

abstract public Property<String> getWebsiteUrl();

abstract public Property<String> getVcsUrl();


}

In the extension, create an instance of the siteInfo class and a method to delegate the captured
values to the data instance.

To configure underlying data objects, define a parameter of type Action.

The following example demonstrates the use of Action in an extension definition:

[Link]

abstract public class SiteExtension {

abstract public RegularFileProperty getOutputDir();

@Nested
abstract public SiteInfo getSiteInfo();

public void siteInfo(Action<? super SiteInfo> action) {


[Link](getSiteInfo());
}
}

Mapping extension properties to task properties

Plugins commonly use an extension to capture user input from the build script and map it to a
custom task’s input/output properties. The build script author interacts with the extension’s DSL,
while the plugin implementation handles the underlying logic:

app/[Link]

// Extension class to capture user input


class MyExtension {
@Input
var inputParameter: String? = null
}

// Custom task that uses the input from the extension


class MyCustomTask : [Link]() {
@Input
var inputParameter: String? = null

@TaskAction
fun executeTask() {
println("Input parameter: $inputParameter")
}
}

// Plugin class that configures the extension and task


class MyPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Create and configure the extension
val extension = [Link]("myExtension",
MyExtension::[Link])
// Create and configure the custom task
[Link]("myTask", MyCustomTask::[Link]) {
group = "custom"
inputParameter = [Link]
}
}
}

app/[Link]

// Extension class to capture user input


class MyExtension {
@Input
String inputParameter = null
}

// Custom task that uses the input from the extension


class MyCustomTask extends DefaultTask {
@Input
String inputParameter = null

@TaskAction
def executeTask() {
println("Input parameter: $inputParameter")
}
}

// Plugin class that configures the extension and task


class MyPlugin implements Plugin<Project> {
void apply(Project project) {
// Create and configure the extension
def extension = [Link]("myExtension", MyExtension)
// Create and configure the custom task
[Link]("myTask", MyCustomTask) {
group = "custom"
inputParameter = [Link]
}
}
}

In this example, the MyExtension class defines an inputParameter property that can be set in the build
script. The MyPlugin class configures this extension and uses its inputParameter value to configure
the MyCustomTask task. The MyCustomTask task then uses this input parameter in its logic.

You can learn more about types you can use in task implementations and extensions in Lazy
Configuration.

Adding default configuration with conventions

Plugins should provide sensible defaults and standards in a specific context, reducing the number
of decisions users need to make. Using the project object, you can define default values. These are
known as conventions.

Conventions are properties that are initialized with default values and can be overridden by the
user in their build script. For example:

[Link]

interface GreetingPluginExtension {
val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
// Add the 'greeting' extension object
val extension =
[Link]<GreetingPluginExtension>("greeting")
[Link]("Hello from GreetingPlugin")
// Add a task that uses configuration from the extension object
[Link]("hello") {
doLast {
println([Link]())
}
}
}
}

apply<GreetingPlugin>()

[Link]

interface GreetingPluginExtension {
Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
// Add the 'greeting' extension object
def extension = [Link]('greeting',
GreetingPluginExtension)
[Link]('Hello from GreetingPlugin')
// Add a task that uses configuration from the extension object
[Link]('hello') {
doLast {
println [Link]()
}
}
}
}

apply plugin: GreetingPlugin

$ gradle -q hello
Hello from GreetingPlugin

In this example, GreetingPluginExtension is a class that represents the convention. The message
property is the convention property with a default value of 'Hello from GreetingPlugin'.

Users can override this value in their build script:


[Link]

GreetingPluginExtension {
message = "Custom message"
}

[Link]

GreetingPluginExtension {
message = 'Custom message'
}

$ gradle -q hello
Custom message

Separating capabilities from conventions

Separating capabilities from conventions in plugins allows users to choose which tasks and
conventions to apply.

For example, the Java Base plugin provides un-opinionated (i.e., generic) functionality like
SourceSets, while the Java plugin adds tasks and conventions familiar to Java developers like
classes, jar or javadoc.

When designing your own plugins, consider developing two plugins — one for capabilities and
another for conventions — to offer flexibility to users.

In the example below, MyPlugin contains conventions, and MyBasePlugin defines capabilities. Then,
MyPlugin applies MyBasePlugin, this is called plugin composition. To apply a plugin from another one:

[Link]

import [Link];
import [Link];

public class MyBasePlugin implements Plugin<Project> {


public void apply(Project project) {
// define capabilities
}
}
[Link]

import [Link];
import [Link];

public class MyPlugin implements Plugin<Project> {


public void apply(Project project) {
[Link]().apply([Link]);

// define conventions
}
}

Reacting to plugins

A common pattern in Gradle plugin implementations is configuring the runtime behavior of


existing plugins and tasks in a build.

For example, a plugin could assume that it is applied to a Java-based project and automatically
reconfigure the standard source directory:

[Link]

public class InhouseStrongOpinionConventionJavaPlugin implements Plugin<Project> {


public void apply(Project project) {
// Careful! Eagerly appyling plugins has downsides, and is not always
recommended.
[Link]().apply([Link]);
SourceSetContainer sourceSets = [Link]().getByType
([Link]);
SourceSet main = [Link](SourceSet.MAIN_SOURCE_SET_NAME);
[Link]().setSrcDirs([Link]("src"));
}
}

The drawback to this approach is that it automatically forces the project to apply the Java plugin,
imposing a strong opinion on it (i.e., reducing flexibility and generality). In practice, the project
applying the plugin might not even deal with Java code.

Instead of automatically applying the Java plugin, the plugin could react to the fact that the
consuming project applies the Java plugin. Only if that is the case, then a certain configuration is
applied:

[Link]

public class InhouseConventionJavaPlugin implements Plugin<Project> {


public void apply(Project project) {
[Link]().withPlugin("java", javaPlugin -> {
SourceSetContainer sourceSets = [Link]().getByType
([Link]);
SourceSet main = [Link](SourceSet.MAIN_SOURCE_SET_NAME);
[Link]().setSrcDirs([Link]("src"));
});
}
}

Reacting to plugins is preferred over applying plugins if there is no good reason to assume that the
consuming project has the expected setup.

The same concept applies to task types:

[Link]

public class InhouseConventionWarPlugin implements Plugin<Project> {


public void apply(Project project) {
[Link]().withType([Link]).configureEach(war ->
[Link]([Link]("src/[Link]")));
}
}

Reacting to build features

Plugins can access the status of build features in the build. The Build Features API allows checking
whether the user requested a particular Gradle feature and if it is active in the current build. An
example of a build feature is the configuration cache.

There are two main use cases:

• Using the status of build features in reports or statistics.

• Incrementally adopting experimental Gradle features by disabling incompatible plugin


functionality.

Below is an example of a plugin that utilizes both of the cases.

Reacting to build features

public abstract class MyPlugin implements Plugin<Project> {

@Inject
protected abstract BuildFeatures getBuildFeatures(); ①

@Override
public void apply(Project p) {
BuildFeatures buildFeatures = getBuildFeatures();

Boolean configCacheRequested = [Link]()


.getRequested() ②
.getOrNull(); // could be null if user did not opt in nor opt out
String configCacheUsage = describeFeatureUsage(configCacheRequested);
MyReport myReport = new MyReport();
[Link](configCacheUsage);

boolean isolatedProjectsActive = [Link]().


getActive() ③
.get(); // the active state is always defined
if (!isolatedProjectsActive) {
myOptionalPluginLogicIncompatibleWithIsolatedProjects();
}
}

private String describeFeatureUsage(Boolean requested) {


return requested == null ? "no preference" : requested ? "opt-in" : "opt-out";
}

private void myOptionalPluginLogicIncompatibleWithIsolatedProjects() {


}
}

① The BuildFeatures service can be injected into plugins, tasks, and other managed types.

② Accessing the requested status of a feature for reporting.

③ Using the active status of a feature to disable incompatible functionality.

Build feature properties

A BuildFeature status properties are represented with Provider<Boolean> types.

The [Link]() status of a build feature determines if the user requested to enable
or disable the feature.

When the requested provider value is:

• true — the user opted in for using the feature

• false — the user opted out from using the feature

• undefined — the user neither opted in nor opted out from using the feature

The [Link]() status of a build feature is always defined. It represents the effective
state of the feature in the build.

When the active provider value is:

• true — the feature may affect the build behavior in a way specific to the feature

• false — the feature will not affect the build behavior

Note that the active status does not depend on the requested status. Even if the user requests a
feature, it may still not be active due to other build options being used in the build. Gradle can also
activate a feature by default, even if the user did not specify a preference.
Using a custom dependencies block

NOTE Custom dependencies blocks are based on incubating APIs.

A plugin can provide dependency declarations in custom blocks that allow users to declare
dependencies in a type-safe and context-aware way.

For instance, instead of users needing to know and use the underlying Configuration name to add
dependencies, a custom dependencies block lets the plugin pick a meaningful name that can be used
consistently.

Adding a custom dependencies block

To add a custom dependencies block, you need to create a new type that will represent the set of
dependency scopes available to users. That new type needs to be accessible from a part of your
plugin (from a domain object or extension). Finally, the dependency scopes need to be wired back
to underlying Configuration objects that will be used during dependency resolution.

See JvmComponentDependencies and JvmTestSuite for an example of how this is used in a Gradle
core plugin.

1. Create an interface that extends Dependencies

You can also extend GradleDependencies to get access to Gradle-provided


NOTE
dependencies like gradleApi().

[Link]

/**
* Custom dependencies block for the example plugin.
*/
public interface ExampleDependencies extends Dependencies {

2. Add accessors for dependency scopes

For each dependency scope your plugin wants to support, add a getter method that returns a
DependencyCollector.

[Link]

/**
* Dependency scope called "implementation"
*/
DependencyCollector getImplementation();
3. Add accessors for custom dependencies block

To make the custom dependencies block configurable, the plugin needs to add a getDependencies
method that returns the new type from above and a configurable block method named
dependencies.

By convention, the accessors for your custom dependencies block should be called
getDependencies()/dependencies(Action). This method could be named something else, but users
would need to know that a different block can behave like a dependencies block.

[Link]

/**
* Custom dependencies for this extension.
*/
@Nested
ExampleDependencies getDependencies();

/**
* Configurable block
*/
default void dependencies(Action<? super ExampleDependencies> action) {
[Link](getDependencies());
}

4. Wire dependency scope to Configuration

Finally, the plugin needs to wire the custom dependencies block to some underlying Configuration
objects. If this is not done, none of the dependencies declared in the custom block will be available
to dependency resolution.

[Link]

[Link]().dependencyScope("exampleImplementation", conf
-> {
[Link]([Link]()
.getImplementation());
});

In this example, the name users will use to add dependencies is "implementation",
NOTE
but the underlying Configuration is named exampleImplementation.

[Link]

example {
dependencies {
implementation("junit:junit:4.13")
}
}

[Link]

example {
dependencies {
implementation("junit:junit:4.13")
}
}

Differences between the custom dependencies and the top-level dependencies blocks

Each depe