0% found this document useful (0 votes)
96 views1,152 pages

User Guide

The Gradle User Manual version 8.9 provides comprehensive guidance on using Gradle, including installation, running builds, core concepts, and authoring builds. It covers topics such as dependency management, task creation, and project structuring, along with advanced features like build caching and continuous builds. The manual serves as a detailed resource for both new and experienced users to effectively utilize Gradle in their development processes.

Uploaded by

shrestoppp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views1,152 pages

User Guide

The Gradle User Manual version 8.9 provides comprehensive guidance on using Gradle, including installation, running builds, core concepts, and authoring builds. It covers topics such as dependency management, task creation, and project structuring, along with advanced features like build caching and continuous builds. The manual serves as a detailed resource for both new and experienced users to effectively utilize Gradle in their development processes.

Uploaded by

shrestoppp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Gradle User Manual

Version 8.9
Version 8.9
Table of Contents
OVERVIEW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Gradle User Manual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
The User Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
RELEASES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Installing Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Compatibility Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
The Feature Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
RUNNING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
CORE CONCEPTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Gradle Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Gradle Wrapper Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Command-Line Interface Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Settings File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Build File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Dependency Management Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Task Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Plugin Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Gradle Incremental Builds and Build Caching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Build Scans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
OTHER TOPICS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Continuous Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
AUTHORING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
THE BASICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Gradle Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Multi-Project Build Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Build Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Writing Settings Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Writing Build Scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Using Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Writing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Using Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Writing Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
STRUCTURING BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Structuring Projects with Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Declaring Dependencies between Subprojects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Sharing Build Logic between Subprojects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Composite Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Configuration On Demand. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
DEVELOPING TASKS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Understanding Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Configuring Tasks Lazily . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
Understanding Lazy properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Creating a Property or Provider instance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
Connecting properties together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Working with files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
Working with task inputs and outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Working with collections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Working with maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
Applying a convention to a property . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Where to apply conventions from? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Making a property unmodifiable. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
Using the Provider API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Provider Files API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Property Files API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Lazy Collections API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Lazy Objects API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Developing Parallel Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Advanced Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
DEVELOPING PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
Understanding Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
Understanding Implementation Options for Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Implementing Pre-compiled Script Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Implementing Binary Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Testing Gradle plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290
Publishing Plugins to the Gradle Plugin Portal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Gradle-managed Directories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Working With Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
Configuring the Build Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382
Initialization Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
Using Shared Build Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
Dataflow Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412
Testing Build Logic with TestKit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416
Using Ant from Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427
AUTHORING JVM BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443
Building Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443
Testing in Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Managing Dependencies of JVM Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 501
JAVA TOOLCHAINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Toolchains for JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Toolchain Resolver Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 522
JVM PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
The Java Library Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
The Application Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
The Java Platform Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544
The Groovy Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 550
The Scala Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 559
WORKING WITH DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
Dependency Management Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
THE BASICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575
Dependency Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575
Declaring repositories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 578
Declaring dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 611
Understanding the difference between libraries and applications . . . . . . . . . . . . . . . . . . . . . . . . 633
View and Debug Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 635
Understanding dependency resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 641
Verifying dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 649
DECLARING VERSIONS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Declaring Versions and Ranges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Declaring Rich Versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 679
Handling versions which change over time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 682
Locking dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 691
CONTROLLING TRANSITIVES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 701
Upgrading versions of transitive dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 701
Downgrading versions and excluding dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 702
Sharing dependency versions between projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 709
Aligning dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 732
Handling mutually exclusive dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
Fixing metadata with component metadata rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 743
Customizing resolution of a dependency directly. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 766
Preventing accidental dependency upgrades. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 784
PRODUCING AND CONSUMING VARIANTS OF LIBRARIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 791
Declaring Capabilities of a Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 791
Modeling library features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 795
Understanding variant selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806
Working with Variant Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 824
Sharing outputs between projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Transforming dependency artifacts on resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 841
PUBLISHING LIBRARIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 858
Publishing a project as module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 858
Understanding Gradle Module Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 862
Signing artifacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 867
Customizing publishing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 868
The Maven Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 879
The Ivy Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
OPTIMIZING BUILD PERFORMANCE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 907
Improve the Performance of Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 907
Gradle Daemon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 927
File System Watching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 935
Incremental build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 938
Configuration cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 975
Inspecting Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1015
USING THE BUILD CACHE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1027
Build Cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1027
Use cases for the build cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1040
Build cache performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1043
Important concepts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1047
Caching Java projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1052
Caching Android projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1057
Debugging and diagnosing cache misses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1060
Solving common problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068
REFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1078
Command-Line Interface Reference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1078
Gradle Wrapper Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1097
Gradle Plugin Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1107
Gradle & Third-party Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1110
GRADLE DSLs and API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1114
A Groovy Build Script Primer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1114
Gradle Kotlin DSL Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1119
LICENSE INFORMATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1151
License Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1151
OVERVIEW
Gradle User Manual
Gradle Build Tool

Gradle Build Tool is a fast, dependable, and adaptable open-source build


automation tool with an elegant and extensible declarative build language.

In this User Manual, Gradle Build Tool is abbreviated Gradle.

Why Gradle?

Gradle is a widely used and mature tool with an active community and a strong developer
ecosystem.

• Gradle is the most popular build system for the JVM and is the default system for Android and
Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.

• Gradle can automate a wide range of software build scenarios using either its built-in
functionality, third-party plugins, or custom build logic.

• Gradle provides a high-level, declarative, and expressive build language that makes it easy to
read and write build logic.

• Gradle is fast, scalable, and can build projects of any size and complexity.

• Gradle produces dependable results while benefiting from optimizations such as incremental
builds, build caching, and parallel execution.

Gradle, Inc. provides a free service called Build Scan® that provides extensive information and
insights about your builds. You can view scans to identify problems or share them for debugging
help.

Supported Languages and Frameworks

Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.

Compatible IDEs

All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.

You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your
continuous integration (CI) server.

Education

The Gradle User Manual is the official documentation for the Gradle Build Tool.

• Getting Started Tutorial — Learn Gradle basics and the benefits of building your App with
Gradle.

• Training Courses — Head over to the courses page to sign up for free Gradle training.

Support

• Forum — The fastest way to get help is through the Gradle Forum.

• Slack — Community members and core contributors answer questions directly on our Slack
Channel.

Licenses

Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user
manual and DSL reference manual are licensed under Creative Commons Attribution-
NonCommercial-ShareAlike 4.0 International License.

The User Manual


Explore our guides and examples to use Gradle.

Releases

Information on Gradle releases and how to install Gradle is found on the Installation page.

Content

The Gradle User Manual is broken down into the following sections:

Running Gradle Builds


Learn Gradle basics and how to use Gradle to build your project.

Authoring Gradle Builds


Develop tasks and plugins to customize your build.
Authoring JVM Builds
Use Gradle with your Java project.

Working with Dependencies


Add dependencies to your build.

Optimizing Builds
Use caches to optimize your build and understand the Gradle daemon, incremental builds and
file system watching.

Reference

1. Gradle’s API Javadocs

2. Gradle’s Groovy DSL

3. Gradle’s Kotlin DSL

4. Gradle’s Core Plugins


RELEASES
Installing Gradle
Gradle Installation

If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the
build uses the Gradle Wrapper. This is identifiable by the presence of the gradlew or gradlew.bat
files in the root of the project:

. ①
├── gradle
│ └── wrapper ②
├── gradlew ③
├── gradlew.bat ③
└── ⋮

① Project root directory.

② Gradle Wrapper.

③ Scripts for executing Gradle builds.

If the gradlew or gradlew.bat files are already present in your project, you do not need to install
Gradle. But you need to make sure your system satisfies Gradle’s prerequisites.

You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version
for your project. Please use the Gradle Wrapper to upgrade Gradle.

Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle
separately when only working within that IDE.

If you do not meet the criteria above and decide to install Gradle on your machine, first check if
Gradle is already installed by running gradle -v in your terminal. If the command does not return
anything, then Gradle is not installed, and you can follow the instructions below.

You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done
manually or using a package manager like SDKMAN! or Homebrew.

You can find all Gradle releases and their checksums on the releases page.

Prerequisites

Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or
higher to run. You can check the compatibility matrix for more information.

To check, run java -version:

❯ java -version
openjdk version "11.0.18" 2023-01-17
OpenJDK Runtime Environment Homebrew (build 11.0.18+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)

❯ java version "1.8.0_151"


Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)

Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your
project. In this example, the $PATH points to JDK17:

❯ echo $PATH
/opt/homebrew/opt/openjdk@17/bin

You can also set the JAVA_HOME environment variable to point to a specific JDK installation directory.
This is especially useful when multiple JDKs are installed:

❯ echo %JAVA_HOME%
C:\Program Files\Java\jdk1.7.0_80

❯ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk-16.jdk/Contents/Home

Gradle supports Kotlin and Groovy as the main build languages. Gradle ships with its own Kotlin
and Groovy libraries, therefore they do not need to be installed. Existing installations are ignored
by Gradle.

See the full compatibility notes for Java, Groovy, Kotlin, and Android.

Linux installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:

❯ sdk install gradle

Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle
that is incompatible or incomplete when compared to the official version.
▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)

• Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /opt/gradle
❯ unzip -d /opt/gradle gradle-8.9-bin.zip
❯ ls /opt/gradle/gradle-8.9
LICENSE NOTICE bin README init.d lib media

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/opt/gradle/gradle-8.9/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.

export GRADLE_HOME=/opt/gradle/gradle-8.9
export PATH=${GRADLE_HOME}/bin:${PATH}

macOS installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:

❯ sdk install gradle

Using Homebrew:
❯ brew install gradle

Using MacPorts:

❯ sudo port install gradle

Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc.

▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)

• Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /usr/local/gradle
❯ unzip gradle-8.9-bin.zip -d /usr/local/gradle
❯ ls /usr/local/gradle/gradle-8.9
LICENSE NOTICE README bin init.d lib

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/usr/local/gradle/gradle-8.9/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.

It’s a good idea to edit .bash_profile in your home directory to add GRADLE_HOME variable:

export GRADLE_HOME=/usr/local/gradle/gradle-8.9
export PATH=$GRADLE_HOME/bin:$PATH

Windows installation
▼ Installing manually
Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

• Binary-only (bin)

• Complete (all) with docs and sources

We recommend downloading the bin file.

Step 2 - Unpack the distribution

Create a new directory C:\Gradle with File Explorer.

Open a second File Explorer window and go to the directory where the Gradle distribution was
downloaded. Double-click the ZIP archive to expose the content. Drag the content folder gradle-
8.9 to your newly created C:\Gradle folder.

Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle using the archiver tool of
your choice.

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path.

In File Explorer right-click on the This PC (or Computer) icon, then click Properties → Advanced
System Settings → Environmental Variables.

Under System Variables select Path, then click Edit. Add an entry for C:\Gradle\gradle-8.9\bin.
Click OK to save.

Alternatively, you can add the environment variable GRADLE_HOME and point this to the unzipped
distribution. Instead of adding a specific version of Gradle to your Path, you can add
%GRADLE_HOME%\bin to your Path. When upgrading to a different version of Gradle, just change the
GRADLE_HOME environment variable.

Verify the installation

Open a console (or a Windows command prompt) and run gradle -v to run gradle and display the
version, e.g.:

❯ gradle -v

------------------------------------------------------------
Gradle 8.9
------------------------------------------------------------

Build time: 2024-06-17 18:10:00 UTC


Revision: 6028379bb5a8512d0b2c1be6403543b79825ef08

Kotlin: 1.9.23
Groovy: 3.0.21
Ant: Apache Ant(TM) version 1.10.13 compiled on January 4 2023
Launcher JVM: 11.0.23 (Eclipse Adoptium 11.0.23+9)
Daemon JVM: /Library/Java/JavaVirtualMachines/temurin-11.jdk/Contents/Home (no JDK
specified, using current Java home)
OS: Mac OS X 14.5 aarch64

You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available
from the releases page) and following these verification instructions.

Compatibility Matrix
The sections below describe Gradle’s compatibility with several integrations. Versions not listed
here may or may not work.

Java

A Java version between 8 and 22 is required to execute Gradle. Java 23 and later versions are not
yet supported.

Java 6 and 7 can be used for compilation but are deprecated for use with testing. Testing with Java 6
and 7 will not be supported in Gradle 9.0.

Any fully supported version of Java can be used for compilation or testing. However, the latest Java
version may only be supported for compilation or testing, not for running Gradle. Support is
achieved using toolchains and applies to all tasks supporting toolchains.

See the table below for the Java version supported by a specific Gradle release:

Table 1. Java Compatibility

Java version Support for toolchains Support for running Gradle

8 N/A 2.0

9 N/A 4.3

10 N/A 4.7

11 N/A 5.0

12 N/A 5.4

13 N/A 6.0

14 N/A 6.3

15 6.7 6.7
Java version Support for toolchains Support for running Gradle

16 7.0 7.0

17 7.3 7.3

18 7.5 7.5

19 7.6 7.6

20 8.1 8.3

21 8.4 8.5

22 8.7 8.8

23 N/A N/A

Kotlin

Gradle is tested with Kotlin 1.6.10 through 2.0.0. Beta and RC versions may or may not work.

Table 2. Embedded Kotlin version

Embedded Kotlin version Minimum Gradle version Kotlin Language version

1.3.10 5.0 1.3

1.3.11 5.1 1.3

1.3.20 5.2 1.3

1.3.21 5.3 1.3

1.3.31 5.5 1.3

1.3.41 5.6 1.3

1.3.50 6.0 1.3

1.3.61 6.1 1.3

1.3.70 6.3 1.3

1.3.71 6.4 1.3

1.3.72 6.5 1.3

1.4.20 6.8 1.3

1.4.31 7.0 1.4

1.5.21 7.2 1.4

1.5.31 7.3 1.4

1.6.21 7.5 1.4

1.7.10 7.6 1.4

1.8.10 8.0 1.8

1.8.20 8.2 1.8

1.9.0 8.3 1.8


Embedded Kotlin version Minimum Gradle version Kotlin Language version

1.9.10 8.4 1.8

1.9.20 8.5 1.8

1.9.22 8.7 1.8

1.9.23 8.9 1.8

Groovy

Gradle is tested with Groovy 1.5.8 through 4.0.0.

Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy
DSL build scripts.

Android

Gradle is tested with Android Gradle Plugin 7.3 through 8.4. Alpha and beta versions may or may
not work.

The Feature Lifecycle


Gradle is under constant development. New versions are delivered on a regular and frequent basis
(approximately every six weeks) as described in the section on end-of-life support.

Continuous improvement combined with frequent delivery allows new features to be available to
users early. Early users provide invaluable feedback, which is incorporated into the development
process.

Getting new functionality into the hands of users regularly is a core value of the Gradle platform.

At the same time, API and feature stability are taken very seriously and considered a core value of
the Gradle platform. Design choices and automated testing are engineered into the development
process and formalized by the section on backward compatibility.

The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of
Gradle what the state of a feature is. The term feature typically means an API or DSL method or
property in this context, but it is not restricted to this definition. Command line arguments and
modes of execution (e.g. the Build Daemon) are two examples of other features.

Feature States

Features can be in one of four states:

1. Internal

2. Incubating

3. Public

4. Deprecated
1. Internal

Internal features are not designed for public use and are only intended to be used by Gradle itself.
They can change in any way at any point in time without any notice. Therefore, we recommend
avoiding the use of such features. Internal features are not documented. If it appears in this User
Manual, the DSL Reference, or the API Reference, then the feature is not internal.

Internal features may evolve into public features.

2. Incubating

Features are introduced in the incubating state to allow real-world feedback to be incorporated into
the feature before making it public. It also gives users willing to test potential future changes early
access.

A feature in an incubating state may change in future Gradle versions until it is no longer
incubating. Changes to incubating features for a Gradle release will be highlighted in the release
notes for that release. The incubation period for new features varies depending on the feature’s
scope, complexity, and nature.

Features in incubation are indicated. In the source code, all methods/properties/classes that are
incubating are annotated with incubating. This results in a special mark for them in the DSL and
API references.

If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the
incubating state.

Feature Preview API

The feature preview API allows certain incubating features to be activated by adding
enableFeaturePreview('FEATURE') in your settings file. Individual preview features will be
announced in release notes.

When incubating features are either promoted to public or removed, the feature preview flags for
them become obsolete, have no effect, and should be removed from the settings file.

3. Public

The default state for a non-internal feature is public. Anything documented in the User Manual, DSL
Reference, or API reference that is not explicitly said to be incubating or deprecated is considered
public. Features are said to be promoted from an incubating state to public. The release notes for
each release indicate which previously incubating features are being promoted by the release.

A public feature will never be removed or intentionally changed without undergoing deprecation.
All public features are subject to the backward compatibility policy.

4. Deprecated

Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such
features will eventually be removed from Gradle after being deprecated. A deprecated feature may
become stale until it is finally removed according to the backward compatibility policy.

Deprecated features are indicated to be so. In the source code, all methods/properties/classes that
are deprecated are annotated with “@java.lang.Deprecated” which is reflected in the DSL and API
References. In most cases, there is a replacement for the deprecated element, which will be
described in the documentation. Using a deprecated feature will result in a runtime warning in
Gradle’s output.

The use of deprecated features should be avoided. The release notes for each release indicate any
features being deprecated by the release.

Backward compatibility policy

Gradle provides backward compatibility across major versions (e.g., 1.x, 2.x, etc.). Once a public
feature is introduced in a Gradle release, it will remain indefinitely unless deprecated. Once
deprecated, it may be removed in the next major release. Deprecated features may be supported
across major releases, but this is not guaranteed.

Release end-of-life Policy

Every day, a new nightly build of Gradle is created.

This contains all of the changes made through Gradle’s extensive continuous integration tests
during that day. Nightly builds may contain new changes that may or may not be stable.

The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or
major release. When no problems are found after a short time (usually a week), the release
candidate is promoted to a general availability (GA) release. If a regression is found in the release
candidate, a new RC distribution is created, and the process repeats. Release candidates are
supported for as long as the release window is open, but they are not intended to be used for
production. Bug reports are greatly appreciated during the RC phase.

The Gradle team may create additional patch releases to replace the final release due to critical bug
fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.

Once a release candidate has been made, all feature development moves on to the next release for
the latest major version. As such, each minor Gradle release causes the previous minor releases in
the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or
feature backports.

For major versions, Gradle will backport critical fixes and security fixes to the last minor in the
previous major version. For example, when Gradle 7 was the latest major version, several releases
were made in the 6.x line, including Gradle 6.9 (and subsequent releases).

As such, each major Gradle release causes:

• The previous major version becomes maintenance only. It will only receive critical bug fixes
and security fixes.

• The major version before the previous one to become end-of-life (EOL), and that release line
will not receive any new fixes.
RUNNING GRADLE BUILDS
CORE CONCEPTS
Gradle Basics
Gradle automates building, testing, and deployment of software from information in build
scripts.

Gradle core concepts

Projects

A Gradle project is a piece of software that can be built, such as an application or a library.

Single project builds include a single project called the root project.

Multi-project builds include one root project and any number of subprojects.

Build Scripts

Build scripts detail to Gradle what steps to take to build the project.

Each project can include one or more build scripts.

Dependency Management

Dependency management is an automated technique for declaring and resolving external


resources required by a project.

Each project typically includes a number of external dependencies that Gradle will resolve during
the build.
Tasks

Tasks are a basic unit of work such as compiling code or running your test.

Each project contains one or more tasks defined inside a build script or a plugin.

Plugins

Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.

Gradle project structure

Many developers will interact with Gradle for the first time through an existing project.

The presence of the gradlew and gradlew.bat files in the root directory of a project is a clear
indicator that Gradle is used.

A Gradle project will look similar to the following:

project
├── gradle ①
│ ├── libs.versions.toml ②
│ └── wrapper
│ ├── gradle-wrapper.jar
│ └── gradle-wrapper.properties
├── gradlew ③
├── gradlew.bat ③
├── settings.gradle(.kts) ④
├── subproject-a
│ ├── build.gradle(.kts) ⑤
│ └── src ⑥
└── subproject-b
├── build.gradle(.kts) ⑤
└── src ⑥

① Gradle directory to store wrapper files and more

② Gradle version catalog for dependency management

③ Gradle wrapper scripts

④ Gradle settings file to define a root project name and subprojects

⑤ Gradle build scripts of the two subprojects - subproject-a and subproject-b

⑥ Source code and/or additional files for the projects

Invoking Gradle

IDE

Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
Gradle can be automatically invoked when you build, clean, or run your app in the IDE.

It is recommended that you consult the manual for the IDE of your choice to learn more about how
Gradle can be used and configured.

Command line

Gradle can be invoked in the command line once installed. For example:

$ gradle build

NOTE Most projects do not use the installed version of Gradle.

Gradle Wrapper

The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to
execute a Gradle build. It is found in the project root directory as a gradlew or gradlew.bat file:

$ gradlew build // Linux or OSX


$ gradlew.bat build // Windows

Next Step: Learn about the Gradle Wrapper >>

Gradle Wrapper Basics


The recommended way to execute any Gradle build is with the Gradle Wrapper.
The Wrapper script invokes a declared version of Gradle, downloading it beforehand if necessary.

The Wrapper is available as a gradlew or gradlew.bat file.

The Wrapper provides the following benefits:

• Standardizes a project on a given Gradle version.

• Provisions the same Gradle version for different users.

• Provisions the Gradle version for different execution environments (IDEs, CI servers…).

Using the Gradle Wrapper

It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and
standardized execution of the build.

Depending on the operating system, you run gradlew or gradlew.bat instead of the gradle command.

Typical Gradle invocation:

$ gradle build

To run the Wrapper on a Linux or OSX machine:

$ ./gradlew build

To run the Wrapper on Windows PowerShell:

$ .\gradlew.bat build

The command is run in the same directory that the Wrapper is located in. If you want to run the
command in a different directory, you must provide the relative path to the Wrapper:
$ ../gradlew build

The following console output demonstrates the use of the Wrapper on a Windows machine, in the
command prompt (cmd), for a Java-based project:

$ gradlew.bat build

Downloading https://services.gradle.org/distributions/gradle-5.0-all.zip
.....................................................................................
Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0-all.zip to C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv
Set executable permissions for: C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle

BUILD SUCCESSFUL in 12s


1 actionable task: 1 executed

Understanding the Wrapper files

The following files are part of the Gradle Wrapper:

.
├── gradle
│ └── wrapper
│ ├── gradle-wrapper.jar ①
│ └── gradle-wrapper.properties ②
├── gradlew ③
└── gradlew.bat ④

① gradle-wrapper.jar: This is a small JAR file that contains the Gradle Wrapper code. It is
responsible for downloading and installing the correct version of Gradle for a project if it’s not
already installed.

② gradle-wrapper.properties: This file contains configuration properties for the Gradle Wrapper,
such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or
TARBALL).

③ gradlew: This is a shell script (Unix-based systems) that acts as a wrapper around gradle-
wrapper.jar. It is used to execute Gradle tasks on Unix-based systems without needing to
manually install Gradle.

④ gradlew.bat: This is a batch script (Windows) that serves the same purpose as gradlew but is used
on Windows systems.

IMPORTANT You should never alter these files.


If you want to view or update the Gradle version of your project, use the command line. Do not edit
the wrapper files manually:

$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2

$ gradlew.bat --version
$ gradlew.bat wrapper --gradle-version 7.2

Consult the Gradle Wrapper reference to learn more.

Next Step: Learn about the Gradle CLI >>

Command-Line Interface Basics


The command-line interface is the primary method of interacting with Gradle outside the IDE.

Use of the Gradle Wrapper is highly encouraged.

Substitute ./gradlew (in macOS / Linux) or gradlew.bat (in Windows) for gradle in the following
examples.

Executing Gradle on the command line conforms to the following structure:

gradle [taskName...] [--option-name...]

Options are allowed before and after task names.


gradle [--option-name...] [taskName...]

If multiple tasks are specified, you should separate them with a space.

gradle [taskName1 taskName2...] [--option-name...]

Options that accept values can be specified with or without = between the option and argument.
The use of = is recommended.

gradle [...] --console=plain

Options that enable behavior have long-form options with inverses specified with --no-. The
following are opposites.

gradle [...] --build-cache


gradle [...] --no-build-cache

Many long-form options have short-option equivalents. The following are equivalent:

gradle --help
gradle -h

Command-line usage

The following sections describe the use of the Gradle command-line interface. Some plugins also
add their own command line options.

Executing tasks

To execute a task called taskName on the root project, type:

$ gradle :taskName

This will run the single taskName and all of its dependencies.

Specify options for tasks

To pass an option to a task, prefix the option name with -- after the task name:

$ gradle taskName --exampleOption=exampleValue

Consult the Gradle Command Line Interface reference to learn more.


Next Step: Learn about the Settings file >>

Settings File Basics


The settings file is the entry point of every Gradle project.

The primary purpose of the settings file is to add subprojects to your build.

Gradle supports single and multi-project builds.

• For single-project builds, the settings file is optional.

• For multi-project builds, the settings file is mandatory and declares all subprojects.

Settings script

The settings file is a script. It is either a settings.gradle file written in Groovy or a


settings.gradle.kts file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

The settings file is typically found in the root directory of the project.

Let’s take a look at an example and break it down:

settings.gradle.kts

rootProject.name = "root-project" ①

include("sub-project-a") ②
include("sub-project-b")
include("sub-project-c")

① Define the project name.

② Add subprojects.

settings.gradle

rootProject.name = 'root-project' ①

include('sub-project-a') ②
include('sub-project-b')
include('sub-project-c')

① Define the project name.

② Add subprojects.

1. Define the project name

The settings file defines your project name:

rootProject.name = "root-project"

There is only one root project per build.

2. Add subprojects

The settings file defines the structure of the project by including subprojects, if there are any:

include("app")
include("business-logic")
include("data-model")

Consult the Writing Settings File page to learn more.

Next Step: Learn about the Build scripts >>

Build File Basics


Generally, a build script details build configuration, tasks, and plugins.
Every Gradle build comprises at least one build script.

In the build file, two types of dependencies can be added:

1. The libraries and/or plugins on which Gradle and the build script depend.

2. The libraries on which the project sources (i.e., source code) depend.

Build scripts

The build script is either a build.gradle file written in Groovy or a build.gradle.kts file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

Let’s take a look at an example and break it down:

build.gradle.kts

plugins {
id("application") ①
}

application {
mainClass = "com.example.Main" ②
}

① Add plugins.

② Use convention properties.


build.gradle

plugins {
id 'application' ①
}

application {
mainClass = 'com.example.Main' ②
}

① Add plugins.

② Use convention properties.

1. Add plugins

Plugins extend Gradle’s functionality and can contribute tasks to a project.

Adding a plugin to a build is called applying a plugin and makes additional functionality available.

plugins {
id("application")
}

The application plugin facilitates creating an executable JVM application.

Applying the Application plugin also implicitly applies the Java plugin. The java plugin adds Java
compilation along with testing and bundling capabilities to a project.

2. Use convention properties

A plugin adds tasks to a project. It also adds properties and methods to a project.

The application plugin defines tasks that package and distribute an application, such as the run
task.

The Application plugin provides a way to declare the main class of a Java application, which is
required to execute the code.

application {
mainClass = "com.example.Main"
}

In this example, the main class (i.e., the point where the program’s execution begins) is
com.example.Main.

Consult the Writing Build Scripts page to learn more.


Next Step: Learn about Dependency Management >>

Dependency Management Basics


Gradle has built-in support for dependency management.

Dependency management is an automated technique for declaring and resolving external


resources required by a project.

Gradle build scripts define the process to build projects that may require external dependencies.
Dependencies refer to JARs, plugins, libraries, or source code that support building your project.

Version Catalog

Version catalogs provide a way to centralize your dependency declarations in a libs.versions.toml


file.

The catalog makes sharing dependencies and version configurations between subprojects simple. It
also allows teams to enforce versions of libraries and plugins in large projects.

The version catalog typically contains four sections:

1. [versions] to declare the version numbers that plugins and libraries will reference.

2. [libraries] to define the libraries used in the build files.

3. [bundles] to define a set of dependencies.

4. [plugins] to define plugins.

[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"

[libraries]
googleMaterial = { group = "com.google.android.material", name = "material", version =
"1.1.0-alpha05" }
mockitoCore = { module = "org.mockito:mockito-core", version.ref = "mockito" }

[plugins]
androidApplication = { id = "com.android.application", version.ref =
"androidGradlePlugin" }

The file is located in the gradle directory so that it can be used by Gradle and IDEs automatically.
The version catalog should be checked into source control: gradle/libs.versions.toml.

Declaring Your Dependencies

To add a dependency to your project, specify a dependency in the dependencies block of your
build.gradle(.kts) file.

The following build.gradle.kts file adds a plugin and two dependencies to the project using the
version catalog above:

plugins {
alias(libs.plugins.androidApplication) ①
}

dependencies {
// Dependency on a remote binary to compile and run the code
implementation(libs.googleMaterial) ②

// Dependency on a remote binary to compile and run the test code


testImplementation(libs.mockitoCore) ③
}

① Applies the Android Gradle plugin to this project, which adds several features that are specific to
building Android apps.

② Adds the Material dependency to the project. Material Design provides components for creating
a user interface in an Android App. This library will be used to compile and run the Kotlin
source code in this project.

③ Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java
code. This library will be used to compile and run the test source code in this project.

Dependencies in Gradle are grouped by configurations.

• The material library is added to the implementation configuration, which is used for compiling
and running production code.

• The mockito-core library is added to the testImplementation configuration, which is used for
compiling and running test code.

NOTE There are many more configurations available.

Viewing Project Dependencies

You can view your dependency tree in the terminal using the ./gradlew :app:dependencies
command:

$ ./gradlew :app:dependencies

> Task :app:dependencies

------------------------------------------------------------
Project ':app'
------------------------------------------------------------

implementation - Implementation only dependencies for source set 'main'. (n)


\--- com.google.android.material:material:1.1.0-alpha05 (n)

testImplementation - Implementation only dependencies for source set 'test'. (n)


\--- org.mockito:mockito-core:2.16.0 (n)

...

Consult the Dependency Management chapter to learn more.

Next Step: Learn about Tasks >>

Task Basics
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
You run a Gradle build task using the gradle command or by invoking the Gradle Wrapper
(./gradlew or gradlew.bat) in your project directory:

$ ./gradlew build

Available tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in the project by running the following command in the terminal:

$ ./gradlew tasks

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

...

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
...

Other tasks
-----------
compileJava - Compiles main Java source.

...

Running tasks

The run task is executed with ./gradlew run:

$ ./gradlew run

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes

> Task :app:run


Hello World!

BUILD SUCCESSFUL in 904ms


2 actionable tasks: 2 executed

In this example Java project, the output of the run task is a Hello World statement printed on the
console.

Task dependency

Many times, a task requires another task to run first.

For example, for Gradle to execute the build task, the Java code must first be compiled. Thus, the
build task depends on the compileJava task.

This means that the compileJava task will run before the build task:

$ ./gradlew build

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

BUILD SUCCESSFUL in 764ms


7 actionable tasks: 7 executed

Build scripts can optionally define task dependencies. Gradle then automatically determines the
task execution order.

Consult the Task development chapter to learn more.

Next Step: Learn about Plugins >>

Plugin Basics
Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a
sophisticated dependency resolution engine. The rest of its functionality comes from plugins.

A plugin is a piece of software that provides additional functionality to the Gradle build system.

Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-
related capabilities:

The Java Library Plugin - java-library


Used to define and build Java libraries. It compiles Java source code with the compileJava task,
generates Javadoc with the javadoc task, and packages the compiled classes into a JAR file with
the jar task.
The Google Services Gradle Plugin - com.google.gms:google-services
Enables Google APIs and Firebase services in your Android application with a configuration
block called googleServices{} and a task called generateReleaseAssets.

The Gradle Bintray Plugin - com.jfrog.bintray


Allows you to publish artifacts to Bintray by configuring the plugin using the bintray{} block.

Plugin distribution

Plugins are distributed in three ways:

1. Core plugins - Gradle develops and maintains a set of Core Plugins.

2. Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.

3. Local plugins - Gradle enables users to create custom plugins using APIs.

Applying plugins

Applying a plugin to a project allows the plugin to extend the project’s capabilities.

You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a
version:

plugins {
id «plugin id» version «plugin version»
}

1. Core plugins

Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These
plugins provide essential functionality for building and managing projects.

Some examples of core plugins include:

• java: Provides support for building Java projects.

• groovy: Adds support for compiling and testing Groovy source files.

• ear: Adds support for building EAR files for enterprise applications.

Core plugins are unique in that they provide short names, such as java for the core JavaPlugin,
when applied in build scripts. They also do not require versions. To apply the java plugin to a
project:

build.gradle.kts

plugins {
id("java")
}
There are many Gradle Core Plugins users can take advantage of.

2. Community plugins

Community plugins are plugins developed by the Gradle community, rather than being part of the
core Gradle distribution. These plugins provide additional functionality that may be specific to
certain use cases or technologies.

The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java
applications.

To apply the org.springframework.boot plugin to a project:

build.gradle.kts

plugins {
id("org.springframework.boot") version "3.1.5"
}

Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can
easily discover and use them.

3. Local plugins

Custom or local plugins are developed and used within a specific project or organization. These
plugins are not shared publicly and are tailored to the specific needs of the project or organization.

Local plugins can encapsulate common build logic, provide integrations with internal systems or
tools, or abstract complex functionality into reusable components.

Gradle provides users with the ability to develop custom plugins using APIs. To create your own
plugin, you’ll typically follow these steps:

1. Define the plugin class: create a new class that implements the Plugin<Project> interface.

// Define a 'HelloPlugin' plugin


class HelloPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Define the 'hello' task
val helloTask = project.tasks.register("hello") {
doLast {
println("Hello, Gradle!")
}
}
}
}

2. Build and optionally publish your plugin: generate a JAR file containing your plugin code and
optionally publish this JAR to a repository (local or remote) to be used in other projects.
// Publish the plugin
plugins {
`maven-publish`
}

publishing {
publications {
create<MavenPublication>("mavenJava") {
from(components["java"])
}
}
repositories {
mavenLocal()
}
}

3. Apply your plugin: when you want to use the plugin, include the plugin ID and version in the
plugins{} block of the build file.

// Apply the plugin


plugins {
id("com.example.hello") version "1.0"
}

Consult the Plugin development chapter to learn more.

Next Step: Learn about Incremental Builds and Build Caching >>

Gradle Incremental Builds and Build Caching


<div class="badge-wrapper">
<a class="badge" href="https://dpeuniversity.gradle.com/app/courses/ec69d0b8-9171-
4969-ac3e-82dea16f87b0/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">Incremental Builds and Build Caching with
Gradle&nbsp;&nbsp;&nbsp;&gt;</span>
</a>
</div>

Gradle uses two main features to reduce build time: incremental builds and build caching.
Incremental builds

An incremental build is a build that avoids running tasks whose inputs have not changed since the
previous build. Re-executing such tasks is unnecessary if they would only re-produce the same
output.

For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine
whether the input or outputs have changed at build time. If they have changed, Gradle will execute
the task. Otherwise, it will skip execution.

Incremental builds are always enabled, and the best way to see them in action is to turn on verbose
mode. With verbose mode, each task state is labeled during a build:

$ ./gradlew compileJava --console=verbose

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE


> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava UP-TO-DATE
BUILD SUCCESSFUL in 374ms
12 actionable tasks: 12 up-to-date

When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE is
printed next to the task.

To permanently enable verbose mode, add org.gradle.console=verbose to your


TIP
gradle.properties file.

Build caching

Incremental Builds are a great optimization that helps avoid work already done. If a developer
continuously changes a single file, there is likely no need to rebuild all the other files in the project.

However, what happens when the same developer switches to a new branch created last week? The
files are rebuilt, even though the developer is building something that has been built before.

This is where a build cache is helpful.

The build cache stores previous build results and restores them when needed. It prevents the
redundant work and cost of executing time-consuming and expensive processes.

When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-
CACHE:

$ ./gradlew compileJava --build-cache

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE


> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava FROM-CACHE
> Task :utilities:compileJava FROM-CACHE
> Task :app:compileJava FROM-CACHE

BUILD SUCCESSFUL in 364ms


12 actionable tasks: 3 from cache, 9 up-to-date

Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE and
not FROM-CACHE.
The build cache allows you to share and reuse unchanged build and test outputs across teams. This
speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new
code changes.

Consult the Build cache chapter to learn more.

Next Step: Learn about Build Scans >>

Build Scans
<div class="badge-wrapper">
<a class="badge" href="https://dpeuniversity.gradle.com/app/courses/b5069222-cfd0-
4393-b645-7a2c713853d5/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">How to Use Build Scans&nbsp;&nbsp;&nbsp;&gt;</span>
</a>
</div>

A build scan is a representation of metadata captured as you run your build.

Build Scans

Gradle captures your build metadata and sends it to the Build Scan Service. The service then
transforms the metadata into information you can analyze and share with others.
The information that scans collect can be an invaluable resource when troubleshooting,
collaborating on, or optimizing the performance of your builds.

For example, with a build scan, it’s no longer necessary to copy and paste error messages or include
all the details about your environment each time you want to ask a question on Stack Overflow,
Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.
Enable Build Scans

To enable build scans on a gradle command, add --scan to the command line option:

./gradlew build --scan

You may be prompted to agree to the terms to use Build Scans.

Vist the Build Scans page to learn more.

Next Step: Start the Tutorial >>


OTHER TOPICS
Continuous Builds
Continuous Build allows you to automatically re-execute the requested tasks when file inputs
change. You can execute the build in this mode using the -t or --continuous command-line option.

For example, you can continuously run the test task and all dependent tasks by running:

$ gradle test --continuous

Gradle will behave as if you ran gradle test after a change to sources or tests that contribute to the
requested tasks. This means unrelated changes (such as changes to build scripts) will not trigger a
rebuild. To incorporate build logic changes, the continuous build must be restarted manually.

Continuous build uses file system watching to detect changes to the inputs. If file system watching
does not work on your system, then continuous build won’t work either. In particular, continuous
build does not work when using --no-daemon.

When Gradle detects a change to the inputs, it will not trigger the build immediately. Instead, it will
wait until no additional changes are detected for a certain period of time - the quiet period. You can
configure the quiet period in milliseconds by the Gradle property
org.gradle.continuous.quietperiod.

Terminating Continuous Build

If Gradle is attached to an interactive input source, such as a terminal, the continuous build can be
exited by pressing CTRL-D (On Microsoft Windows, it is required to also press ENTER or RETURN after
CTRL-D).

If Gradle is not attached to an interactive input source (e.g. is running as part of a script), the build
process must be terminated (e.g. using the kill command or similar).

If the build is being executed via the Tooling API, the build can be cancelled using the Tooling API’s
cancellation mechanism.

Limitations

Under some circumstances, continuous build may not detect changes to inputs.

Creating input directories

Sometimes, creating an input directory that was previously missing does not trigger a build, due to
the way file system watching works. For example, creating the src/main/java directory may not
trigger a build. Similarly, if the input is a filtered file tree and no files are matching the filter, the
creation of matching files may not trigger a build.
Inputs of untracked tasks

Changes to the inputs of untracked tasks or tasks that have no outputs may not trigger a build.

Changes to files outside of project directories

Gradle only watches for changes to files inside the project directory. Changes to files outside the
project directory will go undetected and not trigger a build.

Build cycles

Gradle starts watching for changes just before a task executes. If a task modifies its own inputs
while executing, Gradle will detect the change and trigger a new build. If every time the task
executes, the inputs are modified again, the build will be triggered again. This isn’t unique to
continuous build. A task that modifies its own inputs will never be considered up-to-date when run
"normally" without continuous build.

If your build enters a build cycle like this, you can track down the task by looking at the list of files
reported changed by Gradle. After identifying the file(s) that are changed during each build, you
should look for a task that has that file as an input. In some cases, it may be obvious (e.g., a Java file
is compiled with compileJava). In other cases, you can use --info logging to find the task that is out-
of-date due to the identified files.
AUTHORING GRADLE BUILDS
THE BASICS
Gradle Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory
and the Project Root directory.

Gradle User Home directory

By default, the Gradle User Home (~/.gradle or C:\Users\<USERNAME>\.gradle) stores global


configuration properties, initialization scripts, caches, and log files.

It can be set with the environment variable GRADLE_USER_HOME.

TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.

It is roughly structured as follows:

├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── my-setup.gradle
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties ⑧

① Global cache directory (for everything that is not project-specific).

② Version-specific caches (e.g., to support incremental builds).

③ Shared caches (e.g., for artifacts of dependencies).

④ Registry and logs of the Gradle Daemon.

⑤ Global initialization scripts.

⑥ JDKs downloaded by the toolchain support.

⑦ Distributions downloaded by the Gradle Wrapper.

⑧ Global Gradle configuration properties.

Consult the Gradle Directories reference to learn more.

Project Root directory

The project root directory contains all source files from your project.

It also contains files and directories Gradle generates, such as .gradle and build.

While .gradle is usually checked into source control, the build directory contains the output of your
builds as well as transient files Gradle uses to support features like incremental builds.

The anatomy of a typical project root directory looks as follows:

├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── gradle.properties ⑤
├── gradlew ⑥
├── gradlew.bat ⑥
├── settings.gradle.kts ⑦
├── subproject-one ⑧
| └── build.gradle.kts ⑨
├── subproject-two ⑧
| └── build.gradle.kts ⑨
└── ⋮

① Project-specific cache directory generated by Gradle.

② Version-specific caches (e.g., to support incremental builds).

③ The build directory of this project into which Gradle generates all build artifacts.

④ Contains the JAR file and configuration of the Gradle Wrapper.

⑤ Project-specific Gradle configuration properties.

⑥ Scripts for executing builds using the Gradle Wrapper.

⑦ The project’s settings file where the list of subprojects is defined.

⑧ Usually, a project is organized into one or multiple subprojects.

⑨ Each subproject has its own Gradle build script.

Consult the Gradle Directories reference to learn more.

Next Step: Learn how to structure Multi-Project Builds >>

Multi-Project Build Basics


Gradle supports multi-project builds.

While some small projects and monolithic applications may contain a single build file and source
tree, it is often more common for a project to have been split into smaller, interdependent modules.
The word "interdependent" is vital, as you typically want to link the many modules together
through a single build.
Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-
module project. Gradle refers to modules as subprojects.

A multi-project build consists of one root project and one or more subprojects.

Multi-Project structure

The following represents the structure of a multi-project build that contains two subprojects:

The directory structure should look as follows:

├── .gradle
│ └── ⋮
├── gradle
│ ├── libs.version.toml
│ └── wrapper
├── gradlew
├── gradlew.bat
├── settings.gradle.kts ①
├── sub-project-1
│ └── build.gradle.kts ②
├── sub-project-2
│ └── build.gradle.kts ②
└── sub-project-3
└── build.gradle.kts ②

① The settings.gradle.kts file should include all subprojects.

② Each subproject should have its own build.gradle.kts file.

Multi-Project standards

The Gradle community has two standards for multi-project build structures:

1. Multi-Project Builds using buildSrc - where buildSrc is a subproject-like directory at the


Gradle project root containing all the build logic.

2. Composite Builds - a build that includes other builds where build-logic is a build directory at
the Gradle project root containing reusable build logic.

1. Multi-Project Builds using buildSrc

Multi-project builds allow you to organize projects with many modules, wire dependencies between
those modules, and easily share common build logic amongst them.

For example, a build that has many modules called mobile-app, web-app, api, lib, and documentation
could be structured as follows:

.
├── gradle
├── gradlew
├── settings.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts

The modules will have dependencies between them such as web-app and mobile-app depending on
lib. This means that in order for Gradle to build web-app or mobile-app, it must build lib first.

In this example, the root settings file will look as follows:

settings.gradle.kts

include("mobile-app", "web-app", "api", "lib", "documentation")

NOTE The order in which the subprojects (modules) are included does not matter.

The buildSrc directory is automatically recognized by Gradle. It is a good place to define and
maintain shared configuration or imperative build logic, such as custom tasks or plugins.

buildSrc is automatically included in your build as a special subproject if a build.gradle(.kts) file is


found under buildSrc.

If the java plugin is applied to the buildSrc project, the compiled code from buildSrc/src/main/java
is put in the classpath of the root build script, making it available to any subproject (web-app, mobile-
app, lib, etc…) in the build.

Consult how to declare dependencies between subprojects to learn more.

2. Composite Builds

Composite Builds, also referred to as included builds, are best for sharing logic between builds (not
subprojects) or isolating access to shared build logic (i.e., convention plugins).

Let’s take the previous example. The logic in buildSrc has been turned into a project that contains
plugins and can be published and worked on independently of the root project build.

The plugin is moved to its own build called build-logic with a build script and settings file:

.
├── gradle
├── gradlew
├── settings.gradle.kts
├── build-logic
│ ├── settings.gradle.kts
│ └── conventions
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts

The fact that build-logic is located in a subdirectory of the root project is irrelevant.
NOTE
The folder could be located outside the root project if desired.

The root settings file includes the entire build-logic build:

settings.gradle.kts

pluginManagement {
includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")

Consult how to create composite builds with includeBuild to learn more.

Multi-Project path

A project path has the following pattern: it starts with an optional colon, which denotes the root
project.

The root project, :, is the only project in a path not specified by its name.

The rest of a project path is a colon-separated sequence of project names, where the next project is
a subproject of the previous project:

:sub-project-1

You can see the project paths when running gradle projects:

------------------------------------------------------------
Root project 'project'
------------------------------------------------------------

Root project 'project'


+--- Project ':sub-project-1'
\--- Project ':sub-project-2'

Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for
composite builds.

Identifying project structure

You can use the gradle projects command to identify the project structure.

As an example, let’s use a multi-project build with the following structure:


> gradle -q projects

Projects:

------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------

Root project 'multiproject'


+--- Project ':api'
+--- Project ':services'
| +--- Project ':services:shared'
| \--- Project ':services:webservice'
\--- Project ':shared'

To see a list of the tasks of a project, run gradle <project-path>:tasks


For example, try running gradle :api:tasks

Multi-project builds are collections of tasks you can run. The difference is that you may want to
control which project’s tasks get executed.

The following sections will cover your two options for executing tasks in a multi-project build.

Executing tasks by name

The command gradle test will execute the test task in any subprojects relative to the current
working directory that has that task.

If you run the command from the root project directory, you will run test in api, shared,
services:shared and services:webservice.

If you run the command from the services project directory, you will only execute the task in
services:shared and services:webservice.

The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this
name. And complain if there is no such task found in any of the subprojects traversed.

Some task selectors, like help or dependencies, will only run the task on the project
NOTE they are invoked on and not on all the subprojects to reduce the amount of
information printed on the screen.

Executing tasks by fully qualified name

You can use a task’s fully qualified name to execute a specific task in a particular subproject. For
example: gradle :services:webservice:build will run the build task of the webservice subproject.

The fully qualified name of a task is its project path plus the task name.
This approach works for any task, so if you want to know what tasks are in a particular subproject,
use the tasks task, e.g. gradle :services:webservice:tasks.

Multi-Project building and testing

The build task is typically used to compile, test, and check a single project.

In multi-project builds, you may often want to do all of these tasks across various projects. The
buildNeeded and buildDependents tasks can help with this.

In this example, the :services:person-service project depends on both the :api and :shared
projects. The :api project also depends on the :shared project.

Assuming you are working on a single project, the :api project, you have been making changes but
have not built the entire project since performing a clean. You want to build any necessary
supporting JARs but only perform code quality and unit tests on the parts of the project you have
changed.

The build task does this:

$ gradle :api:build

> Task :shared:compileJava


> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build

BUILD SUCCESSFUL in 0s

If you have just gotten the latest version of the source from your version control system, which
included changes in other projects that :api depends on, you might want to build all the projects
you depend on AND test them too.

The buildNeeded task builds AND tests all the projects from the project dependencies of the
testRuntime configuration:
$ gradle :api:buildNeeded

> Task :shared:compileJava


> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :shared:assemble
> Task :shared:compileTestJava
> Task :shared:processTestResources
> Task :shared:testClasses
> Task :shared:test
> Task :shared:check
> Task :shared:build
> Task :shared:buildNeeded
> Task :api:buildNeeded

BUILD SUCCESSFUL in 0s

You may want to refactor some part of the :api project used in other projects. If you make these
changes, testing only the :api project is insufficient. You must test all projects that depend on the
:api project.

The buildDependents task tests ALL the projects that have a project dependency (in the testRuntime
configuration) on the specified project:

$ gradle :api:buildDependents

> Task :shared:compileJava


> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :services:person-service:compileJava
> Task :services:person-service:processResources
> Task :services:person-service:classes
> Task :services:person-service:jar
> Task :services:person-service:assemble
> Task :services:person-service:compileTestJava
> Task :services:person-service:processTestResources
> Task :services:person-service:testClasses
> Task :services:person-service:test
> Task :services:person-service:check
> Task :services:person-service:build
> Task :services:person-service:buildDependents
> Task :api:buildDependents

BUILD SUCCESSFUL in 0s

Finally, you can build and test everything in all projects. Any task you run in the root project folder
will cause that same-named task to be run on all the children.

You can run gradle build to build and test ALL projects.

Consult the Structuring Builds chapter to learn more.

Next Step: Learn about the Gradle Build Lifecycle >>

Build Lifecycle
As a build author, you define tasks and dependencies between tasks. Gradle guarantees that these
tasks will execute in order of their dependencies.

Your build scripts and plugins configure this dependency graph.

For example, if your project tasks include build, assemble, createDocs, your build script(s) can
ensure that they are executed in the order build → assemble → createDoc.

Task Graphs

Gradle builds the task graph before executing any task.


Across all projects in the build, tasks form a Directed Acyclic Graph (DAG).

This diagram shows two example task graphs, one abstract and the other concrete, with
dependencies between tasks represented as arrows:

Both plugins and build scripts contribute to the task graph via the task dependency mechanism and
annotated inputs/outputs.

Build Phases

A Gradle build has three distinct phases.

Gradle runs these phases in order:

Phase 1. Initialization
• Detects the settings.gradle(.kts) file.

• Creates a Settings instance.

• Evaluates the settings file to determine which projects (and included builds) make up the
build.

• Creates a Project instance for every project.


Phase 2. Configuration
• Evaluates the build scripts, build.gradle(.kts), of every project participating in the build.

• Creates a task graph for requested tasks.

Phase 3. Execution
• Schedules and executes the selected tasks.

• Dependencies between tasks determine execution order.

• Execution of tasks can occur in parallel.

Example

The following example shows which parts of settings and build files correspond to various build
phases:

settings.gradle.kts

rootProject.name = "basic"
println("This is executed during the initialization phase.")

build.gradle.kts

println("This is executed during the configuration phase.")


tasks.register("configured") {
println("This is also executed during the configuration phase, because
:configured is used in the build.")
}

tasks.register("test") {
doLast {
println("This is executed during the execution phase.")
}
}

tasks.register("testBoth") {
doFirst {
println("This is executed first during the execution phase.")
}
doLast {
println("This is executed last during the execution phase.")
}
println("This is executed during the configuration phase as well, because
:testBoth is used in the build.")
}

settings.gradle

rootProject.name = 'basic'
println 'This is executed during the initialization phase.'

build.gradle

println 'This is executed during the configuration phase.'

tasks.register('configured') {
println 'This is also executed during the configuration phase, because
:configured is used in the build.'
}

tasks.register('test') {
doLast {
println 'This is executed during the execution phase.'
}
}

tasks.register('testBoth') {
doFirst {
println 'This is executed first during the execution phase.'
}
doLast {
println 'This is executed last during the execution phase.'
}
println 'This is executed during the configuration phase as well, because
:testBoth is used in the build.'
}

The following command executes the test and testBoth tasks specified above. Because Gradle only
configures requested tasks and their dependencies, the configured task never configures:

> gradle test testBoth


This is executed during the initialization phase.

> Configure project :


This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in
the build.

> Task :test


This is executed during the execution phase.

> Task :testBoth


This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

> gradle test testBoth


This is executed during the initialization phase.

> Configure project :


This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in
the build.

> Task :test


This is executed during the execution phase.

> Task :testBoth


This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Phase 1. Initialization

In the initialization phase, Gradle detects the set of projects (root and subprojects) and included
builds participating in the build.

Gradle first evaluates the settings file, settings.gradle(.kts), and instantiates a Settings object.
Then, Gradle instantiates Project instances for each project.

Phase 2. Configuration

In the configuration phase, Gradle adds tasks and other properties to the projects found by the
initialization phase.

Phase 3. Execution

In the execution phase, Gradle runs tasks.

Gradle uses the task execution graphs generated by the configuration phase to determine which
tasks to execute.

Next Step: Learn how to write Settings files >>

Writing Settings Files


The settings file is the entry point of every Gradle build.

Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root
directory.

When the settings file settings.gradle(.kts) is found, Gradle instantiates a Settings object.

One of the purposes of the Settings object is to allow you to declare all the projects to be included in
the build.
Settings Scripts

The settings script is either a settings.gradle file in Groovy or a settings.gradle.kts file in Kotlin.

Before Gradle assembles the projects for a build, it creates a Settings instance and executes the
settings file against it.

As the settings script executes, it configures this Settings. Therefore, the settings file defines the
Settings object.

There is a one-to-one correspondence between a Settings instance and a


IMPORTANT
settings.gradle(.kts) file.

The Settings Object

The Settings object is part of the Gradle API.

• In the Groovy DSL, the Settings object documentation is found here.

• In the Kotlin DSL, the Settings object documentation is found here.

Many top-level properties and blocks in a settings script are part of the Settings API.

For example, we can set the root project name in the settings script using the Settings.rootProject
property:

settings.rootProject.name = "root"

Which is usually shortened to:

rootProject.name = "root"

Standard Settings properties

The Settings object exposes a standard set of properties in your settings script.

The following table lists a few commonly used properties:

Name Description
buildCache The build cache configuration.

plugins The container of plugins that have been applied to the settings.
Name Description
rootDir The root directory of the build. The root directory is the project directory of the root
project.
rootProjec The root project of the build.
t
settings Returns this settings object.

The following table lists a few commonly used methods:

Name Description
include() Adds the given projects to the build.
includeBuild() Includes a build at the specified path to the composite build.

Settings Script structure

A Settings script is a series of method calls to the Gradle API that often use { … }, a special
shortcut in both the Groovy and Kotlin languages. A { } block is called a lambda in Kotlin or a
closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:

plugins(function() {
id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.

The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script
line by line, top to bottom.

Let’s take a look at an example and break it down:

settings.gradle.kts

pluginManagement { ①
repositories {
gradlePluginPortal()
google()
}
}

plugins { ②
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}

rootProject.name = "root-project" ③

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")

① Define the location of plugins

② Apply settings plugins.

③ Define the root project name.

④ Define dependency resolution strategies.

⑤ Add subprojects to the build.

settings.gradle

pluginManagement { ①
repositories {
gradlePluginPortal()
google()
}
}

plugins { ②
id 'org.gradle.toolchains.foojay-resolver-convention' version '0.8.0'
}

rootProject.name = 'root-project' ③

dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}

include('sub-project-a') ⑤
include('sub-project-b')
include('sub-project-c')

① Define the location of plugins.

② Apply settings plugins.

③ Define the root project name.

④ Define dependency resolution strategies.

⑤ Add subprojects to the build.

1. Define the location of plugins

The settings file can optionally manage plugin versions and repositories for your build with
pluginManagement It provides a centralized way to define which plugins should be used in your
project and from which repositories they should be resolved.

pluginManagement {
repositories {
gradlePluginPortal()
google()
}
}

2. Apply settings plugins

The settings file can optionally apply plugins that are required for configuring the settings of the
project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the
example below.

Plugins applied in the settings file only affect the Settings object.

plugins {
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}

3. Define the root project name

The settings file defines your project name using the rootProject.name property:

rootProject.name = "root-project"

There is only one root project per build.


4. Define dependency resolution strategies

The settings file can optionally define rules and configurations for dependency resolution across
your project(s). It provides a centralized way to manage and customize dependency resolution.

dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.PREFER_PROJECT)
repositories {
mavenCentral()
}
}

You can also include version catalogs in this section.

5. Add subprojects to the build

The settings file defines the structure of the project by adding all the subprojects using the include
statement:

include("app")
include("business-logic")
include("data-model")

You can also include entire builds using includeBuild.

Settings File Scripting

There are many more properties and methods on the Settings object that you can use to configure
your build.

It’s important to remember that while many Gradle scripts are typically written in short Groovy or
Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings
object in the Gradle API:

include("app")

Is actually:

settings.include("app")

Additionally, the full power of the Groovy and Kotlin languages is available to you.

For example, instead of using include many times to add subprojects, you can iterate over the list of
directories in the project root folder and include them automatically:

rootDir.listFiles().filter { it.isDirectory && (new File(it,


"build.gradle.kts").exists()) }.forEach {
include(it.name)
}

TIP This type of logic should be developed in a plugin.

Next Step: Learn how to write Build scripts >>

Writing Build Scripts


The initialization phase in the Gradle Build lifecycle finds the root project and subprojects included
in your project root directory using the settings file.

Then, for each project included in the settings file, Gradle creates a Project instance.

Gradle then looks for a corresponding build script file, which is used in the configuration phase.

Build Scripts

Every Gradle build comprises one or more projects; a root project and subprojects.

A project typically corresponds to a software component that needs to be built, like a library or an
application. It might represent a library JAR, a web application, or a distribution ZIP assembled
from the JARs produced by other projects.

On the other hand, it might represent a thing to be done, such as deploying your application to
staging or production environments.

Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).

A build script configures a project and is associated with an object of type Project.
As the build script executes, it configures Project.

The build script is either a *.gradle file in Groovy or a *.gradle.kts file in Kotlin.

IMPORTANT Build scripts configure Project objects and their children.

The Project object

The Project object is part of the Gradle API:

• In the Groovy DSL, the Project object documentation is found here.

• In the Kotlin DSL, the Project object documentation is found here.

Many top-level properties and blocks in a build script are part of the Project API.

For example, the following build script uses the Project.name property to print the name of the
project:

build.gradle.kts

println(name)
println(project.name)

build.gradle

println name
println project.name

$ gradle -q check
project-api
project-api

Both println statements print out the same property.

The first uses the top-level reference to the name property of the Project object. The second
statement uses the project property available to any build script, which returns the associated
Project object.
Standard project properties

The Project object exposes a standard set of properties in your build script.

The following table lists a few commonly used properties:

Name Type Description


name String The name of the project directory.
path String The fully qualified name of the project.
description String A description for the project.
dependencies DependencyHandler Returns the dependency handler of the project.

repositories RepositoryHandler Returns the repository handler of the project.

layout ProjectLayout Provides access to several important locations for a project.


group Object The group of this project.
version Object The version of this project.

The following table lists a few commonly used methods:

Name Description
uri() Resolves a file path to a URI, relative to the project directory of this project.
task() Creates a Task with the given name and adds it to this project.

Build Script structure

The Build script is composed of { … }, a special object in both Groovy and Kotlin. This object is
called a lambda in Kotlin or a closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:

plugins(function() {
id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.

The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes
the script line by line, top to bottom.

Let’s take a look at an example and break it down:


build.gradle.kts

plugins { ①
id("org.jetbrains.kotlin.jvm") version "1.9.0"
id("application")
}

repositories { ②
mavenCentral()
}

dependencies { ③
testImplementation("org.jetbrains.kotlin:kotlin-test-junit5")
testImplementation("org.junit.jupiter:junit-jupiter-engine:5.9.3")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
implementation("com.google.guava:guava:32.1.1-jre")
}

application { ④
mainClass = "com.example.Main"
}

tasks.named<Test>("test") { ⑤
useJUnitPlatform()
}

① Apply plugins to the build.

② Define the locations where dependencies can be found.

③ Add dependencies.

④ Set properties.

⑤ Register and configure tasks.

build.gradle

plugins { ①
id 'org.jetbrains.kotlin.jvm' version '1.9.0'
id 'application'
}

repositories { ②
mavenCentral()
}

dependencies { ③
testImplementation 'org.jetbrains.kotlin:kotlin-test-junit5'
testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.9.3'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
implementation 'com.google.guava:guava:32.1.1-jre'
}

application { ④
mainClass = 'com.example.Main'
}

tasks.named('test') { ⑤
useJUnitPlatform()
}

① Apply plugins to the build.

② Define the locations where dependencies can be found.

③ Add dependencies.

④ Set properties.

⑤ Register and configure tasks.

1. Apply plugins to the build

Plugins are used to extend Gradle. They are also used to modularize and reuse project
configurations.

Plugins can be applied using the PluginDependenciesSpec plugins script block.

The plugins block is preferred:

plugins {
id("org.jetbrains.kotlin.jvm") version "1.9.0"
id("application")
}

In the example, the application plugin, which is included with Gradle, has been applied, describing
our project as a Java application.

The Kotlin gradle plugin, version 1.9.0, has also been applied. This plugin is not included with
Gradle and, therefore, has to be described using a plugin id and a plugin version so that Gradle can
find and apply it.

2. Define the locations where dependencies can be found

A project generally has a number of dependencies it needs to do its work. Dependencies include
plugins, libraries, or components that Gradle must download for the build to succeed.

The build script lets Gradle know where to look for the binaries of the dependencies. More than one
location can be provided:
repositories {
mavenCentral()
google()
}

In the example, the guava library and the JetBrains Kotlin plugin (org.jetbrains.kotlin.jvm) will be
downloaded from the Maven Central Repository.

3. Add dependencies

A project generally has a number of dependencies it needs to do its work. These dependencies are
often libraries of precompiled classes that are imported in the project’s source code.

Dependencies are managed via configurations and are retrieved from repositories.

Use the DependencyHandler returned by Project.getDependencies() method to manage the


dependencies. Use the RepositoryHandler returned by Project.getRepositories() method to manage
the repositories.

dependencies {
implementation("com.google.guava:guava:32.1.1-jre")
}

In the example, the application code uses Google’s guava libraries. Guava provides utility methods
for collections, caching, primitives support, concurrency, common annotations, string processing,
I/O, and validations.

4. Set properties

A plugin can add properties and methods to a project using extensions.

The Project object has an associated ExtensionContainer object that contains all the settings and
properties for the plugins that have been applied to the project.

In the example, the application plugin added an application property, which is used to detail the
main class of our Java application:

application {
mainClass = "com.example.Main"
}

5. Register and configure tasks

Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping
up a WAR file.

While tasks are typically defined in plugins, you may need to register or configure tasks in build
scripts.

Registering a task adds the task to your project.

You can register tasks in a project using the TaskContainer.register(java.lang.String) method:

tasks.register<Zip>("zip-reports") {
from 'Reports/'
include '*'
archiveName 'Reports.zip'
destinationDir(file('/dir'))
}

You may have seen usage of the TaskContainer.create(java.lang.String) method which should be
avoided:

tasks.create<Zip>("zip-reports") {
from 'Reports/'
include '*'
archiveName 'Reports.zip'
destinationDir(file('/dir'))
}

TIP register(), which enables task configuration avoidance, is preferred over create().

You can locate a task to configure it using the TaskCollection.named(java.lang.String) method:

tasks.named<Test>("test") {
useJUnitPlatform()
}

The example below configures the Javadoc task to automatically generate HTML documentation
from Java code:

tasks.named("javadoc").configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
exclude 'app/internal/*'
}

Build Scripting

A build script is made up of zero or more statements and script blocks:


println(project.layout.projectDirectory);

Statements can include method calls, property assignments, and local variable definitions:

version = '1.0.0.GA'

A script block is a method call which takes a closure/lambda as a parameter:

configurations {
}

The closure/lambda configures some delegate object as it executes:

repositories {
google()
}

A build script is also a Groovy or a Kotlin script:

build.gradle.kts

tasks.register("upper") {
doLast {
val someString = "mY_nAmE"
println("Original: $someString")
println("Upper case: ${someString.toUpperCase()}")
}
}

build.gradle

tasks.register('upper') {
doLast {
String someString = 'mY_nAmE'
println "Original: $someString"
println "Upper case: ${someString.toUpperCase()}"
}
}

$ gradle -q upper
Original: mY_nAmE
Upper case: MY_NAME

It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class
definitions:

build.gradle.kts

tasks.register("count") {
doLast {
repeat(4) { print("$it ") }
}
}

build.gradle

tasks.register('count') {
doLast {
4.times { print "$it " }
}
}

$ gradle -q count
0 1 2 3

Flexible task registration

Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:

build.gradle.kts

repeat(4) { counter ->


tasks.register("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}
build.gradle

4.times { counter ->


tasks.register("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}

$ gradle -q task1
I'm task number 1

Declare Variables

Build scripts can declare two variables: local variables and extra properties.

Local Variables

Declare local variables with the val keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Kotlin language.

Declare local variables with the def keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Groovy language.

build.gradle.kts

val dest = "dest"

tasks.register<Copy>("copy") {
from("source")
into(dest)
}

build.gradle

def dest = 'dest'

tasks.register('copy', Copy) {
from 'source'
into dest
}
Extra Properties

Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined
properties.

Add, read, and set extra properties via the owning object’s extra property. Alternatively, you can
access extra properties via Kotlin delegated properties using by extra.

Add, read, and set extra properties via the owning object’s ext property. Alternatively, you can use
an ext block to add multiple properties simultaneously.

build.gradle.kts

plugins {
id("java-library")
}

val springVersion by extra("3.1.0.RELEASE")


val emailNotification by extra { "[email protected]" }

sourceSets.all { extra["purpose"] = null }

sourceSets {
main {
extra["purpose"] = "production"
}
test {
extra["purpose"] = "test"
}
create("plugin") {
extra["purpose"] = "production"
}
}

tasks.register("printProperties") {
val springVersion = springVersion
val emailNotification = emailNotification
val productionSourceSets = provider {
sourceSets.matching { it.extra["purpose"] == "production" }.map {
it.name }
}
doLast {
println(springVersion)
println(emailNotification)
productionSourceSets.get().forEach { println(it) }
}
}
build.gradle

plugins {
id 'java-library'
}

ext {
springVersion = "3.1.0.RELEASE"
emailNotification = "[email protected]"
}

sourceSets.all { ext.purpose = null }

sourceSets {
main {
purpose = "production"
}
test {
purpose = "test"
}
plugin {
purpose = "production"
}
}

tasks.register('printProperties') {
def springVersion = springVersion
def emailNotification = emailNotification
def productionSourceSets = provider {
sourceSets.matching { it.purpose == "production" }.collect { it.name
}
}
doLast {
println springVersion
println emailNotification
productionSourceSets.get().each { println it }
}
}

$ gradle -q printProperties
3.1.0.RELEASE
[email protected]
main
plugin

This example adds two extra properties to the project object via by extra. Additionally, this
example adds a property named purpose to each source set by setting extra["purpose"] to null. Once
added, you can read and set these properties via extra.

This example adds two extra properties to the project object via an ext block. Additionally, this
example adds a property named purpose to each source set by setting ext.purpose to null. Once
added, you can read and set all these properties just like predefined ones.

Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows
Gradle to recognize when a script attempts to set a property that does not exist. You can access
extra properties anywhere where you can access their owning object. This gives extra properties a
wider scope than local variables. Subprojects can access extra properties on their parent projects.

For more information about extra properties, see ExtraPropertiesExtension in the API
documentation.

Configure Arbitrary Objects

The example greet() task shows an example of arbitrary object configuration:

build.gradle.kts

class UserInfo(
var name: String? = null,
var email: String? = null
)

tasks.register("greet") {
val user = UserInfo().apply {
name = "Isaac Newton"
email = "[email protected]"
}
doLast {
println(user.name)
println(user.email)
}
}

build.gradle

class UserInfo {
String name
String email
}

tasks.register('greet') {
def user = configure(new UserInfo()) {
name = "Isaac Newton"
email = "[email protected]"
}
doLast {
println user.name
println user.email
}
}

$ gradle -q greet
Isaac Newton
[email protected]

Closure Delegates

Each closure has a delegate object. Groovy uses this delegate to look up variable and method
references to nonlocal variables and closure parameters. Gradle uses this for configuration closures,
where the delegate object refers to the object being configured.

build.gradle

dependencies {
assert delegate == project.dependencies
testImplementation('junit:junit:4.13')
delegate.testImplementation('junit:junit:4.13')
}

Default imports

To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.

As a result, instead of writing throw new org.gradle.api.tasks.StopExecutionException(), you can


write throw new StopExecutionException() instead.

Gradle implicitly adds the following imports to each script:

Gradle default imports

import org.gradle.*
import org.gradle.api.*
import org.gradle.api.artifacts.*
import org.gradle.api.artifacts.component.*
import org.gradle.api.artifacts.dsl.*
import org.gradle.api.artifacts.ivy.*
import org.gradle.api.artifacts.maven.*
import org.gradle.api.artifacts.query.*
import org.gradle.api.artifacts.repositories.*
import org.gradle.api.artifacts.result.*
import org.gradle.api.artifacts.transform.*
import org.gradle.api.artifacts.type.*
import org.gradle.api.artifacts.verification.*
import org.gradle.api.attributes.*
import org.gradle.api.attributes.java.*
import org.gradle.api.attributes.plugin.*
import org.gradle.api.cache.*
import org.gradle.api.capabilities.*
import org.gradle.api.component.*
import org.gradle.api.configuration.*
import org.gradle.api.credentials.*
import org.gradle.api.distribution.*
import org.gradle.api.distribution.plugins.*
import org.gradle.api.execution.*
import org.gradle.api.file.*
import org.gradle.api.flow.*
import org.gradle.api.initialization.*
import org.gradle.api.initialization.definition.*
import org.gradle.api.initialization.dsl.*
import org.gradle.api.initialization.resolve.*
import org.gradle.api.invocation.*
import org.gradle.api.java.archives.*
import org.gradle.api.jvm.*
import org.gradle.api.launcher.cli.*
import org.gradle.api.logging.*
import org.gradle.api.logging.configuration.*
import org.gradle.api.model.*
import org.gradle.api.plugins.*
import org.gradle.api.plugins.antlr.*
import org.gradle.api.plugins.catalog.*
import org.gradle.api.plugins.jvm.*
import org.gradle.api.plugins.quality.*
import org.gradle.api.plugins.scala.*
import org.gradle.api.problems.*
import org.gradle.api.project.*
import org.gradle.api.provider.*
import org.gradle.api.publish.*
import org.gradle.api.publish.ivy.*
import org.gradle.api.publish.ivy.plugins.*
import org.gradle.api.publish.ivy.tasks.*
import org.gradle.api.publish.maven.*
import org.gradle.api.publish.maven.plugins.*
import org.gradle.api.publish.maven.tasks.*
import org.gradle.api.publish.plugins.*
import org.gradle.api.publish.tasks.*
import org.gradle.api.reflect.*
import org.gradle.api.reporting.*
import org.gradle.api.reporting.components.*
import org.gradle.api.reporting.dependencies.*
import org.gradle.api.reporting.dependents.*
import org.gradle.api.reporting.model.*
import org.gradle.api.reporting.plugins.*
import org.gradle.api.resources.*
import org.gradle.api.services.*
import org.gradle.api.specs.*
import org.gradle.api.tasks.*
import org.gradle.api.tasks.ant.*
import org.gradle.api.tasks.application.*
import org.gradle.api.tasks.bundling.*
import org.gradle.api.tasks.compile.*
import org.gradle.api.tasks.diagnostics.*
import org.gradle.api.tasks.diagnostics.configurations.*
import org.gradle.api.tasks.incremental.*
import org.gradle.api.tasks.javadoc.*
import org.gradle.api.tasks.options.*
import org.gradle.api.tasks.scala.*
import org.gradle.api.tasks.testing.*
import org.gradle.api.tasks.testing.junit.*
import org.gradle.api.tasks.testing.junitplatform.*
import org.gradle.api.tasks.testing.testng.*
import org.gradle.api.tasks.util.*
import org.gradle.api.tasks.wrapper.*
import org.gradle.api.toolchain.management.*
import org.gradle.authentication.*
import org.gradle.authentication.aws.*
import org.gradle.authentication.http.*
import org.gradle.build.event.*
import org.gradle.buildconfiguration.tasks.*
import org.gradle.buildinit.*
import org.gradle.buildinit.plugins.*
import org.gradle.buildinit.tasks.*
import org.gradle.caching.*
import org.gradle.caching.configuration.*
import org.gradle.caching.http.*
import org.gradle.caching.local.*
import org.gradle.concurrent.*
import org.gradle.external.javadoc.*
import org.gradle.ide.visualstudio.*
import org.gradle.ide.visualstudio.plugins.*
import org.gradle.ide.visualstudio.tasks.*
import org.gradle.ide.xcode.*
import org.gradle.ide.xcode.plugins.*
import org.gradle.ide.xcode.tasks.*
import org.gradle.ivy.*
import org.gradle.jvm.*
import org.gradle.jvm.application.scripts.*
import org.gradle.jvm.application.tasks.*
import org.gradle.jvm.tasks.*
import org.gradle.jvm.toolchain.*
import org.gradle.language.*
import org.gradle.language.assembler.*
import org.gradle.language.assembler.plugins.*
import org.gradle.language.assembler.tasks.*
import org.gradle.language.base.*
import org.gradle.language.base.artifact.*
import org.gradle.language.base.compile.*
import org.gradle.language.base.plugins.*
import org.gradle.language.base.sources.*
import org.gradle.language.c.*
import org.gradle.language.c.plugins.*
import org.gradle.language.c.tasks.*
import org.gradle.language.cpp.*
import org.gradle.language.cpp.plugins.*
import org.gradle.language.cpp.tasks.*
import org.gradle.language.java.artifact.*
import org.gradle.language.jvm.tasks.*
import org.gradle.language.nativeplatform.*
import org.gradle.language.nativeplatform.tasks.*
import org.gradle.language.objectivec.*
import org.gradle.language.objectivec.plugins.*
import org.gradle.language.objectivec.tasks.*
import org.gradle.language.objectivecpp.*
import org.gradle.language.objectivecpp.plugins.*
import org.gradle.language.objectivecpp.tasks.*
import org.gradle.language.plugins.*
import org.gradle.language.rc.*
import org.gradle.language.rc.plugins.*
import org.gradle.language.rc.tasks.*
import org.gradle.language.scala.tasks.*
import org.gradle.language.swift.*
import org.gradle.language.swift.plugins.*
import org.gradle.language.swift.tasks.*
import org.gradle.maven.*
import org.gradle.model.*
import org.gradle.nativeplatform.*
import org.gradle.nativeplatform.platform.*
import org.gradle.nativeplatform.plugins.*
import org.gradle.nativeplatform.tasks.*
import org.gradle.nativeplatform.test.*
import org.gradle.nativeplatform.test.cpp.*
import org.gradle.nativeplatform.test.cpp.plugins.*
import org.gradle.nativeplatform.test.cunit.*
import org.gradle.nativeplatform.test.cunit.plugins.*
import org.gradle.nativeplatform.test.cunit.tasks.*
import org.gradle.nativeplatform.test.googletest.*
import org.gradle.nativeplatform.test.googletest.plugins.*
import org.gradle.nativeplatform.test.plugins.*
import org.gradle.nativeplatform.test.tasks.*
import org.gradle.nativeplatform.test.xctest.*
import org.gradle.nativeplatform.test.xctest.plugins.*
import org.gradle.nativeplatform.test.xctest.tasks.*
import org.gradle.nativeplatform.toolchain.*
import org.gradle.nativeplatform.toolchain.plugins.*
import org.gradle.normalization.*
import org.gradle.platform.*
import org.gradle.platform.base.*
import org.gradle.platform.base.binary.*
import org.gradle.platform.base.component.*
import org.gradle.platform.base.plugins.*
import org.gradle.plugin.devel.*
import org.gradle.plugin.devel.plugins.*
import org.gradle.plugin.devel.tasks.*
import org.gradle.plugin.management.*
import org.gradle.plugin.use.*
import org.gradle.plugins.ear.*
import org.gradle.plugins.ear.descriptor.*
import org.gradle.plugins.ide.*
import org.gradle.plugins.ide.api.*
import org.gradle.plugins.ide.eclipse.*
import org.gradle.plugins.ide.idea.*
import org.gradle.plugins.signing.*
import org.gradle.plugins.signing.signatory.*
import org.gradle.plugins.signing.signatory.pgp.*
import org.gradle.plugins.signing.type.*
import org.gradle.plugins.signing.type.pgp.*
import org.gradle.process.*
import org.gradle.swiftpm.*
import org.gradle.swiftpm.plugins.*
import org.gradle.swiftpm.tasks.*
import org.gradle.testing.base.*
import org.gradle.testing.base.plugins.*
import org.gradle.testing.jacoco.plugins.*
import org.gradle.testing.jacoco.tasks.*
import org.gradle.testing.jacoco.tasks.rules.*
import org.gradle.testkit.runner.*
import org.gradle.util.*
import org.gradle.vcs.*
import org.gradle.vcs.git.*
import org.gradle.work.*
import org.gradle.workers.*

Next Step: Learn how to use Tasks >>

Using Tasks
The work that Gradle can do on a project is defined by one or more tasks.
A task represents some independent unit of work that a build performs. This might be compiling
some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.

When a user runs ./gradlew build in the command line, Gradle will execute the build task along
with any other tasks it depends on.

List available tasks

Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks:

> Task :tasks

------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------

Build Setup tasks


-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project
'myTutorial'.
...

Tasks either come from build scripts or plugins.


Once we apply a plugin to our project, such as the application plugin, additional tasks become
available:

build.gradle.kts

plugins {
id("application")
}

$ ./gradlew tasks

> Task :tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.

Other tasks
-----------
compileJava - Compiles main Java source.

...

Many of these tasks, such as assemble, build, and run, should be familiar to a developer.

Task classification

There are two classes of tasks that can be executed:

1. Actionable tasks have some action(s) attached to do work in your build: compileJava.

2. Lifecycle tasks are tasks with no actions attached: assemble, build.

Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at
once.
Task registration and action

Let’s take a look at a simple "Hello World" task in a build script:

build.gradle.kts

tasks.register("hello") {
doLast {
println("Hello world!")
}
}

build.gradle

tasks.register('hello') {
doLast {
println 'Hello world!'
}
}

In the example, the build script registers a single task called hello using the TaskContainer API,
and adds an action to it.

If the tasks in the project are listed, the hello task is available to Gradle:

$ ./gradlew app:tasks --all

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.

You can execute the task in the build script with ./gradlew hello:
$ ./gradlew hello
Hello world!

When Gradle executes the hello task, it executes the action provided. In this case, the action is
simply a block containing some code: println("Hello world!").

Task group and description

The hello task from the previous section can be detailed with a description and assigned to a
group with the following update:

build.gradle.kts

tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
}

Once the task is assigned to a group, it will be listed by ./gradlew tasks:

$ ./gradlew tasks

> Task :tasks

Custom tasks
------------------
hello - A lovely greeting task.

To view information about a task, use the help --task <task-name> command:

$./gradlew help --task hello

> Task :help


Detailed task information for hello

Path
:app:hello

Type
Task (org.gradle.api.Task)

Options
--rerun Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom

As we can see, the hello task belongs to the custom group.

Task dependencies

You can declare tasks that depend on other tasks:

build.gradle.kts

tasks.register("hello") {
doLast {
println("Hello world!")
}
}
tasks.register("intro") {
dependsOn("hello")
doLast {
println("I'm Gradle")
}
}

build.gradle

tasks.register('hello') {
doLast {
println 'Hello world!'
}
}
tasks.register('intro') {
dependsOn tasks.hello
doLast {
println "I'm Gradle"
}
}

$ gradle -q intro
Hello world!
I'm Gradle

The dependency of taskX to taskY may be declared before taskY is defined:

build.gradle.kts

tasks.register("taskX") {
dependsOn("taskY")
doLast {
println("taskX")
}
}
tasks.register("taskY") {
doLast {
println("taskY")
}
}

build.gradle

tasks.register('taskX') {
dependsOn 'taskY'
doLast {
println 'taskX'
}
}
tasks.register('taskY') {
doLast {
println 'taskY'
}
}

$ gradle -q taskX
taskY
taskX

The hello task from the previous example is updated to include a dependency:

build.gradle.kts

tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn(tasks.assemble)
}

The hello task now depends on the assemble task, which means that Gradle must execute the
assemble task before it can execute the hello task:

$ ./gradlew :app:hello

> Task :app:compileJava UP-TO-DATE


> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:startScripts UP-TO-DATE
> Task :app:distTar UP-TO-DATE
> Task :app:distZip UP-TO-DATE
> Task :app:assemble UP-TO-DATE

> Task :app:hello


Hello world!

Task configuration

Once registered, tasks can be accessed via the TaskProvider API for further configuration.

For instance, you can use this to add dependencies to a task at runtime dynamically:

build.gradle.kts

repeat(4) { counter ->


tasks.register("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}
tasks.named("task0") { dependsOn("task2", "task3") }

build.gradle

4.times { counter ->


tasks.register("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}
tasks.named('task0') { dependsOn('task2', 'task3') }

$ gradle -q task0
I'm task number 2
I'm task number 3
I'm task number 0

Or you can add behavior to an existing task:

build.gradle.kts

tasks.register("hello") {
doLast {
println("Hello Earth")
}
}
tasks.named("hello") {
doFirst {
println("Hello Venus")
}
}
tasks.named("hello") {
doLast {
println("Hello Mars")
}
}
tasks.named("hello") {
doLast {
println("Hello Jupiter")
}
}

build.gradle

tasks.register('hello') {
doLast {
println 'Hello Earth'
}
}
tasks.named('hello') {
doFirst {
println 'Hello Venus'
}
}
tasks.named('hello') {
doLast {
println 'Hello Mars'
}
}
tasks.named('hello') {
doLast {
println 'Hello Jupiter'
}
}

$ gradle -q hello
Hello Venus
Hello Earth
Hello Mars
Hello Jupiter

The calls doFirst and doLast can be executed multiple times. They add an action to the
TIP beginning or the end of the task’s actions list. When the task executes, the actions in
the action list are executed in order.

Here is an example of the named method being used to configure a task added by a plugin:

tasks.named("dokkaHtml") {
outputDirectory.set(buildDir.resolve("dokka"))
}

Task types

Gradle tasks are a subclass of Task.

In the build script, the HelloTask class is created by extending DefaultTask:

build.gradle.kts

// Extend the DefaultTask class to create a HelloTask class


abstract class HelloTask : DefaultTask() {
@TaskAction
fun hello() {
println("hello from HelloTask")
}
}

// Register the hello Task with type HelloTask


tasks.register<HelloTask>("hello") {
group = "Custom tasks"
description = "A lovely greeting task."
}

The hello task is registered with the type HelloTask.

Executing our new hello task:

$ ./gradlew hello

> Task :app:hello


hello from HelloTask

Now the hello task is of type HelloTask instead of type Task.

The Gradle help task reveals the change:

$ ./gradlew help --task hello

> Task :help


Detailed task information for hello

Path
:app:hello

Type
HelloTask (Build_gradle$HelloTask)

Options
--rerun Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom tasks

Built-in task types

Gradle provides many built-in task types with common and popular functionality, such as copying
or deleting files.

This example task copies *.war files from the source directory to the target directory using the Copy
built-in task:
tasks.register("copyTask",Copy) {
from("source")
into("target")
include("*.war")
}

There are many task types developers can take advantage of, including GroovyDoc, Zip, Jar,
JacocoReport, Sign, or Delete, which are available in the link:DSL.

Next Step: Learn how to write Tasks >>

Writing Tasks
Gradle tasks are created by extending DefaultTask.

However, the generic DefaultTask provides no action for Gradle. If users want to extend the
capabilities of Gradle and their build script, they must either use a built-in task or create a custom
task:

1. Built-in task - Gradle provides built-in utility tasks such as Copy, Jar, Zip, Delete, etc…

2. Custom task - Gradle allows users to subclass DefaultTask to create their own task types.

Create a task

The simplest and quickest way to create a custom task is in a build script:

To create a task, inherit from the DefaultTask class and implement a @TaskAction handler:

build.gradle.kts

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}

The CreateFileTask implements a simple set of actions. First, a file called "myfile.txt" is created in
the main project. Then, some text is written to the file.

Register a task

A task is registered in the build script using the TaskContainer.register() method, which allows it
to be then used in the build logic.
build.gradle.kts

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}

tasks.register<CreateFileTask>("createFileTask")

Task group and description

Setting the group and description properties on your tasks can help users understand how to use
your task:

build.gradle.kts

abstract class CreateFileTask : DefaultTask() {


@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}

tasks.register<CreateFileTask>("createFileTask", ) {
group = "custom"
description = "Create myfile.txt in the current directory"
}

Once a task is added to a group, it is visible when listing tasks.

Task input and outputs

For the task to do useful work, it typically needs some inputs. A task typically produces outputs.

build.gradle.kts

abstract class CreateFileTask : DefaultTask() {


@Input
val fileText = "HELLO FROM MY TASK"

@Input
val fileName = "myfile.txt"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText)
}
}

tasks.register<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
}

Configure a task

A task is optionally configured in a build script using the TaskCollection.named() method.

The CreateFileTask class is updated so that the text in the file is configurable:

build.gradle.kts

abstract class CreateFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = "myfile.txt"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}

tasks.register<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
fileText.convention("HELLO FROM THE CREATE FILE TASK METHOD") // Set convention
}

tasks.named<CreateFileTask>("createFileTask") {
fileText.set("HELLO FROM THE NAMED METHOD") // Override with custom message
}

In the named() method, we find the createFileTask task and set the text that will be written to the
file.

When the task is executed:

$ ./gradlew createFileTask

> Configure project :app

> Task :app:createFileTask

BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date

A text file called myfile.txt is created in the project root folder:

myfile.txt

HELLO FROM THE NAMED METHOD

Consult the Developing Gradle Tasks chapter to learn more.

Next Step: Learn how to use Plugins >>

Using Plugins
Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with
Gradle, third-party plugins, and script plugins defined within builds.

Plugins introduce new tasks (e.g., JavaCompile), domain objects (e.g., SourceSet), conventions (e.g.,
locating Java source at src/main/java), and extend core or other plugin objects.

Plugins in Gradle are essential for automating common build tasks, integrating with external tools
or services, and tailoring the build process to meet specific project needs. They also serve as the
primary mechanism for organizing build logic.

Benefits of plugins

Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins
offer several advantages over adding logic directly to the build script:

• Promotes Reusability: Reduces the need to duplicate similar logic across projects.

• Enhances Modularity: Allows for a more modular and organized build script.

• Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.

Plugin distribution

You can leverage plugins from Gradle and the Gradle community or create your own.
Plugins are available in three ways:

1. Core plugins - Gradle develops and maintains a set of Core Plugins.

2. Community plugins - Gradle plugins shared in a remote repository such as Maven or the
Gradle Plugin Portal.

3. Local plugins - Gradle enables users to create custom plugins using APIs.

Types of plugins

Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:

Binary Plugins
Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as
JAR files. They are applied to a project using the plugins {} block. They offer better performance
and maintainability compared to script plugins or precompiled script plugins.

Precompiled Script Plugins


Precompiled script plugins are Groovy DSL or Kotlin DSL scripts compiled and distributed as
Java class files packaged in a library. They are applied to a project using the plugins {} block.
They provide a way to reuse complex logic across projects and allow for better organization of
build logic.

Script Plugins
Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build
script using the apply from: syntax. They are applied inline within a build script to add
functionality or customize the build process. They are simple to use.

A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes
more valuable, it’s migrated to a binary plugin that can be easily tested and shared between
multiple projects or organizations.

Using plugins

To use the build logic encapsulated in a plugin, Gradle needs to perform two steps. First, it needs to
resolve the plugin, and then it needs to apply the plugin to the target, usually a Project.

1. Resolving a plugin means finding the correct version of the JAR that contains a given plugin
and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build
script. Script plugins are self-resolving in that they are resolved from the specific file path or
URL provided when applying them. Core binary plugins provided as part of the Gradle
distribution are automatically resolved.

2. Applying a plugin means executing the plugin’s Plugin.apply(T) on a project.

The plugins DSL is recommended to resolve and apply plugins in one step.

Resolving plugins

Gradle provides the core plugins (e.g., JavaPlugin, GroovyPlugin, MavenPublishPlugin, etc.) as part of
its distribution, which means they are automatically resolved.

Core plugins are applied in a build script using the plugin name:

plugins {
id «plugin name»
}

For example:

build.gradle

plugins {
id("java")
}

Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a
unique ID and a version in the build file:

plugins {
id «plugin id» version «plugin version»
}

And the location of the plugin must be specified in the settings file:

settings.gradle

pluginManagement {
repositories {
gradlePluginPortal()
maven {
url 'https://maven.example.com/plugins'
}
}
}

There are additional considerations for resolving and applying plugins:

# To Use For example:

1 Apply a core, community or The plugins block


local plugin to a specific in the build file plugins {
project.
id("org.barfuin.gradle.taskinfo")
version "2.1.0"
}
# To Use For example:

2 Apply common core, A build script in


community or local plugin to the buildSrc plugins {
multiple subprojects. directory
id("org.barfuin.gradle.taskinfo")
version "2.1.0"
}
repositories {
mavenCentral()
}
dependencies {

implementation(Libs.Kotlin.corouti
nes)
}

3 Apply a core, community or The buildscript


local plugin needed for the block in the build buildscript {
repositories {
build script itself. file
maven {
url =
uri("https://plugins.gradle.org/m2
/")
}
}
dependencies {

classpath("org.barfuin.gradle.task
info:gradle-taskinfo:2.1.0")
}
}
plugins {

id("org.barfuin.gradle.taskinfo")
version "2.1.0"
}

4 Apply a local script plugins. The legacy apply()


method in the apply(plugin =
"org.barfuin.gradle.taskinfo")
build file
apply<MyPlugin>()

1. Applying plugins using the plugins{} block

The plugin DSL provides a concise and convenient way to declare plugin dependencies.

The plugins block configures an instance of PluginDependenciesSpec:

plugins {
application // by name
java // by name
id("java") // by id - recommended
id("org.jetbrains.kotlin.jvm") version "1.9.0" // by id - recommended
}

Core Gradle plugins are unique in that they provide short names, such as java for the core
JavaPlugin.

To apply a core plugin, the short name can be used:

build.gradle.kts

plugins {
java
}

build.gradle

plugins {
id 'java'
}

All other binary plugins must use the fully qualified form of the plugin id (e.g., com.github.foo.bar).

To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally
unique identifier, must be used:

build.gradle.kts

plugins {
id("com.jfrog.bintray") version "1.8.5"
}

build.gradle

plugins {
id 'com.jfrog.bintray' version '1.8.5'
}
See PluginDependenciesSpec for more information on using the Plugin DSL.

Limitations of the plugins DSL

The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine
which plugins are used quickly. This allows Gradle to:

• Optimize the loading and reuse of plugin classes.

• Provide editors with detailed information about the potential properties and values in the build
script.

However, the DSL requires that plugins be defined statically.

There are some key differences between the plugins {} block mechanism and the "traditional"
apply() method mechanism. There are also some constraints and possible limitations.

Constrained Syntax

The plugins {} block does not support arbitrary code.

It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for
Gradle to execute at any time).

The form is:

build.gradle.kts

plugins {
id(«plugin id») ①
id(«plugin id») version «plugin version» ②
}

① for core Gradle plugins or plugins already available to the build script

② for binary Gradle plugins that need to be resolved

build.gradle

plugins {
id «plugin id» ①
id «plugin id» version «plugin version» ②
}

① for core Gradle plugins or plugins already available to the build script

② for binary Gradle plugins that need to be resolved


Where «plugin id» and «plugin version» are a string.

Where «plugin id» and «plugin version» must be constant, literal strings.

The plugins{} block must also be a top-level statement in the build script. It cannot be nested inside
another construct (e.g., an if-statement or for-loop).

Only in build scripts and settings file

The plugins{} block can only be used in a project’s build script build.gradle(.kts) and the
settings.gradle(.kts) file. It must appear before any other block. It cannot be used in script plugins
or init scripts.

Applying plugins to all subprojects

Suppose you have a multi-project build, you probably want to apply plugins to some or all of the
subprojects in your build but not to the root project.

While the default behavior of the plugins{} block is to immediately resolve and apply the plugins,
you can use the apply false syntax to tell Gradle not to apply the plugin to the current project.
Then, use the plugins{} block without the version in subprojects' build scripts:

settings.gradle.kts

include("hello-a")
include("hello-b")
include("goodbye-c")

build.gradle.kts

plugins {
id("com.example.hello") version "1.0.0" apply false
id("com.example.goodbye") version "1.0.0" apply false
}

hello-a/build.gradle.kts

plugins {
id("com.example.hello")
}

hello-b/build.gradle.kts

plugins {
id("com.example.hello")
}
goodbye-c/build.gradle.kts

plugins {
id("com.example.goodbye")
}

settings.gradle

include 'hello-a'
include 'hello-b'
include 'goodbye-c'

build.gradle

plugins {
id 'com.example.hello' version '1.0.0' apply false
id 'com.example.goodbye' version '1.0.0' apply false
}

hello-a/build.gradle

plugins {
id 'com.example.hello'
}

hello-b/build.gradle

plugins {
id 'com.example.hello'
}

goodbye-c/build.gradle

plugins {
id 'com.example.goodbye'
}

You can also encapsulate the versions of external plugins by composing the build logic using your
own convention plugins.

2. Applying plugins from the buildSrc directory

buildSrc is an optional directory at the Gradle project root that contains build logic (i.e., plugins)
used in building the main project. You can apply plugins that reside in a project’s buildSrc directory
as long as they have a defined ID.

The following example shows how to tie the plugin implementation class my.MyPlugin, defined in
buildSrc, to the id "my-plugin":

buildSrc/build.gradle.kts

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("myPlugins") {
id = "my-plugin"
implementationClass = "my.MyPlugin"
}
}
}

buildSrc/build.gradle

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
myPlugins {
id = 'my-plugin'
implementationClass = 'my.MyPlugin'
}
}
}

The plugin can then be applied by ID:

build.gradle.kts

plugins {
id("my-plugin")
}
build.gradle

plugins {
id 'my-plugin'
}

3. Applying plugins using the buildscript{} block

The buildscript block is used for:

1. global dependencies and repositories required for building the project (applied in the
subprojects).

2. declaring which plugins are available for use in the build script (in the build.gradle(.kts) file
itself).

So when you want to use a library in the build script itself, you must add this library on the script
classpath using buildScript:

import org.apache.commons.codec.binary.Base64

buildscript {
repositories { // this is where the plugins are located
mavenCentral()
google()
}
dependencies { // these are the plugins that can be used in subprojects or in the
build file itself
classpath group: 'commons-codec', name: 'commons-codec', version: '1.2' //
used in the task below
classpath 'com.android.tools.build:gradle:4.1.0' // used in subproject
}
}

tasks.register('encode') {
doLast {
def byte[] encodedString = new Base64().encode('hello world\n'.getBytes())
println new String(encodedString)
}
}

And you can apply the globally declared dependencies in the subproject that needs it:

plugins {
id 'com.android.application'
}
Binary plugins published as external jar files can be added to a project by adding the plugin to the
build script classpath and then applying the plugin.

External jars can be added to the build script classpath using the buildscript{} block as described
in External dependencies for the build script:

build.gradle.kts

buildscript {
repositories {
gradlePluginPortal()
}
dependencies {
classpath("com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.5")
}
}

apply(plugin = "com.jfrog.bintray")

build.gradle

buildscript {
repositories {
gradlePluginPortal()
}
dependencies {
classpath 'com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.5'
}
}

apply plugin: 'com.jfrog.bintray'

4. Applying script plugins using the legacy apply() method

A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is
applied using the legacy application method:

class MyPlugin : Plugin<Project> {


override fun apply(project: Project) {
println("Plugin ${this.javaClass.simpleName} applied on ${project.name}")
}
}
apply<MyPlugin>()

Let’s take a rudimentary example of a plugin written in a file called other.gradle located in the
same directory as the build.gradle file:

public class Other implements Plugin<Project> {


@Override
void apply(Project project) {
// Does something
}
}

First, import the external file using:

apply from: 'other.gradle'

Then you can apply it:

apply plugin: Other

Script plugins are automatically resolved and can be applied from a script on the local filesystem or
remotely:

build.gradle.kts

apply(from = "other.gradle.kts")

build.gradle

apply from: 'other.gradle'

Filesystem locations are relative to the project directory, while remote script locations are specified
with an HTTP URL. Multiple script plugins (of either form) can be applied to a given target.

Plugin Management

The pluginManagement{} block is used to configure repositories for plugin resolution and to define
version constraints for plugins that are applied in the build scripts.

The pluginManagement{} block can be used in a settings.gradle(.kts) file, where it must be the first
block in the file:

settings.gradle.kts

pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = "plugin-management"

settings.gradle

pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = 'plugin-management'

The block can also be used in Initialization Script:

init.gradle.kts

settingsEvaluated {
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}
init.gradle

settingsEvaluated { settings ->


settings.pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}

Custom Plugin Repositories

By default, the plugins{} DSL resolves plugins from the public Gradle Plugin Portal.

Many build authors would also like to resolve plugins from private Maven or Ivy repositories
because they contain proprietary implementation details or to have more control over what
plugins are available to their builds.

To specify custom plugin repositories, use the repositories{} block inside pluginManagement{}:

settings.gradle.kts

pluginManagement {
repositories {
maven(url = "./maven-repo")
gradlePluginPortal()
ivy(url = "./ivy-repo")
}
}

settings.gradle

pluginManagement {
repositories {
maven {
url './maven-repo'
}
gradlePluginPortal()
ivy {
url './ivy-repo'
}
}
}

This tells Gradle to first look in the Maven repository at ../maven-repo when resolving plugins and
then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository. If you
don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal() line. Finally, the
Ivy repository at ../ivy-repo will be checked.

Plugin Version Management

A plugins{} block inside pluginManagement{} allows all plugin versions for the build to be defined in
a single location. Plugins can then be applied by id to any build script via the plugins{} block.

One benefit of setting plugin versions this way is that the pluginManagement.plugins{} does not have
the same constrained syntax as the build script plugins{} block. This allows plugin versions to be
taken from gradle.properties, or loaded via another mechanism.

Managing plugin versions via pluginManagement:

settings.gradle.kts

pluginManagement {
val helloPluginVersion: String by settings
plugins {
id("com.example.hello") version "${helloPluginVersion}"
}
}

build.gradle.kts

plugins {
id("com.example.hello")
}

gradle.properties

helloPluginVersion=1.0.0

settings.gradle

pluginManagement {
plugins {
id 'com.example.hello' version "${helloPluginVersion}"
}
}

build.gradle

plugins {
id 'com.example.hello'
}

gradle.properties

helloPluginVersion=1.0.0

The plugin version is loaded from gradle.properties and configured in the settings script, allowing
the plugin to be added to any project without specifying the version.

Plugin Resolution Rules

Plugin resolution rules allow you to modify plugin requests made in plugins{} blocks, e.g., changing
the requested version or explicitly specifying the implementation artifact coordinates.

To add resolution rules, use the resolutionStrategy{} inside the pluginManagement{} block:

settings.gradle.kts

pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == "com.example") {
useModule("com.example:sample-plugins:1.0.0")
}
}
}
repositories {
maven {
url = uri("./maven-repo")
}
gradlePluginPortal()
ivy {
url = uri("./ivy-repo")
}
}
}
settings.gradle

pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == 'com.example') {
useModule('com.example:sample-plugins:1.0.0')
}
}
}
repositories {
maven {
url './maven-repo'
}
gradlePluginPortal()
ivy {
url './ivy-repo'
}
}
}

This tells Gradle to use the specified plugin implementation artifact instead of its built-in default
mapping from plugin ID to Maven/Ivy coordinates.

Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts
that implement the plugin. Read Gradle Plugin Development Plugin for more information on
publishing plugins to custom repositories.

See PluginManagementSpec for complete documentation for using the pluginManagement{} block.

Plugin Marker Artifacts

Since the plugins{} DSL block only allows for declaring plugins by their globally unique plugin id
and version properties, Gradle needs a way to look up the coordinates of the plugin implementation
artifact.

To do so, Gradle will look for a Plugin Marker Artifact with the coordinates
plugin.id:plugin.id.gradle.plugin:plugin.version. This marker needs to have a dependency on the
actual plugin implementation. Publishing these markers is automated by the java-gradle-plugin.

For example, the following complete sample from the sample-plugins project shows how to publish
a com.example.hello plugin and a com.example.goodbye plugin to both an Ivy and Maven repository
using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish
plugin.
build.gradle.kts

plugins {
`java-gradle-plugin`
`maven-publish`
`ivy-publish`
}

group = "com.example"
version = "1.0.0"

gradlePlugin {
plugins {
create("hello") {
id = "com.example.hello"
implementationClass = "com.example.hello.HelloPlugin"
}
create("goodbye") {
id = "com.example.goodbye"
implementationClass = "com.example.goodbye.GoodbyePlugin"
}
}
}

publishing {
repositories {
maven {
url = uri(layout.buildDirectory.dir("maven-repo"))
}
ivy {
url = uri(layout.buildDirectory.dir("ivy-repo"))
}
}
}

build.gradle

plugins {
id 'java-gradle-plugin'
id 'maven-publish'
id 'ivy-publish'
}

group 'com.example'
version '1.0.0'

gradlePlugin {
plugins {
hello {
id = 'com.example.hello'
implementationClass = 'com.example.hello.HelloPlugin'
}
goodbye {
id = 'com.example.goodbye'
implementationClass = 'com.example.goodbye.GoodbyePlugin'
}
}
}

publishing {
repositories {
maven {
url layout.buildDirectory.dir("maven-repo")
}
ivy {
url layout.buildDirectory.dir("ivy-repo")
}
}
}

Running gradle publish in the sample directory creates the following Maven repository layout (the
Ivy layout is similar):

Legacy Plugin Application

With the introduction of the plugins DSL, users should have little reason to use the legacy method
of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to
restrictions in how it currently works.
build.gradle.kts

apply(plugin = "java")

build.gradle

apply plugin: 'java'

Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to
apply the JavaPlugin.

Rather than using a plugin id, plugins can also be applied by simply specifying the class of the
plugin:

build.gradle.kts

apply<JavaPlugin>()

build.gradle

apply plugin: JavaPlugin

The JavaPlugin symbol in the above sample refers to the JavaPlugin. This class does not strictly need
to be imported as the org.gradle.api.plugins package is automatically imported in all build scripts
(see Default imports).

Furthermore, one needs to append the ::class suffix to identify a class literal in Kotlin instead of
.class in Java.

Furthermore, it is unnecessary to append .class to identify a class literal in Groovy as it is in Java.

Using a Version Catalog

When a project uses a version catalog, plugins can be referenced via aliases when applied.

Let’s take a look at a simple Version Catalog:


gradle/libs.versions.toml

[versions]
intellij-plugin = "1.6"

[plugins]
jetbrains-intellij = { id = "org.jetbrains.intellij", version.ref = "intellij-plugin"
}

Then a plugin can be applied to any build script using the alias method:

build.gradle.kts

plugins {
alias(libs.plugins.jetbrains.intellij)
}

jetbrains-intellij is available as the Gradle generated safe accessor:


TIP
jetbrains.intellij.

Next Step: Learn how to write Plugins >>

Writing Plugins
If Gradle or the Gradle community does not offer the specific capabilities your project needs,
creating your own plugin could be a solution.

Additionally, if you find yourself duplicating build logic across subprojects and need a better way to
organize it, custom plugins can help.

Custom plugin

A plugin is any class that implements the Plugin interface. The example below is the most
straightforward plugin, a "hello world" plugin:

build.gradle.kts

import org.gradle.api.Plugin
import org.gradle.api.Project

abstract class SamplePlugin : Plugin<Project> {


override fun apply(project: Project) {
project.tasks.create("SampleTask") {
println("Hello world!")
}
}
}
Script plugin

Many plugins start as a script plugin coded in a build script. This offers an easy way to rapidly
prototype and experiment when building a plugin. Let’s take a look at an example:

build.gradle.kts

// Define a task
abstract class CreateFileTask : DefaultTask() { ①
@get:Input
abstract val fileText: Property<String> ②

@Input
val fileName = "myfile.txt"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}

// Define a plugin
abstract class MyPlugin : Plugin<Project> { ③
override fun apply(project: Project) {
tasks {
register("createFileTask", CreateFileTask::class) {
group = "from my plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY PLUGIN")
}
}
}
}

// Apply the local plugin


apply<MyPlugin>() ④

① Subclass DefaultTask().

② Use lazy configuration in the task.

③ Extend the org.gradle.api.Plugin interface.

④ Apply the script plugin.

1. Subclass DefaultTask()

First, build a task by subclassing DefaultTask().


abstract class CreateFileTask : DefaultTask() { }

This simple task adds a file to our application’s root directory.

2. Use Lazy Configuration

Gradle has a concept called lazy configuration, which allows task inputs and outputs to be
referenced before they are actually set. This is done via the Property class type.

abstract val fileText: Property<String>

One advantage of this mechanism is that you can link the output file of one task to the input file of
another, all before the filename has even been decided. The Property class also knows which task
it’s linked to, enabling Gradle to add the required task dependency automatically.

3. Extend the org.gradle.api.Plugin interface

Next, create a new class that extends the org.gradle.api.Plugin interface.

abstract class MyPlugin : Plugin<Project> {


override fun apply() {}
}

You can add tasks and other logic in the apply() method.

4. Apply the script plugin

Finally, apply the local plugin in the build script.

apply<MyPlugin>()

When MyPlugin is applied in the build script, Gradle calls the fun apply() {} method defined in the
custom MyPlugin class.

This makes the plugin available to the application.

Script plugins are NOT recommended. Script plugins offer an easy way to rapidly
NOTE prototype build logic, before migrating it to a more permanent solution such as
convention plugins or binary plugins.

Convention Plugins

Convention plugins are a way to encapsulate and reuse common build logic in Gradle. They allow
you to define a set of conventions for a project, and then apply those conventions to other projects
or modules.
The example above has been re-written as a convention plugin stored in buildSrc:

buildSrc/src/main/kotlin/MyConventionPlugin.kt

import org.gradle.api.DefaultTask
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.provider.Property
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction
import java.io.File

abstract class CreateFileTask : DefaultTask() {


@get:Input
abstract val fileText: Property<String>

@Input
val fileName = project.rootDir.toString() + "/myfile.txt"

@OutputFile
val myFile: File = File(fileName)

@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}

class MyConventionPlugin : Plugin<Project> {


override fun apply(project: Project) {
project.tasks.register("createFileTask", CreateFileTask::class.java) {
group = "from my plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY PLUGIN")
}
}
}

The plugin can be given an id using a gradlePlugin{} block so that it can be referenced in the root:

buildSrc/build.gradle.kts

gradlePlugin {
plugins {
create("my-convention-plugin") {
id = "com.gradle.plugin.my-convention-plugin"
implementationClass = "com.gradle.plugin.MyConventionPlugin"
}
}
}

The gradlePlugin{} block defines the plugins being built by the project. With the newly created id,
the plugin can be applied in other build scripts accordingly:

build.gradle.kts

plugins {
application
id("com.gradle.plugin.my-convention-plugin") // Apply the new plugin
}

Binary Plugins

A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR
file. It is resolved as a dependency rather than compiled from source.

For most use cases, convention plugins must be updated infrequently. Having each developer
execute the plugin build as part of their development process is wasteful, and we can instead
distribute them as binary dependencies.

There are two ways to update the convention plugin in the example above into a binary plugin.

1. Use composite builds:

settings.gradle.kts

includeBuild("my-plugin")

2. Publish the plugin to a repository:

build.gradle.kts

plugins {
id("com.gradle.plugin.myconventionplugin") version "1.0.0"
}

Consult the Developing Plugins chapter to learn more.


STRUCTURING BUILDS
Structuring Projects with Gradle
It is important to structure your Gradle project to optimize build performance. A multi-project build
is the standard in Gradle.

A multi-project build consists of one root project and one or more subprojects. Gradle can build the
root project and any number of the subprojects in a single execution.

Project locations

Multi-project builds contain a single root project in a directory that Gradle views as the root path: ..

Subprojects are located physically under the root path: ./subproject.

A subproject has a path, which denotes the position of that subproject in the multi-project build. In
most cases, the project path is consistent with its location in the file system.

The project structure is created in the settings.gradle(.kts) file. The settings file must be present
in the root directory.

A simple multi-project build

Let’s look at a basic multi-project build example that contains a root project and a single subproject.

The root project is called basic-multiproject, located somewhere on your machine. From Gradle’s
perspective, the root is the top-level directory ..

The project contains a single subproject called ./app:

.
├── app
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts

.
├── app
│ ...
│ └── build.gradle
└── settings.gradle

This is the recommended project structure for starting any Gradle project. The build init plugin also
generates skeleton projects that follow this structure - a root project with a single subproject:

The settings.gradle(.kts) file describes the project structure to Gradle:

settings.gradle.kts

rootProject.name = "basic-multiproject"
include("app")

settings.gradle

rootProject.name = 'basic-multiproject'
include 'app'

In this case, Gradle will look for a build file for the app subproject in the ./app directory.

You can view the structure of a multi-project build by running the projects command:

$ ./gradlew -q projects

Projects:

------------------------------------------------------------
Root project 'basic-multiproject'
------------------------------------------------------------

Root project 'basic-multiproject'


\--- Project ':app'
To see a list of the tasks of a project, run gradle <project-path>:tasks
For example, try running gradle :app:tasks

In this example, the app subproject is a Java application that applies the application plugin and
configures the main class. The application prints Hello World to the console:

app/build.gradle.kts

plugins {
id("application")
}

application {
mainClass = "com.example.Hello"
}

app/build.gradle

plugins {
id 'application'
}

application {
mainClass = 'com.example.Hello'
}

app/src/main/java/com/example/Hello.java

package com.example;

public class Hello {


public static void main(String[] args) {
System.out.println("Hello, world!");
}
}

You can run the application by executing the run task from the application plugin in the project
root:

$ ./gradlew -q run
Hello, world!
Adding a subproject

In the settings file, you can use the include method to add another subproject to the root project:

settings.gradle.kts

include("project1", "project2:child1", "project3:child1")

settings.gradle

include 'project1', 'project2:child1', 'project3:child1'

The include method takes project paths as arguments. The project path is assumed to be equal to
the relative physical file system path. For example, a path services:api is mapped by default to a
folder ./services/api (relative to the project root .).

More examples of how to work with the project path can be found in the DSL documentation of
Settings.include(java.lang.String[]).

Let’s add another subproject called lib to the previously created project.

All we need to do is add another include statement in the root settings file:

settings.gradle.kts

rootProject.name = "basic-multiproject"
include("app")
include("lib")

settings.gradle

rootProject.name = 'basic-multiproject'
include 'app'
include 'lib'

Gradle will then look for the build file of the new lib subproject in the ./lib/ directory:

.
├── app
│ ...
│ └── build.gradle.kts
├── lib
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts

.
├── app
│ ...
│ └── build.gradle
├── lib
│ ...
│ └── build.gradle
└── settings.gradle

Project Descriptors

To further describe the project architecture to Gradle, the settings file provides project descriptors.

You can modify these descriptors in the settings file at any time.

To access a descriptor, you can:

settings.gradle.kts

include("project-a")
println(rootProject.name)
println(project(":project-a").name)

settings.gradle

include('project-a')
println rootProject.name
println project(':project-a').name

Using this descriptor, you can change the name, project directory, and build file of a project:
settings.gradle.kts

rootProject.name = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "project-a.gradle.kts"

settings.gradle

rootProject.name = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = 'project-a.gradle'

Consult the ProjectDescriptor class in the API documentation for more information.

Modifying a subproject path

Let’s take a hypothetical project with the following structure:

.
├── app
│ ...
│ └── build.gradle.kts
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts

.
├── app
│ ...
│ └── build.gradle
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle
└── settings.gradle

If your settings.gradle(.kts) looks like this:

include(':subs:web:my-web-module')

Gradle sees a subproject with a logical project name of :subs:web:my-web-module and two, possibly
unintentional, other subprojects logically named :subs and :subs:web. This can lead to phantom
build directories, especially when using allprojects{} or subproject{}.

To avoid this, you can use:

include(':subs:web:my-web-module')
project(':subs:web:my-web-module').projectDir = "subs/web/my-web-module"

So that you only end up with a single subproject named :subs:web:my-web-module.

Or you can use:

include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"

So that you only end up with a single subproject named :my-web-module.

So, while the physical project layout is the same, the logical results are different.

Naming recommendations

As your project grows, naming and consistency get increasingly more important. To keep your
builds maintainable, we recommend the following:

1. Keep default project names for subprojects: It is possible to configure custom project names
in the settings file. However, it’s an unnecessary extra effort for the developers to track which
projects belong to what folders.

2. Use lower case hyphenation for all project names: All letters are lowercase, and words are
separated with a dash (-) character.

3. Define the root project name in the settings file: The rootProject.name effectively assigns a
name to the build, used in reports like Build Scans. If the root project name is not set, the name
will be the container directory name, which can be unstable (i.e., you can check out your project
in any directory). The name will be generated randomly if the root project name is not set and
checked out to a file system’s root (e.g., / or C:\).
Declaring Dependencies between Subprojects
What if one subproject depends on another subproject? What if one project needs the artifact
produced by another project?

This is a common use case for multi-project builds. Gradle offers project dependencies for this.

Depending on another project

Let’s explore a theoretical multi-project build with the following layout:

.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts

.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle
└── settings.gradle

In this example, there are three subprojects called shared, api, and person-service:

1. The person-service subproject depends on the other two subprojects, shared and api.

2. The api subproject depends on the shared subproject.

We use the : separator to define a project path such as services:person-service or :shared. Consult
the DSL documentation of Settings.include(java.lang.String[]) for more information about defining
project paths.

settings.gradle.kts

rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")

shared/build.gradle.kts

plugins {
id("java")
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
}

api/build.gradle.kts

plugins {
id("java")
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
}

services/person-service/build.gradle.kts

plugins {
id("java")
}

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
implementation(project(":api"))
}

settings.gradle

rootProject.name = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'

shared/build.gradle

plugins {
id 'java'
}

repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
}
api/build.gradle

plugins {
id 'java'
}

repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
}

services/person-service/build.gradle

plugins {
id 'java'
}

repositories {
mavenCentral()
}

dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
implementation project(':api')
}

A project dependency affects execution order. It causes the other project to be built first and adds
the output with the classes of the other project to the classpath. It also adds the dependencies of the
other project to the classpath.

If you execute ./gradlew :api:compile, first the shared project is built, and then the api project is
built.

Depending on artifacts produced by another project

Sometimes, you might want to depend on the output of a specific task within another project rather
than the entire project. However, explicitly declaring a task dependency from one project to
another is discouraged as it introduces unnecessary coupling between tasks.

The recommended way to model dependencies, where a task in one project depends on the output
of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency
management engine allows you to share arbitrary artifacts between projects and build them on
demand.
Sharing Build Logic between Subprojects
Subprojects in a multi-project build typically share some common dependencies.

Instead of copying and pasting the same Java version and libraries in each subproject build script,
Gradle provides a special directory for storing shared build logic that can be automatically applied
to subprojects.

Share logic in buildSrc

buildSrc is a Gradle-recognized and protected directory which comes with some benefits:

1. Reusable Build Logic:

buildSrc allows you to organize and centralize your custom build logic, tasks, and plugins in a
structured manner. The code written in buildSrc can be reused across your project, making it
easier to maintain and share common build functionality.

2. Isolation from the Main Build:

Code placed in buildSrc is isolated from the other build scripts of your project. This helps keep
the main build scripts cleaner and more focused on project-specific configurations.

3. Automatic Compilation and Classpath:

The contents of the buildSrc directory are automatically compiled and included in the classpath
of your main build. This means that classes and plugins defined in buildSrc can be directly used
in your project’s build scripts without any additional configuration.

4. Ease of Testing:
Since buildSrc is a separate build, it allows for easy testing of your custom build logic. You can
write tests for your build code, ensuring that it behaves as expected.

5. Gradle Plugin Development:

If you are developing custom Gradle plugins for your project, buildSrc is a convenient place to
house the plugin code. This makes the plugins easily accessible within your project.

The buildSrc directory is treated as an included build.

For multi-project builds, there can be only one buildSrc directory, which must be in the root project
directory.

The downside of using buildSrc is that any change to it will invalidate every task in
NOTE
your project and require a rerun.

buildSrc uses the same source code conventions applicable to Java, Groovy, and Kotlin projects. It
also provides direct access to the Gradle API.

A typical project including buildSrc has the following layout:

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──MyCustomTask.kt ①
│ ├── shared.gradle.kts ②
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts ③
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts

① Create the MyCustomTask task.

② A shared build script.

③ Uses the MyCustomTask task and shared build script.


.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──MyCustomTask.groovy ①
│ ├── shared.gradle ②
│ └── build.gradle
├── api
│ ├── src
│ │ └──...
│ └── build.gradle ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle ③
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle
└── settings.gradle

① Create the MyCustomTask task.

② A shared build script.

③ Uses the MyCustomTask task and shared build script.

In the buildSrc, the build script shared.gradle(.kts) is created. It contains dependencies and other
build information that is common to multiple subprojects:

shared.gradle.kts

repositories {
mavenCentral()
}

dependencies {
implementation("org.slf4j:slf4j-api:1.7.32")
}

shared.gradle

repositories {
mavenCentral()
}
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.32'
}

In the buildSrc, the MyCustomTask is also created. It is a helper task that is used as part of the build
logic for multiple subprojects:

MyCustomTask.kt

import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction

open class MyCustomTask : DefaultTask() {


@TaskAction
fun calculateSum() {
// Custom logic to calculate the sum of two numbers
val num1 = 5
val num2 = 7
val sum = num1 + num2

// Print the result


println("Sum: $sum")
}
}

MyCustomTask.groovy

import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction

class MyCustomTask extends DefaultTask {


@TaskAction
void calculateSum() {
// Custom logic to calculate the sum of two numbers
int num1 = 5
int num2 = 7
int sum = num1 + num2

// Print the result


println "Sum: $sum"
}
}

The MyCustomTask task is used in the build script of the api and shared projects. The task is
automatically available because it’s part of buildSrc.

The shared.build(.kts) file is also applied:

build.gradle.kts

// Apply any other configurations specific to your project

// Use the build script defined in buildSrc


apply(from = rootProject.file("buildSrc/shared.gradle"))

// Use the custom task defined in buildSrc


tasks.register<MyCustomTask>("myCustomTask")

build.gradle

// Apply any other configurations specific to your project

// Use the build script defined in buildSrc


apply from: rootProject.file('buildSrc/shared.gradle')

// Use the custom task defined in buildSrc


tasks.register('myCustomTask', MyCustomTask)

Share logic using convention plugins

Gradle’s recommended way of organizing build logic is to use its plugin system.

We can write a plugin that encapsulates the build logic common to several subprojects in a project.
This kind of plugin is called a convention plugin.

While writing plugins is outside the scope of this section, the recommended way to build a Gradle
project is to put common build logic in a convention plugin located in the buildSrc.

Let’s take a look at an example project:

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──myproject.java-conventions.gradle.kts ①
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
└── settings.gradle.kts

① Create the myproject.java-conventions convention plugin.

② Applies the myproject.java-conventions convention plugin.

.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──myproject.java-conventions.gradle ①
│ └── build.gradle
├── api
│ ├── src
│ │ └──...
│ └── build.gradle ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle ②
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle ②
└── settings.gradle

① Create the myproject.java-conventions convention plugin.

② Applies the myproject.java-conventions convention plugin.

This build contains three subprojects:


settings.gradle.kts

rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")

settings.gradle

rootProject.name = 'dependencies-java'
include 'api', 'shared', 'services:person-service'

The source code for the convention plugin created in the buildSrc directory is as follows:

buildSrc/src/main/kotlin/myproject.java-conventions.gradle.kts

plugins {
id("java")
}

group = "com.example"
version = "1.0"

repositories {
mavenCentral()
}

dependencies {
testImplementation("junit:junit:4.13")
}

buildSrc/src/main/groovy/myproject.java-conventions.gradle

plugins {
id 'java'
}

group = 'com.example'
version = '1.0'

repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}

For the convention plugin to compile, basic configuration needs to be applied in the build file of the
buildSrc directory:

buildSrc/build.gradle.kts

plugins {
`kotlin-dsl`
}

repositories {
mavenCentral()
}

buildSrc/build.gradle

plugins {
id 'groovy-gradle-plugin'
}

The convention plugin is applied to the api, shared, and person-service subprojects:

api/build.gradle.kts

plugins {
id("myproject.java-conventions")
}

dependencies {
implementation(project(":shared"))
}

shared/build.gradle.kts

plugins {
id("myproject.java-conventions")
}

services/person-service/build.gradle.kts

plugins {
id("myproject.java-conventions")
}

dependencies {
implementation(project(":shared"))
implementation(project(":api"))
}

api/build.gradle

plugins {
id 'myproject.java-conventions'
}

dependencies {
implementation project(':shared')
}

shared/build.gradle

plugins {
id 'myproject.java-conventions'
}

services/person-service/build.gradle

plugins {
id 'myproject.java-conventions'
}

dependencies {
implementation project(':shared')
implementation project(':api')
}

Do not use cross-project configuration

An improper way to share build logic between subprojects is cross-project configuration via the
subprojects {} and allprojects {} DSL constructs.
TIP Avoid using subprojects {} and allprojects {}.

With cross-configuration, build logic can be injected into a subproject which is not obvious when
looking at its build script.

In the long run, cross-configuration usually grows in complexity and becomes a burden. Cross-
configuration can also introduce configuration-time coupling between projects, which can prevent
optimizations like configuration-on-demand from working properly.

Convention plugins versus cross-configuration

The two most common uses of cross-configuration can be better modeled using convention plugins:

1. Applying plugins or other configurations to subprojects of a certain type.


Often, the cross-configuration logic is if subproject is of type X, then configure Y. This is
equivalent to applying X-conventions plugin directly to a subproject.

2. Extracting information from subprojects of a certain type.


This use case can be modeled using outgoing configuration variants.

Composite Builds
A composite build is a build that includes other builds.

A composite build is similar to a Gradle multi-project build, except that instead of including
subprojects, entire builds are included.

Composite builds allow you to:

• Combine builds that are usually developed independently, for instance, when trying out a bug
fix in a library that your application uses.
• Decompose a large multi-project build into smaller, more isolated chunks that can be worked on
independently or together as needed.

A build that is included in a composite build is referred to as an included build. Included builds do
not share any configuration with the composite build or the other included builds. Each included
build is configured and executed in isolation.

Defining a composite build

The following example demonstrates how two Gradle builds, normally developed separately, can be
combined into a composite build.

my-composite
├── gradle
├── gradlew
├── settings.gradle.kts
├── build.gradle.kts
├── my-app
│ ├── settings.gradle.kts
│ └── app
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/my-app/Main.java
└── my-utils
├── settings.gradle.kts
├── number-utils
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/numberutils/Numbers.java
└── string-utils
├── build.gradle.kts
└── src/main/java/org/sample/stringutils/Strings.java

The my-utils multi-project build produces two Java libraries, number-utils and string-utils. The my-
app build produces an executable using functions from those libraries.

The my-app build does not depend directly on my-utils. Instead, it declares binary dependencies on
the libraries produced by my-utils:

my-app/app/build.gradle.kts

plugins {
id("application")
}

application {
mainClass = "org.sample.myapp.Main"
}

dependencies {
implementation("org.sample:number-utils:1.0")
implementation("org.sample:string-utils:1.0")
}

my-app/app/build.gradle

plugins {
id 'application'
}

application {
mainClass = 'org.sample.myapp.Main'
}

dependencies {
implementation 'org.sample:number-utils:1.0'
implementation 'org.sample:string-utils:1.0'
}

Defining a composite build via --include-build

The --include-build command-line argument turns the executed build into a composite,
substituting dependencies from the included build into the executed build.

For example, the output of ./gradlew run --include-build ../my-utils run from my-app:

$ ./gradlew --include-build ../my-utils run


link:https://docs.gradle.org/8.9/samples/build-organization/composite-
builds/basic/tests/basicCli.out[role=include]

Defining a composite build via the settings file

It’s possible to make the above arrangement persistent by using


Settings.includeBuild(java.lang.Object) to declare the included build in the settings.gradle(.kts)
file.

The settings file can be used to add subprojects and included builds simultaneously.

Included builds are added by location:

settings.gradle.kts

includeBuild("my-utils")

In the example, the settings.gradle(.kts) file combines otherwise separate builds:


settings.gradle.kts

rootProject.name = "my-composite"

includeBuild("my-app")
includeBuild("my-utils")

settings.gradle

rootProject.name = 'my-composite'

includeBuild 'my-app'
includeBuild 'my-utils'

To execute the run task in the my-app build from my-composite, run ./gradlew my-app:app:run.

You can optionally define a run task in my-composite that depends on my-app:app:run so that you can
execute ./gradlew run:

build.gradle.kts

tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}

build.gradle

tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}

Including builds that define Gradle plugins

A special case of included builds are builds that define Gradle plugins.

These builds should be included using the includeBuild statement inside the pluginManagement {}
block of the settings file.

Using this mechanism, the included build may also contribute a settings plugin that can be applied
in the settings file itself:

settings.gradle.kts

pluginManagement {
includeBuild("../url-verifier-plugin")
}

settings.gradle

pluginManagement {
includeBuild '../url-verifier-plugin'
}

Restrictions on included builds

Most builds can be included in a composite, including other composite builds. There are some
restrictions.

In a regular build, Gradle ensures that each project has a unique project path. It makes projects
identifiable and addressable without conflicts.

In a composite build, Gradle adds additional qualification to each project from an included build to
avoid project path conflicts. The full path to identify a project in a composite build is called a build-
tree path. It consists of a build path of an included build and a project path of the project.

By default, build paths and project paths are derived from directory names and structure on disk.
Since included builds can be located anywhere on disk, their build path is determined by the name
of the containing directory. This can sometimes lead to conflicts.

To summarize, the included builds must fulfill these requirements:

• Each included build must have a unique build path.

• Each included build path must not conflict with any project path of the main build.

These conditions guarantee that each project can be uniquely identified even in a composite build.

If conflicts arise, the way to resolve them is by changing the build name of an included build:

settings.gradle.kts

includeBuild("some-included-build") {
name = "other-name"
}
When a composite build is included in another composite build, both builds have
NOTE
the same parent. In other words, the nested composite build structure is flattened.

Interacting with a composite build

Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be
executed, tests can be run, and builds can be imported into the IDE.

Executing tasks

Tasks from an included build can be executed from the command-line or IDE in the same way as
tasks from a regular multi-project build. Executing a task will result in task dependencies being
executed, as well as those tasks required to build dependency artifacts from other included builds.

You can call a task in an included build using a fully qualified path, for example, :included-build-
name:project-name:taskName. Project and task names can be abbreviated.

$ ./gradlew :included-build:subproject-a:compileJava
> Task :included-build:subproject-a:compileJava

$ ./gradlew :i-b:sA:cJ
> Task :included-build:subproject-a:compileJava

To exclude a task from the command line, you need to provide the fully qualified path to the task.

Included build tasks are automatically executed to generate required dependency


NOTE artifacts, or the including build can declare a dependency on a task from an
included build.

Importing into the IDE

One of the most useful features of composite builds is IDE integration.

Importing a composite build permits sources from separate Gradle builds to be easily developed
together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse
Project. Source dependencies are configured, providing cross-build navigation and refactoring.

Declaring dependencies substituted by an included build

By default, Gradle will configure each included build to determine the dependencies it can provide.
The algorithm for doing this is simple. Gradle will inspect the group and name for the projects in
the included build and substitute project dependencies for any external dependency matching
${project.group}:${project.name}.

By default, substitutions are not registered for the main build.

NOTE
To make the (sub)projects of the main build addressable by
${project.group}:${project.name}, you can tell Gradle to treat the main build like an
included build by self-including it: includeBuild(".").

There are cases when the default substitutions determined by Gradle are insufficient or must be
corrected for a particular composite. For these cases, explicitly declaring the substitutions for an
included build is possible.

For example, a single-project build called anonymous-library, produces a Java utility library but does
not declare a value for the group attribute:

build.gradle.kts

plugins {
java
}

build.gradle

plugins {
id 'java'
}

When this build is included in a composite, it will attempt to substitute for the dependency module
undefined:anonymous-library (undefined being the default value for project.group, and anonymous-
library being the root project name). Clearly, this isn’t useful in a composite build.

To use the unpublished library in a composite build, you can explicitly declare the substitutions
that it provides:

settings.gradle.kts

includeBuild("anonymous-library") {
dependencySubstitution {
substitute(module("org.sample:number-utils")).using(project(":"))
}
}

settings.gradle

includeBuild('anonymous-library') {
dependencySubstitution {
substitute module('org.sample:number-utils') using project(':')
}
}

With this configuration, the my-app composite build will substitute any dependency on
org.sample:number-utils with a dependency on the root project of anonymous-library.

Deactivate included build substitutions for a configuration

If you need to resolve a published version of a module that is also available as part of an included
build, you can deactivate the included build substitution rules on the ResolutionStrategy of the
Configuration that is resolved. This is necessary because the rules are globally applied in the build,
and Gradle does not consider published versions during resolution by default.

For example, we create a separate publishedRuntimeClasspath configuration that gets resolved to the
published versions of modules that also exist in one of the local builds. This is done by deactivating
global dependency substitution rules:

build.gradle.kts

configurations.create("publishedRuntimeClasspath") {
resolutionStrategy.useGlobalDependencySubstitutionRules = false

extendsFrom(configurations.runtimeClasspath.get())
isCanBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE,
objects.named(Usage.JAVA_RUNTIME))
}

build.gradle

configurations.create('publishedRuntimeClasspath') {
resolutionStrategy.useGlobalDependencySubstitutionRules = false

extendsFrom(configurations.runtimeClasspath)
canBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage
.JAVA_RUNTIME))
}

A use-case would be to compare published and locally built JAR files.


Cases where included build substitutions must be declared

Many builds will function automatically as an included build, without declared substitutions. Here
are some common cases where declared substitutions are required:

• When the archivesBaseName property is used to set the name of the published artifact.

• When a configuration other than default is published.

• When the MavenPom.addFilter() is used to publish artifacts that don’t match the project name.

• When the maven-publish or ivy-publish plugins are used for publishing and the publication
coordinates don’t match ${project.group}:${project.name}.

Cases where composite build substitutions won’t work

Some builds won’t function correctly when included in a composite, even when dependency
substitutions are explicitly declared. This limitation is because a substituted project dependency
will always point to the default configuration of the target project. Any time the artifacts and
dependencies specified for the default configuration of a project don’t match what is published to a
repository, the composite build may exhibit different behavior.

Here are some cases where the published module metadata may be different from the project
default configuration:

• When a configuration other than default is published.

• When the maven-publish or ivy-publish plugins are used.

• When the POM or ivy.xml file is tweaked as part of publication.

Builds using these features function incorrectly when included in a composite build.

Depending on tasks in an included build

While included builds are isolated from one another and cannot declare direct dependencies, a
composite build can declare task dependencies on its included builds. The included builds are
accessed using Gradle.getIncludedBuilds() or Gradle.includedBuild(java.lang.String), and a task
reference is obtained via the IncludedBuild.task(java.lang.String) method.

Using these APIs, it is possible to declare a dependency on a task in a particular included build:

build.gradle.kts

tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
build.gradle

tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}

Or you can declare a dependency on tasks with a certain path in some or all of the included builds:

build.gradle.kts

tasks.register("publishDeps") {
dependsOn(gradle.includedBuilds.map {
it.task(":publishMavenPublicationToMavenRepository") })
}

build.gradle

tasks.register('publishDeps') {
dependsOn gradle.includedBuilds*.task(
':publishMavenPublicationToMavenRepository')
}

Limitations of composite builds

Limitations of the current implementation include:

• No support for included builds with publications that don’t mirror the project default
configuration.
See Cases where composite builds won’t work.

• Multiple composite builds may conflict when run in parallel if more than one includes the same
build.
Gradle does not share the project lock of a shared composite build between Gradle invocations
to prevent concurrent execution.

Configuration On Demand
Configuration-on-demand attempts to configure only the relevant projects for the requested tasks,
i.e., it only evaluates the build script file of projects participating in the build. This way, the
configuration time of a large multi-project build can be reduced.
The configuration-on-demand feature is incubating, so only some builds are guaranteed to work
correctly. The feature works well for decoupled multi-project builds.

In configuration-on-demand mode, projects are configured as follows:

• The root project is always configured.

• The project in the directory where the build is executed is also configured, but only when
Gradle is executed without any tasks.
This way, the default tasks behave correctly when projects are configured on demand.

• The standard project dependencies are supported, and relevant projects are configured.
If project A has a compile dependency on project B, then building A causes the configuration of
both projects.

• The task dependencies declared via the task path are supported and cause relevant projects to
be configured.
Example: someTask.dependsOn(":some-other-project:someOtherTask")

• A task requested via task path from the command line (or tooling API) causes the relevant
project to be configured.
For example, building project-a:project-b:someTask causes configuration of project-b.

Enable configuration-on-demand

You can enable configuration-on-demand using the --configure-on-demand flag or adding


org.gradle.configureondemand=true to the gradle.properties file.

To configure on demand with every build run, see Gradle properties.

To configure on demand for a given build, see command-line performance-oriented options.

Decoupled projects

Gradle allows projects to access each other’s configurations and tasks during the configuration and
execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform
optimizations such as parallel project builds and configuration on demand.

Projects are considered decoupled when they interact solely through declared dependencies and
task dependencies. Any direct modification or reading of another project’s object creates coupling
between the projects. Coupling during configuration can result in flawed build outcomes when
using 'configuration on demand', while coupling during execution can affect parallel execution.

One common source of coupling is configuration injection, such as using allprojects{} or


subprojects{} in build scripts.

To avoid coupling issues, it’s recommended to:

• Refrain from referencing other subprojects' build scripts and prefer cross-configuration from
the root project.

• Avoid dynamically changing other projects' configurations during execution.


As Gradle evolves, it aims to provide features that leverage decoupled projects while offering
solutions for common use cases like configuration injection without introducing coupling.

Parallel projects

Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently
executing tasks from different projects.

To enable parallel execution, use the --parallel command-line argument or configure your build
environment. Gradle automatically determines the optimal number of parallel threads based on
CPU cores.

During parallel execution, each worker handles a specific project exclusively. Task dependencies
are respected, with workers prioritizing upstream tasks. However, tasks may not execute in
alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and
inputs/outputs to avoid ordering issues.
DEVELOPING TASKS
Understanding Tasks
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.

Before reading this chapter, it’s recommended that you first read the Learning The Basics and
complete the Tutorial.

Listing tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in a project by running the following command in the terminal:

$ ./gradlew tasks

Let’s take a very basic Gradle project as an example. The project has the following structure:

gradle-project
├── app
│ ├── build.gradle.kts // empty file - no build logic
│ └── ... // some java code
├── settings.gradle.kts // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat

gradle-project
├── app
│ ├── build.gradle // empty file - no build logic
│ └── ... // some java code
├── settings.gradle // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat

The settings file contains the following:

settings.gradle.kts

rootProject.name = "gradle-project"
include("app")

settings.gradle

rootProject.name = 'gradle-project'
include('app')

Currently, the app subproject’s build file is empty.

To see the tasks available in the app subproject, run ./gradlew :app:tasks:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

We observe that only a small number of help tasks are available at the moment. This is because the
core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build
your project or compile your code, are added by plugins.

Let’s explore this by adding the Gradle core base plugin to the app build script:

app/build.gradle.kts

plugins {
id("base")
}

app/build.gradle

plugins {
id('base')
}

The base plugin adds central lifecycle tasks. Now when we run ./gradlew app:tasks, we can see the
assemble and build tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.

Task outcomes

When Gradle executes a task, it labels the task with outcomes via the console.

These labels are based on whether a task has actions to execute and if Gradle executed them.
Actions include, but are not limited to, compiling code, zipping files, and publishing archives.

(no label) or EXECUTED


Task executed its actions.

• Task has actions and Gradle executed them.

• Task has no actions and some dependencies, and Gradle executed one or more of the
dependencies. See also Lifecycle Tasks.

UP-TO-DATE
Task’s outputs did not change.

• Task has outputs and inputs but they have not changed. See Incremental Build.

• Task has actions, but the task tells Gradle it did not change its outputs.

• Task has no actions and some dependencies, but all the dependencies are UP-TO-DATE, SKIPPED
or FROM-CACHE. See Lifecycle Tasks.

• Task has no actions and no dependencies.

FROM-CACHE
Task’s outputs could be found from a previous execution.

• Task has outputs restored from the build cache. See Build Cache.

SKIPPED
Task did not execute its actions.

• Task has been explicitly excluded from the command-line. See Excluding tasks from
execution.

• Task has an onlyIf predicate return false. See Using a predicate.

NO-SOURCE
Task did not need to execute its actions.

• Task has inputs and outputs, but no sources (i.e., inputs were not found).

Task group and description

Task groups and descriptions are used to organize and describe tasks.

Groups
Task groups are used to categorize tasks. When you run ./gradlew tasks, tasks are listed under
their respective groups, making it easier to understand their purpose and relationship to other
tasks. Groups are set using the group property.

Descriptions
Descriptions provide a brief explanation of what a task does. When you run ./gradlew tasks, the
descriptions are shown next to each task, helping you understand its purpose and how to use it.
Descriptions are set using the description property.

Let’s consider a basic Java application as an example. The build contains a subproject called app.

Let’s list the available tasks in app at the moment:

$ ./gradlew :app:tasks

> Task :app:tasks


------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application.

Build tasks
-----------
assemble - Assembles the outputs of this project.

Here, the :run task is part of the Application group with the description Runs this project as a JVM
application. In code, it would look something like this:

app/build.gradle.kts

tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}

app/build.gradle

tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}

Private and hidden tasks

Gradle doesn’t support marking a task as private.

However, tasks will only show up when running :tasks if task.group is set or no other task depends
on it.

For instance, the following task will not appear when running ./gradlew :app:tasks because it does
not have a group; it is called a hidden task:

app/build.gradle.kts

tasks.register("helloTask") {
println("Hello")
}

app/build.gradle

tasks.register("helloTask") {
println("Hello")
}

Although helloTask is not listed, it can still be executed by Gradle:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Let’s add a group to the same task:

app/build.gradle.kts

tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}

app/build.gradle

tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}

Now that the group is added, the task is visible:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Other tasks
-----------
helloTask - Hello task

In contrast, ./gradlew tasks --all will show all tasks; hidden and visible tasks are listed.

Grouping tasks

If you want to customize which tasks are shown to users when listed, you can group tasks and set
the visibility of each group.

Remember, even if you hide tasks, they are still available, and Gradle can still run
NOTE
them.

Let’s start with an example built by Gradle init for a Java application with multiple subprojects.
The project structure is as follows:

gradle-project
├── app
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle.kts
│ ├── settings.gradle.kts
│ └── src // common build logic
│ └── ...
├── settings.gradle.kts
├── gradle
├── gradlew
└── gradlew.bat

gradle-project
├── app
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle
│ ├── settings.gradle
│ └── src // common build logic
│ └── ...
├── settings.gradle
├── gradle
├── gradlew
└── gradlew.bat

Run app:tasks to see available tasks in the app subproject:

$ ./gradlew :app:tasks

> Task :app:tasks


------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these
tasks are rarely required directly by developers using the build.

We can configure the :tasks task and limit the tasks shown to a certain group.

Let’s create our own group so that all tasks are hidden by default by updating the app build script:

app/build.gradle.kts

val myBuildGroup = "my app build" // Create a group name

tasks.register<TaskReportTask>("tasksAll") { // Register the tasksAll task


group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}

tasks.named<TaskReportTask>("tasks") { // Move all existing tasks to


the group
displayGroup = myBuildGroup
}

app/build.gradle

def myBuildGroup = "my app build" // Create a group name

tasks.register(TaskReportTask, "tasksAll") { // Register the tasksAll task


group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}

tasks.named(TaskReportTask, "tasks") { // Move all existing tasks to


the group
displayGroup = myBuildGroup
}

Now, when we list tasks available in app, the list is shorter:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

My app build tasks


------------------
tasksAll - Show additional tasks.

Task categories

Gradle distinguishes between two categories of tasks:

1. Lifecycle tasks

2. Actionable tasks

Lifecycle tasks define targets you can call, such as :build your project. Lifecycle tasks do not
provide Gradle with actions. They must be wired to actionable tasks. The base Gradle plugin only
adds lifecycle tasks.

Actionable tasks define actions for Gradle to take, such as :compileJava, which compiles the Java
code of your project. Actions include creating JARs, zipping files, publishing archives, and much
more. Plugins like the java-library plugin adds actionable tasks.

Let’s update the build script of the previous example, which is currently an empty file so that our
app subproject is a Java library:

app/build.gradle.kts

plugins {
id("java-library")
}

app/build.gradle

plugins {
id('java-library')
}

Once again, we list the available tasks to see what new tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.

We see that many new tasks are available such as jar and testClasses.

Additionally, the java-library plugin has wired actionable tasks to lifecycle tasks. If we call the
:build task, we can see several tasks have been executed, including the :app:compileJava task.

$./gradlew :app:build

> Task :app:compileJava


> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

The actionable :compileJava task is wired to the lifecycle :build task.

Incremental tasks

A key feature of Gradle tasks is their incremental nature.

Gradle can reuse results from prior builds. Therefore, if we’ve built our project before and made
only minor changes, rerunning :build will not require Gradle to perform extensive work.

For example, if we modify only the test code in our project, leaving the production code unchanged,
executing the build will solely recompile the test code. Gradle marks the tasks for the production
code as UP-TO-DATE, indicating that it remains unchanged since the last successful build:

$./gradlew :app:build

lkassovic@MacBook-Pro temp1 % ./gradlew :app:build


> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:assemble UP-TO-DATE
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check UP-TO-DATE
> Task :app:build UP-TO-DATE

Caching tasks

Gradle can reuse results from past builds using the build cache.

To enable this feature, activate the build cache by using the --build-cache command line parameter
or by setting org.gradle.caching=true in your gradle.properties file.

This optimization has the potential to accelerate your builds significantly:

$./gradlew :app:clean :app:build --build-cache

> Task :app:compileJava FROM-CACHE


> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava FROM-CACHE
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses UP-TO-DATE
> Task :app:test FROM-CACHE
> Task :app:check UP-TO-DATE
> Task :app:build

When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE.

The build cache is handy if you switch between branches regularly. Gradle supports both local and
remote build caches.

Developing tasks

When developing Gradle tasks, you have two choices:

1. Use an existing Gradle task type such as Zip, Copy, or Delete

2. Create your own Gradle task type such as MyResolveTask or CustomTaskUsingToolchains.

Task types are simply subclasses of the Gradle Task class.

With Gradle tasks, there are three states to consider:

1. Registering a task - using a task (implemented by you or provided by Gradle) in your build
logic.

2. Configuring a task - defining inputs and outputs for a registered task.

3. Implementing a task - creating a custom task class (i.e., custom class type).

Registration is commonly done with the register() method.


Configuring a task is commonly done with the named() method.
Implementing a task is commonly done by extending Gradle’s DefaultTask class:

tasks.register<Copy>("myCopy") ①

tasks.named<Copy>("myCopy") { ②
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

abstract class MyCopyTask : DefaultTask() { ③


@TaskAction
fun copyFiles() {
val sourceDir = File("sourceDir")
val destinationDir = File("destinationDir")
sourceDir.listFiles()?.forEach { file ->
if (file.isFile && file.extension == "txt") {
file.copyTo(File(destinationDir, file.name))
}
}
}
}

① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.

② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.

③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.

tasks.register(Copy, "myCopy") ①

tasks.named(Copy, "myCopy") { ②
from "resources"
into "target"
include "**/*.txt", "**/*.xml", "**/*.properties"
}

abstract class MyCopyTask extends DefaultTask { ③


@TaskAction
void copyFiles() {
fileTree('sourceDir').matching {
include '**/*.txt'
}.forEach { file ->
file.copyTo(file.path.replace('sourceDir', 'destinationDir'))
}
}
}

① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.

② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.

③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.

1. Registering tasks

You define actions for Gradle to take by registering tasks in build scripts or plugins.
Tasks are defined using strings for task names:

build.gradle.kts

tasks.register("hello") {
doLast {
println("hello")
}
}

build.gradle

tasks.register('hello') {
doLast {
println 'hello'
}
}

In the example above, the task is added to the TasksCollection using the register() method in
TaskContainer.

2. Configuring tasks

Gradle tasks must be configured to complete their action(s) successfully. If a task needs to ZIP a file,
it must be configured with the file name and location. You can refer to the API for the Gradle Zip
task to learn how to configure it appropriately.

Let’s look at the Copy task provided by Gradle as an example. We first register a task called myCopy of
type Copy in the build script:

build.gradle.kts

tasks.register<Copy>("myCopy")

build.gradle

tasks.register('myCopy', Copy)

This registers a copy task with no default behavior. Since the task is of type Copy, a Gradle supported
task type, it can be configured using its API.

The following examples show several ways to achieve the same configuration:

1. Using the named() method:

Use named() to configure an existing task registered elsewhere:

build.gradle.kts

tasks.named<Copy>("myCopy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

build.gradle

tasks.named('myCopy') {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}

2. Using a configuration block:

Use a block to configure the task immediately upon registering it:

build.gradle.kts

tasks.register<Copy>("copy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

build.gradle

tasks.register('copy', Copy) {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}

3. Name method as call:

A popular option that is only supported in Groovy is the shorthand notation:

copy {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}

NOTE This option breaks task configuration avoidance and is not recommended!

Regardless of the method chosen, the task is configured with the name of the files to be copied and
the location of the files.

3. Implementing tasks

Gradle provides many task types including Delete, Javadoc, Copy, Exec, Tar, and Pmd. You can
implement a custom task type if Gradle does not provide a task type that meets your build logic
needs.

To create a custom task class, you extend DefaultTask and make the extending class abstract:

app/build.gradle.kts

abstract class MyCopyTask extends DefaultTask {

app/build.gradle

abstract class MyCopyTask : DefaultTask() {

You can learn more about developing custom task types in Implementing Tasks.

Unresolved directive in userguide_single.adoc - include::lifecycle_tasks.adoc[leveloffset=+2]


Unresolved directive in userguide_single.adoc - include::actionable_tasks.adoc[leveloffset=+2]
:leveloffset: +2
Configuring Tasks Lazily
Knowing when and where a particular value is configured is difficult to track as a build grows in
complexity. Gradle provides several ways to manage this using lazy configuration.
Understanding Lazy properties
Gradle provides lazy properties, which delay calculating a property’s value until it’s actually
required.

Lazy properties provide three main benefits:

1. Deferred Value Resolution: Allows wiring Gradle models without needing to know when a
property’s value will be known. For example, you may want to set the input source files of a
task based on the source directories property of an extension, but the extension property value
isn’t known until the build script or some other plugin configures them.

2. Automatic Task Dependency Management: Connects output of one task to input of another,
automatically determining task dependencies. Property instances carry information about
which task, if any, produces their value. Build authors do not need to worry about keeping task
dependencies in sync with configuration changes.

3. Improved Build Performance: Avoids resource-intensive work during configuration,


impacting build performance positively. For example, when a configuration value comes from
parsing a file but is only used when functional tests are run, using a property instance to
capture this means that the file is parsed only when the functional tests are run (and not when
clean is run, for example).

Gradle represents lazy properties with two interfaces:

Provider
Represents a value that can only be queried and cannot be changed.

• Properties with these types are read-only.

• The method Provider.get() returns the current value of the property.

• A Provider can be created from another Provider using Provider.map(Transformer).

• Many other types extend Provider and can be used wherever a Provider is required.

Property
Represents a value that can be queried and changed.

• Properties with these types are configurable.

• Property extends the Provider interface.

• The method Property.set(T) specifies a value for the property, overwriting whatever value
may have been present.

• The method Property.set(Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together
Provider and Property instances before the values are configured.

• A Property can be created by the factory method ObjectFactory.property(Class).

Lazy properties are intended to be passed around and only queried when required. This typically
happens during the execution phase.
The following demonstrates a task with a configurable greeting property and a read-only message
property:

build.gradle.kts

abstract class Greeting : DefaultTask() { ①


@get:Input
abstract val greeting: Property<String> ②

@Internal
val message: Provider<String> = greeting.map { it + " from Gradle" } ③

@TaskAction
fun printMessage() {
logger.quiet(message.get())
}
}

tasks.register<Greeting>("greeting") {
greeting.set("Hi") ④
greeting = "Hi" ⑤
}

build.gradle

abstract class Greeting extends DefaultTask { ①


@Input
abstract Property<String> getGreeting() ②

@Internal
final Provider<String> message = greeting.map { it + ' from Gradle' } ③

@TaskAction
void printMessage() {
logger.quiet(message.get())
}
}

tasks.register("greeting", Greeting) {
greeting.set('Hi') ④
greeting = 'Hi' ⑤
}

① A task that displays a greeting


② A configurable greeting

③ Read-only property calculated from the greeting

④ Configure the greeting

⑤ Alternative notation to calling Property.set()

$ gradle greeting

> Task :greeting


Hi from Gradle

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

The Greeting task has a property of type Property<String> to represent the configurable greeting
and a property of type Provider<String> to represent the calculated, read-only, message. The
message Provider is created from the greeting Property using the map() method; its value is kept up-
to-date as the value of the greeting property changes.
Creating a Property or Provider instance
Neither Provider nor its subtypes, such as Property, are intended to be implemented by a build
script or plugin. Gradle provides factory methods to create instances of these types instead.

In the previous example, two factory methods were presented:

• ObjectFactory.property(Class) create a new Property instance. An instance of the ObjectFactory


can be referenced from Project.getObjects() or by injecting ObjectFactory through a constructor
or method.

• Provider.map(Transformer) creates a new Provider from an existing Provider or Property


instance.

See the Quick Reference for all of the types and factories available.

A Provider can also be created by the factory method ProviderFactory.provider(Callable).

There are no specific methods to create a provider using a groovy.lang.Closure.

When writing a plugin or build script with Groovy, you can use the map(Transformer)
NOTE method with a closure, and Groovy will convert the closure to a Transformer.

Similarly, when writing a plugin or build script with Kotlin, the Kotlin compiler will
convert a Kotlin function into a Transformer.
Connecting properties together
An important feature of lazy properties is that they can be connected together so that changes to
one property are automatically reflected in other properties.

Here is an example where the property of a task is connected to a property of a project extension:

build.gradle.kts

// A project extension
interface MessageExtension {
// A configurable greeting
abstract val greeting: Property<String>
}

// A task that displays a greeting


abstract class Greeting : DefaultTask() {
// Configurable by the user
@get:Input
abstract val greeting: Property<String>

// Read-only property calculated from the greeting


@Internal
val message: Provider<String> = greeting.map { it + " from Gradle" }

@TaskAction
fun printMessage() {
logger.quiet(message.get())
}
}

// Create the project extension


val messages = project.extensions.create<MessageExtension>("messages")

// Create the greeting task


tasks.register<Greeting>("greeting") {
// Attach the greeting from the project extension
// Note that the values of the project extension have not been configured
yet
greeting = messages.greeting
}

messages.apply {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = "Hi"
}
build.gradle

// A project extension
interface MessageExtension {
// A configurable greeting
Property<String> getGreeting()
}

// A task that displays a greeting


abstract class Greeting extends DefaultTask {
// Configurable by the user
@Input
abstract Property<String> getGreeting()

// Read-only property calculated from the greeting


@Internal
final Provider<String> message = greeting.map { it + ' from Gradle' }

@TaskAction
void printMessage() {
logger.quiet(message.get())
}
}

// Create the project extension


project.extensions.create('messages', MessageExtension)

// Create the greeting task


tasks.register("greeting", Greeting) {
// Attach the greeting from the project extension
// Note that the values of the project extension have not been configured
yet
greeting = messages.greeting
}

messages {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = 'Hi'
}

$ gradle greeting

> Task :greeting


Hi from Gradle
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example calls the Property.set(Provider) method to attach a Provider to a Property to supply the
value of the property. In this case, the Provider happens to be a Property as well, but you can
connect any Provider implementation, for example one created using Provider.map()
Working with files
In Working with Files, we introduced four collection types for File-like objects:

Read-only Type Configurable Type

FileCollection ConfigurableFileCollection

FileTree ConfigurableFileTree

All of these types are also considered lazy types.

There are more strongly typed models used to represent elements of the file system: Directory and
RegularFile. These types shouldn’t be confused with the standard Java File type as they are used to
tell Gradle that you expect more specific values such as a directory or a non-directory, regular file.

Gradle provides two specialized Property subtypes for dealing with values of these types:
RegularFileProperty and DirectoryProperty. ObjectFactory has methods to create these:
ObjectFactory.fileProperty() and ObjectFactory.directoryProperty().

A DirectoryProperty can also be used to create a lazily evaluated Provider for a Directory and
RegularFile via DirectoryProperty.dir(String) and DirectoryProperty.file(String) respectively. These
methods create providers whose values are calculated relative to the location for the
DirectoryProperty they were created from. The values returned from these providers will reflect
changes to the DirectoryProperty.

build.gradle.kts

// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource : DefaultTask() {
// The configuration file to use to generate the source file
@get:InputFile
abstract val configFile: RegularFileProperty

// The directory to write source files to


@get:OutputDirectory
abstract val outputDir: DirectoryProperty

@TaskAction
fun compile() {
val inFile = configFile.get().asFile
logger.quiet("configuration file = $inFile")
val dir = outputDir.get().asFile
logger.quiet("output dir = $dir")
val className = inFile.readText().trim()
val srcFile = File(dir, "${className}.java")
srcFile.writeText("public class ${className} { }")
}
}

// Create the source generation task


tasks.register<GenerateSource>("generate") {
// Configure the locations, relative to the project and build directories
configFile = layout.projectDirectory.file("src/config.txt")
outputDir = layout.buildDirectory.dir("generated-source")
}

// Change the build directory


// Don't need to reconfigure the task properties. These are automatically
updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")

build.gradle

// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource extends DefaultTask {
// The configuration file to use to generate the source file
@InputFile
abstract RegularFileProperty getConfigFile()

// The directory to write source files to


@OutputDirectory
abstract DirectoryProperty getOutputDir()

@TaskAction
def compile() {
def inFile = configFile.get().asFile
logger.quiet("configuration file = $inFile")
def dir = outputDir.get().asFile
logger.quiet("output dir = $dir")
def className = inFile.text.trim()
def srcFile = new File(dir, "${className}.java")
srcFile.text = "public class ${className} { ... }"
}
}

// Create the source generation task


tasks.register('generate', GenerateSource) {
// Configure the locations, relative to the project and build directories
configFile = layout.projectDirectory.file('src/config.txt')
outputDir = layout.buildDirectory.dir('generated-source')
}

// Change the build directory


// Don't need to reconfigure the task properties. These are automatically
updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')

$ gradle generate

> Task :generate


configuration file = /home/user/gradle/samples/src/config.txt
output dir = /home/user/gradle/samples/output/generated-source

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

$ gradle generate

> Task :generate


configuration file = /home/user/gradle/samples/kotlin/src/config.txt
output dir = /home/user/gradle/samples/kotlin/output/generated-source

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example creates providers that represent locations in the project and build directories through
Project.getLayout() with ProjectLayout.getBuildDirectory() and ProjectLayout.getProjectDirectory().

To close the loop, note that a DirectoryProperty, or a simple Directory, can be turned into a FileTree
that allows the files and directories contained in the directory to be queried with
DirectoryProperty.getAsFileTree() or Directory.getAsFileTree(). From a DirectoryProperty or a
Directory, you can create FileCollection instances containing a set of the files contained in the
directory with DirectoryProperty.files(Object...) or Directory.files(Object...).
Working with task inputs and outputs
Many builds have several tasks connected together, where one task consumes the outputs of
another task as an input.

To make this work, we need to configure each task to know where to look for its inputs and where
to place its outputs. Ensure that the producing and consuming tasks are configured with the same
location and attach task dependencies between the tasks. This can be cumbersome and brittle if any
of these values are configurable by a user or configured by multiple plugins, as task properties need
to be configured in the correct order and locations, and task dependencies kept in sync as values
change.

The Property API makes this easier by keeping track of the value of a property and the task that
produces the value.

As an example, consider the following plugin with a producer and consumer task which are wired
together:

build.gradle.kts

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:InputFile
abstract val inputFile: RegularFileProperty

@TaskAction
fun consume() {
val input = inputFile.get().asFile
val message = input.readText()
logger.quiet("Read '${message}' from ${input}")
}
}

val producer = tasks.register<Producer>("producer")


val consumer = tasks.register<Consumer>("consumer")
consumer {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = producer.flatMap { it.outputFile }
}

producer {
// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file("file.txt")
}

// Change the build directory.


// Don't need to update producer.outputFile and consumer.inputFile. These are
automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")

build.gradle

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@InputFile
abstract RegularFileProperty getInputFile()

@TaskAction
void consume() {
def input = inputFile.get().asFile
def message = input.text
logger.quiet("Read '${message}' from ${input}")
}
}

def producer = tasks.register("producer", Producer)


def consumer = tasks.register("consumer", Consumer)
consumer.configure {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = producer.flatMap { it.outputFile }
}

producer.configure {
// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file('file.txt')
}

// Change the build directory.


// Don't need to update producer.outputFile and consumer.inputFile. These are
automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/output/file.txt

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/output/file.txt

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/file.txt

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/file.txt

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

In the example above, the task outputs and inputs are connected before any location is defined. The
setters can be called at any time before the task is executed, and the change will automatically
affect all related input and output properties.

Another important thing to note in this example is the absence of any explicit task dependency.
Task outputs represented using Providers keep track of which task produces their value, and using
them as task inputs will implicitly add the correct task dependencies.

Implicit task dependencies also work for input properties that are not files:

build.gradle.kts

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:Input
abstract val message: Property<String>

@TaskAction
fun consume() {
logger.quiet(message.get())
}
}

val producer = tasks.register<Producer>("producer") {


// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file("file.txt")
}
tasks.register<Consumer>("consumer") {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
message = producer.flatMap { it.outputFile }.map { it.asFile.readText() }
}

build.gradle

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@Input
abstract Property<String> getMessage()

@TaskAction
void consume() {
logger.quiet(message.get())
}
}

def producer = tasks.register('producer', Producer) {


// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file('file.txt')
}
tasks.register('consumer', Consumer) {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
message = producer.flatMap { it.outputFile }.map { it.asFile.text }
}

$ gradle consumer

> Task :producer


Wrote 'Hello, World!' to /home/user/gradle/samples/build/file.txt

> Task :consumer


Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

$ gradle consumer
> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/build/file.txt

> Task :consumer


Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Working with collections
Gradle provides two lazy property types to help configure Collection properties.

These work exactly like any other Provider and, just like file providers, they have additional
modeling around them:

• For List values the interface is called ListProperty.


You can create a new ListProperty using ObjectFactory.listProperty(Class) and specifying the
element type.

• For Set values the interface is called SetProperty.


You can create a new SetProperty using ObjectFactory.setProperty(Class) and specifying the
element type.

This type of property allows you to overwrite the entire collection value with
HasMultipleValues.set(Iterable) and HasMultipleValues.set(Provider) or add new elements through
the various add methods:

• HasMultipleValues.add(T): Add a single element to the collection

• HasMultipleValues.add(Provider): Add a lazily calculated element to the collection

• HasMultipleValues.addAll(Provider): Add a lazily calculated collection of elements to the list

Just like every Provider, the collection is calculated when Provider.get() is called. The following
example shows the ListProperty in action:

build.gradle.kts

abstract class Producer : DefaultTask() {


@get:OutputFile
abstract val outputFile: RegularFileProperty

@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer : DefaultTask() {


@get:InputFiles
abstract val inputFiles: ListProperty<RegularFile>

@TaskAction
fun consume() {
inputFiles.get().forEach { inputFile ->
val input = inputFile.asFile
val message = input.readText()
logger.quiet("Read '${message}' from ${input}")
}
}
}

val producerOne = tasks.register<Producer>("producerOne")


val producerTwo = tasks.register<Producer>("producerTwo")
tasks.register<Consumer>("consumer") {
// Connect the producer task outputs to the consumer task input
// Don't need to add task dependencies to the consumer task. These are
automatically added
inputFiles.add(producerOne.get().outputFile)
inputFiles.add(producerTwo.get().outputFile)
}

// Set values for the producer tasks lazily


// Don't need to update the consumer.inputFiles property. This is
automatically updated as producer.outputFile changes
producerOne { outputFile = layout.buildDirectory.file("one.txt") }
producerTwo { outputFile = layout.buildDirectory.file("two.txt") }

// Change the build directory.


// Don't need to update the task properties. These are automatically updated
as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")

build.gradle

abstract class Producer extends DefaultTask {


@OutputFile
abstract RegularFileProperty getOutputFile()

@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}

abstract class Consumer extends DefaultTask {


@InputFiles
abstract ListProperty<RegularFile> getInputFiles()

@TaskAction
void consume() {
inputFiles.get().each { inputFile ->
def input = inputFile.asFile
def message = input.text
logger.quiet("Read '${message}' from ${input}")
}
}
}

def producerOne = tasks.register('producerOne', Producer)


def producerTwo = tasks.register('producerTwo', Producer)
tasks.register('consumer', Consumer) {
// Connect the producer task outputs to the consumer task input
// Don't need to add task dependencies to the consumer task. These are
automatically added
inputFiles.add(producerOne.get().outputFile)
inputFiles.add(producerTwo.get().outputFile)
}

// Set values for the producer tasks lazily


// Don't need to update the consumer.inputFiles property. This is
automatically updated as producer.outputFile changes
producerOne.configure { outputFile = layout.buildDirectory.file('one.txt') }
producerTwo.configure { outputFile = layout.buildDirectory.file('two.txt') }

// Change the build directory.


// Don't need to update the task properties. These are automatically updated
as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')

$ gradle consumer

> Task :producerOne


Wrote 'Hello, World!' to /home/user/gradle/samples/output/one.txt

> Task :producerTwo


Wrote 'Hello, World!' to /home/user/gradle/samples/output/two.txt

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/output/one.txt
Read 'Hello, World!' from /home/user/gradle/samples/output/two.txt

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed

$ gradle consumer

> Task :producerOne


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/one.txt

> Task :producerTwo


Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/two.txt

> Task :consumer


Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/one.txt
Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/two.txt

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
Working with maps
Gradle provides a lazy MapProperty type to allow Map values to be configured. You can create a
MapProperty instance using ObjectFactory.mapProperty(Class, Class).

Similar to other property types, a MapProperty has a set() method that you can use to specify the
value for the property. Some additional methods allow entries with lazy values to be added to the
map.

build.gradle.kts

abstract class Generator: DefaultTask() {


@get:Input
abstract val properties: MapProperty<String, Int>

@TaskAction
fun generate() {
properties.get().forEach { entry ->
logger.quiet("${entry.key} = ${entry.value}")
}
}
}

// Some values to be configured later


var b = 0
var c = 0

tasks.register<Generator>("generate") {
properties.put("a", 1)
// Values have not been configured yet
properties.put("b", providers.provider { b })
properties.putAll(providers.provider { mapOf("c" to c, "d" to c + 1) })
}

// Configure the values. There is no need to reconfigure the task


b = 2
c = 3

build.gradle

abstract class Generator extends DefaultTask {


@Input
abstract MapProperty<String, Integer> getProperties()

@TaskAction
void generate() {
properties.get().each { key, value ->
logger.quiet("${key} = ${value}")
}
}
}

// Some values to be configured later


def b = 0
def c = 0

tasks.register('generate', Generator) {
properties.put("a", 1)
// Values have not been configured yet
properties.put("b", providers.provider { b })
properties.putAll(providers.provider { [c: c, d: c + 1] })
}

// Configure the values. There is no need to reconfigure the task


b = 2
c = 3

$ gradle generate

> Task :generate


a = 1
b = 2
c = 3
d = 4

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
Applying a convention to a property
Often, you want to apply some convention, or default value to a property to be used if no value has
been configured. You can use the convention() method for this. This method accepts either a value
or a Provider, and this will be used as the value until some other value is configured.

build.gradle.kts

tasks.register("show") {
val property = objects.property(String::class)

// Set a convention
property.convention("convention 1")

println("value = " + property.get())

// Can replace the convention


property.convention("convention 2")
println("value = " + property.get())

property.set("explicit value")

// Once a value is set, the convention is ignored


property.convention("ignored convention")

doLast {
println("value = " + property.get())
}
}

build.gradle

tasks.register("show") {
def property = objects.property(String)

// Set a convention
property.convention("convention 1")

println("value = " + property.get())

// Can replace the convention


property.convention("convention 2")
println("value = " + property.get())

property.set("explicit value")
// Once a value is set, the convention is ignored
property.convention("ignored convention")

doLast {
println("value = " + property.get())
}
}

$ gradle show
value = convention 1
value = convention 2

> Task :show


value = explicit value

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

Where to apply conventions from?


There are several appropriate locations for setting a convention on a property at configuration time
(i.e., before execution).

build.gradle.kts

// setting convention when registering a task from plugin


class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
project.getTasks().register<GreetingTask>("hello") {
greeter.convention("Greeter")
}
}
}

apply<GreetingPlugin>()

tasks.withType<GreetingTask>().configureEach {
// setting convention from build script
guest.convention("Guest")
}

abstract class GreetingTask : DefaultTask() {


// setting convention from constructor
@get:Input
abstract val guest: Property<String>
init {
guest.convention("person2")
}

// setting convention from declaration


@Input
val greeter = project.objects.property<String>().convention("person1")

@TaskAction
fun greet() {
println("hello, ${guest.get()}, from ${greeter.get()}")
}
}

build.gradle

// setting convention when registering a task from plugin


class GreetingPlugin implements Plugin<Project> {
void apply(Project project) {
project.getTasks().register("hello", GreetingTask) {
greeter.convention("Greeter")
}
}
}

apply plugin: GreetingPlugin

tasks.withType(GreetingTask).configureEach {
// setting convention from build script
guest.convention("Guest")
}

abstract class GreetingTask extends DefaultTask {


// setting convention from constructor
@Input
abstract Property<String> getGuest()

GreetingTask() {
guest.convention("person2")
}

// setting convention from declaration


@Input
final Property<String> greeter = project.objects.property(String)
.convention("person1")

@TaskAction
void greet() {
println("hello, ${guest.get()}, from ${greeter.get()}")
}
}

From a plugin’s apply() method

Plugin authors may configure a convention on a lazy property from a plugin’s apply() method,
while performing preliminary configuration of the task or extension defining the property. This
works well for regular plugins (meant to be distributed and used in the wild), and internal
convention plugins (which often configure properties defined by third party plugins in a uniform
way for the entire build).

build.gradle.kts

// setting convention when registering a task from plugin


class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
project.getTasks().register<GreetingTask>("hello") {
greeter.convention("Greeter")
}
}
}

build.gradle

// setting convention when registering a task from plugin


class GreetingPlugin implements Plugin<Project> {
void apply(Project project) {
project.getTasks().register("hello", GreetingTask) {
greeter.convention("Greeter")
}
}
}

From a build script

Build engineers may configure a convention on a lazy property from shared build logic that is
configuring tasks (for instance, from third-party plugins) in a standard way for the entire build.
build.gradle.kts

apply<GreetingPlugin>()

tasks.withType<GreetingTask>().configureEach {
// setting convention from build script
guest.convention("Guest")
}

build.gradle

tasks.withType(GreetingTask).configureEach {
// setting convention from build script
guest.convention("Guest")
}

Note that for project-specific values, instead of conventions, you should prefer setting explicit
values (using Property.set(…) or ConfigurableFileCollection.setFrom(…), for instance), as
conventions are only meant to define defaults.

From the task initialization

A task author may configure a convention on a lazy property from the task constructor or (if in
Kotlin) initializer block. This approach works for properties with trivial defaults, but it is not
appropriate if additional context (external to the task implementation) is required in order to set a
suitable default.

build.gradle.kts

// setting convention from constructor


@get:Input
abstract val guest: Property<String>

init {
guest.convention("person2")
}

build.gradle

// setting convention from constructor


@Input
abstract Property<String> getGuest()

GreetingTask() {
guest.convention("person2")
}

Next to the property declaration

You may configure a convention on a lazy property next to the place where the property is
declared. Note this option is not available for managed properties, and has the same caveats as
configuring a convention from the task constructor.

build.gradle.kts

// setting convention from declaration


@Input
val greeter = project.objects.property<String>().convention("person1")

build.gradle

// setting convention from declaration


@Input
final Property<String> greeter = project.objects.property(String).convention
("person1")
Making a property unmodifiable
Most properties of a task or project are intended to be configured by plugins or build scripts so that
they can use specific values for that build.

For example, a property that specifies the output directory for a compilation task may start with a
value specified by a plugin. Then a build script might change the value to some custom location,
then this value is used by the task when it runs. However, once the task starts to run, we want to
prevent further property changes. This way we avoid errors that result from different consumers,
such as the task action, Gradle’s up-to-date checks, build caching, or other tasks, using different
values for the property.

Lazy properties provide several methods that you can use to disallow changes to their value once
the value has been configured. The finalizeValue() method calculates the final value for the
property and prevents further changes to the property.

libVersioning.version.finalizeValue()

When the property’s value comes from a Provider, the provider is queried for its current value, and
the result becomes the final value for the property. This final value replaces the provider and the
property no longer tracks the value of the provider. Calling this method also makes a property
instance unmodifiable and any further attempts to change the value of the property will fail. Gradle
automatically makes the properties of a task final when the task starts execution.

The finalizeValueOnRead() method is similar, except that the property’s final value is not calculated
until the value of the property is queried.

modifiedFiles.finalizeValueOnRead()

In other words, this method calculates the final value lazily as required, whereas finalizeValue()
calculates the final value eagerly. This method can be used when the value may be expensive to
calculate or may not have been configured yet. You also want to ensure that all consumers of the
property see the same value when they query the value.
Using the Provider API
Guidelines to be successful with the Provider API:

1. The Property and Provider types have all of the overloads you need to query or configure a
value. For this reason, you should follow the following guidelines:

◦ For configurable properties, expose the Property directly through a single getter.

◦ For non-configurable properties, expose an Provider directly through a single getter.

2. Avoid simplifying calls like obj.getProperty().get() and obj.getProperty().set(T) in your code


by introducing additional getters and setters.

3. When migrating your plugin to use providers, follow these guidelines:

◦ If it’s a new property, expose it as a Property or Provider using a single getter.

◦ If it’s incubating, change it to use a Property or Provider using a single getter.

◦ If it’s a stable property, add a new Property or Provider and deprecate the old one. You
should wire the old getter/setters into the new property as appropriate.

Provider Files API Reference


Use these types for read-only values:

Provider<RegularFile>
File on disk

Factories
• Provider.map(Transformer).

• Provider.flatMap(Transformer).

• DirectoryProperty.file(String)

Provider<Directory>
Directory on disk

Factories
• Provider.map(Transformer).

• Provider.flatMap(Transformer).

• DirectoryProperty.dir(String)

FileCollection
Unstructured collection of files

Factories
• Project.files(Object[])

• ProjectLayout.files(Object...)

• DirectoryProperty.files(Object...)
FileTree
Hierarchy of files

Factories
• Project.fileTree(Object) will produce a ConfigurableFileTree, or you can use
Project.zipTree(Object) and Project.tarTree(Object)

• DirectoryProperty.getAsFileTree()

Property Files API Reference


Use these types for mutable values:

RegularFileProperty
File on disk

Factories
• ObjectFactory.fileProperty()

DirectoryProperty
Directory on disk

Factories
• ObjectFactory.directoryProperty()

ConfigurableFileCollection
Unstructured collection of files

Factories
• ObjectFactory.fileCollection()

ConfigurableFileTree
Hierarchy of files

Factories
• ObjectFactory.fileTree()

SourceDirectorySet
Hierarchy of source directories

Factories
• ObjectFactory.sourceDirectorySet(String, String)

Lazy Collections API Reference


Use these types for mutable values:

ListProperty<T>
a property whose value is List<T>
Factories
• ObjectFactory.listProperty(Class)

SetProperty<T>
a property whose value is Set<T>

Factories
• ObjectFactory.setProperty(Class)

Lazy Objects API Reference


Use these types for read only values:

Provider<T>
a property whose value is an instance of T

Factories
• Provider.map(Transformer).

• Provider.flatMap(Transformer).

• ProviderFactory.provider(Callable). Always prefer one of the other factory methods over


this method.

Use these types for mutable values:

Property<T>
a property whose value is an instance of T

Factories
• ObjectFactory.property(Class)

Developing Parallel Tasks


Gradle provides an API that can split tasks into sections that can be executed in parallel.

This allows Gradle to fully utilize the resources available and complete builds faster.
The Worker API

The Worker API provides the ability to break up the execution of a task action into discrete units of
work and then execute that work concurrently and asynchronously.

Worker API example

The best way to understand how to use the API is to go through the process of converting an
existing custom task to use the Worker API:

1. You’ll start by creating a custom task class that generates MD5 hashes for a configurable set of
files.

2. Then, you’ll convert this custom task to use the Worker API.

3. Then, we’ll explore running the task with different levels of isolation.

In the process, you’ll learn about the basics of the Worker API and the capabilities it provides.

Step 1. Create a custom task class

First, create a custom task that generates MD5 hashes of a configurable set of files.

In a new directory, create a buildSrc/build.gradle(.kts) file:

buildSrc/build.gradle.kts

repositories {
mavenCentral()
}

dependencies {
implementation("commons-io:commons-io:2.5")
implementation("commons-codec:commons-codec:1.9") ①
}

buildSrc/build.gradle

repositories {
mavenCentral()
}

dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'commons-codec:commons-codec:1.9' ①
}
① Your custom task class will use Apache Commons Codec to generate MD5 hashes.

Next, create a custom task class in your buildSrc/src/main/java directory. You should name this
class CreateMD5:

buildSrc/src/main/java/CreateMD5.java

import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.OutputDirectory;
import org.gradle.api.tasks.SourceTask;
import org.gradle.api.tasks.TaskAction;
import org.gradle.workers.WorkerExecutor;

import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;

abstract public class CreateMD5 extends SourceTask { ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory(); ②

@TaskAction
public void createHashes() {
for (File sourceFile : getSource().getFiles()) { ③
try {
InputStream stream = new FileInputStream(sourceFile);
System.out.println("Generating MD5 for " + sourceFile.getName() + "
...");
// Artificially make this task slower.
Thread.sleep(3000); ④
Provider<RegularFile> md5File = getDestinationDirectory().file
(sourceFile.getName() + ".md5"); ⑤
FileUtils.writeStringToFile(md5File.get().getAsFile(), DigestUtils
.md5Hex(stream), (String) null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
}

① SourceTask is a convenience type for tasks that operate on a set of source files.

② The task output will go into a configured directory.

③ The task iterates over all the files defined as "source files" and creates an MD5 hash of each.
④ Insert an artificial sleep to simulate hashing a large file (the sample files won’t be that large).

⑤ The MD5 hash of each file is written to the output directory into a file of the same name with an
"md5" extension.

Next, create a build.gradle(.kts) that registers your new CreateMD5 task:

build.gradle.kts

plugins { id("base") } ①

tasks.register<CreateMD5>("md5") {
destinationDirectory = project.layout.buildDirectory.dir("md5") ②
source(project.layout.projectDirectory.file("src")) ③
}

build.gradle

plugins { id 'base' } ①

tasks.register("md5", CreateMD5) {
destinationDirectory = project.layout.buildDirectory.dir("md5") ②
source(project.layout.projectDirectory.file('src')) ③
}

① Apply the base plugin so that you’ll have a clean task to use to remove the output.

② MD5 hash files will be written to build/md5.

③ This task will generate MD5 hash files for every file in the src directory.

You will need some source to generate MD5 hashes from. Create three files in the src directory:

src/einstein.txt

Intellectual growth should commence at birth and cease only at death.

src/feynman.txt

I was born not knowing and have had only a little time to change that here and there.

src/hawking.txt

Intelligence is the ability to adapt to change.


At this point, you can test your task by running it ./gradlew md5:

$ gradle md5

The output should look similar to:

> Task :md5


Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 9s
3 actionable tasks: 3 executed

In the build/md5 directory, you should now see corresponding files with an md5 extension containing
MD5 hashes of the files from the src directory. Notice that the task takes at least 9 seconds to run
because it hashes each file one at a time (i.e., three files at ~3 seconds apiece).

Step 2. Convert to the Worker API

Although this task processes each file in sequence, the processing of each file is independent of any
other file. This work can be done in parallel and take advantage of multiple processors. This is
where the Worker API can help.

To use the Worker API, you need to define an interface that represents the parameters of each unit
of work and extends org.gradle.workers.WorkParameters.

For the generation of MD5 hash files, the unit of work will require two parameters:

1. the file to be hashed and,

2. the file to write the hash to.

There is no need to create a concrete implementation because Gradle will generate one for us at
runtime.

buildSrc/src/main/java/MD5WorkParameters.java

import org.gradle.api.file.RegularFileProperty;
import org.gradle.workers.WorkParameters;

public interface MD5WorkParameters extends WorkParameters {


RegularFileProperty getSourceFile(); ①
RegularFileProperty getMD5File();
}

① Use Property objects to represent the source and MD5 hash files.

Then, you need to refactor the part of your custom task that does the work for each individual file
into a separate class. This class is your "unit of work" implementation, and it should be an abstract
class that extends org.gradle.workers.WorkAction:

buildSrc/src/main/java/GenerateMD5.java

import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.workers.WorkAction;

import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;

public abstract class GenerateMD5 implements WorkAction<MD5WorkParameters> { ①


@Override
public void execute() {
try {
File sourceFile = getParameters().getSourceFile().getAsFile().get();
File md5File = getParameters().getMD5File().getAsFile().get();
InputStream stream = new FileInputStream(sourceFile);
System.out.println("Generating MD5 for " + sourceFile.getName() + "...");
// Artificially make this task slower.
Thread.sleep(3000);
FileUtils.writeStringToFile(md5File, DigestUtils.md5Hex(stream), (String)
null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}

① Do not implement the getParameters() method - Gradle will inject this at runtime.

Now, change your custom task class to submit work to the WorkerExecutor instead of doing the
work itself.

buildSrc/src/main/java/CreateMD5.java

import org.gradle.api.Action;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.workers.*;
import org.gradle.api.file.DirectoryProperty;

import javax.inject.Inject;
import java.io.File;

abstract public class CreateMD5 extends SourceTask {

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor(); ①

@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().noIsolation(); ②

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
workQueue.submit(GenerateMD5.class, parameters -> { ③
parameters.getSourceFile().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① The WorkerExecutor service is required in order to submit your work. Create an abstract getter
method annotated javax.inject.Inject, and Gradle will inject the service at runtime when the
task is created.

② Before submitting work, get a WorkQueue object with the desired isolation mode (described
below).

③ When submitting the unit of work, specify the unit of work implementation, in this case
GenerateMD5, and configure its parameters.

At this point, you should be able to rerun your task:

$ gradle clean md5

> Task :md5


Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

The results should look the same as before, although the MD5 hash files may be generated in a
different order since the units of work are executed in parallel. This time, however, the task runs
much faster. This is because the Worker API executes the MD5 calculation for each file in parallel
rather than in sequence.

Step 3. Change the isolation mode

The isolation mode controls how strongly Gradle will isolate items of work from each other and the
rest of the Gradle runtime.

There are three methods on WorkerExecutor that control this:

1. noIsolation()

2. classLoaderIsolation()

3. processIsolation()

The noIsolation() mode is the lowest level of isolation and will prevent a unit of work from
changing the project state. This is the fastest isolation mode because it requires the least overhead
to set up and execute the work item. However, it will use a single shared classloader for all units of
work. This means that each unit of work can affect one another through static class state. It also
means that every unit of work uses the same version of libraries on the buildscript classpath. If you
wanted the user to be able to configure the task to run with a different (but compatible) version of
the Apache Commons Codec library, you would need to use a different isolation mode.

First, you must change the dependency in buildSrc/build.gradle to be compileOnly. This tells Gradle
that it should use this dependency when building the classes, but should not put it on the build
script classpath:

buildSrc/build.gradle.kts

repositories {
mavenCentral()
}

dependencies {
implementation("commons-io:commons-io:2.5")
compileOnly("commons-codec:commons-codec:1.9")
}

buildSrc/build.gradle

repositories {
mavenCentral()
}

dependencies {
implementation 'commons-io:commons-io:2.5'
compileOnly 'commons-codec:commons-codec:1.9'
}

Next, change the CreateMD5 task to allow the user to configure the version of the codec library that
they want to use. It will resolve the appropriate version of the library at runtime and configure the
workers to use this version.

The classLoaderIsolation() method tells Gradle to run this work in a thread with an isolated
classloader:

buildSrc/src/main/java/CreateMD5.java

import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;

import javax.inject.Inject;
import java.io.File;
import java.util.Set;

abstract public class CreateMD5 extends SourceTask {

@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor();

@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().classLoaderIsolation(workerSpec -> {
workerSpec.getClasspath().from(getCodecClasspath()); ②
});

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
workQueue.submit(GenerateMD5.class, parameters -> {
parameters.getSourceFile().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① Expose an input property for the codec library classpath.

② Configure the classpath on the ClassLoaderWorkerSpec when creating the work queue.
Next, you need to configure your build so that it has a repository to look up the codec version at
task execution time. We also create a dependency to resolve our codec library from this repository:

build.gradle.kts

plugins { id("base") }

repositories {
mavenCentral() ①
}

val codec = configurations.create("codec") { ②


attributes {
attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage.JAVA_RUNTIME))
}
isVisible = false
isCanBeConsumed = false
}

dependencies {
codec("commons-codec:commons-codec:1.10") ③
}

tasks.register<CreateMD5>("md5") {
codecClasspath.from(codec) ④
destinationDirectory = project.layout.buildDirectory.dir("md5")
source(project.layout.projectDirectory.file("src"))
}

build.gradle

plugins { id 'base' }

repositories {
mavenCentral() ①
}

configurations.create('codec') { ②
attributes {
attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage
.JAVA_RUNTIME))
}
visible = false
canBeConsumed = false
}

dependencies {
codec 'commons-codec:commons-codec:1.10' ③
}

tasks.register('md5', CreateMD5) {
codecClasspath.from(configurations.codec) ④
destinationDirectory = project.layout.buildDirectory.dir('md5')
source(project.layout.projectDirectory.file('src'))
}

① Add a repository to resolve the codec library - this can be a different repository than the one
used to build the CreateMD5 task class.

② Add a configuration to resolve our codec library version.

③ Configure an alternate, compatible version of Apache Commons Codec.

④ Configure the md5 task to use the configuration as its classpath. Note that the configuration will
not be resolved until the task is executed.

Now, if you run your task, it should work as expected using the configured version of the codec
library:

$ gradle clean md5

> Task :md5


Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

Step 4. Create a Worker Daemon

Sometimes, it is desirable to utilize even greater levels of isolation when executing items of work.
For instance, external libraries may rely on certain system properties to be set, which may conflict
between work items. Or a library might not be compatible with the version of JDK that Gradle is
running with and may need to be run with a different version.

The Worker API can accommodate this using the processIsolation() method that causes the work
to execute in a separate "worker daemon". These worker processes will be session-scoped and can
be reused within the same build session, but they won’t persist across builds. However, if system
resources get low, Gradle will stop unused worker daemons.

To utilize a worker daemon, use the processIsolation() method when creating the WorkQueue. You
may also want to configure custom settings for the new process:
buildSrc/src/main/java/CreateMD5.java

import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;

import javax.inject.Inject;
import java.io.File;
import java.util.Set;

abstract public class CreateMD5 extends SourceTask {

@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①

@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();

@Inject
abstract public WorkerExecutor getWorkerExecutor();

@TaskAction
public void createHashes() {

WorkQueue workQueue = getWorkerExecutor().processIsolation(workerSpec -> {
workerSpec.getClasspath().from(getCodecClasspath());
workerSpec.forkOptions(options -> {
options.setMaxHeapSize("64m"); ②
});
});

for (File sourceFile : getSource().getFiles()) {


Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile
.getName() + ".md5");
workQueue.submit(GenerateMD5.class, parameters -> {
parameters.getSourceFile().set(sourceFile);
parameters.getMD5File().set(md5File);
});
}
}
}

① Change the isolation mode to PROCESS.

② Set up the JavaForkOptions for the new process.


Now, you should be able to run your task, and it will work as expected but using worker daemons
instead:

$ gradle clean md5

> Task :md5


Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

Note that the execution time may be high. This is because Gradle has to start a new process for each
worker daemon, which is expensive.

However, if you run your task a second time, you will see that it runs much faster. This is because
the worker daemon(s) started during the initial build have persisted and are available for use
immediately during subsequent builds:

$ gradle clean md5

> Task :md5


Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 1s
3 actionable tasks: 3 executed

Isolation modes

Gradle provides three isolation modes that can be configured when creating a WorkQueue and are
specified using one of the following methods on WorkerExecutor:

WorkerExecutor.noIsolation()
This states that the work should be run in a thread with minimal isolation.
For instance, it will share the same classloader that the task is loaded from. This is the fastest
level of isolation.

WorkerExecutor.classLoaderIsolation()
This states that the work should be run in a thread with an isolated classloader.
The classloader will have the classpath from the classloader that the unit of work
implementation class was loaded from as well as any additional classpath entries added through
ClassLoaderWorkerSpec.getClasspath().
WorkerExecutor.processIsolation()
This states that the work should be run with a maximum isolation level by executing the work in
a separate process.
The classloader of the process will use the classpath from the classloader that the unit of work
was loaded from as well as any additional classpath entries added through
ClassLoaderWorkerSpec.getClasspath(). Furthermore, the process will be a worker daemon that
will stay alive and can be reused for future work items with the same requirements. This
process can be configured with different settings than the Gradle JVM using
ProcessWorkerSpec.forkOptions(org.gradle.api.Action).

Worker Daemons

When using processIsolation(), Gradle will start a long-lived worker daemon process that can be
reused for future work items.

build.gradle.kts

// Create a WorkQueue with process isolation


val workQueue = workerExecutor.processIsolation() {
// Configure the options for the forked process
forkOptions {
maxHeapSize = "512m"
systemProperty("org.gradle.sample.showFileSize", "true")
}
}

// Create and submit a unit of work for each file


source.forEach { file ->
workQueue.submit(ReverseFile::class) {
fileToReverse = file
destinationDir = outputDir
}
}

build.gradle

// Create a WorkQueue with process isolation


WorkQueue workQueue = workerExecutor.processIsolation() { ProcessWorkerSpec
spec ->
// Configure the options for the forked process
forkOptions { JavaForkOptions options ->
options.maxHeapSize = "512m"
options.systemProperty "org.gradle.sample.showFileSize", "true"
}
}
// Create and submit a unit of work for each file
source.each { file ->
workQueue.submit(ReverseFile.class) { ReverseParameters parameters ->
parameters.fileToReverse = file
parameters.destinationDir = outputDir
}
}

When a unit of work for a worker daemon is submitted, Gradle will first look to see if a compatible,
idle daemon already exists. If so, it will send the unit of work to the idle daemon, marking it as
busy. If not, it will start a new daemon. When evaluating compatibility, Gradle looks at a number of
criteria, all of which can be controlled through
ProcessWorkerSpec.forkOptions(org.gradle.api.Action).

By default, a worker daemon starts with a maximum heap of 512MB. This can be changed by
adjusting the workers' fork options.

executable
A daemon is considered compatible only if it uses the same Java executable.

classpath
A daemon is considered compatible if its classpath contains all the classpath entries requested.
Note that a daemon is considered compatible only if the classpath exactly matches the requested
classpath.

heap settings
A daemon is considered compatible if it has at least the same heap size settings as requested.
In other words, a daemon that has higher heap settings than requested would be considered
compatible.

jvm arguments
A daemon is compatible if it has set all the JVM arguments requested.
Note that a daemon is compatible if it has additional JVM arguments beyond those requested
(except for those treated especially, such as heap settings, assertions, debug, etc.).

system properties
A daemon is considered compatible if it has set all the system properties requested with the
same values.
Note that a daemon is compatible if it has additional system properties beyond those requested.

environment variables
A daemon is considered compatible if it has set all the environment variables requested with the
same values.
Note that a daemon is compatible if it has more environment variables than requested.

bootstrap classpath
A daemon is considered compatible if it contains all the bootstrap classpath entries requested.
Note that a daemon is compatible if it has more bootstrap classpath entries than requested.

debug
A daemon is considered compatible only if debug is set to the same value as requested (true or
false).

enable assertions
A daemon is considered compatible only if enable assertions are set to the same value as
requested (true or false).

default character encoding


A daemon is considered compatible only if the default character encoding is set to the same
value as requested.

Worker daemons will remain running until the build daemon that started them is stopped or
system memory becomes scarce. When system memory is low, Gradle will stop worker daemons to
minimize memory consumption.

A step-by-step description of converting a normal task action to use the worker API
NOTE
can be found in the section on developing parallel tasks.

Cancellation and timeouts

To support cancellation (e.g., when the user stops the build with CTRL+C) and task timeouts, custom
tasks should react to interrupting their executing thread. The same is true for work items submitted
via the worker API. If a task does not respond to an interrupt within 10s, the daemon will shut
down to free up system resources.

Advanced Tasks
Incremental tasks

In Gradle, implementing a task that skips execution when its inputs and outputs are already UP-TO-
DATE is simple and efficient, thanks to the Incremental Build feature.

However, there are times when only a few input files have changed since the last execution, and it
is best to avoid reprocessing all the unchanged inputs. This situation is common in tasks that
transform input files into output files on a one-to-one basis.

To optimize your build process you can use an incremental task. This approach ensures that only
out-of-date input files are processed, improving build performance.

Implementing an incremental task

For a task to process inputs incrementally, that task must contain an incremental task action.

This is a task action method that has a single InputChanges parameter. That parameter tells Gradle
that the action only wants to process the changed inputs.
In addition, the task needs to declare at least one incremental file input property by using either
@Incremental or @SkipWhenEmpty:

build.gradle.kts

public class IncrementalReverseTask : DefaultTask() {

@get:Incremental
@get:InputDirectory
val inputDir: DirectoryProperty = project.objects.directoryProperty()

@get:OutputDirectory
val outputDir: DirectoryProperty = project.objects.directoryProperty()

@get:Input
val inputProperty: RegularFileProperty = project.objects.fileProperty()
// File input property

@TaskAction
fun execute(inputs: InputChanges) { // InputChanges parameter
val msg = if (inputs.isIncremental) "CHANGED inputs are out of date"
else "ALL inputs are out of date"
println(msg)
}
}

build.gradle

class IncrementalReverseTask extends DefaultTask {

@Incremental
@InputDirectory
def File inputDir

@OutputDirectory
def File outputDir

@Input
def inputProperty // File input property

@TaskAction
void execute(InputChanges inputs) { // InputChanges parameter
println inputs.incremental ? "CHANGED inputs are out of date"
: "ALL inputs are out of date"
}
}
To query incremental changes for an input file property, that property must
always return the same instance. The easiest way to accomplish this is to use
one of the following property types: RegularFileProperty, DirectoryProperty
IMPORTANT or ConfigurableFileCollection.

You can learn more about RegularFileProperty and DirectoryProperty in Lazy


Configuration.

The incremental task action can use InputChanges.getFileChanges() to find out what files have
changed for a given file-based input property, be it of type RegularFileProperty, DirectoryProperty
or ConfigurableFileCollection.

The method returns an Iterable of type FileChanges, which in turn can be queried for the
following:

• the affected file

• the change type (ADDED, REMOVED or MODIFIED)

• the normalized path of the changed file

• the file type of the changed file

The following example demonstrates an incremental task that has a directory input. It assumes that
the directory contains a collection of text files and copies them to an output directory, reversing the
text within each file:

build.gradle.kts

abstract class IncrementalReverseTask : DefaultTask() {


@get:Incremental
@get:PathSensitive(PathSensitivity.NAME_ONLY)
@get:InputDirectory
abstract val inputDir: DirectoryProperty

@get:OutputDirectory
abstract val outputDir: DirectoryProperty

@get:Input
abstract val inputProperty: Property<String>

@TaskAction
fun execute(inputChanges: InputChanges) {
println(
if (inputChanges.isIncremental) "Executing incrementally"
else "Executing non-incrementally"
)

inputChanges.getFileChanges(inputDir).forEach { change ->


if (change.fileType == FileType.DIRECTORY) return@forEach
println("${change.changeType}: ${change.normalizedPath}")
val targetFile =
outputDir.file(change.normalizedPath).get().asFile
if (change.changeType == ChangeType.REMOVED) {
targetFile.delete()
} else {
targetFile.writeText(change.file.readText().reversed())
}
}
}
}

build.gradle

abstract class IncrementalReverseTask extends DefaultTask {


@Incremental
@PathSensitive(PathSensitivity.NAME_ONLY)
@InputDirectory
abstract DirectoryProperty getInputDir()

@OutputDirectory
abstract DirectoryProperty getOutputDir()

@Input
abstract Property<String> getInputProperty()

@TaskAction
void execute(InputChanges inputChanges) {
println(inputChanges.incremental
? 'Executing incrementally'
: 'Executing non-incrementally'
)

inputChanges.getFileChanges(inputDir).each { change ->


if (change.fileType == FileType.DIRECTORY) return

println "${change.changeType}: ${change.normalizedPath}"


def targetFile = outputDir.file(change.normalizedPath).get()
.asFile
if (change.changeType == ChangeType.REMOVED) {
targetFile.delete()
} else {
targetFile.text = change.file.text.reverse()
}
}
}
}
The type of the inputDir property, its annotations, and the execute() action use
NOTE getFileChanges() to process the subset of files that have changed since the last build.
The action deletes a target file if the corresponding input file has been removed.

If, for some reason, the task is executed non-incrementally (by running with --rerun-tasks, for
example), all files are reported as ADDED, irrespective of the previous state. In this case, Gradle
automatically removes the previous outputs, so the incremental task must only process the given
files.

For a simple transformer task like the above example, the task action must generate output files for
any out-of-date inputs and delete output files for any removed inputs.

IMPORTANT A task may only contain a single incremental task action.

Which inputs are considered out of date?

When a task has been previously executed, and the only changes since that execution are to
incremental input file properties, Gradle can intelligently determine which input files need to be
processed, a concept known as incremental execution.

In this scenario, the InputChanges.getFileChanges() method, available in the


org.gradle.work.InputChanges class, provides details for all input files associated with the given
property that have been ADDED, REMOVED or MODIFIED.

However, there are many cases where Gradle cannot determine which input files need to be
processed (i.e., non-incremental execution). Examples include:

• There is no history available from a previous execution.

• You are building with a different version of Gradle. Currently, Gradle does not use task history
from a different version.

• An upToDateWhen criterion added to the task returns false.

• An input property has changed since the previous execution.

• A non-incremental input file property has changed since the previous execution.

• One or more output files have changed since the previous execution.

In these cases, Gradle will report all input files as ADDED, and the getFileChanges() method will
return details for all the files that comprise the given input property.

You can check if the task execution is incremental or not with the InputChanges.isIncremental()
method.

An incremental task in action

Consider an instance of IncrementalReverseTask executed against a set of inputs for the first time.

In this case, all inputs will be considered ADDED, as shown here:


build.gradle.kts

tasks.register<IncrementalReverseTask>("incrementalReverse") {
inputDir = file("inputs")
outputDir = layout.buildDirectory.dir("outputs")
inputProperty = project.findProperty("taskInputProperty") as String? ?:
"original"
}

build.gradle

tasks.register('incrementalReverse', IncrementalReverseTask) {
inputDir = file('inputs')
outputDir = layout.buildDirectory.dir("outputs")
inputProperty = project.properties['taskInputProperty'] ?: 'original'
}

The build layout:

.
├── build.gradle
└── inputs
├── 1.txt
├── 2.txt
└── 3.txt

$ gradle -q incrementalReverse
Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

Naturally, when the task is executed again with no changes, then the entire task is UP-TO-DATE, and
the task action is not executed:

$ gradle incrementalReverse
> Task :incrementalReverse UP-TO-DATE

BUILD SUCCESSFUL in 0s
1 actionable task: 1 up-to-date
When an input file is modified in some way or a new input file is added, then re-executing the task
results in those files being returned by InputChanges.getFileChanges().

The following example modifies the content of one file and adds another before running the
incremental task:

build.gradle.kts

tasks.register("updateInputs") {
val inputsDir = layout.projectDirectory.dir("inputs")
outputs.dir(inputsDir)
doLast {
inputsDir.file("1.txt").asFile.writeText("Changed content for
existing file 1.")
inputsDir.file("4.txt").asFile.writeText("Content for new file 4.")
}
}

build.gradle

tasks.register('updateInputs') {
def inputsDir = layout.projectDirectory.dir('inputs')
outputs.dir(inputsDir)
doLast {
inputsDir.file('1.txt').asFile.text = 'Changed content for existing
file 1.'
inputsDir.file('4.txt').asFile.text = 'Content for new file 4.'
}
}

$ gradle -q updateInputs incrementalReverse


Executing incrementally
MODIFIED: 1.txt
ADDED: 4.txt

The various mutation tasks (updateInputs, removeInput, etc) are only present to
NOTE demonstrate the behavior of incremental tasks. They should not be viewed as the
kinds of tasks or task implementations you should have in your own build scripts.

When an existing input file is removed, then re-executing the task results in that file being returned
by InputChanges.getFileChanges() as REMOVED.

The following example removes one of the existing files before executing the incremental task:
build.gradle.kts

tasks.register<Delete>("removeInput") {
delete("inputs/3.txt")
}

build.gradle

tasks.register('removeInput', Delete) {
delete 'inputs/3.txt'
}

$ gradle -q removeInput incrementalReverse


Executing incrementally
REMOVED: 3.txt

Gradle cannot determine which input files are out-of-date when an output file is deleted (or
modified). In this case, details for all the input files for the given property are returned by
InputChanges.getFileChanges().

The following example removes one of the output files from the build directory. However, all the
input files are considered to be ADDED:

build.gradle.kts

tasks.register<Delete>("removeOutput") {
delete(layout.buildDirectory.file("outputs/1.txt"))
}

build.gradle

tasks.register('removeOutput', Delete) {
delete layout.buildDirectory.file("outputs/1.txt")
}

$ gradle -q removeOutput incrementalReverse


Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

The last scenario we want to cover concerns what happens when a non-file-based input property is
modified. In such cases, Gradle cannot determine how the property impacts the task outputs, so the
task is executed non-incrementally. This means that all input files for the given property are
returned by InputChanges.getFileChanges() and they are all treated as ADDED.

The following example sets the project property taskInputProperty to a new value when running
the incrementalReverse task. That project property is used to initialize the task’s inputProperty
property, as you can see in the first example of this section.

Here is the expected output in this case:

$ gradle -q -PtaskInputProperty=changed incrementalReverse


Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

Command Line options

Sometimes, a user wants to declare the value of an exposed task property on the command line
instead of the build script. Passing property values on the command line is particularly helpful if
they change more frequently.

The task API supports a mechanism for marking a property to automatically generate a
corresponding command line parameter with a specific name at runtime.

Step 1. Declare a command-line option

To expose a new command line option for a task property, annotate the corresponding setter
method of a property with Option:

@Option(option = "flag", description = "Sets the flag")

An option requires a mandatory identifier. You can provide an optional description.

A task can expose as many command line options as properties available in the class.

Options may be declared in superinterfaces of the task class as well. If multiple interfaces declare
the same property but with different option flags, they will both work to set the property.

In the example below, the custom task UrlVerify verifies whether a URL can be resolved by making
an HTTP call and checking the response code. The URL to be verified is configurable through the
property url. The setter method for the property is annotated with @Option:

UrlVerify.java

import org.gradle.api.tasks.options.Option;

public class UrlVerify extends DefaultTask {


private String url;

@Option(option = "url", description = "Configures the URL to be verified.")


public void setUrl(String url) {
this.url = url;
}

@Input
public String getUrl() {
return url;
}

@TaskAction
public void verify() {
getLogger().quiet("Verifying URL '{}'", url);

// verify URL by making a HTTP call


}
}

All options declared for a task can be rendered as console output by running the help task and the
--task option.

Step 2. Use an option on the command line

There are a few rules for options on the command line:

• The option uses a double-dash as a prefix, e.g., --url. A single dash does not qualify as valid
syntax for a task option.

• The option argument follows directly after the task declaration, e.g., verifyUrl
--url=http://www.google.com/.

• Multiple task options can be declared in any order on the command line following the task
name.

Building upon the earlier example, the build script creates a task instance of type UrlVerify and
provides a value from the command line through the exposed option:

build.gradle.kts

tasks.register<UrlVerify>("verifyUrl")
build.gradle

tasks.register('verifyUrl', UrlVerify)

$ gradle -q verifyUrl --url=http://www.google.com/


Verifying URL 'http://www.google.com/'

Supported data types for options

Gradle limits the data types that can be used for declaring command line options.

The use of the command line differs per type:

boolean, Boolean, Property<Boolean>


Describes an option with the value true or false.
Passing the option on the command line treats the value as true. For example, --foo equates to
true.
The absence of the option uses the default value of the property. For each boolean option, an
opposite option is created automatically. For example, --no-foo is created for the provided
option --foo and --bar is created for --no-bar. Options whose name starts with --no are disabled
options and set the option value to false. An opposite option is only created if no option with the
same name already exists for the task.

Double, Property<Double>
Describes an option with a double value.
Passing the option on the command line also requires a value, e.g., --factor=2.2 or --factor 2.2.

Integer, Property<Integer>
Describes an option with an integer value.
Passing the option on the command line also requires a value, e.g., --network-timeout=5000 or
--network-timeout 5000.

Long, Property<Long>
Describes an option with a long value.
Passing the option on the command line also requires a value, e.g., --threshold=2147483648 or
--threshold 2147483648.

String, Property<String>
Describes an option with an arbitrary String value.
Passing the option on the command line also requires a value, e.g., --container-id=2x94held or
--container-id 2x94held.

enum, Property<enum>
Describes an option as an enumerated type.
Passing the option on the command line also requires a value e.g., --log-level=DEBUG or --log
-level debug.
The value is not case-sensitive.

List<T> where T is Double, Integer, Long, String, enum


Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123
--image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space
character, are currently not supported.

ListProperty<T>, SetProperty<T> where T is Double, Integer, Long, String, enum


Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123
--image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space
character, are currently not supported.

DirectoryProperty, RegularFileProperty
Describes an option with a file system element.
Passing the option on the command line also requires a value representing a path, e.g., --output
-file=file.txt or --output-dir outputDir.
Relative paths are resolved relative to the project directory of the project that owns this property
instance. See FileSystemLocationProperty.set().

Documenting available values for an option

Theoretically, an option for a property type String or List<String> can accept any arbitrary value.
Accepted values for such an option can be documented programmatically with the help of the
annotation OptionValues:

@OptionValues('file')

This annotation may be assigned to any method that returns a List of one of the supported data
types. You need to specify an option identifier to indicate the relationship between the option and
available values.

Passing a value on the command line not supported by the option does not fail the
NOTE build or throw an exception. You must implement custom logic for such behavior in
the task action.

The example below demonstrates the use of multiple options for a single task. The task
implementation provides a list of available values for the option output-type:

UrlProcess.java

import org.gradle.api.tasks.options.Option;
import org.gradle.api.tasks.options.OptionValues;
public abstract class UrlProcess extends DefaultTask {
private String url;
private OutputType outputType;

@Input
@Option(option = "http", description = "Configures the http protocol to be
allowed.")
public abstract Property<Boolean> getHttp();

@Option(option = "url", description = "Configures the URL to send the request to.
")
public void setUrl(String url) {
if (!getHttp().getOrElse(true) && url.startsWith("http://")) {
throw new IllegalArgumentException("HTTP is not allowed");
} else {
this.url = url;
}
}

@Input
public String getUrl() {
return url;
}

@Option(option = "output-type", description = "Configures the output type.")


public void setOutputType(OutputType outputType) {
this.outputType = outputType;
}

@OptionValues("output-type")
public List<OutputType> getAvailableOutputTypes() {
return new ArrayList<OutputType>(Arrays.asList(OutputType.values()));
}

@Input
public OutputType getOutputType() {
return outputType;
}

@TaskAction
public void process() {
getLogger().quiet("Writing out the URL response from '{}' to '{}'", url,
outputType);

// retrieve content from URL and write to output


}

private static enum OutputType {


CONSOLE, FILE
}
}

Listing command line options

Command line options using the annotations Option and OptionValues are self-documenting.

You will see declared options and their available values reflected in the console output of the help
task. The output renders options alphabetically, except for boolean disable options, which appear
following the enable option:

$ gradle -q help --task processUrl


Detailed task information for processUrl

Path
:processUrl

Type
UrlProcess (UrlProcess)

Options
--http Configures the http protocol to be allowed.

--no-http Disables option --http.

--output-type Configures the output type.


Available values are:
CONSOLE
FILE

--url Configures the URL to send the request to.

--rerun Causes the task to be re-run even if up-to-date.

Description
-

Group
-

Limitations

Support for declaring command line options currently comes with a few limitations.

• Command line options can only be declared for custom tasks via annotation. There’s no
programmatic equivalent for defining options.

• Options cannot be declared globally, e.g., on a project level or as part of a plugin.

• When assigning an option on the command line, the task exposing the option needs to be
spelled out explicitly, e.g., gradle check --tests abc does not work even though the check task
depends on the test task.

• If you specify a task option name that conflicts with the name of a built-in Gradle option, use the
-- delimiter before calling your task to reference that option. For more information, see
Disambiguate Task Options from Built-in Options.

Verification failures

Normally, exceptions thrown during task execution result in a failure that immediately terminates
a build. The outcome of the task will be FAILED, the result of the build will be FAILED, and no further
tasks will be executed. When running with the --continue flag, Gradle will continue to run other
requested tasks in the build after encountering a task failure. However, any tasks that depend on a
failed task will not be executed.

There is a special type of exception that behaves differently when downstream tasks only rely on
the outputs of a failing task. A task can throw a subtype of VerificationException to indicate that it
has failed in a controlled manner such that its output is still valid for consumers. A task depends on
the outcome of another task when it directly depends on it using dependsOn. When Gradle is run
with --continue, consumer tasks that depend on a producer task’s output (via a relationship
between task inputs and outputs) can still run after the producer fails.

A failed unit test, for instance, will cause a failing outcome for the test task. However, this doesn’t
prevent another task from reading and processing the (valid) test results the task produced.
Verification failures are used in exactly this manner by the Test Report Aggregation Plugin.

Verification failures are also useful for tasks that need to report a failure even after producing
useful output consumable by other tasks.

build.gradle.kts

val process = tasks.register("process") {


val outputFile = layout.buildDirectory.file("processed.log")
outputs.files(outputFile) ①

doLast {
val logFile = outputFile.get().asFile
logFile.appendText("Step 1 Complete.") ②
throw VerificationException("Process failed!") ③
logFile.appendText("Step 2 Complete.") ④
}
}

tasks.register("postProcess") {
inputs.files(process) ⑤

doLast {
println("Results: ${inputs.files.singleFile.readText()}") ⑥
}
}

build.gradle

tasks.register("process") {
def outputFile = layout.buildDirectory.file("processed.log")
outputs.files(outputFile) ①

doLast {
def logFile = outputFile.get().asFile
logFile << "Step 1 Complete." ②
throw new VerificationException("Process failed!") ③
logFile << "Step 2 Complete." ④
}
}

tasks.register("postProcess") {
inputs.files(tasks.named("process")) ⑤

doLast {
println("Results: ${inputs.files.singleFile.text}") ⑥
}
}

$ gradle postProcess --continue


> Task :process FAILED

> Task :postProcess


Results: Step 1 Complete.
2 actionable tasks: 2 executed

FAILURE: Build failed with an exception.

① Register Output: The process task writes its output to a log file.

② Modify Output: The task writes to its output file as it executes.

③ Task Failure: The task throws a VerificationException and fails at this point.

④ Continue to Modify Output: This line never runs due to the exception stopping the task.

⑤ Consume Output: The postProcess task depends on the output of the process task due to using
that task’s outputs as its own inputs.

⑥ Use Partial Result: With the --continue flag set, Gradle still runs the requested postProcess task
despite the process task’s failure. postProcess can read and display the partial (though still valid)
result.
DEVELOPING PLUGINS
Understanding Plugins
Gradle comes with a set of powerful core systems such as dependency management, task execution,
and project configuration. But everything else it can do is supplied by plugins.

Plugins encapsulate logic for specific tasks or integrations, such as compiling code, running tests, or
deploying artifacts. By applying plugins, users can easily add new features to their build process
without having to write complex code from scratch.

This plugin-based approach allows Gradle to be lightweight and modular. It also promotes code
reuse and maintainability, as plugins can be shared across projects or within an organization.

Before reading this chapter, it’s recommended that you first read Learning The Basics and complete
the Tutorial.

Plugins Introduction

Plugins can be sourced from Gradle or the Gradle community. But when users want to organize
their build logic or need specific build capabilities not provided by existing plugins, they can
develop their own.

As such, we distinguish between three different kinds of plugins:

1. Core Plugins - plugins that come from Gradle.

2. Community Plugins - plugins that come from Gradle Plugin Portal or a public repository.

3. Local or Custom Plugins - plugins that you develop yourself.

Core Plugins

The term core plugin refers to a plugin that is part of the Gradle distribution such as the Java
Library Plugin. They are always available.

Community Plugins

The term community plugin refers to a plugin published to the Gradle Plugin Portal (or another
public repository) such as the Spotless Plugin.

Local or Custom Plugins

The term local or custom plugin refers to a plugin you write yourself for your own build.

Custom plugins

There are three types of custom plugins:


# Type Location: Most likely: Benefit:

1 Script plugins A .gradle(.kts) A local plugin Plugin is


script file automatically
compiled and
included in the
classpath of the
build script.

2 Precompiled script buildSrc folder or A convention Plugin is


plugins composite build plugin automatically
compiled, tested,
and available on
the classpath of
the build script.
The plugin is
visible to every
build script used
by the build.

3 Binary plugins Standalone project A shared plugin Plugin JAR is


produced and
published. The
plugin can be used
in multiple builds
and shared with
others.

Script plugins

Script plugins are typically small, local plugins written in script files for tasks specific to a single
build or project. They do not need to be reused across multiple projects. Script plugins are not
recommended but many other forms of plugins evolve from script plugins.

To create a plugin, you need to write a class that implements the Plugin interface.

The following sample creates a GreetingPlugin, which adds a hello task to a project when applied:

build.gradle.kts

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
project.task("hello") {
doLast {
println("Hello from the GreetingPlugin")
}
}
}
}
// Apply the plugin
apply<GreetingPlugin>()

build.gradle

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
project.task('hello') {
doLast {
println 'Hello from the GreetingPlugin'
}
}
}
}

// Apply the plugin


apply plugin: GreetingPlugin

$ gradle -q hello
Hello from the GreetingPlugin

The Project object is passed as a parameter in apply(), which the plugin can use to configure the
project however it needs to (such as adding tasks, configuring dependencies, etc.). In this example,
the plugin is written directly in the build file which is not a recommended practice.

When the plugin is written in a separate script file, it can be applied using apply(from =
"file_name.gradle.kts") or apply from: 'file_name.gradle'. In the example below, the plugin is
coded in the other.gradle(.kts) script file. Then, the other.gradle(.kts) is applied to
build.gradle(.kts) using apply from:

other.gradle.kts

class GreetingScriptPlugin : Plugin<Project> {


override fun apply(project: Project) {
project.task("hi") {
doLast {
println("Hi from the GreetingScriptPlugin")
}
}
}
}
// Apply the plugin
apply<GreetingScriptPlugin>()

other.gradle

class GreetingScriptPlugin implements Plugin<Project> {


void apply(Project project) {
project.task('hi') {
doLast {
println 'Hi from the GreetingScriptPlugin'
}
}
}
}

// Apply the plugin


apply plugin: GreetingScriptPlugin

build.gradle.kts

apply(from = "other.gradle.kts")

build.gradle

apply from: 'other.gradle'

$ gradle -q hi
Hi from the GreetingScriptPlugin

Script plugins should be avoided.

Precompiled script plugins

Precompiled script plugins are compiled into class files and packaged into a JAR before they are
executed. These plugins use the Groovy DSL or Kotlin DSL instead of pure Java, Kotlin, or Groovy.
They are best used as convention plugins that share build logic across projects or as a way to
neatly organize build logic.

To create a precompiled script plugin, you can:


1. Use Gradle’s Kotlin DSL - The plugin is a .gradle.kts file, and apply id("kotlin-dsl").

2. Use Gradle’s Groovy DSL - The plugin is a .gradle file, and apply id("groovy-gradle-plugin").

To apply a precompiled script plugin, you need to know its ID. The ID is derived from the plugin
script’s filename and its (optional) package declaration.

For example, the script src/main/*/some-java-library.gradle(.kts) has a plugin ID of some-java-


library (assuming it has no package declaration). Likewise, src/main/*/my/some-java-
library.gradle(.kts) has a plugin ID of my.some-java-library as long as it has a package declaration
of my.

Precompiled script plugin names have two important limitations:

• They cannot start with org.gradle.

• They cannot have the same name as a core plugin.

When the plugin is applied to a project, Gradle creates an instance of the plugin class and calls the
instance’s Plugin.apply() method.

NOTE A new instance of a Plugin is created within each project applying that plugin.

Let’s rewrite the GreetingPlugin script plugin as a precompiled script plugin. Since we are using the
Groovy or Kotlin DSL, the file essentially becomes the plugin. The original script plugin simply
created a hello task which printed a greeting, this is what we will do in the pre-compiled script
plugin:

buildSrc/src/main/kotlin/GreetingPlugin.gradle.kts

tasks.register("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}

buildSrc/src/main/groovy/GreetingPlugin.gradle

tasks.register("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}

The GreetingPlugin can now be applied in other subprojects' builds by using its ID:
app/build.gradle.kts

plugins {
application
id("GreetingPlugin")
}

app/build.gradle

plugins {
id 'application'
id('GreetingPlugin')
}

$ gradle -q hello
Hello from the convention GreetingPlugin

Convention plugins

A convention plugin is typically a precompiled script plugin that configures existing core and
community plugins with your own conventions (i.e. default values) such as setting the Java version
by using java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are
also used to enforce project standards and help streamline the build process. They can apply and
configure plugins, create new tasks and extensions, set dependencies, and much more.

Let’s take an example build with three subprojects: one for data-model, one for database-logic and
one for app code. The project has the following structure:

.
├── buildSrc
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── data-model
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── database-logic
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── app
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts

The build file of the database-logic subproject is as follows:

database-logic/build.gradle.kts

plugins {
id("java-library")
id("org.jetbrains.kotlin.jvm") version "1.9.23"
}

repositories {
mavenCentral()
}

java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
useJUnitPlatform()
}

kotlin {
jvmToolchain(11)
}

// More build logic

database-logic/build.gradle

plugins {
id 'java-library'
id 'org.jetbrains.kotlin.jvm' version '1.9.23'
}

repositories {
mavenCentral()
}

java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}
tasks.test {
useJUnitPlatform()
}

kotlin {
jvmToolchain {
languageVersion.set(JavaLanguageVersion.of(11))
}
}

// More build logic

We apply the java-library plugin and add the org.jetbrains.kotlin.jvm plugin for Kotlin support.
We also configure Kotlin, Java, tests and more.

Our build file is beginning to grow…

The more plugins we apply and the more plugins we configure, the larger it gets. There’s also
repetition in the build files of the app and data-model subprojects, especially when configuring
common extensions like setting the Java version and Kotlin support.

To address this, we use convention plugins. This allows us to avoid repeating configuration in each
build file and keeps our build scripts more concise and maintainable. In convention plugins, we can
encapsulate arbitrary build configuration or custom build logic.

To develop a convention plugin, we recommend using buildSrc – which represents a completely


separate Gradle build. buildSrc has its own settings file to define where dependencies of this build
are located.

We add a Kotlin script called my-java-library.gradle.kts inside the buildSrc/src/main/kotlin


directory. Or conversely, a Groovy script called my-java-library.gradle inside the
buildSrc/src/main/groovy directory. We put all the plugin application and configuration from the
database-logic build file into it:

buildSrc/src/main/kotlin/my-java-library.gradle.kts

plugins {
id("java-library")
id("org.jetbrains.kotlin.jvm")
}

repositories {
mavenCentral()
}

java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
useJUnitPlatform()
}

kotlin {
jvmToolchain(11)
}

buildSrc/src/main/groovy/my-java-library.gradle

plugins {
id 'java-library'
id 'org.jetbrains.kotlin.jvm'
}

repositories {
mavenCentral()
}

java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
useJUnitPlatform()
}

kotlin {
jvmToolchain {
languageVersion.set(JavaLanguageVersion.of(11))
}
}

The name of the file my-java-library is the ID of our brand-new plugin, which we can now use in all
of our subprojects.

Why is the version of id 'org.jetbrains.kotlin.jvm' missing? See Applying External


TIP
Plugins to Pre-Compiled Script Plugins.

The database-logic build file becomes much simpler by removing all the redundant build logic and
applying our convention my-java-library plugin instead:
database-logic/build.gradle.kts

plugins {
id("my-java-library")
}

database-logic/build.gradle

plugins {
id('my-java-library')
}

This convention plugin enables us to easily share common configurations across all our build files.
Any modifications can be made in one place, simplifying maintenance.

Binary plugins

Binary plugins in Gradle are plugins that are built as standalone JAR files and applied to a project
using the plugins{} block in the build script.

Let’s move our GreetingPlugin to a standalone project so that we can publish it and share it with
others. The plugin is essentially moved from the buildSrc folder to its own build called greeting-
plugin.

You can publish the plugin from buildSrc, but this is not recommended practice.
NOTE
Plugins that are ready for publication should be in their own build.

greeting-plugin is simply a Java project that produces a JAR containing the plugin classes.

The easiest way to package and publish a plugin to a repository is to use the Gradle Plugin
Development Plugin. This plugin provides the necessary tasks and configurations (including the
plugin metadata) to compile your script into a plugin that can be applied in other builds.

Here is a simple build script for the greeting-plugin project using the Gradle Plugin Development
Plugin:

build.gradle.kts

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("simplePlugin") {
id = "org.example.greeting"
implementationClass = "org.example.GreetingPlugin"
}
}
}

build.gradle

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
simplePlugin {
id = 'org.example.greeting'
implementationClass = 'org.example.GreetingPlugin'
}
}
}

For more on publishing plugins, see Publishing Plugins.

Project vs Settings vs Init plugins

In the example used through this section, the plugin accepts the Project type as a type parameter.
Alternatively, the plugin can accept a parameter of type Settings to be applied in a settings script, or
a parameter of type Gradle to be applied in an initialization script.

The difference between these types of plugins lies in the scope of their application:

Project Plugin
A project plugin is a plugin that is applied to a specific project in a build. It can customize the
build logic, add tasks, and configure the project-specific settings.

Settings Plugin
A settings plugin is a plugin that is applied in the settings.gradle or settings.gradle.kts file. It
can configure settings that apply to the entire build, such as defining which projects are
included in the build, configuring build script repositories, and applying common configurations
to all projects.

Init Plugin
An init plugin is a plugin that is applied in the init.gradle or init.gradle.kts file. It can
configure settings that apply globally to all Gradle builds on a machine, such as configuring the
Gradle version, setting up default repositories, or applying common plugins to all builds.

Understanding Implementation Options for Plugins


The choice between script, precompiled script, or binary plugins depends on your specific
requirements and preferences.

Script Plugins are simple and easy to write. They are written in Kotlin DSL or Groovy DSL. They
are suitable for small, one-off tasks or for quick experimentation. However, they can become hard
to maintain as the build script grows in size and complexity.

Precompiled Script Plugins are Kotlin or Groovy DSL scripts compiled into Java class files
packaged in a library. They offer better performance and maintainability compared to script
plugins, and they can be reused across different projects. You can also write them in Groovy DSL
but that is not recommended.

Binary Plugins are full-fledged plugins written in Java, Groovy, or Kotlin, compiled into JAR files,
and published to a repository. They offer the best performance, maintainability, and reusability.
They are suitable for complex build logic that needs to be shared across projects, builds, and teams.
You can also write them in Scala or Groovy but that is not recommended.

Here is a breakdown of all options for implementing Gradle plugins:

# Using: Type: The Plugin is: Recommended?


[1]
1 Kotlin DSL Script plugin in a .gradle.kts No
file as an abstract
class that
implements the
apply(Project
project) method
of the
Plugin<Project>
interface.
[1]
2 Groovy DSL Script plugin in a .gradle file as No
an abstract class
that implements
the apply(Project
project) method
of the
Plugin<Project>
interface.

3 Kotlin DSL Pre-compiled a .gradle.kts file. Yes


script plugin
[2]
4 Groovy DSL Pre-compiled a .gradle file. Ok
script plugin
# Using: Type: The Plugin is: Recommended?

5 Java Binary plugin an abstract class Yes


that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Java.

6 Kotlin / Kotlin DSL Binary plugin an abstract class Yes


that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Kotlin
and/or Kotlin DSL.
[2]
7 Groovy / Groovy Binary plugin an abstract class Ok
DSL that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in
Groovy and/or
Groovy DSL.
[2]
8 Scala Binary plugin an abstract class No
that implements
the apply(Project
project) method
of the
Plugin<Project>
interface in Scala.

If you suspect issues with your plugin code, try creating a Build Scan to identify bottlenecks. The
Gradle profiler can help automate Build Scan generation and gather more low-level information.

Implementing Pre-compiled Script Plugins


A precompiled script plugin is typically a Kotlin script that has been compiled and distributed as
Java class files packaged in a library. These scripts are intended to be consumed as binary Gradle
plugins and are recommended for use as convention plugins.

A convention plugin is a plugin that normaly configures existing core and community plugins with
your own conventions (i.e. default values) such as setting the Java version by using
java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are also used to
enforce project standards and help streamline the build process. They can apply and configure
plugins, create new tasks and extensions, set dependencies, and much more.

Setting the plugin ID

The plugin ID for a precompiled script is derived from its file name and optional package
declaration.

For example, a script named code-quality.gradle(.kts) located in src/main/groovy (or


src/main/kotlin) without a package declaration would be exposed as the code-quality plugin:

buildSrc/build.gradle.kts

plugins {
id("kotlin-dsl")
}

app/build.gradle.kts

plugins {
id("code-quality")
}

buildSrc/build.gradle

plugins {
id 'groovy-gradle-plugin'
}

app/build.gradle

plugins {
id 'code-quality'
}

On the other hand, a script named code-quality.gradle(.kts) located in src/main/groovy/my (or


src/main/kotlin/my) with the package declaration my would be exposed as the my.code-quality
plugin:

buildSrc/build.gradle.kts

plugins {
id("kotlin-dsl")
}
app/build.gradle.kts

plugins {
id("my.code-quality")
}

buildSrc/build.gradle

plugins {
id 'groovy-gradle-plugin'
}

app/build.gradle

plugins {
id 'my.code-quality'
}

Making a plugin configurable using extensions

Extension objects are commonly used in plugins to expose configuration options and additional
functionality to build scripts.

When you apply a plugin that defines an extension, you can access the extension object and
configure its properties or call its methods to customize the behavior of the plugin or tasks
provided by the plugin.

A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.

Let’s update our greetings example:

buildSrc/src/main/kotlin/greetings.gradle.kts

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
project.extensions.create<GreetingPluginExtension>("greeting")
buildSrc/src/main/groovy/greetings.gradle

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = project.extensions.create("greeting",
GreetingPluginExtension)

You can set the value of the message property directly with extension.message.set("Hi from
Gradle,").

However, the GreetingPluginExtension object becomes available as a project property with the same
name as the extension object. You can now access message like so:

buildSrc/src/main/kotlin/greetings.gradle.kts

// Where the<GreetingPluginExtension>() is equivalent to


project.extensions.getByType(GreetingPluginExtension::class.java)
the<GreetingPluginExtension>().message.set("Hi from Gradle")

buildSrc/src/main/groovy/greetings.gradle

extensions.findByType(GreetingPluginExtension).message.set("Hi from Gradle")

If you apply the greetings plugin, you can set the convention in your build script:

app/build.gradle.kts

plugins {
application
id("greetings")
}

greeting {
message = "Hello from Gradle"
}
app/build.gradle

plugins {
id 'application'
id('greetings')
}

configure(greeting) {
message = "Hello from Gradle"
}

Adding default configuration as conventions

In plugins, you can define default values, also known as conventions, using the project object.

Convention properties are properties that are initialized with default values but can be overridden:

buildSrc/src/main/kotlin/greetings.gradle.kts

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
project.extensions.create<GreetingPluginExtension>("greeting")

// Set a default value for 'message'


extension.message.convention("Hello from Gradle")

buildSrc/src/main/groovy/greetings.gradle

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = project.extensions.create("greeting",
GreetingPluginExtension)

// Set a default value for 'message'


extension.message.convention("Hello from Gradle")

extension.message.convention(…) sets a convention for the message property of the extension. This
convention specifies that the value of message should default to the content of a file named
defaultGreeting.txt located in the build directory of the project.

If the message property is not explicitly set, its value will be automatically set to the content of
defaultGreeting.txt.

Mapping extension properties to task properties

Using an extension and mapping it to a custom task’s input/output properties is common in plugins.

In this example, the message property of the GreetingPluginExtension is mapped to the message
property of the GreetingTask as an input:

buildSrc/src/main/kotlin/greetings.gradle.kts

// Create extension object


interface GreetingPluginExtension {
val message: Property<String>
}

// Add the 'greeting' extension object to project


val extension =
project.extensions.create<GreetingPluginExtension>("greeting")

// Set a default value for 'message'


extension.message.convention("Hello from Gradle")

// Create a greeting task


abstract class GreetingTask : DefaultTask() {
@Input
val message = project.objects.property<String>()

@TaskAction
fun greet() {
println("Message: ${message.get()}")
}
}

// Register the task and set the convention


tasks.register<GreetingTask>("hello") {
message.convention(extension.message)
}
buildSrc/src/main/groovy/greetings.gradle

// Create extension object


interface GreetingPluginExtension {
Property<String> getMessage()
}

// Add the 'greeting' extension object to project


def extension = project.extensions.create("greeting",
GreetingPluginExtension)

// Set a default value for 'message'


extension.message.convention("Hello from Gradle")

// Create a greeting task


abstract class GreetingTask extends DefaultTask {
@Input
abstract Property<String> getMessage()

@TaskAction
void greet() {
println("Message: ${message.get()}")
}
}

// Register the task and set the convention


tasks.register("hello", GreetingTask) {
message.convention(extension.message)
}

$ gradle -q hello
Message: Hello from Gradle

This means that changes to the extension’s message property will trigger the task to be considered
out-of-date, ensuring that the task is re-executed with the new message.

You can find out more about types that you can use in task implementations and extensions in Lazy
Configuration.

Applying external plugins

In order to apply an external plugin in a precompiled script plugin, it has to be added to the plugin
project’s implementation classpath in the plugin’s build file:
buildSrc/build.gradle.kts

plugins {
`kotlin-dsl`
}

repositories {
mavenCentral()
}

dependencies {
implementation("com.bmuschko:gradle-docker-plugin:6.4.0")
}

buildSrc/build.gradle

plugins {
id 'groovy-gradle-plugin'
}

repositories {
mavenCentral()
}

dependencies {
implementation 'com.bmuschko:gradle-docker-plugin:6.4.0'
}

It can then be applied in the precompiled script plugin:

buildSrc/src/main/kotlin/my-plugin.gradle.kts

plugins {
id("com.bmuschko.docker-remote-api")
}

buildSrc/src/main/groovy/my-plugin.gradle

plugins {
id 'com.bmuschko.docker-remote-api'
}

The plugin version in this case is defined in the dependency declaration.

Implementing Binary Plugins


Binary plugins refer to plugins that are compiled and distributed as JAR files. These plugins are
usually written in Java or Kotlin and provide custom functionality or tasks to a Gradle build.

Using the Plugin Development plugin

The Gradle Plugin Development plugin can be used to assist in developing Gradle plugins.

This plugin will automatically apply the Java Plugin, add the gradleApi() dependency to the api
configuration, generate the required plugin descriptors in the resulting JAR file, and configure the
Plugin Marker Artifact to be used when publishing.

To apply and configure the plugin, add the following code to your build file:

build.gradle.kts

plugins {
`java-gradle-plugin`
}

gradlePlugin {
plugins {
create("simplePlugin") {
id = "org.example.greeting"
implementationClass = "org.example.GreetingPlugin"
}
}
}

build.gradle

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
simplePlugin {
id = 'org.example.greeting'
implementationClass = 'org.example.GreetingPlugin'
}
}
}

Writing and using custom task types is recommended when developing plugins as it automatically
benefits from incremental builds. As an added benefit of applying the plugin to your project, the
task validatePlugins automatically checks for an existing input/output annotation for every public
property defined in a custom task type implementation.

Creating a plugin ID

Plugin IDs are meant to be globally unique, similar to Java package names (i.e., a reverse domain
name). This format helps prevent naming collisions and allows grouping plugins with similar
ownership.

An explicit plugin identifier simplifies applying the plugin to a project. Your plugin ID should
combine components that reflect the namespace (a reasonable pointer to you or your organization)
and the name of the plugin it provides. For example, if your Github account is named foo and your
plugin is named bar, a suitable plugin ID might be com.github.foo.bar. Similarly, if the plugin was
developed at the baz organization, the plugin ID might be org.baz.bar.

Plugin IDs should adhere to the following guidelines:

• May contain any alphanumeric character, '.', and '-'.

• Must contain at least one '.' character separating the namespace from the plugin’s name.

• Conventionally use a lowercase reverse domain name convention for the namespace.

• Conventionally use only lowercase characters in the name.

• org.gradle, com.gradle, and com.gradleware namespaces may not be used.

• Cannot start or end with a '.' character.

• Cannot contain consecutive '.' characters (i.e., '..').

A namespace that identifies ownership and a name is sufficient for a plugin ID.

When bundling multiple plugins in a single JAR artifact, adhering to the same naming conventions
is recommended. This practice helps logically group related plugins.

There is no limit to the number of plugins that can be defined and registered (by different
identifiers) within a single project.

The identifiers for plugins written as a class should be defined in the project’s build script
containing the plugin classes. For this, the java-gradle-plugin needs to be applied:
buildSrc/build.gradle.kts

plugins {
id("java-gradle-plugin")
}

gradlePlugin {
plugins {
create("androidApplicationPlugin") {
id = "com.android.application"
implementationClass = "com.android.AndroidApplicationPlugin"
}
create("androidLibraryPlugin") {
id = "com.android.library"
implementationClass = "com.android.AndroidLibraryPlugin"
}
}
}

buildSrc/build.gradle

plugins {
id 'java-gradle-plugin'
}

gradlePlugin {
plugins {
androidApplicationPlugin {
id = 'com.android.application'
implementationClass = 'com.android.AndroidApplicationPlugin'
}
androidLibraryPlugin {
id = 'com.android.library'
implementationClass = 'com.android.AndroidLibraryPlugin'
}
}
}

Working with files

When developing plugins, it’s a good idea to be flexible when accepting input configuration for file
locations.

It is recommended to use Gradle’s managed properties and project.layout to select file or directory
locations. This will enable lazy configuration so that the actual location will only be resolved when
the file is needed and can be reconfigured at any time during build configuration.

This Gradle build file defines a task GreetingToFileTask that writes a greeting to a file. It also
registers two tasks: greet, which creates the file with the greeting, and sayGreeting, which prints the
file’s contents. The greetingFile property is used to specify the file path for the greeting:

build.gradle.kts

abstract class GreetingToFileTask : DefaultTask() {

@get:OutputFile
abstract val destination: RegularFileProperty

@TaskAction
fun greet() {
val file = destination.get().asFile
file.parentFile.mkdirs()
file.writeText("Hello!")
}
}

val greetingFile = objects.fileProperty()

tasks.register<GreetingToFileTask>("greet") {
destination = greetingFile
}

tasks.register("sayGreeting") {
dependsOn("greet")
val greetingFile = greetingFile
doLast {
val file = greetingFile.get().asFile
println("${file.readText()} (file: ${file.name})")
}
}

greetingFile = layout.buildDirectory.file("hello.txt")

build.gradle

abstract class GreetingToFileTask extends DefaultTask {

@OutputFile
abstract RegularFileProperty getDestination()

@TaskAction
def greet() {
def file = getDestination().get().asFile
file.parentFile.mkdirs()
file.write 'Hello!'
}
}

def greetingFile = objects.fileProperty()

tasks.register('greet', GreetingToFileTask) {
destination = greetingFile
}

tasks.register('sayGreeting') {
dependsOn greet
doLast {
def file = greetingFile.get().asFile
println "${file.text} (file: ${file.name})"
}
}

greetingFile = layout.buildDirectory.file('hello.txt')

$ gradle -q sayGreeting
Hello! (file: hello.txt)

In this example, we configure the greet task destination property as a closure/provider, which is
evaluated with the Project.file(java.lang.Object) method to turn the return value of the
closure/provider into a File object at the last minute. Note that we specify the greetingFile
property value after the task configuration. This lazy evaluation is a key benefit of accepting any
value when setting a file property and then resolving that value when reading the property.

You can learn more about working with files lazily in Working with Files.

Making a plugin configurable using extensions

Most plugins offer configuration options for build scripts and other plugins to customize how the
plugin works. Plugins do this using extension objects.

A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.

An extension object is simply an object with Java Bean properties representing the configuration.

Let’s add a greeting extension object to the project, which allows you to configure the greeting:
build.gradle.kts

interface GreetingPluginExtension {
val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
// Add the 'greeting' extension object
val extension =
project.extensions.create<GreetingPluginExtension>("greeting")
// Add a task that uses configuration from the extension object
project.task("hello") {
doLast {
println(extension.message.get())
}
}
}
}

apply<GreetingPlugin>()

// Configure the extension


the<GreetingPluginExtension>().message = "Hi from Gradle"

build.gradle

interface GreetingPluginExtension {
Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
// Add the 'greeting' extension object
def extension = project.extensions.create('greeting',
GreetingPluginExtension)
// Add a task that uses configuration from the extension object
project.task('hello') {
doLast {
println extension.message.get()
}
}
}
}

apply plugin: GreetingPlugin


// Configure the extension
greeting.message = 'Hi from Gradle'

$ gradle -q hello
Hi from Gradle

In this example, GreetingPluginExtension is an object with a property called message. The extension
object is added to the project with the name greeting. This object becomes available as a project
property with the same name as the extension object. the<GreetingPluginExtension>() is equivalent
to project.extensions.getByType(GreetingPluginExtension::class.java).

Often, you have several related properties you need to specify on a single plugin. Gradle adds a
configuration block for each extension object, so you can group settings:

build.gradle.kts

interface GreetingPluginExtension {
val message: Property<String>
val greeter: Property<String>
}

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
val extension =
project.extensions.create<GreetingPluginExtension>("greeting")
project.task("hello") {
doLast {
println("${extension.message.get()} from
${extension.greeter.get()}")
}
}
}
}

apply<GreetingPlugin>()

// Configure the extension using a DSL block


configure<GreetingPluginExtension> {
message = "Hi"
greeter = "Gradle"
}
build.gradle

interface GreetingPluginExtension {
Property<String> getMessage()
Property<String> getGreeter()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
def extension = project.extensions.create('greeting',
GreetingPluginExtension)
project.task('hello') {
doLast {
println "${extension.message.get()} from ${extension.greeter
.get()}"
}
}
}
}

apply plugin: GreetingPlugin

// Configure the extension using a DSL block


greeting {
message = 'Hi'
greeter = 'Gradle'
}

$ gradle -q hello
Hi from Gradle

In this example, several settings can be grouped within the configure<GreetingPluginExtension>


block. The configure function is used to configure an extension object. It provides a convenient way
to set properties or apply configurations to these objects. The type used in the build script’s
configure function (GreetingPluginExtension) must match the extension type. Then, when the block
is executed, the receiver of the block is the extension.

In this example, several settings can be grouped within the greeting closure. The name of the
closure block in the build script (greeting) must match the extension object name. Then, when the
closure is executed, the fields on the extension object will be mapped to the variables within the
closure based on the standard Groovy closure delegate feature.

Declaring a DSL configuration container

Using an extension object extends the Gradle DSL to add a project property and DSL block for the
plugin. Because an extension object is a regular object, you can provide your own DSL nested inside
the plugin block by adding properties and methods to the extension object.

Let’s consider the following build script for illustration purposes.

build.gradle.kts

plugins {
id("org.myorg.server-env")
}

environments {
create("dev") {
url = "http://localhost:8080"
}

create("staging") {
url = "http://staging.enterprise.com"
}

create("production") {
url = "http://prod.enterprise.com"
}
}

build.gradle

plugins {
id 'org.myorg.server-env'
}

environments {
dev {
url = 'http://localhost:8080'
}

staging {
url = 'http://staging.enterprise.com'
}

production {
url = 'http://prod.enterprise.com'
}
}

The DSL exposed by the plugin exposes a container for defining a set of environments. Each
environment the user configures has an arbitrary but declarative name and is represented with its
own DSL configuration block. The example above instantiates a development, staging, and
production environment, including its respective URL.

Each environment must have a data representation in code to capture the values. The name of an
environment is immutable and can be passed in as a constructor parameter. Currently, the only
other parameter the data object stores is a URL.

The following ServerEnvironment object fulfills those requirements:

ServerEnvironment.java

abstract public class ServerEnvironment {


private final String name;

@javax.inject.Inject
public ServerEnvironment(String name) {
this.name = name;
}

public String getName() {


return name;
}

abstract public Property<String> getUrl();


}

Gradle exposes the factory method ObjectFactory.domainObjectContainer(Class,


NamedDomainObjectFactory) to create a container of data objects. The parameter the method takes
is the class representing the data. The created instance of type NamedDomainObjectContainer can
be exposed to the end user by adding it to the extension container with a specific name.

It’s common for a plugin to post-process the captured values within the plugin implementation, e.g.,
to configure tasks:

ServerEnvironmentPlugin.java

public class ServerEnvironmentPlugin implements Plugin<Project> {


@Override
public void apply(final Project project) {
ObjectFactory objects = project.getObjects();

NamedDomainObjectContainer<ServerEnvironment> serverEnvironmentContainer =
objects.domainObjectContainer(ServerEnvironment.class, name -> objects
.newInstance(ServerEnvironment.class, name));
project.getExtensions().add("environments", serverEnvironmentContainer);

serverEnvironmentContainer.all(serverEnvironment -> {
String env = serverEnvironment.getName();
String capitalizedServerEnv = env.substring(0, 1).toUpperCase() + env
.substring(1);
String taskName = "deployTo" + capitalizedServerEnv;
project.getTasks().register(taskName, Deploy.class, task -> task.getUrl()
.set(serverEnvironment.getUrl()));
});
}
}

In the example above, a deployment task is created dynamically for every user-configured
environment.

You can find out more about implementing project extensions in Developing Custom Gradle Types.

Modeling DSL-like APIs

DSLs exposed by plugins should be readable and easy to understand.

For example, let’s consider the following extension provided by a plugin. In its current form, it
offers a "flat" list of properties for configuring the creation of a website:

build-flat.gradle.kts

plugins {
id("org.myorg.site")
}

site {
outputDir = layout.buildDirectory.file("mysite")
websiteUrl = "https://gradle.org"
vcsUrl = "https://github.com/gradle/gradle-site-plugin"
}

build-flat.gradle

plugins {
id 'org.myorg.site'
}

site {
outputDir = layout.buildDirectory.file("mysite")
websiteUrl = 'https://gradle.org'
vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
}
As the number of exposed properties grows, you should introduce a nested, more expressive
structure.

The following code snippet adds a new configuration block named siteInfo as part of the extension.
This provides a stronger indication of what those properties mean:

build.gradle.kts

plugins {
id("org.myorg.site")
}

site {
outputDir = layout.buildDirectory.file("mysite")

siteInfo {
websiteUrl = "https://gradle.org"
vcsUrl = "https://github.com/gradle/gradle-site-plugin"
}
}

build.gradle

plugins {
id 'org.myorg.site'
}

site {
outputDir = layout.buildDirectory.file("mysite")

siteInfo {
websiteUrl = 'https://gradle.org'
vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
}
}

Implementing the backing objects for such an extension is simple. First, introduce a new data
object for managing the properties websiteUrl and vcsUrl:

SiteInfo.java

abstract public class SiteInfo {

abstract public Property<String> getWebsiteUrl();


abstract public Property<String> getVcsUrl();
}

In the extension, create an instance of the siteInfo class and a method to delegate the captured
values to the data instance.

To configure underlying data objects, define a parameter of type Action.

The following example demonstrates the use of Action in an extension definition:

SiteExtension.java

abstract public class SiteExtension {

abstract public RegularFileProperty getOutputDir();

@Nested
abstract public SiteInfo getSiteInfo();

public void siteInfo(Action<? super SiteInfo> action) {


action.execute(getSiteInfo());
}
}

Mapping extension properties to task properties

Plugins commonly use an extension to capture user input from the build script and map it to a
custom task’s input/output properties. The build script author interacts with the extension’s DSL,
while the plugin implementation handles the underlying logic:

app/build.gradle.kts

// Extension class to capture user input


class MyExtension {
@Input
var inputParameter: String? = null
}

// Custom task that uses the input from the extension


class MyCustomTask : org.gradle.api.DefaultTask() {
@Input
var inputParameter: String? = null

@TaskAction
fun executeTask() {
println("Input parameter: $inputParameter")
}
}
// Plugin class that configures the extension and task
class MyPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Create and configure the extension
val extension = project.extensions.create("myExtension",
MyExtension::class.java)
// Create and configure the custom task
project.tasks.register("myTask", MyCustomTask::class.java) {
group = "custom"
inputParameter = extension.inputParameter
}
}
}

app/build.gradle

// Extension class to capture user input


class MyExtension {
@Input
String inputParameter = null
}

// Custom task that uses the input from the extension


class MyCustomTask extends DefaultTask {
@Input
String inputParameter = null

@TaskAction
def executeTask() {
println("Input parameter: $inputParameter")
}
}

// Plugin class that configures the extension and task


class MyPlugin implements Plugin<Project> {
void apply(Project project) {
// Create and configure the extension
def extension = project.extensions.create("myExtension", MyExtension)
// Create and configure the custom task
project.tasks.register("myTask", MyCustomTask) {
group = "custom"
inputParameter = extension.inputParameter
}
}
}
In this example, the MyExtension class defines an inputParameter property that can be set in the build
script. The MyPlugin class configures this extension and uses its inputParameter value to configure
the MyCustomTask task. The MyCustomTask task then uses this input parameter in its logic.

You can learn more about types you can use in task implementations and extensions in Lazy
Configuration.

Adding default configuration with conventions

Plugins should provide sensible defaults and standards in a specific context, reducing the number
of decisions users need to make. Using the project object, you can define default values. These are
known as conventions.

Conventions are properties that are initialized with default values and can be overridden by the
user in their build script. For example:

build.gradle.kts

interface GreetingPluginExtension {
val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {


override fun apply(project: Project) {
// Add the 'greeting' extension object
val extension =
project.extensions.create<GreetingPluginExtension>("greeting")
extension.message.convention("Hello from GreetingPlugin")
// Add a task that uses configuration from the extension object
project.task("hello") {
doLast {
println(extension.message.get())
}
}
}
}

apply<GreetingPlugin>()

build.gradle

interface GreetingPluginExtension {
Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {


void apply(Project project) {
// Add the 'greeting' extension object
def extension = project.extensions.create('greeting',
GreetingPluginExtension)
extension.message.convention('Hello from GreetingPlugin')
// Add a task that uses configuration from the extension object
project.task('hello') {
doLast {
println extension.message.get()
}
}
}
}

apply plugin: GreetingPlugin

$ gradle -q hello
Hello from GreetingPlugin

In this example, GreetingPluginExtension is a class that represents the convention. The message
property is the convention property with a default value of 'Hello from GreetingPlugin'.

Users can override this value in their build script:

build.gradle.kts

GreetingPluginExtension {
message = "Custom message"
}

build.gradle

GreetingPluginExtension {
message = 'Custom message'
}

$ gradle -q hello
Custom message
Separating capabilities from conventions

Separating capabilities from conventions in plugins allows users to choose which tasks and
conventions to apply.

For example, the Java Base plugin provides un-opinionated (i.e., generic) functionality like
SourceSets, while the Java plugin adds tasks and conventions familiar to Java developers like
classes, jar or javadoc.

When designing your own plugins, consider developing two plugins — one for capabilities and
another for conventions — to offer flexibility to users.

In the example below, MyPlugin contains conventions, and MyBasePlugin defines capabilities. Then,
MyPlugin applies MyBasePlugin, this is called plugin composition. To apply a plugin from another one:

MyBasePlugin.java

import org.gradle.api.Plugin;
import org.gradle.api.Project;

public class MyBasePlugin implements Plugin<Project> {


public void apply(Project project) {
// define capabilities
}
}

MyPlugin.java

import org.gradle.api.Plugin;
import org.gradle.api.Project;

public class MyPlugin implements Plugin<Project> {


public void apply(Project project) {
project.getPlugins().apply(MyBasePlugin.class);

// define conventions
}
}

Reacting to plugins

A common pattern in Gradle plugin implementations is configuring the runtime behavior of


existing plugins and tasks in a build.

For example, a plugin could assume that it is applied to a Java-based project and automatically
reconfigure the standard source directory:

InhouseStrongOpinionConventionJavaPlugin.java

public class InhouseStrongOpinionConventionJavaPlugin implements Plugin<Project> {


public void apply(Project project) {
// Careful! Eagerly appyling plugins has downsides, and is not always
recommended.
project.getPlugins().apply(JavaPlugin.class);
SourceSetContainer sourceSets = project.getExtensions().getByType
(SourceSetContainer.class);
SourceSet main = sourceSets.getByName(SourceSet.MAIN_SOURCE_SET_NAME);
main.getJava().setSrcDirs(Arrays.asList("src"));
}
}

The drawback to this approach is that it automatically forces the project to apply the Java plugin,
imposing a strong opinion on it (i.e., reducing flexibility and generality). In practice, the project
applying the plugin might not even deal with Java code.

Instead of automatically applying the Java plugin, the plugin could react to the fact that the
consuming project applies the Java plugin. Only if that is the case, then a certain configuration is
applied:

InhouseConventionJavaPlugin.java

public class InhouseConventionJavaPlugin implements Plugin<Project> {


public void apply(Project project) {
project.getPlugins().withType(JavaPlugin.class, javaPlugin -> {
SourceSetContainer sourceSets = project.getExtensions().getByType
(SourceSetContainer.class);
SourceSet main = sourceSets.getByName(SourceSet.MAIN_SOURCE_SET_NAME);
main.getJava().setSrcDirs(Arrays.asList("src"));
});
}
}

Reacting to plugins is preferred over applying plugins if there is no good reason to assume that the
consuming project has the expected setup.

The same concept applies to task types:

InhouseConventionWarPlugin.java

public class InhouseConventionWarPlugin implements Plugin<Project> {


public void apply(Project project) {
project.getTasks().withType(War.class).configureEach(war ->
war.setWebXml(project.file("src/someWeb.xml")));
}
}

Reacting to build features

Plugins can access the status of build features in the build. The Build Features API allows checking
whether the user requested a particular Gradle feature and if it is active in the current build. An
example of a build feature is the configuration cache.

There are two main use cases:

• Using the status of build features in reports or statistics.

• Incrementally adopting experimental Gradle features by disabling incompatible plugin


functionality.

Below is an example of a plugin that utilizes both of the cases.

Reacting to build features

public abstract class MyPlugin implements Plugin<Project> {

@Inject
protected abstract BuildFeatures getBuildFeatures(); ①

@Override
public void apply(Project p) {
BuildFeatures buildFeatures = getBuildFeatures();

Boolean configCacheRequested = buildFeatures.getConfigurationCache()


.getRequested() ②
.getOrNull(); // could be null if user did not opt in nor opt out
String configCacheUsage = describeFeatureUsage(configCacheRequested);
MyReport myReport = new MyReport();
myReport.setConfigurationCacheUsage(configCacheUsage);

boolean isolatedProjectsActive = buildFeatures.getIsolatedProjects().


getActive() ③
.get(); // the active state is always defined
if (!isolatedProjectsActive) {
myOptionalPluginLogicIncompatibleWithIsolatedProjects();
}
}

private String describeFeatureUsage(Boolean requested) {


return requested == null ? "no preference" : requested ? "opt-in" : "opt-out";
}

private void myOptionalPluginLogicIncompatibleWithIsolatedProjects() {


}
}

① The BuildFeatures service can be injected into plugins, tasks, and other managed types.

② Accessing the requested status of a feature for reporting.

③ Using the active status of a feature to disable incompatible functionality.


Build feature properties

A BuildFeature status properties are represented with Provider<Boolean> types.

The BuildFeature.getRequested() status of a build feature determines if the user requested to enable
or disable the feature.

When the requested provider value is:

• true — the user opted in for using the feature

• false — the user opted out from using the feature

• undefined — the user neither opted in nor opted out from using the feature

The BuildFeature.getActive() status of a build feature is always defined. It represents the effective
state of the feature in the build.

When the active provider value is:

• true — the feature may affect the build behavior in a way specific to the feature

• false — the feature will not affect the build behavior

Note that the active status does not depend on the requested status. Even if the user requests a
feature, it may still not be active due to other build options being used in the build. Gradle can also
activate a feature by default, even if the user did not specify a preference.

Using a custom dependencies block

NOTE Custom dependencies blocks are based on incubating APIs.

A plugin can provide dependency declarations in custom blocks that allow users to declare
dependencies in a type-safe and context-aware way.

For instance, instead of users needing to know and use the underlying Configuration name to add
dependencies, a custom dependencies block lets the plugin pick a meaningful name that can be used
consistently.

Adding a custom dependencies block

To add a custom dependencies block, you need to create a new type that will represent the set of
dependency scopes available to users. That new type needs to be accessible from a part of your
plugin (from a domain object or extension). Finally, the dependency scopes need to be wired back
to underlying Configuration objects that will be used during dependency resolution.

See JvmComponentDependencies and JvmTestSuite for an example of how this is used in a Gradle
core plugin.

1. Create an interface that extends Dependencies

NOTE You can also extend GradleDependencies to get access to Gradle-provided


dependencies like gradleApi().

ExampleDependencies.java

/**
* Custom dependencies block for the example plugin.
*/
public interface ExampleDependencies extends Dependencies {

2. Add accessors for dependency scopes

For each dependency scope your plugin wants to support, add a getter method that returns a
DependencyCollector.

ExampleDependencies.java

/**
* Dependency scope called "implementation"
*/
DependencyCollector getImplementation();

3. Add accessors for custom dependencies block

To make the custom dependencies block configurable, the plugin needs to add a getDependencies
method that returns the new type from above and a configurable block method named
dependencies.

By convention, the accessors for your custom dependencies block should be called
getDependencies()/dependencies(Action). This method could be named something else, but users
would need to know that a different block can behave like a dependencies block.

ExampleExtension.java

/**
* Custom dependencies for this extension.
*/
@Nested
ExampleDependencies getDependencies();

/**
* Configurable block
*/
default void dependencies(Action<? super ExampleDependencies> action) {
action.execute(getDependencies());
}
4. Wire dependency scope to Configuration

Finally, the plugin needs to wire the custom dependencies block to some underlying Configuration
objects. If this is not done, none of the dependencies declared in the custom block will be available
to dependency resolution.

ExamplePlugin.java

project.getConfigurations().dependencyScope("exampleImplementation", conf
-> {
conf.fromDependencyCollector(example.getDependencies()
.getImplementation());
});

In this example, the name users will use to add dependencies is "implementation",
NOTE
but the underlying Configuration is named exampleImplementation.

build.gradle.kts

example {
dependencies {
implementation("junit:junit:4.13")
}
}

build.gradle

example {
dependencies {
implementation("junit:junit:4.13")
}
}

Differences between the custom dependencies and the top-level dependencies blocks

Each dependency scope returns a DependencyCollector that provides strongly-typed methods to add
and configure dependencies.

There is also a DependencyFactory with factory methods to create new dependencies from different
notations. Dependencies can be created lazily using these factory methods, as shown below.

A custom dependencies block differs from the top-level dependencies block in the following ways:

• Dependencies must be declared using a String, an instance of Dependency, a FileCollection, a


Provider of Dependency, or a ProviderConvertible of MinimalExternalModuleDependency.
• Outside of Gradle build scripts, you must explicitly call a getter for the DependencyCollector and
add.

◦ dependencies.add("implementation", x) becomes getImplementation().add(x)

• You cannot declare dependencies with the Map notation from Kotlin and Java. Use multi-
argument methods instead in Kotlin and Java.

◦ Kotlin: compileOnly(mapOf("group" to "foo", "name" to "bar")) becomes


compileOnly(module(group = "foo", name = "bar"))

◦ Java: compileOnly(Map.of("group", "foo", "name", "bar")) becomes


getCompileOnly().add(module("foo", "bar", null))

• You cannot add a dependency with an instance of Project. You must turn it into a
ProjectDependency first.

• You cannot add version catalog bundles directly. Instead, use the bundle method on each
configuration.

◦ Kotlin and Groovy: implementation(libs.bundles.testing) becomes


implementation.bundle(libs.bundles.testing)

• You cannot use providers for non-Dependency types directly. Instead, map them to a Dependency
using the DependencyFactory.

◦ Kotlin and Groovy: implementation(myStringProvider) becomes


implementation(myStringProvider.map { dependencyFactory.create(it) })

◦ Java: implementation(myStringProvider) becomes


getImplementation().add(myStringProvider.map(getDependencyFactory()::create)

• Unlike the top-level dependencies block, constraints are not in a separate block.

◦ Instead, constraints are added by decorating a dependency with constraint(…) like


implementation(constraint("org:foo:1.0")).

Keep in mind that the dependencies block may not provide access to the same methods as the top-
level dependencies block.

NOTE Plugins should prefer adding dependencies via their own dependencies block.

Providing default dependencies

The implementation of a plugin sometimes requires the use of an external dependency.

You might want to automatically download an artifact using Gradle’s dependency management
mechanism and later use it in the action of a task type declared in the plugin. Ideally, the plugin
implementation does not need to ask the user for the coordinates of that dependency - it can simply
predefine a sensible default version.

Let’s look at an example of a plugin that downloads files containing data for further processing. The
plugin implementation declares a custom configuration that allows for assigning those external
dependencies with dependency coordinates:
DataProcessingPlugin.java

public class DataProcessingPlugin implements Plugin<Project> {


public void apply(Project project) {
Configuration dataFiles = project.getConfigurations().create("dataFiles", c ->
{
c.setVisible(false);
c.setCanBeConsumed(false);
c.setCanBeResolved(true);
c.setDescription("The data artifacts to be processed for this plugin.");
c.defaultDependencies(d -> d.add(project.getDependencies().create(
"org.myorg:data:1.4.6")));
});

project.getTasks().withType(DataProcessing.class).configureEach(
dataProcessing -> dataProcessing.getDataFiles().from(dataFiles));
}
}

DataProcessing.java

abstract public class DataProcessing extends DefaultTask {

@InputFiles
abstract public ConfigurableFileCollection getDataFiles();

@TaskAction
public void process() {
System.out.println(getDataFiles().getFiles());
}
}

This approach is convenient for the end user as there is no need to actively declare a dependency.
The plugin already provides all the details about this implementation.

But what if the user wants to redefine the default dependency?

No problem. The plugin also exposes the custom configuration that can be used to assign a different
dependency. Effectively, the default dependency is overwritten:

build.gradle.kts

plugins {
id("org.myorg.data-processing")
}

dependencies {
dataFiles("org.myorg:more-data:2.6")
}

build.gradle

plugins {
id 'org.myorg.data-processing'
}

dependencies {
dataFiles 'org.myorg:more-data:2.6'
}

You will find that this pattern works well for tasks that require an external dependency when the
task’s action is executed. You can go further and abstract the version to be used for the external
dependency by exposing an extension property (e.g. toolVersion in the JaCoCo plugin).

Minimizing the use of external libraries

Using external libraries in your Gradle projects can bring great convenience, but be aware that they
can introduce complex dependency graphs. Gradle’s buildEnvironment task can help you visualize
these dependencies, including those of your plugins. Keep in mind that plugins share the same
classloader, so conflicts may arise with different versions of the same library.

To demonstrate let’s assume the following build script:

build.gradle.kts

plugins {
id("org.asciidoctor.jvm.convert") version "4.0.2"
}

build.gradle

plugins {
id 'org.asciidoctor.jvm.convert' version '4.0.2'
}

The output of the task clearly indicates the classpath of the classpath configuration:

$ gradle buildEnvironment
> Task :buildEnvironment

------------------------------------------------------------
Root project 'external-libraries'
------------------------------------------------------------

classpath
\--- org.asciidoctor.jvm.convert:org.asciidoctor.jvm.convert.gradle.plugin:4.0.2
\--- org.asciidoctor:asciidoctor-gradle-jvm:4.0.2
+--- org.ysb33r.gradle:grolifant-rawhide:3.0.0
| \--- org.tukaani:xz:1.6
+--- org.ysb33r.gradle:grolifant-herd:3.0.0
| +--- org.tukaani:xz:1.6
| +--- org.ysb33r.gradle:grolifant40:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.apache.commons:commons-collections4:4.4
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0
| | | +--- org.tukaani:xz:1.6
| | | +--- org.apache.commons:commons-collections4:4.4
| | | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant50:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant40-legacy-api:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.apache.commons:commons-collections4:4.4
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant60:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant70:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant80:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant70:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
+--- org.asciidoctor:asciidoctor-gradle-base:4.0.2
| \--- org.ysb33r.gradle:grolifant-herd:3.0.0 (*)
\--- org.asciidoctor:asciidoctorj-api:2.5.7

(*) - Indicates repeated occurrences of a transitive dependency subtree. Gradle


expands transitive dependency subtrees only once per project; repeat occurrences only
display the root of the subtree, followed by this annotation.

A web-based, searchable dependency report is available by adding the --scan option.

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

A Gradle plugin does not run in its own, isolated classloader, so you must consider whether you
truly need a library or if a simpler solution suffices.

For logic that is executed as part of task execution, use the Worker API that allows you to isolate
libraries.

Providing multiple variants of a plugin

Variants of a plugin refer to different flavors or configurations of the plugin that are tailored to
specific needs or use cases. These variants can include different implementations, extensions, or
configurations of the base plugin.

The most convenient way to configure additional plugin variants is to use feature variants, a
concept available in all Gradle projects that apply one of the Java plugins:

dependencies {
implementation 'com.google.guava:guava:30.1-jre' // Regular dependency
featureVariant 'com.google.guava:guava-gwt:30.1-jre' // Feature variant
dependency
}

In the following example, each plugin variant is developed in isolation. A separate source set is
compiled and packaged in a separate jar for each variant.

The following sample demonstrates how to add a variant that is compatible with Gradle 7.0+ while
the "main" variant is compatible with older versions:

build.gradle.kts

val gradle7 = sourceSets.create("gradle7")

java {
registerFeature(gradle7.name) {
usingSourceSet(gradle7)
capability(project.group.toString(), project.name,
project.version.toString()) ①
}
}

configurations.configureEach {
if (isCanBeConsumed && name.startsWith(gradle7.name)) {
attributes {

attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, ②
objects.named("7.0"))
}
}
}

tasks.named<Copy>(gradle7.processResourcesTaskName) { ③
val copyPluginDescriptors = rootSpec.addChild()
copyPluginDescriptors.into("META-INF/gradle-plugins")
copyPluginDescriptors.from(tasks.pluginDescriptors)
}

dependencies {
"gradle7CompileOnly"(gradleApi()) ④
}

build.gradle

def gradle7 = sourceSets.create('gradle7')

java {
registerFeature(gradle7.name) {
usingSourceSet(gradle7)
capability(project.group.toString(), project.name, project.version
.toString()) ①
}
}

configurations.configureEach {
if (canBeConsumed && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion
.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, ②
objects.named(GradlePluginApiVersion, '7.0'))
}
}
}

tasks.named(gradle7.processResourcesTaskName) { ③
def copyPluginDescriptors = rootSpec.addChild()
copyPluginDescriptors.into('META-INF/gradle-plugins')
copyPluginDescriptors.from(tasks.pluginDescriptors)
}

dependencies {
gradle7CompileOnly(gradleApi()) ④
}

Only Gradle versions 7 or higher can be explicitly targeted by a variant, as support


NOTE
for this was only added in Gradle 7.

First, we declare a separate source set and a feature variant for our Gradle 7 plugin variant. Then,
we do some specific wiring to turn the feature into a proper Gradle plugin variant:

① Assign the implicit capability that corresponds to the components GAV to the variant.

② Assign the Gradle API version attribute to all consumable configurations of our Gradle7 variant.
Gradle uses this information to determine which variant to select during plugin resolution.

③ Configure the processGradle7Resources task to ensure the plugin descriptor file is added to the
Gradle7 variant Jar.

④ Add a dependency to the gradleApi() for our new variant so that the API is visible during
compilation time.

Note that there is currently no convenient way to access the API of other Gradle versions as the one
you are building the plugin with. Ideally, every variant should be able to declare a dependency on
the API of the minimal Gradle version it supports. This will be improved in the future.

The above snippet assumes that all variants of your plugin have the plugin class at the same
location. That is, if your plugin class is org.example.GreetingPlugin, you need to create a second
variant of that class in src/gradle7/java/org/example.

Using version-specific variants of multi-variant plugins

Given a dependency on a multi-variant plugin, Gradle will automatically choose its variant that best
matches the current Gradle version when it resolves any of:

• plugins specified in the plugins {} block;

• buildscript classpath dependencies;

• dependencies in the root project of the build source (buildSrc) that appear on the compile or
runtime classpath;

• dependencies in a project that applies the Java Gradle Plugin Development plugin or the Kotlin
DSL plugin, appearing on the compile or runtime classpath.

The best matching variant is the variant that targets the highest Gradle API version and does not
exceed the current build’s Gradle version.
In all other cases, a plugin variant that does not specify the supported Gradle API version is
preferred if such a variant is present.

In projects that use plugins as dependencies, requesting the variants of plugin dependencies that
support a different Gradle version is possible. This allows a multi-variant plugin that depends on
other plugins to access their APIs, which are exclusively provided in their version-specific variants.

This snippet makes the plugin variant gradle7 defined above consume the matching variants of its
dependencies on other multi-variant plugins:

build.gradle.kts

configurations.configureEach {
if (isCanBeResolved && name.startsWith(gradle7.name)) {
attributes {

attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
objects.named("7.0"))
}
}
}

build.gradle

configurations.configureEach {
if (canBeResolved && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion
.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
objects.named(GradlePluginApiVersion, '7.0'))
}
}
}

Reporting problems

Plugins can report problems through Gradle’s problem-reporting APIs. The APIs report rich,
structured information about problems happening during the build. This information can be used
by different user interfaces such as Gradle’s console output, Build Scans, or IDEs to communicate
problems to the user in the most appropriate way.

The following example shows an issue reported from a plugin:


ProblemReportingPlugin.java

public class ProblemReportingPlugin implements Plugin<Project> {

private final ProblemReporter problemReporter;

@Inject
public ProblemReportingPlugin(Problems problems) { ①
this.problemReporter = problems.forNamespace("org.myorg"); ②
}

public void apply(Project project) {


this.problemReporter.reporting(builder -> builder ③
.id("adhoc-deprecation", "Plugin 'x' is deprecated")
.details("The plugin 'x' is deprecated since version 2.5")
.solution("Please use plugin 'y'")
.severity(Severity.WARNING)
);
}
}

① The Problem service is injected into the plugin.

② A problem reporter, is created for the plugin. While the namespace is up to the plugin author, it
is recommended that the plugin ID be used.

③ A problem is reported. This problem is recoverable so that the build will continue.

For a full example, see our end-to-end sample.

Problem building

When reporting a problem, a wide variety of information can be provided. The ProblemSpec
describes all the information that can be provided.

Reporting problems

When it comes to reporting problems, we support three different modes:

• Reporting a problem is used for reporting problems that are recoverable, and the build should
continue.

• Throwing a problem is used for reporting problems that are not recoverable, and the build
should fail.

• Rethrowing a problem is used to wrap an already thrown exception. Otherwise, the behavior is
the same as Throwing.

For more details, see the ProblemReporter documentation.


Problem aggregation

When reporting problems, Gradle will aggregate similar problems by sending them through the
Tooling API based on the problem’s category label.

• When a problem is reported, the first occurrence is going to be reported as a


ProblemDescriptor, containing the complete information about the problem.

• Any subsequent occurrences of the same problem will be reported as a


ProblemAggregationDescriptor. This descriptor will arrive at the end of the build and contain
the number of occurrences of the problem.

• If for any bucket (i.e., category and label pairing), the number of collected occurrences is greater
than 10.000, then it will be sent immediately instead of at the end of the build.

Testing Gradle plugins


Testing plays a crucial role in the development process by ensuring reliable and high-quality
software. This principle applies to build code, including Gradle plugins.

The sample project

This section revolves around a sample project called the "URL verifier plugin". This plugin creates a
task named verifyUrl that checks whether a given URL can be resolved via HTTP GET. The end user
can provide the URL via an extension named verification.

The following build script assumes that the plugin JAR file has been published to a binary
repository. The script demonstrates how to apply the plugin to the project and configure its exposed
extension:

build.gradle.kts

plugins {
id("org.myorg.url-verifier") ①
}

verification {
url = "https://www.google.com/" ②
}

build.gradle

plugins {
id 'org.myorg.url-verifier' ①
}

verification {
url = 'https://www.google.com/' ②
}

① Applies the plugin to the project

② Configures the URL to be verified through the exposed extension

Executing the verifyUrl task renders a success message if the HTTP GET call to the configured URL
returns with a 200 response code:

$ gradle verifyUrl

> Task :verifyUrl


Successfully resolved URL 'https://www.google.com/'

BUILD SUCCESSFUL in 0s
5 actionable tasks: 5 executed

Before diving into the code, let’s first revisit the different types of tests and the tooling that supports
implementing them.

The importance of testing

Testing is a crucial part of the software development life cycle, ensuring that software functions
correctly and meets quality standards before release. Automated testing allows developers to
refactor and improve code with confidence.

The testing pyramid

Manual Testing
While manual testing is straightforward, it is error-prone and requires human effort. For Gradle
plugins, manual testing involves using the plugin in a build script.

Automated Testing
Automated testing includes unit, integration, and functional testing.
The testing pyramid
introduced by Mike Cohen in
his book Succeeding with
Agile: Software Development
Using Scrum describes three
types of automated tests:

1. Unit Testing: Verifies the smallest units of code, typically methods, in isolation. It uses Stubs or
Mocks to isolate code from external dependencies.

2. Integration Testing: Validates that multiple units or components work together.

3. Functional Testing: Tests the system from the end user’s perspective, ensuring correct
functionality. End-to-end tests for Gradle plugins simulate a build, apply the plugin, and execute
specific tasks to verify functionality.

Tooling support

Testing Gradle plugins, both manually and automatically, is simplified with the appropriate tools.
The table below provides a summary of each testing approach. You can choose any test framework
you’re comfortable with.

For detailed explanations and code examples, refer to the specific sections below:

Test type Tooling support

Manual tests Gradle composite builds

Unit tests Any JVM-based test framework

Integration tests Any JVM-based test framework

Functional tests Any JVM-based test framework and Gradle TestKit

Setting up manual tests

The composite builds feature of Gradle makes it easy to test a plugin manually. The standalone
plugin project and the consuming project can be combined into a single unit, making it
straightforward to try out or debug changes without re-publishing the binary file:
.
├── include-plugin-build ①
│ ├── build.gradle
│ └── settings.gradle
└── url-verifier-plugin ②
├── build.gradle
├── settings.gradle
└── src

① Consuming project that includes the plugin project

② The plugin project

There are two ways to include a plugin project in a consuming project:

1. By using the command line option --include-build.

2. By using the method includeBuild in settings.gradle.

The following code snippet demonstrates the use of the settings file:

settings.gradle.kts

pluginManagement {
includeBuild("../url-verifier-plugin")
}

settings.gradle

pluginManagement {
includeBuild '../url-verifier-plugin'
}

The command line output of the verifyUrl task from the project include-plugin-build looks exactly
the same as shown in the introduction, except that it now executes as part of a composite build.

Manual testing has its place in the development process, but it is not a replacement for automated
testing.

Setting up automated tests

Setting up a suite of tests early on is crucial to the success of your plugin. Automated tests become
an invaluable safety net when upgrading the plugin to a new Gradle version or
enhancing/refactoring the code.
Organizing test source code

We recommend implementing a good distribution of unit, integration, and functional tests to cover
the most important use cases. Separating the source code for each test type automatically results in
a project that is more maintainable and manageable.

By default, the Java project creates a convention for organizing unit tests in the directory
src/test/java. Additionally, if you apply the Groovy plugin, source code under the directory
src/test/groovy is considered for compilation (with the same standard for Kotlin under the
directory src/test/kotlin). Consequently, source code directories for other test types should follow
a similar pattern:

.
└── src
├── functionalTest
│ └── groovy ①
├── integrationTest
│ └── groovy ②
├── main
│ ├── java ③
└── test
└── groovy ④

① Source directory containing functional tests

② Source directory containing integration tests

③ Source directory containing production source code

④ Source directory containing unit tests

The directories src/integrationTest/groovy and src/functionalTest/groovy are not


NOTE based on an existing standard convention for Gradle projects. You are free to choose
any project layout that works best for you.

You can configure the source directories for compilation and test execution.

The Test Suite plugin provides a DSL and API to model multiple groups of automated tests into test
suites in JVM-based projects. You can also rely on third-party plugins for convenience, such as the
Nebula Facet plugin or the TestSets plugin.

Modeling test types

A new configuration DSL for modeling the below integrationTest suite is available
NOTE
via the incubating JVM Test Suite plugin.

In Gradle, source code directories are represented using the concept of source sets. A source set is
configured to point to one or more directories containing source code. When you define a source
set, Gradle automatically sets up compilation tasks for the specified directories.

A pre-configured source set can be created with one line of build script code. The source set
automatically registers configurations to define dependencies for the sources of the source set:

// Define a source set named 'test' for test sources


sourceSets {
test {
java {
srcDirs = ['src/test/java']
}
}
}
// Specify a test implementation dependency on JUnit
dependencies {
testImplementation 'junit:junit:4.12'
}

We use that to define an integrationTestImplementation dependency to the project itself, which


represents the "main" variant of our project (i.e., the compiled plugin code):

build.gradle.kts

val integrationTest by sourceSets.creating

dependencies {
"integrationTestImplementation"(project)
}

build.gradle

def integrationTest = sourceSets.create("integrationTest")

dependencies {
integrationTestImplementation(project)
}

Source sets are responsible for compiling source code, but they do not deal with executing the
bytecode. For test execution, a corresponding task of type Test needs to be established. The
following setup shows the execution of integration tests, referencing the classes and runtime
classpath of the integration test source set:

build.gradle.kts

val integrationTestTask = tasks.register<Test>("integrationTest") {


description = "Runs the integration tests."
group = "verification"
testClassesDirs = integrationTest.output.classesDirs
classpath = integrationTest.runtimeClasspath
mustRunAfter(tasks.test)
}
tasks.check {
dependsOn(integrationTestTask)
}

build.gradle

def integrationTestTask = tasks.register("integrationTest", Test) {


description = 'Runs the integration tests.'
group = "verification"
testClassesDirs = integrationTest.output.classesDirs
classpath = integrationTest.runtimeClasspath
mustRunAfter(tasks.named('test'))
}
tasks.named('check') {
dependsOn(integrationTestTask)
}

Configuring a test framework

Gradle does not dictate the use of a specific test framework. Popular choices include JUnit, TestNG
and Spock. Once you choose an option, you have to add its dependency to the compile classpath for
your tests.

The following code snippet shows how to use Spock for implementing tests:

build.gradle.kts

repositories {
mavenCentral()
}

dependencies {
testImplementation(platform("org.spockframework:spock-bom:2.2-groovy-
3.0"))
testImplementation("org.spockframework:spock-core")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")

"integrationTestImplementation"(platform("org.spockframework:spock-
bom:2.2-groovy-3.0"))
"integrationTestImplementation"("org.spockframework:spock-core")
"integrationTestRuntimeOnly"("org.junit.platform:junit-platform-
launcher")

"functionalTestImplementation"(platform("org.spockframework:spock-
bom:2.2-groovy-3.0"))
"functionalTestImplementation"("org.spockframework:spock-core")
"functionalTestRuntimeOnly"("org.junit.platform:junit-platform-launcher")
}

tasks.withType<Test>().configureEach {
// Using JUnitPlatform for running tests
useJUnitPlatform()
}

build.gradle

repositories {
mavenCentral()
}

dependencies {
testImplementation platform("org.spockframework:spock-bom:2.2-groovy-3.0
")
testImplementation 'org.spockframework:spock-core'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'

integrationTestImplementation platform("org.spockframework:spock-bom:2.2-
groovy-3.0")
integrationTestImplementation 'org.spockframework:spock-core'
integrationTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'

functionalTestImplementation platform("org.spockframework:spock-bom:2.2-
groovy-3.0")
functionalTestImplementation 'org.spockframework:spock-core'
functionalTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'
}

tasks.withType(Test).configureEach {
// Using JUnitPlatform for running tests
useJUnitPlatform()
}

Spock is a Groovy-based BDD test framework that even includes APIs for creating
NOTE Stubs and Mocks. The Gradle team prefers Spock over other options for its
expressiveness and conciseness.
Implementing automated tests

This section discusses representative implementation examples for unit, integration, and functional
tests. All test classes are based on the use of Spock, though it should be relatively easy to adapt the
code to a different test framework.

Implementing unit tests

The URL verifier plugin emits HTTP GET calls to check if a URL can be resolved successfully. The
method DefaultHttpCaller.get(String) is responsible for calling a given URL and returns an
instance of type HttpResponse. HttpResponse is a POJO containing information about the HTTP
response code and message:

HttpResponse.java

package org.myorg.http;

public class HttpResponse {


private int code;
private String message;

public HttpResponse(int code, String message) {


this.code = code;
this.message = message;
}

public int getCode() {


return code;
}

public String getMessage() {


return message;
}

@Override
public String toString() {
return "HTTP " + code + ", Reason: " + message;
}
}

The class HttpResponse represents a good candidate for a unit test. It does not reach out to any other
classes nor does it use the Gradle API.

HttpResponseTest.groovy

package org.myorg.http

import spock.lang.Specification

class HttpResponseTest extends Specification {


private static final int OK_HTTP_CODE = 200
private static final String OK_HTTP_MESSAGE = 'OK'

def "can access information"() {


when:
def httpResponse = new HttpResponse(OK_HTTP_CODE, OK_HTTP_MESSAGE)

then:
httpResponse.code == OK_HTTP_CODE
httpResponse.message == OK_HTTP_MESSAGE
}

def "can get String representation"() {


when:
def httpResponse = new HttpResponse(OK_HTTP_CODE, OK_HTTP_MESSAGE)

then:
httpResponse.toString() == "HTTP $OK_HTTP_CODE, Reason: $OK_HTTP_MESSAGE"
}
}

When writing unit tests, it’s important to test boundary conditions and
various forms of invalid input. Try to extract as much logic as possible from
IMPORTANT
classes that use the Gradle API to make it testable as unit tests. It will result
in maintainable code and faster test execution.

You can use the ProjectBuilder class to create Project instances to use when you test your plugin
implementation.

src/test/java/org/example/GreetingPluginTest.java

public class GreetingPluginTest {


@Test
public void greeterPluginAddsGreetingTaskToProject() {
Project project = ProjectBuilder.builder().build();
project.getPluginManager().apply("org.example.greeting");

assertTrue(project.getTasks().getByName("hello") instanceof GreetingTask);


}
}

Implementing integration tests

Let’s look at a class that reaches out to another system, the piece of code that emits the HTTP calls.
At the time of executing a test for the class DefaultHttpCaller, the runtime environment needs to be
able to reach out to the internet:
DefaultHttpCaller.java

package org.myorg.http;

import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URI;
import java.net.URISyntaxException;

public class DefaultHttpCaller implements HttpCaller {


@Override
public HttpResponse get(String url) {
try {
HttpURLConnection connection = (HttpURLConnection) new URI(url).toURL()
.openConnection();
connection.setConnectTimeout(5000);
connection.setRequestMethod("GET");
connection.connect();

int code = connection.getResponseCode();


String message = connection.getResponseMessage();
return new HttpResponse(code, message);
} catch (IOException e) {
throw new HttpCallException(String.format("Failed to call URL '%s' via
HTTP GET", url), e);
} catch (URISyntaxException e) {
throw new RuntimeException(e);
}
}
}

Implementing an integration test for DefaultHttpCaller doesn’t look much different from the unit
test shown in the previous section:

DefaultHttpCallerIntegrationTest.groovy

package org.myorg.http

import spock.lang.Specification
import spock.lang.Subject

class DefaultHttpCallerIntegrationTest extends Specification {


@Subject HttpCaller httpCaller = new DefaultHttpCaller()

def "can make successful HTTP GET call"() {


when:
def httpResponse = httpCaller.get('https://www.google.com/')

then:
httpResponse.code == 200
httpResponse.message == 'OK'
}

def "throws exception when calling unknown host via HTTP GET"() {
when:
httpCaller.get('https://www.wedonotknowyou123.com/')

then:
def t = thrown(HttpCallException)
t.message == "Failed to call URL 'https://www.wedonotknowyou123.com/' via HTTP
GET"
t.cause instanceof UnknownHostException
}
}

Implementing functional tests

Functional tests verify the correctness of the plugin end-to-end. In practice, this means applying,
configuring, and executing the functionality of the plugin implementation. The UrlVerifierPlugin
class exposes an extension and a task instance that uses the URL value configured by the end user:

UrlVerifierPlugin.java

package org.myorg;

import org.gradle.api.Plugin;
import org.gradle.api.Project;
import org.myorg.tasks.UrlVerify;

public class UrlVerifierPlugin implements Plugin<Project> {


@Override
public void apply(Project project) {
UrlVerifierExtension extension = project.getExtensions().create("verification
", UrlVerifierExtension.class);
UrlVerify verifyUrlTask = project.getTasks().create("verifyUrl", UrlVerify
.class);
verifyUrlTask.getUrl().set(extension.getUrl());
}
}

Every Gradle plugin project should apply the plugin development plugin to reduce boilerplate code.
By applying the plugin development plugin, the test source set is preconfigured for the use with
TestKit. If we want to use a custom source set for functional tests and leave the default test source
set for only unit tests, we can configure the plugin development plugin to look for TestKit tests
elsewhere.
build.gradle.kts

gradlePlugin {
testSourceSets(functionalTest)
}

build.gradle

gradlePlugin {
testSourceSets(sourceSets.functionalTest)
}

Functional tests for Gradle plugins use an instance of GradleRunner to execute the build under test.
GradleRunner is an API provided by TestKit, which internally uses the Tooling API to execute the
build.

The following example applies the plugin to the build script under test, configures the extension
and executes the build with the task verifyUrl. Please see the TestKit documentation to get more
familiar with the functionality of TestKit.

UrlVerifierPluginFunctionalTest.groovy

package org.myorg

import org.gradle.testkit.runner.GradleRunner
import spock.lang.Specification
import spock.lang.TempDir

import static org.gradle.testkit.runner.TaskOutcome.SUCCESS

class UrlVerifierPluginFunctionalTest extends Specification {


@TempDir File testProjectDir
File buildFile

def setup() {
buildFile = new File(testProjectDir, 'build.gradle')
buildFile << """
plugins {
id 'org.myorg.url-verifier'
}
"""
}

def "can successfully configure URL through extension and verify it"() {
buildFile << """
verification {
url = 'https://www.google.com/'
}
"""

when:
def result = GradleRunner.create()
.withProjectDir(testProjectDir)
.withArguments('verifyUrl')
.withPluginClasspath()
.build()

then:
result.output.contains("Successfully resolved URL 'https://www.google.com/'")
result.task(":verifyUrl").outcome == SUCCESS
}
}

IDE integration

TestKit determines the plugin classpath by running a specific Gradle task. You will need to execute
the assemble task to initially generate the plugin classpath or to reflect changes to it even when
running TestKit-based functional tests from the IDE.

Some IDEs provide a convenience option to delegate the "test classpath generation and execution"
to the build. In IntelliJ, you can find this option under Preferences… > Build, Execution, Deployment
> Build Tools > Gradle > Runner > Delegate IDE build/run actions to Gradle.

Publishing Plugins to the Gradle Plugin Portal


Publishing a plugin is the primary way to make it available for others to use. While you can publish
to a private repository to restrict access, publishing to the Gradle Plugin Portal makes your plugin
available to anyone in the world.
This guide shows you how to use the com.gradle.plugin-publish plugin to publish plugins to the
Gradle Plugin Portal using a convenient DSL. This approach streamlines configuration steps and
provides validation checks to ensure your plugin meets the Gradle Plugin Portal’s criteria.

Prerequisites

You’ll need an existing Gradle plugin project for this tutorial. If you don’t have one, use the Greeting
plugin sample.

Attempting to publish this plugin will safely fail with a permission error, so don’t worry about
cluttering up the Gradle Plugin Portal with a trivial example plugin.

Account setup

Before publishing your plugin, you must create an account on the Gradle Plugin Portal. Follow the
instructions on the registration page to create an account and obtain an API key from your profile
page’s "API Keys" tab.
Store your API key in your Gradle configuration (gradle.publish.key and gradle.publish.secret) or
use a plugin like Seauc Credentials plugin or Gradle Credentials plugin for secure management.

It is common practice to copy and paste the text into your $HOME/.gradle/gradle.properties file, but
you can also place it in any other valid location. All the plugin requires is that the
gradle.publish.key and gradle.publish.secret are available as project properties when the
appropriate Plugin Portal tasks are executed.

If you are concerned about placing your credentials in gradle.properties, check out the Seauc
Credentials plugin or the Gradle Credentials plugin.

Adding the Plugin Publishing Plugin

To publish your plugin, add the com.gradle.plugin-publish plugin to your project’s build.gradle or
build.gradle.kts file:

build.gradle.kts

plugins {
id("com.gradle.plugin-publish") version "1.2.1"
}

build.gradle

plugins {
id 'com.gradle.plugin-publish' version '1.2.1'
}

The latest version of the Plugin Publishing Plugin can be found on the Gradle Plugin Portal.

Since version 1.0.0 the Plugin Publish Plugin automatically applies the Java Gradle
Plugin Development Plugin (assists with developing Gradle plugins) and the Maven
NOTE
Publish Plugin (generates plugin publication metadata). If using older versions of
the Plugin Publish Plugin, these helper plugins must be applied explicitly.

Configuring the Plugin Publishing Plugin

Configure the com.gradle.plugin-publish plugin in your build.gradle or build.gradle.kts file.

build.gradle.kts

group = "io.github.johndoe" ①
version = "1.0" ②

gradlePlugin { ③
website = "<substitute your project website>" ④
vcsUrl = "<uri to project source repository>" ⑤

// ... ⑥
}

build.gradle

group = 'io.github.johndoe' ①
version = '1.0' ②

gradlePlugin { ③
website = '<substitute your project website>' ④
vcsUrl = '<uri to project source repository>' ⑤

// ... ⑥
}

① Make sure your project has a group set which is used to identify the artifacts (jar and metadata)
you publish for your plugins in the repository of the Gradle Plugin Portal and which is
descriptive of the plugin author or the organization the plugins belong too.

② Set the version of your project, which will also be used as the version of your plugins.

③ Use the gradlePlugin block provided by the Java Gradle Plugin Development Plugin to configure
further options for your plugin publication.

④ Set the website for your plugin’s project.

⑤ Provide the source repository URI so that others can find it, if they want to contribute.

⑥ Set specific properties for each plugin you want to publish; see next section.

Define common properties for all plugins, such as group, version, website, and source repository,
using the gradlePlugin{} block:

build.gradle.kts

gradlePlugin { ①
// ... ②

plugins { ③
create("greetingsPlugin") { ④
id = "<your plugin identifier>" ⑤
displayName = "<short displayable name for plugin>" ⑥
description = "<human-readable description of what your plugin is
about>" ⑦
tags = listOf("tags", "for", "your", "plugins") ⑧
implementationClass = "<your plugin class>"
}
}
}

build.gradle

gradlePlugin { ①
// ... ②

plugins { ③
greetingsPlugin { ④
id = '<your plugin identifier>' ⑤
displayName = '<short displayable name for plugin>' ⑥
description = '<human-readable description of what your plugin is
about>' ⑦
tags.set(['tags', 'for', 'your', 'plugins']) ⑧
implementationClass = '<your plugin class>'
}
}
}

① Plugin specific configuration also goes into the gradlePlugin block.

② This is where we previously added global properties.

③ Each plugin you publish will have its own block inside plugins.

④ The name of a plugin block must be unique for each plugin you publish; this is a property used
only locally by your build and will not be part of the publication.

⑤ Set the unique id of the plugin, as it will be identified in the publication.

⑥ Set the plugin name in human-readable form.

⑦ Set a description to be displayed on the portal. It provides useful information to people who
want to use your plugin.

⑧ Specifies the categories your plugin covers. It makes the plugin more likely to be discovered by
people needing its functionality.

For example, consider the configuration for the GradleTest plugin, already published to the Gradle
Plugin Portal.

build.gradle.kts

gradlePlugin {
website = "https://github.com/ysb33r/gradleTest"
vcsUrl = "https://github.com/ysb33r/gradleTest.git"
plugins {
create("gradletestPlugin") {
id = "org.ysb33r.gradletest"
displayName = "Plugin for compatibility testing of Gradle
plugins"
description = "A plugin that helps you test your plugin against a
variety of Gradle versions"
tags = listOf("testing", "integrationTesting", "compatibility")
implementationClass =
"org.ysb33r.gradle.gradletest.GradleTestPlugin"
}
}
}
build.gradle

gradlePlugin {
website = 'https://github.com/ysb33r/gradleTest'
vcsUrl = 'https://github.com/ysb33r/gradleTest.git'
plugins {
gradletestPlugin {
id = 'org.ysb33r.gradletest'
displayName = 'Plugin for compatibility testing of Gradle
plugins'
description = 'A plugin that helps you test your plugin against a
variety of Gradle versions'
tags.addAll('testing', 'integrationTesting', 'compatibility')
implementationClass =
'org.ysb33r.gradle.gradletest.GradleTestPlugin'
}
}
}

If you browse the associated page on the Gradle Plugin Portal for the GradleTest plugin, you will see
how the specified metadata is displayed.

Sources & Javadoc

The Plugin Publish Plugin automatically generates and publishes the Javadoc, and sources JARs for
your plugin publication.

Sign artifacts

Starting from version 1.0.0 of Plugin Publish Plugin, the signing of published plugin artifacts has
been made automatic. To enable it, all that’s needed is to apply the signing plugin in your build.

Shadow dependencies

Starting from version 1.0.0 of Plugin Publish Plugin, shadowing your plugin’s dependencies (ie,
publishing it as a fat jar) has been made automatic. To enable it, all that’s needed is to apply the
com.github.johnrengelman.shadow plugin in your build.

Publishing the plugin

If you publish your plugin internally for use within your organization, you can publish it like any
other code artifact. See the Ivy and Maven chapters on publishing artifacts.

If you are interested in publishing your plugin to be used by the wider Gradle community, you can
publish it to Gradle Plugin Portal. This site provides the ability to search for and gather information
about plugins contributed by the Gradle community. Please refer to the corresponding section on
making your plugin available on this site.

Publish locally

To check how the artifacts of your published plugin look or to use it only locally or internally in
your company, you can publish it to any Maven repository, including a local folder. You only need
to configure repositories for publishing. Then, you can run the publish task to publish your plugin
to all repositories you have defined (but not the Gradle Plugin Portal).

build.gradle.kts

publishing {
repositories {
maven {
name = "localPluginRepository"
url = uri("../local-plugin-repository")
}
}
}

build.gradle

publishing {
repositories {
maven {
name = 'localPluginRepository'
url = '../local-plugin-repository'
}
}
}

To use the repository in another build, add it to the repositories of the pluginManagement {} block in
your settings.gradle(.kts) file.
Publish to the Plugin Portal

Publish the plugin by using the publishPlugin task:

$ ./gradlew publishPlugins

You can validate your plugins before publishing using the --validate-only flag:

$ ./gradlew publishPlugins --validate-only

If you have not configured your gradle.properties for the Gradle Plugin Portal, you can specify
them on the command-line:

$ ./gradlew publishPlugins -Pgradle.publish.key=<key> -Pgradle.publish.secret=<secret>

You will encounter a permission failure if you attempt to publish the example
Greeting Plugin with the ID used in this section. That’s expected and ensures the
NOTE
portal won’t be overrun with multiple experimental and duplicate greeting-type
plugins.

After approval, your plugin will be available on the Gradle Plugin Portal for others to discover and
use.

Consume the published plugin

Once you successfully publish a plugin, it won’t immediately appear on the Portal. It also needs to
pass an approval process, which is manual and relatively slow for the initial version of your plugin,
but is fully automatic for subsequent versions. For further details, see here.

Once your plugin is approved, you can find instructions for its use at a URL of the form
https://plugins.gradle.org/plugin/<your-plugin-id>. For example, the Greeting Plugin example is
already on the portal at https://plugins.gradle.org/plugin/org.example.greeting.

Plugins published without Gradle Plugin Portal

If your plugin was published without using the Java Gradle Plugin Development Plugin, the
publication will be lacking Plugin Marker Artifact, which is needed for plugins DSL to locate the
plugin. In this case, the recommended way to resolve the plugin in another project is to add a
resolutionStrategy section to the pluginManagement {} block of the project’s settings file, as shown
below.

settings.gradle.kts

resolutionStrategy {
eachPlugin {
if (requested.id.namespace == "org.example") {
useModule("org.example:custom-plugin:${requested.version}")
}
}
}

settings.gradle

resolutionStrategy {
eachPlugin {
if (requested.id.namespace == 'org.example') {
useModule("org.example:custom-plugin:${requested.version}")
}
}
}

[1] Script plugins are hard to maintain. Do not use script plugins apply from:, they are not recommended.
[2] It is recommended to use a statically-typed language like Java or Kotlin for implementing plugins to reduce the likelihood of
binary incompatibilities. If using Groovy, consider using statically compiled Groovy.
OTHER TOPICS
Gradle-managed Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory
and the Project Root directory.

Gradle User Home directory

By default, the Gradle User Home (~/.gradle or C:\Users\<USERNAME>\.gradle) stores global


configuration properties, initialization scripts, caches, and log files.

It can be set with the environment variable GRADLE_USER_HOME.

TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.

It is roughly structured as follows:

├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── my-setup.gradle
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties ⑧

① Global cache directory (for everything that is not project-specific).

② Version-specific caches (e.g., to support incremental builds).

③ Shared caches (e.g., for artifacts of dependencies).

④ Registry and logs of the Gradle Daemon.

⑤ Global initialization scripts.

⑥ JDKs downloaded by the toolchain support.

⑦ Distributions downloaded by the Gradle Wrapper.

⑧ Global Gradle configuration properties.

Cleanup of caches and distributions

Gradle automatically cleans its user home directory.

By default, the cleanup runs in the background when the Gradle daemon is stopped or shut down.

If using --no-daemon, it runs in the foreground after the build session.

The following cleanup strategies are applied periodically (by default, once every 24 hours):

• Version-specific caches in all caches/<GRADLE_VERSION>/ directories are checked for whether they
are still in use.

If not, directories for release versions are deleted after 30 days of inactivity, and snapshot
versions after 7 days.

• Shared caches in caches/ (e.g., jars-*) are checked for whether they are still in use.

If no Gradle version still uses them, they are deleted.

• Files in shared caches used by the current Gradle version in caches/ (e.g., jars-3 or modules-2)
are checked for when they were last accessed.

Depending on whether the file can be recreated locally or downloaded from a remote
repository, it will be deleted after 7 or 30 days, respectively.

• Gradle distributions in wrapper/dists/ are checked for whether they are still in use, i.e., whether
there’s a corresponding version-specific cache directory.
Unused distributions are deleted.

Configuring cleanup of caches and distributions

The retention periods of the various caches can be configured.

Caches are classified into five categories:

1. Released wrapper distributions: Distributions and related version-specific caches


corresponding to released versions (e.g., 4.6.2 or 8.0).

Default retention for unused versions is 30 days.

2. Snapshot wrapper distributions: Distributions and related version-specific caches


corresponding to snapshot versions (e.g. 7.6-20221130141522+0000).

Default retention for unused versions is 7 days.

3. Downloaded resources: Shared caches downloaded from a remote repository (e.g., cached
dependencies).

Default retention for unused resources is 30 days.

4. Created resources: Shared caches that Gradle creates during a build (e.g., artifact transforms).

Default retention for unused resources is 7 days.

5. Build cache: The local build cache (e.g., build-cache-1).

Default retention for unused build-cache entries is 7 days.

The retention period for each category can be configured independently via an init script in the
Gradle User Home:

gradleUserHome/init.d/cache-settings.gradle.kts

beforeSettings {
caches {
releasedWrappers.setRemoveUnusedEntriesAfterDays(45)
snapshotWrappers.setRemoveUnusedEntriesAfterDays(10)
downloadedResources.setRemoveUnusedEntriesAfterDays(45)
createdResources.setRemoveUnusedEntriesAfterDays(10)
buildCache.setRemoveUnusedEntriesAfterDays(5)
}
}

gradleUserHome/init.d/cache-settings.gradle

beforeSettings { settings ->


settings.caches {
releasedWrappers.removeUnusedEntriesAfterDays = 45
snapshotWrappers.removeUnusedEntriesAfterDays = 10
downloadedResources.removeUnusedEntriesAfterDays = 45
createdResources.removeUnusedEntriesAfterDays = 10
buildCache.removeUnusedEntriesAfterDays = 5
}
}

The frequency at which cache cleanup is invoked is also configurable.

There are three possible settings:

1. DEFAULT: Cleanup is performed periodically in the background (currently once every 24


hours).

2. DISABLED: Never cleanup Gradle User Home.

This is useful in cases where Gradle User Home is ephemeral or delaying cleanup is desirable
until an explicit point.

3. ALWAYS: Cleanup is performed at the end of each build session.

This is useful in cases where it’s desirable to ensure that cleanup has occurred before
proceeding.

However, this performs cache cleanup during the build (rather than in the background), which
can be expensive, so this option should only be used when necessary.

To disable cache cleanup:

gradleUserHome/init.d/cache-settings.gradle.kts

beforeSettings {
caches {
cleanup = Cleanup.DISABLED
}
}

gradleUserHome/init.d/cache-settings.gradle

beforeSettings { settings ->


settings.caches {
cleanup = Cleanup.DISABLED
}
}

Cache cleanup settings can only be configured via init scripts and should be placed
under the init.d directory in Gradle User Home. This effectively couples the
NOTE configuration of cache cleanup to the Gradle User Home those settings apply to and
limits the possibility of different conflicting settings from different projects being
applied to the same directory.

Multiple versions of Gradle sharing a Gradle User Home

It is common to share a single Gradle User Home between multiple versions of Gradle.

As stated above, caches in Gradle User Home are version-specific. Different versions of Gradle will
perform maintenance on only the version-specific caches associated with each version.

On the other hand, some caches are shared between versions (e.g., the dependency artifact cache or
the artifact transform cache).

Beginning with Gradle version 8.0, the cache cleanup settings can be configured to custom
retention periods. However, older versions have fixed retention periods (7 or 30 days, depending
on the cache). These shared caches could be accessed by versions of Gradle with different settings
to retain cache artifacts.

This means that:

• If the retention period is not customized, all versions that perform cleanup will have the same
retention periods. There will be no effect due to sharing a Gradle User Home with multiple
versions.

• If the retention period is customized for Gradle versions greater than or equal to version 8.0 to
use retention periods shorter than the previously fixed periods, there will also be no effect.

The versions of Gradle aware of these settings will cleanup artifacts earlier than the previously
fixed retention periods, and older versions will effectively not participate in the cleanup of
shared caches.

• If the retention period is customized for Gradle versions greater than or equal to version 8.0 to
use retention periods longer than the previously fixed periods, the older versions of Gradle may
clean the shared caches earlier than what is configured.

In this case, if it is desirable to maintain these shared cache entries for newer versions for
longer retention periods, they will not be able to share a Gradle User Home with older versions.
They will need to use a separate directory.

Another consideration when sharing the Gradle User Home with versions of Gradle before version
8.0 is that the DSL elements to configure the cache retention settings are unavailable in earlier
versions, so this must be accounted for in any init script shared between versions. This can easily
be handled by conditionally applying a version-compliant script.
The version-compliant script should reside somewhere other than the init.d
NOTE
directory (such as a sub-directory), so it is not automatically applied.

To configure cache cleanup in a version-safe manner:

gradleUserHome/init.d/cache-settings.gradle.kts

if (GradleVersion.current() >= GradleVersion.version("8.0")) {


apply(from = "gradle8/cache-settings.gradle.kts")
}

gradleUserHome/init.d/cache-settings.gradle

if (GradleVersion.current() >= GradleVersion.version('8.0')) {


apply from: "gradle8/cache-settings.gradle"
}

Version-compliant cache configuration script:

gradleUserHome/init.d/gradle8/cache-settings.gradle.kts

beforeSettings {
caches {
releasedWrappers { setRemoveUnusedEntriesAfterDays(45) }
snapshotWrappers { setRemoveUnusedEntriesAfterDays(10) }
downloadedResources { setRemoveUnusedEntriesAfterDays(45) }
createdResources { setRemoveUnusedEntriesAfterDays(10) }
buildCache { setRemoveUnusedEntriesAfterDays(5) }
}
}

gradleUserHome/init.d/gradle8/cache-settings.gradle

beforeSettings { settings ->


settings.caches {
releasedWrappers.removeUnusedEntriesAfterDays = 45
snapshotWrappers.removeUnusedEntriesAfterDays = 10
downloadedResources.removeUnusedEntriesAfterDays = 45
createdResources.removeUnusedEntriesAfterDays = 10
buildCache.removeUnusedEntriesAfterDays = 5
}
}

Cache marking

Beginning with Gradle version 8.1, Gradle supports marking caches with a CACHEDIR.TAG file.

It follows the format described in the Cache Directory Tagging Specification. The purpose of this file
is to allow tools to identify the directories that do not need to be searched or backed up.

By default, the directories caches, wrapper/dists, daemon, and jdks in the Gradle User Home are
marked with this file.

Configuring cache marking

The cache marking feature can be configured via an init script in the Gradle User Home:

gradleUserHome/init.d/cache-settings.gradle.kts

beforeSettings {
caches {
// Disable cache marking for all caches
markingStrategy = MarkingStrategy.NONE
}
}

gradleUserHome/init.d/cache-settings.gradle

beforeSettings { settings ->


settings.caches {
// Disable cache marking for all caches
markingStrategy = MarkingStrategy.NONE
}
}

Cache marking settings can only be configured via init scripts and should be placed
under the init.d directory in Gradle User Home. This effectively couples the
NOTE configuration of cache marking to the Gradle User Home to which those settings
apply and limits the possibility of different conflicting settings from different
projects being applied to the same directory.
Project Root directory

The project root directory contains all source files from your project.

It also contains files and directories Gradle generates, such as .gradle and build.

While the former are usually checked into source control, the latter are transient files Gradle uses
to support features like incremental builds.

The anatomy of a typical project root directory looks as follows:

├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── gradle.properties ⑤
├── gradlew ⑥
├── gradlew.bat ⑥
├── settings.gradle.kts ⑦
├── subproject-one ⑧
| └── build.gradle.kts ⑨
├── subproject-two ⑧
| └── build.gradle.kts ⑨
└── ⋮

① Project-specific cache directory generated by Gradle.

② Version-specific caches (e.g., to support incremental builds).

③ The build directory of this project into which Gradle generates all build artifacts.

④ Contains the JAR file and configuration of the Gradle Wrapper.

⑤ Project-specific Gradle configuration properties.

⑥ Scripts for executing builds using the Gradle Wrapper.

⑦ The project’s settings file where the list of subprojects is defined.

⑧ Usually, a project is organized into one or multiple subprojects.

⑨ Each subproject has its own Gradle build script.

Project cache cleanup

From version 4.10 onwards, Gradle automatically cleans the project-specific cache directory.

After building the project, version-specific cache directories in .gradle/8.9/ are checked
periodically (at most, every 24 hours) to determine whether they are still in use. They are deleted if
they haven’t been used for 7 days.

Next Step: Learn about the Gradle Build Lifecycle >>


Working With Files
File operations are fundamental to nearly every Gradle build. They involve handling source files,
managing file dependencies, and generating reports. Gradle provides a robust API that simplifies
these operations, enabling developers to perform necessary file tasks easily.

Hardcoded paths and laziness

It is best practice to avoid hardcoded paths in build scripts.

In addition to avoiding hardcoded paths, Gradle encourages laziness in its build scripts. This means
that tasks and operations should be deferred until they are actually needed rather than executed
eagerly.

Many examples in this chapter use hard-coded paths as string literals. This makes them easy to
understand, but it is not good practice. The problem is that paths often change, and the more places
you need to change them, the more likely you will miss one and break the build.

Where possible, you should use tasks, task properties, and project properties — in that order of
preference — to configure file paths.

For example, if you create a task that packages the compiled classes of a Java application, you
should use an implementation similar to this:

build.gradle.kts

val archivesDirPath = layout.buildDirectory.dir("archives")

tasks.register<Zip>("packageClasses") {
archiveAppendix = "classes"
destinationDirectory = archivesDirPath

from(tasks.compileJava)
}

build.gradle

def archivesDirPath = layout.buildDirectory.dir('archives')

tasks.register('packageClasses', Zip) {
archiveAppendix = "classes"
destinationDirectory = archivesDirPath

from compileJava
}
The compileJava task is the source of the files to package, and the project property archivesDirPath
stores the location of the archives, as we are likely to use it elsewhere in the build.

Using a task directly as an argument like this relies on it having defined outputs, so it won’t always
be possible. This example could be further improved by relying on the Java plugin’s convention for
destinationDirectory rather than overriding it, but it does demonstrate the use of project
properties.

Locating files

To perform some action on a file, you need to know where it is, and that’s the information provided
by file paths. Gradle builds on the standard Java File class, which represents the location of a single
file and provides APIs for dealing with collections of paths.

Using ProjectLayout

The ProjectLayout class is used to access various directories and files within a project. It provides
methods to retrieve paths to the project directory, build directory, settings file, and other important
locations within the project’s file structure. This class is particularly useful when you need to work
with files in a build script or plugin in different project paths:

build.gradle.kts

val archivesDirPath = layout.buildDirectory.dir("archives")

build.gradle

def archivesDirPath = layout.buildDirectory.dir('archives')

You can learn more about the ProjectLayout class in Services.

Using Project.file()

Gradle provides the Project.file(java.lang.Object) method for specifying the location of a single file
or directory.

Relative paths are resolved relative to the project directory, while absolute paths remain
unchanged.

Never use new File(relative path) unless passed to file() or files() or from()
or other methods defined in terms of file() or files(). Otherwise, this creates a
CAUTION path relative to the current working directory (CWD). Gradle can make no
guarantees about the location of the CWD, which means builds that rely on it
may break at any time.
Here are some examples of using the file() method with different types of arguments:

build.gradle.kts

// Using a relative path


var configFile = file("src/config.xml")

// Using an absolute path


configFile = file(configFile.absolutePath)

// Using a File object with a relative path


configFile = file(File("src/config.xml"))

// Using a java.nio.file.Path object with a relative path


configFile = file(Paths.get("src", "config.xml"))

// Using an absolute java.nio.file.Path object


configFile = file(Paths.get(System.getProperty("user.home")).resolve("global-
config.xml"))

build.gradle

// Using a relative path


File configFile = file('src/config.xml')

// Using an absolute path


configFile = file(configFile.absolutePath)

// Using a File object with a relative path


configFile = file(new File('src/config.xml'))

// Using a java.nio.file.Path object with a relative path


configFile = file(Paths.get('src', 'config.xml'))

// Using an absolute java.nio.file.Path object


configFile = file(Paths.get(System.getProperty('user.home')).resolve('global-
config.xml'))

As you can see, you can pass strings, File instances and Path instances to the file() method, all of
which result in an absolute File object.

In the case of multi-project builds, the file() method will always turn relative paths into paths
relative to the current project directory, which may be a child project.
Using Project.getRootDir()

Suppose you want to use a path relative to the root project directory. In that case, you need to use
the special Project.getRootDir() property to construct an absolute path, like so:

build.gradle.kts

val configFile = file("$rootDir/shared/config.xml")

build.gradle

File configFile = file("$rootDir/shared/config.xml")

Let’s say you’re working on a multi-project build in the directory: dev/projects/AcmeHealth.


The build script above is at: AcmeHealth/subprojects/AcmePatientRecordLib/build.gradle.
The file path will resolve to the absolute of: dev/projects/AcmeHealth/shared/config.xml.

dev
├── projects
│ ├── AcmeHealth
│ │ ├── subprojects
│ │ │ ├── AcmePatientRecordLib
│ │ │ │ └── build.gradle
│ │ │ └── ...
│ │ ├── shared
│ │ │ └── config.xml
│ │ └── ...
│ └── ...
└── settings.gradle

Note that Project also provides Project.getRootProject() for multi-project builds which, in the
example, would resolve to: dev/projects/AcmeHealth/subprojects/AcmePatientRecordLib.

Using FileCollection

A file collection is simply a set of file paths represented by the FileCollection interface.

The set of paths can be any file path. The file paths don’t have to be related in any way, so they don’t
have to be in the same directory or have a shared parent directory.

The recommended way to specify a collection of files is to use the


ProjectLayout.files(java.lang.Object...) method, which returns a FileCollection instance. This
flexible method allows you to pass multiple strings, File instances, collections of strings, collections
of Files, and more. You can also pass in tasks as arguments if they have defined outputs.
files() properly handles relative paths and File(relative path) instances,
CAUTION
resolving them relative to the project directory.

As with the Project.file(java.lang.Object) method covered in the previous section, all relative paths
are evaluated relative to the current project directory. The following example demonstrates some
of the variety of argument types you can use — strings, File instances, lists, or Paths:

build.gradle.kts

val collection: FileCollection = layout.files(


"src/file1.txt",
File("src/file2.txt"),
listOf("src/file3.csv", "src/file4.csv"),
Paths.get("src", "file5.txt")
)

build.gradle

FileCollection collection = layout.files('src/file1.txt',


new File('src/file2.txt'),
['src/file3.csv', 'src/file4.csv'],
Paths.get('src', 'file5.txt'))

File collections have important attributes in Gradle. They can be:

• created lazily

• iterated over

• filtered

• combined

Lazy creation of a file collection is useful when evaluating the files that make up a collection when a
build runs. In the following example, we query the file system to find out what files exist in a
particular directory and then make those into a file collection:

build.gradle.kts

tasks.register("list") {
val projectDirectory = layout.projectDirectory
doLast {
var srcDir: File? = null

val collection = projectDirectory.files({


srcDir?.listFiles()
})

srcDir = projectDirectory.file("src").asFile
println("Contents of ${srcDir.name}")
collection.map { it.relativeTo(projectDirectory.asFile)
}.sorted().forEach { println(it) }

srcDir = projectDirectory.file("src2").asFile
println("Contents of ${srcDir.name}")
collection.map { it.relativeTo(projectDirectory.asFile)
}.sorted().forEach { println(it) }
}
}

build.gradle

tasks.register('list') {
Directory projectDirectory = layout.projectDirectory
doLast {
File srcDir

// Create a file collection using a closure


collection = projectDirectory.files { srcDir.listFiles() }

srcDir = projectDirectory.file('src').asFile
println "Contents of $srcDir.name"
collection.collect { projectDirectory.asFile.relativePath(it) }.sort
().each { println it }

srcDir = projectDirectory.file('src2').asFile
println "Contents of $srcDir.name"
collection.collect { projectDirectory.asFile.relativePath(it) }.sort
().each { println it }
}
}

$ gradle -q list
Contents of src
src/dir1
src/file1.txt
Contents of src2
src2/dir1
src2/dir2
The key to lazy creation is passing a closure (in Groovy) or a Provider (in Kotlin) to the files()
method. Your closure or provider must return a value of a type accepted by files(), such as
List<File>, String, or FileCollection.

Iterating over a file collection can be done through the each() method (in Groovy) or forEach method
(in Kotlin) on the collection or using the collection in a for loop. In both approaches, the file
collection is treated as a set of File instances, i.e., your iteration variable will be of type File.

The following example demonstrates such iteration. It also demonstrates how you can convert file
collections to other types using the as operator (or supported properties):

build.gradle.kts

// Iterate over the files in the collection


collection.forEach { file: File ->
println(file.name)
}

// Convert the collection to various types


val set: Set<File> = collection.files
val list: List<File> = collection.toList()
val path: String = collection.asPath
val file: File = collection.singleFile

// Add and subtract collections


val union = collection + projectLayout.files("src/file2.txt")
val difference = collection - projectLayout.files("src/file2.txt")

build.gradle

// Iterate over the files in the collection


collection.each { File file ->
println file.name
}

// Convert the collection to various types


Set set = collection.files
Set set2 = collection as Set
List list = collection as List
String path = collection.asPath
File file = collection.singleFile

// Add and subtract collections


def union = collection + projectLayout.files('src/file2.txt')
def difference = collection - projectLayout.files('src/file2.txt')
You can also see at the end of the example how to combine file collections using the + and -
operators to merge and subtract them. An important feature of the resulting file collections is that
they are live. In other words, when you combine file collections this way, the result always reflects
what’s currently in the source file collections, even if they change during the build.

For example, imagine collection in the above example gains an extra file or two after union is
created. As long as you use union after those files are added to collection, union will also contain
those additional files. The same goes for the different file collection.

Live collections are also important when it comes to filtering. Suppose you want to use a subset of a
file collection. In that case, you can take advantage of the
FileCollection.filter(org.gradle.api.specs.Spec) method to determine which files to "keep". In the
following example, we create a new collection that consists of only the files that end with .txt in
the source collection:

build.gradle.kts

val textFiles: FileCollection = collection.filter { f: File ->


f.name.endsWith(".txt")
}

build.gradle

FileCollection textFiles = collection.filter { File f ->


f.name.endsWith(".txt")
}

$ gradle -q filterTextFiles
src/file1.txt
src/file2.txt
src/file5.txt

If collection changes at any time, either by adding or removing files from itself, then textFiles will
immediately reflect the change because it is also a live collection. Note that the closure you pass to
filter() takes a File as an argument and should return a boolean.

Understanding implicit conversion to file collections

Many objects in Gradle have properties which accept a set of input files. For example, the
JavaCompile task has a source property that defines the source files to compile. You can set the
value of this property using any of the types supported by the files() method, as mentioned in the
API docs. This means you can, for example, set the property to a File, String, collection,
FileCollection or even a closure or Provider.
This is a feature of specific tasks! That means implicit conversion will not happen for just any
task that has a FileCollection or FileTree property. If you want to know whether implicit
conversion happens in a particular situation, you will need to read the relevant documentation,
such as the corresponding task’s API docs. Alternatively, you can remove all doubt by explicitly
using ProjectLayout.files(java.lang.Object...) in your build.

Here are some examples of the different types of arguments that the source property can take:

build.gradle.kts

tasks.register<JavaCompile>("compile") {
// Use a File object to specify the source directory
source = fileTree(file("src/main/java"))

// Use a String path to specify the source directory


source = fileTree("src/main/java")

// Use a collection to specify multiple source directories


source = fileTree(listOf("src/main/java", "../shared/java"))

// Use a FileCollection (or FileTree in this case) to specify the source


files
source = fileTree("src/main/java").matching {
include("org/gradle/api/**") }

// Using a closure to specify the source files.


setSource({
// Use the contents of each zip file in the src dir
file("src").listFiles().filter { it.name.endsWith(".zip") }.map {
zipTree(it) }
})
}

build.gradle

tasks.register('compile', JavaCompile) {

// Use a File object to specify the source directory


source = file('src/main/java')

// Use a String path to specify the source directory


source = 'src/main/java'

// Use a collection to specify multiple source directories


source = ['src/main/java', '../shared/java']

// Use a FileCollection (or FileTree in this case) to specify the source


files
source = fileTree(dir: 'src/main/java').matching { include
'org/gradle/api/**' }

// Using a closure to specify the source files.


source = {
// Use the contents of each zip file in the src dir
file('src').listFiles().findAll {it.name.endsWith('.zip')}.collect {
zipTree(it) }
}
}

One other thing to note is that properties like source have corresponding methods in core Gradle
tasks. Those methods follow the convention of appending to collections of values rather than
replacing them. Again, this method accepts any of the types supported by the files() method, as
shown here:

build.gradle.kts

tasks.named<JavaCompile>("compile") {
// Add some source directories use String paths
source("src/main/java", "src/main/groovy")

// Add a source directory using a File object


source(file("../shared/java"))

// Add some source directories using a closure


setSource({ file("src/test/").listFiles() })
}

build.gradle

compile {
// Add some source directories use String paths
source 'src/main/java', 'src/main/groovy'

// Add a source directory using a File object


source file('../shared/java')

// Add some source directories using a closure


source { file('src/test/').listFiles() }
}
As this is a common convention, we recommend that you follow it in your own custom tasks.
Specifically, if you plan to add a method to configure a collection-based property, make sure the
method appends rather than replaces values.

Using FileTree

A file tree is a file collection that retains the directory structure of the files it contains and has the
type FileTree. This means all the paths in a file tree must have a shared parent directory. The
following diagram highlights the distinction between file trees and file collections in the typical
case of copying files:

Although FileTree extends FileCollection (an is-a relationship), their behaviors


differ. In other words, you can use a file tree wherever a file collection is required,
NOTE but remember that a file collection is a flat list/set of files, while a file tree is a file
and directory hierarchy. To convert a file tree to a flat collection, use the
FileTree.getFiles() property.

The simplest way to create a file tree is to pass a file or directory path to the
Project.fileTree(java.lang.Object) method. This will create a tree of all the files and directories in
that base directory (but not the base directory itself). The following example demonstrates how to
use this method and how to filter the files and directories using Ant-style patterns:

build.gradle.kts

// Create a file tree with a base directory


var tree: ConfigurableFileTree = fileTree("src/main")

// Add include and exclude patterns to the tree


tree.include("**/*.java")
tree.exclude("**/Abstract*")

// Create a tree using closure


tree = fileTree("src") {
include("**/*.java")
}

// Create a tree using a map


tree = fileTree("dir" to "src", "include" to "**/*.java")
tree = fileTree("dir" to "src", "includes" to listOf("**/*.java",
"**/*.xml"))
tree = fileTree("dir" to "src", "include" to "**/*.java", "exclude" to
"**/*test*/**")

build.gradle

// Create a file tree with a base directory


ConfigurableFileTree tree = fileTree(dir: 'src/main')

// Add include and exclude patterns to the tree


tree.include '**/*.java'
tree.exclude '**/Abstract*'

// Create a tree using closure


tree = fileTree('src') {
include '**/*.java'
}

// Create a tree using a map


tree = fileTree(dir: 'src', include: '**/*.java')
tree = fileTree(dir: 'src', includes: ['**/*.java', '**/*.xml'])
tree = fileTree(dir: 'src', include: '**/*.java', exclude: '**/*test*/**')

You can see more examples of supported patterns in the API docs for PatternFilterable.

By default, fileTree() returns a FileTree instance that applies some default exclude patterns for
convenience — the same defaults as Ant. For the complete default exclude list, see the Ant manual.

If those default excludes prove problematic, you can work around the issue by changing the default
excludes in the settings script:
settings.gradle.kts

import org.apache.tools.ant.DirectoryScanner

DirectoryScanner.removeDefaultExclude("**/.git")
DirectoryScanner.removeDefaultExclude("**/.git/**")

settings.gradle

import org.apache.tools.ant.DirectoryScanner

DirectoryScanner.removeDefaultExclude('**/.git')
DirectoryScanner.removeDefaultExclude('**/.git/**')

Gradle does not support changing default excludes during the execution
IMPORTANT
phase.

You can do many of the same things with file trees that you can with file collections:

• iterate over them (depth first)

• filter them (using FileTree.matching(org.gradle.api.Action) and Ant-style patterns)

• merge them

You can also traverse file trees using the FileTree.visit(org.gradle.api.Action) method. All of these
techniques are demonstrated in the following example:

build.gradle.kts

// Iterate over the contents of a tree


tree.forEach{ file: File ->
println(file)
}

// Filter a tree
val filtered: FileTree = tree.matching {
include("org/gradle/api/**")
}

// Add trees together


val sum: FileTree = tree + fileTree("src/test")

// Visit the elements of the tree


tree.visit {
println("${this.relativePath} => ${this.file}")
}

build.gradle

// Iterate over the contents of a tree


tree.each {File file ->
println file
}

// Filter a tree
FileTree filtered = tree.matching {
include 'org/gradle/api/**'
}

// Add trees together


FileTree sum = tree + fileTree(dir: 'src/test')

// Visit the elements of the tree


tree.visit {element ->
println "$element.relativePath => $element.file"
}

Copying files

Copying files in Gradle primarily uses CopySpec, a mechanism that makes it easy to manage
resources such as source code, configuration files, and other assets in your project build process.

Understanding CopySpec

CopySpec is a copy specification that allows you to define what files to copy, where to copy them
from, and where to copy them. It provides a flexible and expressive way to specify complex file
copying operations, including filtering files based on patterns, renaming files, and
including/excluding files based on various criteria.

CopySpec instances are used in the Copy task to specify the files and directories to be copied.

CopySpec has two important attributes:

1. It is independent of tasks, allowing you to share copy specs within a build.

2. It is hierarchical, providing fine-grained control within the overall copy specification.

1. Sharing copy specs

Consider a build with several tasks that copy a project’s static website resources or add them to an
archive. One task might copy the resources to a folder for a local HTTP server, and another might
package them into a distribution. You could manually specify the file locations and appropriate
inclusions each time they are needed, but human error is more likely to creep in, resulting in
inconsistencies between tasks.

One solution is the Project.copySpec(org.gradle.api.Action) method. This allows you to create a copy
spec outside a task, which can then be attached to an appropriate task using the
CopySpec.with(org.gradle.api.file.CopySpec…) method. The following example demonstrates how
this is done:

build.gradle.kts

val webAssetsSpec: CopySpec = copySpec {


from("src/main/webapp")
include("**/*.html", "**/*.png", "**/*.jpg")
rename("(.+)-staging(.+)", "$1$2")
}

tasks.register<Copy>("copyAssets") {
into(layout.buildDirectory.dir("inPlaceApp"))
with(webAssetsSpec)
}

tasks.register<Zip>("distApp") {
archiveFileName = "my-app-dist.zip"
destinationDirectory = layout.buildDirectory.dir("dists")

from(appClasses)
with(webAssetsSpec)
}

build.gradle

CopySpec webAssetsSpec = copySpec {


from 'src/main/webapp'
include '**/*.html', '**/*.png', '**/*.jpg'
rename '(.+)-staging(.+)', '$1$2'
}

tasks.register('copyAssets', Copy) {
into layout.buildDirectory.dir("inPlaceApp")
with webAssetsSpec
}

tasks.register('distApp', Zip) {
archiveFileName = 'my-app-dist.zip'
destinationDirectory = layout.buildDirectory.dir('dists')
from appClasses
with webAssetsSpec
}

Both the copyAssets and distApp tasks will process the static resources under src/main/webapp, as
specified by webAssetsSpec.

The configuration defined by webAssetsSpec will not apply to the app classes
included by the distApp task. That’s because from appClasses is its own child
specification independent of with webAssetsSpec.
NOTE
This can be confusing, so it’s probably best to treat with() as an extra from()
specification in the task. Hence, it doesn’t make sense to define a standalone copy
spec without at least one from() defined.

Suppose you encounter a scenario in which you want to apply the same copy configuration to
different sets of files. In that case, you can share the configuration block directly without using
copySpec(). Here’s an example that has two independent tasks that happen to want to process
image files only:

build.gradle.kts

val webAssetPatterns = Action<CopySpec> {


include("**/*.html", "**/*.png", "**/*.jpg")
}

tasks.register<Copy>("copyAppAssets") {
into(layout.buildDirectory.dir("inPlaceApp"))
from("src/main/webapp", webAssetPatterns)
}

tasks.register<Zip>("archiveDistAssets") {
archiveFileName = "distribution-assets.zip"
destinationDirectory = layout.buildDirectory.dir("dists")

from("distResources", webAssetPatterns)
}

build.gradle

def webAssetPatterns = {
include '**/*.html', '**/*.png', '**/*.jpg'
}
tasks.register('copyAppAssets', Copy) {
into layout.buildDirectory.dir("inPlaceApp")
from 'src/main/webapp', webAssetPatterns
}

tasks.register('archiveDistAssets', Zip) {
archiveFileName = 'distribution-assets.zip'
destinationDirectory = layout.buildDirectory.dir('dists')

from 'distResources', webAssetPatterns


}

In this case, we assign the copy configuration to its own variable and apply it to whatever from()
specification we want. This doesn’t just work for inclusions but also exclusions, file renaming, and
file content filtering.

2. Using child specifications

If you only use a single copy spec, the file filtering and renaming will apply to all files copied.
Sometimes, this is what you want, but not always. Consider the following example that copies files
into a directory structure that a Java Servlet container can use to deliver a website:

This is not a straightforward copy as the WEB-INF directory and its subdirectories don’t exist within
the project, so they must be created during the copy. In addition, we only want HTML and image
files going directly into the root folder — build/explodedWar — and only JavaScript files going into
the js directory. We need separate filter patterns for those two sets of files.

The solution is to use child specifications, which can be applied to both from() and into()
declarations. The following task definition does the necessary work:

build.gradle.kts

tasks.register<Copy>("nestedSpecs") {
into(layout.buildDirectory.dir("explodedWar"))
exclude("**/*staging*")
from("src/dist") {
include("**/*.html", "**/*.png", "**/*.jpg")
}
from(sourceSets.main.get().output) {
into("WEB-INF/classes")
}
into("WEB-INF/lib") {
from(configurations.runtimeClasspath)
}
}

build.gradle

tasks.register('nestedSpecs', Copy) {
into layout.buildDirectory.dir("explodedWar")
exclude '**/*staging*'
from('src/dist') {
include '**/*.html', '**/*.png', '**/*.jpg'
}
from(sourceSets.main.output) {
into 'WEB-INF/classes'
}
into('WEB-INF/lib') {
from configurations.runtimeClasspath
}
}

Notice how the src/dist configuration has a nested inclusion specification; it is the child copy spec.
You can, of course, add content filtering and renaming here as required. A child copy spec is still a
copy spec.

The above example also demonstrates how you can copy files into a subdirectory of the destination
either by using a child into() on a from() or a child from() on an into(). Both approaches are
acceptable, but you should create and follow a convention to ensure consistency across your build
files.

Don’t get your into() specifications mixed up. For a normal copy, one to the
filesystem rather than an archive, there should always be one "root" into() that
NOTE
specifies the overall destination directory of the copy. Any other into() should have
a child spec attached, and its path will be relative to the root into().

One final thing to be aware of is that a child copy spec inherits its destination path, include
patterns, exclude patterns, copy actions, name mappings, and filters from its parent. So, be careful
where you place your configuration.
Using the Sync task

The Sync task, which extends the Copy task, copies the source files into the destination directory and
then removes any files from the destination directory which it did not copy. It synchronizes the
contents of a directory with its source.

This can be useful for doing things such as installing your application, creating an exploded copy of
your archives, or maintaining a copy of the project’s dependencies.

Here is an example that maintains a copy of the project’s runtime dependencies in the build/libs
directory:

build.gradle.kts

tasks.register<Sync>("libs") {
from(configurations["runtime"])
into(layout.buildDirectory.dir("libs"))
}

build.gradle

tasks.register('libs', Sync) {
from configurations.runtime
into layout.buildDirectory.dir('libs')
}

You can also perform the same function in your own tasks with the
Project.sync(org.gradle.api.Action) method.

Using the Copy task

You can copy a file by creating an instance of Gradle’s builtin Copy task and configuring it with the
location of the file and where you want to put it.

This example mimics copying a generated report into a directory that will be packed into an
archive, such as a ZIP or TAR:

build.gradle.kts

tasks.register<Copy>("copyReport") {
from(layout.buildDirectory.file("reports/my-report.pdf"))
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle

tasks.register('copyReport', Copy) {
from layout.buildDirectory.file("reports/my-report.pdf")
into layout.buildDirectory.dir("toArchive")
}

The file and directory paths are then used to specify what file to copy using
Copy.from(java.lang.Object…) and which directory to copy it to using Copy.into(java.lang.Object).

Although hard-coded paths make for simple examples, they make the build brittle. Using a reliable,
single source of truth, such as a task or shared project property, is better. In the following modified
example, we use a report task defined elsewhere that has the report’s location stored in its
outputFile property:

build.gradle.kts

tasks.register<Copy>("copyReport2") {
from(myReportTask.flatMap { it.outputFile })
into(archiveReportsTask.flatMap { it.dirToArchive })
}

build.gradle

tasks.register('copyReport2', Copy) {
from myReportTask.outputFile
into archiveReportsTask.dirToArchive
}

We have also assumed that the reports will be archived by archiveReportsTask, which provides us
with the directory that will be archived and hence where we want to put the copies of the reports.

Copying multiple files

You can extend the previous examples to multiple files very easily by providing multiple arguments
to from():

build.gradle.kts

tasks.register<Copy>("copyReportsForArchiving") {
from(layout.buildDirectory.file("reports/my-report.pdf"),
layout.projectDirectory.file("src/docs/manual.pdf"))
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyReportsForArchiving', Copy) {
from layout.buildDirectory.file("reports/my-report.pdf"), layout
.projectDirectory.file("src/docs/manual.pdf")
into layout.buildDirectory.dir("toArchive")
}

Two files are now copied into the archive directory.

You can also use multiple from() statements to do the same thing, as shown in the first example of
the section File copying in depth.

But what if you want to copy all the PDFs in a directory without specifying each one? To do this,
attach inclusion and/or exclusion patterns to the copy specification. Here, we use a string pattern to
include PDFs only:

build.gradle.kts

tasks.register<Copy>("copyPdfReportsForArchiving") {
from(layout.buildDirectory.dir("reports"))
include("*.pdf")
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyPdfReportsForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
include "*.pdf"
into layout.buildDirectory.dir("toArchive")
}

One thing to note, as demonstrated in the following diagram, is that only the PDFs that reside
directly in the reports directory are copied:
You can include files in subdirectories by using an Ant-style glob pattern (**/*), as done in this
updated example:

build.gradle.kts

tasks.register<Copy>("copyAllPdfReportsForArchiving") {
from(layout.buildDirectory.dir("reports"))
include("**/*.pdf")
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyAllPdfReportsForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
include "**/*.pdf"
into layout.buildDirectory.dir("toArchive")
}

This task has the following effect:

Remember that a deep filter like this has the side effect of copying the directory structure below
reports and the files. If you want to copy the files without the directory structure, you must use an
explicit fileTree(dir) { includes }.files expression.
Copying directory hierarchies

You may need to copy files as well as the directory structure in which they reside. This is the default
behavior when you specify a directory as the from() argument, as demonstrated by the following
example that copies everything in the reports directory, including all its subdirectories, to the
destination:

build.gradle.kts

tasks.register<Copy>("copyReportsDirForArchiving") {
from(layout.buildDirectory.dir("reports"))
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyReportsDirForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
into layout.buildDirectory.dir("toArchive")
}

The key aspect that users need help with is controlling how much of the directory structure goes to
the destination. In the above example, do you get a toArchive/reports directory, or does everything
in reports go straight into toArchive? The answer is the latter. If a directory is part of the from()
path, then it won’t appear in the destination.

So how do you ensure that reports itself is copied across, but not any other directory in
${layout.buildDirectory}? The answer is to add it as an include pattern:

build.gradle.kts

tasks.register<Copy>("copyReportsDirForArchiving2") {
from(layout.buildDirectory) {
include("reports/**")
}
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyReportsDirForArchiving2', Copy) {
from(layout.buildDirectory) {
include "reports/**"
}
into layout.buildDirectory.dir("toArchive")
}

You’ll get the same behavior as before except with one extra directory level in the destination, i.e.,
toArchive/reports.

One thing to note is how the include() directive applies only to the from(), whereas the directive in
the previous section applied to the whole task. These different levels of granularity in the copy
specification allow you to handle most requirements that you will come across easily.

Understanding file copying

The basic process of copying files in Gradle is a simple one:

• Define a task of type Copy

• Specify which files (and potentially directories) to copy

• Specify a destination for the copied files

But this apparent simplicity hides a rich API that allows fine-grained control of which files are
copied, where they go, and what happens to them as they are copied — renaming of the files and
token substitution of file content are both possibilities, for example.

Let’s start with the last two items on the list, which involve CopySpec. The CopySpec interface, which
the Copy task implements, offers:

• A CopySpec.from(java.lang.Object…) method to define what to copy

• An CopySpec.into(java.lang.Object) method to define the destination

CopySpec has several additional methods that allow you to control the copying process, but these
two are the only required ones. into() is straightforward, requiring a directory path as its
argument in any form supported by the Project.file(java.lang.Object) method. The from()
configuration is far more flexible.

Not only does from() accept multiple arguments, it also allows several different types of argument.
For example, some of the most common types are:

• A String — treated as a file path or, if it starts with "file://", a file URI

• A File — used as a file path

• A FileCollection or FileTree — all files in the collection are included in the copy

• A task — the files or directories that form a task’s defined outputs are included

In fact, from() accepts all the same arguments as Project.files(java.lang.Object…) so see that method
for a more detailed list of acceptable types.
Something else to consider is what type of thing a file path refers to:

• A file — the file is copied as is

• A directory — this is effectively treated as a file tree: everything in it, including subdirectories,
is copied. However, the directory itself is not included in the copy.

• A non-existent file — the path is ignored

Here is an example that uses multiple from() specifications, each with a different argument type.
You will probably also notice that into() is configured lazily using a closure (in Groovy) or a
Provider (in Kotlin) — a technique that also works with from():

build.gradle.kts

tasks.register<Copy>("anotherCopyTask") {
// Copy everything under src/main/webapp
from("src/main/webapp")
// Copy a single file
from("src/staging/index.html")
// Copy the output of a task
from(copyTask)
// Copy the output of a task using Task outputs explicitly.
from(tasks["copyTaskWithPatterns"].outputs)
// Copy the contents of a Zip file
from(zipTree("src/main/assets.zip"))
// Determine the destination directory later
into({ getDestDir() })
}

build.gradle

tasks.register('anotherCopyTask', Copy) {
// Copy everything under src/main/webapp
from 'src/main/webapp'
// Copy a single file
from 'src/staging/index.html'
// Copy the output of a task
from copyTask
// Copy the output of a task using Task outputs explicitly.
from copyTaskWithPatterns.outputs
// Copy the contents of a Zip file
from zipTree('src/main/assets.zip')
// Determine the destination directory later
into { getDestDir() }
}
Note that the lazy configuration of into() is different from a child specification, even though the
syntax is similar. Keep an eye on the number of arguments to distinguish between them.

Copying files in your own tasks

Using the Project.copy method at execution time, as described here, is not


compatible with the configuration cache. A possible solution is to implement
WARNING
the task as a proper class and use FileSystemOperations.copy method instead,
as described in the configuration cache chapter.

Occasionally, you want to copy files or directories as part of a task. For example, a custom archiving
task based on an unsupported archive format might want to copy files to a temporary directory
before they are archived. You still want to take advantage of Gradle’s copy API without introducing
an extra Copy task.

The solution is to use the Project.copy(org.gradle.api.Action) method. Configuring it with a copy


spec works like the Copy task. Here’s a trivial example:

build.gradle.kts

tasks.register("copyMethod") {
doLast {
copy {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
include("**/*.html")
include("**/*.jsp")
}
}
}

build.gradle

tasks.register('copyMethod') {
doLast {
copy {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
include '**/*.html'
include '**/*.jsp'
}
}
}
The above example demonstrates the basic syntax and also highlights two major limitations of
using the copy() method:

1. The copy() method is not incremental. The example’s copyMethod task will always execute
because it has no information about what files make up the task’s inputs. You have to define the
task inputs and outputs manually.

2. Using a task as a copy source, i.e., as an argument to from(), won’t create an automatic task
dependency between your task and that copy source. As such, if you use the copy() method as
part of a task action, you must explicitly declare all inputs and outputs to get the correct
behavior.

The following example shows how to work around these limitations using the dynamic API for task
inputs and outputs:

build.gradle.kts

tasks.register("copyMethodWithExplicitDependencies") {
// up-to-date check for inputs, plus add copyTask as dependency
inputs.files(copyTask)
.withPropertyName("inputs")
.withPathSensitivity(PathSensitivity.RELATIVE)
outputs.dir("some-dir") // up-to-date check for outputs
.withPropertyName("outputDir")
doLast {
copy {
// Copy the output of copyTask
from(copyTask)
into("some-dir")
}
}
}

build.gradle

tasks.register('copyMethodWithExplicitDependencies') {
// up-to-date check for inputs, plus add copyTask as dependency
inputs.files(copyTask)
.withPropertyName("inputs")
.withPathSensitivity(PathSensitivity.RELATIVE)
outputs.dir('some-dir') // up-to-date check for outputs
.withPropertyName("outputDir")
doLast {
copy {
// Copy the output of copyTask
from copyTask
into 'some-dir'
}
}
}

These limitations make it preferable to use the Copy task wherever possible because of its built-in
support for incremental building and task dependency inference. That is why the copy() method is
intended for use by custom tasks that need to copy files as part of their function. Custom tasks that
use the copy() method should declare the necessary inputs and outputs relevant to the copy action.

Renaming files

Renaming files in Gradle can be done using the CopySpec API, which provides methods for renaming
files as they are copied.

Using Copy.rename()

If the files used and generated by your builds sometimes don’t have names that suit, you can
rename those files as you copy them. Gradle allows you to do this as part of a copy specification
using the rename() configuration.

The following example removes the "-staging" marker from the names of any files that have it:

build.gradle.kts

tasks.register<Copy>("copyFromStaging") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))

rename("(.+)-staging(.+)", "$1$2")
}

build.gradle

tasks.register('copyFromStaging', Copy) {
from "src/main/webapp"
into layout.buildDirectory.dir('explodedWar')

rename '(.+)-staging(.+)', '$1$2'


}

As in the above example, you can use regular expressions for this or closures that use more
complex logic to determine the target filename. For example, the following task truncates
filenames:

build.gradle.kts

tasks.register<Copy>("copyWithTruncate") {
from(layout.buildDirectory.dir("reports"))
rename { filename: String ->
if (filename.length > 10) {
filename.slice(0..7) + "~" + filename.length
}
else filename
}
into(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('copyWithTruncate', Copy) {
from layout.buildDirectory.dir("reports")
rename { String filename ->
if (filename.size() > 10) {
return filename[0..7] + "~" + filename.size()
}
else return filename
}
into layout.buildDirectory.dir("toArchive")
}

As with filtering, you can also rename a subset of files by configuring it as part of a child
specification on a from().

Using Copyspec.rename{}

The example of how to rename files on copy gives you most of the information you need to perform
this operation. It demonstrates the two options for renaming:

1. Using a regular expression

2. Using a closure

Regular expressions are a flexible approach to renaming, particularly as Gradle supports regex
groups that allow you to remove and replace parts of the source filename. The following example
shows how you can remove the string "-staging" from any filename that contains it using a simple
regular expression:
build.gradle.kts

tasks.register<Copy>("rename") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
// Use a regular expression to map the file name
rename("(.+)-staging(.+)", "$1$2")
rename("(.+)-staging(.+)".toRegex().pattern, "$1$2")
// Use a closure to convert all file names to upper case
rename { fileName: String ->
fileName.toUpperCase()
}
}

build.gradle

tasks.register('rename', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
// Use a regular expression to map the file name
rename '(.+)-staging(.+)', '$1$2'
rename(/(.+)-staging(.+)/, '$1$2')
// Use a closure to convert all file names to upper case
rename { String fileName ->
fileName.toUpperCase()
}
}

You can use any regular expression supported by the Java Pattern class and the substitution string.
The second argument of rename() works on the same principles as the Matcher.appendReplacement()
method.

Regular expressions in Groovy build scripts


There are two common issues people come across when using regular expressions in this context:

1. If you use a slashy string (those delimited by '/') for the first argument, you must include the
parentheses for rename() as shown in the above example.

2. It’s safest to use single quotes for the second argument, otherwise you need to escape the '$' in
group substitutions, i.e. "\$1\$2".

The first is a minor inconvenience, but slashy strings have the advantage that you don’t have to
escape backslash ('\') characters in the regular expression. The second issue stems from Groovy’s
support for embedded expressions using ${ } syntax in double-quoted and slashy strings.
The closure syntax for rename() is straightforward and can be used for any requirements that
simple regular expressions can’t handle. You’re given a file’s name, and you return a new name for
that file or null if you don’t want to change the name. Be aware that the closure will be executed for
every file copied, so try to avoid expensive operations where possible.

Filtering files

Filtering files in Gradle involves selectively including or excluding files based on certain criteria.

Using CopySpec.include() and CopySpec.exclude()

You can apply filtering in any copy specification through the CopySpec.include(java.lang.String…)
and CopySpec.exclude(java.lang.String…) methods.

These methods are typically used with Ant-style include or exclude patterns, as described in
PatternFilterable.

You can also perform more complex logic by using a closure that takes a FileTreeElement and
returns true if the file should be included or false otherwise. The following example demonstrates
both forms, ensuring that only .html and .jsp files are copied, except for those .html files with the
word "DRAFT" in their content:

build.gradle.kts

tasks.register<Copy>("copyTaskWithPatterns") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
include("**/*.html")
include("**/*.jsp")
exclude { details: FileTreeElement ->
details.file.name.endsWith(".html") &&
details.file.readText().contains("DRAFT")
}
}

build.gradle

tasks.register('copyTaskWithPatterns', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
include '**/*.html'
include '**/*.jsp'
exclude { FileTreeElement details ->
details.file.name.endsWith('.html') &&
details.file.text.contains('DRAFT')
}
}

A question you may ask yourself at this point is what happens when inclusion and exclusion
patterns overlap? Which pattern wins? Here are the basic rules:

• If there are no explicit inclusions or exclusions, everything is included

• If at least one inclusion is specified, only files and directories matching the patterns are
included

• Any exclusion pattern overrides any inclusions, so if a file or directory matches at least one
exclusion pattern, it won’t be included, regardless of the inclusion patterns

Bear these rules in mind when creating combined inclusion and exclusion specifications so that
you end up with the exact behavior you want.

Note that the inclusions and exclusions in the above example will apply to all from() configurations.
If you want to apply filtering to a subset of the copied files, you’ll need to use child specifications.

Filtering file content

Filtering file content in Gradle involves replacing placeholders or tokens in files with dynamic
values.

Using CopySpec.filter()

Transforming the content of files while they are being copied involves basic templating that uses
token substitution, removal of lines of text, or even more complex filtering using a full-blown
template engine.

The following example demonstrates several forms of filtering, including token substitution using
the CopySpec.expand(java.util.Map) method and another using CopySpec.filter(java.lang.Class) with
an Ant filter:

build.gradle.kts

import org.apache.tools.ant.filters.FixCrLfFilter
import org.apache.tools.ant.filters.ReplaceTokens
tasks.register<Copy>("filter") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
// Substitute property tokens in files
expand("copyright" to "2009", "version" to "2.3.1")
// Use some of the filters provided by Ant
filter(FixCrLfFilter::class)
filter(ReplaceTokens::class, "tokens" to mapOf("copyright" to "2009",
"version" to "2.3.1"))
// Use a closure to filter each line
filter { line: String ->
"[$line]"
}
// Use a closure to remove lines
filter { line: String ->
if (line.startsWith('-')) null else line
}
filteringCharset = "UTF-8"
}

build.gradle

import org.apache.tools.ant.filters.FixCrLfFilter
import org.apache.tools.ant.filters.ReplaceTokens

tasks.register('filter', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
// Substitute property tokens in files
expand(copyright: '2009', version: '2.3.1')
// Use some of the filters provided by Ant
filter(FixCrLfFilter)
filter(ReplaceTokens, tokens: [copyright: '2009', version: '2.3.1'])
// Use a closure to filter each line
filter { String line ->
"[$line]"
}
// Use a closure to remove lines
filter { String line ->
line.startsWith('-') ? null : line
}
filteringCharset = 'UTF-8'
}

The filter() method has two variants, which behave differently:

• one takes a FilterReader and is designed to work with Ant filters, such as ReplaceTokens

• one takes a closure or Transformer that defines the transformation for each line of the source
file

Note that both variants assume the source files are text-based. When you use the ReplaceTokens
class with filter(), you create a template engine that replaces tokens of the form @tokenName@ (the
Ant-style token) with values you define.
Using CopySpec.expand()

The expand() method treats the source files as Groovy templates, which evaluates and expands
expressions of the form ${expression}.

You can pass in property names and values that are then expanded in the source files. expand()
allows for more than basic token substitution as the embedded expressions are full-blown Groovy
expressions.

Specifying the character set when reading and writing the file is good practice.
Otherwise, the transformations won’t work properly for non-ASCII text. You
NOTE configure the character set with the CopySpec.setFilteringCharset(String) property.
If it’s not specified, the JVM default character set is used, which will likely differ
from the one you want.

Setting file permissions

Setting file permissions in Gradle involves specifying the permissions for files or directories created
or modified during the build process.

Using CopySpec.filePermissions{}

For any CopySpec involved in copying files, may it be the Copy task itself, or any child specifications,
you can explicitly set the permissions the destination files will have via the
CopySpec.filePermissions {} configurations block.

Using CopySpec.dirPermissions{}

You can do the same for directories too, independently of files, via the CopySpec.dirPermissions {}
configurations block.

Not setting permissions explicitly will preserve the permissions of the original files
NOTE
or directories.

build.gradle.kts

tasks.register<Copy>("permissions") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
filePermissions {
user {
read = true
execute = true
}
other.execute = false
}
dirPermissions {
unix("r-xr-x---")
}
}

build.gradle

tasks.register('permissions', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
filePermissions {
user {
read = true
execute = true
}
other.execute = false
}
dirPermissions {
unix('r-xr-x---')
}
}

For a detailed description of file permissions, see FilePermissions and UserClassFilePermissions.


For details on the convenience method used in the samples, see
ConfigurableFilePermissions.unix(String).

Using empty configuration blocks for file or directory permissions still sets them explicitly, just to
fixed default values. Everything inside one of these configuration blocks is relative to the default
values. Default permissions differ for files and directories:

• file: read & write for owner, read for group, read for other (0644, rw-r—r--)

• directory: read, write & execute for owner, read & execute for group, read & execute for other
(0755, rwxr-xr-x)

Moving files and directories

Moving files and directories in Gradle is a straightforward process that can be accomplished using
several APIs. When implementing file-moving logic in your build scripts, it’s important to consider
file paths, conflicts, and task dependencies.

Using File.renameTo()

File.renameTo() is a method in Java (and by extension, in Gradle’s Groovy DSL) used to rename or
move a file or directory. When you call renameTo() on a File object, you provide another File object
representing the new name or location. If the operation is successful, renameTo() returns true;
otherwise, it returns false.

It’s important to note that renameTo() has some limitations and platform-specific behavior.
In this example, the moveFile task uses the Copy task type to specify the source and destination
directories. Inside the doLast closure, it uses File.renameTo() to move the file from the source
directory to the destination directory:

task moveFile {
doLast {
def sourceFile = file('source.txt')
def destFile = file('destination/new_name.txt')

if (sourceFile.renameTo(destFile)) {
println "File moved successfully."
}
}
}

Using the Copy task

In this example, the moveFile task copies the file source.txt to the destination directory and
renames it to new_name.txt in the process. This achieves a similar effect to moving a file.

task moveFile(type: Copy) {


from 'source.txt'
into 'destination'
rename { fileName ->
'new_name.txt'
}
}

Deleting files and directories

Deleting files and directories in Gradle involves removing them from the file system.

Using the Delete task

You can easily delete files and directories using the Delete task. You must specify which files and
directories to delete in a way supported by the Project.files(java.lang.Object…) method.

For example, the following task deletes the entire contents of a build’s output directory:

build.gradle.kts

tasks.register<Delete>("myClean") {
delete(buildDir)
}
build.gradle

tasks.register('myClean', Delete) {
delete buildDir
}

If you want more control over which files are deleted, you can’t use inclusions and exclusions the
same way you use them for copying files. Instead, you use the built-in filtering mechanisms of
FileCollection and FileTree. The following example does just that to clear out temporary files from
a source directory:

build.gradle.kts

tasks.register<Delete>("cleanTempFiles") {
delete(fileTree("src").matching {
include("**/*.tmp")
})
}

build.gradle

tasks.register('cleanTempFiles', Delete) {
delete fileTree("src").matching {
include "**/*.tmp"
}
}

Using Project.delete()

The Project.delete(org.gradle.api.Action) method can delete files and directories.

This method takes one or more arguments representing the files or directories to be deleted.

For example, the following task deletes the entire contents of a build’s output directory:

build.gradle.kts

tasks.register<Delete>("myClean") {
delete(buildDir)
}
build.gradle

tasks.register('myClean', Delete) {
delete buildDir
}

If you want more control over which files are deleted, you can’t use inclusions and exclusions the
same way you use them for copying files. Instead, you use the built-in filtering mechanisms of
FileCollection and FileTree. The following example does just that to clear out temporary files from
a source directory:

build.gradle.kts

tasks.register<Delete>("cleanTempFiles") {
delete(fileTree("src").matching {
include("**/*.tmp")
})
}

build.gradle

tasks.register('cleanTempFiles', Delete) {
delete fileTree("src").matching {
include "**/*.tmp"
}
}

Creating archives

From the perspective of Gradle, packing files into an archive is effectively a copy in which the
destination is the archive file rather than a directory on the file system. Creating archives looks a
lot like copying, with all the same features.

Using the Zip, Tar, or Jar task

The simplest case involves archiving the entire contents of a directory, which this example
demonstrates by creating a ZIP of the toArchive directory:
build.gradle.kts

tasks.register<Zip>("packageDistribution") {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir("dist")

from(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('packageDistribution', Zip) {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir('dist')

from layout.buildDirectory.dir("toArchive")
}

Notice how we specify the destination and name of the archive instead of an into(): both are
required. You often won’t see them explicitly set because most projects apply the Base Plugin. It
provides some conventional values for those properties.

The following example demonstrates this; you can learn more about the conventions in the archive
naming section.

Each type of archive has its own task type, the most common ones being Zip, Tar and Jar. They all
share most of the configuration options of Copy, including filtering and renaming.

One of the most common scenarios involves copying files into specified archive subdirectories. For
example, let’s say you want to package all PDFs into a docs directory in the archive’s root. This docs
directory doesn’t exist in the source location, so you must create it as part of the archive. You do
this by adding an into() declaration for just the PDFs:

build.gradle.kts

plugins {
base
}

version = "1.0.0"

tasks.register<Zip>("packageDistribution") {
from(layout.buildDirectory.dir("toArchive")) {
exclude("**/*.pdf")
}

from(layout.buildDirectory.dir("toArchive")) {
include("**/*.pdf")
into("docs")
}
}

build.gradle

plugins {
id 'base'
}

version = "1.0.0"

tasks.register('packageDistribution', Zip) {
from(layout.buildDirectory.dir("toArchive")) {
exclude "**/*.pdf"
}

from(layout.buildDirectory.dir("toArchive")) {
include "**/*.pdf"
into "docs"
}
}

As you can see, you can have multiple from() declarations in a copy specification, each with its own
configuration. See Using child copy specifications for more information on this feature.

Understanding archive creation

Archives are essentially self-contained file systems, and Gradle treats them as such. This is why
working with archives is similar to working with files and directories.

Out of the box, Gradle supports the creation of ZIP and TAR archives and, by extension, Java’s JAR,
WAR, and EAR formats—Java’s archive formats are all ZIPs. Each of these formats has a
corresponding task type to create them: Zip, Tar, Jar, War, and Ear. These all work the same way
and are based on copy specifications, just like the Copy task.

Creating an archive file is essentially a file copy in which the destination is implicit, i.e., the archive
file itself. Here is a basic example that specifies the path and name of the target archive file:
build.gradle.kts

tasks.register<Zip>("packageDistribution") {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir("dist")

from(layout.buildDirectory.dir("toArchive"))
}

build.gradle

tasks.register('packageDistribution', Zip) {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir('dist')

from layout.buildDirectory.dir("toArchive")
}

The full power of copy specifications is available to you when creating archives, which means you
can do content filtering, file renaming, or anything else covered in the previous section. A common
requirement is copying files into subdirectories of the archive that don’t exist in the source folders,
something that can be achieved with into() child specifications.

Gradle allows you to create as many archive tasks as you want, but it’s worth considering that
many convention-based plugins provide their own. For example, the Java plugin adds a jar task for
packaging a project’s compiled classes and resources in a JAR. Many of these plugins provide
sensible conventions for the names of archives and the copy specifications used. We recommend
you use these tasks wherever you can rather than overriding them with your own.

Naming archives

Gradle has several conventions around the naming of archives and where they are created based
on the plugins your project uses. The main convention is provided by the Base Plugin, which
defaults to creating archives in the layout.buildDirectory.dir("distributions") directory and
typically uses archive names of the form [projectName]-[version].[type].

The following example comes from a project named archive-naming, hence the myZip task creates an
archive named archive-naming-1.0.zip:

build.gradle.kts

plugins {
base
}

version = "1.0"

tasks.register<Zip>("myZip") {
from("somedir")
val projectDir = layout.projectDirectory.asFile
doLast {
println(archiveFileName.get())
println(destinationDirectory.get().asFile.relativeTo(projectDir))
println(archiveFile.get().asFile.relativeTo(projectDir))
}
}

build.gradle

plugins {
id 'base'
}

version = 1.0

tasks.register('myZip', Zip) {
from 'somedir'
File projectDir = layout.projectDirectory.asFile
doLast {
println archiveFileName.get()
println projectDir.relativePath(destinationDirectory.get().asFile)
println projectDir.relativePath(archiveFile.get().asFile)
}
}

$ gradle -q myZip
archive-naming-1.0.zip
build/distributions
build/distributions/archive-naming-1.0.zip

Note that the archive name does not derive from the task’s name that creates it.

If you want to change the name and location of a generated archive file, you can provide values for
the corresponding task’s archiveFileName and destinationDirectory properties. These override any
conventions that would otherwise apply.

Alternatively, you can make use of the default archive name pattern provided by
AbstractArchiveTask.getArchiveFileName(): [archiveBaseName]-[archiveAppendix]-[archiveVersion]-
[archiveClassifier].[archiveExtension]. You can set each of these properties on the task separately.
Note that the Base Plugin uses the convention of the project name for archiveBaseName, project
version for archiveVersion, and the archive type for archiveExtension. It does not provide values for
the other properties.

This example — from the same project as the one above — configures just the archiveBaseName
property, overriding the default value of the project name:

build.gradle.kts

tasks.register<Zip>("myCustomZip") {
archiveBaseName = "customName"
from("somedir")

doLast {
println(archiveFileName.get())
}
}

build.gradle

tasks.register('myCustomZip', Zip) {
archiveBaseName = 'customName'
from 'somedir'

doLast {
println archiveFileName.get()
}
}

$ gradle -q myCustomZip
customName-1.0.zip

You can also override the default archiveBaseName value for all the archive tasks in your build by
using the project property archivesBaseName, as demonstrated by the following example:

build.gradle.kts

plugins {
base
}
version = "1.0"

base {
archivesName = "gradle"
distsDirectory = layout.buildDirectory.dir("custom-dist")
libsDirectory = layout.buildDirectory.dir("custom-libs")
}

val myZip by tasks.registering(Zip::class) {


from("somedir")
}

val myOtherZip by tasks.registering(Zip::class) {


archiveAppendix = "wrapper"
archiveClassifier = "src"
from("somedir")
}

tasks.register("echoNames") {
val projectNameString = project.name
val archiveFileName = myZip.flatMap { it.archiveFileName }
val myOtherArchiveFileName = myOtherZip.flatMap { it.archiveFileName }
doLast {
println("Project name: $projectNameString")
println(archiveFileName.get())
println(myOtherArchiveFileName.get())
}
}

build.gradle

plugins {
id 'base'
}

version = 1.0
base {
archivesName = "gradle"
distsDirectory = layout.buildDirectory.dir('custom-dist')
libsDirectory = layout.buildDirectory.dir('custom-libs')
}

def myZip = tasks.register('myZip', Zip) {


from 'somedir'
}

def myOtherZip = tasks.register('myOtherZip', Zip) {


archiveAppendix = 'wrapper'
archiveClassifier = 'src'
from 'somedir'
}

tasks.register('echoNames') {
def projectNameString = project.name
def archiveFileName = myZip.flatMap { it.archiveFileName }
def myOtherArchiveFileName = myOtherZip.flatMap { it.archiveFileName }
doLast {
println "Project name: $projectNameString"
println archiveFileName.get()
println myOtherArchiveFileName.get()
}
}

$ gradle -q echoNames
Project name: archives-changed-base-name
gradle-1.0.zip
gradle-wrapper-1.0-src.zip

You can find all the possible archive task properties in the API documentation for
AbstractArchiveTask. Still, we have also summarized the main ones here:

archiveFileName — Property<String>, default: archiveBaseName-archiveAppendix-archiveVersion-


archiveClassifier.archiveExtension
The complete file name of the generated archive. If any of the properties in the default value are
empty, their '-' separator is dropped.

archiveFile — Provider<RegularFile>, read-only, default: destinationDirectory/archiveFileName


The absolute file path of the generated archive.

destinationDirectory — DirectoryProperty, default: depends on archive type


The target directory in which to put the generated archive. By default, JARs and WARs go into
layout.buildDirectory.dir("libs"). ZIPs and TARs go into
layout.buildDirectory.dir("distributions").

archiveBaseName — Property<String>, default: project.name


The base name portion of the archive file name, typically a project name or some other
descriptive name for what it contains.

archiveAppendix — Property<String>, default: null


The appendix portion of the archive file name that comes immediately after the base name. It is
typically used to distinguish between different forms of content, such as code and docs, or a
minimal distribution versus a full or complete one.

archiveVersion — Property<String>, default: project.version


The version portion of the archive file name, typically in the form of a normal project or product
version.

archiveClassifier — Property<String>, default: null


The classifier portion of the archive file name. Often used to distinguish between archives that
target different platforms.

archiveExtension — Property<String>, default: depends on archive type and compression type


The filename extension for the archive. By default, this is set based on the archive task type and
the compression type (if you’re creating a TAR). Will be one of: zip, jar, war, tar, tgz or tbz2. You
can of course set this to a custom extension if you wish.

Sharing content between multiple archives

As described in the CopySpec section above, you can use the Project.copySpec(org.gradle.api.Action)
method to share content between archives.

Using archives as file trees

An archive is a directory and file hierarchy packed into a single file. In other words, it’s a special
case of a file tree, and that’s exactly how Gradle treats archives.

Instead of using the fileTree() method, which only works on normal file systems, you use the
Project.zipTree(java.lang.Object) and Project.tarTree(java.lang.Object) methods to wrap archive
files of the corresponding type (note that JAR, WAR and EAR files are ZIPs). Both methods return
FileTree instances that you can then use in the same way as normal file trees. For example, you can
extract some or all of the files of an archive by copying its contents to some directory on the file
system. Or you can merge one archive into another.

Here are some simple examples of creating archive-based file trees:

build.gradle.kts

// Create a ZIP file tree using path


val zip: FileTree = zipTree("someFile.zip")

// Create a TAR file tree using path


val tar: FileTree = tarTree("someFile.tar")

// tar tree attempts to guess the compression based on the file extension
// however if you must specify the compression explicitly you can:
val someTar: FileTree = tarTree(resources.gzip("someTar.ext"))

build.gradle

// Create a ZIP file tree using path


FileTree zip = zipTree('someFile.zip')
// Create a TAR file tree using path
FileTree tar = tarTree('someFile.tar')

//tar tree attempts to guess the compression based on the file extension
//however if you must specify the compression explicitly you can:
FileTree someTar = tarTree(resources.gzip('someTar.ext'))

You can see a practical example of extracting an archive file in the unpacking archives section
below.

Using AbstractArchiveTask for reproducible builds

Sometimes it’s desirable to recreate archives exactly the same, byte for byte, on different machines.
You want to be sure that building an artifact from source code produces the same result no matter
when and where it is built. This is necessary for projects like reproducible-builds.org.

Reproducing the same byte-for-byte archive poses some challenges since the order of the files in an
archive is influenced by the underlying file system. Each time a ZIP, TAR, JAR, WAR or EAR is built
from source, the order of the files inside the archive may change. Files that only have a different
timestamp also causes differences in archives from build to build.

All AbstractArchiveTask (e.g. Jar, Zip) tasks shipped with Gradle include support for producing
reproducible archives.

For example, to make a Zip task reproducible you need to set Zip.isReproducibleFileOrder() to true
and Zip.isPreserveFileTimestamps() to false. In order to make all archive tasks in your build
reproducible, consider adding the following configuration to your build file:

build.gradle.kts

tasks.withType<AbstractArchiveTask>().configureEach {
isPreserveFileTimestamps = false
isReproducibleFileOrder = true
}

build.gradle

tasks.withType(AbstractArchiveTask).configureEach {
preserveFileTimestamps = false
reproducibleFileOrder = true
}

Often you will want to publish an archive, so that it is usable from another project. This process is
described in Cross-Project publications.

Unpacking archives

Archives are effectively self-contained file systems, so unpacking them is a case of copying the files
from that file system onto the local file system — or even into another archive. Gradle enables this
by providing some wrapper functions that make archives available as hierarchical collections of
files (file trees).

Using Project.zipTree and Project.tarTree

The two functions of interest are Project.zipTree(java.lang.Object) and


Project.tarTree(java.lang.Object), which produce a FileTree from a corresponding archive file.

That file tree can then be used in a from() specification, like so:

build.gradle.kts

tasks.register<Copy>("unpackFiles") {
from(zipTree("src/resources/thirdPartyResources.zip"))
into(layout.buildDirectory.dir("resources"))
}

build.gradle

tasks.register('unpackFiles', Copy) {
from zipTree("src/resources/thirdPartyResources.zip")
into layout.buildDirectory.dir("resources")
}

As with a normal copy, you can control which files are unpacked via filters and even rename files
as they are unpacked.

More advanced processing can be handled by the eachFile() method. For example, you might need
to extract different subtrees of the archive into different paths within the destination directory. The
following sample uses the method to extract the files within the archive’s libs directory into the
root destination directory, rather than into a libs subdirectory:

build.gradle.kts

tasks.register<Copy>("unpackLibsDirectory") {
from(zipTree("src/resources/thirdPartyResources.zip")) {
include("libs/**") ①
eachFile {
relativePath = RelativePath(true,
*relativePath.segments.drop(1).toTypedArray()) ②
}
includeEmptyDirs = false ③
}
into(layout.buildDirectory.dir("resources"))
}

build.gradle

tasks.register('unpackLibsDirectory', Copy) {
from(zipTree("src/resources/thirdPartyResources.zip")) {
include "libs/**" ①
eachFile { fcd ->
fcd.relativePath = new RelativePath(true, fcd.relativePath
.segments.drop(1)) ②
}
includeEmptyDirs = false ③
}
into layout.buildDirectory.dir("resources")
}

① Extracts only the subset of files that reside in the libs directory

② Remaps the path of the extracting files into the destination directory by dropping the libs
segment from the file path

③ Ignores the empty directories resulting from the remapping, see Caution note below

You can not change the destination path of empty directories with this
CAUTION
technique. You can learn more in this issue.

If you’re a Java developer wondering why there is no jarTree() method, that’s because zipTree()
works perfectly well for JARs, WARs, and EARs.

Creating "uber" or "fat" JARs

In Java, applications and their dependencies were typically packaged as separate JARs within a
single distribution archive. That still happens, but another approach that is now common is placing
the classes and resources of the dependencies directly into the application JAR, creating what is
known as an Uber or fat JAR.

Creating "uber" or "fat" JARs in Gradle involves packaging all dependencies into a single JAR file,
making it easier to distribute and run the application.
Using the Shadow Plugin

Gradle does not have full built-in support for creating uber JARs, but you can use third-party
plugins like the Shadow plugin (com.github.johnrengelman.shadow) to achieve this. This plugin
packages your project classes and dependencies into a single JAR file.

Using Project.zipTree() and the Jar task

To copy the contents of other JAR files into the application JAR, use the
Project.zipTree(java.lang.Object) method and the Jar task. This is demonstrated by the uberJar task
in the following example:

build.gradle.kts

plugins {
java
}

version = "1.0.0"

repositories {
mavenCentral()
}

dependencies {
implementation("commons-io:commons-io:2.6")
}

tasks.register<Jar>("uberJar") {
archiveClassifier = "uber"

from(sourceSets.main.get().output)

dependsOn(configurations.runtimeClasspath)
from({
configurations.runtimeClasspath.get().filter {
it.name.endsWith("jar") }.map { zipTree(it) }
})
}

build.gradle

plugins {
id 'java'
}

version = '1.0.0'
repositories {
mavenCentral()
}

dependencies {
implementation 'commons-io:commons-io:2.6'
}

tasks.register('uberJar', Jar) {
archiveClassifier = 'uber'

from sourceSets.main.output

dependsOn configurations.runtimeClasspath
from {
configurations.runtimeClasspath.findAll { it.name.endsWith('jar') }
.collect { zipTree(it) }
}
}

In this case, we’re taking the runtime dependencies of the project —


configurations.runtimeClasspath.files — and wrapping each of the JAR files with the zipTree()
method. The result is a collection of ZIP file trees, the contents of which are copied into the uber JAR
alongside the application classes.

Creating directories

Many tasks need to create directories to store the files they generate, which is why Gradle
automatically manages this aspect of tasks when they explicitly define file and directory outputs.
All core Gradle tasks ensure that any output directories they need are created, if necessary, using
this mechanism.

Using File.mkdirs and Files.createDirectories

In cases where you need to create a directory manually, you can use the standard
Files.createDirectories or File.mkdirs methods from within your build scripts or custom task
implementations.

Here is a simple example that creates a single images directory in the project folder:

build.gradle.kts

tasks.register("ensureDirectory") {
// Store target directory into a variable to avoid project reference in
the configuration cache
val directory = file("images")
doLast {
Files.createDirectories(directory.toPath())
}
}

build.gradle

tasks.register('ensureDirectory') {
// Store target directory into a variable to avoid project reference in
the configuration cache
def directory = file("images")

doLast {
Files.createDirectories(directory.toPath())
}
}

As described in the Apache Ant manual, the mkdir task will automatically create all necessary
directories in the given path. It will do nothing if the directory already exists.

Using Project.mkdir

You can create directories in Gradle using the mkdir method, which is available in the Project
object. This method takes a File object or a String representing the path of the directory to be
created:

tasks.register('createDirs') {
doLast {
mkdir 'src/main/resources'
mkdir file('build/generated')

// Create multiple dirs


mkdir files(['src/main/resources', 'src/test/resources'])

// Check dir existence


def dir = file('src/main/resources')
if (!dir.exists()) {
mkdir dir
}
}
}
Installing executables

When you are building a standalone executable, you may want to install this file on your system, so
it ends up in your path.

Using the Copy task

You can use a Copy task to install the executable into shared directories like /usr/local/bin. The
installation directory probably contains many other executables, some of which may even be
unreadable by Gradle. To support the unreadable files in the Copy task’s destination directory and to
avoid time consuming up-to-date checks, you can use Task.doNotTrackState():

build.gradle.kts

tasks.register<Copy>("installExecutable") {
from("build/my-binary")
into("/usr/local/bin")
doNotTrackState("Installation directory contains unrelated files")
}

build.gradle

tasks.register("installExecutable", Copy) {
from "build/my-binary"
into "/usr/local/bin"
doNotTrackState("Installation directory contains unrelated files")
}

Deploying single files into application servers

Deploying a single file to an application server typically refers to the process of transferring a
packaged application artifact, such as a WAR file, to the application server’s deployment directory.

Using the Copy task

When working with application servers, you can use a Copy task to deploy the application archive
(e.g. a WAR file). Since you are deploying a single file, the destination directory of the Copy is the
whole deployment directory. The deployment directory sometimes does contain unreadable files
like named pipes, so Gradle may have problems doing up-to-date checks. In order to support this
use-case, you can use Task.doNotTrackState():
build.gradle.kts

plugins {
war
}

tasks.register<Copy>("deployToTomcat") {
from(tasks.war)
into(layout.projectDirectory.dir("tomcat/webapps"))
doNotTrackState("Deployment directory contains unreadable files")
}

build.gradle

plugins {
id 'war'
}

tasks.register("deployToTomcat", Copy) {
from war
into layout.projectDirectory.dir('tomcat/webapps')
doNotTrackState("Deployment directory contains unreadable files")
}

Logging
The log serves as the primary 'UI' of a build tool. If it becomes overly verbose, important warnings
and issues can be obscured. However, it is essential to have relevant information to determine if
something has gone wrong.

Gradle defines six log levels, detailed in Log levels. In addition to the standard log levels, Gradle
introduces two specific levels: QUIET and LIFECYCLE. LIFECYCLE is the default level used to report
build progress.

Understanding Log levels

There are 6 log levels in Gradle:

ERROR Error messages

QUIET Important information messages

WARNING Warning messages


LIFECYCLE Progress information messages

INFO Information messages

DEBUG Debug messages

The console’s rich components (build status and work-in-progress area) are
NOTE
displayed regardless of the log level used.

Choosing a log level

You can choose different log levels from the command line switches shown in Log level command-
line options.

You can also configure the log level using gradle.properties.

In Stacktrace command-line options you can find the command line switches which affect
stacktrace logging.

Log level command-line options:

Option Outputs Log Levels

-q or --quiet QUIET and higher

-w or --warn WARN and higher

no logging options LIFECYCLE and higher

-i or --info INFO and higher

-d or --debug DEBUG and higher (that is, all log messages)

CAUTION The DEBUG log level can expose sensitive security information to the console.

Stacktrace command-line options

-s or --stacktrace
Truncated stacktraces are printed. We recommend this over full stacktraces. Groovy full
stacktraces are extremely verbose due to the underlying dynamic invocation mechanisms. Yet
they usually do not contain relevant information about what has gone wrong in your code. This
option renders stacktraces for deprecation warnings.

-S or --full-stacktrace
The full stacktraces are printed out. This option renders stacktraces for deprecation warnings.

<No stacktrace options>


No stacktraces are printed to the console in case of a build error (e.g., a compile error). Only in
case of internal exceptions will stacktraces be printed. If the DEBUG log level is chosen, truncated
stacktraces are always printed.
Logging Sensitive Information

Running Gradle with the DEBUG log level can potentially expose sensitive information to the console
and build log.

This information might include:

• Environment variables

• Private repository credentials

• Build cache and Develocity credentials

• Plugin Portal publishing credentials

It’s important to avoid using the DEBUG log level when running on public Continuous Integration (CI)
services. Build logs on these services are accessible to the public and can expose sensitive
information. Even on private CI services, logging sensitive credentials may pose a risk depending
on your organization’s threat model. It’s advisable to discuss this with your organization’s security
team.

Some CI providers attempt to redact sensitive credentials from logs, but this process is not foolproof
and typically only redacts exact matches of pre-configured secrets.

If you suspect that a Gradle Plugin may inadvertently expose sensitive information, please contact
[[email protected]](mailto:[email protected]) for assistance with disclosure.

Writing your own log messages

A simple option for logging in your build file is to write messages to standard output. Gradle
redirects anything written to standard output to its logging system at the QUIET log level:

build.gradle.kts

println("A message which is logged at QUIET level")

build.gradle

println 'A message which is logged at QUIET level'

Gradle also provides a logger property to a build script, which is an instance of Logger. This
interface extends the SLF4J Logger interface and adds a few Gradle-specific methods. Below is an
example of how this is used in the build script:
build.gradle.kts

logger.quiet("An info log message which is always logged.")


logger.error("An error log message.")
logger.warn("A warning log message.")
logger.lifecycle("A lifecycle info log message.")
logger.info("An info log message.")
logger.debug("A debug log message.")
logger.trace("A trace log message.") // Gradle never logs TRACE level logs

build.gradle

logger.quiet('An info log message which is always logged.')


logger.error('An error log message.')
logger.warn('A warning log message.')
logger.lifecycle('A lifecycle info log message.')
logger.info('An info log message.')
logger.debug('A debug log message.')
logger.trace('A trace log message.') // Gradle never logs TRACE level logs

Use the link typical SLF4J pattern to replace a placeholder with an actual value in the log message.

build.gradle.kts

logger.info("A {} log message", "info")

build.gradle

logger.info('A {} log message', 'info')

You can also hook into Gradle’s logging system from within other classes used in the build (classes
from the buildSrc directory, for example) with an SLF4J logger. You can use this logger the same
way as you use the provided logger in the build script.

build.gradle.kts

import org.slf4j.LoggerFactory
val slf4jLogger = LoggerFactory.getLogger("some-logger")
slf4jLogger.info("An info log message logged using SLF4j")

build.gradle

import org.slf4j.LoggerFactory

def slf4jLogger = LoggerFactory.getLogger('some-logger')


slf4jLogger.info('An info log message logged using SLF4j')

Logging from external tools and libraries

Internally, Gradle uses Ant and Ivy. Both have their own logging system. Gradle redirects their
logging output into the Gradle logging system.

There is a 1:1 mapping from the Ant/Ivy log levels to the Gradle log levels, except the Ant/Ivy TRACE
log level, which is mapped to the Gradle DEBUG log level. This means the default Gradle log level will
not show any Ant/Ivy output unless it is an error or a warning.

Many tools out there still use the standard output for logging. By default, Gradle redirects standard
output to the QUIET log level and standard error to the ERROR level. This behavior is configurable.

The project object provides a LoggingManager, which allows you to change the log levels that
standard out or error are redirected to when your build script is evaluated.

build.gradle.kts

logging.captureStandardOutput(LogLevel.INFO)
println("A message which is logged at INFO level")

build.gradle

logging.captureStandardOutput LogLevel.INFO
println 'A message which is logged at INFO level'

To change the log level for standard out or error during task execution, use a LoggingManager.
build.gradle.kts

tasks.register("logInfo") {
logging.captureStandardOutput(LogLevel.INFO)
doFirst {
println("A task message which is logged at INFO level")
}
}

build.gradle

tasks.register('logInfo') {
logging.captureStandardOutput LogLevel.INFO
doFirst {
println 'A task message which is logged at INFO level'
}
}

Gradle also integrates with the [Java Util Logging](https://docs.oracle.com/javase/8/docs/api/java/


util/logging/package-summary.html), Jakarta Commons Logging and
[Log4j](https://logging.apache.org/log4j/2.x/) logging toolkits. Any log messages your build classes
write using these logging toolkits will be redirected to Gradle’s logging system.

Changing what Gradle logs

The configuration cache limits the ability to customize Gradle’s logging UI. The
custom logger can only implement supported listener interfaces. These
WARNING
interfaces do not receive events when the configuration cache entry is reused
because the configuration phase is skipped.

You can replace much of Gradle’s logging UI with your own. You could do this if you want to
customize the UI somehow - to log more or less information or to change the formatting. Simply
replace the logging using the Gradle.useLogger(java.lang.Object) method. This is accessible from a
build script, an init script, or via the embedding API. Note that this completely disables Gradle’s
default output. Below is an example init script that changes how task execution and build
completion are logged:

customLogger.init.gradle.kts

useLogger(CustomEventLogger())

@Suppress("deprecation")
class CustomEventLogger() : BuildAdapter(), TaskExecutionListener {

override fun beforeExecute(task: Task) {


println("[${task.name}]")
}

override fun afterExecute(task: Task, state: TaskState) {


println()
}

override fun buildFinished(result: BuildResult) {


println("build completed")
if (result.failure != null) {
(result.failure as Throwable).printStackTrace()
}
}
}

customLogger.init.gradle

useLogger(new CustomEventLogger())

@SuppressWarnings("deprecation")
class CustomEventLo