User Guide
User Guide
Version 8.9
Version 8.9
Table of Contents
OVERVIEW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Gradle User Manual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
The User Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
RELEASES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Installing Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Compatibility Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
The Feature Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
RUNNING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
CORE CONCEPTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Gradle Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Gradle Wrapper Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Command-Line Interface Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Settings File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Build File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Dependency Management Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Task Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Plugin Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Gradle Incremental Builds and Build Caching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Build Scans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
OTHER TOPICS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Continuous Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
AUTHORING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
THE BASICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Gradle Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Multi-Project Build Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Build Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Writing Settings Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Writing Build Scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Using Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Writing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Using Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Writing Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
STRUCTURING BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Structuring Projects with Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Declaring Dependencies between Subprojects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Sharing Build Logic between Subprojects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Composite Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Configuration On Demand. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
DEVELOPING TASKS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Understanding Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Configuring Tasks Lazily . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
Understanding Lazy properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
Creating a Property or Provider instance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
Connecting properties together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Working with files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
Working with task inputs and outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Working with collections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
Working with maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
Applying a convention to a property . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
Where to apply conventions from? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Making a property unmodifiable. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
Using the Provider API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Provider Files API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
Property Files API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Lazy Collections API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Lazy Objects API Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Developing Parallel Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Advanced Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
DEVELOPING PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
Understanding Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
Understanding Implementation Options for Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Implementing Pre-compiled Script Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Implementing Binary Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259
Testing Gradle plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290
Publishing Plugins to the Gradle Plugin Portal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Gradle-managed Directories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Working With Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
Configuring the Build Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382
Initialization Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
Using Shared Build Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
Dataflow Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412
Testing Build Logic with TestKit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416
Using Ant from Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427
AUTHORING JVM BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443
Building Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 443
Testing in Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
Managing Dependencies of JVM Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 501
JAVA TOOLCHAINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Toolchains for JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
Toolchain Resolver Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 522
JVM PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
The Java Library Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525
The Application Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
The Java Platform Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544
The Groovy Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 550
The Scala Plugin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 559
WORKING WITH DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
Dependency Management Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
THE BASICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575
Dependency Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575
Declaring repositories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 578
Declaring dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 611
Understanding the difference between libraries and applications . . . . . . . . . . . . . . . . . . . . . . . . 633
View and Debug Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 635
Understanding dependency resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 641
Verifying dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 649
DECLARING VERSIONS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Declaring Versions and Ranges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Declaring Rich Versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 679
Handling versions which change over time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 682
Locking dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 691
CONTROLLING TRANSITIVES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 701
Upgrading versions of transitive dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 701
Downgrading versions and excluding dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 702
Sharing dependency versions between projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 709
Aligning dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 732
Handling mutually exclusive dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
Fixing metadata with component metadata rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 743
Customizing resolution of a dependency directly. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 766
Preventing accidental dependency upgrades. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 784
PRODUCING AND CONSUMING VARIANTS OF LIBRARIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 791
Declaring Capabilities of a Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 791
Modeling library features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 795
Understanding variant selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806
Working with Variant Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 824
Sharing outputs between projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Transforming dependency artifacts on resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 841
PUBLISHING LIBRARIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 858
Publishing a project as module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 858
Understanding Gradle Module Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 862
Signing artifacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 867
Customizing publishing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 868
The Maven Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 879
The Ivy Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
OPTIMIZING BUILD PERFORMANCE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 907
Improve the Performance of Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 907
Gradle Daemon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 927
File System Watching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 935
Incremental build . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 938
Configuration cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 975
Inspecting Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1015
USING THE BUILD CACHE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1027
Build Cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1027
Use cases for the build cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1040
Build cache performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1043
Important concepts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1047
Caching Java projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1052
Caching Android projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1057
Debugging and diagnosing cache misses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1060
Solving common problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1068
REFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1078
Command-Line Interface Reference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1078
Gradle Wrapper Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1097
Gradle Plugin Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1107
Gradle & Third-party Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1110
GRADLE DSLs and API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1114
A Groovy Build Script Primer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1114
Gradle Kotlin DSL Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1119
LICENSE INFORMATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1151
License Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1151
OVERVIEW
Gradle User Manual
Gradle Build Tool
Why Gradle?
Gradle is a widely used and mature tool with an active community and a strong developer
ecosystem.
• Gradle is the most popular build system for the JVM and is the default system for Android and
Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.
• Gradle can automate a wide range of software build scenarios using either its built-in
functionality, third-party plugins, or custom build logic.
• Gradle provides a high-level, declarative, and expressive build language that makes it easy to
read and write build logic.
• Gradle is fast, scalable, and can build projects of any size and complexity.
• Gradle produces dependable results while benefiting from optimizations such as incremental
builds, build caching, and parallel execution.
Gradle, Inc. provides a free service called Build Scan® that provides extensive information and
insights about your builds. You can view scans to identify problems or share them for debugging
help.
Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.
Compatible IDEs
All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your
continuous integration (CI) server.
Education
The Gradle User Manual is the official documentation for the Gradle Build Tool.
• Getting Started Tutorial — Learn Gradle basics and the benefits of building your App with
Gradle.
• Training Courses — Head over to the courses page to sign up for free Gradle training.
Support
• Forum — The fastest way to get help is through the Gradle Forum.
• Slack — Community members and core contributors answer questions directly on our Slack
Channel.
Licenses
Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user
manual and DSL reference manual are licensed under Creative Commons Attribution-
NonCommercial-ShareAlike 4.0 International License.
Releases
Information on Gradle releases and how to install Gradle is found on the Installation page.
Content
The Gradle User Manual is broken down into the following sections:
Optimizing Builds
Use caches to optimize your build and understand the Gradle daemon, incremental builds and
file system watching.
Reference
If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the
build uses the Gradle Wrapper. This is identifiable by the presence of the gradlew or gradlew.bat
files in the root of the project:
. ①
├── gradle
│ └── wrapper ②
├── gradlew ③
├── gradlew.bat ③
└── ⋮
② Gradle Wrapper.
If the gradlew or gradlew.bat files are already present in your project, you do not need to install
Gradle. But you need to make sure your system satisfies Gradle’s prerequisites.
You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version
for your project. Please use the Gradle Wrapper to upgrade Gradle.
Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle
separately when only working within that IDE.
If you do not meet the criteria above and decide to install Gradle on your machine, first check if
Gradle is already installed by running gradle -v in your terminal. If the command does not return
anything, then Gradle is not installed, and you can follow the instructions below.
You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done
manually or using a package manager like SDKMAN! or Homebrew.
You can find all Gradle releases and their checksums on the releases page.
Prerequisites
Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or
higher to run. You can check the compatibility matrix for more information.
❯ java -version
openjdk version "11.0.18" 2023-01-17
OpenJDK Runtime Environment Homebrew (build 11.0.18+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)
Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your
project. In this example, the $PATH points to JDK17:
❯ echo $PATH
/opt/homebrew/opt/openjdk@17/bin
You can also set the JAVA_HOME environment variable to point to a specific JDK installation directory.
This is especially useful when multiple JDKs are installed:
❯ echo %JAVA_HOME%
C:\Program Files\Java\jdk1.7.0_80
❯ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk-16.jdk/Contents/Home
Gradle supports Kotlin and Groovy as the main build languages. Gradle ships with its own Kotlin
and Groovy libraries, therefore they do not need to be installed. Existing installations are ignored
by Gradle.
See the full compatibility notes for Java, Groovy, Kotlin, and Android.
Linux installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:
Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle
that is incompatible or incomplete when compared to the official version.
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /opt/gradle
❯ unzip -d /opt/gradle gradle-8.9-bin.zip
❯ ls /opt/gradle/gradle-8.9
LICENSE NOTICE bin README init.d lib media
To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/opt/gradle/gradle-8.9/bin
Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.
export GRADLE_HOME=/opt/gradle/gradle-8.9
export PATH=${GRADLE_HOME}/bin:${PATH}
macOS installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:
Using Homebrew:
❯ brew install gradle
Using MacPorts:
Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc.
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /usr/local/gradle
❯ unzip gradle-8.9-bin.zip -d /usr/local/gradle
❯ ls /usr/local/gradle/gradle-8.9
LICENSE NOTICE README bin init.d lib
To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/usr/local/gradle/gradle-8.9/bin
Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.
It’s a good idea to edit .bash_profile in your home directory to add GRADLE_HOME variable:
export GRADLE_HOME=/usr/local/gradle/gradle-8.9
export PATH=$GRADLE_HOME/bin:$PATH
Windows installation
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
Open a second File Explorer window and go to the directory where the Gradle distribution was
downloaded. Double-click the ZIP archive to expose the content. Drag the content folder gradle-
8.9 to your newly created C:\Gradle folder.
Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle using the archiver tool of
your choice.
To install Gradle, the path to the unpacked files needs to be in your Path.
In File Explorer right-click on the This PC (or Computer) icon, then click Properties → Advanced
System Settings → Environmental Variables.
Under System Variables select Path, then click Edit. Add an entry for C:\Gradle\gradle-8.9\bin.
Click OK to save.
Alternatively, you can add the environment variable GRADLE_HOME and point this to the unzipped
distribution. Instead of adding a specific version of Gradle to your Path, you can add
%GRADLE_HOME%\bin to your Path. When upgrading to a different version of Gradle, just change the
GRADLE_HOME environment variable.
Open a console (or a Windows command prompt) and run gradle -v to run gradle and display the
version, e.g.:
❯ gradle -v
------------------------------------------------------------
Gradle 8.9
------------------------------------------------------------
Kotlin: 1.9.23
Groovy: 3.0.21
Ant: Apache Ant(TM) version 1.10.13 compiled on January 4 2023
Launcher JVM: 11.0.23 (Eclipse Adoptium 11.0.23+9)
Daemon JVM: /Library/Java/JavaVirtualMachines/temurin-11.jdk/Contents/Home (no JDK
specified, using current Java home)
OS: Mac OS X 14.5 aarch64
You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available
from the releases page) and following these verification instructions.
Compatibility Matrix
The sections below describe Gradle’s compatibility with several integrations. Versions not listed
here may or may not work.
Java
A Java version between 8 and 22 is required to execute Gradle. Java 23 and later versions are not
yet supported.
Java 6 and 7 can be used for compilation but are deprecated for use with testing. Testing with Java 6
and 7 will not be supported in Gradle 9.0.
Any fully supported version of Java can be used for compilation or testing. However, the latest Java
version may only be supported for compilation or testing, not for running Gradle. Support is
achieved using toolchains and applies to all tasks supporting toolchains.
See the table below for the Java version supported by a specific Gradle release:
8 N/A 2.0
9 N/A 4.3
10 N/A 4.7
11 N/A 5.0
12 N/A 5.4
13 N/A 6.0
14 N/A 6.3
15 6.7 6.7
Java version Support for toolchains Support for running Gradle
16 7.0 7.0
17 7.3 7.3
18 7.5 7.5
19 7.6 7.6
20 8.1 8.3
21 8.4 8.5
22 8.7 8.8
23 N/A N/A
Kotlin
Gradle is tested with Kotlin 1.6.10 through 2.0.0. Beta and RC versions may or may not work.
Groovy
Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy
DSL build scripts.
Android
Gradle is tested with Android Gradle Plugin 7.3 through 8.4. Alpha and beta versions may or may
not work.
Continuous improvement combined with frequent delivery allows new features to be available to
users early. Early users provide invaluable feedback, which is incorporated into the development
process.
Getting new functionality into the hands of users regularly is a core value of the Gradle platform.
At the same time, API and feature stability are taken very seriously and considered a core value of
the Gradle platform. Design choices and automated testing are engineered into the development
process and formalized by the section on backward compatibility.
The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of
Gradle what the state of a feature is. The term feature typically means an API or DSL method or
property in this context, but it is not restricted to this definition. Command line arguments and
modes of execution (e.g. the Build Daemon) are two examples of other features.
Feature States
1. Internal
2. Incubating
3. Public
4. Deprecated
1. Internal
Internal features are not designed for public use and are only intended to be used by Gradle itself.
They can change in any way at any point in time without any notice. Therefore, we recommend
avoiding the use of such features. Internal features are not documented. If it appears in this User
Manual, the DSL Reference, or the API Reference, then the feature is not internal.
2. Incubating
Features are introduced in the incubating state to allow real-world feedback to be incorporated into
the feature before making it public. It also gives users willing to test potential future changes early
access.
A feature in an incubating state may change in future Gradle versions until it is no longer
incubating. Changes to incubating features for a Gradle release will be highlighted in the release
notes for that release. The incubation period for new features varies depending on the feature’s
scope, complexity, and nature.
Features in incubation are indicated. In the source code, all methods/properties/classes that are
incubating are annotated with incubating. This results in a special mark for them in the DSL and
API references.
If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the
incubating state.
The feature preview API allows certain incubating features to be activated by adding
enableFeaturePreview('FEATURE') in your settings file. Individual preview features will be
announced in release notes.
When incubating features are either promoted to public or removed, the feature preview flags for
them become obsolete, have no effect, and should be removed from the settings file.
3. Public
The default state for a non-internal feature is public. Anything documented in the User Manual, DSL
Reference, or API reference that is not explicitly said to be incubating or deprecated is considered
public. Features are said to be promoted from an incubating state to public. The release notes for
each release indicate which previously incubating features are being promoted by the release.
A public feature will never be removed or intentionally changed without undergoing deprecation.
All public features are subject to the backward compatibility policy.
4. Deprecated
Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such
features will eventually be removed from Gradle after being deprecated. A deprecated feature may
become stale until it is finally removed according to the backward compatibility policy.
Deprecated features are indicated to be so. In the source code, all methods/properties/classes that
are deprecated are annotated with “@java.lang.Deprecated” which is reflected in the DSL and API
References. In most cases, there is a replacement for the deprecated element, which will be
described in the documentation. Using a deprecated feature will result in a runtime warning in
Gradle’s output.
The use of deprecated features should be avoided. The release notes for each release indicate any
features being deprecated by the release.
Gradle provides backward compatibility across major versions (e.g., 1.x, 2.x, etc.). Once a public
feature is introduced in a Gradle release, it will remain indefinitely unless deprecated. Once
deprecated, it may be removed in the next major release. Deprecated features may be supported
across major releases, but this is not guaranteed.
This contains all of the changes made through Gradle’s extensive continuous integration tests
during that day. Nightly builds may contain new changes that may or may not be stable.
The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or
major release. When no problems are found after a short time (usually a week), the release
candidate is promoted to a general availability (GA) release. If a regression is found in the release
candidate, a new RC distribution is created, and the process repeats. Release candidates are
supported for as long as the release window is open, but they are not intended to be used for
production. Bug reports are greatly appreciated during the RC phase.
The Gradle team may create additional patch releases to replace the final release due to critical bug
fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.
Once a release candidate has been made, all feature development moves on to the next release for
the latest major version. As such, each minor Gradle release causes the previous minor releases in
the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or
feature backports.
For major versions, Gradle will backport critical fixes and security fixes to the last minor in the
previous major version. For example, when Gradle 7 was the latest major version, several releases
were made in the 6.x line, including Gradle 6.9 (and subsequent releases).
• The previous major version becomes maintenance only. It will only receive critical bug fixes
and security fixes.
• The major version before the previous one to become end-of-life (EOL), and that release line
will not receive any new fixes.
RUNNING GRADLE BUILDS
CORE CONCEPTS
Gradle Basics
Gradle automates building, testing, and deployment of software from information in build
scripts.
Projects
A Gradle project is a piece of software that can be built, such as an application or a library.
Single project builds include a single project called the root project.
Multi-project builds include one root project and any number of subprojects.
Build Scripts
Build scripts detail to Gradle what steps to take to build the project.
Dependency Management
Each project typically includes a number of external dependencies that Gradle will resolve during
the build.
Tasks
Tasks are a basic unit of work such as compiling code or running your test.
Each project contains one or more tasks defined inside a build script or a plugin.
Plugins
Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.
Many developers will interact with Gradle for the first time through an existing project.
The presence of the gradlew and gradlew.bat files in the root directory of a project is a clear
indicator that Gradle is used.
project
├── gradle ①
│ ├── libs.versions.toml ②
│ └── wrapper
│ ├── gradle-wrapper.jar
│ └── gradle-wrapper.properties
├── gradlew ③
├── gradlew.bat ③
├── settings.gradle(.kts) ④
├── subproject-a
│ ├── build.gradle(.kts) ⑤
│ └── src ⑥
└── subproject-b
├── build.gradle(.kts) ⑤
└── src ⑥
Invoking Gradle
IDE
Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
Gradle can be automatically invoked when you build, clean, or run your app in the IDE.
It is recommended that you consult the manual for the IDE of your choice to learn more about how
Gradle can be used and configured.
Command line
Gradle can be invoked in the command line once installed. For example:
$ gradle build
Gradle Wrapper
The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to
execute a Gradle build. It is found in the project root directory as a gradlew or gradlew.bat file:
• Provisions the Gradle version for different execution environments (IDEs, CI servers…).
It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and
standardized execution of the build.
Depending on the operating system, you run gradlew or gradlew.bat instead of the gradle command.
$ gradle build
$ ./gradlew build
$ .\gradlew.bat build
The command is run in the same directory that the Wrapper is located in. If you want to run the
command in a different directory, you must provide the relative path to the Wrapper:
$ ../gradlew build
The following console output demonstrates the use of the Wrapper on a Windows machine, in the
command prompt (cmd), for a Java-based project:
$ gradlew.bat build
Downloading https://services.gradle.org/distributions/gradle-5.0-all.zip
.....................................................................................
Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0-all.zip to C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv
Set executable permissions for: C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle
.
├── gradle
│ └── wrapper
│ ├── gradle-wrapper.jar ①
│ └── gradle-wrapper.properties ②
├── gradlew ③
└── gradlew.bat ④
① gradle-wrapper.jar: This is a small JAR file that contains the Gradle Wrapper code. It is
responsible for downloading and installing the correct version of Gradle for a project if it’s not
already installed.
② gradle-wrapper.properties: This file contains configuration properties for the Gradle Wrapper,
such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or
TARBALL).
③ gradlew: This is a shell script (Unix-based systems) that acts as a wrapper around gradle-
wrapper.jar. It is used to execute Gradle tasks on Unix-based systems without needing to
manually install Gradle.
④ gradlew.bat: This is a batch script (Windows) that serves the same purpose as gradlew but is used
on Windows systems.
$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2
$ gradlew.bat --version
$ gradlew.bat wrapper --gradle-version 7.2
Substitute ./gradlew (in macOS / Linux) or gradlew.bat (in Windows) for gradle in the following
examples.
If multiple tasks are specified, you should separate them with a space.
Options that accept values can be specified with or without = between the option and argument.
The use of = is recommended.
Options that enable behavior have long-form options with inverses specified with --no-. The
following are opposites.
Many long-form options have short-option equivalents. The following are equivalent:
gradle --help
gradle -h
Command-line usage
The following sections describe the use of the Gradle command-line interface. Some plugins also
add their own command line options.
Executing tasks
$ gradle :taskName
This will run the single taskName and all of its dependencies.
To pass an option to a task, prefix the option name with -- after the task name:
The primary purpose of the settings file is to add subprojects to your build.
• For multi-project builds, the settings file is mandatory and declares all subprojects.
Settings script
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
The settings file is typically found in the root directory of the project.
settings.gradle.kts
rootProject.name = "root-project" ①
include("sub-project-a") ②
include("sub-project-b")
include("sub-project-c")
② Add subprojects.
settings.gradle
rootProject.name = 'root-project' ①
include('sub-project-a') ②
include('sub-project-b')
include('sub-project-c')
② Add subprojects.
rootProject.name = "root-project"
2. Add subprojects
The settings file defines the structure of the project by including subprojects, if there are any:
include("app")
include("business-logic")
include("data-model")
1. The libraries and/or plugins on which Gradle and the build script depend.
2. The libraries on which the project sources (i.e., source code) depend.
Build scripts
The build script is either a build.gradle file written in Groovy or a build.gradle.kts file in Kotlin.
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
build.gradle.kts
plugins {
id("application") ①
}
application {
mainClass = "com.example.Main" ②
}
① Add plugins.
plugins {
id 'application' ①
}
application {
mainClass = 'com.example.Main' ②
}
① Add plugins.
1. Add plugins
Adding a plugin to a build is called applying a plugin and makes additional functionality available.
plugins {
id("application")
}
Applying the Application plugin also implicitly applies the Java plugin. The java plugin adds Java
compilation along with testing and bundling capabilities to a project.
A plugin adds tasks to a project. It also adds properties and methods to a project.
The application plugin defines tasks that package and distribute an application, such as the run
task.
The Application plugin provides a way to declare the main class of a Java application, which is
required to execute the code.
application {
mainClass = "com.example.Main"
}
In this example, the main class (i.e., the point where the program’s execution begins) is
com.example.Main.
Gradle build scripts define the process to build projects that may require external dependencies.
Dependencies refer to JARs, plugins, libraries, or source code that support building your project.
Version Catalog
The catalog makes sharing dependencies and version configurations between subprojects simple. It
also allows teams to enforce versions of libraries and plugins in large projects.
1. [versions] to declare the version numbers that plugins and libraries will reference.
[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"
[libraries]
googleMaterial = { group = "com.google.android.material", name = "material", version =
"1.1.0-alpha05" }
mockitoCore = { module = "org.mockito:mockito-core", version.ref = "mockito" }
[plugins]
androidApplication = { id = "com.android.application", version.ref =
"androidGradlePlugin" }
The file is located in the gradle directory so that it can be used by Gradle and IDEs automatically.
The version catalog should be checked into source control: gradle/libs.versions.toml.
To add a dependency to your project, specify a dependency in the dependencies block of your
build.gradle(.kts) file.
The following build.gradle.kts file adds a plugin and two dependencies to the project using the
version catalog above:
plugins {
alias(libs.plugins.androidApplication) ①
}
dependencies {
// Dependency on a remote binary to compile and run the code
implementation(libs.googleMaterial) ②
① Applies the Android Gradle plugin to this project, which adds several features that are specific to
building Android apps.
② Adds the Material dependency to the project. Material Design provides components for creating
a user interface in an Android App. This library will be used to compile and run the Kotlin
source code in this project.
③ Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java
code. This library will be used to compile and run the test source code in this project.
• The material library is added to the implementation configuration, which is used for compiling
and running production code.
• The mockito-core library is added to the testImplementation configuration, which is used for
compiling and running test code.
You can view your dependency tree in the terminal using the ./gradlew :app:dependencies
command:
$ ./gradlew :app:dependencies
------------------------------------------------------------
Project ':app'
------------------------------------------------------------
...
Task Basics
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
You run a Gradle build task using the gradle command or by invoking the Gradle Wrapper
(./gradlew or gradlew.bat) in your project directory:
$ ./gradlew build
Available tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in the project by running the following command in the terminal:
$ ./gradlew tasks
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
...
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
...
Other tasks
-----------
compileJava - Compiles main Java source.
...
Running tasks
$ ./gradlew run
In this example Java project, the output of the run task is a Hello World statement printed on the
console.
Task dependency
For example, for Gradle to execute the build task, the Java code must first be compiled. Thus, the
build task depends on the compileJava task.
This means that the compileJava task will run before the build task:
$ ./gradlew build
Build scripts can optionally define task dependencies. Gradle then automatically determines the
task execution order.
Plugin Basics
Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a
sophisticated dependency resolution engine. The rest of its functionality comes from plugins.
A plugin is a piece of software that provides additional functionality to the Gradle build system.
Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-
related capabilities:
Plugin distribution
2. Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.
3. Local plugins - Gradle enables users to create custom plugins using APIs.
Applying plugins
Applying a plugin to a project allows the plugin to extend the project’s capabilities.
You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a
version:
plugins {
id «plugin id» version «plugin version»
}
1. Core plugins
Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These
plugins provide essential functionality for building and managing projects.
• groovy: Adds support for compiling and testing Groovy source files.
• ear: Adds support for building EAR files for enterprise applications.
Core plugins are unique in that they provide short names, such as java for the core JavaPlugin,
when applied in build scripts. They also do not require versions. To apply the java plugin to a
project:
build.gradle.kts
plugins {
id("java")
}
There are many Gradle Core Plugins users can take advantage of.
2. Community plugins
Community plugins are plugins developed by the Gradle community, rather than being part of the
core Gradle distribution. These plugins provide additional functionality that may be specific to
certain use cases or technologies.
The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java
applications.
build.gradle.kts
plugins {
id("org.springframework.boot") version "3.1.5"
}
Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can
easily discover and use them.
3. Local plugins
Custom or local plugins are developed and used within a specific project or organization. These
plugins are not shared publicly and are tailored to the specific needs of the project or organization.
Local plugins can encapsulate common build logic, provide integrations with internal systems or
tools, or abstract complex functionality into reusable components.
Gradle provides users with the ability to develop custom plugins using APIs. To create your own
plugin, you’ll typically follow these steps:
1. Define the plugin class: create a new class that implements the Plugin<Project> interface.
2. Build and optionally publish your plugin: generate a JAR file containing your plugin code and
optionally publish this JAR to a repository (local or remote) to be used in other projects.
// Publish the plugin
plugins {
`maven-publish`
}
publishing {
publications {
create<MavenPublication>("mavenJava") {
from(components["java"])
}
}
repositories {
mavenLocal()
}
}
3. Apply your plugin: when you want to use the plugin, include the plugin ID and version in the
plugins{} block of the build file.
Next Step: Learn about Incremental Builds and Build Caching >>
Gradle uses two main features to reduce build time: incremental builds and build caching.
Incremental builds
An incremental build is a build that avoids running tasks whose inputs have not changed since the
previous build. Re-executing such tasks is unnecessary if they would only re-produce the same
output.
For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine
whether the input or outputs have changed at build time. If they have changed, Gradle will execute
the task. Otherwise, it will skip execution.
Incremental builds are always enabled, and the best way to see them in action is to turn on verbose
mode. With verbose mode, each task state is labeled during a build:
When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE is
printed next to the task.
Build caching
Incremental Builds are a great optimization that helps avoid work already done. If a developer
continuously changes a single file, there is likely no need to rebuild all the other files in the project.
However, what happens when the same developer switches to a new branch created last week? The
files are rebuilt, even though the developer is building something that has been built before.
The build cache stores previous build results and restores them when needed. It prevents the
redundant work and cost of executing time-consuming and expensive processes.
When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-
CACHE:
Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE and
not FROM-CACHE.
The build cache allows you to share and reuse unchanged build and test outputs across teams. This
speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new
code changes.
Build Scans
<div class="badge-wrapper">
<a class="badge" href="https://dpeuniversity.gradle.com/app/courses/b5069222-cfd0-
4393-b645-7a2c713853d5/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">How to Use Build Scans ></span>
</a>
</div>
Build Scans
Gradle captures your build metadata and sends it to the Build Scan Service. The service then
transforms the metadata into information you can analyze and share with others.
The information that scans collect can be an invaluable resource when troubleshooting,
collaborating on, or optimizing the performance of your builds.
For example, with a build scan, it’s no longer necessary to copy and paste error messages or include
all the details about your environment each time you want to ask a question on Stack Overflow,
Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.
Enable Build Scans
To enable build scans on a gradle command, add --scan to the command line option:
For example, you can continuously run the test task and all dependent tasks by running:
Gradle will behave as if you ran gradle test after a change to sources or tests that contribute to the
requested tasks. This means unrelated changes (such as changes to build scripts) will not trigger a
rebuild. To incorporate build logic changes, the continuous build must be restarted manually.
Continuous build uses file system watching to detect changes to the inputs. If file system watching
does not work on your system, then continuous build won’t work either. In particular, continuous
build does not work when using --no-daemon.
When Gradle detects a change to the inputs, it will not trigger the build immediately. Instead, it will
wait until no additional changes are detected for a certain period of time - the quiet period. You can
configure the quiet period in milliseconds by the Gradle property
org.gradle.continuous.quietperiod.
If Gradle is attached to an interactive input source, such as a terminal, the continuous build can be
exited by pressing CTRL-D (On Microsoft Windows, it is required to also press ENTER or RETURN after
CTRL-D).
If Gradle is not attached to an interactive input source (e.g. is running as part of a script), the build
process must be terminated (e.g. using the kill command or similar).
If the build is being executed via the Tooling API, the build can be cancelled using the Tooling API’s
cancellation mechanism.
Limitations
Under some circumstances, continuous build may not detect changes to inputs.
Sometimes, creating an input directory that was previously missing does not trigger a build, due to
the way file system watching works. For example, creating the src/main/java directory may not
trigger a build. Similarly, if the input is a filtered file tree and no files are matching the filter, the
creation of matching files may not trigger a build.
Inputs of untracked tasks
Changes to the inputs of untracked tasks or tasks that have no outputs may not trigger a build.
Gradle only watches for changes to files inside the project directory. Changes to files outside the
project directory will go undetected and not trigger a build.
Build cycles
Gradle starts watching for changes just before a task executes. If a task modifies its own inputs
while executing, Gradle will detect the change and trigger a new build. If every time the task
executes, the inputs are modified again, the build will be triggered again. This isn’t unique to
continuous build. A task that modifies its own inputs will never be considered up-to-date when run
"normally" without continuous build.
If your build enters a build cycle like this, you can track down the task by looking at the list of files
reported changed by Gradle. After identifying the file(s) that are changed during each build, you
should look for a task that has that file as an input. In some cases, it may be obvious (e.g., a Java file
is compiled with compileJava). In other cases, you can use --info logging to find the task that is out-
of-date due to the identified files.
AUTHORING GRADLE BUILDS
THE BASICS
Gradle Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory
and the Project Root directory.
TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.
├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── my-setup.gradle
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties ⑧
The project root directory contains all source files from your project.
It also contains files and directories Gradle generates, such as .gradle and build.
While .gradle is usually checked into source control, the build directory contains the output of your
builds as well as transient files Gradle uses to support features like incremental builds.
├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── gradle.properties ⑤
├── gradlew ⑥
├── gradlew.bat ⑥
├── settings.gradle.kts ⑦
├── subproject-one ⑧
| └── build.gradle.kts ⑨
├── subproject-two ⑧
| └── build.gradle.kts ⑨
└── ⋮
③ The build directory of this project into which Gradle generates all build artifacts.
While some small projects and monolithic applications may contain a single build file and source
tree, it is often more common for a project to have been split into smaller, interdependent modules.
The word "interdependent" is vital, as you typically want to link the many modules together
through a single build.
Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-
module project. Gradle refers to modules as subprojects.
A multi-project build consists of one root project and one or more subprojects.
Multi-Project structure
The following represents the structure of a multi-project build that contains two subprojects:
├── .gradle
│ └── ⋮
├── gradle
│ ├── libs.version.toml
│ └── wrapper
├── gradlew
├── gradlew.bat
├── settings.gradle.kts ①
├── sub-project-1
│ └── build.gradle.kts ②
├── sub-project-2
│ └── build.gradle.kts ②
└── sub-project-3
└── build.gradle.kts ②
Multi-Project standards
The Gradle community has two standards for multi-project build structures:
2. Composite Builds - a build that includes other builds where build-logic is a build directory at
the Gradle project root containing reusable build logic.
Multi-project builds allow you to organize projects with many modules, wire dependencies between
those modules, and easily share common build logic amongst them.
For example, a build that has many modules called mobile-app, web-app, api, lib, and documentation
could be structured as follows:
.
├── gradle
├── gradlew
├── settings.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts
The modules will have dependencies between them such as web-app and mobile-app depending on
lib. This means that in order for Gradle to build web-app or mobile-app, it must build lib first.
settings.gradle.kts
NOTE The order in which the subprojects (modules) are included does not matter.
The buildSrc directory is automatically recognized by Gradle. It is a good place to define and
maintain shared configuration or imperative build logic, such as custom tasks or plugins.
If the java plugin is applied to the buildSrc project, the compiled code from buildSrc/src/main/java
is put in the classpath of the root build script, making it available to any subproject (web-app, mobile-
app, lib, etc…) in the build.
2. Composite Builds
Composite Builds, also referred to as included builds, are best for sharing logic between builds (not
subprojects) or isolating access to shared build logic (i.e., convention plugins).
Let’s take the previous example. The logic in buildSrc has been turned into a project that contains
plugins and can be published and worked on independently of the root project build.
The plugin is moved to its own build called build-logic with a build script and settings file:
.
├── gradle
├── gradlew
├── settings.gradle.kts
├── build-logic
│ ├── settings.gradle.kts
│ └── conventions
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts
The fact that build-logic is located in a subdirectory of the root project is irrelevant.
NOTE
The folder could be located outside the root project if desired.
settings.gradle.kts
pluginManagement {
includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")
Multi-Project path
A project path has the following pattern: it starts with an optional colon, which denotes the root
project.
The root project, :, is the only project in a path not specified by its name.
The rest of a project path is a colon-separated sequence of project names, where the next project is
a subproject of the previous project:
:sub-project-1
You can see the project paths when running gradle projects:
------------------------------------------------------------
Root project 'project'
------------------------------------------------------------
Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for
composite builds.
You can use the gradle projects command to identify the project structure.
Projects:
------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------
Multi-project builds are collections of tasks you can run. The difference is that you may want to
control which project’s tasks get executed.
The following sections will cover your two options for executing tasks in a multi-project build.
The command gradle test will execute the test task in any subprojects relative to the current
working directory that has that task.
If you run the command from the root project directory, you will run test in api, shared,
services:shared and services:webservice.
If you run the command from the services project directory, you will only execute the task in
services:shared and services:webservice.
The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this
name. And complain if there is no such task found in any of the subprojects traversed.
Some task selectors, like help or dependencies, will only run the task on the project
NOTE they are invoked on and not on all the subprojects to reduce the amount of
information printed on the screen.
You can use a task’s fully qualified name to execute a specific task in a particular subproject. For
example: gradle :services:webservice:build will run the build task of the webservice subproject.
The fully qualified name of a task is its project path plus the task name.
This approach works for any task, so if you want to know what tasks are in a particular subproject,
use the tasks task, e.g. gradle :services:webservice:tasks.
The build task is typically used to compile, test, and check a single project.
In multi-project builds, you may often want to do all of these tasks across various projects. The
buildNeeded and buildDependents tasks can help with this.
In this example, the :services:person-service project depends on both the :api and :shared
projects. The :api project also depends on the :shared project.
Assuming you are working on a single project, the :api project, you have been making changes but
have not built the entire project since performing a clean. You want to build any necessary
supporting JARs but only perform code quality and unit tests on the parts of the project you have
changed.
$ gradle :api:build
BUILD SUCCESSFUL in 0s
If you have just gotten the latest version of the source from your version control system, which
included changes in other projects that :api depends on, you might want to build all the projects
you depend on AND test them too.
The buildNeeded task builds AND tests all the projects from the project dependencies of the
testRuntime configuration:
$ gradle :api:buildNeeded
BUILD SUCCESSFUL in 0s
You may want to refactor some part of the :api project used in other projects. If you make these
changes, testing only the :api project is insufficient. You must test all projects that depend on the
:api project.
The buildDependents task tests ALL the projects that have a project dependency (in the testRuntime
configuration) on the specified project:
$ gradle :api:buildDependents
BUILD SUCCESSFUL in 0s
Finally, you can build and test everything in all projects. Any task you run in the root project folder
will cause that same-named task to be run on all the children.
You can run gradle build to build and test ALL projects.
Build Lifecycle
As a build author, you define tasks and dependencies between tasks. Gradle guarantees that these
tasks will execute in order of their dependencies.
For example, if your project tasks include build, assemble, createDocs, your build script(s) can
ensure that they are executed in the order build → assemble → createDoc.
Task Graphs
This diagram shows two example task graphs, one abstract and the other concrete, with
dependencies between tasks represented as arrows:
Both plugins and build scripts contribute to the task graph via the task dependency mechanism and
annotated inputs/outputs.
Build Phases
Phase 1. Initialization
• Detects the settings.gradle(.kts) file.
• Evaluates the settings file to determine which projects (and included builds) make up the
build.
Phase 3. Execution
• Schedules and executes the selected tasks.
Example
The following example shows which parts of settings and build files correspond to various build
phases:
settings.gradle.kts
rootProject.name = "basic"
println("This is executed during the initialization phase.")
build.gradle.kts
tasks.register("test") {
doLast {
println("This is executed during the execution phase.")
}
}
tasks.register("testBoth") {
doFirst {
println("This is executed first during the execution phase.")
}
doLast {
println("This is executed last during the execution phase.")
}
println("This is executed during the configuration phase as well, because
:testBoth is used in the build.")
}
settings.gradle
rootProject.name = 'basic'
println 'This is executed during the initialization phase.'
build.gradle
tasks.register('configured') {
println 'This is also executed during the configuration phase, because
:configured is used in the build.'
}
tasks.register('test') {
doLast {
println 'This is executed during the execution phase.'
}
}
tasks.register('testBoth') {
doFirst {
println 'This is executed first during the execution phase.'
}
doLast {
println 'This is executed last during the execution phase.'
}
println 'This is executed during the configuration phase as well, because
:testBoth is used in the build.'
}
The following command executes the test and testBoth tasks specified above. Because Gradle only
configures requested tasks and their dependencies, the configured task never configures:
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Phase 1. Initialization
In the initialization phase, Gradle detects the set of projects (root and subprojects) and included
builds participating in the build.
Gradle first evaluates the settings file, settings.gradle(.kts), and instantiates a Settings object.
Then, Gradle instantiates Project instances for each project.
Phase 2. Configuration
In the configuration phase, Gradle adds tasks and other properties to the projects found by the
initialization phase.
Phase 3. Execution
Gradle uses the task execution graphs generated by the configuration phase to determine which
tasks to execute.
Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root
directory.
When the settings file settings.gradle(.kts) is found, Gradle instantiates a Settings object.
One of the purposes of the Settings object is to allow you to declare all the projects to be included in
the build.
Settings Scripts
The settings script is either a settings.gradle file in Groovy or a settings.gradle.kts file in Kotlin.
Before Gradle assembles the projects for a build, it creates a Settings instance and executes the
settings file against it.
As the settings script executes, it configures this Settings. Therefore, the settings file defines the
Settings object.
Many top-level properties and blocks in a settings script are part of the Settings API.
For example, we can set the root project name in the settings script using the Settings.rootProject
property:
settings.rootProject.name = "root"
rootProject.name = "root"
The Settings object exposes a standard set of properties in your settings script.
Name Description
buildCache The build cache configuration.
plugins The container of plugins that have been applied to the settings.
Name Description
rootDir The root directory of the build. The root directory is the project directory of the root
project.
rootProjec The root project of the build.
t
settings Returns this settings object.
Name Description
include() Adds the given projects to the build.
includeBuild() Includes a build at the specified path to the composite build.
A Settings script is a series of method calls to the Gradle API that often use { … }, a special
shortcut in both the Groovy and Kotlin languages. A { } block is called a lambda in Kotlin or a
closure in Groovy.
Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:
plugins(function() {
id("plugin")
})
The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.
The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script
line by line, top to bottom.
settings.gradle.kts
pluginManagement { ①
repositories {
gradlePluginPortal()
google()
}
}
plugins { ②
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
rootProject.name = "root-project" ③
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")
settings.gradle
pluginManagement { ①
repositories {
gradlePluginPortal()
google()
}
}
plugins { ②
id 'org.gradle.toolchains.foojay-resolver-convention' version '0.8.0'
}
rootProject.name = 'root-project' ③
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
include('sub-project-a') ⑤
include('sub-project-b')
include('sub-project-c')
The settings file can optionally manage plugin versions and repositories for your build with
pluginManagement It provides a centralized way to define which plugins should be used in your
project and from which repositories they should be resolved.
pluginManagement {
repositories {
gradlePluginPortal()
google()
}
}
The settings file can optionally apply plugins that are required for configuring the settings of the
project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the
example below.
Plugins applied in the settings file only affect the Settings object.
plugins {
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
The settings file defines your project name using the rootProject.name property:
rootProject.name = "root-project"
The settings file can optionally define rules and configurations for dependency resolution across
your project(s). It provides a centralized way to manage and customize dependency resolution.
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.PREFER_PROJECT)
repositories {
mavenCentral()
}
}
The settings file defines the structure of the project by adding all the subprojects using the include
statement:
include("app")
include("business-logic")
include("data-model")
There are many more properties and methods on the Settings object that you can use to configure
your build.
It’s important to remember that while many Gradle scripts are typically written in short Groovy or
Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings
object in the Gradle API:
include("app")
Is actually:
settings.include("app")
Additionally, the full power of the Groovy and Kotlin languages is available to you.
For example, instead of using include many times to add subprojects, you can iterate over the list of
directories in the project root folder and include them automatically:
Then, for each project included in the settings file, Gradle creates a Project instance.
Gradle then looks for a corresponding build script file, which is used in the configuration phase.
Build Scripts
Every Gradle build comprises one or more projects; a root project and subprojects.
A project typically corresponds to a software component that needs to be built, like a library or an
application. It might represent a library JAR, a web application, or a distribution ZIP assembled
from the JARs produced by other projects.
On the other hand, it might represent a thing to be done, such as deploying your application to
staging or production environments.
Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).
A build script configures a project and is associated with an object of type Project.
As the build script executes, it configures Project.
The build script is either a *.gradle file in Groovy or a *.gradle.kts file in Kotlin.
Many top-level properties and blocks in a build script are part of the Project API.
For example, the following build script uses the Project.name property to print the name of the
project:
build.gradle.kts
println(name)
println(project.name)
build.gradle
println name
println project.name
$ gradle -q check
project-api
project-api
The first uses the top-level reference to the name property of the Project object. The second
statement uses the project property available to any build script, which returns the associated
Project object.
Standard project properties
The Project object exposes a standard set of properties in your build script.
Name Description
uri() Resolves a file path to a URI, relative to the project directory of this project.
task() Creates a Task with the given name and adds it to this project.
The Build script is composed of { … }, a special object in both Groovy and Kotlin. This object is
called a lambda in Kotlin or a closure in Groovy.
Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:
plugins(function() {
id("plugin")
})
The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.
The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes
the script line by line, top to bottom.
plugins { ①
id("org.jetbrains.kotlin.jvm") version "1.9.0"
id("application")
}
repositories { ②
mavenCentral()
}
dependencies { ③
testImplementation("org.jetbrains.kotlin:kotlin-test-junit5")
testImplementation("org.junit.jupiter:junit-jupiter-engine:5.9.3")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
implementation("com.google.guava:guava:32.1.1-jre")
}
application { ④
mainClass = "com.example.Main"
}
tasks.named<Test>("test") { ⑤
useJUnitPlatform()
}
③ Add dependencies.
④ Set properties.
build.gradle
plugins { ①
id 'org.jetbrains.kotlin.jvm' version '1.9.0'
id 'application'
}
repositories { ②
mavenCentral()
}
dependencies { ③
testImplementation 'org.jetbrains.kotlin:kotlin-test-junit5'
testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.9.3'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
implementation 'com.google.guava:guava:32.1.1-jre'
}
application { ④
mainClass = 'com.example.Main'
}
tasks.named('test') { ⑤
useJUnitPlatform()
}
③ Add dependencies.
④ Set properties.
Plugins are used to extend Gradle. They are also used to modularize and reuse project
configurations.
plugins {
id("org.jetbrains.kotlin.jvm") version "1.9.0"
id("application")
}
In the example, the application plugin, which is included with Gradle, has been applied, describing
our project as a Java application.
The Kotlin gradle plugin, version 1.9.0, has also been applied. This plugin is not included with
Gradle and, therefore, has to be described using a plugin id and a plugin version so that Gradle can
find and apply it.
A project generally has a number of dependencies it needs to do its work. Dependencies include
plugins, libraries, or components that Gradle must download for the build to succeed.
The build script lets Gradle know where to look for the binaries of the dependencies. More than one
location can be provided:
repositories {
mavenCentral()
google()
}
In the example, the guava library and the JetBrains Kotlin plugin (org.jetbrains.kotlin.jvm) will be
downloaded from the Maven Central Repository.
3. Add dependencies
A project generally has a number of dependencies it needs to do its work. These dependencies are
often libraries of precompiled classes that are imported in the project’s source code.
Dependencies are managed via configurations and are retrieved from repositories.
dependencies {
implementation("com.google.guava:guava:32.1.1-jre")
}
In the example, the application code uses Google’s guava libraries. Guava provides utility methods
for collections, caching, primitives support, concurrency, common annotations, string processing,
I/O, and validations.
4. Set properties
The Project object has an associated ExtensionContainer object that contains all the settings and
properties for the plugins that have been applied to the project.
In the example, the application plugin added an application property, which is used to detail the
main class of our Java application:
application {
mainClass = "com.example.Main"
}
Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping
up a WAR file.
While tasks are typically defined in plugins, you may need to register or configure tasks in build
scripts.
tasks.register<Zip>("zip-reports") {
from 'Reports/'
include '*'
archiveName 'Reports.zip'
destinationDir(file('/dir'))
}
You may have seen usage of the TaskContainer.create(java.lang.String) method which should be
avoided:
tasks.create<Zip>("zip-reports") {
from 'Reports/'
include '*'
archiveName 'Reports.zip'
destinationDir(file('/dir'))
}
TIP register(), which enables task configuration avoidance, is preferred over create().
tasks.named<Test>("test") {
useJUnitPlatform()
}
The example below configures the Javadoc task to automatically generate HTML documentation
from Java code:
tasks.named("javadoc").configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
exclude 'app/internal/*'
}
Build Scripting
Statements can include method calls, property assignments, and local variable definitions:
version = '1.0.0.GA'
configurations {
}
repositories {
google()
}
build.gradle.kts
tasks.register("upper") {
doLast {
val someString = "mY_nAmE"
println("Original: $someString")
println("Upper case: ${someString.toUpperCase()}")
}
}
build.gradle
tasks.register('upper') {
doLast {
String someString = 'mY_nAmE'
println "Original: $someString"
println "Upper case: ${someString.toUpperCase()}"
}
}
$ gradle -q upper
Original: mY_nAmE
Upper case: MY_NAME
It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class
definitions:
build.gradle.kts
tasks.register("count") {
doLast {
repeat(4) { print("$it ") }
}
}
build.gradle
tasks.register('count') {
doLast {
4.times { print "$it " }
}
}
$ gradle -q count
0 1 2 3
Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:
build.gradle.kts
$ gradle -q task1
I'm task number 1
Declare Variables
Build scripts can declare two variables: local variables and extra properties.
Local Variables
Declare local variables with the val keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Kotlin language.
Declare local variables with the def keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Groovy language.
build.gradle.kts
tasks.register<Copy>("copy") {
from("source")
into(dest)
}
build.gradle
tasks.register('copy', Copy) {
from 'source'
into dest
}
Extra Properties
Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined
properties.
Add, read, and set extra properties via the owning object’s extra property. Alternatively, you can
access extra properties via Kotlin delegated properties using by extra.
Add, read, and set extra properties via the owning object’s ext property. Alternatively, you can use
an ext block to add multiple properties simultaneously.
build.gradle.kts
plugins {
id("java-library")
}
sourceSets {
main {
extra["purpose"] = "production"
}
test {
extra["purpose"] = "test"
}
create("plugin") {
extra["purpose"] = "production"
}
}
tasks.register("printProperties") {
val springVersion = springVersion
val emailNotification = emailNotification
val productionSourceSets = provider {
sourceSets.matching { it.extra["purpose"] == "production" }.map {
it.name }
}
doLast {
println(springVersion)
println(emailNotification)
productionSourceSets.get().forEach { println(it) }
}
}
build.gradle
plugins {
id 'java-library'
}
ext {
springVersion = "3.1.0.RELEASE"
emailNotification = "[email protected]"
}
sourceSets {
main {
purpose = "production"
}
test {
purpose = "test"
}
plugin {
purpose = "production"
}
}
tasks.register('printProperties') {
def springVersion = springVersion
def emailNotification = emailNotification
def productionSourceSets = provider {
sourceSets.matching { it.purpose == "production" }.collect { it.name
}
}
doLast {
println springVersion
println emailNotification
productionSourceSets.get().each { println it }
}
}
$ gradle -q printProperties
3.1.0.RELEASE
[email protected]
main
plugin
This example adds two extra properties to the project object via by extra. Additionally, this
example adds a property named purpose to each source set by setting extra["purpose"] to null. Once
added, you can read and set these properties via extra.
This example adds two extra properties to the project object via an ext block. Additionally, this
example adds a property named purpose to each source set by setting ext.purpose to null. Once
added, you can read and set all these properties just like predefined ones.
Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows
Gradle to recognize when a script attempts to set a property that does not exist. You can access
extra properties anywhere where you can access their owning object. This gives extra properties a
wider scope than local variables. Subprojects can access extra properties on their parent projects.
For more information about extra properties, see ExtraPropertiesExtension in the API
documentation.
build.gradle.kts
class UserInfo(
var name: String? = null,
var email: String? = null
)
tasks.register("greet") {
val user = UserInfo().apply {
name = "Isaac Newton"
email = "[email protected]"
}
doLast {
println(user.name)
println(user.email)
}
}
build.gradle
class UserInfo {
String name
String email
}
tasks.register('greet') {
def user = configure(new UserInfo()) {
name = "Isaac Newton"
email = "[email protected]"
}
doLast {
println user.name
println user.email
}
}
$ gradle -q greet
Isaac Newton
[email protected]
Closure Delegates
Each closure has a delegate object. Groovy uses this delegate to look up variable and method
references to nonlocal variables and closure parameters. Gradle uses this for configuration closures,
where the delegate object refers to the object being configured.
build.gradle
dependencies {
assert delegate == project.dependencies
testImplementation('junit:junit:4.13')
delegate.testImplementation('junit:junit:4.13')
}
Default imports
To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.
import org.gradle.*
import org.gradle.api.*
import org.gradle.api.artifacts.*
import org.gradle.api.artifacts.component.*
import org.gradle.api.artifacts.dsl.*
import org.gradle.api.artifacts.ivy.*
import org.gradle.api.artifacts.maven.*
import org.gradle.api.artifacts.query.*
import org.gradle.api.artifacts.repositories.*
import org.gradle.api.artifacts.result.*
import org.gradle.api.artifacts.transform.*
import org.gradle.api.artifacts.type.*
import org.gradle.api.artifacts.verification.*
import org.gradle.api.attributes.*
import org.gradle.api.attributes.java.*
import org.gradle.api.attributes.plugin.*
import org.gradle.api.cache.*
import org.gradle.api.capabilities.*
import org.gradle.api.component.*
import org.gradle.api.configuration.*
import org.gradle.api.credentials.*
import org.gradle.api.distribution.*
import org.gradle.api.distribution.plugins.*
import org.gradle.api.execution.*
import org.gradle.api.file.*
import org.gradle.api.flow.*
import org.gradle.api.initialization.*
import org.gradle.api.initialization.definition.*
import org.gradle.api.initialization.dsl.*
import org.gradle.api.initialization.resolve.*
import org.gradle.api.invocation.*
import org.gradle.api.java.archives.*
import org.gradle.api.jvm.*
import org.gradle.api.launcher.cli.*
import org.gradle.api.logging.*
import org.gradle.api.logging.configuration.*
import org.gradle.api.model.*
import org.gradle.api.plugins.*
import org.gradle.api.plugins.antlr.*
import org.gradle.api.plugins.catalog.*
import org.gradle.api.plugins.jvm.*
import org.gradle.api.plugins.quality.*
import org.gradle.api.plugins.scala.*
import org.gradle.api.problems.*
import org.gradle.api.project.*
import org.gradle.api.provider.*
import org.gradle.api.publish.*
import org.gradle.api.publish.ivy.*
import org.gradle.api.publish.ivy.plugins.*
import org.gradle.api.publish.ivy.tasks.*
import org.gradle.api.publish.maven.*
import org.gradle.api.publish.maven.plugins.*
import org.gradle.api.publish.maven.tasks.*
import org.gradle.api.publish.plugins.*
import org.gradle.api.publish.tasks.*
import org.gradle.api.reflect.*
import org.gradle.api.reporting.*
import org.gradle.api.reporting.components.*
import org.gradle.api.reporting.dependencies.*
import org.gradle.api.reporting.dependents.*
import org.gradle.api.reporting.model.*
import org.gradle.api.reporting.plugins.*
import org.gradle.api.resources.*
import org.gradle.api.services.*
import org.gradle.api.specs.*
import org.gradle.api.tasks.*
import org.gradle.api.tasks.ant.*
import org.gradle.api.tasks.application.*
import org.gradle.api.tasks.bundling.*
import org.gradle.api.tasks.compile.*
import org.gradle.api.tasks.diagnostics.*
import org.gradle.api.tasks.diagnostics.configurations.*
import org.gradle.api.tasks.incremental.*
import org.gradle.api.tasks.javadoc.*
import org.gradle.api.tasks.options.*
import org.gradle.api.tasks.scala.*
import org.gradle.api.tasks.testing.*
import org.gradle.api.tasks.testing.junit.*
import org.gradle.api.tasks.testing.junitplatform.*
import org.gradle.api.tasks.testing.testng.*
import org.gradle.api.tasks.util.*
import org.gradle.api.tasks.wrapper.*
import org.gradle.api.toolchain.management.*
import org.gradle.authentication.*
import org.gradle.authentication.aws.*
import org.gradle.authentication.http.*
import org.gradle.build.event.*
import org.gradle.buildconfiguration.tasks.*
import org.gradle.buildinit.*
import org.gradle.buildinit.plugins.*
import org.gradle.buildinit.tasks.*
import org.gradle.caching.*
import org.gradle.caching.configuration.*
import org.gradle.caching.http.*
import org.gradle.caching.local.*
import org.gradle.concurrent.*
import org.gradle.external.javadoc.*
import org.gradle.ide.visualstudio.*
import org.gradle.ide.visualstudio.plugins.*
import org.gradle.ide.visualstudio.tasks.*
import org.gradle.ide.xcode.*
import org.gradle.ide.xcode.plugins.*
import org.gradle.ide.xcode.tasks.*
import org.gradle.ivy.*
import org.gradle.jvm.*
import org.gradle.jvm.application.scripts.*
import org.gradle.jvm.application.tasks.*
import org.gradle.jvm.tasks.*
import org.gradle.jvm.toolchain.*
import org.gradle.language.*
import org.gradle.language.assembler.*
import org.gradle.language.assembler.plugins.*
import org.gradle.language.assembler.tasks.*
import org.gradle.language.base.*
import org.gradle.language.base.artifact.*
import org.gradle.language.base.compile.*
import org.gradle.language.base.plugins.*
import org.gradle.language.base.sources.*
import org.gradle.language.c.*
import org.gradle.language.c.plugins.*
import org.gradle.language.c.tasks.*
import org.gradle.language.cpp.*
import org.gradle.language.cpp.plugins.*
import org.gradle.language.cpp.tasks.*
import org.gradle.language.java.artifact.*
import org.gradle.language.jvm.tasks.*
import org.gradle.language.nativeplatform.*
import org.gradle.language.nativeplatform.tasks.*
import org.gradle.language.objectivec.*
import org.gradle.language.objectivec.plugins.*
import org.gradle.language.objectivec.tasks.*
import org.gradle.language.objectivecpp.*
import org.gradle.language.objectivecpp.plugins.*
import org.gradle.language.objectivecpp.tasks.*
import org.gradle.language.plugins.*
import org.gradle.language.rc.*
import org.gradle.language.rc.plugins.*
import org.gradle.language.rc.tasks.*
import org.gradle.language.scala.tasks.*
import org.gradle.language.swift.*
import org.gradle.language.swift.plugins.*
import org.gradle.language.swift.tasks.*
import org.gradle.maven.*
import org.gradle.model.*
import org.gradle.nativeplatform.*
import org.gradle.nativeplatform.platform.*
import org.gradle.nativeplatform.plugins.*
import org.gradle.nativeplatform.tasks.*
import org.gradle.nativeplatform.test.*
import org.gradle.nativeplatform.test.cpp.*
import org.gradle.nativeplatform.test.cpp.plugins.*
import org.gradle.nativeplatform.test.cunit.*
import org.gradle.nativeplatform.test.cunit.plugins.*
import org.gradle.nativeplatform.test.cunit.tasks.*
import org.gradle.nativeplatform.test.googletest.*
import org.gradle.nativeplatform.test.googletest.plugins.*
import org.gradle.nativeplatform.test.plugins.*
import org.gradle.nativeplatform.test.tasks.*
import org.gradle.nativeplatform.test.xctest.*
import org.gradle.nativeplatform.test.xctest.plugins.*
import org.gradle.nativeplatform.test.xctest.tasks.*
import org.gradle.nativeplatform.toolchain.*
import org.gradle.nativeplatform.toolchain.plugins.*
import org.gradle.normalization.*
import org.gradle.platform.*
import org.gradle.platform.base.*
import org.gradle.platform.base.binary.*
import org.gradle.platform.base.component.*
import org.gradle.platform.base.plugins.*
import org.gradle.plugin.devel.*
import org.gradle.plugin.devel.plugins.*
import org.gradle.plugin.devel.tasks.*
import org.gradle.plugin.management.*
import org.gradle.plugin.use.*
import org.gradle.plugins.ear.*
import org.gradle.plugins.ear.descriptor.*
import org.gradle.plugins.ide.*
import org.gradle.plugins.ide.api.*
import org.gradle.plugins.ide.eclipse.*
import org.gradle.plugins.ide.idea.*
import org.gradle.plugins.signing.*
import org.gradle.plugins.signing.signatory.*
import org.gradle.plugins.signing.signatory.pgp.*
import org.gradle.plugins.signing.type.*
import org.gradle.plugins.signing.type.pgp.*
import org.gradle.process.*
import org.gradle.swiftpm.*
import org.gradle.swiftpm.plugins.*
import org.gradle.swiftpm.tasks.*
import org.gradle.testing.base.*
import org.gradle.testing.base.plugins.*
import org.gradle.testing.jacoco.plugins.*
import org.gradle.testing.jacoco.tasks.*
import org.gradle.testing.jacoco.tasks.rules.*
import org.gradle.testkit.runner.*
import org.gradle.util.*
import org.gradle.vcs.*
import org.gradle.vcs.git.*
import org.gradle.work.*
import org.gradle.workers.*
Using Tasks
The work that Gradle can do on a project is defined by one or more tasks.
A task represents some independent unit of work that a build performs. This might be compiling
some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.
When a user runs ./gradlew build in the command line, Gradle will execute the build task along
with any other tasks it depends on.
Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks:
------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project
'myTutorial'.
...
build.gradle.kts
plugins {
id("application")
}
$ ./gradlew tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
Other tasks
-----------
compileJava - Compiles main Java source.
...
Many of these tasks, such as assemble, build, and run, should be familiar to a developer.
Task classification
1. Actionable tasks have some action(s) attached to do work in your build: compileJava.
Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at
once.
Task registration and action
build.gradle.kts
tasks.register("hello") {
doLast {
println("Hello world!")
}
}
build.gradle
tasks.register('hello') {
doLast {
println 'Hello world!'
}
}
In the example, the build script registers a single task called hello using the TaskContainer API,
and adds an action to it.
If the tasks in the project are listed, the hello task is available to Gradle:
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.
You can execute the task in the build script with ./gradlew hello:
$ ./gradlew hello
Hello world!
When Gradle executes the hello task, it executes the action provided. In this case, the action is
simply a block containing some code: println("Hello world!").
The hello task from the previous section can be detailed with a description and assigned to a
group with the following update:
build.gradle.kts
tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
}
$ ./gradlew tasks
Custom tasks
------------------
hello - A lovely greeting task.
To view information about a task, use the help --task <task-name> command:
Path
:app:hello
Type
Task (org.gradle.api.Task)
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom
Task dependencies
build.gradle.kts
tasks.register("hello") {
doLast {
println("Hello world!")
}
}
tasks.register("intro") {
dependsOn("hello")
doLast {
println("I'm Gradle")
}
}
build.gradle
tasks.register('hello') {
doLast {
println 'Hello world!'
}
}
tasks.register('intro') {
dependsOn tasks.hello
doLast {
println "I'm Gradle"
}
}
$ gradle -q intro
Hello world!
I'm Gradle
build.gradle.kts
tasks.register("taskX") {
dependsOn("taskY")
doLast {
println("taskX")
}
}
tasks.register("taskY") {
doLast {
println("taskY")
}
}
build.gradle
tasks.register('taskX') {
dependsOn 'taskY'
doLast {
println 'taskX'
}
}
tasks.register('taskY') {
doLast {
println 'taskY'
}
}
$ gradle -q taskX
taskY
taskX
The hello task from the previous example is updated to include a dependency:
build.gradle.kts
tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn(tasks.assemble)
}
The hello task now depends on the assemble task, which means that Gradle must execute the
assemble task before it can execute the hello task:
$ ./gradlew :app:hello
Task configuration
Once registered, tasks can be accessed via the TaskProvider API for further configuration.
For instance, you can use this to add dependencies to a task at runtime dynamically:
build.gradle.kts
build.gradle
$ gradle -q task0
I'm task number 2
I'm task number 3
I'm task number 0
build.gradle.kts
tasks.register("hello") {
doLast {
println("Hello Earth")
}
}
tasks.named("hello") {
doFirst {
println("Hello Venus")
}
}
tasks.named("hello") {
doLast {
println("Hello Mars")
}
}
tasks.named("hello") {
doLast {
println("Hello Jupiter")
}
}
build.gradle
tasks.register('hello') {
doLast {
println 'Hello Earth'
}
}
tasks.named('hello') {
doFirst {
println 'Hello Venus'
}
}
tasks.named('hello') {
doLast {
println 'Hello Mars'
}
}
tasks.named('hello') {
doLast {
println 'Hello Jupiter'
}
}
$ gradle -q hello
Hello Venus
Hello Earth
Hello Mars
Hello Jupiter
The calls doFirst and doLast can be executed multiple times. They add an action to the
TIP beginning or the end of the task’s actions list. When the task executes, the actions in
the action list are executed in order.
Here is an example of the named method being used to configure a task added by a plugin:
tasks.named("dokkaHtml") {
outputDirectory.set(buildDir.resolve("dokka"))
}
Task types
build.gradle.kts
$ ./gradlew hello
Path
:app:hello
Type
HelloTask (Build_gradle$HelloTask)
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom tasks
Gradle provides many built-in task types with common and popular functionality, such as copying
or deleting files.
This example task copies *.war files from the source directory to the target directory using the Copy
built-in task:
tasks.register("copyTask",Copy) {
from("source")
into("target")
include("*.war")
}
There are many task types developers can take advantage of, including GroovyDoc, Zip, Jar,
JacocoReport, Sign, or Delete, which are available in the link:DSL.
Writing Tasks
Gradle tasks are created by extending DefaultTask.
However, the generic DefaultTask provides no action for Gradle. If users want to extend the
capabilities of Gradle and their build script, they must either use a built-in task or create a custom
task:
1. Built-in task - Gradle provides built-in utility tasks such as Copy, Jar, Zip, Delete, etc…
2. Custom task - Gradle allows users to subclass DefaultTask to create their own task types.
Create a task
The simplest and quickest way to create a custom task is in a build script:
To create a task, inherit from the DefaultTask class and implement a @TaskAction handler:
build.gradle.kts
The CreateFileTask implements a simple set of actions. First, a file called "myfile.txt" is created in
the main project. Then, some text is written to the file.
Register a task
A task is registered in the build script using the TaskContainer.register() method, which allows it
to be then used in the build logic.
build.gradle.kts
tasks.register<CreateFileTask>("createFileTask")
Setting the group and description properties on your tasks can help users understand how to use
your task:
build.gradle.kts
tasks.register<CreateFileTask>("createFileTask", ) {
group = "custom"
description = "Create myfile.txt in the current directory"
}
For the task to do useful work, it typically needs some inputs. A task typically produces outputs.
build.gradle.kts
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText)
}
}
tasks.register<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
}
Configure a task
The CreateFileTask class is updated so that the text in the file is configurable:
build.gradle.kts
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
tasks.register<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
fileText.convention("HELLO FROM THE CREATE FILE TASK METHOD") // Set convention
}
tasks.named<CreateFileTask>("createFileTask") {
fileText.set("HELLO FROM THE NAMED METHOD") // Override with custom message
}
In the named() method, we find the createFileTask task and set the text that will be written to the
file.
$ ./gradlew createFileTask
BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date
myfile.txt
Using Plugins
Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with
Gradle, third-party plugins, and script plugins defined within builds.
Plugins introduce new tasks (e.g., JavaCompile), domain objects (e.g., SourceSet), conventions (e.g.,
locating Java source at src/main/java), and extend core or other plugin objects.
Plugins in Gradle are essential for automating common build tasks, integrating with external tools
or services, and tailoring the build process to meet specific project needs. They also serve as the
primary mechanism for organizing build logic.
Benefits of plugins
Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins
offer several advantages over adding logic directly to the build script:
• Promotes Reusability: Reduces the need to duplicate similar logic across projects.
• Enhances Modularity: Allows for a more modular and organized build script.
• Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.
Plugin distribution
You can leverage plugins from Gradle and the Gradle community or create your own.
Plugins are available in three ways:
2. Community plugins - Gradle plugins shared in a remote repository such as Maven or the
Gradle Plugin Portal.
3. Local plugins - Gradle enables users to create custom plugins using APIs.
Types of plugins
Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:
Binary Plugins
Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as
JAR files. They are applied to a project using the plugins {} block. They offer better performance
and maintainability compared to script plugins or precompiled script plugins.
Script Plugins
Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build
script using the apply from: syntax. They are applied inline within a build script to add
functionality or customize the build process. They are simple to use.
A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes
more valuable, it’s migrated to a binary plugin that can be easily tested and shared between
multiple projects or organizations.
Using plugins
To use the build logic encapsulated in a plugin, Gradle needs to perform two steps. First, it needs to
resolve the plugin, and then it needs to apply the plugin to the target, usually a Project.
1. Resolving a plugin means finding the correct version of the JAR that contains a given plugin
and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build
script. Script plugins are self-resolving in that they are resolved from the specific file path or
URL provided when applying them. Core binary plugins provided as part of the Gradle
distribution are automatically resolved.
The plugins DSL is recommended to resolve and apply plugins in one step.
Resolving plugins
Gradle provides the core plugins (e.g., JavaPlugin, GroovyPlugin, MavenPublishPlugin, etc.) as part of
its distribution, which means they are automatically resolved.
Core plugins are applied in a build script using the plugin name:
plugins {
id «plugin name»
}
For example:
build.gradle
plugins {
id("java")
}
Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a
unique ID and a version in the build file:
plugins {
id «plugin id» version «plugin version»
}
And the location of the plugin must be specified in the settings file:
settings.gradle
pluginManagement {
repositories {
gradlePluginPortal()
maven {
url 'https://maven.example.com/plugins'
}
}
}
implementation(Libs.Kotlin.corouti
nes)
}
classpath("org.barfuin.gradle.task
info:gradle-taskinfo:2.1.0")
}
}
plugins {
id("org.barfuin.gradle.taskinfo")
version "2.1.0"
}
The plugin DSL provides a concise and convenient way to declare plugin dependencies.
plugins {
application // by name
java // by name
id("java") // by id - recommended
id("org.jetbrains.kotlin.jvm") version "1.9.0" // by id - recommended
}
Core Gradle plugins are unique in that they provide short names, such as java for the core
JavaPlugin.
build.gradle.kts
plugins {
java
}
build.gradle
plugins {
id 'java'
}
All other binary plugins must use the fully qualified form of the plugin id (e.g., com.github.foo.bar).
To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally
unique identifier, must be used:
build.gradle.kts
plugins {
id("com.jfrog.bintray") version "1.8.5"
}
build.gradle
plugins {
id 'com.jfrog.bintray' version '1.8.5'
}
See PluginDependenciesSpec for more information on using the Plugin DSL.
The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine
which plugins are used quickly. This allows Gradle to:
• Provide editors with detailed information about the potential properties and values in the build
script.
There are some key differences between the plugins {} block mechanism and the "traditional"
apply() method mechanism. There are also some constraints and possible limitations.
Constrained Syntax
It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for
Gradle to execute at any time).
build.gradle.kts
plugins {
id(«plugin id») ①
id(«plugin id») version «plugin version» ②
}
① for core Gradle plugins or plugins already available to the build script
build.gradle
plugins {
id «plugin id» ①
id «plugin id» version «plugin version» ②
}
① for core Gradle plugins or plugins already available to the build script
Where «plugin id» and «plugin version» must be constant, literal strings.
The plugins{} block must also be a top-level statement in the build script. It cannot be nested inside
another construct (e.g., an if-statement or for-loop).
The plugins{} block can only be used in a project’s build script build.gradle(.kts) and the
settings.gradle(.kts) file. It must appear before any other block. It cannot be used in script plugins
or init scripts.
Suppose you have a multi-project build, you probably want to apply plugins to some or all of the
subprojects in your build but not to the root project.
While the default behavior of the plugins{} block is to immediately resolve and apply the plugins,
you can use the apply false syntax to tell Gradle not to apply the plugin to the current project.
Then, use the plugins{} block without the version in subprojects' build scripts:
settings.gradle.kts
include("hello-a")
include("hello-b")
include("goodbye-c")
build.gradle.kts
plugins {
id("com.example.hello") version "1.0.0" apply false
id("com.example.goodbye") version "1.0.0" apply false
}
hello-a/build.gradle.kts
plugins {
id("com.example.hello")
}
hello-b/build.gradle.kts
plugins {
id("com.example.hello")
}
goodbye-c/build.gradle.kts
plugins {
id("com.example.goodbye")
}
settings.gradle
include 'hello-a'
include 'hello-b'
include 'goodbye-c'
build.gradle
plugins {
id 'com.example.hello' version '1.0.0' apply false
id 'com.example.goodbye' version '1.0.0' apply false
}
hello-a/build.gradle
plugins {
id 'com.example.hello'
}
hello-b/build.gradle
plugins {
id 'com.example.hello'
}
goodbye-c/build.gradle
plugins {
id 'com.example.goodbye'
}
You can also encapsulate the versions of external plugins by composing the build logic using your
own convention plugins.
buildSrc is an optional directory at the Gradle project root that contains build logic (i.e., plugins)
used in building the main project. You can apply plugins that reside in a project’s buildSrc directory
as long as they have a defined ID.
The following example shows how to tie the plugin implementation class my.MyPlugin, defined in
buildSrc, to the id "my-plugin":
buildSrc/build.gradle.kts
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("myPlugins") {
id = "my-plugin"
implementationClass = "my.MyPlugin"
}
}
}
buildSrc/build.gradle
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
myPlugins {
id = 'my-plugin'
implementationClass = 'my.MyPlugin'
}
}
}
build.gradle.kts
plugins {
id("my-plugin")
}
build.gradle
plugins {
id 'my-plugin'
}
1. global dependencies and repositories required for building the project (applied in the
subprojects).
2. declaring which plugins are available for use in the build script (in the build.gradle(.kts) file
itself).
So when you want to use a library in the build script itself, you must add this library on the script
classpath using buildScript:
import org.apache.commons.codec.binary.Base64
buildscript {
repositories { // this is where the plugins are located
mavenCentral()
google()
}
dependencies { // these are the plugins that can be used in subprojects or in the
build file itself
classpath group: 'commons-codec', name: 'commons-codec', version: '1.2' //
used in the task below
classpath 'com.android.tools.build:gradle:4.1.0' // used in subproject
}
}
tasks.register('encode') {
doLast {
def byte[] encodedString = new Base64().encode('hello world\n'.getBytes())
println new String(encodedString)
}
}
And you can apply the globally declared dependencies in the subproject that needs it:
plugins {
id 'com.android.application'
}
Binary plugins published as external jar files can be added to a project by adding the plugin to the
build script classpath and then applying the plugin.
External jars can be added to the build script classpath using the buildscript{} block as described
in External dependencies for the build script:
build.gradle.kts
buildscript {
repositories {
gradlePluginPortal()
}
dependencies {
classpath("com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.5")
}
}
apply(plugin = "com.jfrog.bintray")
build.gradle
buildscript {
repositories {
gradlePluginPortal()
}
dependencies {
classpath 'com.jfrog.bintray.gradle:gradle-bintray-plugin:1.8.5'
}
}
A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is
applied using the legacy application method:
Let’s take a rudimentary example of a plugin written in a file called other.gradle located in the
same directory as the build.gradle file:
Script plugins are automatically resolved and can be applied from a script on the local filesystem or
remotely:
build.gradle.kts
apply(from = "other.gradle.kts")
build.gradle
Filesystem locations are relative to the project directory, while remote script locations are specified
with an HTTP URL. Multiple script plugins (of either form) can be applied to a given target.
Plugin Management
The pluginManagement{} block is used to configure repositories for plugin resolution and to define
version constraints for plugins that are applied in the build scripts.
The pluginManagement{} block can be used in a settings.gradle(.kts) file, where it must be the first
block in the file:
settings.gradle.kts
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = "plugin-management"
settings.gradle
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = 'plugin-management'
init.gradle.kts
settingsEvaluated {
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}
init.gradle
By default, the plugins{} DSL resolves plugins from the public Gradle Plugin Portal.
Many build authors would also like to resolve plugins from private Maven or Ivy repositories
because they contain proprietary implementation details or to have more control over what
plugins are available to their builds.
To specify custom plugin repositories, use the repositories{} block inside pluginManagement{}:
settings.gradle.kts
pluginManagement {
repositories {
maven(url = "./maven-repo")
gradlePluginPortal()
ivy(url = "./ivy-repo")
}
}
settings.gradle
pluginManagement {
repositories {
maven {
url './maven-repo'
}
gradlePluginPortal()
ivy {
url './ivy-repo'
}
}
}
This tells Gradle to first look in the Maven repository at ../maven-repo when resolving plugins and
then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository. If you
don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal() line. Finally, the
Ivy repository at ../ivy-repo will be checked.
A plugins{} block inside pluginManagement{} allows all plugin versions for the build to be defined in
a single location. Plugins can then be applied by id to any build script via the plugins{} block.
One benefit of setting plugin versions this way is that the pluginManagement.plugins{} does not have
the same constrained syntax as the build script plugins{} block. This allows plugin versions to be
taken from gradle.properties, or loaded via another mechanism.
settings.gradle.kts
pluginManagement {
val helloPluginVersion: String by settings
plugins {
id("com.example.hello") version "${helloPluginVersion}"
}
}
build.gradle.kts
plugins {
id("com.example.hello")
}
gradle.properties
helloPluginVersion=1.0.0
settings.gradle
pluginManagement {
plugins {
id 'com.example.hello' version "${helloPluginVersion}"
}
}
build.gradle
plugins {
id 'com.example.hello'
}
gradle.properties
helloPluginVersion=1.0.0
The plugin version is loaded from gradle.properties and configured in the settings script, allowing
the plugin to be added to any project without specifying the version.
Plugin resolution rules allow you to modify plugin requests made in plugins{} blocks, e.g., changing
the requested version or explicitly specifying the implementation artifact coordinates.
To add resolution rules, use the resolutionStrategy{} inside the pluginManagement{} block:
settings.gradle.kts
pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == "com.example") {
useModule("com.example:sample-plugins:1.0.0")
}
}
}
repositories {
maven {
url = uri("./maven-repo")
}
gradlePluginPortal()
ivy {
url = uri("./ivy-repo")
}
}
}
settings.gradle
pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == 'com.example') {
useModule('com.example:sample-plugins:1.0.0')
}
}
}
repositories {
maven {
url './maven-repo'
}
gradlePluginPortal()
ivy {
url './ivy-repo'
}
}
}
This tells Gradle to use the specified plugin implementation artifact instead of its built-in default
mapping from plugin ID to Maven/Ivy coordinates.
Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts
that implement the plugin. Read Gradle Plugin Development Plugin for more information on
publishing plugins to custom repositories.
See PluginManagementSpec for complete documentation for using the pluginManagement{} block.
Since the plugins{} DSL block only allows for declaring plugins by their globally unique plugin id
and version properties, Gradle needs a way to look up the coordinates of the plugin implementation
artifact.
To do so, Gradle will look for a Plugin Marker Artifact with the coordinates
plugin.id:plugin.id.gradle.plugin:plugin.version. This marker needs to have a dependency on the
actual plugin implementation. Publishing these markers is automated by the java-gradle-plugin.
For example, the following complete sample from the sample-plugins project shows how to publish
a com.example.hello plugin and a com.example.goodbye plugin to both an Ivy and Maven repository
using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish
plugin.
build.gradle.kts
plugins {
`java-gradle-plugin`
`maven-publish`
`ivy-publish`
}
group = "com.example"
version = "1.0.0"
gradlePlugin {
plugins {
create("hello") {
id = "com.example.hello"
implementationClass = "com.example.hello.HelloPlugin"
}
create("goodbye") {
id = "com.example.goodbye"
implementationClass = "com.example.goodbye.GoodbyePlugin"
}
}
}
publishing {
repositories {
maven {
url = uri(layout.buildDirectory.dir("maven-repo"))
}
ivy {
url = uri(layout.buildDirectory.dir("ivy-repo"))
}
}
}
build.gradle
plugins {
id 'java-gradle-plugin'
id 'maven-publish'
id 'ivy-publish'
}
group 'com.example'
version '1.0.0'
gradlePlugin {
plugins {
hello {
id = 'com.example.hello'
implementationClass = 'com.example.hello.HelloPlugin'
}
goodbye {
id = 'com.example.goodbye'
implementationClass = 'com.example.goodbye.GoodbyePlugin'
}
}
}
publishing {
repositories {
maven {
url layout.buildDirectory.dir("maven-repo")
}
ivy {
url layout.buildDirectory.dir("ivy-repo")
}
}
}
Running gradle publish in the sample directory creates the following Maven repository layout (the
Ivy layout is similar):
With the introduction of the plugins DSL, users should have little reason to use the legacy method
of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to
restrictions in how it currently works.
build.gradle.kts
apply(plugin = "java")
build.gradle
Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to
apply the JavaPlugin.
Rather than using a plugin id, plugins can also be applied by simply specifying the class of the
plugin:
build.gradle.kts
apply<JavaPlugin>()
build.gradle
The JavaPlugin symbol in the above sample refers to the JavaPlugin. This class does not strictly need
to be imported as the org.gradle.api.plugins package is automatically imported in all build scripts
(see Default imports).
Furthermore, one needs to append the ::class suffix to identify a class literal in Kotlin instead of
.class in Java.
When a project uses a version catalog, plugins can be referenced via aliases when applied.
[versions]
intellij-plugin = "1.6"
[plugins]
jetbrains-intellij = { id = "org.jetbrains.intellij", version.ref = "intellij-plugin"
}
Then a plugin can be applied to any build script using the alias method:
build.gradle.kts
plugins {
alias(libs.plugins.jetbrains.intellij)
}
Writing Plugins
If Gradle or the Gradle community does not offer the specific capabilities your project needs,
creating your own plugin could be a solution.
Additionally, if you find yourself duplicating build logic across subprojects and need a better way to
organize it, custom plugins can help.
Custom plugin
A plugin is any class that implements the Plugin interface. The example below is the most
straightforward plugin, a "hello world" plugin:
build.gradle.kts
import org.gradle.api.Plugin
import org.gradle.api.Project
Many plugins start as a script plugin coded in a build script. This offers an easy way to rapidly
prototype and experiment when building a plugin. Let’s take a look at an example:
build.gradle.kts
// Define a task
abstract class CreateFileTask : DefaultTask() { ①
@get:Input
abstract val fileText: Property<String> ②
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
// Define a plugin
abstract class MyPlugin : Plugin<Project> { ③
override fun apply(project: Project) {
tasks {
register("createFileTask", CreateFileTask::class) {
group = "from my plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY PLUGIN")
}
}
}
}
① Subclass DefaultTask().
1. Subclass DefaultTask()
Gradle has a concept called lazy configuration, which allows task inputs and outputs to be
referenced before they are actually set. This is done via the Property class type.
One advantage of this mechanism is that you can link the output file of one task to the input file of
another, all before the filename has even been decided. The Property class also knows which task
it’s linked to, enabling Gradle to add the required task dependency automatically.
You can add tasks and other logic in the apply() method.
apply<MyPlugin>()
When MyPlugin is applied in the build script, Gradle calls the fun apply() {} method defined in the
custom MyPlugin class.
Script plugins are NOT recommended. Script plugins offer an easy way to rapidly
NOTE prototype build logic, before migrating it to a more permanent solution such as
convention plugins or binary plugins.
Convention Plugins
Convention plugins are a way to encapsulate and reuse common build logic in Gradle. They allow
you to define a set of conventions for a project, and then apply those conventions to other projects
or modules.
The example above has been re-written as a convention plugin stored in buildSrc:
buildSrc/src/main/kotlin/MyConventionPlugin.kt
import org.gradle.api.DefaultTask
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.provider.Property
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction
import java.io.File
@Input
val fileName = project.rootDir.toString() + "/myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
The plugin can be given an id using a gradlePlugin{} block so that it can be referenced in the root:
buildSrc/build.gradle.kts
gradlePlugin {
plugins {
create("my-convention-plugin") {
id = "com.gradle.plugin.my-convention-plugin"
implementationClass = "com.gradle.plugin.MyConventionPlugin"
}
}
}
The gradlePlugin{} block defines the plugins being built by the project. With the newly created id,
the plugin can be applied in other build scripts accordingly:
build.gradle.kts
plugins {
application
id("com.gradle.plugin.my-convention-plugin") // Apply the new plugin
}
Binary Plugins
A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR
file. It is resolved as a dependency rather than compiled from source.
For most use cases, convention plugins must be updated infrequently. Having each developer
execute the plugin build as part of their development process is wasteful, and we can instead
distribute them as binary dependencies.
There are two ways to update the convention plugin in the example above into a binary plugin.
settings.gradle.kts
includeBuild("my-plugin")
build.gradle.kts
plugins {
id("com.gradle.plugin.myconventionplugin") version "1.0.0"
}
A multi-project build consists of one root project and one or more subprojects. Gradle can build the
root project and any number of the subprojects in a single execution.
Project locations
Multi-project builds contain a single root project in a directory that Gradle views as the root path: ..
A subproject has a path, which denotes the position of that subproject in the multi-project build. In
most cases, the project path is consistent with its location in the file system.
The project structure is created in the settings.gradle(.kts) file. The settings file must be present
in the root directory.
Let’s look at a basic multi-project build example that contains a root project and a single subproject.
The root project is called basic-multiproject, located somewhere on your machine. From Gradle’s
perspective, the root is the top-level directory ..
.
├── app
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
└── settings.gradle
This is the recommended project structure for starting any Gradle project. The build init plugin also
generates skeleton projects that follow this structure - a root project with a single subproject:
settings.gradle.kts
rootProject.name = "basic-multiproject"
include("app")
settings.gradle
rootProject.name = 'basic-multiproject'
include 'app'
In this case, Gradle will look for a build file for the app subproject in the ./app directory.
You can view the structure of a multi-project build by running the projects command:
$ ./gradlew -q projects
Projects:
------------------------------------------------------------
Root project 'basic-multiproject'
------------------------------------------------------------
In this example, the app subproject is a Java application that applies the application plugin and
configures the main class. The application prints Hello World to the console:
app/build.gradle.kts
plugins {
id("application")
}
application {
mainClass = "com.example.Hello"
}
app/build.gradle
plugins {
id 'application'
}
application {
mainClass = 'com.example.Hello'
}
app/src/main/java/com/example/Hello.java
package com.example;
You can run the application by executing the run task from the application plugin in the project
root:
$ ./gradlew -q run
Hello, world!
Adding a subproject
In the settings file, you can use the include method to add another subproject to the root project:
settings.gradle.kts
settings.gradle
The include method takes project paths as arguments. The project path is assumed to be equal to
the relative physical file system path. For example, a path services:api is mapped by default to a
folder ./services/api (relative to the project root .).
More examples of how to work with the project path can be found in the DSL documentation of
Settings.include(java.lang.String[]).
Let’s add another subproject called lib to the previously created project.
All we need to do is add another include statement in the root settings file:
settings.gradle.kts
rootProject.name = "basic-multiproject"
include("app")
include("lib")
settings.gradle
rootProject.name = 'basic-multiproject'
include 'app'
include 'lib'
Gradle will then look for the build file of the new lib subproject in the ./lib/ directory:
.
├── app
│ ...
│ └── build.gradle.kts
├── lib
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
├── lib
│ ...
│ └── build.gradle
└── settings.gradle
Project Descriptors
To further describe the project architecture to Gradle, the settings file provides project descriptors.
You can modify these descriptors in the settings file at any time.
settings.gradle.kts
include("project-a")
println(rootProject.name)
println(project(":project-a").name)
settings.gradle
include('project-a')
println rootProject.name
println project(':project-a').name
Using this descriptor, you can change the name, project directory, and build file of a project:
settings.gradle.kts
rootProject.name = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "project-a.gradle.kts"
settings.gradle
rootProject.name = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = 'project-a.gradle'
Consult the ProjectDescriptor class in the API documentation for more information.
.
├── app
│ ...
│ └── build.gradle.kts
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle
└── settings.gradle
include(':subs:web:my-web-module')
Gradle sees a subproject with a logical project name of :subs:web:my-web-module and two, possibly
unintentional, other subprojects logically named :subs and :subs:web. This can lead to phantom
build directories, especially when using allprojects{} or subproject{}.
include(':subs:web:my-web-module')
project(':subs:web:my-web-module').projectDir = "subs/web/my-web-module"
include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"
So, while the physical project layout is the same, the logical results are different.
Naming recommendations
As your project grows, naming and consistency get increasingly more important. To keep your
builds maintainable, we recommend the following:
1. Keep default project names for subprojects: It is possible to configure custom project names
in the settings file. However, it’s an unnecessary extra effort for the developers to track which
projects belong to what folders.
2. Use lower case hyphenation for all project names: All letters are lowercase, and words are
separated with a dash (-) character.
3. Define the root project name in the settings file: The rootProject.name effectively assigns a
name to the build, used in reports like Build Scans. If the root project name is not set, the name
will be the container directory name, which can be unstable (i.e., you can check out your project
in any directory). The name will be generated randomly if the root project name is not set and
checked out to a file system’s root (e.g., / or C:\).
Declaring Dependencies between Subprojects
What if one subproject depends on another subproject? What if one project needs the artifact
produced by another project?
This is a common use case for multi-project builds. Gradle offers project dependencies for this.
.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle
└── settings.gradle
In this example, there are three subprojects called shared, api, and person-service:
1. The person-service subproject depends on the other two subprojects, shared and api.
We use the : separator to define a project path such as services:person-service or :shared. Consult
the DSL documentation of Settings.include(java.lang.String[]) for more information about defining
project paths.
settings.gradle.kts
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
shared/build.gradle.kts
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
api/build.gradle.kts
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
}
services/person-service/build.gradle.kts
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
implementation(project(":api"))
}
settings.gradle
rootProject.name = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'
shared/build.gradle
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
api/build.gradle
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
}
services/person-service/build.gradle
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
implementation project(':api')
}
A project dependency affects execution order. It causes the other project to be built first and adds
the output with the classes of the other project to the classpath. It also adds the dependencies of the
other project to the classpath.
If you execute ./gradlew :api:compile, first the shared project is built, and then the api project is
built.
Sometimes, you might want to depend on the output of a specific task within another project rather
than the entire project. However, explicitly declaring a task dependency from one project to
another is discouraged as it introduces unnecessary coupling between tasks.
The recommended way to model dependencies, where a task in one project depends on the output
of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency
management engine allows you to share arbitrary artifacts between projects and build them on
demand.
Sharing Build Logic between Subprojects
Subprojects in a multi-project build typically share some common dependencies.
Instead of copying and pasting the same Java version and libraries in each subproject build script,
Gradle provides a special directory for storing shared build logic that can be automatically applied
to subprojects.
buildSrc is a Gradle-recognized and protected directory which comes with some benefits:
buildSrc allows you to organize and centralize your custom build logic, tasks, and plugins in a
structured manner. The code written in buildSrc can be reused across your project, making it
easier to maintain and share common build functionality.
Code placed in buildSrc is isolated from the other build scripts of your project. This helps keep
the main build scripts cleaner and more focused on project-specific configurations.
The contents of the buildSrc directory are automatically compiled and included in the classpath
of your main build. This means that classes and plugins defined in buildSrc can be directly used
in your project’s build scripts without any additional configuration.
4. Ease of Testing:
Since buildSrc is a separate build, it allows for easy testing of your custom build logic. You can
write tests for your build code, ensuring that it behaves as expected.
If you are developing custom Gradle plugins for your project, buildSrc is a convenient place to
house the plugin code. This makes the plugins easily accessible within your project.
For multi-project builds, there can be only one buildSrc directory, which must be in the root project
directory.
The downside of using buildSrc is that any change to it will invalidate every task in
NOTE
your project and require a rerun.
buildSrc uses the same source code conventions applicable to Java, Groovy, and Kotlin projects. It
also provides direct access to the Gradle API.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──MyCustomTask.kt ①
│ ├── shared.gradle.kts ②
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts ③
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts
In the buildSrc, the build script shared.gradle(.kts) is created. It contains dependencies and other
build information that is common to multiple subprojects:
shared.gradle.kts
repositories {
mavenCentral()
}
dependencies {
implementation("org.slf4j:slf4j-api:1.7.32")
}
shared.gradle
repositories {
mavenCentral()
}
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.32'
}
In the buildSrc, the MyCustomTask is also created. It is a helper task that is used as part of the build
logic for multiple subprojects:
MyCustomTask.kt
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction
MyCustomTask.groovy
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction
The MyCustomTask task is used in the build script of the api and shared projects. The task is
automatically available because it’s part of buildSrc.
build.gradle.kts
build.gradle
Gradle’s recommended way of organizing build logic is to use its plugin system.
We can write a plugin that encapsulates the build logic common to several subprojects in a project.
This kind of plugin is called a convention plugin.
While writing plugins is outside the scope of this section, the recommended way to build a Gradle
project is to put common build logic in a convention plugin located in the buildSrc.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──myproject.java-conventions.gradle.kts ①
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts ②
└── settings.gradle.kts
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──myproject.java-conventions.gradle ①
│ └── build.gradle
├── api
│ ├── src
│ │ └──...
│ └── build.gradle ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle ②
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle ②
└── settings.gradle
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
settings.gradle
rootProject.name = 'dependencies-java'
include 'api', 'shared', 'services:person-service'
The source code for the convention plugin created in the buildSrc directory is as follows:
buildSrc/src/main/kotlin/myproject.java-conventions.gradle.kts
plugins {
id("java")
}
group = "com.example"
version = "1.0"
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
buildSrc/src/main/groovy/myproject.java-conventions.gradle
plugins {
id 'java'
}
group = 'com.example'
version = '1.0'
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
For the convention plugin to compile, basic configuration needs to be applied in the build file of the
buildSrc directory:
buildSrc/build.gradle.kts
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
buildSrc/build.gradle
plugins {
id 'groovy-gradle-plugin'
}
The convention plugin is applied to the api, shared, and person-service subprojects:
api/build.gradle.kts
plugins {
id("myproject.java-conventions")
}
dependencies {
implementation(project(":shared"))
}
shared/build.gradle.kts
plugins {
id("myproject.java-conventions")
}
services/person-service/build.gradle.kts
plugins {
id("myproject.java-conventions")
}
dependencies {
implementation(project(":shared"))
implementation(project(":api"))
}
api/build.gradle
plugins {
id 'myproject.java-conventions'
}
dependencies {
implementation project(':shared')
}
shared/build.gradle
plugins {
id 'myproject.java-conventions'
}
services/person-service/build.gradle
plugins {
id 'myproject.java-conventions'
}
dependencies {
implementation project(':shared')
implementation project(':api')
}
An improper way to share build logic between subprojects is cross-project configuration via the
subprojects {} and allprojects {} DSL constructs.
TIP Avoid using subprojects {} and allprojects {}.
With cross-configuration, build logic can be injected into a subproject which is not obvious when
looking at its build script.
In the long run, cross-configuration usually grows in complexity and becomes a burden. Cross-
configuration can also introduce configuration-time coupling between projects, which can prevent
optimizations like configuration-on-demand from working properly.
The two most common uses of cross-configuration can be better modeled using convention plugins:
Composite Builds
A composite build is a build that includes other builds.
A composite build is similar to a Gradle multi-project build, except that instead of including
subprojects, entire builds are included.
• Combine builds that are usually developed independently, for instance, when trying out a bug
fix in a library that your application uses.
• Decompose a large multi-project build into smaller, more isolated chunks that can be worked on
independently or together as needed.
A build that is included in a composite build is referred to as an included build. Included builds do
not share any configuration with the composite build or the other included builds. Each included
build is configured and executed in isolation.
The following example demonstrates how two Gradle builds, normally developed separately, can be
combined into a composite build.
my-composite
├── gradle
├── gradlew
├── settings.gradle.kts
├── build.gradle.kts
├── my-app
│ ├── settings.gradle.kts
│ └── app
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/my-app/Main.java
└── my-utils
├── settings.gradle.kts
├── number-utils
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/numberutils/Numbers.java
└── string-utils
├── build.gradle.kts
└── src/main/java/org/sample/stringutils/Strings.java
The my-utils multi-project build produces two Java libraries, number-utils and string-utils. The my-
app build produces an executable using functions from those libraries.
The my-app build does not depend directly on my-utils. Instead, it declares binary dependencies on
the libraries produced by my-utils:
my-app/app/build.gradle.kts
plugins {
id("application")
}
application {
mainClass = "org.sample.myapp.Main"
}
dependencies {
implementation("org.sample:number-utils:1.0")
implementation("org.sample:string-utils:1.0")
}
my-app/app/build.gradle
plugins {
id 'application'
}
application {
mainClass = 'org.sample.myapp.Main'
}
dependencies {
implementation 'org.sample:number-utils:1.0'
implementation 'org.sample:string-utils:1.0'
}
The --include-build command-line argument turns the executed build into a composite,
substituting dependencies from the included build into the executed build.
For example, the output of ./gradlew run --include-build ../my-utils run from my-app:
The settings file can be used to add subprojects and included builds simultaneously.
settings.gradle.kts
includeBuild("my-utils")
rootProject.name = "my-composite"
includeBuild("my-app")
includeBuild("my-utils")
settings.gradle
rootProject.name = 'my-composite'
includeBuild 'my-app'
includeBuild 'my-utils'
To execute the run task in the my-app build from my-composite, run ./gradlew my-app:app:run.
You can optionally define a run task in my-composite that depends on my-app:app:run so that you can
execute ./gradlew run:
build.gradle.kts
tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
build.gradle
tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}
A special case of included builds are builds that define Gradle plugins.
These builds should be included using the includeBuild statement inside the pluginManagement {}
block of the settings file.
Using this mechanism, the included build may also contribute a settings plugin that can be applied
in the settings file itself:
settings.gradle.kts
pluginManagement {
includeBuild("../url-verifier-plugin")
}
settings.gradle
pluginManagement {
includeBuild '../url-verifier-plugin'
}
Most builds can be included in a composite, including other composite builds. There are some
restrictions.
In a regular build, Gradle ensures that each project has a unique project path. It makes projects
identifiable and addressable without conflicts.
In a composite build, Gradle adds additional qualification to each project from an included build to
avoid project path conflicts. The full path to identify a project in a composite build is called a build-
tree path. It consists of a build path of an included build and a project path of the project.
By default, build paths and project paths are derived from directory names and structure on disk.
Since included builds can be located anywhere on disk, their build path is determined by the name
of the containing directory. This can sometimes lead to conflicts.
• Each included build path must not conflict with any project path of the main build.
These conditions guarantee that each project can be uniquely identified even in a composite build.
If conflicts arise, the way to resolve them is by changing the build name of an included build:
settings.gradle.kts
includeBuild("some-included-build") {
name = "other-name"
}
When a composite build is included in another composite build, both builds have
NOTE
the same parent. In other words, the nested composite build structure is flattened.
Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be
executed, tests can be run, and builds can be imported into the IDE.
Executing tasks
Tasks from an included build can be executed from the command-line or IDE in the same way as
tasks from a regular multi-project build. Executing a task will result in task dependencies being
executed, as well as those tasks required to build dependency artifacts from other included builds.
You can call a task in an included build using a fully qualified path, for example, :included-build-
name:project-name:taskName. Project and task names can be abbreviated.
$ ./gradlew :included-build:subproject-a:compileJava
> Task :included-build:subproject-a:compileJava
$ ./gradlew :i-b:sA:cJ
> Task :included-build:subproject-a:compileJava
To exclude a task from the command line, you need to provide the fully qualified path to the task.
Importing a composite build permits sources from separate Gradle builds to be easily developed
together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse
Project. Source dependencies are configured, providing cross-build navigation and refactoring.
By default, Gradle will configure each included build to determine the dependencies it can provide.
The algorithm for doing this is simple. Gradle will inspect the group and name for the projects in
the included build and substitute project dependencies for any external dependency matching
${project.group}:${project.name}.
NOTE
To make the (sub)projects of the main build addressable by
${project.group}:${project.name}, you can tell Gradle to treat the main build like an
included build by self-including it: includeBuild(".").
There are cases when the default substitutions determined by Gradle are insufficient or must be
corrected for a particular composite. For these cases, explicitly declaring the substitutions for an
included build is possible.
For example, a single-project build called anonymous-library, produces a Java utility library but does
not declare a value for the group attribute:
build.gradle.kts
plugins {
java
}
build.gradle
plugins {
id 'java'
}
When this build is included in a composite, it will attempt to substitute for the dependency module
undefined:anonymous-library (undefined being the default value for project.group, and anonymous-
library being the root project name). Clearly, this isn’t useful in a composite build.
To use the unpublished library in a composite build, you can explicitly declare the substitutions
that it provides:
settings.gradle.kts
includeBuild("anonymous-library") {
dependencySubstitution {
substitute(module("org.sample:number-utils")).using(project(":"))
}
}
settings.gradle
includeBuild('anonymous-library') {
dependencySubstitution {
substitute module('org.sample:number-utils') using project(':')
}
}
With this configuration, the my-app composite build will substitute any dependency on
org.sample:number-utils with a dependency on the root project of anonymous-library.
If you need to resolve a published version of a module that is also available as part of an included
build, you can deactivate the included build substitution rules on the ResolutionStrategy of the
Configuration that is resolved. This is necessary because the rules are globally applied in the build,
and Gradle does not consider published versions during resolution by default.
For example, we create a separate publishedRuntimeClasspath configuration that gets resolved to the
published versions of modules that also exist in one of the local builds. This is done by deactivating
global dependency substitution rules:
build.gradle.kts
configurations.create("publishedRuntimeClasspath") {
resolutionStrategy.useGlobalDependencySubstitutionRules = false
extendsFrom(configurations.runtimeClasspath.get())
isCanBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE,
objects.named(Usage.JAVA_RUNTIME))
}
build.gradle
configurations.create('publishedRuntimeClasspath') {
resolutionStrategy.useGlobalDependencySubstitutionRules = false
extendsFrom(configurations.runtimeClasspath)
canBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage
.JAVA_RUNTIME))
}
Many builds will function automatically as an included build, without declared substitutions. Here
are some common cases where declared substitutions are required:
• When the archivesBaseName property is used to set the name of the published artifact.
• When the MavenPom.addFilter() is used to publish artifacts that don’t match the project name.
• When the maven-publish or ivy-publish plugins are used for publishing and the publication
coordinates don’t match ${project.group}:${project.name}.
Some builds won’t function correctly when included in a composite, even when dependency
substitutions are explicitly declared. This limitation is because a substituted project dependency
will always point to the default configuration of the target project. Any time the artifacts and
dependencies specified for the default configuration of a project don’t match what is published to a
repository, the composite build may exhibit different behavior.
Here are some cases where the published module metadata may be different from the project
default configuration:
Builds using these features function incorrectly when included in a composite build.
While included builds are isolated from one another and cannot declare direct dependencies, a
composite build can declare task dependencies on its included builds. The included builds are
accessed using Gradle.getIncludedBuilds() or Gradle.includedBuild(java.lang.String), and a task
reference is obtained via the IncludedBuild.task(java.lang.String) method.
Using these APIs, it is possible to declare a dependency on a task in a particular included build:
build.gradle.kts
tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
build.gradle
tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}
Or you can declare a dependency on tasks with a certain path in some or all of the included builds:
build.gradle.kts
tasks.register("publishDeps") {
dependsOn(gradle.includedBuilds.map {
it.task(":publishMavenPublicationToMavenRepository") })
}
build.gradle
tasks.register('publishDeps') {
dependsOn gradle.includedBuilds*.task(
':publishMavenPublicationToMavenRepository')
}
• No support for included builds with publications that don’t mirror the project default
configuration.
See Cases where composite builds won’t work.
• Multiple composite builds may conflict when run in parallel if more than one includes the same
build.
Gradle does not share the project lock of a shared composite build between Gradle invocations
to prevent concurrent execution.
Configuration On Demand
Configuration-on-demand attempts to configure only the relevant projects for the requested tasks,
i.e., it only evaluates the build script file of projects participating in the build. This way, the
configuration time of a large multi-project build can be reduced.
The configuration-on-demand feature is incubating, so only some builds are guaranteed to work
correctly. The feature works well for decoupled multi-project builds.
• The project in the directory where the build is executed is also configured, but only when
Gradle is executed without any tasks.
This way, the default tasks behave correctly when projects are configured on demand.
• The standard project dependencies are supported, and relevant projects are configured.
If project A has a compile dependency on project B, then building A causes the configuration of
both projects.
• The task dependencies declared via the task path are supported and cause relevant projects to
be configured.
Example: someTask.dependsOn(":some-other-project:someOtherTask")
• A task requested via task path from the command line (or tooling API) causes the relevant
project to be configured.
For example, building project-a:project-b:someTask causes configuration of project-b.
Enable configuration-on-demand
Decoupled projects
Gradle allows projects to access each other’s configurations and tasks during the configuration and
execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform
optimizations such as parallel project builds and configuration on demand.
Projects are considered decoupled when they interact solely through declared dependencies and
task dependencies. Any direct modification or reading of another project’s object creates coupling
between the projects. Coupling during configuration can result in flawed build outcomes when
using 'configuration on demand', while coupling during execution can affect parallel execution.
• Refrain from referencing other subprojects' build scripts and prefer cross-configuration from
the root project.
Parallel projects
Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently
executing tasks from different projects.
To enable parallel execution, use the --parallel command-line argument or configure your build
environment. Gradle automatically determines the optimal number of parallel threads based on
CPU cores.
During parallel execution, each worker handles a specific project exclusively. Task dependencies
are respected, with workers prioritizing upstream tasks. However, tasks may not execute in
alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and
inputs/outputs to avoid ordering issues.
DEVELOPING TASKS
Understanding Tasks
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
Before reading this chapter, it’s recommended that you first read the Learning The Basics and
complete the Tutorial.
Listing tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in a project by running the following command in the terminal:
$ ./gradlew tasks
Let’s take a very basic Gradle project as an example. The project has the following structure:
gradle-project
├── app
│ ├── build.gradle.kts // empty file - no build logic
│ └── ... // some java code
├── settings.gradle.kts // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat
gradle-project
├── app
│ ├── build.gradle // empty file - no build logic
│ └── ... // some java code
├── settings.gradle // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat
settings.gradle.kts
rootProject.name = "gradle-project"
include("app")
settings.gradle
rootProject.name = 'gradle-project'
include('app')
To see the tasks available in the app subproject, run ./gradlew :app:tasks:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
We observe that only a small number of help tasks are available at the moment. This is because the
core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build
your project or compile your code, are added by plugins.
Let’s explore this by adding the Gradle core base plugin to the app build script:
app/build.gradle.kts
plugins {
id("base")
}
app/build.gradle
plugins {
id('base')
}
The base plugin adds central lifecycle tasks. Now when we run ./gradlew app:tasks, we can see the
assemble and build tasks are available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
Task outcomes
When Gradle executes a task, it labels the task with outcomes via the console.
These labels are based on whether a task has actions to execute and if Gradle executed them.
Actions include, but are not limited to, compiling code, zipping files, and publishing archives.
• Task has no actions and some dependencies, and Gradle executed one or more of the
dependencies. See also Lifecycle Tasks.
UP-TO-DATE
Task’s outputs did not change.
• Task has outputs and inputs but they have not changed. See Incremental Build.
• Task has actions, but the task tells Gradle it did not change its outputs.
• Task has no actions and some dependencies, but all the dependencies are UP-TO-DATE, SKIPPED
or FROM-CACHE. See Lifecycle Tasks.
FROM-CACHE
Task’s outputs could be found from a previous execution.
• Task has outputs restored from the build cache. See Build Cache.
SKIPPED
Task did not execute its actions.
• Task has been explicitly excluded from the command-line. See Excluding tasks from
execution.
NO-SOURCE
Task did not need to execute its actions.
• Task has inputs and outputs, but no sources (i.e., inputs were not found).
Task groups and descriptions are used to organize and describe tasks.
Groups
Task groups are used to categorize tasks. When you run ./gradlew tasks, tasks are listed under
their respective groups, making it easier to understand their purpose and relationship to other
tasks. Groups are set using the group property.
Descriptions
Descriptions provide a brief explanation of what a task does. When you run ./gradlew tasks, the
descriptions are shown next to each task, helping you understand its purpose and how to use it.
Descriptions are set using the description property.
Let’s consider a basic Java application as an example. The build contains a subproject called app.
$ ./gradlew :app:tasks
Application tasks
-----------------
run - Runs this project as a JVM application.
Build tasks
-----------
assemble - Assembles the outputs of this project.
Here, the :run task is part of the Application group with the description Runs this project as a JVM
application. In code, it would look something like this:
app/build.gradle.kts
tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
app/build.gradle
tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
However, tasks will only show up when running :tasks if task.group is set or no other task depends
on it.
For instance, the following task will not appear when running ./gradlew :app:tasks because it does
not have a group; it is called a hidden task:
app/build.gradle.kts
tasks.register("helloTask") {
println("Hello")
}
app/build.gradle
tasks.register("helloTask") {
println("Hello")
}
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
app/build.gradle.kts
tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}
app/build.gradle
tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
Other tasks
-----------
helloTask - Hello task
In contrast, ./gradlew tasks --all will show all tasks; hidden and visible tasks are listed.
Grouping tasks
If you want to customize which tasks are shown to users when listed, you can group tasks and set
the visibility of each group.
Remember, even if you hide tasks, they are still available, and Gradle can still run
NOTE
them.
Let’s start with an example built by Gradle init for a Java application with multiple subprojects.
The project structure is as follows:
gradle-project
├── app
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle.kts
│ ├── settings.gradle.kts
│ └── src // common build logic
│ └── ...
├── settings.gradle.kts
├── gradle
├── gradlew
└── gradlew.bat
gradle-project
├── app
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle
│ ├── settings.gradle
│ └── src // common build logic
│ └── ...
├── settings.gradle
├── gradle
├── gradlew
└── gradlew.bat
$ ./gradlew :app:tasks
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these
tasks are rarely required directly by developers using the build.
We can configure the :tasks task and limit the tasks shown to a certain group.
Let’s create our own group so that all tasks are hidden by default by updating the app build script:
app/build.gradle.kts
app/build.gradle
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Task categories
1. Lifecycle tasks
2. Actionable tasks
Lifecycle tasks define targets you can call, such as :build your project. Lifecycle tasks do not
provide Gradle with actions. They must be wired to actionable tasks. The base Gradle plugin only
adds lifecycle tasks.
Actionable tasks define actions for Gradle to take, such as :compileJava, which compiles the Java
code of your project. Actions include creating JARs, zipping files, publishing archives, and much
more. Plugins like the java-library plugin adds actionable tasks.
Let’s update the build script of the previous example, which is currently an empty file so that our
app subproject is a Java library:
app/build.gradle.kts
plugins {
id("java-library")
}
app/build.gradle
plugins {
id('java-library')
}
Once again, we list the available tasks to see what new tasks are available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
We see that many new tasks are available such as jar and testClasses.
Additionally, the java-library plugin has wired actionable tasks to lifecycle tasks. If we call the
:build task, we can see several tasks have been executed, including the :app:compileJava task.
$./gradlew :app:build
Incremental tasks
Gradle can reuse results from prior builds. Therefore, if we’ve built our project before and made
only minor changes, rerunning :build will not require Gradle to perform extensive work.
For example, if we modify only the test code in our project, leaving the production code unchanged,
executing the build will solely recompile the test code. Gradle marks the tasks for the production
code as UP-TO-DATE, indicating that it remains unchanged since the last successful build:
$./gradlew :app:build
Caching tasks
Gradle can reuse results from past builds using the build cache.
To enable this feature, activate the build cache by using the --build-cache command line parameter
or by setting org.gradle.caching=true in your gradle.properties file.
When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE.
The build cache is handy if you switch between branches regularly. Gradle supports both local and
remote build caches.
Developing tasks
1. Registering a task - using a task (implemented by you or provided by Gradle) in your build
logic.
3. Implementing a task - creating a custom task class (i.e., custom class type).
tasks.register<Copy>("myCopy") ①
tasks.named<Copy>("myCopy") { ②
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.
② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.
③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.
tasks.register(Copy, "myCopy") ①
tasks.named(Copy, "myCopy") { ②
from "resources"
into "target"
include "**/*.txt", "**/*.xml", "**/*.properties"
}
① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.
② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.
③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.
1. Registering tasks
You define actions for Gradle to take by registering tasks in build scripts or plugins.
Tasks are defined using strings for task names:
build.gradle.kts
tasks.register("hello") {
doLast {
println("hello")
}
}
build.gradle
tasks.register('hello') {
doLast {
println 'hello'
}
}
In the example above, the task is added to the TasksCollection using the register() method in
TaskContainer.
2. Configuring tasks
Gradle tasks must be configured to complete their action(s) successfully. If a task needs to ZIP a file,
it must be configured with the file name and location. You can refer to the API for the Gradle Zip
task to learn how to configure it appropriately.
Let’s look at the Copy task provided by Gradle as an example. We first register a task called myCopy of
type Copy in the build script:
build.gradle.kts
tasks.register<Copy>("myCopy")
build.gradle
tasks.register('myCopy', Copy)
This registers a copy task with no default behavior. Since the task is of type Copy, a Gradle supported
task type, it can be configured using its API.
The following examples show several ways to achieve the same configuration:
build.gradle.kts
tasks.named<Copy>("myCopy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
build.gradle
tasks.named('myCopy') {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
build.gradle.kts
tasks.register<Copy>("copy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
build.gradle
tasks.register('copy', Copy) {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
copy {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
NOTE This option breaks task configuration avoidance and is not recommended!
Regardless of the method chosen, the task is configured with the name of the files to be copied and
the location of the files.
3. Implementing tasks
Gradle provides many task types including Delete, Javadoc, Copy, Exec, Tar, and Pmd. You can
implement a custom task type if Gradle does not provide a task type that meets your build logic
needs.
To create a custom task class, you extend DefaultTask and make the extending class abstract:
app/build.gradle.kts
app/build.gradle
You can learn more about developing custom task types in Implementing Tasks.
1. Deferred Value Resolution: Allows wiring Gradle models without needing to know when a
property’s value will be known. For example, you may want to set the input source files of a
task based on the source directories property of an extension, but the extension property value
isn’t known until the build script or some other plugin configures them.
2. Automatic Task Dependency Management: Connects output of one task to input of another,
automatically determining task dependencies. Property instances carry information about
which task, if any, produces their value. Build authors do not need to worry about keeping task
dependencies in sync with configuration changes.
Provider
Represents a value that can only be queried and cannot be changed.
• Many other types extend Provider and can be used wherever a Provider is required.
Property
Represents a value that can be queried and changed.
• The method Property.set(T) specifies a value for the property, overwriting whatever value
may have been present.
• The method Property.set(Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together
Provider and Property instances before the values are configured.
Lazy properties are intended to be passed around and only queried when required. This typically
happens during the execution phase.
The following demonstrates a task with a configurable greeting property and a read-only message
property:
build.gradle.kts
@Internal
val message: Provider<String> = greeting.map { it + " from Gradle" } ③
@TaskAction
fun printMessage() {
logger.quiet(message.get())
}
}
tasks.register<Greeting>("greeting") {
greeting.set("Hi") ④
greeting = "Hi" ⑤
}
build.gradle
@Internal
final Provider<String> message = greeting.map { it + ' from Gradle' } ③
@TaskAction
void printMessage() {
logger.quiet(message.get())
}
}
tasks.register("greeting", Greeting) {
greeting.set('Hi') ④
greeting = 'Hi' ⑤
}
$ gradle greeting
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
The Greeting task has a property of type Property<String> to represent the configurable greeting
and a property of type Provider<String> to represent the calculated, read-only, message. The
message Provider is created from the greeting Property using the map() method; its value is kept up-
to-date as the value of the greeting property changes.
Creating a Property or Provider instance
Neither Provider nor its subtypes, such as Property, are intended to be implemented by a build
script or plugin. Gradle provides factory methods to create instances of these types instead.
See the Quick Reference for all of the types and factories available.
When writing a plugin or build script with Groovy, you can use the map(Transformer)
NOTE method with a closure, and Groovy will convert the closure to a Transformer.
Similarly, when writing a plugin or build script with Kotlin, the Kotlin compiler will
convert a Kotlin function into a Transformer.
Connecting properties together
An important feature of lazy properties is that they can be connected together so that changes to
one property are automatically reflected in other properties.
Here is an example where the property of a task is connected to a property of a project extension:
build.gradle.kts
// A project extension
interface MessageExtension {
// A configurable greeting
abstract val greeting: Property<String>
}
@TaskAction
fun printMessage() {
logger.quiet(message.get())
}
}
messages.apply {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = "Hi"
}
build.gradle
// A project extension
interface MessageExtension {
// A configurable greeting
Property<String> getGreeting()
}
@TaskAction
void printMessage() {
logger.quiet(message.get())
}
}
messages {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = 'Hi'
}
$ gradle greeting
This example calls the Property.set(Provider) method to attach a Provider to a Property to supply the
value of the property. In this case, the Provider happens to be a Property as well, but you can
connect any Provider implementation, for example one created using Provider.map()
Working with files
In Working with Files, we introduced four collection types for File-like objects:
FileCollection ConfigurableFileCollection
FileTree ConfigurableFileTree
There are more strongly typed models used to represent elements of the file system: Directory and
RegularFile. These types shouldn’t be confused with the standard Java File type as they are used to
tell Gradle that you expect more specific values such as a directory or a non-directory, regular file.
Gradle provides two specialized Property subtypes for dealing with values of these types:
RegularFileProperty and DirectoryProperty. ObjectFactory has methods to create these:
ObjectFactory.fileProperty() and ObjectFactory.directoryProperty().
A DirectoryProperty can also be used to create a lazily evaluated Provider for a Directory and
RegularFile via DirectoryProperty.dir(String) and DirectoryProperty.file(String) respectively. These
methods create providers whose values are calculated relative to the location for the
DirectoryProperty they were created from. The values returned from these providers will reflect
changes to the DirectoryProperty.
build.gradle.kts
// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource : DefaultTask() {
// The configuration file to use to generate the source file
@get:InputFile
abstract val configFile: RegularFileProperty
@TaskAction
fun compile() {
val inFile = configFile.get().asFile
logger.quiet("configuration file = $inFile")
val dir = outputDir.get().asFile
logger.quiet("output dir = $dir")
val className = inFile.readText().trim()
val srcFile = File(dir, "${className}.java")
srcFile.writeText("public class ${className} { }")
}
}
build.gradle
// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource extends DefaultTask {
// The configuration file to use to generate the source file
@InputFile
abstract RegularFileProperty getConfigFile()
@TaskAction
def compile() {
def inFile = configFile.get().asFile
logger.quiet("configuration file = $inFile")
def dir = outputDir.get().asFile
logger.quiet("output dir = $dir")
def className = inFile.text.trim()
def srcFile = new File(dir, "${className}.java")
srcFile.text = "public class ${className} { ... }"
}
}
$ gradle generate
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
$ gradle generate
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
This example creates providers that represent locations in the project and build directories through
Project.getLayout() with ProjectLayout.getBuildDirectory() and ProjectLayout.getProjectDirectory().
To close the loop, note that a DirectoryProperty, or a simple Directory, can be turned into a FileTree
that allows the files and directories contained in the directory to be queried with
DirectoryProperty.getAsFileTree() or Directory.getAsFileTree(). From a DirectoryProperty or a
Directory, you can create FileCollection instances containing a set of the files contained in the
directory with DirectoryProperty.files(Object...) or Directory.files(Object...).
Working with task inputs and outputs
Many builds have several tasks connected together, where one task consumes the outputs of
another task as an input.
To make this work, we need to configure each task to know where to look for its inputs and where
to place its outputs. Ensure that the producing and consuming tasks are configured with the same
location and attach task dependencies between the tasks. This can be cumbersome and brittle if any
of these values are configurable by a user or configured by multiple plugins, as task properties need
to be configured in the correct order and locations, and task dependencies kept in sync as values
change.
The Property API makes this easier by keeping track of the value of a property and the task that
produces the value.
As an example, consider the following plugin with a producer and consumer task which are wired
together:
build.gradle.kts
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
val input = inputFile.get().asFile
val message = input.readText()
logger.quiet("Read '${message}' from ${input}")
}
}
producer {
// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file("file.txt")
}
build.gradle
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
def input = inputFile.get().asFile
def message = input.text
logger.quiet("Read '${message}' from ${input}")
}
}
producer.configure {
// Set values for the producer lazily
// Don't need to update the consumer.inputFile property. This is
automatically updated as producer.outputFile changes
outputFile = layout.buildDirectory.file('file.txt')
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
In the example above, the task outputs and inputs are connected before any location is defined. The
setters can be called at any time before the task is executed, and the change will automatically
affect all related input and output properties.
Another important thing to note in this example is the absence of any explicit task dependency.
Task outputs represented using Providers keep track of which task produces their value, and using
them as task inputs will implicitly add the correct task dependencies.
Implicit task dependencies also work for input properties that are not files:
build.gradle.kts
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
logger.quiet(message.get())
}
}
build.gradle
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
logger.quiet(message.get())
}
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer
> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/build/file.txt
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Working with collections
Gradle provides two lazy property types to help configure Collection properties.
These work exactly like any other Provider and, just like file providers, they have additional
modeling around them:
This type of property allows you to overwrite the entire collection value with
HasMultipleValues.set(Iterable) and HasMultipleValues.set(Provider) or add new elements through
the various add methods:
Just like every Provider, the collection is calculated when Provider.get() is called. The following
example shows the ListProperty in action:
build.gradle.kts
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = outputFile.get().asFile
output.writeText( message)
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
inputFiles.get().forEach { inputFile ->
val input = inputFile.asFile
val message = input.readText()
logger.quiet("Read '${message}' from ${input}")
}
}
}
build.gradle
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = outputFile.get().asFile
output.text = message
logger.quiet("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
inputFiles.get().each { inputFile ->
def input = inputFile.asFile
def message = input.text
logger.quiet("Read '${message}' from ${input}")
}
}
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
$ gradle consumer
BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
Working with maps
Gradle provides a lazy MapProperty type to allow Map values to be configured. You can create a
MapProperty instance using ObjectFactory.mapProperty(Class, Class).
Similar to other property types, a MapProperty has a set() method that you can use to specify the
value for the property. Some additional methods allow entries with lazy values to be added to the
map.
build.gradle.kts
@TaskAction
fun generate() {
properties.get().forEach { entry ->
logger.quiet("${entry.key} = ${entry.value}")
}
}
}
tasks.register<Generator>("generate") {
properties.put("a", 1)
// Values have not been configured yet
properties.put("b", providers.provider { b })
properties.putAll(providers.provider { mapOf("c" to c, "d" to c + 1) })
}
build.gradle
@TaskAction
void generate() {
properties.get().each { key, value ->
logger.quiet("${key} = ${value}")
}
}
}
tasks.register('generate', Generator) {
properties.put("a", 1)
// Values have not been configured yet
properties.put("b", providers.provider { b })
properties.putAll(providers.provider { [c: c, d: c + 1] })
}
$ gradle generate
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
Applying a convention to a property
Often, you want to apply some convention, or default value to a property to be used if no value has
been configured. You can use the convention() method for this. This method accepts either a value
or a Provider, and this will be used as the value until some other value is configured.
build.gradle.kts
tasks.register("show") {
val property = objects.property(String::class)
// Set a convention
property.convention("convention 1")
property.set("explicit value")
doLast {
println("value = " + property.get())
}
}
build.gradle
tasks.register("show") {
def property = objects.property(String)
// Set a convention
property.convention("convention 1")
property.set("explicit value")
// Once a value is set, the convention is ignored
property.convention("ignored convention")
doLast {
println("value = " + property.get())
}
}
$ gradle show
value = convention 1
value = convention 2
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
build.gradle.kts
apply<GreetingPlugin>()
tasks.withType<GreetingTask>().configureEach {
// setting convention from build script
guest.convention("Guest")
}
@TaskAction
fun greet() {
println("hello, ${guest.get()}, from ${greeter.get()}")
}
}
build.gradle
tasks.withType(GreetingTask).configureEach {
// setting convention from build script
guest.convention("Guest")
}
GreetingTask() {
guest.convention("person2")
}
@TaskAction
void greet() {
println("hello, ${guest.get()}, from ${greeter.get()}")
}
}
Plugin authors may configure a convention on a lazy property from a plugin’s apply() method,
while performing preliminary configuration of the task or extension defining the property. This
works well for regular plugins (meant to be distributed and used in the wild), and internal
convention plugins (which often configure properties defined by third party plugins in a uniform
way for the entire build).
build.gradle.kts
build.gradle
Build engineers may configure a convention on a lazy property from shared build logic that is
configuring tasks (for instance, from third-party plugins) in a standard way for the entire build.
build.gradle.kts
apply<GreetingPlugin>()
tasks.withType<GreetingTask>().configureEach {
// setting convention from build script
guest.convention("Guest")
}
build.gradle
tasks.withType(GreetingTask).configureEach {
// setting convention from build script
guest.convention("Guest")
}
Note that for project-specific values, instead of conventions, you should prefer setting explicit
values (using Property.set(…) or ConfigurableFileCollection.setFrom(…), for instance), as
conventions are only meant to define defaults.
A task author may configure a convention on a lazy property from the task constructor or (if in
Kotlin) initializer block. This approach works for properties with trivial defaults, but it is not
appropriate if additional context (external to the task implementation) is required in order to set a
suitable default.
build.gradle.kts
init {
guest.convention("person2")
}
build.gradle
GreetingTask() {
guest.convention("person2")
}
You may configure a convention on a lazy property next to the place where the property is
declared. Note this option is not available for managed properties, and has the same caveats as
configuring a convention from the task constructor.
build.gradle.kts
build.gradle
For example, a property that specifies the output directory for a compilation task may start with a
value specified by a plugin. Then a build script might change the value to some custom location,
then this value is used by the task when it runs. However, once the task starts to run, we want to
prevent further property changes. This way we avoid errors that result from different consumers,
such as the task action, Gradle’s up-to-date checks, build caching, or other tasks, using different
values for the property.
Lazy properties provide several methods that you can use to disallow changes to their value once
the value has been configured. The finalizeValue() method calculates the final value for the
property and prevents further changes to the property.
libVersioning.version.finalizeValue()
When the property’s value comes from a Provider, the provider is queried for its current value, and
the result becomes the final value for the property. This final value replaces the provider and the
property no longer tracks the value of the provider. Calling this method also makes a property
instance unmodifiable and any further attempts to change the value of the property will fail. Gradle
automatically makes the properties of a task final when the task starts execution.
The finalizeValueOnRead() method is similar, except that the property’s final value is not calculated
until the value of the property is queried.
modifiedFiles.finalizeValueOnRead()
In other words, this method calculates the final value lazily as required, whereas finalizeValue()
calculates the final value eagerly. This method can be used when the value may be expensive to
calculate or may not have been configured yet. You also want to ensure that all consumers of the
property see the same value when they query the value.
Using the Provider API
Guidelines to be successful with the Provider API:
1. The Property and Provider types have all of the overloads you need to query or configure a
value. For this reason, you should follow the following guidelines:
◦ For configurable properties, expose the Property directly through a single getter.
◦ If it’s a stable property, add a new Property or Provider and deprecate the old one. You
should wire the old getter/setters into the new property as appropriate.
Provider<RegularFile>
File on disk
Factories
• Provider.map(Transformer).
• Provider.flatMap(Transformer).
• DirectoryProperty.file(String)
Provider<Directory>
Directory on disk
Factories
• Provider.map(Transformer).
• Provider.flatMap(Transformer).
• DirectoryProperty.dir(String)
FileCollection
Unstructured collection of files
Factories
• Project.files(Object[])
• ProjectLayout.files(Object...)
• DirectoryProperty.files(Object...)
FileTree
Hierarchy of files
Factories
• Project.fileTree(Object) will produce a ConfigurableFileTree, or you can use
Project.zipTree(Object) and Project.tarTree(Object)
• DirectoryProperty.getAsFileTree()
RegularFileProperty
File on disk
Factories
• ObjectFactory.fileProperty()
DirectoryProperty
Directory on disk
Factories
• ObjectFactory.directoryProperty()
ConfigurableFileCollection
Unstructured collection of files
Factories
• ObjectFactory.fileCollection()
ConfigurableFileTree
Hierarchy of files
Factories
• ObjectFactory.fileTree()
SourceDirectorySet
Hierarchy of source directories
Factories
• ObjectFactory.sourceDirectorySet(String, String)
ListProperty<T>
a property whose value is List<T>
Factories
• ObjectFactory.listProperty(Class)
SetProperty<T>
a property whose value is Set<T>
Factories
• ObjectFactory.setProperty(Class)
Provider<T>
a property whose value is an instance of T
Factories
• Provider.map(Transformer).
• Provider.flatMap(Transformer).
Property<T>
a property whose value is an instance of T
Factories
• ObjectFactory.property(Class)
This allows Gradle to fully utilize the resources available and complete builds faster.
The Worker API
The Worker API provides the ability to break up the execution of a task action into discrete units of
work and then execute that work concurrently and asynchronously.
The best way to understand how to use the API is to go through the process of converting an
existing custom task to use the Worker API:
1. You’ll start by creating a custom task class that generates MD5 hashes for a configurable set of
files.
2. Then, you’ll convert this custom task to use the Worker API.
3. Then, we’ll explore running the task with different levels of isolation.
In the process, you’ll learn about the basics of the Worker API and the capabilities it provides.
First, create a custom task that generates MD5 hashes of a configurable set of files.
buildSrc/build.gradle.kts
repositories {
mavenCentral()
}
dependencies {
implementation("commons-io:commons-io:2.5")
implementation("commons-codec:commons-codec:1.9") ①
}
buildSrc/build.gradle
repositories {
mavenCentral()
}
dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'commons-codec:commons-codec:1.9' ①
}
① Your custom task class will use Apache Commons Codec to generate MD5 hashes.
Next, create a custom task class in your buildSrc/src/main/java directory. You should name this
class CreateMD5:
buildSrc/src/main/java/CreateMD5.java
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.OutputDirectory;
import org.gradle.api.tasks.SourceTask;
import org.gradle.api.tasks.TaskAction;
import org.gradle.workers.WorkerExecutor;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory(); ②
@TaskAction
public void createHashes() {
for (File sourceFile : getSource().getFiles()) { ③
try {
InputStream stream = new FileInputStream(sourceFile);
System.out.println("Generating MD5 for " + sourceFile.getName() + "
...");
// Artificially make this task slower.
Thread.sleep(3000); ④
Provider<RegularFile> md5File = getDestinationDirectory().file
(sourceFile.getName() + ".md5"); ⑤
FileUtils.writeStringToFile(md5File.get().getAsFile(), DigestUtils
.md5Hex(stream), (String) null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
}
① SourceTask is a convenience type for tasks that operate on a set of source files.
③ The task iterates over all the files defined as "source files" and creates an MD5 hash of each.
④ Insert an artificial sleep to simulate hashing a large file (the sample files won’t be that large).
⑤ The MD5 hash of each file is written to the output directory into a file of the same name with an
"md5" extension.
build.gradle.kts
plugins { id("base") } ①
tasks.register<CreateMD5>("md5") {
destinationDirectory = project.layout.buildDirectory.dir("md5") ②
source(project.layout.projectDirectory.file("src")) ③
}
build.gradle
plugins { id 'base' } ①
tasks.register("md5", CreateMD5) {
destinationDirectory = project.layout.buildDirectory.dir("md5") ②
source(project.layout.projectDirectory.file('src')) ③
}
① Apply the base plugin so that you’ll have a clean task to use to remove the output.
③ This task will generate MD5 hash files for every file in the src directory.
You will need some source to generate MD5 hashes from. Create three files in the src directory:
src/einstein.txt
src/feynman.txt
I was born not knowing and have had only a little time to change that here and there.
src/hawking.txt
$ gradle md5
BUILD SUCCESSFUL in 9s
3 actionable tasks: 3 executed
In the build/md5 directory, you should now see corresponding files with an md5 extension containing
MD5 hashes of the files from the src directory. Notice that the task takes at least 9 seconds to run
because it hashes each file one at a time (i.e., three files at ~3 seconds apiece).
Although this task processes each file in sequence, the processing of each file is independent of any
other file. This work can be done in parallel and take advantage of multiple processors. This is
where the Worker API can help.
To use the Worker API, you need to define an interface that represents the parameters of each unit
of work and extends org.gradle.workers.WorkParameters.
For the generation of MD5 hash files, the unit of work will require two parameters:
There is no need to create a concrete implementation because Gradle will generate one for us at
runtime.
buildSrc/src/main/java/MD5WorkParameters.java
import org.gradle.api.file.RegularFileProperty;
import org.gradle.workers.WorkParameters;
① Use Property objects to represent the source and MD5 hash files.
Then, you need to refactor the part of your custom task that does the work for each individual file
into a separate class. This class is your "unit of work" implementation, and it should be an abstract
class that extends org.gradle.workers.WorkAction:
buildSrc/src/main/java/GenerateMD5.java
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.workers.WorkAction;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
① Do not implement the getParameters() method - Gradle will inject this at runtime.
Now, change your custom task class to submit work to the WorkerExecutor instead of doing the
work itself.
buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.workers.*;
import org.gradle.api.file.DirectoryProperty;
import javax.inject.Inject;
import java.io.File;
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor(); ①
@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().noIsolation(); ②
① The WorkerExecutor service is required in order to submit your work. Create an abstract getter
method annotated javax.inject.Inject, and Gradle will inject the service at runtime when the
task is created.
② Before submitting work, get a WorkQueue object with the desired isolation mode (described
below).
③ When submitting the unit of work, specify the unit of work implementation, in this case
GenerateMD5, and configure its parameters.
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
The results should look the same as before, although the MD5 hash files may be generated in a
different order since the units of work are executed in parallel. This time, however, the task runs
much faster. This is because the Worker API executes the MD5 calculation for each file in parallel
rather than in sequence.
The isolation mode controls how strongly Gradle will isolate items of work from each other and the
rest of the Gradle runtime.
1. noIsolation()
2. classLoaderIsolation()
3. processIsolation()
The noIsolation() mode is the lowest level of isolation and will prevent a unit of work from
changing the project state. This is the fastest isolation mode because it requires the least overhead
to set up and execute the work item. However, it will use a single shared classloader for all units of
work. This means that each unit of work can affect one another through static class state. It also
means that every unit of work uses the same version of libraries on the buildscript classpath. If you
wanted the user to be able to configure the task to run with a different (but compatible) version of
the Apache Commons Codec library, you would need to use a different isolation mode.
First, you must change the dependency in buildSrc/build.gradle to be compileOnly. This tells Gradle
that it should use this dependency when building the classes, but should not put it on the build
script classpath:
buildSrc/build.gradle.kts
repositories {
mavenCentral()
}
dependencies {
implementation("commons-io:commons-io:2.5")
compileOnly("commons-codec:commons-codec:1.9")
}
buildSrc/build.gradle
repositories {
mavenCentral()
}
dependencies {
implementation 'commons-io:commons-io:2.5'
compileOnly 'commons-codec:commons-codec:1.9'
}
Next, change the CreateMD5 task to allow the user to configure the version of the codec library that
they want to use. It will resolve the appropriate version of the library at runtime and configure the
workers to use this version.
The classLoaderIsolation() method tells Gradle to run this work in a thread with an isolated
classloader:
buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;
import javax.inject.Inject;
import java.io.File;
import java.util.Set;
@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor();
@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().classLoaderIsolation(workerSpec -> {
workerSpec.getClasspath().from(getCodecClasspath()); ②
});
② Configure the classpath on the ClassLoaderWorkerSpec when creating the work queue.
Next, you need to configure your build so that it has a repository to look up the codec version at
task execution time. We also create a dependency to resolve our codec library from this repository:
build.gradle.kts
plugins { id("base") }
repositories {
mavenCentral() ①
}
dependencies {
codec("commons-codec:commons-codec:1.10") ③
}
tasks.register<CreateMD5>("md5") {
codecClasspath.from(codec) ④
destinationDirectory = project.layout.buildDirectory.dir("md5")
source(project.layout.projectDirectory.file("src"))
}
build.gradle
plugins { id 'base' }
repositories {
mavenCentral() ①
}
configurations.create('codec') { ②
attributes {
attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage
.JAVA_RUNTIME))
}
visible = false
canBeConsumed = false
}
dependencies {
codec 'commons-codec:commons-codec:1.10' ③
}
tasks.register('md5', CreateMD5) {
codecClasspath.from(configurations.codec) ④
destinationDirectory = project.layout.buildDirectory.dir('md5')
source(project.layout.projectDirectory.file('src'))
}
① Add a repository to resolve the codec library - this can be a different repository than the one
used to build the CreateMD5 task class.
④ Configure the md5 task to use the configuration as its classpath. Note that the configuration will
not be resolved until the task is executed.
Now, if you run your task, it should work as expected using the configured version of the codec
library:
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
Sometimes, it is desirable to utilize even greater levels of isolation when executing items of work.
For instance, external libraries may rely on certain system properties to be set, which may conflict
between work items. Or a library might not be compatible with the version of JDK that Gradle is
running with and may need to be run with a different version.
The Worker API can accommodate this using the processIsolation() method that causes the work
to execute in a separate "worker daemon". These worker processes will be session-scoped and can
be reused within the same build session, but they won’t persist across builds. However, if system
resources get low, Gradle will stop unused worker daemons.
To utilize a worker daemon, use the processIsolation() method when creating the WorkQueue. You
may also want to configure custom settings for the new process:
buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;
import javax.inject.Inject;
import java.io.File;
import java.util.Set;
@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor();
@TaskAction
public void createHashes() {
①
WorkQueue workQueue = getWorkerExecutor().processIsolation(workerSpec -> {
workerSpec.getClasspath().from(getCodecClasspath());
workerSpec.forkOptions(options -> {
options.setMaxHeapSize("64m"); ②
});
});
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
Note that the execution time may be high. This is because Gradle has to start a new process for each
worker daemon, which is expensive.
However, if you run your task a second time, you will see that it runs much faster. This is because
the worker daemon(s) started during the initial build have persisted and are available for use
immediately during subsequent builds:
BUILD SUCCESSFUL in 1s
3 actionable tasks: 3 executed
Isolation modes
Gradle provides three isolation modes that can be configured when creating a WorkQueue and are
specified using one of the following methods on WorkerExecutor:
WorkerExecutor.noIsolation()
This states that the work should be run in a thread with minimal isolation.
For instance, it will share the same classloader that the task is loaded from. This is the fastest
level of isolation.
WorkerExecutor.classLoaderIsolation()
This states that the work should be run in a thread with an isolated classloader.
The classloader will have the classpath from the classloader that the unit of work
implementation class was loaded from as well as any additional classpath entries added through
ClassLoaderWorkerSpec.getClasspath().
WorkerExecutor.processIsolation()
This states that the work should be run with a maximum isolation level by executing the work in
a separate process.
The classloader of the process will use the classpath from the classloader that the unit of work
was loaded from as well as any additional classpath entries added through
ClassLoaderWorkerSpec.getClasspath(). Furthermore, the process will be a worker daemon that
will stay alive and can be reused for future work items with the same requirements. This
process can be configured with different settings than the Gradle JVM using
ProcessWorkerSpec.forkOptions(org.gradle.api.Action).
Worker Daemons
When using processIsolation(), Gradle will start a long-lived worker daemon process that can be
reused for future work items.
build.gradle.kts
build.gradle
When a unit of work for a worker daemon is submitted, Gradle will first look to see if a compatible,
idle daemon already exists. If so, it will send the unit of work to the idle daemon, marking it as
busy. If not, it will start a new daemon. When evaluating compatibility, Gradle looks at a number of
criteria, all of which can be controlled through
ProcessWorkerSpec.forkOptions(org.gradle.api.Action).
By default, a worker daemon starts with a maximum heap of 512MB. This can be changed by
adjusting the workers' fork options.
executable
A daemon is considered compatible only if it uses the same Java executable.
classpath
A daemon is considered compatible if its classpath contains all the classpath entries requested.
Note that a daemon is considered compatible only if the classpath exactly matches the requested
classpath.
heap settings
A daemon is considered compatible if it has at least the same heap size settings as requested.
In other words, a daemon that has higher heap settings than requested would be considered
compatible.
jvm arguments
A daemon is compatible if it has set all the JVM arguments requested.
Note that a daemon is compatible if it has additional JVM arguments beyond those requested
(except for those treated especially, such as heap settings, assertions, debug, etc.).
system properties
A daemon is considered compatible if it has set all the system properties requested with the
same values.
Note that a daemon is compatible if it has additional system properties beyond those requested.
environment variables
A daemon is considered compatible if it has set all the environment variables requested with the
same values.
Note that a daemon is compatible if it has more environment variables than requested.
bootstrap classpath
A daemon is considered compatible if it contains all the bootstrap classpath entries requested.
Note that a daemon is compatible if it has more bootstrap classpath entries than requested.
debug
A daemon is considered compatible only if debug is set to the same value as requested (true or
false).
enable assertions
A daemon is considered compatible only if enable assertions are set to the same value as
requested (true or false).
Worker daemons will remain running until the build daemon that started them is stopped or
system memory becomes scarce. When system memory is low, Gradle will stop worker daemons to
minimize memory consumption.
A step-by-step description of converting a normal task action to use the worker API
NOTE
can be found in the section on developing parallel tasks.
To support cancellation (e.g., when the user stops the build with CTRL+C) and task timeouts, custom
tasks should react to interrupting their executing thread. The same is true for work items submitted
via the worker API. If a task does not respond to an interrupt within 10s, the daemon will shut
down to free up system resources.
Advanced Tasks
Incremental tasks
In Gradle, implementing a task that skips execution when its inputs and outputs are already UP-TO-
DATE is simple and efficient, thanks to the Incremental Build feature.
However, there are times when only a few input files have changed since the last execution, and it
is best to avoid reprocessing all the unchanged inputs. This situation is common in tasks that
transform input files into output files on a one-to-one basis.
To optimize your build process you can use an incremental task. This approach ensures that only
out-of-date input files are processed, improving build performance.
For a task to process inputs incrementally, that task must contain an incremental task action.
This is a task action method that has a single InputChanges parameter. That parameter tells Gradle
that the action only wants to process the changed inputs.
In addition, the task needs to declare at least one incremental file input property by using either
@Incremental or @SkipWhenEmpty:
build.gradle.kts
@get:Incremental
@get:InputDirectory
val inputDir: DirectoryProperty = project.objects.directoryProperty()
@get:OutputDirectory
val outputDir: DirectoryProperty = project.objects.directoryProperty()
@get:Input
val inputProperty: RegularFileProperty = project.objects.fileProperty()
// File input property
@TaskAction
fun execute(inputs: InputChanges) { // InputChanges parameter
val msg = if (inputs.isIncremental) "CHANGED inputs are out of date"
else "ALL inputs are out of date"
println(msg)
}
}
build.gradle
@Incremental
@InputDirectory
def File inputDir
@OutputDirectory
def File outputDir
@Input
def inputProperty // File input property
@TaskAction
void execute(InputChanges inputs) { // InputChanges parameter
println inputs.incremental ? "CHANGED inputs are out of date"
: "ALL inputs are out of date"
}
}
To query incremental changes for an input file property, that property must
always return the same instance. The easiest way to accomplish this is to use
one of the following property types: RegularFileProperty, DirectoryProperty
IMPORTANT or ConfigurableFileCollection.
The incremental task action can use InputChanges.getFileChanges() to find out what files have
changed for a given file-based input property, be it of type RegularFileProperty, DirectoryProperty
or ConfigurableFileCollection.
The method returns an Iterable of type FileChanges, which in turn can be queried for the
following:
The following example demonstrates an incremental task that has a directory input. It assumes that
the directory contains a collection of text files and copies them to an output directory, reversing the
text within each file:
build.gradle.kts
@get:OutputDirectory
abstract val outputDir: DirectoryProperty
@get:Input
abstract val inputProperty: Property<String>
@TaskAction
fun execute(inputChanges: InputChanges) {
println(
if (inputChanges.isIncremental) "Executing incrementally"
else "Executing non-incrementally"
)
build.gradle
@OutputDirectory
abstract DirectoryProperty getOutputDir()
@Input
abstract Property<String> getInputProperty()
@TaskAction
void execute(InputChanges inputChanges) {
println(inputChanges.incremental
? 'Executing incrementally'
: 'Executing non-incrementally'
)
If, for some reason, the task is executed non-incrementally (by running with --rerun-tasks, for
example), all files are reported as ADDED, irrespective of the previous state. In this case, Gradle
automatically removes the previous outputs, so the incremental task must only process the given
files.
For a simple transformer task like the above example, the task action must generate output files for
any out-of-date inputs and delete output files for any removed inputs.
When a task has been previously executed, and the only changes since that execution are to
incremental input file properties, Gradle can intelligently determine which input files need to be
processed, a concept known as incremental execution.
However, there are many cases where Gradle cannot determine which input files need to be
processed (i.e., non-incremental execution). Examples include:
• You are building with a different version of Gradle. Currently, Gradle does not use task history
from a different version.
• A non-incremental input file property has changed since the previous execution.
• One or more output files have changed since the previous execution.
In these cases, Gradle will report all input files as ADDED, and the getFileChanges() method will
return details for all the files that comprise the given input property.
You can check if the task execution is incremental or not with the InputChanges.isIncremental()
method.
Consider an instance of IncrementalReverseTask executed against a set of inputs for the first time.
tasks.register<IncrementalReverseTask>("incrementalReverse") {
inputDir = file("inputs")
outputDir = layout.buildDirectory.dir("outputs")
inputProperty = project.findProperty("taskInputProperty") as String? ?:
"original"
}
build.gradle
tasks.register('incrementalReverse', IncrementalReverseTask) {
inputDir = file('inputs')
outputDir = layout.buildDirectory.dir("outputs")
inputProperty = project.properties['taskInputProperty'] ?: 'original'
}
.
├── build.gradle
└── inputs
├── 1.txt
├── 2.txt
└── 3.txt
$ gradle -q incrementalReverse
Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt
Naturally, when the task is executed again with no changes, then the entire task is UP-TO-DATE, and
the task action is not executed:
$ gradle incrementalReverse
> Task :incrementalReverse UP-TO-DATE
BUILD SUCCESSFUL in 0s
1 actionable task: 1 up-to-date
When an input file is modified in some way or a new input file is added, then re-executing the task
results in those files being returned by InputChanges.getFileChanges().
The following example modifies the content of one file and adds another before running the
incremental task:
build.gradle.kts
tasks.register("updateInputs") {
val inputsDir = layout.projectDirectory.dir("inputs")
outputs.dir(inputsDir)
doLast {
inputsDir.file("1.txt").asFile.writeText("Changed content for
existing file 1.")
inputsDir.file("4.txt").asFile.writeText("Content for new file 4.")
}
}
build.gradle
tasks.register('updateInputs') {
def inputsDir = layout.projectDirectory.dir('inputs')
outputs.dir(inputsDir)
doLast {
inputsDir.file('1.txt').asFile.text = 'Changed content for existing
file 1.'
inputsDir.file('4.txt').asFile.text = 'Content for new file 4.'
}
}
The various mutation tasks (updateInputs, removeInput, etc) are only present to
NOTE demonstrate the behavior of incremental tasks. They should not be viewed as the
kinds of tasks or task implementations you should have in your own build scripts.
When an existing input file is removed, then re-executing the task results in that file being returned
by InputChanges.getFileChanges() as REMOVED.
The following example removes one of the existing files before executing the incremental task:
build.gradle.kts
tasks.register<Delete>("removeInput") {
delete("inputs/3.txt")
}
build.gradle
tasks.register('removeInput', Delete) {
delete 'inputs/3.txt'
}
Gradle cannot determine which input files are out-of-date when an output file is deleted (or
modified). In this case, details for all the input files for the given property are returned by
InputChanges.getFileChanges().
The following example removes one of the output files from the build directory. However, all the
input files are considered to be ADDED:
build.gradle.kts
tasks.register<Delete>("removeOutput") {
delete(layout.buildDirectory.file("outputs/1.txt"))
}
build.gradle
tasks.register('removeOutput', Delete) {
delete layout.buildDirectory.file("outputs/1.txt")
}
The last scenario we want to cover concerns what happens when a non-file-based input property is
modified. In such cases, Gradle cannot determine how the property impacts the task outputs, so the
task is executed non-incrementally. This means that all input files for the given property are
returned by InputChanges.getFileChanges() and they are all treated as ADDED.
The following example sets the project property taskInputProperty to a new value when running
the incrementalReverse task. That project property is used to initialize the task’s inputProperty
property, as you can see in the first example of this section.
Sometimes, a user wants to declare the value of an exposed task property on the command line
instead of the build script. Passing property values on the command line is particularly helpful if
they change more frequently.
The task API supports a mechanism for marking a property to automatically generate a
corresponding command line parameter with a specific name at runtime.
To expose a new command line option for a task property, annotate the corresponding setter
method of a property with Option:
A task can expose as many command line options as properties available in the class.
Options may be declared in superinterfaces of the task class as well. If multiple interfaces declare
the same property but with different option flags, they will both work to set the property.
In the example below, the custom task UrlVerify verifies whether a URL can be resolved by making
an HTTP call and checking the response code. The URL to be verified is configurable through the
property url. The setter method for the property is annotated with @Option:
UrlVerify.java
import org.gradle.api.tasks.options.Option;
@Input
public String getUrl() {
return url;
}
@TaskAction
public void verify() {
getLogger().quiet("Verifying URL '{}'", url);
All options declared for a task can be rendered as console output by running the help task and the
--task option.
• The option uses a double-dash as a prefix, e.g., --url. A single dash does not qualify as valid
syntax for a task option.
• The option argument follows directly after the task declaration, e.g., verifyUrl
--url=http://www.google.com/.
• Multiple task options can be declared in any order on the command line following the task
name.
Building upon the earlier example, the build script creates a task instance of type UrlVerify and
provides a value from the command line through the exposed option:
build.gradle.kts
tasks.register<UrlVerify>("verifyUrl")
build.gradle
tasks.register('verifyUrl', UrlVerify)
Gradle limits the data types that can be used for declaring command line options.
Double, Property<Double>
Describes an option with a double value.
Passing the option on the command line also requires a value, e.g., --factor=2.2 or --factor 2.2.
Integer, Property<Integer>
Describes an option with an integer value.
Passing the option on the command line also requires a value, e.g., --network-timeout=5000 or
--network-timeout 5000.
Long, Property<Long>
Describes an option with a long value.
Passing the option on the command line also requires a value, e.g., --threshold=2147483648 or
--threshold 2147483648.
String, Property<String>
Describes an option with an arbitrary String value.
Passing the option on the command line also requires a value, e.g., --container-id=2x94held or
--container-id 2x94held.
enum, Property<enum>
Describes an option as an enumerated type.
Passing the option on the command line also requires a value e.g., --log-level=DEBUG or --log
-level debug.
The value is not case-sensitive.
DirectoryProperty, RegularFileProperty
Describes an option with a file system element.
Passing the option on the command line also requires a value representing a path, e.g., --output
-file=file.txt or --output-dir outputDir.
Relative paths are resolved relative to the project directory of the project that owns this property
instance. See FileSystemLocationProperty.set().
Theoretically, an option for a property type String or List<String> can accept any arbitrary value.
Accepted values for such an option can be documented programmatically with the help of the
annotation OptionValues:
@OptionValues('file')
This annotation may be assigned to any method that returns a List of one of the supported data
types. You need to specify an option identifier to indicate the relationship between the option and
available values.
Passing a value on the command line not supported by the option does not fail the
NOTE build or throw an exception. You must implement custom logic for such behavior in
the task action.
The example below demonstrates the use of multiple options for a single task. The task
implementation provides a list of available values for the option output-type:
UrlProcess.java
import org.gradle.api.tasks.options.Option;
import org.gradle.api.tasks.options.OptionValues;
public abstract class UrlProcess extends DefaultTask {
private String url;
private OutputType outputType;
@Input
@Option(option = "http", description = "Configures the http protocol to be
allowed.")
public abstract Property<Boolean> getHttp();
@Option(option = "url", description = "Configures the URL to send the request to.
")
public void setUrl(String url) {
if (!getHttp().getOrElse(true) && url.startsWith("http://")) {
throw new IllegalArgumentException("HTTP is not allowed");
} else {
this.url = url;
}
}
@Input
public String getUrl() {
return url;
}
@OptionValues("output-type")
public List<OutputType> getAvailableOutputTypes() {
return new ArrayList<OutputType>(Arrays.asList(OutputType.values()));
}
@Input
public OutputType getOutputType() {
return outputType;
}
@TaskAction
public void process() {
getLogger().quiet("Writing out the URL response from '{}' to '{}'", url,
outputType);
Command line options using the annotations Option and OptionValues are self-documenting.
You will see declared options and their available values reflected in the console output of the help
task. The output renders options alphabetically, except for boolean disable options, which appear
following the enable option:
Path
:processUrl
Type
UrlProcess (UrlProcess)
Options
--http Configures the http protocol to be allowed.
Description
-
Group
-
Limitations
Support for declaring command line options currently comes with a few limitations.
• Command line options can only be declared for custom tasks via annotation. There’s no
programmatic equivalent for defining options.
• When assigning an option on the command line, the task exposing the option needs to be
spelled out explicitly, e.g., gradle check --tests abc does not work even though the check task
depends on the test task.
• If you specify a task option name that conflicts with the name of a built-in Gradle option, use the
-- delimiter before calling your task to reference that option. For more information, see
Disambiguate Task Options from Built-in Options.
Verification failures
Normally, exceptions thrown during task execution result in a failure that immediately terminates
a build. The outcome of the task will be FAILED, the result of the build will be FAILED, and no further
tasks will be executed. When running with the --continue flag, Gradle will continue to run other
requested tasks in the build after encountering a task failure. However, any tasks that depend on a
failed task will not be executed.
There is a special type of exception that behaves differently when downstream tasks only rely on
the outputs of a failing task. A task can throw a subtype of VerificationException to indicate that it
has failed in a controlled manner such that its output is still valid for consumers. A task depends on
the outcome of another task when it directly depends on it using dependsOn. When Gradle is run
with --continue, consumer tasks that depend on a producer task’s output (via a relationship
between task inputs and outputs) can still run after the producer fails.
A failed unit test, for instance, will cause a failing outcome for the test task. However, this doesn’t
prevent another task from reading and processing the (valid) test results the task produced.
Verification failures are used in exactly this manner by the Test Report Aggregation Plugin.
Verification failures are also useful for tasks that need to report a failure even after producing
useful output consumable by other tasks.
build.gradle.kts
doLast {
val logFile = outputFile.get().asFile
logFile.appendText("Step 1 Complete.") ②
throw VerificationException("Process failed!") ③
logFile.appendText("Step 2 Complete.") ④
}
}
tasks.register("postProcess") {
inputs.files(process) ⑤
doLast {
println("Results: ${inputs.files.singleFile.readText()}") ⑥
}
}
build.gradle
tasks.register("process") {
def outputFile = layout.buildDirectory.file("processed.log")
outputs.files(outputFile) ①
doLast {
def logFile = outputFile.get().asFile
logFile << "Step 1 Complete." ②
throw new VerificationException("Process failed!") ③
logFile << "Step 2 Complete." ④
}
}
tasks.register("postProcess") {
inputs.files(tasks.named("process")) ⑤
doLast {
println("Results: ${inputs.files.singleFile.text}") ⑥
}
}
① Register Output: The process task writes its output to a log file.
③ Task Failure: The task throws a VerificationException and fails at this point.
④ Continue to Modify Output: This line never runs due to the exception stopping the task.
⑤ Consume Output: The postProcess task depends on the output of the process task due to using
that task’s outputs as its own inputs.
⑥ Use Partial Result: With the --continue flag set, Gradle still runs the requested postProcess task
despite the process task’s failure. postProcess can read and display the partial (though still valid)
result.
DEVELOPING PLUGINS
Understanding Plugins
Gradle comes with a set of powerful core systems such as dependency management, task execution,
and project configuration. But everything else it can do is supplied by plugins.
Plugins encapsulate logic for specific tasks or integrations, such as compiling code, running tests, or
deploying artifacts. By applying plugins, users can easily add new features to their build process
without having to write complex code from scratch.
This plugin-based approach allows Gradle to be lightweight and modular. It also promotes code
reuse and maintainability, as plugins can be shared across projects or within an organization.
Before reading this chapter, it’s recommended that you first read Learning The Basics and complete
the Tutorial.
Plugins Introduction
Plugins can be sourced from Gradle or the Gradle community. But when users want to organize
their build logic or need specific build capabilities not provided by existing plugins, they can
develop their own.
2. Community Plugins - plugins that come from Gradle Plugin Portal or a public repository.
Core Plugins
The term core plugin refers to a plugin that is part of the Gradle distribution such as the Java
Library Plugin. They are always available.
Community Plugins
The term community plugin refers to a plugin published to the Gradle Plugin Portal (or another
public repository) such as the Spotless Plugin.
The term local or custom plugin refers to a plugin you write yourself for your own build.
Custom plugins
Script plugins
Script plugins are typically small, local plugins written in script files for tasks specific to a single
build or project. They do not need to be reused across multiple projects. Script plugins are not
recommended but many other forms of plugins evolve from script plugins.
To create a plugin, you need to write a class that implements the Plugin interface.
The following sample creates a GreetingPlugin, which adds a hello task to a project when applied:
build.gradle.kts
build.gradle
$ gradle -q hello
Hello from the GreetingPlugin
The Project object is passed as a parameter in apply(), which the plugin can use to configure the
project however it needs to (such as adding tasks, configuring dependencies, etc.). In this example,
the plugin is written directly in the build file which is not a recommended practice.
When the plugin is written in a separate script file, it can be applied using apply(from =
"file_name.gradle.kts") or apply from: 'file_name.gradle'. In the example below, the plugin is
coded in the other.gradle(.kts) script file. Then, the other.gradle(.kts) is applied to
build.gradle(.kts) using apply from:
other.gradle.kts
other.gradle
build.gradle.kts
apply(from = "other.gradle.kts")
build.gradle
$ gradle -q hi
Hi from the GreetingScriptPlugin
Precompiled script plugins are compiled into class files and packaged into a JAR before they are
executed. These plugins use the Groovy DSL or Kotlin DSL instead of pure Java, Kotlin, or Groovy.
They are best used as convention plugins that share build logic across projects or as a way to
neatly organize build logic.
2. Use Gradle’s Groovy DSL - The plugin is a .gradle file, and apply id("groovy-gradle-plugin").
To apply a precompiled script plugin, you need to know its ID. The ID is derived from the plugin
script’s filename and its (optional) package declaration.
When the plugin is applied to a project, Gradle creates an instance of the plugin class and calls the
instance’s Plugin.apply() method.
NOTE A new instance of a Plugin is created within each project applying that plugin.
Let’s rewrite the GreetingPlugin script plugin as a precompiled script plugin. Since we are using the
Groovy or Kotlin DSL, the file essentially becomes the plugin. The original script plugin simply
created a hello task which printed a greeting, this is what we will do in the pre-compiled script
plugin:
buildSrc/src/main/kotlin/GreetingPlugin.gradle.kts
tasks.register("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}
buildSrc/src/main/groovy/GreetingPlugin.gradle
tasks.register("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}
The GreetingPlugin can now be applied in other subprojects' builds by using its ID:
app/build.gradle.kts
plugins {
application
id("GreetingPlugin")
}
app/build.gradle
plugins {
id 'application'
id('GreetingPlugin')
}
$ gradle -q hello
Hello from the convention GreetingPlugin
Convention plugins
A convention plugin is typically a precompiled script plugin that configures existing core and
community plugins with your own conventions (i.e. default values) such as setting the Java version
by using java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are
also used to enforce project standards and help streamline the build process. They can apply and
configure plugins, create new tasks and extensions, set dependencies, and much more.
Let’s take an example build with three subprojects: one for data-model, one for database-logic and
one for app code. The project has the following structure:
.
├── buildSrc
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── data-model
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── database-logic
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── app
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts
database-logic/build.gradle.kts
plugins {
id("java-library")
id("org.jetbrains.kotlin.jvm") version "1.9.23"
}
repositories {
mavenCentral()
}
java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}
tasks.test {
useJUnitPlatform()
}
kotlin {
jvmToolchain(11)
}
database-logic/build.gradle
plugins {
id 'java-library'
id 'org.jetbrains.kotlin.jvm' version '1.9.23'
}
repositories {
mavenCentral()
}
java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}
tasks.test {
useJUnitPlatform()
}
kotlin {
jvmToolchain {
languageVersion.set(JavaLanguageVersion.of(11))
}
}
We apply the java-library plugin and add the org.jetbrains.kotlin.jvm plugin for Kotlin support.
We also configure Kotlin, Java, tests and more.
The more plugins we apply and the more plugins we configure, the larger it gets. There’s also
repetition in the build files of the app and data-model subprojects, especially when configuring
common extensions like setting the Java version and Kotlin support.
To address this, we use convention plugins. This allows us to avoid repeating configuration in each
build file and keeps our build scripts more concise and maintainable. In convention plugins, we can
encapsulate arbitrary build configuration or custom build logic.
buildSrc/src/main/kotlin/my-java-library.gradle.kts
plugins {
id("java-library")
id("org.jetbrains.kotlin.jvm")
}
repositories {
mavenCentral()
}
java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}
tasks.test {
useJUnitPlatform()
}
kotlin {
jvmToolchain(11)
}
buildSrc/src/main/groovy/my-java-library.gradle
plugins {
id 'java-library'
id 'org.jetbrains.kotlin.jvm'
}
repositories {
mavenCentral()
}
java {
toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}
tasks.test {
useJUnitPlatform()
}
kotlin {
jvmToolchain {
languageVersion.set(JavaLanguageVersion.of(11))
}
}
The name of the file my-java-library is the ID of our brand-new plugin, which we can now use in all
of our subprojects.
The database-logic build file becomes much simpler by removing all the redundant build logic and
applying our convention my-java-library plugin instead:
database-logic/build.gradle.kts
plugins {
id("my-java-library")
}
database-logic/build.gradle
plugins {
id('my-java-library')
}
This convention plugin enables us to easily share common configurations across all our build files.
Any modifications can be made in one place, simplifying maintenance.
Binary plugins
Binary plugins in Gradle are plugins that are built as standalone JAR files and applied to a project
using the plugins{} block in the build script.
Let’s move our GreetingPlugin to a standalone project so that we can publish it and share it with
others. The plugin is essentially moved from the buildSrc folder to its own build called greeting-
plugin.
You can publish the plugin from buildSrc, but this is not recommended practice.
NOTE
Plugins that are ready for publication should be in their own build.
greeting-plugin is simply a Java project that produces a JAR containing the plugin classes.
The easiest way to package and publish a plugin to a repository is to use the Gradle Plugin
Development Plugin. This plugin provides the necessary tasks and configurations (including the
plugin metadata) to compile your script into a plugin that can be applied in other builds.
Here is a simple build script for the greeting-plugin project using the Gradle Plugin Development
Plugin:
build.gradle.kts
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("simplePlugin") {
id = "org.example.greeting"
implementationClass = "org.example.GreetingPlugin"
}
}
}
build.gradle
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
simplePlugin {
id = 'org.example.greeting'
implementationClass = 'org.example.GreetingPlugin'
}
}
}
In the example used through this section, the plugin accepts the Project type as a type parameter.
Alternatively, the plugin can accept a parameter of type Settings to be applied in a settings script, or
a parameter of type Gradle to be applied in an initialization script.
The difference between these types of plugins lies in the scope of their application:
Project Plugin
A project plugin is a plugin that is applied to a specific project in a build. It can customize the
build logic, add tasks, and configure the project-specific settings.
Settings Plugin
A settings plugin is a plugin that is applied in the settings.gradle or settings.gradle.kts file. It
can configure settings that apply to the entire build, such as defining which projects are
included in the build, configuring build script repositories, and applying common configurations
to all projects.
Init Plugin
An init plugin is a plugin that is applied in the init.gradle or init.gradle.kts file. It can
configure settings that apply globally to all Gradle builds on a machine, such as configuring the
Gradle version, setting up default repositories, or applying common plugins to all builds.
Script Plugins are simple and easy to write. They are written in Kotlin DSL or Groovy DSL. They
are suitable for small, one-off tasks or for quick experimentation. However, they can become hard
to maintain as the build script grows in size and complexity.
Precompiled Script Plugins are Kotlin or Groovy DSL scripts compiled into Java class files
packaged in a library. They offer better performance and maintainability compared to script
plugins, and they can be reused across different projects. You can also write them in Groovy DSL
but that is not recommended.
Binary Plugins are full-fledged plugins written in Java, Groovy, or Kotlin, compiled into JAR files,
and published to a repository. They offer the best performance, maintainability, and reusability.
They are suitable for complex build logic that needs to be shared across projects, builds, and teams.
You can also write them in Scala or Groovy but that is not recommended.
If you suspect issues with your plugin code, try creating a Build Scan to identify bottlenecks. The
Gradle profiler can help automate Build Scan generation and gather more low-level information.
A convention plugin is a plugin that normaly configures existing core and community plugins with
your own conventions (i.e. default values) such as setting the Java version by using
java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are also used to
enforce project standards and help streamline the build process. They can apply and configure
plugins, create new tasks and extensions, set dependencies, and much more.
The plugin ID for a precompiled script is derived from its file name and optional package
declaration.
buildSrc/build.gradle.kts
plugins {
id("kotlin-dsl")
}
app/build.gradle.kts
plugins {
id("code-quality")
}
buildSrc/build.gradle
plugins {
id 'groovy-gradle-plugin'
}
app/build.gradle
plugins {
id 'code-quality'
}
buildSrc/build.gradle.kts
plugins {
id("kotlin-dsl")
}
app/build.gradle.kts
plugins {
id("my.code-quality")
}
buildSrc/build.gradle
plugins {
id 'groovy-gradle-plugin'
}
app/build.gradle
plugins {
id 'my.code-quality'
}
Extension objects are commonly used in plugins to expose configuration options and additional
functionality to build scripts.
When you apply a plugin that defines an extension, you can access the extension object and
configure its properties or call its methods to customize the behavior of the plugin or tasks
provided by the plugin.
A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.
buildSrc/src/main/kotlin/greetings.gradle.kts
You can set the value of the message property directly with extension.message.set("Hi from
Gradle,").
However, the GreetingPluginExtension object becomes available as a project property with the same
name as the extension object. You can now access message like so:
buildSrc/src/main/kotlin/greetings.gradle.kts
buildSrc/src/main/groovy/greetings.gradle
If you apply the greetings plugin, you can set the convention in your build script:
app/build.gradle.kts
plugins {
application
id("greetings")
}
greeting {
message = "Hello from Gradle"
}
app/build.gradle
plugins {
id 'application'
id('greetings')
}
configure(greeting) {
message = "Hello from Gradle"
}
In plugins, you can define default values, also known as conventions, using the project object.
Convention properties are properties that are initialized with default values but can be overridden:
buildSrc/src/main/kotlin/greetings.gradle.kts
buildSrc/src/main/groovy/greetings.gradle
extension.message.convention(…) sets a convention for the message property of the extension. This
convention specifies that the value of message should default to the content of a file named
defaultGreeting.txt located in the build directory of the project.
If the message property is not explicitly set, its value will be automatically set to the content of
defaultGreeting.txt.
Using an extension and mapping it to a custom task’s input/output properties is common in plugins.
In this example, the message property of the GreetingPluginExtension is mapped to the message
property of the GreetingTask as an input:
buildSrc/src/main/kotlin/greetings.gradle.kts
@TaskAction
fun greet() {
println("Message: ${message.get()}")
}
}
@TaskAction
void greet() {
println("Message: ${message.get()}")
}
}
$ gradle -q hello
Message: Hello from Gradle
This means that changes to the extension’s message property will trigger the task to be considered
out-of-date, ensuring that the task is re-executed with the new message.
You can find out more about types that you can use in task implementations and extensions in Lazy
Configuration.
In order to apply an external plugin in a precompiled script plugin, it has to be added to the plugin
project’s implementation classpath in the plugin’s build file:
buildSrc/build.gradle.kts
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
dependencies {
implementation("com.bmuschko:gradle-docker-plugin:6.4.0")
}
buildSrc/build.gradle
plugins {
id 'groovy-gradle-plugin'
}
repositories {
mavenCentral()
}
dependencies {
implementation 'com.bmuschko:gradle-docker-plugin:6.4.0'
}
buildSrc/src/main/kotlin/my-plugin.gradle.kts
plugins {
id("com.bmuschko.docker-remote-api")
}
buildSrc/src/main/groovy/my-plugin.gradle
plugins {
id 'com.bmuschko.docker-remote-api'
}
The Gradle Plugin Development plugin can be used to assist in developing Gradle plugins.
This plugin will automatically apply the Java Plugin, add the gradleApi() dependency to the api
configuration, generate the required plugin descriptors in the resulting JAR file, and configure the
Plugin Marker Artifact to be used when publishing.
To apply and configure the plugin, add the following code to your build file:
build.gradle.kts
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("simplePlugin") {
id = "org.example.greeting"
implementationClass = "org.example.GreetingPlugin"
}
}
}
build.gradle
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
simplePlugin {
id = 'org.example.greeting'
implementationClass = 'org.example.GreetingPlugin'
}
}
}
Writing and using custom task types is recommended when developing plugins as it automatically
benefits from incremental builds. As an added benefit of applying the plugin to your project, the
task validatePlugins automatically checks for an existing input/output annotation for every public
property defined in a custom task type implementation.
Creating a plugin ID
Plugin IDs are meant to be globally unique, similar to Java package names (i.e., a reverse domain
name). This format helps prevent naming collisions and allows grouping plugins with similar
ownership.
An explicit plugin identifier simplifies applying the plugin to a project. Your plugin ID should
combine components that reflect the namespace (a reasonable pointer to you or your organization)
and the name of the plugin it provides. For example, if your Github account is named foo and your
plugin is named bar, a suitable plugin ID might be com.github.foo.bar. Similarly, if the plugin was
developed at the baz organization, the plugin ID might be org.baz.bar.
• Must contain at least one '.' character separating the namespace from the plugin’s name.
• Conventionally use a lowercase reverse domain name convention for the namespace.
A namespace that identifies ownership and a name is sufficient for a plugin ID.
When bundling multiple plugins in a single JAR artifact, adhering to the same naming conventions
is recommended. This practice helps logically group related plugins.
There is no limit to the number of plugins that can be defined and registered (by different
identifiers) within a single project.
The identifiers for plugins written as a class should be defined in the project’s build script
containing the plugin classes. For this, the java-gradle-plugin needs to be applied:
buildSrc/build.gradle.kts
plugins {
id("java-gradle-plugin")
}
gradlePlugin {
plugins {
create("androidApplicationPlugin") {
id = "com.android.application"
implementationClass = "com.android.AndroidApplicationPlugin"
}
create("androidLibraryPlugin") {
id = "com.android.library"
implementationClass = "com.android.AndroidLibraryPlugin"
}
}
}
buildSrc/build.gradle
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
androidApplicationPlugin {
id = 'com.android.application'
implementationClass = 'com.android.AndroidApplicationPlugin'
}
androidLibraryPlugin {
id = 'com.android.library'
implementationClass = 'com.android.AndroidLibraryPlugin'
}
}
}
When developing plugins, it’s a good idea to be flexible when accepting input configuration for file
locations.
It is recommended to use Gradle’s managed properties and project.layout to select file or directory
locations. This will enable lazy configuration so that the actual location will only be resolved when
the file is needed and can be reconfigured at any time during build configuration.
This Gradle build file defines a task GreetingToFileTask that writes a greeting to a file. It also
registers two tasks: greet, which creates the file with the greeting, and sayGreeting, which prints the
file’s contents. The greetingFile property is used to specify the file path for the greeting:
build.gradle.kts
@get:OutputFile
abstract val destination: RegularFileProperty
@TaskAction
fun greet() {
val file = destination.get().asFile
file.parentFile.mkdirs()
file.writeText("Hello!")
}
}
tasks.register<GreetingToFileTask>("greet") {
destination = greetingFile
}
tasks.register("sayGreeting") {
dependsOn("greet")
val greetingFile = greetingFile
doLast {
val file = greetingFile.get().asFile
println("${file.readText()} (file: ${file.name})")
}
}
greetingFile = layout.buildDirectory.file("hello.txt")
build.gradle
@OutputFile
abstract RegularFileProperty getDestination()
@TaskAction
def greet() {
def file = getDestination().get().asFile
file.parentFile.mkdirs()
file.write 'Hello!'
}
}
tasks.register('greet', GreetingToFileTask) {
destination = greetingFile
}
tasks.register('sayGreeting') {
dependsOn greet
doLast {
def file = greetingFile.get().asFile
println "${file.text} (file: ${file.name})"
}
}
greetingFile = layout.buildDirectory.file('hello.txt')
$ gradle -q sayGreeting
Hello! (file: hello.txt)
In this example, we configure the greet task destination property as a closure/provider, which is
evaluated with the Project.file(java.lang.Object) method to turn the return value of the
closure/provider into a File object at the last minute. Note that we specify the greetingFile
property value after the task configuration. This lazy evaluation is a key benefit of accepting any
value when setting a file property and then resolving that value when reading the property.
You can learn more about working with files lazily in Working with Files.
Most plugins offer configuration options for build scripts and other plugins to customize how the
plugin works. Plugins do this using extension objects.
A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.
An extension object is simply an object with Java Bean properties representing the configuration.
Let’s add a greeting extension object to the project, which allows you to configure the greeting:
build.gradle.kts
interface GreetingPluginExtension {
val message: Property<String>
}
apply<GreetingPlugin>()
build.gradle
interface GreetingPluginExtension {
Property<String> getMessage()
}
$ gradle -q hello
Hi from Gradle
In this example, GreetingPluginExtension is an object with a property called message. The extension
object is added to the project with the name greeting. This object becomes available as a project
property with the same name as the extension object. the<GreetingPluginExtension>() is equivalent
to project.extensions.getByType(GreetingPluginExtension::class.java).
Often, you have several related properties you need to specify on a single plugin. Gradle adds a
configuration block for each extension object, so you can group settings:
build.gradle.kts
interface GreetingPluginExtension {
val message: Property<String>
val greeter: Property<String>
}
apply<GreetingPlugin>()
interface GreetingPluginExtension {
Property<String> getMessage()
Property<String> getGreeter()
}
$ gradle -q hello
Hi from Gradle
In this example, several settings can be grouped within the greeting closure. The name of the
closure block in the build script (greeting) must match the extension object name. Then, when the
closure is executed, the fields on the extension object will be mapped to the variables within the
closure based on the standard Groovy closure delegate feature.
Using an extension object extends the Gradle DSL to add a project property and DSL block for the
plugin. Because an extension object is a regular object, you can provide your own DSL nested inside
the plugin block by adding properties and methods to the extension object.
build.gradle.kts
plugins {
id("org.myorg.server-env")
}
environments {
create("dev") {
url = "http://localhost:8080"
}
create("staging") {
url = "http://staging.enterprise.com"
}
create("production") {
url = "http://prod.enterprise.com"
}
}
build.gradle
plugins {
id 'org.myorg.server-env'
}
environments {
dev {
url = 'http://localhost:8080'
}
staging {
url = 'http://staging.enterprise.com'
}
production {
url = 'http://prod.enterprise.com'
}
}
The DSL exposed by the plugin exposes a container for defining a set of environments. Each
environment the user configures has an arbitrary but declarative name and is represented with its
own DSL configuration block. The example above instantiates a development, staging, and
production environment, including its respective URL.
Each environment must have a data representation in code to capture the values. The name of an
environment is immutable and can be passed in as a constructor parameter. Currently, the only
other parameter the data object stores is a URL.
ServerEnvironment.java
@javax.inject.Inject
public ServerEnvironment(String name) {
this.name = name;
}
It’s common for a plugin to post-process the captured values within the plugin implementation, e.g.,
to configure tasks:
ServerEnvironmentPlugin.java
NamedDomainObjectContainer<ServerEnvironment> serverEnvironmentContainer =
objects.domainObjectContainer(ServerEnvironment.class, name -> objects
.newInstance(ServerEnvironment.class, name));
project.getExtensions().add("environments", serverEnvironmentContainer);
serverEnvironmentContainer.all(serverEnvironment -> {
String env = serverEnvironment.getName();
String capitalizedServerEnv = env.substring(0, 1).toUpperCase() + env
.substring(1);
String taskName = "deployTo" + capitalizedServerEnv;
project.getTasks().register(taskName, Deploy.class, task -> task.getUrl()
.set(serverEnvironment.getUrl()));
});
}
}
In the example above, a deployment task is created dynamically for every user-configured
environment.
You can find out more about implementing project extensions in Developing Custom Gradle Types.
For example, let’s consider the following extension provided by a plugin. In its current form, it
offers a "flat" list of properties for configuring the creation of a website:
build-flat.gradle.kts
plugins {
id("org.myorg.site")
}
site {
outputDir = layout.buildDirectory.file("mysite")
websiteUrl = "https://gradle.org"
vcsUrl = "https://github.com/gradle/gradle-site-plugin"
}
build-flat.gradle
plugins {
id 'org.myorg.site'
}
site {
outputDir = layout.buildDirectory.file("mysite")
websiteUrl = 'https://gradle.org'
vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
}
As the number of exposed properties grows, you should introduce a nested, more expressive
structure.
The following code snippet adds a new configuration block named siteInfo as part of the extension.
This provides a stronger indication of what those properties mean:
build.gradle.kts
plugins {
id("org.myorg.site")
}
site {
outputDir = layout.buildDirectory.file("mysite")
siteInfo {
websiteUrl = "https://gradle.org"
vcsUrl = "https://github.com/gradle/gradle-site-plugin"
}
}
build.gradle
plugins {
id 'org.myorg.site'
}
site {
outputDir = layout.buildDirectory.file("mysite")
siteInfo {
websiteUrl = 'https://gradle.org'
vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
}
}
Implementing the backing objects for such an extension is simple. First, introduce a new data
object for managing the properties websiteUrl and vcsUrl:
SiteInfo.java
In the extension, create an instance of the siteInfo class and a method to delegate the captured
values to the data instance.
SiteExtension.java
@Nested
abstract public SiteInfo getSiteInfo();
Plugins commonly use an extension to capture user input from the build script and map it to a
custom task’s input/output properties. The build script author interacts with the extension’s DSL,
while the plugin implementation handles the underlying logic:
app/build.gradle.kts
@TaskAction
fun executeTask() {
println("Input parameter: $inputParameter")
}
}
// Plugin class that configures the extension and task
class MyPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Create and configure the extension
val extension = project.extensions.create("myExtension",
MyExtension::class.java)
// Create and configure the custom task
project.tasks.register("myTask", MyCustomTask::class.java) {
group = "custom"
inputParameter = extension.inputParameter
}
}
}
app/build.gradle
@TaskAction
def executeTask() {
println("Input parameter: $inputParameter")
}
}
You can learn more about types you can use in task implementations and extensions in Lazy
Configuration.
Plugins should provide sensible defaults and standards in a specific context, reducing the number
of decisions users need to make. Using the project object, you can define default values. These are
known as conventions.
Conventions are properties that are initialized with default values and can be overridden by the
user in their build script. For example:
build.gradle.kts
interface GreetingPluginExtension {
val message: Property<String>
}
apply<GreetingPlugin>()
build.gradle
interface GreetingPluginExtension {
Property<String> getMessage()
}
$ gradle -q hello
Hello from GreetingPlugin
In this example, GreetingPluginExtension is a class that represents the convention. The message
property is the convention property with a default value of 'Hello from GreetingPlugin'.
build.gradle.kts
GreetingPluginExtension {
message = "Custom message"
}
build.gradle
GreetingPluginExtension {
message = 'Custom message'
}
$ gradle -q hello
Custom message
Separating capabilities from conventions
Separating capabilities from conventions in plugins allows users to choose which tasks and
conventions to apply.
For example, the Java Base plugin provides un-opinionated (i.e., generic) functionality like
SourceSets, while the Java plugin adds tasks and conventions familiar to Java developers like
classes, jar or javadoc.
When designing your own plugins, consider developing two plugins — one for capabilities and
another for conventions — to offer flexibility to users.
In the example below, MyPlugin contains conventions, and MyBasePlugin defines capabilities. Then,
MyPlugin applies MyBasePlugin, this is called plugin composition. To apply a plugin from another one:
MyBasePlugin.java
import org.gradle.api.Plugin;
import org.gradle.api.Project;
MyPlugin.java
import org.gradle.api.Plugin;
import org.gradle.api.Project;
// define conventions
}
}
Reacting to plugins
For example, a plugin could assume that it is applied to a Java-based project and automatically
reconfigure the standard source directory:
InhouseStrongOpinionConventionJavaPlugin.java
The drawback to this approach is that it automatically forces the project to apply the Java plugin,
imposing a strong opinion on it (i.e., reducing flexibility and generality). In practice, the project
applying the plugin might not even deal with Java code.
Instead of automatically applying the Java plugin, the plugin could react to the fact that the
consuming project applies the Java plugin. Only if that is the case, then a certain configuration is
applied:
InhouseConventionJavaPlugin.java
Reacting to plugins is preferred over applying plugins if there is no good reason to assume that the
consuming project has the expected setup.
InhouseConventionWarPlugin.java
Plugins can access the status of build features in the build. The Build Features API allows checking
whether the user requested a particular Gradle feature and if it is active in the current build. An
example of a build feature is the configuration cache.
@Inject
protected abstract BuildFeatures getBuildFeatures(); ①
@Override
public void apply(Project p) {
BuildFeatures buildFeatures = getBuildFeatures();
① The BuildFeatures service can be injected into plugins, tasks, and other managed types.
The BuildFeature.getRequested() status of a build feature determines if the user requested to enable
or disable the feature.
• undefined — the user neither opted in nor opted out from using the feature
The BuildFeature.getActive() status of a build feature is always defined. It represents the effective
state of the feature in the build.
• true — the feature may affect the build behavior in a way specific to the feature
Note that the active status does not depend on the requested status. Even if the user requests a
feature, it may still not be active due to other build options being used in the build. Gradle can also
activate a feature by default, even if the user did not specify a preference.
A plugin can provide dependency declarations in custom blocks that allow users to declare
dependencies in a type-safe and context-aware way.
For instance, instead of users needing to know and use the underlying Configuration name to add
dependencies, a custom dependencies block lets the plugin pick a meaningful name that can be used
consistently.
To add a custom dependencies block, you need to create a new type that will represent the set of
dependency scopes available to users. That new type needs to be accessible from a part of your
plugin (from a domain object or extension). Finally, the dependency scopes need to be wired back
to underlying Configuration objects that will be used during dependency resolution.
See JvmComponentDependencies and JvmTestSuite for an example of how this is used in a Gradle
core plugin.
ExampleDependencies.java
/**
* Custom dependencies block for the example plugin.
*/
public interface ExampleDependencies extends Dependencies {
For each dependency scope your plugin wants to support, add a getter method that returns a
DependencyCollector.
ExampleDependencies.java
/**
* Dependency scope called "implementation"
*/
DependencyCollector getImplementation();
To make the custom dependencies block configurable, the plugin needs to add a getDependencies
method that returns the new type from above and a configurable block method named
dependencies.
By convention, the accessors for your custom dependencies block should be called
getDependencies()/dependencies(Action). This method could be named something else, but users
would need to know that a different block can behave like a dependencies block.
ExampleExtension.java
/**
* Custom dependencies for this extension.
*/
@Nested
ExampleDependencies getDependencies();
/**
* Configurable block
*/
default void dependencies(Action<? super ExampleDependencies> action) {
action.execute(getDependencies());
}
4. Wire dependency scope to Configuration
Finally, the plugin needs to wire the custom dependencies block to some underlying Configuration
objects. If this is not done, none of the dependencies declared in the custom block will be available
to dependency resolution.
ExamplePlugin.java
project.getConfigurations().dependencyScope("exampleImplementation", conf
-> {
conf.fromDependencyCollector(example.getDependencies()
.getImplementation());
});
In this example, the name users will use to add dependencies is "implementation",
NOTE
but the underlying Configuration is named exampleImplementation.
build.gradle.kts
example {
dependencies {
implementation("junit:junit:4.13")
}
}
build.gradle
example {
dependencies {
implementation("junit:junit:4.13")
}
}
Differences between the custom dependencies and the top-level dependencies blocks
Each dependency scope returns a DependencyCollector that provides strongly-typed methods to add
and configure dependencies.
There is also a DependencyFactory with factory methods to create new dependencies from different
notations. Dependencies can be created lazily using these factory methods, as shown below.
A custom dependencies block differs from the top-level dependencies block in the following ways:
• You cannot declare dependencies with the Map notation from Kotlin and Java. Use multi-
argument methods instead in Kotlin and Java.
• You cannot add a dependency with an instance of Project. You must turn it into a
ProjectDependency first.
• You cannot add version catalog bundles directly. Instead, use the bundle method on each
configuration.
• You cannot use providers for non-Dependency types directly. Instead, map them to a Dependency
using the DependencyFactory.
• Unlike the top-level dependencies block, constraints are not in a separate block.
Keep in mind that the dependencies block may not provide access to the same methods as the top-
level dependencies block.
NOTE Plugins should prefer adding dependencies via their own dependencies block.
You might want to automatically download an artifact using Gradle’s dependency management
mechanism and later use it in the action of a task type declared in the plugin. Ideally, the plugin
implementation does not need to ask the user for the coordinates of that dependency - it can simply
predefine a sensible default version.
Let’s look at an example of a plugin that downloads files containing data for further processing. The
plugin implementation declares a custom configuration that allows for assigning those external
dependencies with dependency coordinates:
DataProcessingPlugin.java
project.getTasks().withType(DataProcessing.class).configureEach(
dataProcessing -> dataProcessing.getDataFiles().from(dataFiles));
}
}
DataProcessing.java
@InputFiles
abstract public ConfigurableFileCollection getDataFiles();
@TaskAction
public void process() {
System.out.println(getDataFiles().getFiles());
}
}
This approach is convenient for the end user as there is no need to actively declare a dependency.
The plugin already provides all the details about this implementation.
No problem. The plugin also exposes the custom configuration that can be used to assign a different
dependency. Effectively, the default dependency is overwritten:
build.gradle.kts
plugins {
id("org.myorg.data-processing")
}
dependencies {
dataFiles("org.myorg:more-data:2.6")
}
build.gradle
plugins {
id 'org.myorg.data-processing'
}
dependencies {
dataFiles 'org.myorg:more-data:2.6'
}
You will find that this pattern works well for tasks that require an external dependency when the
task’s action is executed. You can go further and abstract the version to be used for the external
dependency by exposing an extension property (e.g. toolVersion in the JaCoCo plugin).
Using external libraries in your Gradle projects can bring great convenience, but be aware that they
can introduce complex dependency graphs. Gradle’s buildEnvironment task can help you visualize
these dependencies, including those of your plugins. Keep in mind that plugins share the same
classloader, so conflicts may arise with different versions of the same library.
build.gradle.kts
plugins {
id("org.asciidoctor.jvm.convert") version "4.0.2"
}
build.gradle
plugins {
id 'org.asciidoctor.jvm.convert' version '4.0.2'
}
The output of the task clearly indicates the classpath of the classpath configuration:
$ gradle buildEnvironment
> Task :buildEnvironment
------------------------------------------------------------
Root project 'external-libraries'
------------------------------------------------------------
classpath
\--- org.asciidoctor.jvm.convert:org.asciidoctor.jvm.convert.gradle.plugin:4.0.2
\--- org.asciidoctor:asciidoctor-gradle-jvm:4.0.2
+--- org.ysb33r.gradle:grolifant-rawhide:3.0.0
| \--- org.tukaani:xz:1.6
+--- org.ysb33r.gradle:grolifant-herd:3.0.0
| +--- org.tukaani:xz:1.6
| +--- org.ysb33r.gradle:grolifant40:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.apache.commons:commons-collections4:4.4
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0
| | | +--- org.tukaani:xz:1.6
| | | +--- org.apache.commons:commons-collections4:4.4
| | | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant50:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant40-legacy-api:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.apache.commons:commons-collections4:4.4
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant60:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant70:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant80:3.0.0
| | +--- org.tukaani:xz:1.6
| | +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
| | +--- org.ysb33r.gradle:grolifant70:3.0.0 (*)
| | \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
| \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
+--- org.asciidoctor:asciidoctor-gradle-base:4.0.2
| \--- org.ysb33r.gradle:grolifant-herd:3.0.0 (*)
\--- org.asciidoctor:asciidoctorj-api:2.5.7
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
A Gradle plugin does not run in its own, isolated classloader, so you must consider whether you
truly need a library or if a simpler solution suffices.
For logic that is executed as part of task execution, use the Worker API that allows you to isolate
libraries.
Variants of a plugin refer to different flavors or configurations of the plugin that are tailored to
specific needs or use cases. These variants can include different implementations, extensions, or
configurations of the base plugin.
The most convenient way to configure additional plugin variants is to use feature variants, a
concept available in all Gradle projects that apply one of the Java plugins:
dependencies {
implementation 'com.google.guava:guava:30.1-jre' // Regular dependency
featureVariant 'com.google.guava:guava-gwt:30.1-jre' // Feature variant
dependency
}
In the following example, each plugin variant is developed in isolation. A separate source set is
compiled and packaged in a separate jar for each variant.
The following sample demonstrates how to add a variant that is compatible with Gradle 7.0+ while
the "main" variant is compatible with older versions:
build.gradle.kts
java {
registerFeature(gradle7.name) {
usingSourceSet(gradle7)
capability(project.group.toString(), project.name,
project.version.toString()) ①
}
}
configurations.configureEach {
if (isCanBeConsumed && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, ②
objects.named("7.0"))
}
}
}
tasks.named<Copy>(gradle7.processResourcesTaskName) { ③
val copyPluginDescriptors = rootSpec.addChild()
copyPluginDescriptors.into("META-INF/gradle-plugins")
copyPluginDescriptors.from(tasks.pluginDescriptors)
}
dependencies {
"gradle7CompileOnly"(gradleApi()) ④
}
build.gradle
java {
registerFeature(gradle7.name) {
usingSourceSet(gradle7)
capability(project.group.toString(), project.name, project.version
.toString()) ①
}
}
configurations.configureEach {
if (canBeConsumed && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion
.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, ②
objects.named(GradlePluginApiVersion, '7.0'))
}
}
}
tasks.named(gradle7.processResourcesTaskName) { ③
def copyPluginDescriptors = rootSpec.addChild()
copyPluginDescriptors.into('META-INF/gradle-plugins')
copyPluginDescriptors.from(tasks.pluginDescriptors)
}
dependencies {
gradle7CompileOnly(gradleApi()) ④
}
First, we declare a separate source set and a feature variant for our Gradle 7 plugin variant. Then,
we do some specific wiring to turn the feature into a proper Gradle plugin variant:
① Assign the implicit capability that corresponds to the components GAV to the variant.
② Assign the Gradle API version attribute to all consumable configurations of our Gradle7 variant.
Gradle uses this information to determine which variant to select during plugin resolution.
③ Configure the processGradle7Resources task to ensure the plugin descriptor file is added to the
Gradle7 variant Jar.
④ Add a dependency to the gradleApi() for our new variant so that the API is visible during
compilation time.
Note that there is currently no convenient way to access the API of other Gradle versions as the one
you are building the plugin with. Ideally, every variant should be able to declare a dependency on
the API of the minimal Gradle version it supports. This will be improved in the future.
The above snippet assumes that all variants of your plugin have the plugin class at the same
location. That is, if your plugin class is org.example.GreetingPlugin, you need to create a second
variant of that class in src/gradle7/java/org/example.
Given a dependency on a multi-variant plugin, Gradle will automatically choose its variant that best
matches the current Gradle version when it resolves any of:
• dependencies in the root project of the build source (buildSrc) that appear on the compile or
runtime classpath;
• dependencies in a project that applies the Java Gradle Plugin Development plugin or the Kotlin
DSL plugin, appearing on the compile or runtime classpath.
The best matching variant is the variant that targets the highest Gradle API version and does not
exceed the current build’s Gradle version.
In all other cases, a plugin variant that does not specify the supported Gradle API version is
preferred if such a variant is present.
In projects that use plugins as dependencies, requesting the variants of plugin dependencies that
support a different Gradle version is possible. This allows a multi-variant plugin that depends on
other plugins to access their APIs, which are exclusively provided in their version-specific variants.
This snippet makes the plugin variant gradle7 defined above consume the matching variants of its
dependencies on other multi-variant plugins:
build.gradle.kts
configurations.configureEach {
if (isCanBeResolved && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
objects.named("7.0"))
}
}
}
build.gradle
configurations.configureEach {
if (canBeResolved && name.startsWith(gradle7.name)) {
attributes {
attribute(GradlePluginApiVersion
.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
objects.named(GradlePluginApiVersion, '7.0'))
}
}
}
Reporting problems
Plugins can report problems through Gradle’s problem-reporting APIs. The APIs report rich,
structured information about problems happening during the build. This information can be used
by different user interfaces such as Gradle’s console output, Build Scans, or IDEs to communicate
problems to the user in the most appropriate way.
@Inject
public ProblemReportingPlugin(Problems problems) { ①
this.problemReporter = problems.forNamespace("org.myorg"); ②
}
② A problem reporter, is created for the plugin. While the namespace is up to the plugin author, it
is recommended that the plugin ID be used.
③ A problem is reported. This problem is recoverable so that the build will continue.
Problem building
When reporting a problem, a wide variety of information can be provided. The ProblemSpec
describes all the information that can be provided.
Reporting problems
• Reporting a problem is used for reporting problems that are recoverable, and the build should
continue.
• Throwing a problem is used for reporting problems that are not recoverable, and the build
should fail.
• Rethrowing a problem is used to wrap an already thrown exception. Otherwise, the behavior is
the same as Throwing.
When reporting problems, Gradle will aggregate similar problems by sending them through the
Tooling API based on the problem’s category label.
• If for any bucket (i.e., category and label pairing), the number of collected occurrences is greater
than 10.000, then it will be sent immediately instead of at the end of the build.
This section revolves around a sample project called the "URL verifier plugin". This plugin creates a
task named verifyUrl that checks whether a given URL can be resolved via HTTP GET. The end user
can provide the URL via an extension named verification.
The following build script assumes that the plugin JAR file has been published to a binary
repository. The script demonstrates how to apply the plugin to the project and configure its exposed
extension:
build.gradle.kts
plugins {
id("org.myorg.url-verifier") ①
}
verification {
url = "https://www.google.com/" ②
}
build.gradle
plugins {
id 'org.myorg.url-verifier' ①
}
verification {
url = 'https://www.google.com/' ②
}
Executing the verifyUrl task renders a success message if the HTTP GET call to the configured URL
returns with a 200 response code:
$ gradle verifyUrl
BUILD SUCCESSFUL in 0s
5 actionable tasks: 5 executed
Before diving into the code, let’s first revisit the different types of tests and the tooling that supports
implementing them.
Testing is a crucial part of the software development life cycle, ensuring that software functions
correctly and meets quality standards before release. Automated testing allows developers to
refactor and improve code with confidence.
Manual Testing
While manual testing is straightforward, it is error-prone and requires human effort. For Gradle
plugins, manual testing involves using the plugin in a build script.
Automated Testing
Automated testing includes unit, integration, and functional testing.
The testing pyramid
introduced by Mike Cohen in
his book Succeeding with
Agile: Software Development
Using Scrum describes three
types of automated tests:
1. Unit Testing: Verifies the smallest units of code, typically methods, in isolation. It uses Stubs or
Mocks to isolate code from external dependencies.
3. Functional Testing: Tests the system from the end user’s perspective, ensuring correct
functionality. End-to-end tests for Gradle plugins simulate a build, apply the plugin, and execute
specific tasks to verify functionality.
Tooling support
Testing Gradle plugins, both manually and automatically, is simplified with the appropriate tools.
The table below provides a summary of each testing approach. You can choose any test framework
you’re comfortable with.
For detailed explanations and code examples, refer to the specific sections below:
The composite builds feature of Gradle makes it easy to test a plugin manually. The standalone
plugin project and the consuming project can be combined into a single unit, making it
straightforward to try out or debug changes without re-publishing the binary file:
.
├── include-plugin-build ①
│ ├── build.gradle
│ └── settings.gradle
└── url-verifier-plugin ②
├── build.gradle
├── settings.gradle
└── src
The following code snippet demonstrates the use of the settings file:
settings.gradle.kts
pluginManagement {
includeBuild("../url-verifier-plugin")
}
settings.gradle
pluginManagement {
includeBuild '../url-verifier-plugin'
}
The command line output of the verifyUrl task from the project include-plugin-build looks exactly
the same as shown in the introduction, except that it now executes as part of a composite build.
Manual testing has its place in the development process, but it is not a replacement for automated
testing.
Setting up a suite of tests early on is crucial to the success of your plugin. Automated tests become
an invaluable safety net when upgrading the plugin to a new Gradle version or
enhancing/refactoring the code.
Organizing test source code
We recommend implementing a good distribution of unit, integration, and functional tests to cover
the most important use cases. Separating the source code for each test type automatically results in
a project that is more maintainable and manageable.
By default, the Java project creates a convention for organizing unit tests in the directory
src/test/java. Additionally, if you apply the Groovy plugin, source code under the directory
src/test/groovy is considered for compilation (with the same standard for Kotlin under the
directory src/test/kotlin). Consequently, source code directories for other test types should follow
a similar pattern:
.
└── src
├── functionalTest
│ └── groovy ①
├── integrationTest
│ └── groovy ②
├── main
│ ├── java ③
└── test
└── groovy ④
You can configure the source directories for compilation and test execution.
The Test Suite plugin provides a DSL and API to model multiple groups of automated tests into test
suites in JVM-based projects. You can also rely on third-party plugins for convenience, such as the
Nebula Facet plugin or the TestSets plugin.
A new configuration DSL for modeling the below integrationTest suite is available
NOTE
via the incubating JVM Test Suite plugin.
In Gradle, source code directories are represented using the concept of source sets. A source set is
configured to point to one or more directories containing source code. When you define a source
set, Gradle automatically sets up compilation tasks for the specified directories.
A pre-configured source set can be created with one line of build script code. The source set
automatically registers configurations to define dependencies for the sources of the source set:
build.gradle.kts
dependencies {
"integrationTestImplementation"(project)
}
build.gradle
dependencies {
integrationTestImplementation(project)
}
Source sets are responsible for compiling source code, but they do not deal with executing the
bytecode. For test execution, a corresponding task of type Test needs to be established. The
following setup shows the execution of integration tests, referencing the classes and runtime
classpath of the integration test source set:
build.gradle.kts
build.gradle
Gradle does not dictate the use of a specific test framework. Popular choices include JUnit, TestNG
and Spock. Once you choose an option, you have to add its dependency to the compile classpath for
your tests.
The following code snippet shows how to use Spock for implementing tests:
build.gradle.kts
repositories {
mavenCentral()
}
dependencies {
testImplementation(platform("org.spockframework:spock-bom:2.2-groovy-
3.0"))
testImplementation("org.spockframework:spock-core")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
"integrationTestImplementation"(platform("org.spockframework:spock-
bom:2.2-groovy-3.0"))
"integrationTestImplementation"("org.spockframework:spock-core")
"integrationTestRuntimeOnly"("org.junit.platform:junit-platform-
launcher")
"functionalTestImplementation"(platform("org.spockframework:spock-
bom:2.2-groovy-3.0"))
"functionalTestImplementation"("org.spockframework:spock-core")
"functionalTestRuntimeOnly"("org.junit.platform:junit-platform-launcher")
}
tasks.withType<Test>().configureEach {
// Using JUnitPlatform for running tests
useJUnitPlatform()
}
build.gradle
repositories {
mavenCentral()
}
dependencies {
testImplementation platform("org.spockframework:spock-bom:2.2-groovy-3.0
")
testImplementation 'org.spockframework:spock-core'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
integrationTestImplementation platform("org.spockframework:spock-bom:2.2-
groovy-3.0")
integrationTestImplementation 'org.spockframework:spock-core'
integrationTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'
functionalTestImplementation platform("org.spockframework:spock-bom:2.2-
groovy-3.0")
functionalTestImplementation 'org.spockframework:spock-core'
functionalTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'
}
tasks.withType(Test).configureEach {
// Using JUnitPlatform for running tests
useJUnitPlatform()
}
Spock is a Groovy-based BDD test framework that even includes APIs for creating
NOTE Stubs and Mocks. The Gradle team prefers Spock over other options for its
expressiveness and conciseness.
Implementing automated tests
This section discusses representative implementation examples for unit, integration, and functional
tests. All test classes are based on the use of Spock, though it should be relatively easy to adapt the
code to a different test framework.
The URL verifier plugin emits HTTP GET calls to check if a URL can be resolved successfully. The
method DefaultHttpCaller.get(String) is responsible for calling a given URL and returns an
instance of type HttpResponse. HttpResponse is a POJO containing information about the HTTP
response code and message:
HttpResponse.java
package org.myorg.http;
@Override
public String toString() {
return "HTTP " + code + ", Reason: " + message;
}
}
The class HttpResponse represents a good candidate for a unit test. It does not reach out to any other
classes nor does it use the Gradle API.
HttpResponseTest.groovy
package org.myorg.http
import spock.lang.Specification
then:
httpResponse.code == OK_HTTP_CODE
httpResponse.message == OK_HTTP_MESSAGE
}
then:
httpResponse.toString() == "HTTP $OK_HTTP_CODE, Reason: $OK_HTTP_MESSAGE"
}
}
When writing unit tests, it’s important to test boundary conditions and
various forms of invalid input. Try to extract as much logic as possible from
IMPORTANT
classes that use the Gradle API to make it testable as unit tests. It will result
in maintainable code and faster test execution.
You can use the ProjectBuilder class to create Project instances to use when you test your plugin
implementation.
src/test/java/org/example/GreetingPluginTest.java
Let’s look at a class that reaches out to another system, the piece of code that emits the HTTP calls.
At the time of executing a test for the class DefaultHttpCaller, the runtime environment needs to be
able to reach out to the internet:
DefaultHttpCaller.java
package org.myorg.http;
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URI;
import java.net.URISyntaxException;
Implementing an integration test for DefaultHttpCaller doesn’t look much different from the unit
test shown in the previous section:
DefaultHttpCallerIntegrationTest.groovy
package org.myorg.http
import spock.lang.Specification
import spock.lang.Subject
then:
httpResponse.code == 200
httpResponse.message == 'OK'
}
def "throws exception when calling unknown host via HTTP GET"() {
when:
httpCaller.get('https://www.wedonotknowyou123.com/')
then:
def t = thrown(HttpCallException)
t.message == "Failed to call URL 'https://www.wedonotknowyou123.com/' via HTTP
GET"
t.cause instanceof UnknownHostException
}
}
Functional tests verify the correctness of the plugin end-to-end. In practice, this means applying,
configuring, and executing the functionality of the plugin implementation. The UrlVerifierPlugin
class exposes an extension and a task instance that uses the URL value configured by the end user:
UrlVerifierPlugin.java
package org.myorg;
import org.gradle.api.Plugin;
import org.gradle.api.Project;
import org.myorg.tasks.UrlVerify;
Every Gradle plugin project should apply the plugin development plugin to reduce boilerplate code.
By applying the plugin development plugin, the test source set is preconfigured for the use with
TestKit. If we want to use a custom source set for functional tests and leave the default test source
set for only unit tests, we can configure the plugin development plugin to look for TestKit tests
elsewhere.
build.gradle.kts
gradlePlugin {
testSourceSets(functionalTest)
}
build.gradle
gradlePlugin {
testSourceSets(sourceSets.functionalTest)
}
Functional tests for Gradle plugins use an instance of GradleRunner to execute the build under test.
GradleRunner is an API provided by TestKit, which internally uses the Tooling API to execute the
build.
The following example applies the plugin to the build script under test, configures the extension
and executes the build with the task verifyUrl. Please see the TestKit documentation to get more
familiar with the functionality of TestKit.
UrlVerifierPluginFunctionalTest.groovy
package org.myorg
import org.gradle.testkit.runner.GradleRunner
import spock.lang.Specification
import spock.lang.TempDir
def setup() {
buildFile = new File(testProjectDir, 'build.gradle')
buildFile << """
plugins {
id 'org.myorg.url-verifier'
}
"""
}
def "can successfully configure URL through extension and verify it"() {
buildFile << """
verification {
url = 'https://www.google.com/'
}
"""
when:
def result = GradleRunner.create()
.withProjectDir(testProjectDir)
.withArguments('verifyUrl')
.withPluginClasspath()
.build()
then:
result.output.contains("Successfully resolved URL 'https://www.google.com/'")
result.task(":verifyUrl").outcome == SUCCESS
}
}
IDE integration
TestKit determines the plugin classpath by running a specific Gradle task. You will need to execute
the assemble task to initially generate the plugin classpath or to reflect changes to it even when
running TestKit-based functional tests from the IDE.
Some IDEs provide a convenience option to delegate the "test classpath generation and execution"
to the build. In IntelliJ, you can find this option under Preferences… > Build, Execution, Deployment
> Build Tools > Gradle > Runner > Delegate IDE build/run actions to Gradle.
Prerequisites
You’ll need an existing Gradle plugin project for this tutorial. If you don’t have one, use the Greeting
plugin sample.
Attempting to publish this plugin will safely fail with a permission error, so don’t worry about
cluttering up the Gradle Plugin Portal with a trivial example plugin.
Account setup
Before publishing your plugin, you must create an account on the Gradle Plugin Portal. Follow the
instructions on the registration page to create an account and obtain an API key from your profile
page’s "API Keys" tab.
Store your API key in your Gradle configuration (gradle.publish.key and gradle.publish.secret) or
use a plugin like Seauc Credentials plugin or Gradle Credentials plugin for secure management.
It is common practice to copy and paste the text into your $HOME/.gradle/gradle.properties file, but
you can also place it in any other valid location. All the plugin requires is that the
gradle.publish.key and gradle.publish.secret are available as project properties when the
appropriate Plugin Portal tasks are executed.
If you are concerned about placing your credentials in gradle.properties, check out the Seauc
Credentials plugin or the Gradle Credentials plugin.
To publish your plugin, add the com.gradle.plugin-publish plugin to your project’s build.gradle or
build.gradle.kts file:
build.gradle.kts
plugins {
id("com.gradle.plugin-publish") version "1.2.1"
}
build.gradle
plugins {
id 'com.gradle.plugin-publish' version '1.2.1'
}
The latest version of the Plugin Publishing Plugin can be found on the Gradle Plugin Portal.
Since version 1.0.0 the Plugin Publish Plugin automatically applies the Java Gradle
Plugin Development Plugin (assists with developing Gradle plugins) and the Maven
NOTE
Publish Plugin (generates plugin publication metadata). If using older versions of
the Plugin Publish Plugin, these helper plugins must be applied explicitly.
build.gradle.kts
group = "io.github.johndoe" ①
version = "1.0" ②
gradlePlugin { ③
website = "<substitute your project website>" ④
vcsUrl = "<uri to project source repository>" ⑤
// ... ⑥
}
build.gradle
group = 'io.github.johndoe' ①
version = '1.0' ②
gradlePlugin { ③
website = '<substitute your project website>' ④
vcsUrl = '<uri to project source repository>' ⑤
// ... ⑥
}
① Make sure your project has a group set which is used to identify the artifacts (jar and metadata)
you publish for your plugins in the repository of the Gradle Plugin Portal and which is
descriptive of the plugin author or the organization the plugins belong too.
② Set the version of your project, which will also be used as the version of your plugins.
③ Use the gradlePlugin block provided by the Java Gradle Plugin Development Plugin to configure
further options for your plugin publication.
⑤ Provide the source repository URI so that others can find it, if they want to contribute.
⑥ Set specific properties for each plugin you want to publish; see next section.
Define common properties for all plugins, such as group, version, website, and source repository,
using the gradlePlugin{} block:
build.gradle.kts
gradlePlugin { ①
// ... ②
plugins { ③
create("greetingsPlugin") { ④
id = "<your plugin identifier>" ⑤
displayName = "<short displayable name for plugin>" ⑥
description = "<human-readable description of what your plugin is
about>" ⑦
tags = listOf("tags", "for", "your", "plugins") ⑧
implementationClass = "<your plugin class>"
}
}
}
build.gradle
gradlePlugin { ①
// ... ②
plugins { ③
greetingsPlugin { ④
id = '<your plugin identifier>' ⑤
displayName = '<short displayable name for plugin>' ⑥
description = '<human-readable description of what your plugin is
about>' ⑦
tags.set(['tags', 'for', 'your', 'plugins']) ⑧
implementationClass = '<your plugin class>'
}
}
}
③ Each plugin you publish will have its own block inside plugins.
④ The name of a plugin block must be unique for each plugin you publish; this is a property used
only locally by your build and will not be part of the publication.
⑦ Set a description to be displayed on the portal. It provides useful information to people who
want to use your plugin.
⑧ Specifies the categories your plugin covers. It makes the plugin more likely to be discovered by
people needing its functionality.
For example, consider the configuration for the GradleTest plugin, already published to the Gradle
Plugin Portal.
build.gradle.kts
gradlePlugin {
website = "https://github.com/ysb33r/gradleTest"
vcsUrl = "https://github.com/ysb33r/gradleTest.git"
plugins {
create("gradletestPlugin") {
id = "org.ysb33r.gradletest"
displayName = "Plugin for compatibility testing of Gradle
plugins"
description = "A plugin that helps you test your plugin against a
variety of Gradle versions"
tags = listOf("testing", "integrationTesting", "compatibility")
implementationClass =
"org.ysb33r.gradle.gradletest.GradleTestPlugin"
}
}
}
build.gradle
gradlePlugin {
website = 'https://github.com/ysb33r/gradleTest'
vcsUrl = 'https://github.com/ysb33r/gradleTest.git'
plugins {
gradletestPlugin {
id = 'org.ysb33r.gradletest'
displayName = 'Plugin for compatibility testing of Gradle
plugins'
description = 'A plugin that helps you test your plugin against a
variety of Gradle versions'
tags.addAll('testing', 'integrationTesting', 'compatibility')
implementationClass =
'org.ysb33r.gradle.gradletest.GradleTestPlugin'
}
}
}
If you browse the associated page on the Gradle Plugin Portal for the GradleTest plugin, you will see
how the specified metadata is displayed.
The Plugin Publish Plugin automatically generates and publishes the Javadoc, and sources JARs for
your plugin publication.
Sign artifacts
Starting from version 1.0.0 of Plugin Publish Plugin, the signing of published plugin artifacts has
been made automatic. To enable it, all that’s needed is to apply the signing plugin in your build.
Shadow dependencies
Starting from version 1.0.0 of Plugin Publish Plugin, shadowing your plugin’s dependencies (ie,
publishing it as a fat jar) has been made automatic. To enable it, all that’s needed is to apply the
com.github.johnrengelman.shadow plugin in your build.
If you publish your plugin internally for use within your organization, you can publish it like any
other code artifact. See the Ivy and Maven chapters on publishing artifacts.
If you are interested in publishing your plugin to be used by the wider Gradle community, you can
publish it to Gradle Plugin Portal. This site provides the ability to search for and gather information
about plugins contributed by the Gradle community. Please refer to the corresponding section on
making your plugin available on this site.
Publish locally
To check how the artifacts of your published plugin look or to use it only locally or internally in
your company, you can publish it to any Maven repository, including a local folder. You only need
to configure repositories for publishing. Then, you can run the publish task to publish your plugin
to all repositories you have defined (but not the Gradle Plugin Portal).
build.gradle.kts
publishing {
repositories {
maven {
name = "localPluginRepository"
url = uri("../local-plugin-repository")
}
}
}
build.gradle
publishing {
repositories {
maven {
name = 'localPluginRepository'
url = '../local-plugin-repository'
}
}
}
To use the repository in another build, add it to the repositories of the pluginManagement {} block in
your settings.gradle(.kts) file.
Publish to the Plugin Portal
$ ./gradlew publishPlugins
You can validate your plugins before publishing using the --validate-only flag:
If you have not configured your gradle.properties for the Gradle Plugin Portal, you can specify
them on the command-line:
You will encounter a permission failure if you attempt to publish the example
Greeting Plugin with the ID used in this section. That’s expected and ensures the
NOTE
portal won’t be overrun with multiple experimental and duplicate greeting-type
plugins.
After approval, your plugin will be available on the Gradle Plugin Portal for others to discover and
use.
Once you successfully publish a plugin, it won’t immediately appear on the Portal. It also needs to
pass an approval process, which is manual and relatively slow for the initial version of your plugin,
but is fully automatic for subsequent versions. For further details, see here.
Once your plugin is approved, you can find instructions for its use at a URL of the form
https://plugins.gradle.org/plugin/<your-plugin-id>. For example, the Greeting Plugin example is
already on the portal at https://plugins.gradle.org/plugin/org.example.greeting.
If your plugin was published without using the Java Gradle Plugin Development Plugin, the
publication will be lacking Plugin Marker Artifact, which is needed for plugins DSL to locate the
plugin. In this case, the recommended way to resolve the plugin in another project is to add a
resolutionStrategy section to the pluginManagement {} block of the project’s settings file, as shown
below.
settings.gradle.kts
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == "org.example") {
useModule("org.example:custom-plugin:${requested.version}")
}
}
}
settings.gradle
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == 'org.example') {
useModule("org.example:custom-plugin:${requested.version}")
}
}
}
[1] Script plugins are hard to maintain. Do not use script plugins apply from:, they are not recommended.
[2] It is recommended to use a statically-typed language like Java or Kotlin for implementing plugins to reduce the likelihood of
binary incompatibilities. If using Groovy, consider using statically compiled Groovy.
OTHER TOPICS
Gradle-managed Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory
and the Project Root directory.
TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.
├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── my-setup.gradle
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties ⑧
By default, the cleanup runs in the background when the Gradle daemon is stopped or shut down.
The following cleanup strategies are applied periodically (by default, once every 24 hours):
• Version-specific caches in all caches/<GRADLE_VERSION>/ directories are checked for whether they
are still in use.
If not, directories for release versions are deleted after 30 days of inactivity, and snapshot
versions after 7 days.
• Shared caches in caches/ (e.g., jars-*) are checked for whether they are still in use.
• Files in shared caches used by the current Gradle version in caches/ (e.g., jars-3 or modules-2)
are checked for when they were last accessed.
Depending on whether the file can be recreated locally or downloaded from a remote
repository, it will be deleted after 7 or 30 days, respectively.
• Gradle distributions in wrapper/dists/ are checked for whether they are still in use, i.e., whether
there’s a corresponding version-specific cache directory.
Unused distributions are deleted.
3. Downloaded resources: Shared caches downloaded from a remote repository (e.g., cached
dependencies).
4. Created resources: Shared caches that Gradle creates during a build (e.g., artifact transforms).
The retention period for each category can be configured independently via an init script in the
Gradle User Home:
gradleUserHome/init.d/cache-settings.gradle.kts
beforeSettings {
caches {
releasedWrappers.setRemoveUnusedEntriesAfterDays(45)
snapshotWrappers.setRemoveUnusedEntriesAfterDays(10)
downloadedResources.setRemoveUnusedEntriesAfterDays(45)
createdResources.setRemoveUnusedEntriesAfterDays(10)
buildCache.setRemoveUnusedEntriesAfterDays(5)
}
}
gradleUserHome/init.d/cache-settings.gradle
This is useful in cases where Gradle User Home is ephemeral or delaying cleanup is desirable
until an explicit point.
This is useful in cases where it’s desirable to ensure that cleanup has occurred before
proceeding.
However, this performs cache cleanup during the build (rather than in the background), which
can be expensive, so this option should only be used when necessary.
gradleUserHome/init.d/cache-settings.gradle.kts
beforeSettings {
caches {
cleanup = Cleanup.DISABLED
}
}
gradleUserHome/init.d/cache-settings.gradle
Cache cleanup settings can only be configured via init scripts and should be placed
under the init.d directory in Gradle User Home. This effectively couples the
NOTE configuration of cache cleanup to the Gradle User Home those settings apply to and
limits the possibility of different conflicting settings from different projects being
applied to the same directory.
It is common to share a single Gradle User Home between multiple versions of Gradle.
As stated above, caches in Gradle User Home are version-specific. Different versions of Gradle will
perform maintenance on only the version-specific caches associated with each version.
On the other hand, some caches are shared between versions (e.g., the dependency artifact cache or
the artifact transform cache).
Beginning with Gradle version 8.0, the cache cleanup settings can be configured to custom
retention periods. However, older versions have fixed retention periods (7 or 30 days, depending
on the cache). These shared caches could be accessed by versions of Gradle with different settings
to retain cache artifacts.
• If the retention period is not customized, all versions that perform cleanup will have the same
retention periods. There will be no effect due to sharing a Gradle User Home with multiple
versions.
• If the retention period is customized for Gradle versions greater than or equal to version 8.0 to
use retention periods shorter than the previously fixed periods, there will also be no effect.
The versions of Gradle aware of these settings will cleanup artifacts earlier than the previously
fixed retention periods, and older versions will effectively not participate in the cleanup of
shared caches.
• If the retention period is customized for Gradle versions greater than or equal to version 8.0 to
use retention periods longer than the previously fixed periods, the older versions of Gradle may
clean the shared caches earlier than what is configured.
In this case, if it is desirable to maintain these shared cache entries for newer versions for
longer retention periods, they will not be able to share a Gradle User Home with older versions.
They will need to use a separate directory.
Another consideration when sharing the Gradle User Home with versions of Gradle before version
8.0 is that the DSL elements to configure the cache retention settings are unavailable in earlier
versions, so this must be accounted for in any init script shared between versions. This can easily
be handled by conditionally applying a version-compliant script.
The version-compliant script should reside somewhere other than the init.d
NOTE
directory (such as a sub-directory), so it is not automatically applied.
gradleUserHome/init.d/cache-settings.gradle.kts
gradleUserHome/init.d/cache-settings.gradle
gradleUserHome/init.d/gradle8/cache-settings.gradle.kts
beforeSettings {
caches {
releasedWrappers { setRemoveUnusedEntriesAfterDays(45) }
snapshotWrappers { setRemoveUnusedEntriesAfterDays(10) }
downloadedResources { setRemoveUnusedEntriesAfterDays(45) }
createdResources { setRemoveUnusedEntriesAfterDays(10) }
buildCache { setRemoveUnusedEntriesAfterDays(5) }
}
}
gradleUserHome/init.d/gradle8/cache-settings.gradle
Cache marking
Beginning with Gradle version 8.1, Gradle supports marking caches with a CACHEDIR.TAG file.
It follows the format described in the Cache Directory Tagging Specification. The purpose of this file
is to allow tools to identify the directories that do not need to be searched or backed up.
By default, the directories caches, wrapper/dists, daemon, and jdks in the Gradle User Home are
marked with this file.
The cache marking feature can be configured via an init script in the Gradle User Home:
gradleUserHome/init.d/cache-settings.gradle.kts
beforeSettings {
caches {
// Disable cache marking for all caches
markingStrategy = MarkingStrategy.NONE
}
}
gradleUserHome/init.d/cache-settings.gradle
Cache marking settings can only be configured via init scripts and should be placed
under the init.d directory in Gradle User Home. This effectively couples the
NOTE configuration of cache marking to the Gradle User Home to which those settings
apply and limits the possibility of different conflicting settings from different
projects being applied to the same directory.
Project Root directory
The project root directory contains all source files from your project.
It also contains files and directories Gradle generates, such as .gradle and build.
While the former are usually checked into source control, the latter are transient files Gradle uses
to support features like incremental builds.
├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── gradle.properties ⑤
├── gradlew ⑥
├── gradlew.bat ⑥
├── settings.gradle.kts ⑦
├── subproject-one ⑧
| └── build.gradle.kts ⑨
├── subproject-two ⑧
| └── build.gradle.kts ⑨
└── ⋮
③ The build directory of this project into which Gradle generates all build artifacts.
From version 4.10 onwards, Gradle automatically cleans the project-specific cache directory.
After building the project, version-specific cache directories in .gradle/8.9/ are checked
periodically (at most, every 24 hours) to determine whether they are still in use. They are deleted if
they haven’t been used for 7 days.
In addition to avoiding hardcoded paths, Gradle encourages laziness in its build scripts. This means
that tasks and operations should be deferred until they are actually needed rather than executed
eagerly.
Many examples in this chapter use hard-coded paths as string literals. This makes them easy to
understand, but it is not good practice. The problem is that paths often change, and the more places
you need to change them, the more likely you will miss one and break the build.
Where possible, you should use tasks, task properties, and project properties — in that order of
preference — to configure file paths.
For example, if you create a task that packages the compiled classes of a Java application, you
should use an implementation similar to this:
build.gradle.kts
tasks.register<Zip>("packageClasses") {
archiveAppendix = "classes"
destinationDirectory = archivesDirPath
from(tasks.compileJava)
}
build.gradle
tasks.register('packageClasses', Zip) {
archiveAppendix = "classes"
destinationDirectory = archivesDirPath
from compileJava
}
The compileJava task is the source of the files to package, and the project property archivesDirPath
stores the location of the archives, as we are likely to use it elsewhere in the build.
Using a task directly as an argument like this relies on it having defined outputs, so it won’t always
be possible. This example could be further improved by relying on the Java plugin’s convention for
destinationDirectory rather than overriding it, but it does demonstrate the use of project
properties.
Locating files
To perform some action on a file, you need to know where it is, and that’s the information provided
by file paths. Gradle builds on the standard Java File class, which represents the location of a single
file and provides APIs for dealing with collections of paths.
Using ProjectLayout
The ProjectLayout class is used to access various directories and files within a project. It provides
methods to retrieve paths to the project directory, build directory, settings file, and other important
locations within the project’s file structure. This class is particularly useful when you need to work
with files in a build script or plugin in different project paths:
build.gradle.kts
build.gradle
Using Project.file()
Gradle provides the Project.file(java.lang.Object) method for specifying the location of a single file
or directory.
Relative paths are resolved relative to the project directory, while absolute paths remain
unchanged.
Never use new File(relative path) unless passed to file() or files() or from()
or other methods defined in terms of file() or files(). Otherwise, this creates a
CAUTION path relative to the current working directory (CWD). Gradle can make no
guarantees about the location of the CWD, which means builds that rely on it
may break at any time.
Here are some examples of using the file() method with different types of arguments:
build.gradle.kts
build.gradle
As you can see, you can pass strings, File instances and Path instances to the file() method, all of
which result in an absolute File object.
In the case of multi-project builds, the file() method will always turn relative paths into paths
relative to the current project directory, which may be a child project.
Using Project.getRootDir()
Suppose you want to use a path relative to the root project directory. In that case, you need to use
the special Project.getRootDir() property to construct an absolute path, like so:
build.gradle.kts
build.gradle
dev
├── projects
│ ├── AcmeHealth
│ │ ├── subprojects
│ │ │ ├── AcmePatientRecordLib
│ │ │ │ └── build.gradle
│ │ │ └── ...
│ │ ├── shared
│ │ │ └── config.xml
│ │ └── ...
│ └── ...
└── settings.gradle
Note that Project also provides Project.getRootProject() for multi-project builds which, in the
example, would resolve to: dev/projects/AcmeHealth/subprojects/AcmePatientRecordLib.
Using FileCollection
A file collection is simply a set of file paths represented by the FileCollection interface.
The set of paths can be any file path. The file paths don’t have to be related in any way, so they don’t
have to be in the same directory or have a shared parent directory.
As with the Project.file(java.lang.Object) method covered in the previous section, all relative paths
are evaluated relative to the current project directory. The following example demonstrates some
of the variety of argument types you can use — strings, File instances, lists, or Paths:
build.gradle.kts
build.gradle
• created lazily
• iterated over
• filtered
• combined
Lazy creation of a file collection is useful when evaluating the files that make up a collection when a
build runs. In the following example, we query the file system to find out what files exist in a
particular directory and then make those into a file collection:
build.gradle.kts
tasks.register("list") {
val projectDirectory = layout.projectDirectory
doLast {
var srcDir: File? = null
srcDir = projectDirectory.file("src").asFile
println("Contents of ${srcDir.name}")
collection.map { it.relativeTo(projectDirectory.asFile)
}.sorted().forEach { println(it) }
srcDir = projectDirectory.file("src2").asFile
println("Contents of ${srcDir.name}")
collection.map { it.relativeTo(projectDirectory.asFile)
}.sorted().forEach { println(it) }
}
}
build.gradle
tasks.register('list') {
Directory projectDirectory = layout.projectDirectory
doLast {
File srcDir
srcDir = projectDirectory.file('src').asFile
println "Contents of $srcDir.name"
collection.collect { projectDirectory.asFile.relativePath(it) }.sort
().each { println it }
srcDir = projectDirectory.file('src2').asFile
println "Contents of $srcDir.name"
collection.collect { projectDirectory.asFile.relativePath(it) }.sort
().each { println it }
}
}
$ gradle -q list
Contents of src
src/dir1
src/file1.txt
Contents of src2
src2/dir1
src2/dir2
The key to lazy creation is passing a closure (in Groovy) or a Provider (in Kotlin) to the files()
method. Your closure or provider must return a value of a type accepted by files(), such as
List<File>, String, or FileCollection.
Iterating over a file collection can be done through the each() method (in Groovy) or forEach method
(in Kotlin) on the collection or using the collection in a for loop. In both approaches, the file
collection is treated as a set of File instances, i.e., your iteration variable will be of type File.
The following example demonstrates such iteration. It also demonstrates how you can convert file
collections to other types using the as operator (or supported properties):
build.gradle.kts
build.gradle
For example, imagine collection in the above example gains an extra file or two after union is
created. As long as you use union after those files are added to collection, union will also contain
those additional files. The same goes for the different file collection.
Live collections are also important when it comes to filtering. Suppose you want to use a subset of a
file collection. In that case, you can take advantage of the
FileCollection.filter(org.gradle.api.specs.Spec) method to determine which files to "keep". In the
following example, we create a new collection that consists of only the files that end with .txt in
the source collection:
build.gradle.kts
build.gradle
$ gradle -q filterTextFiles
src/file1.txt
src/file2.txt
src/file5.txt
If collection changes at any time, either by adding or removing files from itself, then textFiles will
immediately reflect the change because it is also a live collection. Note that the closure you pass to
filter() takes a File as an argument and should return a boolean.
Many objects in Gradle have properties which accept a set of input files. For example, the
JavaCompile task has a source property that defines the source files to compile. You can set the
value of this property using any of the types supported by the files() method, as mentioned in the
API docs. This means you can, for example, set the property to a File, String, collection,
FileCollection or even a closure or Provider.
This is a feature of specific tasks! That means implicit conversion will not happen for just any
task that has a FileCollection or FileTree property. If you want to know whether implicit
conversion happens in a particular situation, you will need to read the relevant documentation,
such as the corresponding task’s API docs. Alternatively, you can remove all doubt by explicitly
using ProjectLayout.files(java.lang.Object...) in your build.
Here are some examples of the different types of arguments that the source property can take:
build.gradle.kts
tasks.register<JavaCompile>("compile") {
// Use a File object to specify the source directory
source = fileTree(file("src/main/java"))
build.gradle
tasks.register('compile', JavaCompile) {
One other thing to note is that properties like source have corresponding methods in core Gradle
tasks. Those methods follow the convention of appending to collections of values rather than
replacing them. Again, this method accepts any of the types supported by the files() method, as
shown here:
build.gradle.kts
tasks.named<JavaCompile>("compile") {
// Add some source directories use String paths
source("src/main/java", "src/main/groovy")
build.gradle
compile {
// Add some source directories use String paths
source 'src/main/java', 'src/main/groovy'
Using FileTree
A file tree is a file collection that retains the directory structure of the files it contains and has the
type FileTree. This means all the paths in a file tree must have a shared parent directory. The
following diagram highlights the distinction between file trees and file collections in the typical
case of copying files:
The simplest way to create a file tree is to pass a file or directory path to the
Project.fileTree(java.lang.Object) method. This will create a tree of all the files and directories in
that base directory (but not the base directory itself). The following example demonstrates how to
use this method and how to filter the files and directories using Ant-style patterns:
build.gradle.kts
build.gradle
You can see more examples of supported patterns in the API docs for PatternFilterable.
By default, fileTree() returns a FileTree instance that applies some default exclude patterns for
convenience — the same defaults as Ant. For the complete default exclude list, see the Ant manual.
If those default excludes prove problematic, you can work around the issue by changing the default
excludes in the settings script:
settings.gradle.kts
import org.apache.tools.ant.DirectoryScanner
DirectoryScanner.removeDefaultExclude("**/.git")
DirectoryScanner.removeDefaultExclude("**/.git/**")
settings.gradle
import org.apache.tools.ant.DirectoryScanner
DirectoryScanner.removeDefaultExclude('**/.git')
DirectoryScanner.removeDefaultExclude('**/.git/**')
Gradle does not support changing default excludes during the execution
IMPORTANT
phase.
You can do many of the same things with file trees that you can with file collections:
• merge them
You can also traverse file trees using the FileTree.visit(org.gradle.api.Action) method. All of these
techniques are demonstrated in the following example:
build.gradle.kts
// Filter a tree
val filtered: FileTree = tree.matching {
include("org/gradle/api/**")
}
build.gradle
// Filter a tree
FileTree filtered = tree.matching {
include 'org/gradle/api/**'
}
Copying files
Copying files in Gradle primarily uses CopySpec, a mechanism that makes it easy to manage
resources such as source code, configuration files, and other assets in your project build process.
Understanding CopySpec
CopySpec is a copy specification that allows you to define what files to copy, where to copy them
from, and where to copy them. It provides a flexible and expressive way to specify complex file
copying operations, including filtering files based on patterns, renaming files, and
including/excluding files based on various criteria.
CopySpec instances are used in the Copy task to specify the files and directories to be copied.
Consider a build with several tasks that copy a project’s static website resources or add them to an
archive. One task might copy the resources to a folder for a local HTTP server, and another might
package them into a distribution. You could manually specify the file locations and appropriate
inclusions each time they are needed, but human error is more likely to creep in, resulting in
inconsistencies between tasks.
One solution is the Project.copySpec(org.gradle.api.Action) method. This allows you to create a copy
spec outside a task, which can then be attached to an appropriate task using the
CopySpec.with(org.gradle.api.file.CopySpec…) method. The following example demonstrates how
this is done:
build.gradle.kts
tasks.register<Copy>("copyAssets") {
into(layout.buildDirectory.dir("inPlaceApp"))
with(webAssetsSpec)
}
tasks.register<Zip>("distApp") {
archiveFileName = "my-app-dist.zip"
destinationDirectory = layout.buildDirectory.dir("dists")
from(appClasses)
with(webAssetsSpec)
}
build.gradle
tasks.register('copyAssets', Copy) {
into layout.buildDirectory.dir("inPlaceApp")
with webAssetsSpec
}
tasks.register('distApp', Zip) {
archiveFileName = 'my-app-dist.zip'
destinationDirectory = layout.buildDirectory.dir('dists')
from appClasses
with webAssetsSpec
}
Both the copyAssets and distApp tasks will process the static resources under src/main/webapp, as
specified by webAssetsSpec.
The configuration defined by webAssetsSpec will not apply to the app classes
included by the distApp task. That’s because from appClasses is its own child
specification independent of with webAssetsSpec.
NOTE
This can be confusing, so it’s probably best to treat with() as an extra from()
specification in the task. Hence, it doesn’t make sense to define a standalone copy
spec without at least one from() defined.
Suppose you encounter a scenario in which you want to apply the same copy configuration to
different sets of files. In that case, you can share the configuration block directly without using
copySpec(). Here’s an example that has two independent tasks that happen to want to process
image files only:
build.gradle.kts
tasks.register<Copy>("copyAppAssets") {
into(layout.buildDirectory.dir("inPlaceApp"))
from("src/main/webapp", webAssetPatterns)
}
tasks.register<Zip>("archiveDistAssets") {
archiveFileName = "distribution-assets.zip"
destinationDirectory = layout.buildDirectory.dir("dists")
from("distResources", webAssetPatterns)
}
build.gradle
def webAssetPatterns = {
include '**/*.html', '**/*.png', '**/*.jpg'
}
tasks.register('copyAppAssets', Copy) {
into layout.buildDirectory.dir("inPlaceApp")
from 'src/main/webapp', webAssetPatterns
}
tasks.register('archiveDistAssets', Zip) {
archiveFileName = 'distribution-assets.zip'
destinationDirectory = layout.buildDirectory.dir('dists')
In this case, we assign the copy configuration to its own variable and apply it to whatever from()
specification we want. This doesn’t just work for inclusions but also exclusions, file renaming, and
file content filtering.
If you only use a single copy spec, the file filtering and renaming will apply to all files copied.
Sometimes, this is what you want, but not always. Consider the following example that copies files
into a directory structure that a Java Servlet container can use to deliver a website:
This is not a straightforward copy as the WEB-INF directory and its subdirectories don’t exist within
the project, so they must be created during the copy. In addition, we only want HTML and image
files going directly into the root folder — build/explodedWar — and only JavaScript files going into
the js directory. We need separate filter patterns for those two sets of files.
The solution is to use child specifications, which can be applied to both from() and into()
declarations. The following task definition does the necessary work:
build.gradle.kts
tasks.register<Copy>("nestedSpecs") {
into(layout.buildDirectory.dir("explodedWar"))
exclude("**/*staging*")
from("src/dist") {
include("**/*.html", "**/*.png", "**/*.jpg")
}
from(sourceSets.main.get().output) {
into("WEB-INF/classes")
}
into("WEB-INF/lib") {
from(configurations.runtimeClasspath)
}
}
build.gradle
tasks.register('nestedSpecs', Copy) {
into layout.buildDirectory.dir("explodedWar")
exclude '**/*staging*'
from('src/dist') {
include '**/*.html', '**/*.png', '**/*.jpg'
}
from(sourceSets.main.output) {
into 'WEB-INF/classes'
}
into('WEB-INF/lib') {
from configurations.runtimeClasspath
}
}
Notice how the src/dist configuration has a nested inclusion specification; it is the child copy spec.
You can, of course, add content filtering and renaming here as required. A child copy spec is still a
copy spec.
The above example also demonstrates how you can copy files into a subdirectory of the destination
either by using a child into() on a from() or a child from() on an into(). Both approaches are
acceptable, but you should create and follow a convention to ensure consistency across your build
files.
Don’t get your into() specifications mixed up. For a normal copy, one to the
filesystem rather than an archive, there should always be one "root" into() that
NOTE
specifies the overall destination directory of the copy. Any other into() should have
a child spec attached, and its path will be relative to the root into().
One final thing to be aware of is that a child copy spec inherits its destination path, include
patterns, exclude patterns, copy actions, name mappings, and filters from its parent. So, be careful
where you place your configuration.
Using the Sync task
The Sync task, which extends the Copy task, copies the source files into the destination directory and
then removes any files from the destination directory which it did not copy. It synchronizes the
contents of a directory with its source.
This can be useful for doing things such as installing your application, creating an exploded copy of
your archives, or maintaining a copy of the project’s dependencies.
Here is an example that maintains a copy of the project’s runtime dependencies in the build/libs
directory:
build.gradle.kts
tasks.register<Sync>("libs") {
from(configurations["runtime"])
into(layout.buildDirectory.dir("libs"))
}
build.gradle
tasks.register('libs', Sync) {
from configurations.runtime
into layout.buildDirectory.dir('libs')
}
You can also perform the same function in your own tasks with the
Project.sync(org.gradle.api.Action) method.
You can copy a file by creating an instance of Gradle’s builtin Copy task and configuring it with the
location of the file and where you want to put it.
This example mimics copying a generated report into a directory that will be packed into an
archive, such as a ZIP or TAR:
build.gradle.kts
tasks.register<Copy>("copyReport") {
from(layout.buildDirectory.file("reports/my-report.pdf"))
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyReport', Copy) {
from layout.buildDirectory.file("reports/my-report.pdf")
into layout.buildDirectory.dir("toArchive")
}
The file and directory paths are then used to specify what file to copy using
Copy.from(java.lang.Object…) and which directory to copy it to using Copy.into(java.lang.Object).
Although hard-coded paths make for simple examples, they make the build brittle. Using a reliable,
single source of truth, such as a task or shared project property, is better. In the following modified
example, we use a report task defined elsewhere that has the report’s location stored in its
outputFile property:
build.gradle.kts
tasks.register<Copy>("copyReport2") {
from(myReportTask.flatMap { it.outputFile })
into(archiveReportsTask.flatMap { it.dirToArchive })
}
build.gradle
tasks.register('copyReport2', Copy) {
from myReportTask.outputFile
into archiveReportsTask.dirToArchive
}
We have also assumed that the reports will be archived by archiveReportsTask, which provides us
with the directory that will be archived and hence where we want to put the copies of the reports.
You can extend the previous examples to multiple files very easily by providing multiple arguments
to from():
build.gradle.kts
tasks.register<Copy>("copyReportsForArchiving") {
from(layout.buildDirectory.file("reports/my-report.pdf"),
layout.projectDirectory.file("src/docs/manual.pdf"))
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyReportsForArchiving', Copy) {
from layout.buildDirectory.file("reports/my-report.pdf"), layout
.projectDirectory.file("src/docs/manual.pdf")
into layout.buildDirectory.dir("toArchive")
}
You can also use multiple from() statements to do the same thing, as shown in the first example of
the section File copying in depth.
But what if you want to copy all the PDFs in a directory without specifying each one? To do this,
attach inclusion and/or exclusion patterns to the copy specification. Here, we use a string pattern to
include PDFs only:
build.gradle.kts
tasks.register<Copy>("copyPdfReportsForArchiving") {
from(layout.buildDirectory.dir("reports"))
include("*.pdf")
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyPdfReportsForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
include "*.pdf"
into layout.buildDirectory.dir("toArchive")
}
One thing to note, as demonstrated in the following diagram, is that only the PDFs that reside
directly in the reports directory are copied:
You can include files in subdirectories by using an Ant-style glob pattern (**/*), as done in this
updated example:
build.gradle.kts
tasks.register<Copy>("copyAllPdfReportsForArchiving") {
from(layout.buildDirectory.dir("reports"))
include("**/*.pdf")
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyAllPdfReportsForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
include "**/*.pdf"
into layout.buildDirectory.dir("toArchive")
}
Remember that a deep filter like this has the side effect of copying the directory structure below
reports and the files. If you want to copy the files without the directory structure, you must use an
explicit fileTree(dir) { includes }.files expression.
Copying directory hierarchies
You may need to copy files as well as the directory structure in which they reside. This is the default
behavior when you specify a directory as the from() argument, as demonstrated by the following
example that copies everything in the reports directory, including all its subdirectories, to the
destination:
build.gradle.kts
tasks.register<Copy>("copyReportsDirForArchiving") {
from(layout.buildDirectory.dir("reports"))
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyReportsDirForArchiving', Copy) {
from layout.buildDirectory.dir("reports")
into layout.buildDirectory.dir("toArchive")
}
The key aspect that users need help with is controlling how much of the directory structure goes to
the destination. In the above example, do you get a toArchive/reports directory, or does everything
in reports go straight into toArchive? The answer is the latter. If a directory is part of the from()
path, then it won’t appear in the destination.
So how do you ensure that reports itself is copied across, but not any other directory in
${layout.buildDirectory}? The answer is to add it as an include pattern:
build.gradle.kts
tasks.register<Copy>("copyReportsDirForArchiving2") {
from(layout.buildDirectory) {
include("reports/**")
}
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyReportsDirForArchiving2', Copy) {
from(layout.buildDirectory) {
include "reports/**"
}
into layout.buildDirectory.dir("toArchive")
}
You’ll get the same behavior as before except with one extra directory level in the destination, i.e.,
toArchive/reports.
One thing to note is how the include() directive applies only to the from(), whereas the directive in
the previous section applied to the whole task. These different levels of granularity in the copy
specification allow you to handle most requirements that you will come across easily.
But this apparent simplicity hides a rich API that allows fine-grained control of which files are
copied, where they go, and what happens to them as they are copied — renaming of the files and
token substitution of file content are both possibilities, for example.
Let’s start with the last two items on the list, which involve CopySpec. The CopySpec interface, which
the Copy task implements, offers:
CopySpec has several additional methods that allow you to control the copying process, but these
two are the only required ones. into() is straightforward, requiring a directory path as its
argument in any form supported by the Project.file(java.lang.Object) method. The from()
configuration is far more flexible.
Not only does from() accept multiple arguments, it also allows several different types of argument.
For example, some of the most common types are:
• A String — treated as a file path or, if it starts with "file://", a file URI
• A FileCollection or FileTree — all files in the collection are included in the copy
• A task — the files or directories that form a task’s defined outputs are included
In fact, from() accepts all the same arguments as Project.files(java.lang.Object…) so see that method
for a more detailed list of acceptable types.
Something else to consider is what type of thing a file path refers to:
• A directory — this is effectively treated as a file tree: everything in it, including subdirectories,
is copied. However, the directory itself is not included in the copy.
Here is an example that uses multiple from() specifications, each with a different argument type.
You will probably also notice that into() is configured lazily using a closure (in Groovy) or a
Provider (in Kotlin) — a technique that also works with from():
build.gradle.kts
tasks.register<Copy>("anotherCopyTask") {
// Copy everything under src/main/webapp
from("src/main/webapp")
// Copy a single file
from("src/staging/index.html")
// Copy the output of a task
from(copyTask)
// Copy the output of a task using Task outputs explicitly.
from(tasks["copyTaskWithPatterns"].outputs)
// Copy the contents of a Zip file
from(zipTree("src/main/assets.zip"))
// Determine the destination directory later
into({ getDestDir() })
}
build.gradle
tasks.register('anotherCopyTask', Copy) {
// Copy everything under src/main/webapp
from 'src/main/webapp'
// Copy a single file
from 'src/staging/index.html'
// Copy the output of a task
from copyTask
// Copy the output of a task using Task outputs explicitly.
from copyTaskWithPatterns.outputs
// Copy the contents of a Zip file
from zipTree('src/main/assets.zip')
// Determine the destination directory later
into { getDestDir() }
}
Note that the lazy configuration of into() is different from a child specification, even though the
syntax is similar. Keep an eye on the number of arguments to distinguish between them.
Occasionally, you want to copy files or directories as part of a task. For example, a custom archiving
task based on an unsupported archive format might want to copy files to a temporary directory
before they are archived. You still want to take advantage of Gradle’s copy API without introducing
an extra Copy task.
build.gradle.kts
tasks.register("copyMethod") {
doLast {
copy {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
include("**/*.html")
include("**/*.jsp")
}
}
}
build.gradle
tasks.register('copyMethod') {
doLast {
copy {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
include '**/*.html'
include '**/*.jsp'
}
}
}
The above example demonstrates the basic syntax and also highlights two major limitations of
using the copy() method:
1. The copy() method is not incremental. The example’s copyMethod task will always execute
because it has no information about what files make up the task’s inputs. You have to define the
task inputs and outputs manually.
2. Using a task as a copy source, i.e., as an argument to from(), won’t create an automatic task
dependency between your task and that copy source. As such, if you use the copy() method as
part of a task action, you must explicitly declare all inputs and outputs to get the correct
behavior.
The following example shows how to work around these limitations using the dynamic API for task
inputs and outputs:
build.gradle.kts
tasks.register("copyMethodWithExplicitDependencies") {
// up-to-date check for inputs, plus add copyTask as dependency
inputs.files(copyTask)
.withPropertyName("inputs")
.withPathSensitivity(PathSensitivity.RELATIVE)
outputs.dir("some-dir") // up-to-date check for outputs
.withPropertyName("outputDir")
doLast {
copy {
// Copy the output of copyTask
from(copyTask)
into("some-dir")
}
}
}
build.gradle
tasks.register('copyMethodWithExplicitDependencies') {
// up-to-date check for inputs, plus add copyTask as dependency
inputs.files(copyTask)
.withPropertyName("inputs")
.withPathSensitivity(PathSensitivity.RELATIVE)
outputs.dir('some-dir') // up-to-date check for outputs
.withPropertyName("outputDir")
doLast {
copy {
// Copy the output of copyTask
from copyTask
into 'some-dir'
}
}
}
These limitations make it preferable to use the Copy task wherever possible because of its built-in
support for incremental building and task dependency inference. That is why the copy() method is
intended for use by custom tasks that need to copy files as part of their function. Custom tasks that
use the copy() method should declare the necessary inputs and outputs relevant to the copy action.
Renaming files
Renaming files in Gradle can be done using the CopySpec API, which provides methods for renaming
files as they are copied.
Using Copy.rename()
If the files used and generated by your builds sometimes don’t have names that suit, you can
rename those files as you copy them. Gradle allows you to do this as part of a copy specification
using the rename() configuration.
The following example removes the "-staging" marker from the names of any files that have it:
build.gradle.kts
tasks.register<Copy>("copyFromStaging") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
rename("(.+)-staging(.+)", "$1$2")
}
build.gradle
tasks.register('copyFromStaging', Copy) {
from "src/main/webapp"
into layout.buildDirectory.dir('explodedWar')
As in the above example, you can use regular expressions for this or closures that use more
complex logic to determine the target filename. For example, the following task truncates
filenames:
build.gradle.kts
tasks.register<Copy>("copyWithTruncate") {
from(layout.buildDirectory.dir("reports"))
rename { filename: String ->
if (filename.length > 10) {
filename.slice(0..7) + "~" + filename.length
}
else filename
}
into(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('copyWithTruncate', Copy) {
from layout.buildDirectory.dir("reports")
rename { String filename ->
if (filename.size() > 10) {
return filename[0..7] + "~" + filename.size()
}
else return filename
}
into layout.buildDirectory.dir("toArchive")
}
As with filtering, you can also rename a subset of files by configuring it as part of a child
specification on a from().
Using Copyspec.rename{}
The example of how to rename files on copy gives you most of the information you need to perform
this operation. It demonstrates the two options for renaming:
2. Using a closure
Regular expressions are a flexible approach to renaming, particularly as Gradle supports regex
groups that allow you to remove and replace parts of the source filename. The following example
shows how you can remove the string "-staging" from any filename that contains it using a simple
regular expression:
build.gradle.kts
tasks.register<Copy>("rename") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
// Use a regular expression to map the file name
rename("(.+)-staging(.+)", "$1$2")
rename("(.+)-staging(.+)".toRegex().pattern, "$1$2")
// Use a closure to convert all file names to upper case
rename { fileName: String ->
fileName.toUpperCase()
}
}
build.gradle
tasks.register('rename', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
// Use a regular expression to map the file name
rename '(.+)-staging(.+)', '$1$2'
rename(/(.+)-staging(.+)/, '$1$2')
// Use a closure to convert all file names to upper case
rename { String fileName ->
fileName.toUpperCase()
}
}
You can use any regular expression supported by the Java Pattern class and the substitution string.
The second argument of rename() works on the same principles as the Matcher.appendReplacement()
method.
1. If you use a slashy string (those delimited by '/') for the first argument, you must include the
parentheses for rename() as shown in the above example.
2. It’s safest to use single quotes for the second argument, otherwise you need to escape the '$' in
group substitutions, i.e. "\$1\$2".
The first is a minor inconvenience, but slashy strings have the advantage that you don’t have to
escape backslash ('\') characters in the regular expression. The second issue stems from Groovy’s
support for embedded expressions using ${ } syntax in double-quoted and slashy strings.
The closure syntax for rename() is straightforward and can be used for any requirements that
simple regular expressions can’t handle. You’re given a file’s name, and you return a new name for
that file or null if you don’t want to change the name. Be aware that the closure will be executed for
every file copied, so try to avoid expensive operations where possible.
Filtering files
Filtering files in Gradle involves selectively including or excluding files based on certain criteria.
You can apply filtering in any copy specification through the CopySpec.include(java.lang.String…)
and CopySpec.exclude(java.lang.String…) methods.
These methods are typically used with Ant-style include or exclude patterns, as described in
PatternFilterable.
You can also perform more complex logic by using a closure that takes a FileTreeElement and
returns true if the file should be included or false otherwise. The following example demonstrates
both forms, ensuring that only .html and .jsp files are copied, except for those .html files with the
word "DRAFT" in their content:
build.gradle.kts
tasks.register<Copy>("copyTaskWithPatterns") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
include("**/*.html")
include("**/*.jsp")
exclude { details: FileTreeElement ->
details.file.name.endsWith(".html") &&
details.file.readText().contains("DRAFT")
}
}
build.gradle
tasks.register('copyTaskWithPatterns', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
include '**/*.html'
include '**/*.jsp'
exclude { FileTreeElement details ->
details.file.name.endsWith('.html') &&
details.file.text.contains('DRAFT')
}
}
A question you may ask yourself at this point is what happens when inclusion and exclusion
patterns overlap? Which pattern wins? Here are the basic rules:
• If at least one inclusion is specified, only files and directories matching the patterns are
included
• Any exclusion pattern overrides any inclusions, so if a file or directory matches at least one
exclusion pattern, it won’t be included, regardless of the inclusion patterns
Bear these rules in mind when creating combined inclusion and exclusion specifications so that
you end up with the exact behavior you want.
Note that the inclusions and exclusions in the above example will apply to all from() configurations.
If you want to apply filtering to a subset of the copied files, you’ll need to use child specifications.
Filtering file content in Gradle involves replacing placeholders or tokens in files with dynamic
values.
Using CopySpec.filter()
Transforming the content of files while they are being copied involves basic templating that uses
token substitution, removal of lines of text, or even more complex filtering using a full-blown
template engine.
The following example demonstrates several forms of filtering, including token substitution using
the CopySpec.expand(java.util.Map) method and another using CopySpec.filter(java.lang.Class) with
an Ant filter:
build.gradle.kts
import org.apache.tools.ant.filters.FixCrLfFilter
import org.apache.tools.ant.filters.ReplaceTokens
tasks.register<Copy>("filter") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
// Substitute property tokens in files
expand("copyright" to "2009", "version" to "2.3.1")
// Use some of the filters provided by Ant
filter(FixCrLfFilter::class)
filter(ReplaceTokens::class, "tokens" to mapOf("copyright" to "2009",
"version" to "2.3.1"))
// Use a closure to filter each line
filter { line: String ->
"[$line]"
}
// Use a closure to remove lines
filter { line: String ->
if (line.startsWith('-')) null else line
}
filteringCharset = "UTF-8"
}
build.gradle
import org.apache.tools.ant.filters.FixCrLfFilter
import org.apache.tools.ant.filters.ReplaceTokens
tasks.register('filter', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
// Substitute property tokens in files
expand(copyright: '2009', version: '2.3.1')
// Use some of the filters provided by Ant
filter(FixCrLfFilter)
filter(ReplaceTokens, tokens: [copyright: '2009', version: '2.3.1'])
// Use a closure to filter each line
filter { String line ->
"[$line]"
}
// Use a closure to remove lines
filter { String line ->
line.startsWith('-') ? null : line
}
filteringCharset = 'UTF-8'
}
• one takes a FilterReader and is designed to work with Ant filters, such as ReplaceTokens
• one takes a closure or Transformer that defines the transformation for each line of the source
file
Note that both variants assume the source files are text-based. When you use the ReplaceTokens
class with filter(), you create a template engine that replaces tokens of the form @tokenName@ (the
Ant-style token) with values you define.
Using CopySpec.expand()
The expand() method treats the source files as Groovy templates, which evaluates and expands
expressions of the form ${expression}.
You can pass in property names and values that are then expanded in the source files. expand()
allows for more than basic token substitution as the embedded expressions are full-blown Groovy
expressions.
Specifying the character set when reading and writing the file is good practice.
Otherwise, the transformations won’t work properly for non-ASCII text. You
NOTE configure the character set with the CopySpec.setFilteringCharset(String) property.
If it’s not specified, the JVM default character set is used, which will likely differ
from the one you want.
Setting file permissions in Gradle involves specifying the permissions for files or directories created
or modified during the build process.
Using CopySpec.filePermissions{}
For any CopySpec involved in copying files, may it be the Copy task itself, or any child specifications,
you can explicitly set the permissions the destination files will have via the
CopySpec.filePermissions {} configurations block.
Using CopySpec.dirPermissions{}
You can do the same for directories too, independently of files, via the CopySpec.dirPermissions {}
configurations block.
Not setting permissions explicitly will preserve the permissions of the original files
NOTE
or directories.
build.gradle.kts
tasks.register<Copy>("permissions") {
from("src/main/webapp")
into(layout.buildDirectory.dir("explodedWar"))
filePermissions {
user {
read = true
execute = true
}
other.execute = false
}
dirPermissions {
unix("r-xr-x---")
}
}
build.gradle
tasks.register('permissions', Copy) {
from 'src/main/webapp'
into layout.buildDirectory.dir('explodedWar')
filePermissions {
user {
read = true
execute = true
}
other.execute = false
}
dirPermissions {
unix('r-xr-x---')
}
}
Using empty configuration blocks for file or directory permissions still sets them explicitly, just to
fixed default values. Everything inside one of these configuration blocks is relative to the default
values. Default permissions differ for files and directories:
• file: read & write for owner, read for group, read for other (0644, rw-r—r--)
• directory: read, write & execute for owner, read & execute for group, read & execute for other
(0755, rwxr-xr-x)
Moving files and directories in Gradle is a straightforward process that can be accomplished using
several APIs. When implementing file-moving logic in your build scripts, it’s important to consider
file paths, conflicts, and task dependencies.
Using File.renameTo()
File.renameTo() is a method in Java (and by extension, in Gradle’s Groovy DSL) used to rename or
move a file or directory. When you call renameTo() on a File object, you provide another File object
representing the new name or location. If the operation is successful, renameTo() returns true;
otherwise, it returns false.
It’s important to note that renameTo() has some limitations and platform-specific behavior.
In this example, the moveFile task uses the Copy task type to specify the source and destination
directories. Inside the doLast closure, it uses File.renameTo() to move the file from the source
directory to the destination directory:
task moveFile {
doLast {
def sourceFile = file('source.txt')
def destFile = file('destination/new_name.txt')
if (sourceFile.renameTo(destFile)) {
println "File moved successfully."
}
}
}
In this example, the moveFile task copies the file source.txt to the destination directory and
renames it to new_name.txt in the process. This achieves a similar effect to moving a file.
Deleting files and directories in Gradle involves removing them from the file system.
You can easily delete files and directories using the Delete task. You must specify which files and
directories to delete in a way supported by the Project.files(java.lang.Object…) method.
For example, the following task deletes the entire contents of a build’s output directory:
build.gradle.kts
tasks.register<Delete>("myClean") {
delete(buildDir)
}
build.gradle
tasks.register('myClean', Delete) {
delete buildDir
}
If you want more control over which files are deleted, you can’t use inclusions and exclusions the
same way you use them for copying files. Instead, you use the built-in filtering mechanisms of
FileCollection and FileTree. The following example does just that to clear out temporary files from
a source directory:
build.gradle.kts
tasks.register<Delete>("cleanTempFiles") {
delete(fileTree("src").matching {
include("**/*.tmp")
})
}
build.gradle
tasks.register('cleanTempFiles', Delete) {
delete fileTree("src").matching {
include "**/*.tmp"
}
}
Using Project.delete()
This method takes one or more arguments representing the files or directories to be deleted.
For example, the following task deletes the entire contents of a build’s output directory:
build.gradle.kts
tasks.register<Delete>("myClean") {
delete(buildDir)
}
build.gradle
tasks.register('myClean', Delete) {
delete buildDir
}
If you want more control over which files are deleted, you can’t use inclusions and exclusions the
same way you use them for copying files. Instead, you use the built-in filtering mechanisms of
FileCollection and FileTree. The following example does just that to clear out temporary files from
a source directory:
build.gradle.kts
tasks.register<Delete>("cleanTempFiles") {
delete(fileTree("src").matching {
include("**/*.tmp")
})
}
build.gradle
tasks.register('cleanTempFiles', Delete) {
delete fileTree("src").matching {
include "**/*.tmp"
}
}
Creating archives
From the perspective of Gradle, packing files into an archive is effectively a copy in which the
destination is the archive file rather than a directory on the file system. Creating archives looks a
lot like copying, with all the same features.
The simplest case involves archiving the entire contents of a directory, which this example
demonstrates by creating a ZIP of the toArchive directory:
build.gradle.kts
tasks.register<Zip>("packageDistribution") {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir("dist")
from(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('packageDistribution', Zip) {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir('dist')
from layout.buildDirectory.dir("toArchive")
}
Notice how we specify the destination and name of the archive instead of an into(): both are
required. You often won’t see them explicitly set because most projects apply the Base Plugin. It
provides some conventional values for those properties.
The following example demonstrates this; you can learn more about the conventions in the archive
naming section.
Each type of archive has its own task type, the most common ones being Zip, Tar and Jar. They all
share most of the configuration options of Copy, including filtering and renaming.
One of the most common scenarios involves copying files into specified archive subdirectories. For
example, let’s say you want to package all PDFs into a docs directory in the archive’s root. This docs
directory doesn’t exist in the source location, so you must create it as part of the archive. You do
this by adding an into() declaration for just the PDFs:
build.gradle.kts
plugins {
base
}
version = "1.0.0"
tasks.register<Zip>("packageDistribution") {
from(layout.buildDirectory.dir("toArchive")) {
exclude("**/*.pdf")
}
from(layout.buildDirectory.dir("toArchive")) {
include("**/*.pdf")
into("docs")
}
}
build.gradle
plugins {
id 'base'
}
version = "1.0.0"
tasks.register('packageDistribution', Zip) {
from(layout.buildDirectory.dir("toArchive")) {
exclude "**/*.pdf"
}
from(layout.buildDirectory.dir("toArchive")) {
include "**/*.pdf"
into "docs"
}
}
As you can see, you can have multiple from() declarations in a copy specification, each with its own
configuration. See Using child copy specifications for more information on this feature.
Archives are essentially self-contained file systems, and Gradle treats them as such. This is why
working with archives is similar to working with files and directories.
Out of the box, Gradle supports the creation of ZIP and TAR archives and, by extension, Java’s JAR,
WAR, and EAR formats—Java’s archive formats are all ZIPs. Each of these formats has a
corresponding task type to create them: Zip, Tar, Jar, War, and Ear. These all work the same way
and are based on copy specifications, just like the Copy task.
Creating an archive file is essentially a file copy in which the destination is implicit, i.e., the archive
file itself. Here is a basic example that specifies the path and name of the target archive file:
build.gradle.kts
tasks.register<Zip>("packageDistribution") {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir("dist")
from(layout.buildDirectory.dir("toArchive"))
}
build.gradle
tasks.register('packageDistribution', Zip) {
archiveFileName = "my-distribution.zip"
destinationDirectory = layout.buildDirectory.dir('dist')
from layout.buildDirectory.dir("toArchive")
}
The full power of copy specifications is available to you when creating archives, which means you
can do content filtering, file renaming, or anything else covered in the previous section. A common
requirement is copying files into subdirectories of the archive that don’t exist in the source folders,
something that can be achieved with into() child specifications.
Gradle allows you to create as many archive tasks as you want, but it’s worth considering that
many convention-based plugins provide their own. For example, the Java plugin adds a jar task for
packaging a project’s compiled classes and resources in a JAR. Many of these plugins provide
sensible conventions for the names of archives and the copy specifications used. We recommend
you use these tasks wherever you can rather than overriding them with your own.
Naming archives
Gradle has several conventions around the naming of archives and where they are created based
on the plugins your project uses. The main convention is provided by the Base Plugin, which
defaults to creating archives in the layout.buildDirectory.dir("distributions") directory and
typically uses archive names of the form [projectName]-[version].[type].
The following example comes from a project named archive-naming, hence the myZip task creates an
archive named archive-naming-1.0.zip:
build.gradle.kts
plugins {
base
}
version = "1.0"
tasks.register<Zip>("myZip") {
from("somedir")
val projectDir = layout.projectDirectory.asFile
doLast {
println(archiveFileName.get())
println(destinationDirectory.get().asFile.relativeTo(projectDir))
println(archiveFile.get().asFile.relativeTo(projectDir))
}
}
build.gradle
plugins {
id 'base'
}
version = 1.0
tasks.register('myZip', Zip) {
from 'somedir'
File projectDir = layout.projectDirectory.asFile
doLast {
println archiveFileName.get()
println projectDir.relativePath(destinationDirectory.get().asFile)
println projectDir.relativePath(archiveFile.get().asFile)
}
}
$ gradle -q myZip
archive-naming-1.0.zip
build/distributions
build/distributions/archive-naming-1.0.zip
Note that the archive name does not derive from the task’s name that creates it.
If you want to change the name and location of a generated archive file, you can provide values for
the corresponding task’s archiveFileName and destinationDirectory properties. These override any
conventions that would otherwise apply.
Alternatively, you can make use of the default archive name pattern provided by
AbstractArchiveTask.getArchiveFileName(): [archiveBaseName]-[archiveAppendix]-[archiveVersion]-
[archiveClassifier].[archiveExtension]. You can set each of these properties on the task separately.
Note that the Base Plugin uses the convention of the project name for archiveBaseName, project
version for archiveVersion, and the archive type for archiveExtension. It does not provide values for
the other properties.
This example — from the same project as the one above — configures just the archiveBaseName
property, overriding the default value of the project name:
build.gradle.kts
tasks.register<Zip>("myCustomZip") {
archiveBaseName = "customName"
from("somedir")
doLast {
println(archiveFileName.get())
}
}
build.gradle
tasks.register('myCustomZip', Zip) {
archiveBaseName = 'customName'
from 'somedir'
doLast {
println archiveFileName.get()
}
}
$ gradle -q myCustomZip
customName-1.0.zip
You can also override the default archiveBaseName value for all the archive tasks in your build by
using the project property archivesBaseName, as demonstrated by the following example:
build.gradle.kts
plugins {
base
}
version = "1.0"
base {
archivesName = "gradle"
distsDirectory = layout.buildDirectory.dir("custom-dist")
libsDirectory = layout.buildDirectory.dir("custom-libs")
}
tasks.register("echoNames") {
val projectNameString = project.name
val archiveFileName = myZip.flatMap { it.archiveFileName }
val myOtherArchiveFileName = myOtherZip.flatMap { it.archiveFileName }
doLast {
println("Project name: $projectNameString")
println(archiveFileName.get())
println(myOtherArchiveFileName.get())
}
}
build.gradle
plugins {
id 'base'
}
version = 1.0
base {
archivesName = "gradle"
distsDirectory = layout.buildDirectory.dir('custom-dist')
libsDirectory = layout.buildDirectory.dir('custom-libs')
}
tasks.register('echoNames') {
def projectNameString = project.name
def archiveFileName = myZip.flatMap { it.archiveFileName }
def myOtherArchiveFileName = myOtherZip.flatMap { it.archiveFileName }
doLast {
println "Project name: $projectNameString"
println archiveFileName.get()
println myOtherArchiveFileName.get()
}
}
$ gradle -q echoNames
Project name: archives-changed-base-name
gradle-1.0.zip
gradle-wrapper-1.0-src.zip
You can find all the possible archive task properties in the API documentation for
AbstractArchiveTask. Still, we have also summarized the main ones here:
As described in the CopySpec section above, you can use the Project.copySpec(org.gradle.api.Action)
method to share content between archives.
An archive is a directory and file hierarchy packed into a single file. In other words, it’s a special
case of a file tree, and that’s exactly how Gradle treats archives.
Instead of using the fileTree() method, which only works on normal file systems, you use the
Project.zipTree(java.lang.Object) and Project.tarTree(java.lang.Object) methods to wrap archive
files of the corresponding type (note that JAR, WAR and EAR files are ZIPs). Both methods return
FileTree instances that you can then use in the same way as normal file trees. For example, you can
extract some or all of the files of an archive by copying its contents to some directory on the file
system. Or you can merge one archive into another.
build.gradle.kts
// tar tree attempts to guess the compression based on the file extension
// however if you must specify the compression explicitly you can:
val someTar: FileTree = tarTree(resources.gzip("someTar.ext"))
build.gradle
//tar tree attempts to guess the compression based on the file extension
//however if you must specify the compression explicitly you can:
FileTree someTar = tarTree(resources.gzip('someTar.ext'))
You can see a practical example of extracting an archive file in the unpacking archives section
below.
Sometimes it’s desirable to recreate archives exactly the same, byte for byte, on different machines.
You want to be sure that building an artifact from source code produces the same result no matter
when and where it is built. This is necessary for projects like reproducible-builds.org.
Reproducing the same byte-for-byte archive poses some challenges since the order of the files in an
archive is influenced by the underlying file system. Each time a ZIP, TAR, JAR, WAR or EAR is built
from source, the order of the files inside the archive may change. Files that only have a different
timestamp also causes differences in archives from build to build.
All AbstractArchiveTask (e.g. Jar, Zip) tasks shipped with Gradle include support for producing
reproducible archives.
For example, to make a Zip task reproducible you need to set Zip.isReproducibleFileOrder() to true
and Zip.isPreserveFileTimestamps() to false. In order to make all archive tasks in your build
reproducible, consider adding the following configuration to your build file:
build.gradle.kts
tasks.withType<AbstractArchiveTask>().configureEach {
isPreserveFileTimestamps = false
isReproducibleFileOrder = true
}
build.gradle
tasks.withType(AbstractArchiveTask).configureEach {
preserveFileTimestamps = false
reproducibleFileOrder = true
}
Often you will want to publish an archive, so that it is usable from another project. This process is
described in Cross-Project publications.
Unpacking archives
Archives are effectively self-contained file systems, so unpacking them is a case of copying the files
from that file system onto the local file system — or even into another archive. Gradle enables this
by providing some wrapper functions that make archives available as hierarchical collections of
files (file trees).
That file tree can then be used in a from() specification, like so:
build.gradle.kts
tasks.register<Copy>("unpackFiles") {
from(zipTree("src/resources/thirdPartyResources.zip"))
into(layout.buildDirectory.dir("resources"))
}
build.gradle
tasks.register('unpackFiles', Copy) {
from zipTree("src/resources/thirdPartyResources.zip")
into layout.buildDirectory.dir("resources")
}
As with a normal copy, you can control which files are unpacked via filters and even rename files
as they are unpacked.
More advanced processing can be handled by the eachFile() method. For example, you might need
to extract different subtrees of the archive into different paths within the destination directory. The
following sample uses the method to extract the files within the archive’s libs directory into the
root destination directory, rather than into a libs subdirectory:
build.gradle.kts
tasks.register<Copy>("unpackLibsDirectory") {
from(zipTree("src/resources/thirdPartyResources.zip")) {
include("libs/**") ①
eachFile {
relativePath = RelativePath(true,
*relativePath.segments.drop(1).toTypedArray()) ②
}
includeEmptyDirs = false ③
}
into(layout.buildDirectory.dir("resources"))
}
build.gradle
tasks.register('unpackLibsDirectory', Copy) {
from(zipTree("src/resources/thirdPartyResources.zip")) {
include "libs/**" ①
eachFile { fcd ->
fcd.relativePath = new RelativePath(true, fcd.relativePath
.segments.drop(1)) ②
}
includeEmptyDirs = false ③
}
into layout.buildDirectory.dir("resources")
}
① Extracts only the subset of files that reside in the libs directory
② Remaps the path of the extracting files into the destination directory by dropping the libs
segment from the file path
③ Ignores the empty directories resulting from the remapping, see Caution note below
You can not change the destination path of empty directories with this
CAUTION
technique. You can learn more in this issue.
If you’re a Java developer wondering why there is no jarTree() method, that’s because zipTree()
works perfectly well for JARs, WARs, and EARs.
In Java, applications and their dependencies were typically packaged as separate JARs within a
single distribution archive. That still happens, but another approach that is now common is placing
the classes and resources of the dependencies directly into the application JAR, creating what is
known as an Uber or fat JAR.
Creating "uber" or "fat" JARs in Gradle involves packaging all dependencies into a single JAR file,
making it easier to distribute and run the application.
Using the Shadow Plugin
Gradle does not have full built-in support for creating uber JARs, but you can use third-party
plugins like the Shadow plugin (com.github.johnrengelman.shadow) to achieve this. This plugin
packages your project classes and dependencies into a single JAR file.
To copy the contents of other JAR files into the application JAR, use the
Project.zipTree(java.lang.Object) method and the Jar task. This is demonstrated by the uberJar task
in the following example:
build.gradle.kts
plugins {
java
}
version = "1.0.0"
repositories {
mavenCentral()
}
dependencies {
implementation("commons-io:commons-io:2.6")
}
tasks.register<Jar>("uberJar") {
archiveClassifier = "uber"
from(sourceSets.main.get().output)
dependsOn(configurations.runtimeClasspath)
from({
configurations.runtimeClasspath.get().filter {
it.name.endsWith("jar") }.map { zipTree(it) }
})
}
build.gradle
plugins {
id 'java'
}
version = '1.0.0'
repositories {
mavenCentral()
}
dependencies {
implementation 'commons-io:commons-io:2.6'
}
tasks.register('uberJar', Jar) {
archiveClassifier = 'uber'
from sourceSets.main.output
dependsOn configurations.runtimeClasspath
from {
configurations.runtimeClasspath.findAll { it.name.endsWith('jar') }
.collect { zipTree(it) }
}
}
Creating directories
Many tasks need to create directories to store the files they generate, which is why Gradle
automatically manages this aspect of tasks when they explicitly define file and directory outputs.
All core Gradle tasks ensure that any output directories they need are created, if necessary, using
this mechanism.
In cases where you need to create a directory manually, you can use the standard
Files.createDirectories or File.mkdirs methods from within your build scripts or custom task
implementations.
Here is a simple example that creates a single images directory in the project folder:
build.gradle.kts
tasks.register("ensureDirectory") {
// Store target directory into a variable to avoid project reference in
the configuration cache
val directory = file("images")
doLast {
Files.createDirectories(directory.toPath())
}
}
build.gradle
tasks.register('ensureDirectory') {
// Store target directory into a variable to avoid project reference in
the configuration cache
def directory = file("images")
doLast {
Files.createDirectories(directory.toPath())
}
}
As described in the Apache Ant manual, the mkdir task will automatically create all necessary
directories in the given path. It will do nothing if the directory already exists.
Using Project.mkdir
You can create directories in Gradle using the mkdir method, which is available in the Project
object. This method takes a File object or a String representing the path of the directory to be
created:
tasks.register('createDirs') {
doLast {
mkdir 'src/main/resources'
mkdir file('build/generated')
When you are building a standalone executable, you may want to install this file on your system, so
it ends up in your path.
You can use a Copy task to install the executable into shared directories like /usr/local/bin. The
installation directory probably contains many other executables, some of which may even be
unreadable by Gradle. To support the unreadable files in the Copy task’s destination directory and to
avoid time consuming up-to-date checks, you can use Task.doNotTrackState():
build.gradle.kts
tasks.register<Copy>("installExecutable") {
from("build/my-binary")
into("/usr/local/bin")
doNotTrackState("Installation directory contains unrelated files")
}
build.gradle
tasks.register("installExecutable", Copy) {
from "build/my-binary"
into "/usr/local/bin"
doNotTrackState("Installation directory contains unrelated files")
}
Deploying a single file to an application server typically refers to the process of transferring a
packaged application artifact, such as a WAR file, to the application server’s deployment directory.
When working with application servers, you can use a Copy task to deploy the application archive
(e.g. a WAR file). Since you are deploying a single file, the destination directory of the Copy is the
whole deployment directory. The deployment directory sometimes does contain unreadable files
like named pipes, so Gradle may have problems doing up-to-date checks. In order to support this
use-case, you can use Task.doNotTrackState():
build.gradle.kts
plugins {
war
}
tasks.register<Copy>("deployToTomcat") {
from(tasks.war)
into(layout.projectDirectory.dir("tomcat/webapps"))
doNotTrackState("Deployment directory contains unreadable files")
}
build.gradle
plugins {
id 'war'
}
tasks.register("deployToTomcat", Copy) {
from war
into layout.projectDirectory.dir('tomcat/webapps')
doNotTrackState("Deployment directory contains unreadable files")
}
Logging
The log serves as the primary 'UI' of a build tool. If it becomes overly verbose, important warnings
and issues can be obscured. However, it is essential to have relevant information to determine if
something has gone wrong.
Gradle defines six log levels, detailed in Log levels. In addition to the standard log levels, Gradle
introduces two specific levels: QUIET and LIFECYCLE. LIFECYCLE is the default level used to report
build progress.
The console’s rich components (build status and work-in-progress area) are
NOTE
displayed regardless of the log level used.
You can choose different log levels from the command line switches shown in Log level command-
line options.
In Stacktrace command-line options you can find the command line switches which affect
stacktrace logging.
CAUTION The DEBUG log level can expose sensitive security information to the console.
-s or --stacktrace
Truncated stacktraces are printed. We recommend this over full stacktraces. Groovy full
stacktraces are extremely verbose due to the underlying dynamic invocation mechanisms. Yet
they usually do not contain relevant information about what has gone wrong in your code. This
option renders stacktraces for deprecation warnings.
-S or --full-stacktrace
The full stacktraces are printed out. This option renders stacktraces for deprecation warnings.
Running Gradle with the DEBUG log level can potentially expose sensitive information to the console
and build log.
• Environment variables
It’s important to avoid using the DEBUG log level when running on public Continuous Integration (CI)
services. Build logs on these services are accessible to the public and can expose sensitive
information. Even on private CI services, logging sensitive credentials may pose a risk depending
on your organization’s threat model. It’s advisable to discuss this with your organization’s security
team.
Some CI providers attempt to redact sensitive credentials from logs, but this process is not foolproof
and typically only redacts exact matches of pre-configured secrets.
If you suspect that a Gradle Plugin may inadvertently expose sensitive information, please contact
[[email protected]](mailto:[email protected]) for assistance with disclosure.
A simple option for logging in your build file is to write messages to standard output. Gradle
redirects anything written to standard output to its logging system at the QUIET log level:
build.gradle.kts
build.gradle
Gradle also provides a logger property to a build script, which is an instance of Logger. This
interface extends the SLF4J Logger interface and adds a few Gradle-specific methods. Below is an
example of how this is used in the build script:
build.gradle.kts
build.gradle
Use the link typical SLF4J pattern to replace a placeholder with an actual value in the log message.
build.gradle.kts
build.gradle
You can also hook into Gradle’s logging system from within other classes used in the build (classes
from the buildSrc directory, for example) with an SLF4J logger. You can use this logger the same
way as you use the provided logger in the build script.
build.gradle.kts
import org.slf4j.LoggerFactory
val slf4jLogger = LoggerFactory.getLogger("some-logger")
slf4jLogger.info("An info log message logged using SLF4j")
build.gradle
import org.slf4j.LoggerFactory
Internally, Gradle uses Ant and Ivy. Both have their own logging system. Gradle redirects their
logging output into the Gradle logging system.
There is a 1:1 mapping from the Ant/Ivy log levels to the Gradle log levels, except the Ant/Ivy TRACE
log level, which is mapped to the Gradle DEBUG log level. This means the default Gradle log level will
not show any Ant/Ivy output unless it is an error or a warning.
Many tools out there still use the standard output for logging. By default, Gradle redirects standard
output to the QUIET log level and standard error to the ERROR level. This behavior is configurable.
The project object provides a LoggingManager, which allows you to change the log levels that
standard out or error are redirected to when your build script is evaluated.
build.gradle.kts
logging.captureStandardOutput(LogLevel.INFO)
println("A message which is logged at INFO level")
build.gradle
logging.captureStandardOutput LogLevel.INFO
println 'A message which is logged at INFO level'
To change the log level for standard out or error during task execution, use a LoggingManager.
build.gradle.kts
tasks.register("logInfo") {
logging.captureStandardOutput(LogLevel.INFO)
doFirst {
println("A task message which is logged at INFO level")
}
}
build.gradle
tasks.register('logInfo') {
logging.captureStandardOutput LogLevel.INFO
doFirst {
println 'A task message which is logged at INFO level'
}
}
The configuration cache limits the ability to customize Gradle’s logging UI. The
custom logger can only implement supported listener interfaces. These
WARNING
interfaces do not receive events when the configuration cache entry is reused
because the configuration phase is skipped.
You can replace much of Gradle’s logging UI with your own. You could do this if you want to
customize the UI somehow - to log more or less information or to change the formatting. Simply
replace the logging using the Gradle.useLogger(java.lang.Object) method. This is accessible from a
build script, an init script, or via the embedding API. Note that this completely disables Gradle’s
default output. Below is an example init script that changes how task execution and build
completion are logged:
customLogger.init.gradle.kts
useLogger(CustomEventLogger())
@Suppress("deprecation")
class CustomEventLogger() : BuildAdapter(), TaskExecutionListener {
customLogger.init.gradle
useLogger(new CustomEventLogger())
@SuppressWarnings("deprecation")
class CustomEventLo