User Guide
User Guide
Version 8.12.1
Version 8.12.1
Table of Contents
OVERVIEW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Gradle User Manual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
RELEASES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Installing Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Compatibility Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
The Feature Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
UPGRADING . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Upgrading your build from Gradle 8.x to the latest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
RUNNING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
CORE CONCEPTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Gradle Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Gradle Wrapper Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Command-Line Interface Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Settings File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Build File Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Dependency Management Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Task Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Plugin Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Gradle Incremental Builds and Build Caching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Build Scans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
AUTHORING GRADLE BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
CORE CONCEPTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Gradle Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Multi-Project Build Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Build Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Writing Settings Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Writing Build Scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Using Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Writing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Using Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Writing Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
GRADLE TYPES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Understanding Properties and Providers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Understanding Collections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
Understanding Services and Service Injection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
STRUCTURING BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
Structuring Projects with Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
Declaring Dependencies between Subprojects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
Sharing Build Logic between Subprojects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
Composite Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Configuration On Demand. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
DEVELOPING TASKS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Understanding Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Controlling Task Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
Organizing Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
Configuring Tasks Lazily . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
Developing Parallel Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
Advanced Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350
Using Shared Build Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
DEVELOPING PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
Understanding Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
Understanding Implementation Options for Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
Implementing Pre-compiled Script Plugins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
Implementing Binary Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
Testing Gradle plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424
Publishing Plugins to the Gradle Plugin Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Working With Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
Initialization Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 503
Dataflow Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511
Testing Build Logic with TestKit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
Using Ant from Gradle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 526
OPTIMIZING BUILD PERFORMANCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
Configuring the Build Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
Gradle-managed Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551
Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558
Improve the Performance of Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
Configuration cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 589
Continuous Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 631
Inspecting Gradle Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 632
Isolated Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 637
File System Watching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 641
THE BUILD CACHE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645
Build Cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645
Use cases for the build cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
Build cache performance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 661
Important concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665
Caching Java projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 670
Caching Android projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
Debugging and diagnosing cache misses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 678
Solving common problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 686
DEPENDENCY MANAGEMENT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 696
CORE CONCEPTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
1. Declaring dependencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
2. Dependency Configurations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 703
3. Declaring repositories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 706
4. Centralizing dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 711
5. Dependency Constraints and Conflict Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 715
6. Dependency Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 716
7. Variant Aware Dependency Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 724
DECLARING DEPENDENCIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 729
Declaring Dependencies Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 729
Viewing Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
Declaring Versions and Ranges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 745
Declaring Dependency Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 757
Declaring Dependency Configurations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 761
DECLARING REPOSITORIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 773
Declaring Repositories Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 773
Centralizing Repository Declarations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 778
Repository Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 780
Metadata Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 788
Supported Protocols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 792
Filtering Repository Content . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 804
CENTRALIZING DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 809
Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 809
Version Catalogs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 812
Using Catalogs with Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 827
MANAGING DEPENDENCIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Locking Versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 831
Using Resolution Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 841
Modifying Dependency Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 866
Dependency Caching. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 889
UNDERSTANDING DEPENDENCY RESOLUTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
Understanding the Dependency Resolution Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 896
Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 916
Variants and Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 921
CONTROLLING DEPENDENCY RESOLUTION. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 933
Dependency Resolution Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 933
Dependency Graph Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 934
Artifact Resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 939
Artifact Transforms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 950
PUBLISHING LIBRARIES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 973
Publishing a project as module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 973
Understanding Gradle Module Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 977
Signing artifacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 982
Customizing publishing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 983
The Maven Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 994
The Ivy Publish Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1011
OTHER TOPICS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1022
Verifying dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1022
Aligning dependency versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1046
Modeling library features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1053
PLATFORMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1065
JVM BUILDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1066
Building Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1066
Testing in Java & JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1091
Managing Dependencies of JVM Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1124
JAVA TOOLCHAINS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129
Toolchains for JVM projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1129
Toolchain Resolver Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1145
JVM PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1148
The Java Library Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1148
The Application Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1160
The Java Platform Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1167
The Groovy Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1173
The Scala Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1182
INTEGRATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1194
Gradle & Third-party Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1194
REFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1198
Gradle Wrapper Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1198
Gradle Daemon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1208
Command-Line Interface Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1216
GRADLE DSL/API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1235
A Groovy Build Script Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1235
Gradle Kotlin DSL Primer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1240
CORE PLUGINS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1271
Gradle Plugin Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1271
HOW TO GUIDES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1274
How to share outputs between projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1274
LICENSE INFORMATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1281
License Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1281
OVERVIEW
Gradle User Manual
Gradle Build Tool
Why Gradle?
Gradle is a widely used and mature tool with an active community and a strong developer
ecosystem.
• Gradle is the most popular build system for the JVM and is the default system for Android and
Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.
• Gradle can automate a wide range of software build scenarios using either its built-in
functionality, third-party plugins, or custom build logic.
• Gradle provides a high-level, declarative, and expressive build language that makes it easy to
read and write build logic.
• Gradle is fast, scalable, and can build projects of any size and complexity.
• Gradle produces dependable results while benefiting from optimizations such as incremental
builds, build caching, and parallel execution.
Gradle, Inc. provides a free service called Build Scan® that provides extensive information and
insights about your builds. You can view scans to identify problems or share them for debugging
help.
Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.
Compatible IDEs
All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your
continuous integration (CI) server.
Releases
Information on Gradle releases and how to install Gradle is found on the Installation page.
User Manual
The Gradle User Manual is the official documentation for the Gradle Build Tool:
• Running Gradle Builds — Learn how to use Gradle with your project.
• Authoring Gradle Builds — Learn how to develop tasks and plugins to customize your build.
• Authoring JVM Builds — Learn how to use Gradle with your Java project.
• Optimizing Builds — Learn how to use caches and other tools to optimize your build.
Education
• Training Courses — Head over to the courses page to sign up for free Gradle training.
Support
• Forum — The fastest way to get help is through the Gradle Forum.
• Slack — Community members and core contributors answer questions directly on our Slack
Channel.
Licenses
Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user
manual and DSL reference manual are licensed under Creative Commons Attribution-
NonCommercial-ShareAlike 4.0 International License.
Copyright
Copyright © 2024 Gradle, Inc. All rights reserved. Gradle is a trademark of Gradle, Inc.
For inquiries related to commercial use or licensing, contact Gradle Inc. directly.
RELEASES
Installing Gradle
Gradle Installation
If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the
build uses the Gradle Wrapper. This is identifiable by the presence of the gradlew or [Link]
files in the root of the project:
. ①
├── gradle
│ └── wrapper ②
├── gradlew ③
├── [Link] ③
└── ⋮
② Gradle Wrapper.
If the gradlew or [Link] files are already present in your project, you do not need to install
Gradle. But you need to make sure your system satisfies Gradle’s prerequisites.
You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version
for your project. Please use the Gradle Wrapper to upgrade Gradle.
Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle
separately when only working within that IDE.
If you do not meet the criteria above and decide to install Gradle on your machine, first check if
Gradle is already installed by running gradle -v in your terminal. If the command does not return
anything, then Gradle is not installed, and you can follow the instructions below.
You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done
manually or using a package manager like SDKMAN! or Homebrew.
You can find all Gradle releases and their checksums on the releases page.
Prerequisites
Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or
higher to run. You can check the compatibility matrix for more information.
❯ java -version
openjdk version "11.0.18" 2023-01-17
OpenJDK Runtime Environment Homebrew (build 11.0.18+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)
Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your
project. In this example, the $PATH points to JDK17:
❯ echo $PATH
/opt/homebrew/opt/openjdk@17/bin
You can also set the JAVA_HOME environment variable to point to a specific JDK installation directory.
This is especially useful when multiple JDKs are installed:
❯ echo %JAVA_HOME%
C:\Program Files\Java\jdk1.7.0_80
❯ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/[Link]/Contents/Home
Gradle supports Kotlin and Groovy as the main build languages. Gradle ships with its own Kotlin
and Groovy libraries, therefore they do not need to be installed. Existing installations are ignored
by Gradle.
See the full compatibility notes for Java, Groovy, Kotlin, and Android.
Linux installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:
Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle
that is incompatible or incomplete when compared to the official version.
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
• Complete (all) with docs and sources
We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /opt/gradle
❯ unzip -d /opt/gradle [Link]
❯ ls /opt/gradle/gradle-8.12.1
LICENSE NOTICE bin README init.d lib media
To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/opt/gradle/gradle-8.12.1/bin
Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.
export GRADLE_HOME=/opt/gradle/gradle-8.12.1
export PATH=${GRADLE_HOME}/bin:${PATH}
macOS installation
▼ Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most
Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and
maintained by SDKMAN!:
Using Homebrew:
Using MacPorts:
❯ sudo port install gradle
Other package managers are available, but the version of Gradle distributed by them is not
controlled by Gradle, Inc.
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
We recommend downloading the bin file; it is a smaller file that is quick to download (and the
latest documentation is available online).
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /usr/local/gradle
❯ unzip [Link] -d /usr/local/gradle
❯ ls /usr/local/gradle/gradle-8.12.1
LICENSE NOTICE README bin init.d lib
To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH
environment variable to include the bin directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/usr/local/gradle/gradle-8.12.1/bin
Alternatively, you could also add the environment variable GRADLE_HOME and point this to the
unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add
$GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change
the GRADLE_HOME environment variable.
It’s a good idea to edit .bash_profile in your home directory to add GRADLE_HOME variable:
export GRADLE_HOME=/usr/local/gradle/gradle-8.12.1
export PATH=$GRADLE_HOME/bin:$PATH
Windows installation
▼ Installing manually
Step 1 - Download the latest Gradle distribution
• Binary-only (bin)
Open a second File Explorer window and go to the directory where the Gradle distribution was
downloaded. Double-click the ZIP archive to expose the content. Drag the content folder gradle-
8.12.1 to your newly created C:\Gradle folder.
Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle using the archiver tool of
your choice.
To install Gradle, the path to the unpacked files needs to be in your Path.
In File Explorer right-click on the This PC (or Computer) icon, then click Properties → Advanced
System Settings → Environmental Variables.
Under System Variables select Path, then click Edit. Add an entry for C:\Gradle\gradle-8.12.1\bin.
Click OK to save.
Alternatively, you can add the environment variable GRADLE_HOME and point this to the unzipped
distribution. Instead of adding a specific version of Gradle to your Path, you can add
%GRADLE_HOME%\bin to your Path. When upgrading to a different version of Gradle, just change the
GRADLE_HOME environment variable.
Open a console (or a Windows command prompt) and run gradle -v to run gradle and display the
version, e.g.:
❯ gradle -v
------------------------------------------------------------
Gradle 8.12.1
------------------------------------------------------------
You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available
from the releases page) and following these verification instructions.
Compatibility Matrix
The sections below describe Gradle’s compatibility with several integrations. Versions not listed
here may or may not work.
Java Runtime
Gradle runs on the Java Virtual Machine (JVM), which is often provided by either a JDK or JRE. A
JVM version between 8 and 23 is required to execute Gradle. JVM 24 and later versions are not yet
supported.
Executing the Gradle daemon with JVM 16 or earlier has been deprecated and will become an error
in Gradle 9.0. The Gradle wrapper, Gradle client, Tooling API client, and TestKit client will remain
compatible with JVM 8.
JDK 6 and 7 can be used for compilation. Testing with JVM 6 and 7 is deprecated and will not be
supported in Gradle 9.0.
Any fully supported version of Java can be used for compilation or testing. However, the latest Java
version may only be supported for compilation or testing, not for running Gradle. Support is
achieved using toolchains and applies to all tasks supporting toolchains.
See the table below for the Java version supported by a specific Gradle release:
8 N/A 2.0
9 N/A 4.3
10 N/A 4.7
11 N/A 5.0
12 N/A 5.4
13 N/A 6.0
14 N/A 6.3
15 6.7 6.7
Java version Support for toolchains Support for running Gradle
16 7.0 7.0
17 7.3 7.3
18 7.5 7.5
19 7.6 7.6
20 8.1 8.3
21 8.4 8.5
22 8.7 8.8
23 8.10 8.10
24 N/A N/A
Kotlin
Gradle is tested with Kotlin 1.6.10 through 2.1.0-Beta2. Beta and RC versions may or may not work.
Groovy
Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy
DSL build scripts.
Android
Gradle is tested with Android Gradle Plugin 7.3 through 8.7. Alpha and beta versions may or may
not work.
Continuous improvement combined with frequent delivery allows new features to be available to
users early. Early users provide invaluable feedback, which is incorporated into the development
process.
Getting new functionality into the hands of users regularly is a core value of the Gradle platform.
At the same time, API and feature stability are taken very seriously and considered a core value of
the Gradle platform. Design choices and automated testing are engineered into the development
process and formalized by the section on backward compatibility.
The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of
Gradle what the state of a feature is. The term feature typically means an API or DSL method or
property in this context, but it is not restricted to this definition. Command line arguments and
modes of execution (e.g. the Build Daemon) are two examples of other features.
Feature States
2. Incubating
3. Public
4. Deprecated
1. Internal
Internal features are not designed for public use and are only intended to be used by Gradle itself.
They can change in any way at any point in time without any notice. Therefore, we recommend
avoiding the use of such features. Internal features are not documented. If it appears in this User
Manual, the DSL Reference, or the API Reference, then the feature is not internal.
2. Incubating
Features are introduced in the incubating state to allow real-world feedback to be incorporated into
the feature before making it public. It also gives users willing to test potential future changes early
access.
A feature in an incubating state may change in future Gradle versions until it is no longer
incubating. Changes to incubating features for a Gradle release will be highlighted in the release
notes for that release. The incubation period for new features varies depending on the feature’s
scope, complexity, and nature.
Features in incubation are indicated. In the source code, all methods/properties/classes that are
incubating are annotated with incubating. This results in a special mark for them in the DSL and
API references.
If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the
incubating state.
The feature preview API allows certain incubating features to be activated by adding
enableFeaturePreview('FEATURE') in your settings file. Individual preview features will be
announced in release notes.
When incubating features are either promoted to public or removed, the feature preview flags for
them become obsolete, have no effect, and should be removed from the settings file.
3. Public
The default state for a non-internal feature is public. Anything documented in the User Manual, DSL
Reference, or API reference that is not explicitly said to be incubating or deprecated is considered
public. Features are said to be promoted from an incubating state to public. The release notes for
each release indicate which previously incubating features are being promoted by the release.
A public feature will never be removed or intentionally changed without undergoing deprecation.
All public features are subject to the backward compatibility policy.
4. Deprecated
Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such
features will eventually be removed from Gradle after being deprecated. A deprecated feature may
become stale until it is finally removed according to the backward compatibility policy.
Deprecated features are indicated to be so. In the source code, all methods/properties/classes that
are deprecated are annotated with “@[Link]” which is reflected in the DSL and API
References. In most cases, there is a replacement for the deprecated element, which will be
described in the documentation. Using a deprecated feature will result in a runtime warning in
Gradle’s output.
The use of deprecated features should be avoided. The release notes for each release indicate any
features being deprecated by the release.
Gradle provides backward compatibility across major versions (e.g., 1.x, 2.x, etc.). Once a public
feature is introduced in a Gradle release, it will remain indefinitely unless deprecated. Once
deprecated, it may be removed in the next major release. Deprecated features may be supported
across major releases, but this is not guaranteed.
This contains all of the changes made through Gradle’s extensive continuous integration tests
during that day. Nightly builds may contain new changes that may or may not be stable.
The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or
major release. When no problems are found after a short time (usually a week), the release
candidate is promoted to a general availability (GA) release. If a regression is found in the release
candidate, a new RC distribution is created, and the process repeats. Release candidates are
supported for as long as the release window is open, but they are not intended to be used for
production. Bug reports are greatly appreciated during the RC phase.
The Gradle team may create additional patch releases to replace the final release due to critical bug
fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.
Once a release candidate has been made, all feature development moves on to the next release for
the latest major version. As such, each minor Gradle release causes the previous minor releases in
the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or
feature backports.
For major versions, Gradle will backport critical fixes and security fixes to the last minor in the
previous major version. For example, when Gradle 7 was the latest major version, several releases
were made in the 6.x line, including Gradle 6.9 (and subsequent releases).
As such, each major Gradle release causes:
• The previous major version becomes maintenance only. It will only receive critical bug fixes
and security fixes.
• The major version before the previous one to become end-of-life (EOL), and that release line
will not receive any new fixes.
UPGRADING
Upgrading your build from Gradle 8.x to the latest
This chapter provides the information you need to migrate your Gradle 8.x builds to the latest
Gradle release. For migrating from Gradle 4.x, 5.x, 6.x, or 7.x, see the older migration guide first.
1. Try running gradle help --scan and view the deprecations view of the generated build scan.
This lets you see any deprecation warnings that apply to your build.
Alternatively, you can run gradle help --warning-mode=all to see the deprecations in the
console, though it may not report as much detailed information.
Some plugins will break with this new version of Gradle because they use internal APIs that
have been removed or changed. The previous step will help you identify potential problems by
issuing deprecation warnings when a plugin tries to use a deprecated part of the API.
4. Try to run the project and debug any errors using the Troubleshooting Guide.
The embedded Kotlin has been updated from 2.0.20 to Kotlin 2.0.21.
Upgrade to Ant 1.10.15
To determine the location of the Mac OS X SDK for Swift, Gradle now passes the --sdk macosx
arguments to xcrun. This is necessary because the SDK could be discovered inconsistently without
this argument across different environments.
Eager task creation methods on the TaskContainer interface have been marked @Deprecated and will
generate compiler and IDE warnings when used in build scripts or plugin code. There is not yet a
Gradle deprecation warning emitted for their use.
However, if the build is configured to fail on warnings during Kotlin script or plugin code
compilation, this behavior may cause the build to fail.
A standard Gradle deprecation warning will be printed upon use when these methods are fully
deprecated in a future version.
Deprecations
Previously, when at least two equal-length chains of artifact transforms were available that would
produce compatible variants that would each satisfy a resolution request, Gradle would arbitrarily,
and silently, pick one.
There are multiple distinct artifact transformation chains of the same length that
would satisfy this request. This behavior has been deprecated. This will fail with an
error in Gradle 9.0.
Found multiple transformation chains that produce a variant of 'root project :' with
requested attributes:
- color 'red'
- texture 'smooth'
Found the following transformation chains:
- From configuration ':squareBlueSmoothElements':
- With source attributes:
- artifactType 'txt'
- color 'blue'
- shape 'square'
- texture 'smooth'
- Candidate transformation chains:
- Transformation chain: 'ColorTransform':
- 'BrokenColorTransform':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
- Transformation chain: 'ColorTransform2':
- 'BrokenColorTransform2':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
Remove one or more registered transforms, or add additional attributes to them to
ensure only a single valid transformation chain exists.
In such a scenario, Gradle has no way to know which of the two (or more) possible transformation
chains should be used. Picking an arbitrary chain can lead to inefficient performance or
unexpected behavior changes when seemingly unrelated parts of the build are modified. This is
potentially a very complex situation and the message now fully explains the situation by printing
all the registered transforms in order, along with their source (input) variants for each candidate
chain.
1. Add additional, distinguishing attributes when registering transforms present in the chain, to
ensure that only a single chain will be selectable to satisfy the request
2. Request additional attributes to disambiguate which chain is selected (if they result in non-
identical final attributes)
The init task must run by itself. This task should not be combined with other tasks in a single
Gradle invocation.
Running init in the same invocation as other tasks will become an error in Gradle 9.0.
Calling [Link]() from a task action at execution time is now deprecated and will be made
an error in Gradle 10.0. This method can still be used during configuration time.
The deprecation is only issued if the configuration cache is not enabled. When the configuration
cache is enabled, calls to [Link]() are reported as configuration cache problems instead.
This deprecation was originally introduced in Gradle 7.4 but was only issued when the
STABLE_CONFIGURATION_CACHE feature flag was enabled. That feature flag no longer controls this
deprecation. This is another step towards moving users away from idioms that are incompatible
with the configuration cache, which will become the only mode supported by Gradle in a future
release.
Currently, there are multiple ways to set a property with Groovy DSL syntax:
---
propertyName = value
setPropertyName(value)
setPropertyName value
propertyName(value)
propertyName value
---
The latter one, "space-assignment", is a Gradle-specific feature that is not part of the Groovy
language. In regular Groovy, this is just a method call: propertyName(value), and Gradle generates
propertyName method in the runtime if this method hasn’t been present already. This feature may be
a source of confusion (especially for new users) and adds an extra layer of complexity for users and
the Gradle codebase without providing any significant value. Sometimes, classes declare methods
with the same name, and these may even have semantics that are different from a plain
assignment.
These generated methods are now deprecated and will be removed in Gradle 10.0, and both
propertyName value and propertyName(value) will stop working unless the explicit method
propertyName is defined. Use explicit assignment propertyName = value instead.
For explicit methods, consider using the propertyName(value) syntax instead of propertyName value
for clarity. For example, jvmArgs "some", "arg" can be replaced with jvmArgs("some", "arg") or with
jvmArgs = ["some", "arg"] for Test tasks.
If you have a big project, to replace occurrences of space-assignment syntax you can use, for
example, the following sed command:
---
find . -name '[Link]' -type f -exec sed -[Link] -E 's/([^A-Za-z]|^)(replaceme)[
\t]*([^= \t{])/\1\2 = \3/g' {} +
---
You should replace replaceme with one or more property names you want to replace, separated by |,
e.g. (url|group).
[Link]
The method was deprecated because it was not intended for public use in build scripts.
[Link]
The embedded Kotlin has been updated from 1.9.24 to Kotlin 2.0.20. Also see the Kotlin 2.0.10 and
Kotlin 2.0.0 release notes.
The default kotlin-test version in JVM test suites has been upgraded to 2.0.20 as well.
Kotlin DSL scripts are still compiled with Kotlin language version set to 1.8 for backward
compatibility.
If you configured the task in a build script, you will need to replace:
jvmVersion = JavaVersion.VERSION_17
With:
jvmVersion = [Link](17)
Using the CLI options to configure which JVM version to use for the Gradle Daemon has no impact.
The name-matching logic has been updated to treat numbers as word boundaries for camelCase
names. Previously, a request like unique would match both uniqueA and unique1. Such a request will
now fail due to ambiguity. To avoid issues, use the exact name instead of a shortened version.
• Task selection
• Project selection
Deprecations
The JavaHome property of the ForkOptions type has been deprecated and will be removed in Gradle
9.0.
Starting in Gradle 9.0, mutating configurations in a script’s buildscript block will result in an error.
This applies to project, settings, init, and standalone scripts.
The buildscript configurations block is only intended to control buildscript classpath resolution.
Consider the following script that creates a new buildscript configuration in a Settings script and
resolves it:
buildscript {
configurations {
create("myConfig")
}
dependencies {
"myConfig"("org:foo:1.0")
}
}
This pattern is sometimes used to resolve dependencies in Settings, where there is no other way to
obtain a Configuration. Resolving dependencies in this context is not recommended. Using a
detached configuration is a possible but discouraged alternative.
Starting in Gradle 9.0, selecting variants by name from non-Ivy external components will be
forbidden.
Selecting variants by name from local components will still be permitted; however, this pattern is
discouraged. Variant aware dependency resolution should be preferred over selecting variants by
name for local components.
The following dependencies will fail to resolve when targeting a non-Ivy external component:
dependencies {
implementation(group: "[Link]", name: "example", version: "1.0",
configuration: "conf")
implementation("[Link]:example:1.0") {
targetConfiguration = "conf"
}
}
Starting in Gradle 9.0, manually adding configuration instances to a configuration container will
result in an error. Configurations should only be added to the container through the eager or lazy
factory methods. Detached configurations and copied configurations should not be added to the
container.
Deprecated ProjectDependency#getDependencyProject()
To discover details about all projects that were included in a resolution, inspect the full
ResolutionResult. Project dependencies are exposed in the DependencyResult. See the user guide
section on programmatic dependency resolution for more details on this API. This is the only
reliable way to find all projects that are used in a resolution. Inspecting only the declared
`ProjectDependency`s may miss transitive or substituted project dependencies.
To get the identity of the target project, use the new Isolated Projects safe project path method:
ProjectDependency#getPath().
// Old way:
val someProject = [Link]
// New way:
val someProject = [Link]([Link])
This approach will not fetch project instances from different builds.
These deprecated methods do not track task dependencies, unlike their replacements.
Deprecated AbstractOptions
The AbstractOptions class has been deprecated and will be removed in Gradle 9.0. All classes
extending AbstractOptions will no longer extend it.
As a result, the AbstractOptions#define(Map) method will no longer be present. This method exposes
a non-type-safe API and unnecessarily relies on reflection. It can be replaced by directly setting the
properties specified in the map.
Consider the following example of the deprecated behavior and its replacement:
[Link](JavaCompile) {
// Deprecated behavior
[Link](encoding: 'UTF-8')
[Link](memoryMaximumSize: '1G')
[Link](debugLevel: 'lines')
// Can be replaced by
[Link] = 'UTF-8'
[Link] = true
[Link] = '1G'
[Link] = true
[Link] = 'lines'
}
Deprecated Dependency#contentEquals(Dependency)
The method was originally intended to compare dependencies based on their actual target
component, regardless of whether they were of different dependency type. The existing method
does not behave as specified by its Javadoc, and we do not plan to introduce a replacement that
does.
These methods are scheduled for removal as part of the ongoing effort to make writing
configuration-cache-compatible code easier. There is no way to use these methods without breaking
configuration cache requirements so it is recommended to migrate to a compatible alternative. The
appropriate replacement for your use case depends on the context in which the method was
previously called.
At execution time, for example in @TaskAction or doFirst/doLast callbacks, the use of Project
instance is not allowed when the configuration cache is enabled. To run external processes, tasks
should use an injected ExecOperation service, which has the same API and can act as a drop-in
replacement. The standard Java/Groovy/Kotlin process APIs, like [Link] can be
used as well.
At configuration time, only special Provider-based APIs must be used to run external processes
when the configuration cache is enabled. You can use [Link] and
[Link] to obtain the output of the process. A custom ValueSource implementation
can be used for more sophisticated scenarios. The configuration cache guide has a more elaborate
example of using these APIs.
This behavior has been deprecated and will become an error in Gradle 9.0.
To create extension relationships between configurations, you should change to using non-
detached configurations created via the other factory methods present in the project’s
ConfigurationContainer.
The Gradle#useLogger(Object) method has been deprecated and will be removed in Gradle 9.0.
This method was originally intended to customize logs printed by Gradle. However, it only allows
intercepting a subset of the logs and cannot work with the configuration cache. We do not plan to
introduce a replacement for this feature.
Unnecessary options on compile options and doc tasks have been deprecated
Gradle’s API allowed some properties that represented nested groups of properties to be replaced
wholesale with a setter method. This was awkward and unusual to do and would sometimes
require the use of internal APIs. The setters for these properties will be removed in Gradle 9.0 to
simplify the API and ensure consistent behavior. Instead of using the setter method, these
properties should be configured by calling the getter and configuring the object directly or using
the convenient configuration method. For example, in CompileOptions, instead of calling the
setForkOptions setter, you can call getForkOptions() or forkOptions(Action).
• [Link]
• [Link]
• [Link]
• [Link]
• [Link]
• [Link]
These methods on Javadoc have been deprecated and will be removed in Gradle 9.0.
JavaCompile tasks may fail when using a JRE even if compilation is not necessary
The JavaCompile tasks may sometimes fail when using a JRE instead of a JDK. This is due to changes
in the toolchain resolution code, which enforces the presence of a compiler when one is requested.
The java-base plugin uses the JavaCompile tasks it creates to determine the default source and target
compatibility when sourceCompatibility/targetCompatibility or release are not set. With the new
enforcement, the absence of a compiler causes this to fail when only a JRE is provided, even if no
compilation is needed (e.g., in projects with no sources).
The embedded Kotlin has been updated from 1.9.23 to Kotlin 1.9.24.
Deprecations
Starting in Gradle 9.0, Gradle will require JVM 17 or later to run. Most Gradle APIs will be compiled
to target JVM 17 bytecode.
Gradle will still support compiling Java code to target JVM version 6 or later. The target JVM version
of the compiled code can be configured separately from the JVM version used to run Gradle.
All Gradle clients (wrapper, launcher, Tooling API and TestKit) will remain compatible with JVM 8
and will be compiled to target JVM 8 bytecode. Only the Gradle daemon will require JVM 17 or later.
These clients can be configured to run Gradle builds with a different JVM version than the one used
to run the client:
Alternatively, the JAVA_HOME environment variable can be set to a JVM 17 or newer, which will run
both the client and daemon with the same version of the JVM.
Running Gradle builds with --no-daemon or using ProjectBuilder in tests will require JVM version
17 or later. The worker API will remain compatible with JVM 8, and running JVM tests will require
JVM 8.
We decided to upgrade the minimum version of the Java runtime for a number of reasons:
• Dependencies are beginning to drop support for older versions and may not release security
patches.
• Significant language improvements between Java 8 and Java 17 cannot be used without
upgrading.
• Download metrics for Gradle distributions show that JVM 17 is widely used.
Deprecated consuming non-consumable configurations from Ivy
Consuming non-consumable configurations in this manner is deprecated and will result in an error
in Gradle 9.0.
Projects should also never access the mutable state of another project. Since Configurations are
mutable, extending configurations across project boundaries restricts the parallelism that Gradle
can apply.
Extending configurations in different projects is deprecated and will result in an error in Gradle
9.0.
In previous versions of Gradle, toolchain provisioning could leave a partially provisioned toolchain
in place with a marker file indicating that the toolchain was fully provisioned. This could lead
to strange behavior with the toolchain. In Gradle 8.9, the toolchain is fully provisioned before the
marker file is written. However, to not detect potentially broken toolchains, a different marker file
(.ready) is used. This means all your existing toolchains will be re-provisioned the first time you use
them with Gradle 8.9. Gradle 8.9 also writes the old marker file ([Link]) to indicate that the
toolchain was fully provisioned. This means that if you return to an older version of Gradle, an 8.9-
provisioned toolchain will not be re-provisioned.
The embedded Kotlin has been updated from 1.9.22 to Kotlin 1.9.23.
In previous versions of Gradle, Java projects that had no declared dependencies could implicitly
compile against Gradle’s runtime classes. This means that some projects were able to compile
without any declared dependencies even though they referenced Gradle runtime classes. This
situation is unlikely to arise in projects since IDE integration and test execution would be
compromised. However, if you need to utilize the Gradle API, declare a gradleApi dependency or
apply the java-gradle-plugin plugin.
References to Gradle types not part of the public API should be avoided, as their direct use is
unsupported. Gradle internal implementation classes may suffer breaking changes (or be renamed
or removed) from one version to another without warning.
Users need to distinguish between the API and internal parts of the Gradle codebase. This is
typically achieved by including internal in the implementation package names. However, before
this release, the configuration cache subsystem did not follow this pattern.
To address this issue, all code initially under the [Link]* packages has been
moved to new internal packages ([Link].*).
Since Gradle 8.8, file-system watching has only been supported on macOS 12 (Monterey) and later.
We added a check to automatically disable file-system watching on macOS 11 (Big Sur) and earlier
versions.
Possible change to JDK8-based compiler output when annotation processors are used
The Java compilation infrastructure has been updated to use the Problems API. This change will
supply the Tooling API clients with structured, rich information about compilation issues.
The feature should not have any visible impact on the usual build output, with JDK8 being an
exception. When annotation processors are used in the compiler, the output message differs
slightly from the previous ones.
The change mainly manifests itself in typename printed. For example, Java standard types like
[Link] will be reported as [Link] instead of String.
Deprecations
To ensure the accuracy of dependency resolution, Gradle checks that Configurations are not
mutated after they have been used as part of a dependency graph.
• Resolvable configurations should not have their resolution strategy, dependencies, hierarchy,
etc., modified after they have been resolved.
• Consumable configurations should not have their dependencies, hierarchy, attributes, etc.
modified after they have been published or consumed as a variant.
• Dependency scope configurations should not have their dependencies, constraints, etc.,
modified after a configuration that extends from them is observed.
In prior versions of Gradle, many of these circumstances were detected and handled by failing the
build. However, some cases went undetected or did not trigger build failures. In Gradle 9.0, all
changes to a configuration, once observed, will become an error. After a configuration of any type
has been observed, it should be considered immutable. This validation covers the following
properties of a configuration:
• Resolution Strategy
• Dependencies
• Constraints
• Exclude Rules
• Artifacts
• Hierarchy (extendsFrom)
Starting in Gradle 8.8, a deprecation warning will be emitted in cases that were not already an
error. Usually, this deprecation is caused by mutating a configuration in a beforeResolve hook. This
hook is only executed after a configuration is fully resolved but not when it is partially resolved for
computing task dependencies.
[Link]
plugins {
id("java-library")
}
[Link] {
// `beforeResolve` is not called before the configuration is partially
resolved for
// build dependencies, but only before a full graph resolution.
// Configurations should not be mutated in this hook
[Link] {
// Add a dependency on `com:foo` if not already present
if ([Link] { [Link] == "com" && [Link] == "foo" }) {
[Link]().[Link]([Link]
ate("com:foo:1.0"))
}
}
}
[Link]("resolve") {
val conf: FileCollection = configurations["runtimeClasspath"]
// Resolve dependencies
doLast {
assert([Link] { [Link] } == listOf("[Link]"))
}
}
For the following use cases, consider these alternatives when replacing a beforeResolve hook:
• Roles: Configuration roles should be set upon creation and not changed afterward.
• Hierarchy: Configuration hierarchy (extendsFrom) should be set upon creation. Mutating the
hierarchy prior to resolution is highly discouraged but permitted within a withDependencies
hook.
In an ongoing effort to simplify the Gradle API, the following methods that support filtering based
on declared dependencies have been deprecated:
On Configuration:
• files(Dependency…)
• files(Spec)
• files(Closure)
• fileCollection(Dependency…)
• fileCollection(Spec)
• fileCollection(Closure)
On ResolvedConfiguration:
• getFiles(Spec)
• getFirstLevelModuleDependencies(Spec)
On LenientConfiguration:
• getFirstLevelModuleDependencies(Spec)
• getFiles(Spec)
• getArtifacts(Spec)
To mitigate this deprecation, consider the example below that leverages the ArtifactView API along
with the componentFilter method to select a subset of a Configuration’s artifacts:
[Link]
dependencies {
conf("[Link]:foo:1.0")
conf("[Link]:bar:1.0")
}
[Link]("filterDependencies") {
val files: FileCollection = [Link] {
componentFilter {
when(it) {
is ModuleComponentIdentifier ->
[Link] == "[Link]" && [Link] == "foo"
else -> false
}
}
}.files
doLast {
assert([Link] { [Link] } == listOf("[Link]"))
}
}
[Link]
configurations {
conf
}
dependencies {
conf "[Link]:foo:1.0"
conf "[Link]:bar:1.0"
}
[Link]("filterDependencies") {
FileCollection files = [Link] {
componentFilter {
it instanceof ModuleComponentIdentifier
&& [Link] == "[Link]"
&& [Link] == "foo"
}
}.files
doLast {
assert files*.name == ["[Link]"]
}
}
Contrary to the deprecated Dependency filtering methods, componentFilter does not consider the
transitive dependencies of the component being filtered. This allows for more granular control over
which artifacts are selected.
Task and Configuration have a Namer inner class (also called Namer) that can be used as a common
way to retrieve the name of a task or configuration. Now that these types implement Named, these
classes are no longer necessary and have been deprecated. They will be removed in Gradle 9.0. Use
[Link] instead.
A new API for defining file permissions has been added in Gradle 8.3, see:
• FilePermissions.
• ConfigurableFilePermissions.
The new API has now been promoted to stable, and the old methods have been deprecated:
• [Link]
• [Link]
• [Link]
• [Link]
• [Link]
• [Link]
Deprecated setting retention period directly on local build cache
In previous versions, cleanup of the local build cache entries ran every 24 hours, and this interval
could not be configured. The retention period was configured using
[Link].
In Gradle 8.0, a new mechanism was added to configure the cleanup and retention periods for
various resources in Gradle User Home. In Gradle 8.8, this mechanism was extended to permit the
retention configuration of local build cache entries, providing improved control and consistency.
• Specifying [Link] or [Link] will now prevent or force the cleanup of the local
build cache
• Build cache entry retention is now configured via an init-script, in the same manner as other
caches.
If you want build cache entries to be retained for 30 days, remove any calls to the deprecated
method:
buildCache {
local {
// Remove this line
removeUnusedEntriesAfterDays = 30
}
}
beforeSettings {
caches {
[Link](30)
}
}
In [Link] (Kotlin DSL), you can use gradle-enterprise in the plugins block to apply the
Gradle Enterprise plugin with the same version as gradle --scan.
plugins {
`gradle-enterprise`
}
The Develocity plugin must be applied with an explicit plugin ID and version. There is no
develocity shorthand available in the plugins block:
plugins {
id("[Link]") version "3.17.3"
}
If you want to continue using the Gradle Enterprise plugin, you can specify the deprecated plugin
ID:
plugins {
id("[Link]") version "3.17.3"
}
We encourage you to use the latest released Develocity plugin version, even when using an older
Gradle version.
We have implemented several refactorings of the Problems API, including a significant change in
how problem definitions and contextual information are handled. The complete design
specification can be found here.
In implementing this spec, we have introduced the following breaking changes to the ProblemSpec
interface:
• The label(String) and description(String) methods have been replaced with the id(String,
String) method and its overloaded variants.
• [Link]*(…)
• [Link]*(…)
Replacements that better handle conventions are under consideration for a future 8.x release.
Since the previous version was 3.0.17, the 3.0.18 and 3.0.19, and 3.0.20 changes are also included.
Some changes in static type checking have resulted in source-code incompatibilities. Starting with
3.0.18, if you cast a closure to an Action without generics, the closure parameter will be Object
instead of any explicit type specified. This can be fixed by adding the appropriate type to the cast,
and the redundant parameter declaration can be removed:
// Before
[Link]("foo", { Task it -> [Link] = "Foo task" } as Action)
// Fixed
[Link]("foo", { [Link] = "Foo task" } as Action<Task>)
ASM was upgraded from 9.6 to 9.7 to ensure earlier compatibility for Java 23.
The embedded Kotlin has been updated from 1.9.10 to Kotlin 1.9.22.
JSch has been replaced by [Link]:jsch and updated from 0.1.55 to 0.2.16
This includes reworking the way that Gradle configures JGit for SSH operations by moving from
JSch to Apache SSHD.
Apache Commons Compress has been updated from 1.21 to 1.25.0. This change may affect the
checksums of the produced jars, zips, and other archive types because the metadata of the
produced artifacts may differ.
ASM was upgraded from 9.5 to 9.6 for better support of multi-release jars.
The version catalog parser has been upgraded and is now compliant with version 1.0.0 of the TOML
spec.
This should not impact catalogs that use the recommended syntax or were generated by Gradle for
publication.
Deprecations
Using plugin conventions has been emitting warnings since Gradle 8.2. Now, registering plugin
conventions will also trigger deprecation warnings. For more information, see the section about
plugin convention deprecation.
In Kotlin DSL, it is possible to reference a task or other domain object by its name using the
"name"() notation.
tasks {
"wrapper"() // 1 - returns TaskProvider<Task>
"wrapper"(Wrapper::class) // 2 - returns TaskProvider<Wrapper>
"wrapper"(Wrapper::class) { // 3 - configures a task named wrapper of type Wrapper
}
"wrapper" { // 4 - configures a task named wrapper of type Task
}
}
The first notation is deprecated and will be removed in Gradle 9.0. Instead of using "name"() to
reference a task or domain object, use named("name") or one of the other supported notations.
tasks {
named("wrapper") // returns TaskProvider<Task>
}
The Gradle API and Groovy build scripts are not impacted by this.
Before Gradle 8.3, Gradle would decode a CharSequence given to [Link](Object) using an
algorithm that accepted invalid URLs and improperly decoded others. Gradle now uses the URI class
to parse and decode URLs, but with a fallback to the legacy behavior in the event of an error.
Starting in Gradle 9.0, the fallback will be removed, and an error will be thrown instead.
To fix a deprecation warning, invalid URLs that require the legacy behavior should be re-encoded
to be valid URLs, such as in the following examples:
Deprecated SelfResolvingDependency
The SelfResolvingDependency interface has been deprecated for removal in Gradle 9.0. This type
dates back to the first versions of Gradle, where some dependencies could be resolved
independently. Now, all dependencies should be resolved as part of a dependency graph using a
Configuration.
• resolve
• resolve(boolean)
• getBuildDependencies
Consider the following scripts that showcase the deprecated interface and its replacement:
[Link]
plugins {
id("java-library")
}
dependencies {
implementation(files("[Link]"))
implementation(project(":foo"))
}
[Link]("resolveDeprecated") {
// Wire build dependencies (calls getBuildDependencies)
dependsOn(configurations["implementation"].[Link]())
// Resolve dependencies
doLast {
configurations["implementation"].[Link]<FileCollectionDependen
cy>() {
assert(resolve().map { [Link] } == listOf("[Link]"))
assert(resolve(true).map { [Link] } == listOf("[Link]"))
}
configurations["implementation"].[Link]<ProjectDependency>() {
// These methods do not even work properly.
assert(resolve().map { [Link] } == listOf<String>())
assert(resolve(true).map { [Link] } == listOf<String>())
}
}
}
[Link]("resolveReplacement") {
val conf = configurations["runtimeClasspath"]
// Resolve dependencies
val files = [Link]
doLast {
assert([Link] { [Link] } == listOf("[Link]", "[Link]"))
}
}
• [Link](Collection)
Deprecations
Calling registerFeature on the java extension using the main source set is deprecated and will
change behavior in Gradle 9.0.
Currently, features created while calling usingSourceSet with the main source set are initialized
differently than features created while calling usingSourceSet with any other source set. Previously,
when using the main source set, new implementation, compileOnly, runtimeOnly, api, and
compileOnlyApi configurations were created, and the compile and runtime classpaths of the main
source set were configured to extend these configurations.
Starting in Gradle 9.0, the main source set will be treated like any other source set. With the java-
library plugin applied (or any other plugin that applies the java plugin), calling usingSourceSet with
the main source set will throw an exception. This is because the java plugin already configures a
main feature. Only if the java plugin is not applied will the main source set be permitted when calling
usingSourceSet.
Code that currently registers features with the main source set, such as:
[Link]
plugins {
id("java-library")
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets["main"])
}
}
[Link]
plugins {
id("java-library")
}
java {
registerFeature("feature") {
usingSourceSet([Link])
}
}
Should instead, create a separate source set for the feature and register the feature with that source
set:
[Link]
plugins {
id("java-library")
}
sourceSets {
create("feature")
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets["feature"])
}
}
[Link]
plugins {
id("java-library")
}
sourceSets {
feature
}
java {
registerFeature("feature") {
usingSourceSet([Link])
}
}
Publishing dependencies with an explicit artifact with a name different from the dependency’s
artifactId to Maven repositories has been deprecated. This behavior is still permitted when
publishing to Ivy repositories. It will result in an error in Gradle 9.0.
When publishing to Maven repositories, Gradle will interpret the dependency below as if it were
declared with coordinates org:notfoo:1.0:
[Link]
dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}
[Link]
dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}
[Link]
dependencies {
implementation("org:notfoo:1.0")
}
[Link]
dependencies {
implementation("org:notfoo:1.0")
}
Deprecated ArtifactIdentifier
The ArtifactIdentifier class has been deprecated for removal in Gradle 9.0.
Starting in Gradle 9.0, mutating dependencies sourced from a DependencyCollector, after those
dependencies have been observed will result in an error. The DependencyCollector interface is used
to declare dependencies within the test suites DSL.
Consider the following example where a test suite’s dependency is mutated after it is observed:
[Link]
plugins {
id("java-library")
}
[Link] {
named<JvmTestSuite>("test") {
dependencies {
// Dependency is declared on a `DependencyCollector`
implementation("com:foo")
}
}
}
[Link] {
// Calling `all` here realizes/observes all lazy sources, including the
`DependencyCollector`
// from the test suite block. Operations like resolving a configuration
similarly realize lazy sources.
[Link] {
if (this is ExternalDependency && group == "com" && name == "foo" &&
version == null) {
// Dependency is mutated after observation
version {
require("2.0")
}
}
}
}
In the above example, the build logic uses iteration and mutation to try to set a default version for a
particular dependency if the version is not already set. Build logic like the above example creates
challenges in resolving declared dependencies, as reporting tools will display this dependency as if
the user declared the version as "2.0", even though they never did. Instead, the build logic can avoid
iteration and mutation by declaring a preferred version constraint on the dependency’s
coordinates. This allows the dependency management engine to use the version declared on the
constraint if no other version is declared.
Consider the following example that replaces the above iteration with an indiscriminate preferred
version constraint:
[Link]
dependencies {
constraints {
testImplementation("com:foo") {
version {
prefer("2.0")
}
}
}
}
The groovy-base plugin is now responsible for configuring source and target compatibility version
conventions on all GroovyCompile tasks.
If you are using this task without applying grooy-base, you will have to manually set compatibility
versions on these tasks. In general, the groovy-base plugin should be applied whenever working
with Groovy language tasks.
[Link]
The type of argument passed to [Link] is changed from Predicate to Spec for a more
consistent API. This change should not affect anyone using [Link] with a lambda
expression. However, this might affect plugin authors if they don’t use SAM conversions to create a
lambda.
Deprecations
• [Link](VersionNumber)
This can lead to circular dependency graphs, as the resolved configuration is used for two purposes.
To avoid this problem, plugins should mark all resolvable configurations as canBeConsumed=false or
use the resolvable(String) configuration factory method when creating configurations meant for
resolution.
In Gradle 9.0, consuming configurations in this manner will no longer be allowed and result in an
error.
Gradle will warn if a project is added to the build where the associated projectDir does not exist or
is not writable. Starting with version 9.0, Gradle will not run builds if a project directory is missing
or read-only. If you intend to dynamically synthesize projects, make sure to create directories for
them as well:
[Link]
include("project-without-directory")
project(":project-without-directory").[Link]()
[Link]
include 'project-without-directory'
project(":project-without-directory").[Link]()
Gradle 8.4 now configures XML parsers with security features enabled. If your build logic depends
on old XML parsers that don’t support secure parsing, your build may fail. If you encounter a
failure, check and update or remove any dependency on legacy XML parsers.
If you are an Android user, please upgrade your AGP version to 8.3.0 or higher to fix the issue
caused by AGP itself. See the Update XML parser used in AGP for Gradle 8.4 compatibility for more
details.
If you are unable to upgrade XML parsers coming from your build logic dependencies, you can
force the use of the XML parsers built into the JVM. In OpenJDK, for example, this can be done by
adding the following to [Link]:
[Link]=[Link].
SAXParserFactoryImpl
[Link]=[Link]
[Link]
[Link]=[Link]
.[Link]
See the CVE-2023-42445 advisory for more details and ways to enable secure XML processing on
previous Gradle versions.
Gradle 8.4 forbids external XML entities when parsing XML documents. If you use the EAR plugin
and configure the [Link] descriptor via the EAR plugin’s DSL and customize the descriptor
using withXml {} and use asElement{} in the customization block, then the build will now fail for
security reasons.
[Link]
plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}
[Link]
plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}
If you happen to use asNode() instead of asElement(), then nothing changes, given asNode() simply
ignores external DTDs.
You can work around this by running your build with the [Link] system
property set to http.
-[Link]=http
To make this workaround persistent, add the following line to your [Link]:
[Link]=http
Note that this will enable HTTP access to external DTDs for the whole build JVM. See the JAXP
documentation for more details.
Deprecations
The following methods on GenerateMavenPom are deprecated and will be removed in Gradle 9.0. They
were never intended to be public API.
• getVersionRangeMapper
• withCompileScopeAttributes
• withRuntimeScopeAttributes
Upgrading from 8.2 and earlier
With the deprecation of [Link], buildscripts that are compiled with warnings as errors
could fail if the deprecated field is used.
The TestLauncher interface is part of the Tooling API, specialized for running tests. It is a logical
extension of the BuildLauncher that can only launch tasks. A discrepancy has been reported in their
behavior: if the same failing test is executed, BuildLauncher will report a build failure, but
TestLauncher won’t. Originally, this was a design decision in order to continue the execution and
run the tests in all test tasks and not stop at the first failure. At the same time, this behavior can be
confusing for users as they can experience a failing test in a successful build. To make the two APIs
more uniform, we made TestLauncher also fail the build, which is a potential breaking change.
Tooling API clients should explicitly pass --continue to the build to continue the test execution even
if a test task fails.
Consider the case where the set of attributes on a Configuration is changed after an ArtifactView is
created:
[Link]
tasks {
myTask {
[Link]([Link] {
attributes {
// Add attributes to select a different type of artifact
}
}.files)
}
}
configurations {
classpath {
attributes {
// Add more attributes to the configuration
}
}
}
The inputFiles property of myTask uses an artifact view to select a different type of artifact from the
configuration classpath. Since the artifact view was created before the attributes were added to the
configuration, Gradle could not select the correct artifact.
Some builds may have worked around this by also putting the additional attributes into the artifact
view. This is no longer necessary.
The embedded Kotlin has been updated from 1.8.20 to Kotlin 1.9.0. The Kotlin language and API
levels for the Kotlin DSL are still set to 1.8 for backward compatibility. See the release notes for
Kotlin 1.8.22 and Kotlin 1.8.21.
Kotlin 1.9 dropped support for Kotlin language and API level 1.3. If you build Gradle plugins written
in Kotlin with this version of Gradle and need to support Gradle <7.0 you need to stick to using the
Kotlin Gradle Plugin <1.9.0 and configure the Kotlin language and API levels to 1.3. See the
Compatibility Matrix for details about other versions.
Plugins or build logic that eagerly queries the attributes of JVM configurations may now cause the
project’s Java toolchain to be finalized earlier than before. Attempting to modify the toolchain after
it has been finalized will result in error messages similar to the following:
The value for property 'implementation' is final and cannot be changed any further.
The value for property 'languageVersion' is final and cannot be changed any further.
The value for property 'vendor' is final and cannot be changed any further.
This situation may arise when plugins or build logic eagerly query an existing JVM Configuration’s
attributes to create a new Configuration with the same attributes. Previously, this logic would have
omitted the two above-noted attributes entirely, while now, the same logic will copy the attributes
and finalize the project’s Java toolchain. To avoid early toolchain finalization, attribute-copying
logic should be updated to query the source Configuration’s attributes lazily:
[Link]
[Link]
Deprecations
The [Link] property is deprecated. It uses eager APIs and has ordering issues if the value
is read in build logic and then later modified. It could result in outputs ending up in different
locations.
Note that, at this stage, Gradle will not print deprecation warnings if you still use [Link].
We know this is a big change, and we want to give the authors of major plugins time to stop using it.
Switching from a File to a DirectoryProperty requires adaptations in build logic. The main impact is
that you cannot use the property inside a String to expand it. Instead, you should leverage the dir
and file methods to compute your desired location.
Here is an example of creating a file where the following:
[Link]
// Returns a [Link]
file("$buildDir/[Link]")
[Link]
// Returns a [Link]
file("$buildDir/[Link]")
[Link]
// Compatible with a number of Gradle lazy APIs that accept also [Link]
val output: Provider<RegularFile> =
[Link]("[Link]")
[Link]
// Compatible with a number of Gradle lazy APIs that accept also [Link]
Provider<RegularFile> output = [Link]("[Link]")
// Returns a [Link]
file("$buildDir/outputLocation")
[Link]
// Returns a [Link]
file("$buildDir/outputLocation")
[Link]
[Link]
Client module dependencies were originally intended to allow builds to override incorrect or
missing component metadata of external dependencies by defining the metadata locally. This
functionality has since been replaced by Component Metadata Rules.
[Link]
dependencies {
implementation(module("org:foo:1.0") {
dependency("org:bar:1.0")
module("org:baz:1.0") {
dependency("com:example:1.0")
}
})
}
[Link]
dependencies {
implementation module("org:foo:1.0") {
dependency "org:bar:1.0"
module("org:baz:1.0") {
dependency "com:example:1.0"
}
}
}
build-logic/src/main/kotlin/[Link]
@CacheableRule
abstract class AddDependenciesRule @Inject constructor(val dependencies:
List<String>) : ComponentMetadataRule {
override fun execute(context: ComponentMetadataContext) {
listOf("compile", "runtime").forEach { base ->
[Link](base) {
withDependencies {
[Link] {
add(it)
}
}
}
}
}
}
[Link]
dependencies {
components {
withModule<AddDependenciesRule>("org:foo") {
params(listOf(
"org:bar:1.0",
"org:baz:1.0"
))
}
withModule<AddDependenciesRule>("org:baz") {
params(listOf("com:example:1.0"))
}
}
implementation("org:foo:1.0")
}
build-logic/src/main/groovy/[Link]
@CacheableRule
abstract class AddDependenciesRule implements ComponentMetadataRule {
List<String> dependencies
@Inject
AddDependenciesRule(List<String> dependencies) {
[Link] = dependencies
}
@Override
void execute(ComponentMetadataContext context) {
["compile", "runtime"].each { base ->
[Link](base) {
withDependencies {
[Link] {
add(it)
}
}
}
}
}
}
[Link]
dependencies {
components {
withModule("org:foo", AddDependenciesRule) {
params([
"org:bar:1.0",
"org:baz:1.0"
])
}
withModule("org:baz", AddDependenciesRule) {
params(["com:example:1.0"])
}
}
implementation "org:foo:1.0"
}
Starting in Gradle 9.0, the earliest supported Develocity plugin version is 3.13.1. The plugin versions
from 3.0 up to 3.13 will be ignored when applied.
Upgrade to version 3.13.1 or later of the Develocity plugin. You can find the latest available version
on the Gradle Plugin Portal. More information on the compatibility can be found here.
The embedded Kotlin has been updated to Kotlin 1.8.20. For more information, see What’s new in
Kotlin 1.8.20.
Note that there is a known issue with Kotlin compilation avoidance that can cause OutOfMemory
exceptions in compileKotlin tasks if the compilation classpath contains very large JAR files. This
applies to builds applying the Kotlin plugin v1.8.20 or the kotlin-dsl plugin.
You can work around it by disabling Kotlin compilation avoidance in your [Link] file:
[Link]=false
Since the previous version was 1.10.11, the 1.10.12 changes are also included.
Since the previous version was 6.48.0, all changes since then are included.
A plugin compiled with Gradle >= 8.2 that makes use of the Kotlin DSL functions [Link]<T>(),
[Link](KClass) or [Link]<T> {} cannot run on Gradle ⇐ 6.1.
This internal representation is now created more lazily, which can change the order in which tasks
are configured. Some tasks may never be configured.
This change may cause code paths that relied on a particular order to no longer function, such as
conditionally adding attributes to a configuration based on the presence of certain attributes.
We recommend not modifying domain objects (configurations, source sets, tasks, etc) from
configuration blocks for other domain objects that may not be configured.
configurations {
val myConfig = create("myConfig")
}
[Link]("myTask") {
// This is not safe, as the execution of this block may not occur, or may
not occur in the order expected
configurations["myConfig"].attributes {
attribute(Usage.USAGE_ATTRIBUTE, [Link](Usage::[Link],
Usage.JAVA_RUNTIME))
}
}
Deprecations
• getAnnotationProcessorGeneratedSourcesDirectory()
• setAnnotationProcessorGeneratedSourcesDirectory(File)
• setAnnotationProcessorGeneratedSourcesDirectory(Provider<File>)
Gradle will now warn at runtime when methods of Configuration are called inconsistently with the
configuration’s intended usage.
This change is part of a larger ongoing effort to make the intended behavior of configurations more
consistent and predictable and to unlock further speed and memory improvements.
Currently, the following methods should only be called with these listed allowed usages:
Intended usage is noted in the Configuration interface’s Javadoc. This list is likely to grow in future
releases.
Starting in Gradle 9.0, using a configuration inconsistently with its intended usage will be
prohibited.
Also note that although it is not currently restricted, the getDependencies() method is only intended
for use with DECLARABLE configurations. The getAllDependencies() method, which retrieves all
declared dependencies on a configuration and any superconfigurations, will not be restricted to
any particular usage.
The concept of conventions is outdated and superseded by extensions to provide custom DSLs.
To reflect this in the Gradle API, the following elements are deprecated:
• [Link]()
• [Link]
• [Link]
Gradle Core plugins still register their conventions in addition to their extensions for backwards
compatibility.
It is deprecated to access any of these conventions and their properties. Doing so will now emit a
deprecation warning. This will become an error in Gradle 9.0. You should prefer accessing the
extensions and their properties instead.
Prominent community plugins already migrated to using extensions to provide custom DSLs. Some
of them still register conventions for backward compatibility. Registering conventions does not emit
a deprecation warning yet to provide a migration window. Future Gradle versions will do.
Also note that Plugins compiled with Gradle ⇐ 8.1 that make use of the Kotlin DSL functions
[Link]<T>(), [Link](KClass) or [Link]<T> {} will emit a deprecation warning
when run on Gradle >= 8.2. To fix this these plugins should be recompiled with Gradle >= 8.2 or
changed to access extensions directly using [Link]<T>() instead.
The convention properties contributed by the base plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.
The conventions are replaced by the base { } configuration block backed by BasePluginExtension.
The old convention object defines the distsDirName, libsDirName, and archivesBaseName properties
with simple getter and setter methods. Those methods are available in the extension only to
maintain backward compatibility. Build scripts should solely use the properties of type Property:
[Link]
plugins {
base
}
base {
[Link]("gradle")
[Link]([Link]("custom-dist"))
[Link]([Link]("custom-libs"))
}
[Link]
plugins {
id 'base'
}
base {
archivesName = "gradle"
distsDirectory = [Link]('custom-dist')
libsDirectory = [Link]('custom-libs')
}
The convention properties the application plugin contributed have been deprecated and scheduled
for removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.
[Link]
plugins {
application
}
[Link]
plugins {
id 'application'
}
[Link]
plugins {
application
}
application {
applicationDefaultJvmArgs = listOf("-[Link]=en")
}
[Link]
plugins {
id 'application'
}
application {
applicationDefaultJvmArgs = ['-[Link]=en']
}
The convention properties the java plugin contributed have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.
[Link]
plugins {
id("java")
}
plugins {
id 'java'
}
This should be changed to use the java { } configuration block, backed by JavaPluginExtension,
instead:
[Link]
plugins {
id("java")
}
java {
sourceCompatibility = JavaVersion.VERSION_18
}
[Link]
plugins {
id 'java'
}
java {
sourceCompatibility = JavaVersion.VERSION_18
}
The convention properties contributed by the war plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.
[Link]
plugins {
id("war")
}
[Link]
plugins {
id 'war'
}
Clients should configure the war task directly. Also, [Link]([Link]).configureEach(…) can
be used to configure each task of type War.
[Link]
plugins {
id("war")
}
[Link] {
[Link](file("src/main/webapp"))
}
[Link]
plugins {
id 'war'
}
war {
webAppDirectory = file('src/main/webapp')
}
Deprecated ear plugin conventions
The convention properties contributed by the ear plugin have been deprecated and scheduled for
removal in Gradle 9.0. For more context, see the section about plugin convention deprecation.
[Link]
plugins {
id("ear")
}
[Link]
plugins {
id 'ear'
}
Clients should configure the ear task directly. Also, [Link]([Link]).configureEach(…) can
be used to configure each task of type Ear.
[Link]
plugins {
id("ear")
}
[Link] {
[Link](file("src/main/app"))
}
[Link]
plugins {
id 'ear'
}
ear {
appDirectory = file('src/main/app') // use application metadata found in
this folder
}
The convention properties contributed by the project-reports plugin have been deprecated and
scheduled for removal in Gradle 9.0. For more context, see the section about plugin convention
deprecation.
[Link]
plugins {
`project-report`
}
configure<ProjectReportsPluginConvention> {
projectReportDirName = "custom" // Accessing a convention
}
[Link]
plugins {
id 'project-report'
}
[Link]
plugins {
`project-report`
}
[Link]<HtmlDependencyReportTask>() {
[Link]([Link]("reports/custom"
))
}
[Link]
plugins {
id 'project-report'
}
[Link](HtmlDependencyReportTask) {
projectReportDirectory = [Link](
"reports/custom")
}
• getAll()
Obtain the set of all configurations from the project’s configurations container instead.
In some cases, Gradle will load JVM test framework dependencies from the Gradle distribution to
execute tests. This existing behavior can lead to test framework dependency version conflicts on
the test classpath. To avoid these conflicts, this behavior is deprecated and will be removed in
Gradle 9.0. Tests using TestNG are unaffected.
To prepare for this change in behavior, either declare the required dependencies explicitly or
migrate to Test Suites, where these dependencies are managed automatically.
Test Suites
Builds that use test suites will not be affected by this change. Test suites manage the test framework
dependencies automatically and do not require dependencies to be explicitly declared. See the user
manual for further information on migrating to test suites.
In the absence of test suites, dependencies must be manually declared on the test runtime
classpath:
[Link]
dependencies {
// If using JUnit Jupiter
testImplementation("[Link]:junit-jupiter:5.9.2")
testRuntimeOnly("[Link]:junit-platform-launcher")
// If using JUnit 4
testImplementation("junit:junit:4.13.2")
// If using JUnit 3
testCompileOnly("junit:junit:3.8.2")
testRuntimeOnly("junit:junit:4.13.2")
}
[Link]
dependencies {
// If using JUnit Jupiter
testImplementation '[Link]:junit-jupiter:5.9.2'
testRuntimeOnly '[Link]:junit-platform-launcher'
// If using JUnit 4
testImplementation 'junit:junit:4.13.2'
// If using JUnit 3
testCompileOnly 'junit:junit:3.8.2'
testRuntimeOnly 'junit:junit:4.13.2'
}
BuildIdentifier and ProjectComponentSelector method deprecations
• getName()
• isCurrentBuild()
You could use these methods to distinguish between different project components with the same
name but from different builds. However, for certain composite build setups, these methods do not
provide enough information to guarantee uniqueness.
Gradle now emits a [Link] file in some global cache directories, as specified in Cache
marking.
This may cause these directories to no longer be searched or backed up by some tools. To disable it,
use the following code in an init script in the Gradle User Home:
[Link]
beforeSettings {
caches {
// Disable cache marking for all caches
[Link]([Link])
}
}
[Link]
In this release, the configuration cache feature was promoted from incubating to stable. As such, all
properties originally mentioned in the feature documentation (which had an unsafe part in their
names, e.g., [Link]-cache) were renamed, in some cases, by removing the
unsafe part of the name.
[Link]- [Link]-problems
problems
Compilation warnings from Kotlin DSL scripts are printed to the console output. For example, the
use of deprecated APIs in Kotlin DSL will emit warnings each time the script is compiled.
This is a potentially breaking change if you are consuming the console output of Gradle builds.
If you are configuring custom Kotlin compiler options on a project with the kotlin-dsl plugin
applied you might encounter a breaking change.
In previous Gradle versions, the kotlin-dsl plugin was adding required compiler arguments on
afterEvaluate {}. Now that the Kotlin Gradle Plugin provides lazy configuration properties, our
kotlin-dsl plugin switched to adding required compiler arguments to the lazy properties directly.
As a consequence, if you were setting freeCompilerArgs the kotlin-dsl plugin is now failing the
build because its required compiler arguments are overridden by your configuration.
[Link]
plugins {
`kotlin-dsl`
}
[Link](KotlinCompile::class).configureEach {
kotlinOptions { // Deprecated non-lazy configuration options
freeCompilerArgs = listOf("-Xcontext-receivers")
}
}
With the configuration above you would get the following build failure:
You must change this to adding your custom compiler arguments to the lazy configuration
properties of the Kotlin Gradle Plugin for them to be appended to the ones required by the kotlin-
dsl plugin:
[Link]
plugins {
`kotlin-dsl`
}
[Link](KotlinCompile::class).configureEach {
compilerOptions { // New lazy configuration options
[Link]("-Xcontext-receivers")
}
}
If you were already adding to freeCompilerArgs instead of setting its value, you should not
experience a build failure.
New API introduced may clash with existing Gradle DSL code
When a new property or method is added to an existing type in the Gradle DSL, it may clash with
names already used in user code.
When a name clash occurs, one solution is to rename the element in user code.
This is a non-exhaustive list of API additions in 8.1 that may cause name collisions with existing
user code.
• [Link]()
• [Link]()
Using unsupported API to start external processes at configuration time is no longer allowed with the
configuration cache enabled
Since Gradle 7.5, using [Link], [Link], and standard Java and Groovy APIs to run
external processes at configuration time has been considered an error only if the feature preview
STABLE_CONFIGURATION_CACHE was enabled. With the configuration cache promotion to a stable
feature in Gradle 8.1, this error is detected regardless of the feature preview status. The
configuration cache chapter has more details to help with the migration to the new provider-based
APIs to execute external processes at configuration time.
Builds that do not use the configuration cache, or only start external processes at execution time
are not affected by this change.
Deprecations
The allowed usage of a configuration should be immutable after creation. Mutating the allowed
usage on a configuration created by a Gradle core plugin is deprecated. This includes calling any of
the following Configuration methods:
• setCanBeConsumed(boolean)
• setCanBeResolved(boolean)
These methods now emit deprecation warnings on these configurations, except for certain special
cases which make allowances for the existing behavior of popular plugins. This rule does not yet
apply to detached configurations or configurations created in buildscripts and third-party plugins.
Calling setCanBeConsumed(false) on apiElements or runtimeElements is not yet deprecated in order to
avoid warnings that would be otherwise emitted when using select popular third-party plugins.
This change is part of a larger ongoing effort to make the intended behavior of configurations more
consistent and predictable, and to unlock further speed and memory improvements in this area of
Gradle.
The ability to change the allowed usage of a configuration after creation will be removed in Gradle
9.0.
The ability to create non-detached configurations with these names will be removed in Gradle 9.0.
Calling select methods on the JavaPluginExtension without the java component present
Starting in Gradle 8.1, calling any of the following methods on JavaPluginExtension without the
presence of the default java component is deprecated:
• withJavadocJar()
• withSourcesJar()
• consistentResolution(Action)
This java component is added by the JavaPlugin, which is applied by any of the Gradle JVM plugins
including:
• java-library
• application
• groovy
• scala
Starting in Gradle 9.0, calling any of the above listed methods without the presence of the default
java component will become an error.
WarPlugin#configureConfiguration(ConfigurationContainer)
By default, when applying the java plugin, the testClassesDirs`and `classpath of all Test tasks have
the same convention. Unless otherwise changed, the default behavior is to execute the tests from
the default test TestSuite by configuring the task with the classpath and testClassesDirs from the
test suite. This behavior will be removed in Gradle 9.0.
While this existing default behavior is correct for the use case of executing the default unit test
suite under a different environment, it does not support the use case of executing an entirely
separate set of tests.
If you wish to continue including these tests, use the following code to avoid the deprecation
warning in 8.1 and prepare for the behavior change in 9.0. Alternatively, consider migrating to test
suites.
[Link]
[Link]
[Link] {
testClassesDirs = [Link]
classpath = [Link]
}
Modifying Gradle Module Metadata after a publication has been populated
Altering the GMM (e.g., changing a component configuration variants) after a Maven or Ivy
publication has been populated from their components is now deprecated. This feature will be
removed in Gradle 9.0.
Eager population of the publication can happen if the following methods are called:
• Maven
◦ [Link]()
• Ivy
◦ [Link]()
◦ [Link]()
◦ [Link](Action)
Previously, the following code did not generate warnings, but it created inconsistencies between
published artifacts:
[Link]
publishing {
publications {
create<MavenPublication>("maven") {
from(components["java"])
}
create<IvyPublication>("ivy") {
from(components["java"])
}
}
}
([Link]["maven"] as MavenPublication).artifacts
([Link]["ivy"] as IvyPublication).artifacts
[Link]
publishing {
publications {
maven(MavenPublication) {
from [Link]
}
ivy(IvyPublication) {
from [Link]
}
}
}
[Link]
[Link]
[Link]([Link]) {
skip() }
[Link]([Link])
{ skip() }
In this example, the Maven and Ivy publications will contain the main JAR artifacts for the project,
whereas the GMM module file will omit them.
Running JVM tests on JVM versions older than 8 is deprecated. Testing on these versions will
become an error in Gradle 9.0
Applying Kotlin DSL precompiled scripts published with Gradle < 6.0
Applying Kotlin DSL precompiled scripts published with Gradle < 6.0 is deprecated. Please use a
version of the plugin published with Gradle >= 6.0.
Applying the kotlin-dsl together with Kotlin Gradle Plugin < 1.8.0
Applying the kotlin-dsl together with Kotlin Gradle Plugin < 1.8.0 is deprecated. Please let Gradle
control the version of kotlin-dsl by removing any explicit kotlin-dsl version constraints from your
build logic. This will let the kotlin-dsl plugin decide which version of the Kotlin Gradle Plugin to
use. If you explicitly declare which version of the Kotlin Gradle Plugin to use for your build logic,
update it to >= 1.8.0.
Accessing libraries or bundles from dependency version catalogs in the plugins {} block of a Kotlin script
Accessing libraries or bundles from dependency version catalogs in the plugins {} block of a Kotlin
script is deprecated. Please only use versions or plugins from dependency version catalogs in the
plugins {} block.
Using ValidatePlugins task without a Java Toolchain
Using a task of type ValidatePlugins without applying the Java Toolchains plugin is deprecated, and
will become an error in Gradle 9.0.
[Link]
plugins {
id("jvm-toolchains")
}
[Link]
plugins {
id 'jvm-toolchains'
}
The Java Toolchains plugin is applied automatically by the Java library plugin or other JVM plugins.
So you can apply any of them to your project and it will fix the warning.
• [Link](…)
• [Link](…)
• [Link](…)
• ConfigureUtil
The enum constant JvmVendorSpec.IBM_SEMERU is now deprecated and will be removed in Gradle 9.0.
Please replace it by its equivalent [Link] to avoid warnings and potential errors in the
next major version release.
Following the related previous deprecation of the behaviour in Gradle 7.1, it is now also deprecated
to use related StartParameter and GradleBuild properties. These properties will be removed in
Gradle 9.0.
Setting custom build file using buildFile property in GradleBuild task has been deprecated.
Please use the dir property instead to specify the root of the nested build. Alternatively, consider
using one of the recommended alternatives for GradleBuild task.
The [Link] property in [Link] under Gradle User Home has been
deprecated. Please use the cache cleanup DSL instead to disable or modify the cleanup
configuration.
Since the [Link] property may still be needed for older versions of Gradle, this
property may still be present and no deprecation warnings will be printed as long as it is also
configured via the DSL. The DSL value will always take preference over the
[Link] property. If the desired configuration is to disable cleanup for older
versions of Gradle (using [Link]), but to enable cleanup with the default values
for Gradle versions at or above Gradle 8, then cleanup should be configured to use
[Link]:
[Link]
[Link]
gradle8/[Link]
gradle8/[Link]
beforeSettings {
caches {
[Link]([Link])
}
}
Using relative file paths to point to Java executables is now deprecated and will become an error in
Gradle 9. This is done to reduce confusion about what such relative paths should resolve against.
See the configuration cache chapter for details on how to migrate these usages to APIs that are
supported by the configuration cache.
Running the Test task successfully when no test was executed is now deprecated and will become
an error in Gradle 9. Note that it is not an error when no test sources are present, in this case the
test task is simply skipped. It is only an error when test sources are present, but no test was
selected for execution. This is changed to avoid accidental successful test runs due to erroneous
configuration.
Workaround for false positive errors shown in Kotlin DSL plugins {} block using version catalog is not
needed anymore
Version catalog accessors for plugin aliases in the plugins {} block aren’t shown as errors in IntelliJ
IDEA and Android Studio Kotlin script editor anymore.
If you were using the @Suppress("DSL_SCOPE_VIOLATION") annotation as a workaround, you can now
remove it.
If you were using the Gradle Libs Error Suppressor IntelliJ IDEA plugin, you can now uninstall it.
After upgrading Gradle to 8.1 you will need to clear the IDE caches and restart.
Also see the deprecated usages of version catalogs in the Kotlin DSL plugins {} block above.
RUNNING GRADLE BUILDS
CORE CONCEPTS
Gradle Basics
Gradle automates building, testing, and deployment of software from information in build
scripts.
Projects
A Gradle project is a piece of software that can be built, such as an application or a library.
Single project builds include a single project called the root project.
Multi-project builds include one root project and any number of subprojects.
Build Scripts
Build scripts detail to Gradle what steps to take to build the project.
Dependency Management
Each project typically includes a number of external dependencies that Gradle will resolve during
the build.
Tasks
Tasks are a basic unit of work such as compiling code or running your test.
Each project contains one or more tasks defined inside a build script or a plugin.
Plugins
Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.
Many developers will interact with Gradle for the first time through an existing project.
The presence of the gradlew and [Link] files in the root directory of a project is a clear
indicator that Gradle is used.
project
├── gradle ①
│ ├── [Link] ②
│ └── wrapper
│ ├── [Link]
│ └── [Link]
├── gradlew ③
├── [Link] ③
├── [Link](.kts) ④
├── subproject-a
│ ├── [Link](.kts) ⑤
│ └── src ⑥
└── subproject-b
├── [Link](.kts) ⑤
└── src ⑥
Invoking Gradle
IDE
Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse,
and NetBeans.
Gradle can be automatically invoked when you build, clean, or run your app in the IDE.
It is recommended that you consult the manual for the IDE of your choice to learn more about how
Gradle can be used and configured.
Command line
Gradle can be invoked in the command line once installed. For example:
$ gradle build
Gradle Wrapper
The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to
execute a Gradle build. It is found in the project root directory as a gradlew or [Link] file:
• Provisions the Gradle version for different execution environments (IDEs, CI servers…).
It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and
standardized execution of the build.
Depending on the operating system, you run gradlew or [Link] instead of the gradle command.
$ gradle build
$ ./gradlew build
$ .\[Link] build
The command is run in the same directory that the Wrapper is located in. If you want to run the
command in a different directory, you must provide the relative path to the Wrapper:
$ ../gradlew build
The following console output demonstrates the use of the Wrapper on a Windows machine, in the
command prompt (cmd), for a Java-based project:
$ [Link] build
Downloading [Link]
.....................................................................................
Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\[Link] to C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv
Set executable permissions for: C:\Documents and
Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-
all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle
.
├── gradle
│ └── wrapper
│ ├── [Link] ①
│ └── [Link] ②
├── gradlew ③
└── [Link] ④
① [Link]: This is a small JAR file that contains the Gradle Wrapper code. It is
responsible for downloading and installing the correct version of Gradle for a project if it’s not
already installed.
② [Link]: This file contains configuration properties for the Gradle Wrapper,
such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or
TARBALL).
③ gradlew: This is a shell script (Unix-based systems) that acts as a wrapper around gradle-
[Link]. It is used to execute Gradle tasks on Unix-based systems without needing to
manually install Gradle.
④ [Link]: This is a batch script (Windows) that serves the same purpose as gradlew but is used
on Windows systems.
$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2
$ [Link] --version
$ [Link] wrapper --gradle-version 7.2
Substitute ./gradlew (in macOS / Linux) or [Link] (in Windows) for gradle in the following
examples.
If multiple tasks are specified, you should separate them with a space.
Options that accept values can be specified with or without = between the option and argument.
The use of = is recommended.
Options that enable behavior have long-form options with inverses specified with --no-. The
following are opposites.
Many long-form options have short-option equivalents. The following are equivalent:
gradle --help
gradle -h
Command-line usage
The following sections describe the use of the Gradle command-line interface. Some plugins also
add their own command line options.
Executing tasks
$ gradle :taskName
This will run the single taskName and all of its dependencies.
To pass an option to a task, prefix the option name with -- after the task name:
The primary purpose of the settings file is to add subprojects to your build.
• For multi-project builds, the settings file is mandatory and declares all subprojects.
Settings script
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
The settings file is typically found in the root directory of the project.
[Link]
[Link] = "root-project" ①
include("sub-project-a") ②
include("sub-project-b")
include("sub-project-c")
② Add subprojects.
[Link]
[Link] = 'root-project' ①
include('sub-project-a') ②
include('sub-project-b')
include('sub-project-c')
② Add subprojects.
[Link] = "root-project"
2. Add subprojects
The settings file defines the structure of the project by including subprojects, if there are any:
include("app")
include("business-logic")
include("data-model")
1. The libraries and/or plugins on which Gradle and the build script depend.
2. The libraries on which the project sources (i.e., source code) depend.
Build scripts
The build script is either a [Link] file written in Groovy or a [Link] file in Kotlin.
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
[Link]
plugins {
id("application") ①
}
application {
mainClass = "[Link]" ②
}
① Add plugins.
plugins {
id 'application' ①
}
application {
mainClass = '[Link]' ②
}
① Add plugins.
1. Add plugins
Adding a plugin to a build is called applying a plugin and makes additional functionality available.
plugins {
id("application")
}
Applying the Application plugin also implicitly applies the Java plugin. The java plugin adds Java
compilation along with testing and bundling capabilities to a project.
A plugin adds tasks to a project. It also adds properties and methods to a project.
The application plugin defines tasks that package and distribute an application, such as the run
task.
The Application plugin provides a way to declare the main class of a Java application, which is
required to execute the code.
application {
mainClass = "[Link]"
}
In this example, the main class (i.e., the point where the program’s execution begins) is
[Link].
Gradle build scripts define the process to build projects that may require external dependencies.
Dependencies refer to JARs, plugins, libraries, or source code that support building your project.
Version Catalog
The catalog makes sharing dependencies and version configurations between subprojects simple. It
also allows teams to enforce versions of libraries and plugins in large projects.
1. [versions] to declare the version numbers that plugins and libraries will reference.
[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"
[libraries]
googleMaterial = { group = "[Link]", name = "material", version =
"1.1.0-alpha05" }
mockitoCore = { module = "[Link]:mockito-core", [Link] = "mockito" }
[plugins]
androidApplication = { id = "[Link]", [Link] =
"androidGradlePlugin" }
The file is located in the gradle directory so that it can be used by Gradle and IDEs automatically.
The version catalog should be checked into source control: gradle/[Link].
To add a dependency to your project, specify a dependency in the dependencies block of your
[Link](.kts) file.
The following [Link] file adds a plugin and two dependencies to the project using the
version catalog above:
plugins {
alias([Link]) ①
}
dependencies {
// Dependency on a remote binary to compile and run the code
implementation([Link]) ②
① Applies the Android Gradle plugin to this project, which adds several features that are specific to
building Android apps.
② Adds the Material dependency to the project. Material Design provides components for creating
a user interface in an Android App. This library will be used to compile and run the Kotlin
source code in this project.
③ Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java
code. This library will be used to compile and run the test source code in this project.
• The material library is added to the implementation configuration, which is used for compiling
and running production code.
• The mockito-core library is added to the testImplementation configuration, which is used for
compiling and running test code.
You can view your dependency tree in the terminal using the ./gradlew :app:dependencies
command:
$ ./gradlew :app:dependencies
------------------------------------------------------------
Project ':app'
------------------------------------------------------------
...
Task Basics
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
You run a Gradle build task using the gradle command or by invoking the Gradle Wrapper
(./gradlew or [Link]) in your project directory:
$ ./gradlew build
Available tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in the project by running the following command in the terminal:
$ ./gradlew tasks
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
...
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
...
Other tasks
-----------
compileJava - Compiles main Java source.
...
Running tasks
$ ./gradlew run
In this example Java project, the output of the run task is a Hello World statement printed on the
console.
Task dependency
For example, for Gradle to execute the build task, the Java code must first be compiled. Thus, the
build task depends on the compileJava task.
This means that the compileJava task will run before the build task:
$ ./gradlew build
Build scripts can optionally define task dependencies. Gradle then automatically determines the
task execution order.
Plugin Basics
Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a
sophisticated dependency resolution engine. The rest of its functionality comes from plugins.
A plugin is a piece of software that provides additional functionality to the Gradle build system.
Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-
related capabilities:
Plugin distribution
2. Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.
3. Local plugins - Gradle enables users to create custom plugins using APIs.
Applying plugins
Applying a plugin to a project allows the plugin to extend the project’s capabilities.
You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a
version:
plugins {
id «plugin id» version «plugin version»
}
1. Core plugins
Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These
plugins provide essential functionality for building and managing projects.
• groovy: Adds support for compiling and testing Groovy source files.
• ear: Adds support for building EAR files for enterprise applications.
Core plugins are unique in that they provide short names, such as java for the core JavaPlugin,
when applied in build scripts. They also do not require versions. To apply the java plugin to a
project:
[Link]
plugins {
id("java")
}
There are many Gradle Core Plugins users can take advantage of.
2. Community plugins
Community plugins are plugins developed by the Gradle community, rather than being part of the
core Gradle distribution. These plugins provide additional functionality that may be specific to
certain use cases or technologies.
The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java
applications.
[Link]
plugins {
id("[Link]") version "3.1.5"
}
Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can
easily discover and use them.
3. Local plugins
Custom or local plugins are developed and used within a specific project or organization. These
plugins are not shared publicly and are tailored to the specific needs of the project or organization.
Local plugins can encapsulate common build logic, provide integrations with internal systems or
tools, or abstract complex functionality into reusable components.
Gradle provides users with the ability to develop custom plugins using APIs. To create your own
plugin, you’ll typically follow these steps:
1. Define the plugin class: create a new class that implements the Plugin<Project> interface.
2. Build and optionally publish your plugin: generate a JAR file containing your plugin code and
optionally publish this JAR to a repository (local or remote) to be used in other projects.
// Publish the plugin
plugins {
`maven-publish`
}
publishing {
publications {
create<MavenPublication>("mavenJava") {
from(components["java"])
}
}
repositories {
mavenLocal()
}
}
3. Apply your plugin: when you want to use the plugin, include the plugin ID and version in the
plugins{} block of the build file.
Next Step: Learn about Incremental Builds and Build Caching >>
Gradle uses two main features to reduce build time: incremental builds and build caching.
Incremental builds
An incremental build is a build that avoids running tasks whose inputs have not changed since the
previous build. Re-executing such tasks is unnecessary if they would only re-produce the same
output.
For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine
whether the input or outputs have changed at build time. If they have changed, Gradle will execute
the task. Otherwise, it will skip execution.
Incremental builds are always enabled, and the best way to see them in action is to turn on verbose
mode. With verbose mode, each task state is labeled during a build:
When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE is
printed next to the task.
Build caching
Incremental Builds are a great optimization that helps avoid work already done. If a developer
continuously changes a single file, there is likely no need to rebuild all the other files in the project.
However, what happens when the same developer switches to a new branch created last week? The
files are rebuilt, even though the developer is building something that has been built before.
The build cache stores previous build results and restores them when needed. It prevents the
redundant work and cost of executing time-consuming and expensive processes.
When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-
CACHE:
Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE and
not FROM-CACHE.
The build cache allows you to share and reuse unchanged build and test outputs across teams. This
speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new
code changes.
Build Scans
<div class="badge-wrapper">
<a class="badge" href="[Link]
4393-b645-7a2c713853d5/" target="_blank">
<span class="badge-type button--blue">LEARN</span>
<span class="badge-text">How to Use Build Scans ></span>
</a>
</div>
Build Scans
Gradle captures your build metadata and sends it to the Build Scan Service. The service then
transforms the metadata into information you can analyze and share with others.
The information that scans collect can be an invaluable resource when troubleshooting,
collaborating on, or optimizing the performance of your builds.
For example, with a build scan, it’s no longer necessary to copy and paste error messages or include
all the details about your environment each time you want to ask a question on Stack Overflow,
Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.
Enable Build Scans
To enable build scans on a gradle command, add --scan to the command line option:
TIP Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.
├── caches ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ ├── ⋮
│ ├── jars-3 ③
│ └── modules-2 ③
├── daemon ④
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d ⑤
│ └── [Link]
├── jdks ⑥
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists ⑦
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── [Link] ⑧
The project root directory contains all source files from your project.
It also contains files and directories Gradle generates, such as .gradle and build, as well as the
Gradle configuration directory: gradle.
While gradle is usually checked into source control, build and .gradle directories contain the
output of your builds, caches, and other transient files Gradle uses to support features like
incremental builds.
├── .gradle ①
│ ├── 4.8 ②
│ ├── 4.9 ②
│ └── ⋮
├── build ③
├── gradle
│ └── wrapper ④
├── [Link] ⑤
├── gradlew ⑥
├── [Link] ⑥
├── [Link] ⑦
├── subproject-one ⑧
| └── [Link] ⑨
├── subproject-two ⑧
| └── [Link] ⑨
└── ⋮
③ The build directory of this project into which Gradle generates all build artifacts.
While some small projects and monolithic applications may contain a single build file and source
tree, it is often more common for a project to have been split into smaller, interdependent modules.
The word "interdependent" is vital, as you typically want to link the many modules together
through a single build.
Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-
module project. Gradle refers to modules as subprojects.
A multi-project build consists of one root project and one or more subprojects.
Multi-Project structure
The following represents the structure of a multi-project build that contains three subprojects:
├── .gradle
│ └── ⋮
├── gradle
│ ├── [Link]
│ └── wrapper
├── gradlew
├── [Link]
├── [Link] ①
├── sub-project-1
│ └── [Link] ②
├── sub-project-2
│ └── [Link] ②
└── sub-project-3
└── [Link] ②
The Gradle community has two standards for multi-project build structures:
2. Composite Builds - a build that includes other builds where build-logic is a build directory at
the Gradle project root containing reusable build logic.
Multi-project builds allow you to organize projects with many modules, wire dependencies between
those modules, and easily share common build logic amongst them.
For example, a build that has many modules called mobile-app, web-app, api, lib, and documentation
could be structured as follows:
.
├── gradle
├── gradlew
├── [Link]
├── buildSrc
│ ├── [Link]
│ └── src/main/kotlin/[Link]
├── mobile-app
│ └── [Link]
├── web-app
│ └── [Link]
├── api
│ └── [Link]
├── lib
│ └── [Link]
└── documentation
└── [Link]
The modules will have dependencies between them such as web-app and mobile-app depending on
lib. This means that in order for Gradle to build web-app or mobile-app, it must build lib first.
[Link]
[Link]
NOTE The order in which the subprojects (modules) are included does not matter.
The buildSrc directory is automatically recognized by Gradle. It is a good place to define and
maintain shared configuration or imperative build logic, such as custom tasks or plugins.
If the java plugin is applied to the buildSrc project, the compiled code from buildSrc/src/main/java
is put in the classpath of the root build script, making it available to any subproject (web-app, mobile-
app, lib, etc…) in the build.
2. Composite Builds
Composite Builds, also referred to as included builds, are best for sharing logic between builds (not
subprojects) or isolating access to shared build logic (i.e., convention plugins).
Let’s take the previous example. The logic in buildSrc has been turned into a project that contains
plugins and can be published and worked on independently of the root project build.
The plugin is moved to its own build called build-logic with a build script and settings file:
.
├── gradle
├── gradlew
├── [Link]
├── build-logic
│ ├── [Link]
│ └── conventions
│ ├── [Link]
│ └── src/main/kotlin/[Link]
├── mobile-app
│ └── [Link]
├── web-app
│ └── [Link]
├── api
│ └── [Link]
├── lib
│ └── [Link]
└── documentation
└── [Link]
The fact that build-logic is located in a subdirectory of the root project is irrelevant.
NOTE
The folder could be located outside the root project if desired.
[Link]
pluginManagement {
includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")
Multi-Project path
A project path has the following pattern: it starts with an optional colon, which denotes the root
project.
The root project, :, is the only project in a path not specified by its name.
The rest of a project path is a colon-separated sequence of project names, where the next project is
a subproject of the previous project:
:sub-project-1
You can see the project paths when running gradle projects:
------------------------------------------------------------
Root project 'project'
------------------------------------------------------------
Root project 'project'
+--- Project ':sub-project-1'
\--- Project ':sub-project-2'
Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for
composite builds.
You can use the gradle projects command to identify the project structure.
$ gradle -q projects
Projects:
------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------
Multi-project builds are collections of tasks you can run. The difference is that you may want to
control which project’s tasks get executed.
The following sections will cover your two options for executing tasks in a multi-project build.
The command gradle test will execute the test task in any subprojects relative to the current
working directory that has that task.
If you run the command from the root project directory, you will run test in api, shared,
services:shared and services:webservice.
If you run the command from the services project directory, you will only execute the task in
services:shared and services:webservice.
The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this
name. And complain if there is no such task found in any of the subprojects traversed.
Some task selectors, like help or dependencies, will only run the task on the project
NOTE they are invoked on and not on all the subprojects to reduce the amount of
information printed on the screen.
You can use a task’s fully qualified name to execute a specific task in a particular subproject. For
example: gradle :services:webservice:build will run the build task of the webservice subproject.
The fully qualified name of a task is its project path plus the task name.
This approach works for any task, so if you want to know what tasks are in a particular subproject,
use the tasks task, e.g. gradle :services:webservice:tasks.
The build task is typically used to compile, test, and check a single project.
In multi-project builds, you may often want to do all of these tasks across various projects. The
buildNeeded and buildDependents tasks can help with this.
In this example, the :services:person-service project depends on both the :api and :shared
projects. The :api project also depends on the :shared project.
Assuming you are working on a single project, the :api project, you have been making changes but
have not built the entire project since performing a clean. You want to build any necessary
supporting JARs but only perform code quality and unit tests on the parts of the project you have
changed.
$ gradle :api:build
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
BUILD SUCCESSFUL in 0s
If you have just gotten the latest version of the source from your version control system, which
included changes in other projects that :api depends on, you might want to build all the projects
you depend on AND test them too.
The buildNeeded task builds AND tests all the projects from the project dependencies of the
testRuntime configuration:
$ gradle :api:buildNeeded
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :shared:assemble
> Task :shared:compileTestJava
> Task :shared:processTestResources
> Task :shared:testClasses
> Task :shared:test
> Task :shared:check
> Task :shared:build
> Task :shared:buildNeeded
> Task :api:buildNeeded
BUILD SUCCESSFUL in 0s
You may want to refactor some part of the :api project used in other projects. If you make these
changes, testing only the :api project is insufficient. You must test all projects that depend on the
:api project.
The buildDependents task tests ALL the projects that have a project dependency (in the testRuntime
configuration) on the specified project:
$ gradle :api:buildDependents
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :services:person-service:compileJava
> Task :services:person-service:processResources
> Task :services:person-service:classes
> Task :services:person-service:jar
> Task :services:person-service:assemble
> Task :services:person-service:compileTestJava
> Task :services:person-service:processTestResources
> Task :services:person-service:testClasses
> Task :services:person-service:test
> Task :services:person-service:check
> Task :services:person-service:build
> Task :services:person-service:buildDependents
> Task :api:buildDependents
BUILD SUCCESSFUL in 0s
Finally, you can build and test everything in all projects. Any task you run in the root project folder
will cause that same-named task to be run on all the children.
You can run gradle build to build and test ALL projects.
Build Lifecycle
As a build author, you define tasks and specify dependencies between them. Gradle guarantees that
tasks will execute in the order dictated by these dependencies.
Your build scripts and plugins configure this task dependency graph.
For example, if your project includes tasks such as build, assemble, and createDocs, you can
configure the build script so that they are executed in the order: build → assemble → createDocs.
Task Graphs
Across all projects in the build, tasks form a Directed Acyclic Graph (DAG).
This diagram shows two example task graphs, one abstract and the other concrete, with
dependencies between tasks represented as arrows:
Both plugins and build scripts contribute to the task graph via the task dependency mechanism and
annotated inputs/outputs.
Build Phases
Phase 1. Initialization
• Detects the [Link](.kts) file.
• Evaluates the settings file to determine which projects (and included builds) make up the
build.
Phase 3. Execution
• Schedules and executes the selected tasks.
Example
The following example shows which parts of settings and build files correspond to various build
phases:
[Link]
[Link] = "basic"
println("This is executed during the initialization phase.")
[Link]
[Link]("test") {
doLast {
println("This is executed during the execution phase.")
}
}
[Link]("testBoth") {
doFirst {
println("This is executed first during the execution phase.")
}
doLast {
println("This is executed last during the execution phase.")
}
println("This is executed during the configuration phase as well, because
:testBoth is used in the build.")
}
[Link]
[Link] = 'basic'
println 'This is executed during the initialization phase.'
[Link]
[Link]('configured') {
println 'This is also executed during the configuration phase, because
:configured is used in the build.'
}
[Link]('test') {
doLast {
println 'This is executed during the execution phase.'
}
}
[Link]('testBoth') {
doFirst {
println 'This is executed first during the execution phase.'
}
doLast {
println 'This is executed last during the execution phase.'
}
println 'This is executed during the configuration phase as well, because
:testBoth is used in the build.'
}
The following command executes the test and testBoth tasks specified above. Because Gradle only
configures requested tasks and their dependencies, the configured task never configures:
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Phase 1. Initialization
In the initialization phase, Gradle detects the set of projects (root and subprojects) and included
builds participating in the build.
Gradle first evaluates the settings file, [Link](.kts), and instantiates a Settings object.
Then, Gradle instantiates Project instances for each project.
Phase 2. Configuration
In the configuration phase, Gradle adds tasks and other properties to the projects found by the
initialization phase.
Phase 3. Execution
Gradle uses the task execution graphs generated by the configuration phase to determine which
tasks to execute.
Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root
directory.
When the settings file [Link](.kts) is found, Gradle instantiates a Settings object.
One of the purposes of the Settings object is to allow you to declare all the projects to be included in
the build.
Settings Scripts
The settings script is either a [Link] file in Groovy or a [Link] file in Kotlin.
Before Gradle assembles the projects for a build, it creates a Settings instance and executes the
settings file against it.
As the settings script executes, it configures this Settings. Therefore, the settings file defines the
Settings object.
Many top-level properties and blocks in a settings script are part of the Settings API.
For example, we can set the root project name in the settings script using the [Link]
property:
[Link] = "application"
[Link]
[Link] = "application"
[Link]
[Link] = 'application'
Standard Settings properties
The Settings object exposes a standard set of properties in your settings script.
Name Description
buildCache The build cache configuration.
plugins The container of plugins that have been applied to the settings.
rootDir The root directory of the build. The root directory is the project directory of the root
project.
rootProjec The root project of the build.
t
settings Returns this settings object.
Name Description
include() Adds the given projects to the build.
includeBuild() Includes a build at the specified path to the composite build.
A Settings script is a series of method calls to the Gradle API that often use { … }, a special
shortcut in both the Groovy and Kotlin languages. A { } block is called a lambda in Kotlin or a
closure in Groovy.
Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:
plugins(function() {
id("plugin")
})
The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.
The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script
line by line, top to bottom.
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}
[Link] = "simple-project" ③
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")
[Link]
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}
[Link] = 'simple-project' ③
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")
The settings file can manage plugin versions and repositories for your build using the
pluginManagement block. It provides a way to define which plugins should be used in your project
and from which repositories they should be resolved.
[Link]
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
[Link]
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
The settings file can optionally apply plugins that are required for configuring the settings of the
project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the
example below.
Plugins applied in the settings file only affect the Settings object.
[Link]
plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}
[Link]
plugins { ②
id("[Link]-resolver-convention") version "0.8.0"
}
The settings file defines your project name using the [Link] property:
[Link]
[Link] = "simple-project" ③
[Link]
[Link] = 'simple-project' ③
The settings file can optionally define rules and configurations for dependency resolution across
your project(s). It provides a centralized way to manage and customize dependency resolution.
[Link]
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
[Link]
dependencyResolutionManagement { ④
repositories {
mavenCentral()
}
}
The settings file defines the structure of the project by adding all the subprojects using the include
statement:
[Link]
include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")
[Link]
include("sub-project-a") ⑤
include("sub-project-b")
include("sub-project-c")
There are many more properties and methods on the Settings object that you can use to configure
your build.
It’s important to remember that while many Gradle scripts are typically written in short Groovy or
Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings
object in the Gradle API:
include("app")
Is actually:
[Link]("app")
Additionally, the full power of the Groovy and Kotlin languages is available to you.
For example, instead of using include many times to add subprojects, you can iterate over the list of
directories in the project root folder and include them automatically:
Then, for each project included in the settings file, Gradle creates a Project instance.
Gradle then looks for a corresponding build script file, which is used in the configuration phase.
Build Scripts
Every Gradle build comprises one or more projects; a root project and subprojects.
A project typically corresponds to a software component that needs to be built, like a library or an
application. It might represent a library JAR, a web application, or a distribution ZIP assembled
from the JARs produced by other projects.
On the other hand, it might represent a thing to be done, such as deploying your application to
staging or production environments.
Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).
A build script configures a project and is associated with an object of type Project.
The build script is either a *.gradle file in Groovy or a *.[Link] file in Kotlin.
Many top-level properties and blocks in a build script are part of the Project API.
For example, the following build script uses the [Link] property to print the name of the
project:
[Link]
println(name)
println([Link])
[Link]
println name
println [Link]
$ gradle -q check
project-api
project-api
The first uses the top-level reference to the name property of the Project object. The second
statement uses the project property available to any build script, which returns the associated
Project object.
The Project object exposes a standard set of properties in your build script.
Name Description
uri() Resolves a file path to a URI, relative to the project directory of this project.
task() Creates a Task with the given name and adds it to this project.
The Build script is composed of { … }, a special object in both Groovy and Kotlin. This object is
called a lambda in Kotlin or a closure in Groovy.
Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy
closure object is passed as the argument. It is the short form for:
plugins(function() {
id("plugin")
})
The code inside the function is executed against a this object called a receiver in Kotlin lambda and
a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct
corresponding method. The this of the method invocation id("plugin") object is of type
PluginDependenciesSpec.
The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes
the script line by line, top to bottom.
[Link]
plugins { ①
id("application")
}
repositories { ②
mavenCentral()
}
dependencies { ③
testImplementation("[Link]:junit-jupiter-engine:5.9.3")
testRuntimeOnly("[Link]:junit-platform-launcher")
implementation("[Link]:guava:32.1.1-jre")
}
application { ④
mainClass = "[Link]"
}
[Link]<Test>("test") { ⑤
useJUnitPlatform()
}
[Link]<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}
[Link]<Zip>("zip-reports") {
from("Reports/")
include("*")
[Link]("[Link]")
[Link](file("/dir"))
}
[Link]
plugins { ①
id 'application'
}
repositories { ②
mavenCentral()
}
dependencies { ③
testImplementation '[Link]:junit-jupiter-engine:5.9.3'
testRuntimeOnly '[Link]:junit-platform-launcher'
implementation '[Link]:guava:32.1.1-jre'
}
application { ④
mainClass = '[Link]'
}
[Link]('test', Test) { ⑤
useJUnitPlatform()
}
[Link]('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}
[Link]('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = '[Link]'
destinationDirectory = file('/dir')
}
③ Add dependencies.
④ Set properties.
⑤ Register and configure tasks.
Plugins are used to extend Gradle. They are also used to modularize and reuse project
configurations.
[Link]
plugins { ①
id("application")
}
[Link]
plugins { ①
id 'application'
}
In the example, the application plugin, which is included with Gradle, has been applied, describing
our project as a Java application.
A project generally has a number of dependencies it needs to do its work. Dependencies include
plugins, libraries, or components that Gradle must download for the build to succeed.
The build script lets Gradle know where to look for the binaries of the dependencies. More than one
location can be provided:
[Link]
repositories { ②
mavenCentral()
}
[Link]
repositories { ②
mavenCentral()
}
In the example, the guava library and the JetBrains Kotlin plugin ([Link]) will be
downloaded from the Maven Central Repository.
3. Add dependencies
A project generally has a number of dependencies it needs to do its work. These dependencies are
often libraries of precompiled classes that are imported in the project’s source code.
Dependencies are managed via configurations and are retrieved from repositories.
[Link]
dependencies { ③
testImplementation("[Link]:junit-jupiter-engine:5.9.3")
testRuntimeOnly("[Link]:junit-platform-launcher")
implementation("[Link]:guava:32.1.1-jre")
}
[Link]
dependencies { ③
testImplementation '[Link]:junit-jupiter-engine:5.9.3'
testRuntimeOnly '[Link]:junit-platform-launcher'
implementation '[Link]:guava:32.1.1-jre'
}
In the example, the application code uses Google’s guava libraries. Guava provides utility methods
for collections, caching, primitives support, concurrency, common annotations, string processing,
I/O, and validations.
4. Set properties
The Project object has an associated ExtensionContainer object that contains all the settings and
properties for the plugins that have been applied to the project.
In the example, the application plugin added an application property, which is used to detail the
main class of our Java application:
[Link]
application { ④
mainClass = "[Link]"
}
[Link]
application { ④
mainClass = '[Link]'
}
Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping
up a WAR file.
While tasks are typically defined in plugins, you may need to register or configure tasks in build
scripts.
[Link]
[Link]<Zip>("zip-reports") {
from("Reports/")
include("*")
[Link]("[Link]")
[Link](file("/dir"))
}
[Link]
[Link]('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = '[Link]'
destinationDirectory = file('/dir')
}
You may have seen usage of the [Link]([Link]) method which should be
avoided.
[Link]<Zip>("zip-reports") { }
TIP register(), which enables task configuration avoidance, is preferred over create().
[Link]
[Link]<Test>("test") { ⑤
useJUnitPlatform()
}
[Link]
[Link]('test', Test) { ⑤
useJUnitPlatform()
}
The example below configures the Javadoc task to automatically generate HTML documentation
from Java code:
[Link]
[Link]<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}
[Link]
[Link]('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}
Build Scripting
println([Link]);
Statements can include method calls, property assignments, and local variable definitions:
version = '[Link]'
configurations {
}
repositories {
google()
}
[Link]
[Link]("upper") {
doLast {
val someString = "mY_nAmE"
println("Original: $someString")
println("Upper case: ${[Link]()}")
}
}
[Link]
[Link]('upper') {
doLast {
String someString = 'mY_nAmE'
println "Original: $someString"
println "Upper case: ${[Link]()}"
}
}
$ gradle -q upper
Original: mY_nAmE
Upper case: MY_NAME
It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class
definitions:
[Link]
[Link]("count") {
doLast {
repeat(4) { print("$it ") }
}
}
[Link]
[Link]('count') {
doLast {
[Link] { print "$it " }
}
}
$ gradle -q count
0 1 2 3
Flexible task registration
Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:
[Link]
[Link]
$ gradle -q task1
I'm task number 1
Gradle Types
In Gradle, types, properties, and providers are foundational for managing and configuring build
logic:
• Types: Gradle defines types (like Task, Configuration, File, etc.) to represent build components.
You can extend these types to create custom tasks or domain objects.
• Properties: Gradle properties (e.g., Property<T>, ListProperty<T>, SetProperty<T>) are used for
build configuration. They allow lazy evaluation, meaning their values are calculated only when
needed, enhancing flexibility and performance.
• Providers: A Provider<T> represents a value that is computed or retrieved lazily. Providers are
often used with properties to defer value computation until necessary. This is especially useful
for integrating dynamic, runtime values into your build.
Build scripts can declare two variables: local variables and extra properties.
Local Variables
Declare local variables with the val keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Kotlin language.
Declare local variables with the def keyword. Local variables are only visible in the scope where
they have been declared. They are a feature of the underlying Groovy language.
[Link]
[Link]<Copy>("copy") {
from("source")
into(dest)
}
[Link]
[Link]('copy', Copy) {
from 'source'
into dest
}
Extra Properties
Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined
properties.
Add, read, and set extra properties via the owning object’s extra property. Alternatively, you can
access extra properties via Kotlin delegated properties using by extra.
Add, read, and set extra properties via the owning object’s ext property. Alternatively, you can use
an ext block to add multiple properties simultaneously.
[Link]
plugins {
id("java-library")
}
sourceSets {
main {
extra["purpose"] = "production"
}
test {
extra["purpose"] = "test"
}
create("plugin") {
extra["purpose"] = "production"
}
}
[Link]("printProperties") {
val springVersion = springVersion
val emailNotification = emailNotification
val productionSourceSets = provider {
[Link] { [Link]["purpose"] == "production" }.map {
[Link] }
}
doLast {
println(springVersion)
println(emailNotification)
[Link]().forEach { println(it) }
}
}
[Link]
plugins {
id 'java-library'
}
ext {
springVersion = "[Link]"
emailNotification = "build@[Link]"
}
sourceSets {
main {
purpose = "production"
}
test {
purpose = "test"
}
plugin {
purpose = "production"
}
}
[Link]('printProperties') {
def springVersion = springVersion
def emailNotification = emailNotification
def productionSourceSets = provider {
[Link] { [Link] == "production" }.collect { [Link]
}
}
doLast {
println springVersion
println emailNotification
[Link]().each { println it }
}
}
$ gradle -q printProperties
[Link]
build@[Link]
main
plugin
This example adds two extra properties to the project object via by extra. Additionally, this
example adds a property named purpose to each source set by setting extra["purpose"] to null. Once
added, you can read and set these properties via extra.
This example adds two extra properties to the project object via an ext block. Additionally, this
example adds a property named purpose to each source set by setting [Link] to null. Once
added, you can read and set all these properties just like predefined ones.
Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows
Gradle to recognize when a script attempts to set a property that does not exist. You can access
extra properties anywhere where you can access their owning object. This gives extra properties a
wider scope than local variables. Subprojects can access extra properties on their parent projects.
For more information about extra properties, see ExtraPropertiesExtension in the API
documentation.
Configure Arbitrary Objects
[Link]
class UserInfo(
var name: String? = null,
var email: String? = null
)
[Link]("greet") {
val user = UserInfo().apply {
name = "Isaac Newton"
email = "isaac@[Link]"
}
doLast {
println([Link])
println([Link])
}
}
[Link]
class UserInfo {
String name
String email
}
[Link]('greet') {
def user = configure(new UserInfo()) {
name = "Isaac Newton"
email = "isaac@[Link]"
}
doLast {
println [Link]
println [Link]
}
}
$ gradle -q greet
Isaac Newton
isaac@[Link]
Closure Delegates
Each closure has a delegate object. Groovy uses this delegate to look up variable and method
references to nonlocal variables and closure parameters. Gradle uses this for configuration closures,
where the delegate object refers to the object being configured.
[Link]
dependencies {
assert delegate == [Link]
testImplementation('junit:junit:4.13')
[Link]('junit:junit:4.13')
}
Default imports
To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].c.*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
import [Link].*
Using Tasks
The work that Gradle can do on a project is defined by one or more tasks.
A task represents some independent unit of work that a build performs. This might be compiling
some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.
When a user runs ./gradlew build in the command line, Gradle will execute the build task along
with any other tasks it depends on.
Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks:
------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project
'myTutorial'.
...
[Link]
plugins {
id("application")
}
[Link]
plugins {
id 'application'
}
$ ./gradlew tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
Other tasks
-----------
compileJava - Compiles main Java source.
...
Many of these tasks, such as assemble, build, and run, should be familiar to a developer.
Task classification
1. Actionable tasks have some action(s) attached to do work in your build: compileJava.
Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at
once.
[Link]
[Link]("hello") {
doLast {
println("Hello world!")
}
}
[Link]
[Link]('hello') {
doLast {
println 'Hello world!'
}
}
In the example, the build script registers a single task called hello using the TaskContainer API,
and adds an action to it.
If the tasks in the project are listed, the hello task is available to Gradle:
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.
You can execute the task in the build script with ./gradlew hello:
$ ./gradlew hello
Hello world!
When Gradle executes the hello task, it executes the action provided. In this case, the action is
simply a block containing some code: println("Hello world!").
The hello task from the previous section can be detailed with a description and assigned to a
group with the following update:
[Link]
[Link]("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
}
[Link]
[Link]('hello') {
group = 'Custom'
description = 'A lovely greeting task.'
doLast {
println 'Hello world!'
}
}
Custom tasks
------------------
hello - A lovely greeting task.
To view information about a task, use the help --task <task-name> command:
Path
:app:hello
Type
Task ([Link])
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom
Task dependencies
[Link]
[Link]("hello") {
doLast {
println("Hello world!")
}
}
[Link]("intro") {
dependsOn("hello")
doLast {
println("I'm Gradle")
}
}
[Link]
[Link]('hello') {
doLast {
println 'Hello world!'
}
}
[Link]('intro') {
dependsOn [Link]
doLast {
println "I'm Gradle"
}
}
$ gradle -q intro
Hello world!
I'm Gradle
[Link]
[Link]("taskX") {
dependsOn("taskY")
doLast {
println("taskX")
}
}
[Link]("taskY") {
doLast {
println("taskY")
}
}
[Link]
[Link]('taskX') {
dependsOn 'taskY'
doLast {
println 'taskX'
}
}
[Link]('taskY') {
doLast {
println 'taskY'
}
}
$ gradle -q taskX
taskY
taskX
The hello task from the previous example is updated to include a dependency:
[Link]
[Link]("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn([Link])
}
[Link]
[Link]('hello') {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn([Link])
}
The hello task now depends on the assemble task, which means that Gradle must execute the
assemble task before it can execute the hello task:
$ ./gradlew :app:hello
> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:startScripts UP-TO-DATE
> Task :app:distTar UP-TO-DATE
> Task :app:distZip UP-TO-DATE
> Task :app:assemble UP-TO-DATE
Task configuration
Once registered, tasks can be accessed via the TaskProvider API for further configuration.
For instance, you can use this to add dependencies to a task at runtime dynamically:
[Link]
[Link]
$ gradle -q task0
I'm task number 2
I'm task number 3
I'm task number 0
[Link]
[Link]("hello") {
doLast {
println("Hello Earth")
}
}
[Link]("hello") {
doFirst {
println("Hello Venus")
}
}
[Link]("hello") {
doLast {
println("Hello Mars")
}
}
[Link]("hello") {
doLast {
println("Hello Jupiter")
}
}
[Link]
[Link]('hello') {
doLast {
println 'Hello Earth'
}
}
[Link]('hello') {
doFirst {
println 'Hello Venus'
}
}
[Link]('hello') {
doLast {
println 'Hello Mars'
}
}
[Link]('hello') {
doLast {
println 'Hello Jupiter'
}
}
$ gradle -q hello
Hello Venus
Hello Earth
Hello Mars
Hello Jupiter
The calls doFirst and doLast can be executed multiple times. They add an action to the
TIP beginning or the end of the task’s actions list. When the task executes, the actions in
the action list are executed in order.
Here is an example of the named method being used to configure a task added by a plugin:
[Link]
[Link] {
[Link](buildDir)
}
[Link]
[Link]("dokkaHtml") {
[Link](buildDir)
}
Task types
[Link]
[Link]
$ ./gradlew hello
Type
HelloTask (Build_gradle$HelloTask)
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom tasks
Gradle provides many built-in task types with common and popular functionality, such as copying
or deleting files.
This example task copies *.war files from the source directory to the target directory using the Copy
built-in task:
[Link]
[Link]<Copy>("copyTask") {
from("source")
into("target")
include("*.war")
}
[Link]
[Link]('copyTask', Copy) {
from("source")
into("target")
include("*.war")
}
There are many task types developers can take advantage of, including GroovyDoc, Zip, Jar,
JacocoReport, Sign, or Delete, which are available in the link:DSL.
However, the generic DefaultTask provides no action for Gradle. If users want to extend the
capabilities of Gradle and their build script, they must either use a built-in task or create a custom
task:
1. Built-in task - Gradle provides built-in utility tasks such as Copy, Jar, Zip, Delete, etc…
2. Custom task - Gradle allows users to subclass DefaultTask to create their own task types.
Create a task
The simplest and quickest way to create a custom task is in a build script:
To create a task, inherit from the DefaultTask class and implement a @TaskAction handler:
[Link]
[Link]
The CreateFileTask implements a simple set of actions. First, a file called "[Link]" is created in
the main project. Then, some text is written to the file.
Register a task
A task is registered in the build script using the [Link]() method, which allows it
to be then used in the build logic.
[Link]
[Link]<CreateFileTask>("createFileTask")
[Link]
[Link]("createFileTask", CreateFileTask)
Setting the group and description properties on your tasks can help users understand how to use
your task:
[Link]
[Link]<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create [Link] in the current directory"
}
[Link]
[Link]("createFileTask", CreateFileTask) {
group = "custom"
description = "Create [Link] in the current directory"
}
For the task to do useful work, it typically needs some inputs. A task typically produces outputs.
[Link]
@Input
val fileName = "[Link]"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}
[Link]
@Input
final String fileName = "[Link]"
@OutputFile
final File myFile = new File(fileName)
@TaskAction
void action() {
[Link]()
[Link] = [Link]()
}
}
Configure a task
The CreateAFileTask class is updated so that the text in the file is configurable:
[Link]
@Input
val fileName = "[Link]"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}
[Link]<CreateAFileTask>("createAFileTask") {
group = "custom"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM THE CREATE FILE TASK METHOD") // Set
convention
}
[Link]<CreateAFileTask>("createAFileTask") {
[Link]("HELLO FROM THE NAMED METHOD") // Override with custom
message
}
[Link]
@Input
final String fileName = "[Link]"
@OutputFile
final File myFile = new File(fileName)
@TaskAction
void action() {
[Link]()
[Link] = [Link]()
}
}
[Link]("createAFileTask", CreateAFileTask) {
group = "custom"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM THE CREATE FILE TASK METHOD") // Set
convention
}
[Link]("createAFileTask", CreateAFileTask) {
[Link]("HELLO FROM THE NAMED METHOD") // Override with custom
message
}
In the named() method, we find the createAFileTask task and set the text that will be written to the
file.
$ ./gradlew createAFileTask
BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date
[Link]
Using Plugins
Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with
Gradle, third-party plugins, and script plugins defined within builds.
Plugins introduce new tasks (e.g., JavaCompile), domain objects (e.g., SourceSet), conventions (e.g.,
locating Java source at src/main/java), and extend core or other plugin objects.
Plugins in Gradle are essential for automating common build tasks, integrating with external tools
or services, and tailoring the build process to meet specific project needs. They also serve as the
primary mechanism for organizing build logic.
Benefits of plugins
Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins
offer several advantages over adding logic directly to the build script:
• Promotes Reusability: Reduces the need to duplicate similar logic across projects.
• Enhances Modularity: Allows for a more modular and organized build script.
• Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.
Plugin distribution
You can leverage plugins from Gradle and the Gradle community or create your own.
2. Community plugins - Gradle plugins shared in a remote repository such as Maven or the
Gradle Plugin Portal.
Types of plugins
Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:
1. Script Plugins
Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build script
using the apply from: syntax. They are applied inline within a build script to add functionality or
customize the build process. They are not recommended but it’s important to understand how to
work:
[Link]
// Define a plugin
class HelloWorldPlugin : Plugin<Project> {
override fun apply(project: Project) {
[Link]("helloWorld") {
group = "Example"
description = "Prints 'Hello, World!' to the console"
doLast {
println("Hello, World!")
}
}
}
}
Precompiled script plugins are Groovy DSL or Kotlin DSL scripts compiled and distributed as Java
class files packaged in some library. They are meant to be consumed as a binary Gradle plugin, so
they are applied to a project using the plugins {} block. The plugin ID by which the precompiled
script can be referenced is derived from its name and optional package declaration.
plugin/src/main/kotlin/[Link]
consumer/[Link]
plugins {
id("my-plugin") version "1.0"
}
These are a hybrid of precompiled plugins and binary plugins that provide a way to reuse complex
logic across projects and allow for better organization of build logic.
buildSrc/src/main/kotlin/[Link]
plugins {
java
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("[Link]:junit-jupiter:5.8.1")
implementation("[Link]:guava:30.1.1-jre")
}
[Link]<Test>("test") {
useJUnitPlatform()
}
[Link]<Copy>("backupTestXml") {
from("build/test-results/test")
into("/tmp/results/")
exclude("binary/**")
}
app/[Link]
plugins {
application
id("shared-build-conventions")
}
4. Binary Plugins
Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as
JAR files. They are applied to a project using the plugins {} block. They offer better performance
and maintainability compared to script plugins or precompiled script plugins.
plugin/src/main/kotlin/plugin/[Link]
consumer/[Link]
plugins {
id("my-plugin") version "1.0"
}
The difference between a binary plugin and a script plugin lies in how they are shared and
executed:
• A script plugin is shared as source code, and it is compiled at the time of use.
Binary plugins can be written in any language that produces JVM bytecode, such as Java, Kotlin, or
Groovy. In contrast, script plugins can only be written using Kotlin DSL or Groovy DSL.
However, there is also a middle ground: precompiled script plugins. These are written in Kotlin
DSL or Groovy DSL, like script plugins, but are compiled into bytecode and shared like binary
plugins.
A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes
more valuable, it’s migrated to a binary plugin that can be easily tested and shared between
multiple projects or organizations.
Using plugins
To use the build logic encapsulated in a plugin, Gradle needs to perform two steps. First, it needs to
resolve the plugin, and then it needs to apply the plugin to the target, usually a Project.
1. Resolving a plugin means finding the correct version of the JAR that contains a given plugin
and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build
script. Script plugins are self-resolving in that they are resolved from the specific file path or
URL provided when applying them. Core binary plugins provided as part of the Gradle
distribution are automatically resolved.
The plugins DSL is recommended to resolve and apply plugins in one step.
Resolving plugins
Gradle provides the core plugins (e.g., JavaPlugin, GroovyPlugin, MavenPublishPlugin, etc.) as part of
its distribution, which means they are automatically resolved.
Core plugins are applied in a build script using the plugin name:
plugins {
id «plugin name»
}
For example:
plugins {
id("java")
}
Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a
unique ID and a version in the build file:
plugins {
id «plugin id» version «plugin version»
}
And the location of the plugin must be specified in the settings file:
[Link]
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
[Link]
pluginManagement { ①
repositories {
gradlePluginPortal()
}
}
id("[Link]")
version "2.1.0"
}
classpath("[Link]
kinfo:gradle-taskinfo:2.1.0")
}
}
apply(plugin =
"[Link]")
The plugin DSL provides a concise and convenient way to declare plugin dependencies.
plugins {
application // by name
java // by name
id("java") // by id - recommended
id("[Link]") version "1.9.0" // by id - recommended
}
Core Gradle plugins are unique in that they provide short names, such as java for the core
JavaPlugin.
[Link]
plugins {
java
}
[Link]
plugins {
id 'java'
}
All other binary plugins must use the fully qualified form of the plugin id (e.g., [Link]).
To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally
unique identifier, must be used:
[Link]
plugins {
id("[Link]") version "3.3.1"
}
[Link]
plugins {
id '[Link]' version '3.3.1'
}
The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine
which plugins are used quickly. This allows Gradle to:
• Provide editors with detailed information about the potential properties and values in the build
script.
There are some key differences between the plugins {} block mechanism and the "traditional"
apply() method mechanism. There are also some constraints and possible limitations.
The plugins{} block can only be used in a project’s build script [Link](.kts) and the
[Link](.kts) file. It must appear before any other block. It cannot be used in script plugins
or init scripts.
Constrained Syntax
It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for
Gradle to execute at any time).
plugins {
id(«plugin id») ①
id(«plugin id») version «plugin version» ②
}
① for core Gradle plugins or plugins already available to the build script
Where «plugin id» and «plugin version» must be constant, literal strings.
The plugins{} block must also be a top-level statement in the build script. It cannot be nested inside
another construct (e.g., an if-statement or for-loop).
Suppose you have a multi-project build, you probably want to apply plugins to some or all of the
subprojects in your build but not to the root project.
While the default behavior of the plugins{} block is to immediately resolve and apply the plugins,
you can use the apply false syntax to tell Gradle not to apply the plugin to the current project.
Then, use the plugins{} block without the version in subprojects' build scripts:
[Link]
include("hello-a")
include("hello-b")
include("goodbye-c")
[Link]
plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective
build files).
id("[Link]") version "1.0.0" apply false
id("[Link]") version "1.0.0" apply false
}
allprojects {
// Apply the common 'java' plugin to all projects (including the root)
[Link]("java")
}
subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the
root)
[Link]("java-library")
}
hello-a/[Link]
plugins {
id("[Link]")
}
hello-b/[Link]
plugins {
id("[Link]")
}
goodbye-c/[Link]
plugins {
id("[Link]")
}
[Link]
include 'hello-a'
include 'hello-b'
include 'goodbye-c'
[Link]
plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective
build files).
id '[Link]' version '1.0.0' apply false
id '[Link]' version '1.0.0' apply false
}
allprojects {
// Apply the common 'java' plugin to all projects (including the root)
apply(plugin: 'java')
}
subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the
root)
apply(plugin: 'java-library')
}
hello-a/[Link]
plugins {
id '[Link]'
}
hello-b/[Link]
plugins {
id '[Link]'
}
goodbye-c/[Link]
plugins {
id '[Link]'
}
You can also encapsulate the versions of external plugins by composing the build logic using your
own convention plugins.
buildSrc is an optional directory at the Gradle project root that contains build logic (i.e., plugins)
used in building the main project. You can apply plugins that reside in a project’s buildSrc directory
as long as they have a defined ID.
The following example shows how to tie the plugin implementation class [Link], defined in
buildSrc, to the id "my-plugin":
buildSrc/[Link]
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("myPlugins") {
id = "my-plugin"
implementationClass = "[Link]"
}
}
}
buildSrc/[Link]
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
myPlugins {
id = 'my-plugin'
implementationClass = '[Link]'
}
}
}
[Link]
plugins {
id("my-plugin")
}
[Link]
plugins {
id 'my-plugin'
}
To define libraries or plugins used in the build script itself, you can use the buildscript block. The
buildscript block is also used for specifying where to find those dependencies.
This approach is less common with newer versions of Gradle, as the plugins {} block simplifies
plugin usage. However, buildscript {} may be necessary when dealing with custom or non-
standard plugin repositories as well as libraries dependencies:
[Link]
import [Link]
import [Link]
buildscript {
repositories {
maven {
url = uri("[Link]
}
mavenCentral() // Where to find the plugin
}
dependencies {
classpath("[Link]:snakeyaml:1.19") // The library's classpath
dependency
classpath("[Link]:shadow-gradle-plugin:8.3.4") // Plugin
dependency for legacy plugin application
}
}
[Link]
import [Link]
buildscript {
repositories { // Where to find the plugin or library
maven {
url = uri("[Link]
}
mavenCentral()
}
dependencies {
classpath '[Link]:snakeyaml:1.19' // The library's classpath
dependency
classpath '[Link]:shadow-gradle-plugin:8.3.4' // Plugin
dependency for legacy plugin application
}
}
A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is
applied using the legacy application method:
[Link]
apply<MyPlugin>()
[Link]
Plugin Management
The pluginManagement{} block is used to configure repositories for plugin resolution and to define
version constraints for plugins that are applied in the build scripts.
The pluginManagement{} block can be used in a [Link](.kts) file, where it must be the first
block in the file:
[Link]
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
[Link] = "plugin-management"
[Link]
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
[Link] = 'plugin-management'
settingsEvaluated {
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}
[Link]
By default, the plugins{} DSL resolves plugins from the public Gradle Plugin Portal.
Many build authors would also like to resolve plugins from private Maven or Ivy repositories
because they contain proprietary implementation details or to have more control over what
plugins are available to their builds.
To specify custom plugin repositories, use the repositories{} block inside pluginManagement{}:
[Link]
pluginManagement {
repositories {
maven(url = file("./maven-repo"))
gradlePluginPortal()
ivy(url = file("./ivy-repo"))
}
}
[Link]
pluginManagement {
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}
This tells Gradle to first look in the Maven repository at ../maven-repo when resolving plugins and
then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository. If you
don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal() line. Finally, the
Ivy repository at ../ivy-repo will be checked.
A plugins{} block inside pluginManagement{} allows all plugin versions for the build to be defined in
a single location. Plugins can then be applied by id to any build script via the plugins{} block.
One benefit of setting plugin versions this way is that the [Link]{} does not have
the same constrained syntax as the build script plugins{} block. This allows plugin versions to be
taken from [Link], or loaded via another mechanism.
[Link]
pluginManagement {
val helloPluginVersion: String by settings
plugins {
id("[Link]") version "${helloPluginVersion}"
}
}
[Link]
plugins {
id("[Link]")
}
[Link]
helloPluginVersion=1.0.0
[Link]
pluginManagement {
plugins {
id '[Link]' version "${helloPluginVersion}"
}
}
[Link]
plugins {
id '[Link]'
}
[Link]
helloPluginVersion=1.0.0
The plugin version is loaded from [Link] and configured in the settings script, allowing
the plugin to be added to any project without specifying the version.
Plugin resolution rules allow you to modify plugin requests made in plugins{} blocks, e.g., changing
the requested version or explicitly specifying the implementation artifact coordinates.
To add resolution rules, use the resolutionStrategy{} inside the pluginManagement{} block:
[Link]
pluginManagement {
resolutionStrategy {
eachPlugin {
if ([Link] == "[Link]") {
useModule("[Link]:sample-plugins:1.0.0")
}
}
}
repositories {
maven {
url = uri("./maven-repo")
}
gradlePluginPortal()
ivy {
url = uri("./ivy-repo")
}
}
}
[Link]
pluginManagement {
resolutionStrategy {
eachPlugin {
if ([Link] == '[Link]') {
useModule('[Link]:sample-plugins:1.0.0')
}
}
}
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}
This tells Gradle to use the specified plugin implementation artifact instead of its built-in default
mapping from plugin ID to Maven/Ivy coordinates.
Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts
that implement the plugin. Read Gradle Plugin Development Plugin for more information on
publishing plugins to custom repositories.
See PluginManagementSpec for complete documentation for using the pluginManagement{} block.
Plugin Marker Artifacts
Since the plugins{} DSL block only allows for declaring plugins by their globally unique plugin id
and version properties, Gradle needs a way to look up the coordinates of the plugin implementation
artifact.
To do so, Gradle will look for a Plugin Marker Artifact with the coordinates
[Link]:[Link]:[Link]. This marker needs to have a dependency on the
actual plugin implementation. Publishing these markers is automated by the java-gradle-plugin.
For example, the following complete sample from the sample-plugins project shows how to publish
a [Link] plugin and a [Link] plugin to both an Ivy and Maven repository
using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish
plugin.
[Link]
plugins {
`java-gradle-plugin`
`maven-publish`
`ivy-publish`
}
group = "[Link]"
version = "1.0.0"
gradlePlugin {
plugins {
create("hello") {
id = "[Link]"
implementationClass = "[Link]"
}
create("goodbye") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}
publishing {
repositories {
maven {
url = uri([Link]("maven-repo"))
}
ivy {
url = uri([Link]("ivy-repo"))
}
}
}
[Link]
plugins {
id 'java-gradle-plugin'
id 'maven-publish'
id 'ivy-publish'
}
group = '[Link]'
version = '1.0.0'
gradlePlugin {
plugins {
hello {
id = '[Link]'
implementationClass = '[Link]'
}
goodbye {
id = '[Link]'
implementationClass = '[Link]'
}
}
}
publishing {
repositories {
maven {
url = [Link]('maven-repo')
}
ivy {
url = [Link]('ivy-repo')
}
}
}
Running gradle publish in the sample directory creates the following Maven repository layout (the
Ivy layout is similar):
Legacy Plugin Application
With the introduction of the plugins DSL, users should have little reason to use the legacy method
of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to
restrictions in how it currently works.
[Link]
apply(plugin = "java")
[Link]
Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to
apply the JavaPlugin.
Rather than using a plugin id, plugins can also be applied by simply specifying the class of the
plugin:
[Link]
apply<JavaPlugin>()
[Link]
The JavaPlugin symbol in the above sample refers to the JavaPlugin. This class does not strictly need
to be imported as the [Link] package is automatically imported in all build scripts
(see Default imports).
Furthermore, one needs to append the ::class suffix to identify a class literal in Kotlin instead of
.class in Java.
You may also see the apply method used to include an entire build file:
[Link]
apply(from = "[Link]")
[Link]
When a project uses a version catalog, plugins can be referenced via aliases when applied.
[Link]
[versions]
groovy = "3.0.5"
checkstyle = "8.37"
[libraries]
groovy-core = { module = "[Link]:groovy", [Link] = "groovy"
}
groovy-json = { module = "[Link]:groovy-json", [Link] =
"groovy" }
groovy-nio = { module = "[Link]:groovy-nio", [Link] =
"groovy" }
commons-lang3 = { group = "[Link]", name = "commons-lang3",
version = { strictly = "[3.8, 4.0[", prefer="3.9" } }
[bundles]
groovy = ["groovy-core", "groovy-json", "groovy-nio"]
[plugins]
versions = { id = "[Link]", version = "0.45.0" }
Then a plugin can be applied to any build script using the alias method:
[Link]
plugins {
`java-library`
alias([Link])
}
[Link]
plugins {
id 'java-library'
alias([Link])
}
Writing Plugins
If Gradle or the Gradle community does not offer the specific capabilities your project needs,
creating your own custom plugin could be a solution.
Additionally, if you find yourself duplicating build logic across subprojects and need a better way to
organize it, convention plugins can help.
Script plugin
A plugin is any class that implements the Plugin interface. For example, this is a "hello world"
plugin:
[Link]
apply<SamplePlugin>() ③
[Link]
[Link]
[Link]
[Link]
When SamplePlugin is applied in your project, Gradle calls the fun apply() {} method defined. This
adds the ScriptPlugin task to your project:
[Link]
apply<SamplePlugin>()
[Link]
Note that this is a simple hello-world example and does not reflect best practices.
The best practice for developing plugins is to create convention plugins or binary plugins.
Pre-compiled script plugins offer an easy way to rapidly prototype and experiment. They let you
package build logic as *.gradle(.kts) script files using the Groovy or Kotlin DSL. These scripts
reside in specific directories, such as src/main/groovy or src/main/kotlin.
To apply one, simply use its ID derived from the script filename (without .gradle). You can think of
the file itself as the plugin, so you do not need to subclass the Plugin interface in a precompiled
script.
.
└── buildSrc
├── [Link]
└── src
└── main
└── kotlin
└── [Link]
buildSrc/src/main/kotlin/[Link]
@Input
val fileName = "[Link]"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}
[Link]<CreateFileTask>("createMyFileTaskInConventionPlugin") {
group = "from my convention plugin"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM MY CONVENTION PLUGIN")
}
buildSrc/src/main/groovy/[Link]
@Input
String fileName = "[Link]"
@OutputFile
File getMyFile() {
return new File(fileName)
}
@TaskAction
void action() {
[Link]()
[Link]([Link]())
}
}
[Link]("createMyFileTaskInConventionPlugin", CreateFileTask) {
group = "from my convention plugin"
description = "Create [Link] in the current directory"
[Link]("HELLO FROM MY CONVENTION PLUGIN")
}
The pre-compiled script can now be applied in the [Link](.kts) file of any subproject:
[Link]
plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
`kotlin-dsl`
}
[Link]
plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id 'groovy' // Apply the Groovy DSL plugin
}
The createFileTask task from the plugin is now available in your subproject.
Binary Plugins
A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR
file. It is resolved as a dependency rather than compiled from source.
For most use cases, convention plugins must be updated infrequently. Having each developer
execute the plugin build as part of their development process is wasteful, and we can instead
distribute them as binary dependencies.
There are two ways to update the convention plugin in the example above into a binary plugin.
[Link]
includeBuild("my-plugin")
[Link]
plugins {
id("[Link]-plugin") version "1.0.0"
}
Let’s go with the second solution. This plugin has been re-written in Kotlin and is called
[Link]. It is still stored in buildSrc:
buildSrc/src/main/kotlin/[Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
import [Link]
@Input
val fileName = [Link]() + "/[Link]"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
[Link]()
[Link]([Link]())
}
}
The plugin can be published and given an id using a gradlePlugin{} block so that it can be
referenced in the root:
buildSrc/[Link]
group = "[Link]"
version = "1.0.0"
gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "[Link]-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}
publishing {
repositories {
mavenLocal()
}
}
buildSrc/[Link]
group = '[Link]'
version = '1.0.0'
gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "[Link]-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}
publishing {
repositories {
mavenLocal()
}
}
[Link]
plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
id("[Link]-binary-plugin") // Apply the binary plugin
`kotlin-dsl`
}
[Link]
plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id '[Link]-binary-plugin' // Apply the binary plugin
id 'groovy' // Apply the Groovy DSL plugin
}
2. Provider - Represents a value that can only be queried and cannot be changed.
[Link]
@TaskAction
fun printConfiguration() {
println("Configuration value: ${[Link]()}")
}
}
[Link]("myIntroTask", MyIntroTask::class) {
[Link](configurationProvider)
}
[Link]
@TaskAction
void printConfiguration() {
println "Configuration value: ${[Link]()}"
}
}
[Link]("myIntroTask", MyIntroTask) {
[Link](configurationProvider)
}
Understanding Properties
Properties in Gradle are variables that hold values. They can be defined and accessed within the
build script to store information like file paths, version numbers, or custom values.
[Link]
// Setting a property
val simpleMessageProperty: Property<String> =
[Link](String::class)
[Link]("Hello, World from a Property!")
// Accessing a property
println([Link]())
[Link]
// Setting a property
def simpleMessageProperty = [Link](String)
[Link]("Hello, World from a Property!")
// Accessing a property
println([Link]())
Properties:
• The method [Link](T) specifies a value for the property, overwriting whatever value may
have been present.
• The method [Link](Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together Provider
and Property instances before the values are configured.
Understanding Providers
Providers are objects that represent a value that may not be immediately available. Providers are
useful for lazy evaluation and can be used to model values that may change over time or depend on
other tasks or inputs:
[Link]
// Setting a provider
val simpleMessageProvider: Provider<String> = [Link] {
"Hello, World from a Provider!" }
// Accessing a provider
println([Link]())
[Link]
// Setting a provider
def simpleMessageProvider = [Link] { "Hello, World from a
Provider!" }
// Accessing a provider
println([Link]())
Providers:
• Many other types extend Provider and can be used wherever a Provider is required.
Gradle’s managed properties allow you to declare properties as abstract getters (Java, Groovy) or
abstract properties (Kotlin).
Gradle then automatically provides the implementation for these properties, managing their state.
A property may be mutable, meaning that it has both a get() method and set() method:
[Link]
@TaskAction
fun printMessage() {
println([Link]())
}
}
[Link]<MyPropertyTask>("myPropertyTask") {
[Link]("Hello, Gradle!")
}
[Link]
@TaskAction
void printMessage() {
println([Link]())
}
}
[Link]('myPropertyTask', MyPropertyTask) {
[Link]("Hello, Gradle!")
}
Or read-only, meaning that it has only a get() method. The read-only properties are providers:
[Link]
[Link]<MyProviderTask>("MyProviderTask") {
[Link]
@TaskAction
void printMessage() {
println([Link]())
}
}
[Link]('MyProviderTask', MyProviderTask)
A mutable managed property is declared using an abstract getter method of type Property<T>,
where T can be any serializable type or a fully managed Gradle type. The property must not have
any setter methods.
[Link]
@TaskAction
void run() {
[Link]("Downloading " + getUri().get()); // Use the `uri` property
}
}
Note that for a property to be considered a mutable managed property, the property’s getter
methods must be abstract and have public or protected visibility.
You can declare a read-only managed property, also known as a provider, using a getter method of
type Provider<T>. The method implementation needs to derive the value. It can, for example, derive
the value from other properties.
Here is an example of a task type with a uri provider that is derived from a location property:
[Link]
@Internal
public Provider<URI> getUri() {
return getLocation().map(l -> [Link]("[Link] + l));
}
@TaskAction
void run() {
[Link]("Downloading " + getUri().get()); // Use the `uri`
provider (read-only property)
}
}
Read-only Managed Nested Properties (Nested Providers)
You can declare a read-only managed nested property by adding an abstract getter method for the
property to a type annotated with @Nested. The property should not have any setter methods. Gradle
provides the implementation for the getter method and creates a value for the property.
This pattern is useful when a custom type has a nested complex type which has the same lifecycle.
If the lifecycle is different, consider using Property<NestedType> instead.
Here is an example of a task type with a resource property. The Resource type is also a custom
Gradle type and defines some managed properties:
[Link]
@TaskAction
void run() {
// Use the `resource` property
[Link]("Downloading [Link] + getResource().getHostName().get()
+ "/" + getResource().getPath().get());
}
}
If the type contains an abstract property called "name" of type String, Gradle provides an
implementation for the getter method, and extends each constructor with a "name" parameter,
which comes before all other constructor parameters.
If the type is an interface, Gradle will provide a constructor with a single "name" parameter and
@Inject semantics.
You can have your type implement or extend the Named interface, which defines such a read-only
"name" property:
import [Link]
// Usage
val instance = MyTypeImpl("myName")
println([Link]) // Prints: myName
A managed type as an abstract class or interface with no fields and whose properties are all
managed. These types have their state entirely managed by Gradle.
[Link]
A named managed type is a managed type that additionally has an abstract property "name" of type
String. Named managed types are especially useful as the element type of
NamedDomainObjectContainer:
[Link]
interface MyNamedType {
val name: String
}
interface MyNamedType {
String getName()
}
MyNamedTypeImpl(String name) {
[Link] = name
}
}
class MyPluginExtension {
NamedDomainObjectContainer<MyNamedType> myNamedContainer
MyPluginExtension(Project project) {
myNamedContainer = [Link](MyNamedType) { name ->
new MyNamedTypeImpl(name)
}
}
}
Sometimes you may see properties implemented in the Java bean property style. That is, they do
not use a Property<T> or Provider<T> types but are instead implemented with concrete setter and
getter methods (or corresponding conveniences in Groovy or Kotlin).
@TaskAction
public void myAction() {
[Link]("SomeProperty: " + someProperty);
}
}
Understanding Collections
Gradle provides types for maintaining collections of objects, intended to work well to extends
Gradle’s DSLs and provide useful features such as lazy configuration.
Available collections
These collection types are used for managing collections of objects, particularly in the context of
build scripts and plugins:
1. DomainObjectSet<T>: Represents a set of objects of type T. This set does not allow duplicate
elements, and you can add, remove, and query objects in the set.
These types are commonly used in Gradle plugins and build scripts to manage collections of objects,
such as tasks, configurations, or custom domain objects.
1. DomainObjectSet
[Link]
[Link]
2. NamedDomainObjectSet
A NamedDomainObjectSet holds a set of configurable objects, where each element has a name
associated with it.
[Link]
[Link]
3. NamedDomainObjectList
A NamedDomainObjectList holds a list of configurable objects, where each element has a name
associated with it.
[Link]
4. NamedDomainObjectContainer
A NamedDomainObjectContainer manages a set of objects, where each element has a name associated
with it.
The container takes care of creating and configuring the elements, and provides a DSL that build
scripts can use to define and configure elements. It is intended to hold objects which are themselves
configurable, for example a set of custom Gradle objects.
Gradle uses NamedDomainObjectContainer type extensively throughout the API. For example, the
[Link] object used to manage the tasks of a project is a NamedDomainObjectContainer<Task>.
You can create a container instance using the ObjectFactory service, which provides the
[Link]() method. This is also available using the [Link]()
method, however in a custom Gradle type it’s generally better to use the injected ObjectFactory
service instead of passing around a Project instance.
You can also create a container instance using a read-only managed property.
[Link]
[Link]
In order to use a type with any of the domainObjectContainer() methods, it must either
• expose a property named “name” as the unique, and constant, name for the object. The
domainObjectContainer(Class) variant of the method creates new instances by calling the
constructor of the class that takes a string argument, which is the desired name of the object.
Objects created this way are treated as custom Gradle types, and so can make use of the features
discussed in this chapter, for example service injection or managed properties.
See the above link for domainObjectContainer() method variants that allow custom instantiation
strategies:
Property<URI> getUri();
Property<String> getUserName();
}
For each container property, Gradle automatically adds a block to the Groovy and Kotlin DSL that
you can use to configure the contents of the container:
[Link]
plugins {
id("[Link]")
}
download {
// Can use a block to configure the container contents
resources {
register("gradle") {
uri = uri("[Link]
}
}
}
[Link]
plugins {
id("[Link]")
}
download {
// Can use a block to configure the container contents
resources {
register('gradle') {
uri = uri('[Link]
}
}
}
5. ExtensiblePolymorphicDomainObjectContainer
[Link]
[Link]
MyPluginExtensionExtensiblePolymorphicDomainObjectContainer(ObjectFactory
objectFactory) {
// Create the container
animals = [Link](Animal)
}
6. FileSystemOperations - Allows a task to run operations on the filesystem such as deleting files,
copying files or syncing directories.
7. ArchiveOperations - Allows a task to run operations on archive files such as ZIP or TAR files.
8. ExecOperations - Allows a task to run external processes with dedicated support for running
external java programs.
Out of the above, ProjectLayout and WorkerExecutor services are only available for injection in
project plugins. BuildLayout is only available in settings plugins and settings files. ProjectLayout is
unavailable in Worker API actions.
1. ObjectFactory
ObjectFactory is a service for creating custom Gradle types, allowing you to define nested objects
and DSLs in your build logic. It provides methods for creating instances of different types, such as
properties (Property<T>), collections (ListProperty<T>, SetProperty<T>, MapProperty<K, V>), file-
related objects (RegularFileProperty, DirectoryProperty, ConfigurableFileCollection,
ConfigurableFileTree), and more.
You can obtain an instance of ObjectFactory using the [Link] property. Here’s a simple
example demonstrating how to use ObjectFactory to create a property and set its value:
[Link]
[Link]("myObjectFactoryTask") {
doLast {
val objectFactory = [Link]
val myProperty = [Link](String::class)
[Link]("Hello, Gradle!")
println([Link]())
}
}
[Link]
[Link]("myObjectFactoryTask") {
doLast {
def objectFactory = [Link]
def myProperty = [Link](String)
[Link]("Hello, Gradle!")
println [Link]()
}
}
TIP It is preferable to let Gradle create objects automatically by using managed properties.
Using ObjectFactory to create these objects ensures that they are properly managed by Gradle,
especially in terms of configuration avoidance and lazy evaluation. This means that the values of
these objects are only calculated when needed, which can improve build performance.
[Link]
@Inject
public DownloadExtension(ObjectFactory objectFactory) {
// Use an injected ObjectFactory to create a Resource object
resource = [Link]([Link]);
}
@TaskAction
fun doTaskAction() {
val outputDirectory = [Link]()
[Link]([Link])
println([Link]())
}
}
[Link]("myInjectedObjectFactoryTask", MyObjectFactoryTask::class) {}
[Link]
@Inject //@[Link]
MyObjectFactoryTask(ObjectFactory objectFactory) {
[Link] = objectFactory
}
@TaskAction
void doTaskAction() {
var outputDirectory = [Link]()
[Link]([Link])
println([Link]())
}
}
[Link]("myInjectedObjectFactoryTask",MyObjectFactoryTask) {}
The MyObjectFactoryTask task uses an ObjectFactory instance, which is injected into the task’s
constructor using the @Inject annotation.
2. ProjectLayout
ProjectLayout is a service that provides access to the layout of a Gradle project’s directories and
files. It’s part of the [Link] package and allows you to query the project’s layout to get
information about source sets, build directories, and other file-related aspects of the project.
You can obtain a ProjectLayout instance from a Project object using the [Link] property.
Here’s a simple example:
[Link]
[Link]("showLayout") {
doLast {
val layout = [Link]
println("Project Directory: ${[Link]}")
println("Build Directory: ${[Link]()}")
}
}
[Link]
[Link]('showLayout') {
doLast {
def layout = [Link]
println "Project Directory: ${[Link]}"
println "Build Directory: ${[Link]()}"
}
}
[Link]
@TaskAction
fun doTaskAction() {
val outputDirectory = [Link]
println(outputDirectory)
}
}
[Link]("myInjectedProjectLayoutTask", MyProjectLayoutTask::class) {}
[Link]
@Inject //@[Link]
MyProjectLayoutTask(ProjectLayout projectLayout) {
[Link] = projectLayout
}
@TaskAction
void doTaskAction() {
var outputDirectory = [Link]
println(outputDirectory)
}
}
[Link]("myInjectedProjectLayoutTask",MyProjectLayoutTask) {}
The MyProjectLayoutTask task uses a ProjectLayout instance, which is injected into the task’s
constructor using the @Inject annotation.
3. BuildLayout
BuildLayout is a service that provides access to the root and settings directory in a Settings plugin or
a Settings script, it is analogous to ProjectLayout. It’s part of the [Link] package to
access standard build-wide file system locations as lazily computed value.
These APIs are currently incubating but eventually should replace existing
accessors in Settings, which return eagerly computed locations:
NOTE
[Link] → [Link]
[Link] → [Link]
You can obtain a BuildLayout instance from a Settings object using the [Link] property.
Here’s a simple example:
[Link]
[Link]
apply<MyBuildLayoutPlugin>()
[Link]
@Inject //@[Link]
MyBuildLayoutPlugin(BuildLayout buildLayout) {
[Link] = buildLayout
}
This code defines a MyBuildLayoutPlugin plugin that implements the Plugin interface for the Settings
type. The plugin expects a BuildLayout instance to be injected into its constructor using the @Inject
annotation.
4. ProviderFactory
ProviderFactory is a service that provides methods for creating different types of providers.
Providers are used to model values that may be computed lazily in your build scripts.
The ProviderFactory interface provides methods for creating various types of providers, including:
• provider(Callable<T> value) to create a provider with a value that is lazily computed based on a
Callable.
• gradleProperty(Class<T> type) to create a property provider that reads its value from a Gradle
project property.
[Link]
[Link]("printMessage") {
doLast {
val providerFactory = [Link]
val messageProvider = [Link] { "Hello, Gradle!" }
println([Link]())
}
}
[Link]
[Link]('printMessage') {
doLast {
def providerFactory = [Link]
def messageProvider = [Link] { "Hello, Gradle!" }
println [Link]()
}
}
The task named printMessage uses the ProviderFactory to create a provider that supplies the
message string.
@TaskAction
fun doTaskAction() {
val outputDirectory = [Link] { "build/[Link]"
}
println([Link]())
}
}
[Link]("myInjectedProviderFactoryTask", MyProviderFactoryTask::class)
{}
[Link]
@Inject //@[Link]
MyProviderFactoryTask(ProviderFactory providerFactory) {
[Link] = providerFactory
}
@TaskAction
void doTaskAction() {
var outputDirectory = [Link] { "build/[Link]"
}
println([Link]())
}
}
[Link]("myInjectedProviderFactoryTask",MyProviderFactoryTask) {}
The ProviderFactory service is injected into the MyProviderFactoryTask task’s constructor using the
@Inject annotation.
5. WorkerExecutor
WorkerExecutor is a service that allows you to perform parallel execution of tasks using worker
processes. This is particularly useful for tasks that perform CPU-intensive or long-running
operations, as it allows them to be executed in parallel, improving build performance.
Using WorkerExecutor, you can submit units of work (called actions) to be executed in separate
worker processes. This helps isolate the work from the main Gradle process, providing better
reliability and performance.
Here’s a basic example of how you might use WorkerExecutor in a build script:
[Link]
[Link]("myWorkTask", MyWorkerTask::class) {}
[Link]
@Inject
public MyWorkAction() {
[Link] = "Hello from a Worker!";
}
@Override
public void execute() {
[Link](greeting);
}
}
abstract class MyWorkerTask extends DefaultTask {
@Input
abstract Property<Boolean> getBooleanFlag()
@Inject
abstract WorkerExecutor getWorkerExecutor()
@TaskAction
void doThings() {
[Link]().submit(MyWorkAction) {}
}
}
[Link]("myWorkTask", MyWorkerTask) {}
6. FileSystemOperations
FileSystemOperations is a service that provides methods for performing file system operations such
as copying, deleting, and syncing. It is part of the [Link] package and is typically used
in custom tasks or plugins to interact with the file system.
[Link]
@TaskAction
fun doTaskAction() {
[Link] {
from("src")
into("dest")
}
}
}
[Link]("myInjectedFileSystemOperationsTask",
MyFileSystemOperationsTask::class)
[Link]
@Inject //@[Link]
MyFileSystemOperationsTask(FileSystemOperations fileSystemOperations) {
[Link] = fileSystemOperations
}
@TaskAction
void doTaskAction() {
[Link] {
from 'src'
into 'dest'
}
}
}
[Link]("myInjectedFileSystemOperationsTask",
MyFileSystemOperationsTask)
With some ceremony, it is possible to use FileSystemOperations in an ad-hoc task defined in a build
script:
[Link]
interface InjectedFsOps {
@get:Inject val fs: FileSystemOperations
}
[Link]("myAdHocFileSystemOperationsTask") {
val injected = [Link]<InjectedFsOps>()
doLast {
[Link] {
from("src")
into("dest")
}
}
}
[Link]
interface InjectedFsOps {
@Inject //@[Link]
FileSystemOperations getFs()
}
[Link]('myAdHocFileSystemOperationsTask') {
def injected = [Link](InjectedFsOps)
doLast {
[Link] {
from 'source'
into 'destination'
}
}
}
First, you need to declare an interface with a property of type FileSystemOperations, here named
InjectedFsOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.
TIP This is a good time to consider extracting the ad-hoc task into a proper class.
7. ArchiveOperations
ArchiveOperations is a service that provides methods for accessing the contents of archives, such as
ZIP and TAR files. It is part of the [Link] package and is typically used in custom tasks
or plugins to unpack archive files.
[Link]
from([Link]([Link]("[Link]")))
into([Link]("unpacked-sources"))
}
}
}
[Link]("myInjectedArchiveOperationsTask",
MyArchiveOperationsTask::class)
[Link]
@Inject
MyArchiveOperationsTask(ArchiveOperations archiveOperations,
ProjectLayout layout, FileSystemOperations fs) {
[Link] = archiveOperations
[Link] = layout
[Link] = fs
}
@TaskAction
void doTaskAction() {
[Link] {
from([Link]([Link](
"[Link]")))
into([Link]("unpacked-sources"))
}
}
}
[Link]("myInjectedArchiveOperationsTask", MyArchiveOperationsTask)
The ArchiveOperations service is injected into the MyArchiveOperationsTask task’s constructor using
the @Inject annotation.
With some ceremony, it is possible to use ArchiveOperations in an ad-hoc task defined in a build
script:
[Link]
interface InjectedArcOps {
@get:Inject val arcOps: ArchiveOperations
}
[Link]("myAdHocArchiveOperationsTask") {
val injected = [Link]<InjectedArcOps>()
val archiveFile = "${[Link]}/[Link]"
doLast {
[Link](archiveFile)
}
}
[Link]
interface InjectedArcOps {
@Inject //@[Link]
ArchiveOperations getArcOps()
}
[Link]('myAdHocArchiveOperationsTask') {
def injected = [Link](InjectedArcOps)
def archiveFile = "${projectDir}/[Link]"
doLast {
[Link](archiveFile)
}
}
First, you need to declare an interface with a property of type ArchiveOperations, here named
InjectedArcOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.
TIP This is a good time to consider extracting the ad-hoc task into a proper class.
8. ExecOperations
ExecOperations is a service that provides methods for executing external processes (commands)
from within a build script. It is part of the [Link] package and is typically used in
custom tasks or plugins to run command-line tools or scripts as part of the build process.
[Link]
[Link]("myInjectedExecOperationsTask", MyExecOperationsTask::class)
[Link]
@Inject //@[Link]
MyExecOperationsTask(ExecOperations execOperations) {
[Link] = execOperations
}
@TaskAction
void doTaskAction() {
[Link] {
commandLine 'ls', '-la'
}
}
}
[Link]("myInjectedExecOperationsTask", MyExecOperationsTask)
The ExecOperations is injected into the MyExecOperationsTask task’s constructor using the @Inject
annotation.
With some ceremony, it is possible to use ExecOperations in an ad-hoc task defined in a build script:
[Link]
interface InjectedExecOps {
@get:Inject val execOps: ExecOperations
}
[Link]("myAdHocExecOperationsTask") {
val injected = [Link]<InjectedExecOps>()
doLast {
[Link] {
commandLine("ls", "-la")
}
}
}
[Link]
interface InjectedExecOps {
@Inject //@[Link]
ExecOperations getExecOps()
}
[Link]('myAdHocExecOperationsTask') {
def injected = [Link](InjectedExecOps)
doLast {
[Link] {
commandLine 'ls', '-la'
}
}
}
First, you need to declare an interface with a property of type ExecOperations, here named
InjectedExecOps, to serve as an injection point. Then call the method [Link] to
generate an implementation of the interface that holds an injected service.
TIP This is a good time to consider extracting the ad-hoc task into a proper class.
9. ToolingModelBuilderRegistry
ToolingModelBuilderRegistry is a service that allows you to register custom tooling model builders.
Tooling models are used to provide rich IDE integration for Gradle projects, allowing IDEs to
understand and work with the project’s structure, dependencies, and other aspects.
[Link]
[Link]
@Override
boolean canBuild(String modelName) {
return false
}
@Override
Object buildAll(String modelName, Project project) {
return null
}
}
OrtModelPlugin(ToolingModelBuilderRegistry registry) {
[Link] = registry
}
Constructor injection
There are 2 ways that an object can receive the services that it needs. The first option is to add the
service as a parameter of the class constructor. The constructor must be annotated with the
[Link] annotation. Gradle uses the declared type of each constructor parameter to
determine the services that the object requires. The order of the constructor parameters and their
names are not significant and can be whatever you like.
Here is an example that shows a task type that receives an ObjectFactory via its constructor:
[Link]
@OutputDirectory
public DirectoryProperty getOutputDirectory() {
return outputDirectory;
}
@TaskAction
void run() {
// ...
}
}
Property injection
Alternatively, a service can be injected by adding a property getter method annotated with the
[Link] annotation to the class. This can be useful, for example, when you cannot
change the constructor of the class due to backwards compatibility constraints. This pattern also
allows Gradle to defer creation of the service until the getter method is called, rather than when the
instance is created. This can help with performance. Gradle uses the declared return type of the
getter method to determine the service to make available. The name of the property is not
significant and can be whatever you like.
The property getter method must be public or protected. The method can be abstract or, in cases
where this isn’t possible, can have a dummy method body. The method body is discarded.
Here is an example that shows a task type that receives a two services via property getter methods:
[Link]
@TaskAction
void run() {
WorkerExecutor workerExecutor = getWorkerExecutor();
ObjectFactory objectFactory = getObjectFactory();
// Use the executor and factory ...
}
}
STRUCTURING BUILDS
Structuring Projects with Gradle
It is important to structure your Gradle project to optimize build performance. A multi-project build
is the standard in Gradle.
A multi-project build consists of one root project and one or more subprojects. Gradle can build the
root project and any number of the subprojects in a single execution.
Project locations
Multi-project builds contain a single root project in a directory that Gradle views as the root path: ..
A subproject has a path, which denotes the position of that subproject in the multi-project build. In
most cases, the project path is consistent with its location in the file system.
The project structure is created in the [Link](.kts) file. The settings file must be present
in the root directory.
Let’s look at a basic multi-project build example that contains a root project and a single subproject.
The root project is called basic-multiproject, located somewhere on your machine. From Gradle’s
perspective, the root is the top-level directory ..
.
├── app
│ ...
│ └── [Link]
└── [Link]
.
├── app
│ ...
│ └── [Link]
└── [Link]
This is the recommended project structure for starting any Gradle project. The build init plugin also
generates skeleton projects that follow this structure - a root project with a single subproject:
[Link]
[Link] = "basic-multiproject"
include("app")
[Link]
[Link] = 'basic-multiproject'
include 'app'
In this case, Gradle will look for a build file for the app subproject in the ./app directory.
You can view the structure of a multi-project build by running the projects command:
$ ./gradlew -q projects
Projects:
------------------------------------------------------------
Root project 'basic-multiproject'
------------------------------------------------------------
In this example, the app subproject is a Java application that applies the application plugin and
configures the main class. The application prints Hello World to the console:
app/[Link]
plugins {
id("application")
}
application {
mainClass = "[Link]"
}
app/[Link]
plugins {
id 'application'
}
application {
mainClass = '[Link]'
}
app/src/main/java/com/example/[Link]
package [Link];
You can run the application by executing the run task from the application plugin in the project
root:
$ ./gradlew -q run
Hello, world!
Adding a subproject
In the settings file, you can use the include method to add another subproject to the root project:
[Link]
[Link]
The include method takes project paths as arguments. The project path is assumed to be equal to
the relative physical file system path. For example, a path services:api is mapped by default to a
folder ./services/api (relative to the project root .).
More examples of how to work with the project path can be found in the DSL documentation of
[Link]([Link][]).
Let’s add another subproject called lib to the previously created project.
All we need to do is add another include statement in the root settings file:
[Link]
[Link] = "basic-multiproject"
include("app")
include("lib")
[Link]
[Link] = 'basic-multiproject'
include 'app'
include 'lib'
Gradle will then look for the build file of the new lib subproject in the ./lib/ directory:
.
├── app
│ ...
│ └── [Link]
├── lib
│ ...
│ └── [Link]
└── [Link]
.
├── app
│ ...
│ └── [Link]
├── lib
│ ...
│ └── [Link]
└── [Link]
Project Descriptors
To further describe the project architecture to Gradle, the settings file provides project descriptors.
You can modify these descriptors in the settings file at any time.
[Link]
include("project-a")
println([Link])
println(project(":project-a").name)
[Link]
include('project-a')
println [Link]
println project(':project-a').name
Using this descriptor, you can change the name, project directory, and build file of a project:
[Link]
[Link] = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "[Link]"
[Link]
[Link] = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = '[Link]'
Consult the ProjectDescriptor class in the API documentation for more information.
.
├── app
│ ...
│ └── [Link]
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── [Link]
└── [Link]
.
├── app
│ ...
│ └── [Link]
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── [Link]
└── [Link]
include(':subs:web:my-web-module')
Gradle sees a subproject with a logical project name of :subs:web:my-web-module and two, possibly
unintentional, other subprojects logically named :subs and :subs:web. This can lead to phantom
build directories, especially when using allprojects{} or subproject{}.
include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"
So, while the physical project layout is the same, the logical results are different.
Naming recommendations
As your project grows, naming and consistency get increasingly more important. To keep your
builds maintainable, we recommend the following:
1. Keep default project names for subprojects: It is possible to configure custom project names
in the settings file. However, it’s an unnecessary extra effort for the developers to track which
projects belong to what folders.
2. Use lower case hyphenation for all project names: All letters are lowercase, and words are
separated with a dash (-) character.
3. Define the root project name in the settings file: The [Link] effectively assigns a
name to the build, used in reports like Build Scans. If the root project name is not set, the name
will be the container directory name, which can be unstable (i.e., you can check out your project
in any directory). The name will be generated randomly if the root project name is not set and
checked out to a file system’s root (e.g., / or C:\).
.
├── api
│ ├── src
│ │ └──...
│ └── [Link]
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link]
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]
.
├── api
│ ├── src
│ │ └──...
│ └── [Link]
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link]
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]
In this example, there are three subprojects called shared, api, and person-service:
1. The person-service subproject depends on the other two subprojects, shared and api.
We use the : separator to define a project path such as services:person-service or :shared. Consult
the DSL documentation of [Link]([Link][]) for more information about defining
project paths.
[Link]
[Link] = "dependencies-java"
include("api", "shared", "services:person-service")
shared/[Link]
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
api/[Link]
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
}
services/person-service/[Link]
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
implementation(project(":api"))
}
[Link]
[Link] = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'
shared/[Link]
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
api/[Link]
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
}
services/person-service/[Link]
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
implementation project(':api')
}
A project dependency affects execution order. It causes the other project to be built first and adds
the output with the classes of the other project to the classpath. It also adds the dependencies of the
other project to the classpath.
If you execute ./gradlew :api:compile, first the shared project is built, and then the api project is
built.
Sometimes, you might want to depend on the output of a specific task within another project rather
than the entire project. However, explicitly declaring a task dependency from one project to
another is discouraged as it introduces unnecessary coupling between tasks.
The recommended way to model dependencies, where a task in one project depends on the output
of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency
management engine allows you to share arbitrary artifacts between projects and build them on
demand.
buildSrc is a Gradle-recognized and protected directory which comes with some benefits:
buildSrc allows you to organize and centralize your custom build logic, tasks, and plugins in a
structured manner. The code written in buildSrc can be reused across your project, making it
easier to maintain and share common build functionality.
Code placed in buildSrc is isolated from the other build scripts of your project. This helps keep
the main build scripts cleaner and more focused on project-specific configurations.
The contents of the buildSrc directory are automatically compiled and included in the classpath
of your main build. This means that classes and plugins defined in buildSrc can be directly used
in your project’s build scripts without any additional configuration.
4. Ease of Testing:
Since buildSrc is a separate build, it allows for easy testing of your custom build logic. You can
write tests for your build code, ensuring that it behaves as expected.
5. Gradle Plugin Development:
If you are developing custom Gradle plugins for your project, buildSrc is a convenient place to
house the plugin code. This makes the plugins easily accessible within your project.
For multi-project builds, there can be only one buildSrc directory, which must be in the root project
directory.
The downside of using buildSrc is that any change to it will invalidate every task in
NOTE
your project and require a rerun.
buildSrc uses the same source code conventions applicable to Java, Groovy, and Kotlin projects. It
also provides direct access to the Gradle API.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──[Link] ①
│ ├── [Link] ②
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ③
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──[Link] ①
│ ├── [Link] ②
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ③
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ③
├── shared
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]
In the buildSrc, the build script [Link](.kts) is created. It contains dependencies and other
build information that is common to multiple subprojects:
[Link]
repositories {
mavenCentral()
}
dependencies {
implementation("org.slf4j:slf4j-api:1.7.32")
}
[Link]
repositories {
mavenCentral()
}
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.32'
}
In the buildSrc, the MyCustomTask is also created. It is a helper task that is used as part of the build
logic for multiple subprojects:
[Link]
import [Link]
import [Link]
[Link]
import [Link]
import [Link]
The MyCustomTask task is used in the build script of the api and shared projects. The task is
automatically available because it’s part of buildSrc.
The [Link](.kts) file is also applied:
[Link]
[Link]
Gradle’s recommended way of organizing build logic is to use its plugin system.
We can write a plugin that encapsulates the build logic common to several subprojects in a project.
This kind of plugin is called a convention plugin.
While writing plugins is outside the scope of this section, the recommended way to build a Gradle
project is to put common build logic in a convention plugin located in the buildSrc.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──[Link] ①
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ②
├── shared
│ ├── src
│ │ └──...
│ └── [Link] ②
└── [Link]
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──[Link] ①
│ └── [Link]
├── api
│ ├── src
│ │ └──...
│ └── [Link] ②
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── [Link] ②
├── shared
│ ├── src
│ │ └──...
│ └── [Link] ②
└── [Link]
[Link] = "dependencies-java"
include("api", "shared", "services:person-service")
[Link]
[Link] = 'dependencies-java'
include 'api', 'shared', 'services:person-service'
The source code for the convention plugin created in the buildSrc directory is as follows:
buildSrc/src/main/kotlin/[Link]
plugins {
id("java")
}
group = "[Link]"
version = "1.0"
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
buildSrc/src/main/groovy/[Link]
plugins {
id 'java'
}
group = '[Link]'
version = '1.0'
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
For the convention plugin to compile, basic configuration needs to be applied in the build file of the
buildSrc directory:
buildSrc/[Link]
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
buildSrc/[Link]
plugins {
id 'groovy-gradle-plugin'
}
The convention plugin is applied to the api, shared, and person-service subprojects:
api/[Link]
plugins {
id("[Link]-conventions")
}
dependencies {
implementation(project(":shared"))
}
shared/[Link]
plugins {
id("[Link]-conventions")
}
services/person-service/[Link]
plugins {
id("[Link]-conventions")
}
dependencies {
implementation(project(":shared"))
implementation(project(":api"))
}
api/[Link]
plugins {
id '[Link]-conventions'
}
dependencies {
implementation project(':shared')
}
shared/[Link]
plugins {
id '[Link]-conventions'
}
services/person-service/[Link]
plugins {
id '[Link]-conventions'
}
dependencies {
implementation project(':shared')
implementation project(':api')
}
An improper way to share build logic between subprojects is cross-project configuration via the
subprojects {} and allprojects {} DSL constructs.
TIP Avoid using subprojects {} and allprojects {}.
With cross-project configuration, build logic can be injected into a subproject which is not obvious
when looking at its build script.
In the long run, cross-project configuration usually grows in complexity and becomes a burden.
Cross-project configuration can also introduce configuration-time coupling between projects, which
can prevent optimizations like configuration-on-demand from working properly.
The two most common uses of cross-project configuration can be better modeled using convention
plugins:
Composite Builds
A composite build is a build that includes other builds.
A composite build is similar to a Gradle multi-project build, except that instead of including
subprojects, entire builds are included.
• Combine builds that are usually developed independently, for instance, when trying out a bug
fix in a library that your application uses.
• Decompose a large multi-project build into smaller, more isolated chunks that can be worked on
independently or together as needed.
A build that is included in a composite build is referred to as an included build. Included builds do
not share any configuration with the composite build or the other included builds. Each included
build is configured and executed in isolation.
The following example demonstrates how two Gradle builds, normally developed separately, can be
combined into a composite build.
my-composite
├── gradle
├── gradlew
├── [Link]
├── [Link]
├── my-app
│ ├── [Link]
│ └── app
│ ├── [Link]
│ └── src/main/java/org/sample/my-app/[Link]
└── my-utils
├── [Link]
├── number-utils
│ ├── [Link]
│ └── src/main/java/org/sample/numberutils/[Link]
└── string-utils
├── [Link]
└── src/main/java/org/sample/stringutils/[Link]
The my-utils multi-project build produces two Java libraries, number-utils and string-utils. The my-
app build produces an executable using functions from those libraries.
The my-app build does not depend directly on my-utils. Instead, it declares binary dependencies on
the libraries produced by my-utils:
my-app/app/[Link]
plugins {
id("application")
}
application {
mainClass = "[Link]"
}
dependencies {
implementation("[Link]:number-utils:1.0")
implementation("[Link]:string-utils:1.0")
}
my-app/app/[Link]
plugins {
id 'application'
}
application {
mainClass = '[Link]'
}
dependencies {
implementation '[Link]:number-utils:1.0'
implementation '[Link]:string-utils:1.0'
}
The --include-build command-line argument turns the executed build into a composite,
substituting dependencies from the included build into the executed build.
For example, the output of ./gradlew run --include-build ../my-utils run from my-app:
The settings file can be used to add subprojects and included builds simultaneously.
[Link]
includeBuild("my-utils")
In the example, the [Link](.kts) file combines otherwise separate builds:
[Link]
[Link] = "my-composite"
includeBuild("my-app")
includeBuild("my-utils")
[Link]
[Link] = 'my-composite'
includeBuild 'my-app'
includeBuild 'my-utils'
To execute the run task in the my-app build from my-composite, run ./gradlew my-app:app:run.
You can optionally define a run task in my-composite that depends on my-app:app:run so that you can
execute ./gradlew run:
[Link]
[Link]("run") {
dependsOn([Link]("my-app").task(":app:run"))
}
[Link]
[Link]('run') {
dependsOn [Link]('my-app').task(':app:run')
}
A special case of included builds are builds that define Gradle plugins.
These builds should be included using the includeBuild statement inside the pluginManagement {}
block of the settings file.
Using this mechanism, the included build may also contribute a settings plugin that can be applied
in the settings file itself:
[Link]
pluginManagement {
includeBuild("../url-verifier-plugin")
}
[Link]
pluginManagement {
includeBuild '../url-verifier-plugin'
}
Most builds can be included in a composite, including other composite builds. There are some
restrictions.
In a regular build, Gradle ensures that each project has a unique project path. It makes projects
identifiable and addressable without conflicts.
In a composite build, Gradle adds additional qualification to each project from an included build to
avoid project path conflicts. The full path to identify a project in a composite build is called a build-
tree path. It consists of a build path of an included build and a project path of the project.
By default, build paths and project paths are derived from directory names and structure on disk.
Since included builds can be located anywhere on disk, their build path is determined by the name
of the containing directory. This can sometimes lead to conflicts.
• Each included build path must not conflict with any project path of the main build.
These conditions guarantee that each project can be uniquely identified even in a composite build.
If conflicts arise, the way to resolve them is by changing the build name of an included build:
[Link]
includeBuild("some-included-build") {
name = "other-name"
}
When a composite build is included in another composite build, both builds have
NOTE
the same parent. In other words, the nested composite build structure is flattened.
Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be
executed, tests can be run, and builds can be imported into the IDE.
Executing tasks
Tasks from an included build can be executed from the command-line or IDE in the same way as
tasks from a regular multi-project build. Executing a task will result in task dependencies being
executed, as well as those tasks required to build dependency artifacts from other included builds.
You can call a task in an included build using a fully qualified path, for example, :included-build-
name:project-name:taskName. Project and task names can be abbreviated.
$ ./gradlew :included-build:subproject-a:compileJava
> Task :included-build:subproject-a:compileJava
$ ./gradlew :i-b:sA:cJ
> Task :included-build:subproject-a:compileJava
To exclude a task from the command line, you need to provide the fully qualified path to the task.
Importing a composite build permits sources from separate Gradle builds to be easily developed
together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse
Project. Source dependencies are configured, providing cross-build navigation and refactoring.
By default, Gradle will configure each included build to determine the dependencies it can provide.
The algorithm for doing this is simple. Gradle will inspect the group and name for the projects in
the included build and substitute project dependencies for any external dependency matching
${[Link]}:${[Link]}.
NOTE By default, substitutions are not registered for the main build.
To make the (sub)projects of the main build addressable by
${[Link]}:${[Link]}, you can tell Gradle to treat the main build like an
included build by self-including it: includeBuild(".").
There are cases when the default substitutions determined by Gradle are insufficient or must be
corrected for a particular composite. For these cases, explicitly declaring the substitutions for an
included build is possible.
For example, a single-project build called anonymous-library, produces a Java utility library but does
not declare a value for the group attribute:
[Link]
plugins {
java
}
[Link]
plugins {
id 'java'
}
When this build is included in a composite, it will attempt to substitute for the dependency module
undefined:anonymous-library (undefined being the default value for [Link], and anonymous-
library being the root project name). Clearly, this isn’t useful in a composite build.
To use the unpublished library in a composite build, you can explicitly declare the substitutions
that it provides:
[Link]
includeBuild("anonymous-library") {
dependencySubstitution {
substitute(module("[Link]:number-utils")).using(project(":"))
}
}
[Link]
includeBuild('anonymous-library') {
dependencySubstitution {
substitute module('[Link]:number-utils') using project(':')
}
}
With this configuration, the my-app composite build will substitute any dependency on
[Link]:number-utils with a dependency on the root project of anonymous-library.
If you need to resolve a published version of a module that is also available as part of an included
build, you can deactivate the included build substitution rules on the ResolutionStrategy of the
Configuration that is resolved. This is necessary because the rules are globally applied in the build,
and Gradle does not consider published versions during resolution by default.
For example, we create a separate publishedRuntimeClasspath configuration that gets resolved to the
published versions of modules that also exist in one of the local builds. This is done by deactivating
global dependency substitution rules:
[Link]
[Link]("publishedRuntimeClasspath") {
[Link] = false
extendsFrom([Link]())
isCanBeConsumed = false
[Link](Usage.USAGE_ATTRIBUTE,
[Link](Usage.JAVA_RUNTIME))
}
[Link]
[Link]('publishedRuntimeClasspath') {
[Link] = false
extendsFrom([Link])
canBeConsumed = false
[Link](Usage.USAGE_ATTRIBUTE, [Link](Usage, Usage
.JAVA_RUNTIME))
}
Many builds will function automatically as an included build, without declared substitutions. Here
are some common cases where declared substitutions are required:
• When the archivesBaseName property is used to set the name of the published artifact.
• When the [Link]() is used to publish artifacts that don’t match the project name.
• When the maven-publish or ivy-publish plugins are used for publishing and the publication
coordinates don’t match ${[Link]}:${[Link]}.
Some builds won’t function correctly when included in a composite, even when dependency
substitutions are explicitly declared. This limitation is because a substituted project dependency
will always point to the default configuration of the target project. Any time the artifacts and
dependencies specified for the default configuration of a project don’t match what is published to a
repository, the composite build may exhibit different behavior.
Here are some cases where the published module metadata may be different from the project
default configuration:
Builds using these features function incorrectly when included in a composite build.
While included builds are isolated from one another and cannot declare direct dependencies, a
composite build can declare task dependencies on its included builds. The included builds are
accessed using [Link]() or [Link]([Link]), and a task
reference is obtained via the [Link]([Link]) method.
Using these APIs, it is possible to declare a dependency on a task in a particular included build:
[Link]
[Link]("run") {
dependsOn([Link]("my-app").task(":app:run"))
}
[Link]
[Link]('run') {
dependsOn [Link]('my-app').task(':app:run')
}
Or you can declare a dependency on tasks with a certain path in some or all of the included builds:
[Link]
[Link]("publishDeps") {
dependsOn([Link] {
[Link](":publishMavenPublicationToMavenRepository") })
}
[Link]
[Link]('publishDeps') {
dependsOn [Link]*.task(
':publishMavenPublicationToMavenRepository')
}
• No support for included builds with publications that don’t mirror the project default
configuration.
See Cases where composite builds won’t work.
• Multiple composite builds may conflict when run in parallel if more than one includes the same
build.
Gradle does not share the project lock of a shared composite build between Gradle invocations
to prevent concurrent execution.
Configuration On Demand
Configuration-on-demand attempts to configure only the relevant projects for the requested tasks,
i.e., it only evaluates the build script file of projects participating in the build. This way, the
configuration time of a large multi-project build can be reduced.
The configuration-on-demand feature is incubating, so only some builds are guaranteed to work
correctly. The feature works well for decoupled multi-project builds.
• The project in the directory where the build is executed is also configured, but only when
Gradle is executed without any tasks.
This way, the default tasks behave correctly when projects are configured on demand.
• The standard project dependencies are supported, and relevant projects are configured.
If project A has a compile dependency on project B, then building A causes the configuration of
both projects.
• The task dependencies declared via the task path are supported and cause relevant projects to
be configured.
Example: [Link](":some-other-project:someOtherTask")
• A task requested via task path from the command line (or tooling API) causes the relevant
project to be configured.
For example, building project-a:project-b:someTask causes configuration of project-b.
Enable configuration-on-demand
Decoupled projects
Gradle allows projects to access each other’s configurations and tasks during the configuration and
execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform
optimizations such as parallel project builds and configuration on demand.
Projects are considered decoupled when they interact solely through declared dependencies and
task dependencies. Any direct modification or reading of another project’s object creates coupling
between the projects. Coupling during configuration can result in flawed build outcomes when
using 'configuration on demand', while coupling during execution can affect parallel execution.
• Refrain from referencing other subprojects' build scripts and prefer cross-project configuration
from the root project.
Parallel projects
Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently
executing tasks from different projects.
To enable parallel execution, use the --parallel command-line argument or configure your build
environment. Gradle automatically determines the optimal number of parallel threads based on
CPU cores.
During parallel execution, each worker handles a specific project exclusively. Task dependencies
are respected, with workers prioritizing upstream tasks. However, tasks may not execute in
alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and
inputs/outputs to avoid ordering issues.
DEVELOPING TASKS
Understanding Tasks
A task represents some independent unit of work that a build performs, such as compiling classes,
creating a JAR, generating Javadoc, or publishing archives to a repository.
Before reading this chapter, it’s recommended that you first read the Learning The Basics and
complete the Tutorial.
Listing tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in a project by running the following command in the terminal:
$ ./gradlew tasks
Let’s take a very basic Gradle project as an example. The project has the following structure:
gradle-project
├── app
│ ├── [Link] // empty file - no build logic
│ └── ... // some java code
├── [Link] // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── [Link]
gradle-project
├── app
│ ├── [Link] // empty file - no build logic
│ └── ... // some java code
├── [Link] // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── [Link]
[Link]
[Link] = "gradle-project"
include("app")
[Link]
[Link] = 'gradle-project'
include('app')
To see the tasks available in the app subproject, run ./gradlew :app:tasks:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
We observe that only a small number of help tasks are available at the moment. This is because the
core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build
your project or compile your code, are added by plugins.
Let’s explore this by adding the Gradle core base plugin to the app build script:
app/[Link]
plugins {
id("base")
}
app/[Link]
plugins {
id('base')
}
The base plugin adds central lifecycle tasks. Now when we run ./gradlew app:tasks, we can see the
assemble and build tasks are available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
Task outcomes
When Gradle executes a task, it labels the task with outcomes via the console.
These labels are based on whether a task has actions to execute and if Gradle executed them.
Actions include, but are not limited to, compiling code, zipping files, and publishing archives.
• Task has no actions and some dependencies, and Gradle executed one or more of the
dependencies. See also Lifecycle Tasks.
UP-TO-DATE
Task’s outputs did not change.
• Task has outputs and inputs but they have not changed. See Incremental Build.
• Task has actions, but the task tells Gradle it did not change its outputs.
• Task has no actions and some dependencies, but all the dependencies are UP-TO-DATE, SKIPPED
or FROM-CACHE. See Lifecycle Tasks.
FROM-CACHE
Task’s outputs could be found from a previous execution.
• Task has outputs restored from the build cache. See Build Cache.
SKIPPED
Task did not execute its actions.
• Task has been explicitly excluded from the command-line. See Excluding tasks from
execution.
NO-SOURCE
Task did not need to execute its actions.
• Task has inputs and outputs, but no sources (i.e., inputs were not found).
Task groups and descriptions are used to organize and describe tasks.
Groups
Task groups are used to categorize tasks. When you run ./gradlew tasks, tasks are listed under
their respective groups, making it easier to understand their purpose and relationship to other
tasks. Groups are set using the group property.
Descriptions
Descriptions provide a brief explanation of what a task does. When you run ./gradlew tasks, the
descriptions are shown next to each task, helping you understand its purpose and how to use it.
Descriptions are set using the description property.
Let’s consider a basic Java application as an example. The build contains a subproject called app.
$ ./gradlew :app:tasks
Application tasks
-----------------
run - Runs this project as a JVM application.
Build tasks
-----------
assemble - Assembles the outputs of this project.
Here, the :run task is part of the Application group with the description Runs this project as a JVM
application. In code, it would look something like this:
app/[Link]
[Link]("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
app/[Link]
[Link]("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
However, tasks will only show up when running :tasks if [Link] is set or no other task depends
on it.
For instance, the following task will not appear when running ./gradlew :app:tasks because it does
not have a group; it is called a hidden task:
app/[Link]
[Link]("helloTask") {
println("Hello")
}
app/[Link]
[Link]("helloTask") {
println 'Hello'
}
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
app/[Link]
[Link]("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}
app/[Link]
[Link]("helloTask") {
group = "Other"
description = "Hello task"
println 'Hello'
}
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
Other tasks
-----------
helloTask - Hello task
In contrast, ./gradlew tasks --all will show all tasks; hidden and visible tasks are listed.
Grouping tasks
If you want to customize which tasks are shown to users when listed, you can group tasks and set
the visibility of each group.
Remember, even if you hide tasks, they are still available, and Gradle can still run
NOTE
them.
Let’s start with an example built by Gradle init for a Java application with multiple subprojects.
The project structure is as follows:
gradle-project
├── app
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── utilities
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── list
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── [Link]
│ ├── [Link]
│ └── src // common build logic
│ └── ...
├── [Link]
├── gradle
├── gradlew
└── [Link]
gradle-project
├── app
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── utilities
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── list
│ ├── [Link]
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── [Link]
│ ├── [Link]
│ └── src // common build logic
│ └── ...
├── [Link]
├── gradle
├── gradlew
└── [Link]
$ ./gradlew :app:tasks
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently
available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these
tasks are rarely required directly by developers using the build.
We can configure the :tasks task and limit the tasks shown to a certain group.
Let’s create our own group so that all tasks are hidden by default by updating the app build script:
app/[Link]
app/[Link]
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Task categories
1. Lifecycle tasks
2. Actionable tasks
Lifecycle tasks define targets you can call, such as :build your project. Lifecycle tasks do not
provide Gradle with actions. They must be wired to actionable tasks. The base Gradle plugin only
adds lifecycle tasks.
Actionable tasks define actions for Gradle to take, such as :compileJava, which compiles the Java
code of your project. Actions include creating JARs, zipping files, publishing archives, and much
more. Plugins like the java-library plugin adds actionable tasks.
Let’s update the build script of the previous example, which is currently an empty file so that our
app subproject is a Java library:
app/[Link]
plugins {
id("java-library")
}
app/[Link]
plugins {
id('java-library')
}
Once again, we list the available tasks to see what new tasks are available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project
':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
We see that many new tasks are available such as jar and testClasses.
Additionally, the java-library plugin has wired actionable tasks to lifecycle tasks. If we call the
:build task, we can see several tasks have been executed, including the :app:compileJava task.
$./gradlew :app:build
Incremental tasks
Gradle can reuse results from prior builds. Therefore, if we’ve built our project before and made
only minor changes, rerunning :build will not require Gradle to perform extensive work.
For example, if we modify only the test code in our project, leaving the production code unchanged,
executing the build will solely recompile the test code. Gradle marks the tasks for the production
code as UP-TO-DATE, indicating that it remains unchanged since the last successful build:
$./gradlew :app:build
Caching tasks
Gradle can reuse results from past builds using the build cache.
To enable this feature, activate the build cache by using the --build-cache command line parameter
or by setting [Link]=true in your [Link] file.
When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE.
The build cache is handy if you switch between branches regularly. Gradle supports both local and
remote build caches.
Developing tasks
1. Registering a task - using a task (implemented by you or provided by Gradle) in your build
logic.
3. Implementing a task - creating a custom task class (i.e., custom class type).
[Link]<Copy>("myCopy") ①
[Link]<Copy>("myCopy") { ②
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.
② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.
③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.
[Link](Copy, "myCopy") ①
[Link](Copy, "myCopy") { ②
from "resources"
into "target"
include "**/*.txt", "**/*.xml", "**/*.properties"
}
① Register the myCopy task of type Copy to let Gradle know we intend to use it in our build
logic.
② Configure the registered myCopy task with the inputs and outputs it needs according to
its API.
③ Implement a custom task type called MyCopyTask which extends DefaultTask and defines
the copyFiles task action.
1. Registering tasks
You define actions for Gradle to take by registering tasks in build scripts or plugins.
Tasks are defined using strings for task names:
[Link]
[Link]("hello") {
doLast {
println("hello")
}
}
[Link]
[Link]('hello') {
doLast {
println 'hello'
}
}
In the example above, the task is added to the TasksCollection using the register() method in
TaskContainer.
2. Configuring tasks
Gradle tasks must be configured to complete their action(s) successfully. If a task needs to ZIP a file,
it must be configured with the file name and location. You can refer to the API for the Gradle Zip
task to learn how to configure it appropriately.
Let’s look at the Copy task provided by Gradle as an example. We first register a task called myCopy of
type Copy in the build script:
[Link]
[Link]<Copy>("myCopy")
[Link]
[Link]('myCopy', Copy)
This registers a copy task with no default behavior. Since the task is of type Copy, a Gradle supported
task type, it can be configured using its API.
The following examples show several ways to achieve the same configuration:
[Link]
[Link]<Copy>("myCopy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
[Link]
[Link]('myCopy') {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
[Link]
[Link]<Copy>("copy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
[Link]
[Link]('copy', Copy) {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
copy {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
NOTE This option breaks task configuration avoidance and is not recommended!
Regardless of the method chosen, the task is configured with the name of the files to be copied and
the location of the files.
3. Implementing tasks
Gradle provides many task types including Delete, Javadoc, Copy, Exec, Tar, and Pmd. You can
implement a custom task type if Gradle does not provide a task type that meets your build logic
needs.
To create a custom task class, you extend DefaultTask and make the extending class abstract:
app/[Link]
app/[Link]
Implicit dependencies
These dependencies are automatically inferred by Gradle based on the tasks' actions and
configuration. For example, if taskB uses the output of taskA (e.g., a file generated by taskA),
Gradle will automatically ensure that taskA is executed before taskB to fulfill this dependency.
Explicit dependencies
These dependencies are explicitly declared in the build script using the dependsOn, mustRunAfter,
or shouldRunAfter methods. For example, if you want to ensure that taskB always runs after
taskA, you can explicitly declare this dependency using [Link](taskA).
Both implicit and explicit dependencies play a crucial role in defining the order of task execution
and ensuring that tasks are executed in the correct sequence to produce the desired build output.
Task dependencies
Gradle inherently understands the dependencies among tasks. Consequently, it can determine the
tasks that need execution when you target a specific task.
Let’s take an example application with an app subproject and a some-logic subproject:
[Link]
[Link] = "gradle-project"
include("app")
include("some-logic")
[Link]
[Link] = 'gradle-project'
include('app')
include('some-logic')
Let’s imagine that the app subproject depends on the subproject called some-logic, which contains
some Java code. We add this dependency in the app build script:
app/[Link]
plugins {
id("application") // app is now a java application
}
application {
[Link]("[Link]") // main class name required by
the application plugin
}
dependencies {
implementation(project(":some-logic")) // dependency on some-logic
}
app/[Link]
plugins {
id('application') // app is now a java application
}
application {
mainClass = '[Link]' // main class name required by
the application plugin
}
dependencies {
implementation(project(':some-logic')) // dependency on some-logic
}
If we run :app:build again, we see the Java code of some-logic is also compiled by Gradle
automatically:
$./gradlew :app:build
Adding dependencies
There are several ways you can define the dependencies of a task.
Defining dependencies using task names and the dependsOn()` method is simplest.
[Link]("taskX") {
dependsOn("taskY")
}
[Link]("taskX") {
dependsOn "taskY"
}
$ gradle -q taskX
taskY
taskX
For more information about task dependencies, see the Task API.
Ordering tasks
In some cases, it is useful to control the order in which two tasks will execute, without introducing
an explicit dependency between those tasks.
The primary difference between a task ordering and a task dependency is that an ordering rule does
not influence which tasks will be executed, only the order in which they will be executed.
• Enforce sequential ordering of tasks (e.g., build never runs before clean).
• Run build validations early in the build (e.g., validate I have the correct credentials before
starting the work for a release build).
• Get feedback faster by running quick verification tasks before long verification tasks (e.g., unit
tests should run before integration tests).
• A task that aggregates the results of all tasks of a particular type (e.g., test report task combines
the outputs of all executed test tasks).
Two ordering rules are available: "must run after" and "should run after".
To specify a "must run after" or "should run after" ordering between 2 tasks, you use the
[Link]([Link]...) and [Link]([Link]...) methods. These
methods accept a task instance, a task name, or any other input accepted by
[Link]([Link]...).
When you use "must run after", you specify that taskY must always run after taskX when the build
requires the execution of taskX and taskY. So if you only run taskY with mustRunAfter, you won’t
cause taskX to run. This is expressed as [Link](taskX).
[Link]
The "should run after" ordering rule is similar but less strict, as it will be ignored in two situations:
2. When using parallel execution and all task dependencies have been satisfied apart from the
"should run after" task, then this task will be run regardless of whether or not its "should run
after" dependencies have been run.
You should use "should run after" where the ordering is helpful but not strictly required:
[Link]
In the examples above, it is still possible to execute taskY without causing taskX to run:
$ gradle -q taskY
taskY
The “should run after” ordering rule will be ignored if it introduces an ordering cycle:
[Link]
[Link]
$ gradle -q taskX
taskZ
taskY
taskX
• It is possible to execute taskX and taskY independently. The ordering rule only has an effect
when both tasks are scheduled for execution.
• When run with --continue, it is possible for taskY to execute if taskX fails.
Finalizer tasks
Finalizer tasks are automatically added to the task graph when the finalized task is scheduled to
run.
To specify a finalizer task, you use the [Link]([Link]…) method. This method
accepts a task instance, a task name, or any other input accepted by
[Link]([Link]…):
[Link]
taskX { finalizedBy(taskY) }
[Link]
$ gradle -q taskX
taskX
taskY
Finalizer tasks are executed even if the finalized task fails or if the finalized task is considered UP-
TO-DATE:
[Link]
taskX { finalizedBy(taskY) }
[Link]
$ gradle -q taskX
taskX
taskY
* Where:
Build file '/home/user/gradle/samples/[Link]' line: 4
BUILD FAILED in 0s
Finalizer tasks are useful when the build creates a resource that must be cleaned up, regardless of
whether the build fails or succeeds. An example of such a resource is a web container that is started
before an integration test task and must be shut down, even if some tests fail.
Skipping tasks
1. Using a predicate
You can use [Link] to attach a predicate to a task. The task’s actions will only be executed if the
predicate is evaluated to be true.
The predicate is passed to the task as a parameter and returns true if the task will execute and
false if the task will be skipped. The predicate is evaluated just before the task is executed.
Passing an optional reason string to onlyIf() is useful for explaining why the task is skipped:
[Link]
hello {
val skipProvider = [Link]("skipHello")
onlyIf("there is no property skipHello") {
![Link]()
}
}
[Link]
[Link] {
def skipProvider = [Link]("skipHello")
onlyIf("there is no property skipHello") {
![Link]
}
}
BUILD SUCCESSFUL in 0s
To find why a task was skipped, run the build with the --info logging level.
2. Using StopExecutionException
If the logic for skipping a task can’t be expressed with a predicate, you can use the
StopExecutionException.
If this exception is thrown by an action, the task action as well as the execution of any following
action is skipped. The build continues by executing the next task:
[Link]
compile {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw StopExecutionException()
}
}
}
[Link]("myTask") {
dependsOn(compile)
doLast {
println("I am not affected")
}
}
[Link]
[Link] {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw new StopExecutionException()
}
}
}
[Link]('myTask') {
dependsOn('compile')
doLast {
println 'I am not affected'
}
}
$ gradle -q myTask
I am not affected
This feature is helpful if you work with tasks provided by Gradle. It allows you to add conditional
[1]
execution of the built-in actions of such a task.
3. Enabling and Disabling tasks
Every task has an enabled flag, which defaults to true. Setting it to false prevents executing the
task’s actions.
[Link]
disableMe {
enabled = false
}
[Link]
[Link] {
enabled = false
}
$ gradle disableMe
> Task :disableMe SKIPPED
BUILD SUCCESSFUL in 0s
4. Task timeouts
Every task has a timeout property, which can be used to limit its execution time. When a task
reaches its timeout, its task execution thread is interrupted. The task will be marked as FAILED.
Finalizer tasks are executed. If --continue is used, other tasks continue running.
Tasks that don’t respond to interrupts can’t be timed out. All of Gradle’s built-in tasks respond to
timeouts.
[Link]
[Link]("hangingTask") {
doLast {
[Link](100000)
}
timeout = [Link](500)
}
[Link]
[Link]("hangingTask") {
doLast {
[Link](100000)
}
timeout = [Link](500)
}
Task rules
Sometimes you want to have a task whose behavior depends on a large or infinite number value
range of parameters. A very nice and expressive way to provide such tasks are task rules:
[Link]
[Link]("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + ([Link]("ping", "")))
}
}
}
}
[Link]
$ gradle -q pingServer1
Pinging: Server1
The String parameter is used as a description for the rule, which is shown with ./gradlew tasks.
Rules are not only used when calling tasks from the command line. You can also create dependsOn
relations on rule based tasks:
[Link]
[Link]("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + ([Link]("ping", "")))
}
}
}
}
[Link]("groupPing") {
dependsOn("pingServer1", "pingServer2")
}
[Link]
if ([Link]("ping")) {
task(taskName) {
doLast {
println "Pinging: " + (taskName - 'ping')
}
}
}
}
[Link]('groupPing') {
dependsOn 'pingServer1', 'pingServer2'
}
$ gradle -q groupPing
Pinging: Server1
Pinging: Server2
If you run ./gradlew -q tasks, you won’t find a task named pingServer1 or pingServer2, but this
script is executing logic based on the request to run those tasks.
You can exclude a task from execution using the -x or --exclude-task command-line option and
provide the task’s name to exclude.
For instance, you can run the check task but exclude the test task from running. This approach can
lead to unexpected outcomes, particularly if you exclude an actionable task that produces results
needed by other tasks. Instead of relying on the -x parameter, defining a suitable lifecycle task for
the desired action is recommended.
Organizing Tasks
There are two types of tasks, actionable and lifecycle tasks.
Actionable tasks in Gradle are tasks that perform actual work, such as compiling code. Lifecycle
tasks are tasks that do not do work themselves. These tasks have no actions, instead, they bundle
actionable tasks and serve as targets for the build.
A well-organized setup of lifecycle tasks enhances the accessibility of your build for new users and
simplifies integration with CI.
Lifecycle tasks
Lifecycle tasks can be particularly beneficial for separating work between users or machines (CI vs
local). For example, a developer on a local machine might not want to run an entire build on every
single change.
Let’s take a standard app as an example which applies the base plugin.
The Gradle base plugin defines several lifecycle tasks, including build, assemble, and
NOTE
check.
We group the build, check task, and the run task by adding the following lines to the app build script:
app/[Link]
[Link] {
group = myBuildGroup
}
[Link] {
group = myBuildGroup
description = "Runs checks (including tests)."
}
[Link]("run") {
group = myBuildGroup
}
app/[Link]
[Link] {
group = myBuildGroup
}
[Link] {
group = myBuildGroup
description = "Runs checks (including tests)."
}
[Link]('run') {
group = myBuildGroup
}
If we now look at the app:tasks list, we can see the three tasks are available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
This is already useful if the standard lifecycle tasks are sufficient. Moving the groups around helps
clarify the tasks you expect to used in your build.
In many cases, there are more specific requirements that you want to address. One common
scenario is running quality checks without running tests. Currently, the :check task runs tests and
the code quality checks. Instead, we want to run code quality checks all the time, but not the
lengthy test.
To add a quality check lifecycle task, we introduce an additional lifecycle task called qualityCheck
and a plugin called spotbugs.
To add a lifecycle task, use [Link](). The only thing you need to provide is a name. Put this
task in our group and wire the actionable tasks that belong to this new lifecycle task using the
dependsOn() method:
app/[Link]
plugins {
id("[Link]") version "6.0.7" // spotbugs plugin
}
app/[Link]
plugins {
id '[Link]' version '6.0.7' // spotbugs plugin
}
Note that you don’t need to list all the tasks that Gradle will execute. Just specify the targets you
want to collect here. Gradle will determine which other tasks it needs to call to reach these goals.
In the example, we add the classes task, a lifecycle task to compile all our production code, and the
spotbugsMain task, which checks our production code.
We also add a description that will show up in the task list that helps distinguish the two check
tasks better.
Now, if run './gradlew :app:tasks', we can see that our new qualityCheck lifecycle task is available:
$ ./gradlew :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
My app build tasks
------------------
build - Assembles and tests this project.
check - Runs checks (including tests).
qualityCheck - Runs checks (excluding tests).
run - Runs this project as a JVM application
tasksAll - Show additional tasks.
If we run it, we can see that it runs checkstyle but not the tests:
$ ./gradlew :app:qualityCheck
BUILD SUCCESSFUL in 1s
16 actionable tasks: 5 executed, 11 up-to-date
So far, we have looked at tasks in individual subprojects, which is useful for local development
when you work on code in one subproject.
With this setup, developers only need to know that they can call Gradle with :subproject-
name:tasks to see which tasks are available and useful for them.
Global lifecycle tasks
Another place to invoke lifecycle tasks is within the root build; this is especially useful for
Continuous Integration (CI).
Gradle tasks play a crucial role in CI or CD systems, where activities like compiling all code, running
tests, or building and packaging the complete application are typical. To facilitate this, you can
include lifecycle tasks that span multiple subprojects.
Gradle has been around for a long time, and you will frequently observe build files
in the root directory serving various purposes. In older Gradle versions, many tasks
NOTE
were defined within the root Gradle build file, resulting in various issues.
Therefore, exercise caution when determining the content of this file.
One of the few elements that should be placed in the root build file is global lifecycle tasks.
Let’s continue using the Gradle init Java application multi-project as an example.
This time, we’re incorporating a build script in the root project. We’ll establish two groups for our
global lifecycle tasks: one for tasks relevant to local development, such as running all checks, and
another exclusively for our CI system.
Once again, we narrowed down the tasks listed to our specific groups:
[Link]
[Link]<TaskReportTask>("tasks") {
displayGroups = listOf<String>(globalBuildGroup, ciBuildGroup)
}
[Link]
[Link](TaskReportTask, "tasks") {
displayGroups = [globalBuildGroup, ciBuildGroup]
}
------------------------------------------------------------
Tasks runnable from root project 'gradle-project'
------------------------------------------------------------
No tasks
Let’s add a qualityCheckApp task to execute all code quality checks in the app subproject. Similarly,
for CI purposes, we implement a checkAll task that runs all tests:
[Link]
[Link]("qualityCheckApp") {
group = globalBuildGroup
description = "Runs checks on app (globally)"
dependsOn(":app:qualityCheck" )
}
[Link]("checkAll") {
group = ciBuildGroup
description = "Runs checks for all projects (CI)"
dependsOn([Link] { ":${[Link]}:check" })
dependsOn([Link] { [Link](":checkAll") })
}
[Link]
[Link]("qualityCheckApp") {
group = globalBuildGroup
description = "Runs checks on app (globally)"
dependsOn(":app:qualityCheck")
}
[Link]("checkAll") {
group = ciBuildGroup
description = "Runs checks for all projects (CI)"
dependsOn [Link] { ":${[Link]}:check" }
dependsOn [Link] { [Link](":checkAll") }
}
So we can now ask Gradle to show us the tasks for the root project and, by default, it will only show
us the qualityCheckAll task (and optionally the checkAll task depending on the value of
displayGroups).
$ ./gradlew :tasks
------------------------------------------------------------
Tasks runnable from root project 'gradle-project'
------------------------------------------------------------
My CI build tasks
-----------------
checkAll - Runs checks for all projects (CI)
If we run the :checkAll task, we see that it compiles all the code and runs the code quality checks
(including spotbug):
$ ./gradlew :checkAll
BUILD SUCCESSFUL in 1s
21 actionable tasks: 12 executed, 9 up-to-date
Gradle provides lazy properties, which delay calculating a property’s value until it’s actually
required.
1. Deferred Value Resolution: Allows wiring Gradle models without needing to know when a
property’s value will be known. For example, you may want to set the input source files of a
task based on the source directories property of an extension, but the extension property value
isn’t known until the build script or some other plugin configures them.
2. Automatic Task Dependency Management: Connects output of one task to input of another,
automatically determining task dependencies. Property instances carry information about
which task, if any, produces their value. Build authors do not need to worry about keeping task
dependencies in sync with configuration changes.
Provider
Represents a value that can only be queried and cannot be changed.
• Many other types extend Provider and can be used wherever a Provider is required.
Property
Represents a value that can be queried and changed.
• The method [Link](T) specifies a value for the property, overwriting whatever value
may have been present.
• The method [Link](Provider) specifies a Provider for the value for the property,
overwriting whatever value may have been present. This allows you to wire together
Provider and Property instances before the values are configured.
Lazy properties are intended to be passed around and only queried when required. This typically
happens during the execution phase.
The following demonstrates a task with a configurable greeting property and a read-only message
property:
[Link]
@Internal
val message: Provider<String> = [Link] { it + " from Gradle" } ③
@TaskAction
fun printMessage() {
[Link]([Link]())
}
}
[Link]<Greeting>("greeting") {
[Link]("Hi") ④
greeting = "Hi" ⑤
}
[Link]
@Internal
final Provider<String> message = [Link] { it + ' from Gradle' } ③
@TaskAction
void printMessage() {
[Link]([Link]())
}
}
[Link]("greeting", Greeting) {
[Link]('Hi') ④
greeting = 'Hi' ⑤
}
② A configurable greeting
$ gradle greeting
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
The Greeting task has a property of type Property<String> to represent the configurable greeting
and a property of type Provider<String> to represent the calculated, read-only, message. The
message Provider is created from the greeting Property using the map() method; its value is kept up-
to-date as the value of the greeting property changes.
Neither Provider nor its subtypes, such as Property, are intended to be implemented by a build
script or plugin. Gradle provides factory methods to create instances of these types instead.
See the Quick Reference for all of the types and factories available.
When writing a plugin or build script with Groovy, you can use the map(Transformer)
NOTE method with a closure, and Groovy will convert the closure to a Transformer.
Similarly, when writing a plugin or build script with Kotlin, the Kotlin compiler will
convert a Kotlin function into a Transformer.
An important feature of lazy properties is that they can be connected together so that changes to
one property are automatically reflected in other properties.
Here is an example where the property of a task is connected to a property of a project extension:
[Link]
// A project extension
interface MessageExtension {
// A configurable greeting
abstract val greeting: Property<String>
}
@TaskAction
fun printMessage() {
[Link]([Link]())
}
}
[Link] {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = "Hi"
}
[Link]
// A project extension
interface MessageExtension {
// A configurable greeting
Property<String> getGreeting()
}
@TaskAction
void printMessage() {
[Link]([Link]())
}
}
messages {
// Configure the greeting on the extension
// Note that there is no need to reconfigure the task's `greeting`
property. This is automatically updated as the extension property changes
greeting = 'Hi'
}
$ gradle greeting
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
This example calls the [Link](Provider) method to attach a Provider to a Property to supply the
value of the property. In this case, the Provider happens to be a Property as well, but you can
connect any Provider implementation, for example one created using [Link]()
In Working with Files, we introduced four collection types for File-like objects:
FileCollection ConfigurableFileCollection
FileTree ConfigurableFileTree
All of these types are also considered lazy types.
There are more strongly typed models used to represent elements of the file system: Directory and
RegularFile. These types shouldn’t be confused with the standard Java File type as they are used to
tell Gradle that you expect more specific values such as a directory or a non-directory, regular file.
Gradle provides two specialized Property subtypes for dealing with values of these types:
RegularFileProperty and DirectoryProperty. ObjectFactory has methods to create these:
[Link]() and [Link]().
A DirectoryProperty can also be used to create a lazily evaluated Provider for a Directory and
RegularFile via [Link](String) and [Link](String) respectively. These
methods create providers whose values are calculated relative to the location for the
DirectoryProperty they were created from. The values returned from these providers will reflect
changes to the DirectoryProperty.
[Link]
// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource : DefaultTask() {
// The configuration file to use to generate the source file
@get:InputFile
abstract val configFile: RegularFileProperty
@TaskAction
fun compile() {
val inFile = [Link]().asFile
[Link]("configuration file = $inFile")
val dir = [Link]().asFile
[Link]("output dir = $dir")
val className = [Link]().trim()
val srcFile = File(dir, "${className}.java")
[Link]("public class ${className} { }")
}
}
[Link]
// A task that generates a source file and writes the result to an output
directory
abstract class GenerateSource extends DefaultTask {
// The configuration file to use to generate the source file
@InputFile
abstract RegularFileProperty getConfigFile()
@TaskAction
def compile() {
def inFile = [Link]().asFile
[Link]("configuration file = $inFile")
def dir = [Link]().asFile
[Link]("output dir = $dir")
def className = [Link]()
def srcFile = new File(dir, "${className}.java")
[Link] = "public class ${className} { ... }"
}
}
$ gradle generate
$ gradle generate
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
This example creates providers that represent locations in the project and build directories through
[Link]() with [Link]() and [Link]().
To close the loop, note that a DirectoryProperty, or a simple Directory, can be turned into a FileTree
that allows the files and directories contained in the directory to be queried with
[Link]() or [Link](). From a DirectoryProperty or a
Directory, you can create FileCollection instances containing a set of the files contained in the
directory with [Link](Object...) or [Link](Object...).
Many builds have several tasks connected together, where one task consumes the outputs of
another task as an input.
To make this work, we need to configure each task to know where to look for its inputs and where
to place its outputs. Ensure that the producing and consuming tasks are configured with the same
location and attach task dependencies between the tasks. This can be cumbersome and brittle if any
of these values are configurable by a user or configured by multiple plugins, as task properties need
to be configured in the correct order and locations, and task dependencies kept in sync as values
change.
The Property API makes this easier by keeping track of the value of a property and the task that
produces the value.
As an example, consider the following plugin with a producer and consumer task which are wired
together:
[Link]
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
val input = [Link]().asFile
val message = [Link]()
[Link]("Read '${message}' from ${input}")
}
}
consumer {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = [Link] { [Link] }
}
producer {
// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]("[Link]")
}
[Link]
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
def input = [Link]().asFile
def message = [Link]
[Link]("Read '${message}' from ${input}")
}
}
[Link] {
// Connect the producer task output to the consumer task input
// Don't need to add a task dependency to the consumer task. This is
automatically added
inputFile = [Link] { [Link] }
}
[Link] {
// Set values for the producer lazily
// Don't need to update the [Link] property. This is
automatically updated as [Link] changes
outputFile = [Link]('[Link]')
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
In the example above, the task outputs and inputs are connected before any location is defined. The
setters can be called at any time before the task is executed, and the change will automatically
affect all related input and output properties.
Another important thing to note in this example is the absence of any explicit task dependency.
Task outputs represented using Providers keep track of which task produces their value, and using
them as task inputs will implicitly add the correct task dependencies.
Implicit task dependencies also work for input properties that are not files:
[Link]
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
[Link]([Link]())
}
}
[Link]
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
[Link]([Link]())
}
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Gradle provides two lazy property types to help configure Collection properties.
These work exactly like any other Provider and, just like file providers, they have additional
modeling around them:
This type of property allows you to overwrite the entire collection value with
[Link](Iterable) and [Link](Provider) or add new elements through
the various add methods:
• [Link](T): Add a single element to the collection
Just like every Provider, the collection is calculated when [Link]() is called. The following
example shows the ListProperty in action:
[Link]
@TaskAction
fun produce() {
val message = "Hello, World!"
val output = [Link]().asFile
[Link]( message)
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
fun consume() {
[Link]().forEach { inputFile ->
val input = [Link]
val message = [Link]()
[Link]("Read '${message}' from ${input}")
}
}
}
[Link]
@TaskAction
void produce() {
String message = 'Hello, World!'
def output = [Link]().asFile
[Link] = message
[Link]("Wrote '${message}' to ${output}")
}
}
@TaskAction
void consume() {
[Link]().each { inputFile ->
def input = [Link]
def message = [Link]
[Link]("Read '${message}' from ${input}")
}
}
}
$ gradle consumer
BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
$ gradle consumer
BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
Gradle provides a lazy MapProperty type to allow Map values to be configured. You can create a
MapProperty instance using [Link](Class, Class).
Similar to other property types, a MapProperty has a set() method that you can use to specify the
value for the property. Some additional methods allow entries with lazy values to be added to the
map.
[Link]
@TaskAction
fun generate() {
[Link]().forEach { entry ->
[Link]("${[Link]} = ${[Link]}")
}
}
}
[Link]<Generator>("generate") {
[Link]("a", 1)
// Values have not been configured yet
[Link]("b", [Link] { b })
[Link]([Link] { mapOf("c" to c, "d" to c + 1) })
}
[Link]
@TaskAction
void generate() {
[Link]().each { key, value ->
[Link]("${key} = ${value}")
}
}
}
[Link]('generate', Generator) {
[Link]("a", 1)
// Values have not been configured yet
[Link]("b", [Link] { b })
[Link]([Link] { [c: c, d: c + 1] })
}
$ gradle generate
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
Often, you want to apply some convention, or default value to a property to be used if no value has
been configured. You can use the convention() method for this. This method accepts either a value
or a Provider, and this will be used as the value until some other value is configured.
[Link]
[Link]("show") {
val property = [Link](String::class)
// Set a convention
[Link]("convention 1")
[Link]("explicit value")
// Once a value is set, the convention is ignored
[Link]("ignored convention")
doLast {
println("value = " + [Link]())
}
}
[Link]
[Link]("show") {
def property = [Link](String)
// Set a convention
[Link]("convention 1")
[Link]("explicit value")
doLast {
println("value = " + [Link]())
}
}
$ gradle show
value = convention 1
value = convention 2
BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
There are several appropriate locations for setting a convention on a property at configuration time
(i.e., before execution).
[Link]
apply<GreetingPlugin>()
[Link]<GreetingTask>().configureEach {
// setting convention from build script
[Link]("Guest")
}
init {
[Link]("person2")
}
@TaskAction
fun greet() {
println("hello, ${[Link]()}, from ${[Link]()}")
}
}
[Link]
[Link](GreetingTask).configureEach {
// setting convention from build script
[Link]("Guest")
}
GreetingTask() {
[Link]("person2")
}
@TaskAction
void greet() {
println("hello, ${[Link]()}, from ${[Link]()}")
}
}
Plugin authors may configure a convention on a lazy property from a plugin’s apply() method,
while performing preliminary configuration of the task or extension defining the property. This
works well for regular plugins (meant to be distributed and used in the wild), and internal
convention plugins (which often configure properties defined by third party plugins in a uniform
way for the entire build).
[Link]
[Link]
Build engineers may configure a convention on a lazy property from shared build logic that is
configuring tasks (for instance, from third-party plugins) in a standard way for the entire build.
[Link]
apply<GreetingPlugin>()
[Link]<GreetingTask>().configureEach {
// setting convention from build script
[Link]("Guest")
}
[Link]
[Link](GreetingTask).configureEach {
// setting convention from build script
[Link]("Guest")
}
Note that for project-specific values, instead of conventions, you should prefer setting explicit
values (using [Link](…) or [Link](…), for instance), as
conventions are only meant to define defaults.
From the task initialization
A task author may configure a convention on a lazy property from the task constructor or (if in
Kotlin) initializer block. This approach works for properties with trivial defaults, but it is not
appropriate if additional context (external to the task implementation) is required in order to set a
suitable default.
[Link]
init {
[Link]("person2")
}
[Link]
GreetingTask() {
[Link]("person2")
}
You may configure a convention on a lazy property next to the place where the property is
declared. Note this option is not available for managed properties, and has the same caveats as
configuring a convention from the task constructor.
[Link]
Most properties of a task or project are intended to be configured by plugins or build scripts so that
they can use specific values for that build.
For example, a property that specifies the output directory for a compilation task may start with a
value specified by a plugin. Then a build script might change the value to some custom location,
then this value is used by the task when it runs. However, once the task starts to run, we want to
prevent further property changes. This way we avoid errors that result from different consumers,
such as the task action, Gradle’s up-to-date checks, build caching, or other tasks, using different
values for the property.
Lazy properties provide several methods that you can use to disallow changes to their value once
the value has been configured. The finalizeValue() method calculates the final value for the
property and prevents further changes to the property.
[Link]()
When the property’s value comes from a Provider, the provider is queried for its current value, and
the result becomes the final value for the property. This final value replaces the provider and the
property no longer tracks the value of the provider. Calling this method also makes a property
instance unmodifiable and any further attempts to change the value of the property will fail. Gradle
automatically makes the properties of a task final when the task starts execution.
The finalizeValueOnRead() method is similar, except that the property’s final value is not calculated
until the value of the property is queried.
[Link]()
In other words, this method calculates the final value lazily as required, whereas finalizeValue()
calculates the final value eagerly. This method can be used when the value may be expensive to
calculate or may not have been configured yet. You also want to ensure that all consumers of the
property see the same value when they query the value.
◦ For configurable properties, expose the Property directly through a single getter.
◦ If it’s a stable property, add a new Property or Provider and deprecate the old one. You
should wire the old getter/setters into the new property as appropriate.
Provider<RegularFile>
File on disk
Factories
• [Link](Transformer).
• [Link](Transformer).
• [Link](String)
Provider<Directory>
Directory on disk
Factories
• [Link](Transformer).
• [Link](Transformer).
• [Link](String)
FileCollection
Unstructured collection of files
Factories
• [Link](Object[])
• [Link](Object...)
• [Link](Object...)
FileTree
Hierarchy of files
Factories
• [Link](Object) will produce a ConfigurableFileTree, or you can use
[Link](Object) and [Link](Object)
• [Link]()
RegularFileProperty
File on disk
Factories
• [Link]()
DirectoryProperty
Directory on disk
Factories
• [Link]()
ConfigurableFileCollection
Unstructured collection of files
Factories
• [Link]()
ConfigurableFileTree
Hierarchy of files
Factories
• [Link]()
SourceDirectorySet
Hierarchy of source directories
Factories
• [Link](String, String)
ListProperty<T>
a property whose value is List<T>
Factories
• [Link](Class)
SetProperty<T>
a property whose value is Set<T>
Factories
• [Link](Class)
Provider<T>
a property whose value is an instance of T
Factories
• [Link](Transformer).
• [Link](Transformer).
Property<T>
a property whose value is an instance of T
Factories
• [Link](Class)
This allows Gradle to fully utilize the resources available and complete builds faster.
The Worker API
The Worker API provides the ability to break up the execution of a task action into discrete units of
work and then execute that work concurrently and asynchronously.
The best way to understand how to use the API is to go through the process of converting an
existing custom task to use the Worker API:
1. You’ll start by creating a custom task class that generates MD5 hashes for a configurable set of
files.
2. Then, you’ll convert this custom task to use the Worker API.
3. Then, we’ll explore running the task with different levels of isolation.
In the process, you’ll learn about the basics of the Worker API and the capabilities it provides.
First, create a custom task that generates MD5 hashes of a configurable set of files.
buildSrc/[Link]
repositories {
mavenCentral()
}
dependencies {
implementation("commons-io:commons-io:2.5")
implementation("commons-codec:commons-codec:1.9") ①
}
buildSrc/[Link]
repositories {
mavenCentral()
}
dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'commons-codec:commons-codec:1.9' ①
}
① Your custom task class will use Apache Commons Codec to generate MD5 hashes.
Next, create a custom task class in your buildSrc/src/main/java directory. You should name this
class CreateMD5:
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory(); ②
@TaskAction
public void createHashes() {
for (File sourceFile : getSource().getFiles()) { ③
try {
InputStream stream = new FileInputStream(sourceFile);
[Link]("Generating MD5 for " + [Link]() + "
...");
// Artificially make this task slower.
[Link](3000); ④
Provider<RegularFile> md5File = getDestinationDirectory().file
([Link]() + ".md5"); ⑤
[Link]([Link]().getAsFile(), DigestUtils
.md5Hex(stream), (String) null);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
}
① SourceTask is a convenience type for tasks that operate on a set of source files.
③ The task iterates over all the files defined as "source files" and creates an MD5 hash of each.
④ Insert an artificial sleep to simulate hashing a large file (the sample files won’t be that large).
⑤ The MD5 hash of each file is written to the output directory into a file of the same name with an
"md5" extension.
[Link]
plugins { id("base") } ①
[Link]<CreateMD5>("md5") {
destinationDirectory = [Link]("md5") ②
source([Link]("src")) ③
}
[Link]
plugins { id 'base' } ①
[Link]("md5", CreateMD5) {
destinationDirectory = [Link]("md5") ②
source([Link]('src')) ③
}
① Apply the base plugin so that you’ll have a clean task to use to remove the output.
③ This task will generate MD5 hash files for every file in the src directory.
You will need some source to generate MD5 hashes from. Create three files in the src directory:
src/[Link]
src/[Link]
I was born not knowing and have had only a little time to change that here and there.
src/[Link]
$ gradle md5
BUILD SUCCESSFUL in 9s
3 actionable tasks: 3 executed
In the build/md5 directory, you should now see corresponding files with an md5 extension containing
MD5 hashes of the files from the src directory. Notice that the task takes at least 9 seconds to run
because it hashes each file one at a time (i.e., three files at ~3 seconds apiece).
Although this task processes each file in sequence, the processing of each file is independent of any
other file. This work can be done in parallel and take advantage of multiple processors. This is
where the Worker API can help.
To use the Worker API, you need to define an interface that represents the parameters of each unit
of work and extends [Link].
For the generation of MD5 hash files, the unit of work will require two parameters:
There is no need to create a concrete implementation because Gradle will generate one for us at
runtime.
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
① Use Property objects to represent the source and MD5 hash files.
Then, you need to refactor the part of your custom task that does the work for each individual file
into a separate class. This class is your "unit of work" implementation, and it should be an abstract
class that extends [Link]:
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
① Do not implement the getParameters() method - Gradle will inject this at runtime.
Now, change your custom task class to submit work to the WorkerExecutor instead of doing the
work itself.
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link].*;
import [Link];
import [Link];
import [Link];
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor(); ①
@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().noIsolation(); ②
① The WorkerExecutor service is required in order to submit your work. Create an abstract getter
method annotated [Link], and Gradle will inject the service at runtime when the
task is created.
② Before submitting work, get a WorkQueue object with the desired isolation mode (described
below).
③ When submitting the unit of work, specify the unit of work implementation, in this case
GenerateMD5, and configure its parameters.
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
The results should look the same as before, although the MD5 hash files may be generated in a
different order since the units of work are executed in parallel. This time, however, the task runs
much faster. This is because the Worker API executes the MD5 calculation for each file in parallel
rather than in sequence.
The isolation mode controls how strongly Gradle will isolate items of work from each other and the
rest of the Gradle runtime.
1. noIsolation()
2. classLoaderIsolation()
3. processIsolation()
The noIsolation() mode is the lowest level of isolation and will prevent a unit of work from
changing the project state. This is the fastest isolation mode because it requires the least overhead
to set up and execute the work item. However, it will use a single shared classloader for all units of
work. This means that each unit of work can affect one another through static class state. It also
means that every unit of work uses the same version of libraries on the buildscript classpath. If you
wanted the user to be able to configure the task to run with a different (but compatible) version of
the Apache Commons Codec library, you would need to use a different isolation mode.
First, you must change the dependency in buildSrc/[Link] to be compileOnly. This tells Gradle
that it should use this dependency when building the classes, but should not put it on the build
script classpath:
buildSrc/[Link]
repositories {
mavenCentral()
}
dependencies {
implementation("commons-io:commons-io:2.5")
compileOnly("commons-codec:commons-codec:1.9")
}
buildSrc/[Link]
repositories {
mavenCentral()
}
dependencies {
implementation 'commons-io:commons-io:2.5'
compileOnly 'commons-codec:commons-codec:1.9'
}
Next, change the CreateMD5 task to allow the user to configure the version of the codec library that
they want to use. It will resolve the appropriate version of the library at runtime and configure the
workers to use this version.
The classLoaderIsolation() method tells Gradle to run this work in a thread with an isolated
classloader:
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link];
import [Link].*;
import [Link];
import [Link];
import [Link];
@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor();
@TaskAction
public void createHashes() {
WorkQueue workQueue = getWorkerExecutor().classLoaderIsolation(workerSpec -> {
[Link]().from(getCodecClasspath()); ②
});
② Configure the classpath on the ClassLoaderWorkerSpec when creating the work queue.
Next, you need to configure your build so that it has a repository to look up the codec version at
task execution time. We also create a dependency to resolve our codec library from this repository:
[Link]
plugins { id("base") }
repositories {
mavenCentral() ①
}
dependencies {
codec("commons-codec:commons-codec:1.10") ③
}
[Link]<CreateMD5>("md5") {
[Link](codec) ④
destinationDirectory = [Link]("md5")
source([Link]("src"))
}
[Link]
plugins { id 'base' }
repositories {
mavenCentral() ①
}
[Link]('codec') { ②
attributes {
attribute(Usage.USAGE_ATTRIBUTE, [Link](Usage, Usage
.JAVA_RUNTIME))
}
visible = false
canBeConsumed = false
}
dependencies {
codec 'commons-codec:commons-codec:1.10' ③
}
[Link]('md5', CreateMD5) {
[Link]([Link]) ④
destinationDirectory = [Link]('md5')
source([Link]('src'))
}
① Add a repository to resolve the codec library - this can be a different repository than the one
used to build the CreateMD5 task class.
④ Configure the md5 task to use the configuration as its classpath. Note that the configuration will
not be resolved until the task is executed.
Now, if you run your task, it should work as expected using the configured version of the codec
library:
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
Sometimes, it is desirable to utilize even greater levels of isolation when executing items of work.
For instance, external libraries may rely on certain system properties to be set, which may conflict
between work items. Or a library might not be compatible with the version of JDK that Gradle is
running with and may need to be run with a different version.
The Worker API can accommodate this using the processIsolation() method that causes the work
to execute in a separate "worker daemon". These worker processes will be session-scoped and can
be reused within the same build session, but they won’t persist across builds. However, if system
resources get low, Gradle will stop unused worker daemons.
To utilize a worker daemon, use the processIsolation() method when creating the WorkQueue. You
may also want to configure custom settings for the new process:
buildSrc/src/main/java/[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link].*;
import [Link];
import [Link].*;
import [Link];
import [Link];
import [Link];
@InputFiles
abstract public ConfigurableFileCollection getCodecClasspath(); ①
@OutputDirectory
abstract public DirectoryProperty getDestinationDirectory();
@Inject
abstract public WorkerExecutor getWorkerExecutor();
@TaskAction
public void createHashes() {
①
WorkQueue workQueue = getWorkerExecutor().processIsolation(workerSpec -> {
[Link]().from(getCodecClasspath());
[Link](options -> {
[Link]("64m"); ②
});
});
BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
Note that the execution time may be high. This is because Gradle has to start a new process for each
worker daemon, which is expensive.
However, if you run your task a second time, you will see that it runs much faster. This is because
the worker daemon(s) started during the initial build have persisted and are available for use
immediately during subsequent builds:
BUILD SUCCESSFUL in 1s
3 actionable tasks: 3 executed
Isolation modes
Gradle provides three isolation modes that can be configured when creating a WorkQueue and are
specified using one of the following methods on WorkerExecutor:
[Link]()
This states that the work should be run in a thread with minimal isolation.
For instance, it will share the same classloader that the task is loaded from. This is the fastest
level of isolation.
[Link]()
This states that the work should be run in a thread with an isolated classloader.
The classloader will have the classpath from the classloader that the unit of work
implementation class was loaded from as well as any additional classpath entries added through
[Link]().
[Link]()
This states that the work should be run with a maximum isolation level by executing the work in
a separate process.
The classloader of the process will use the classpath from the classloader that the unit of work
was loaded from as well as any additional classpath entries added through
[Link](). Furthermore, the process will be a worker daemon that
will stay alive and can be reused for future work items with the same requirements. This
process can be configured with different settings than the Gradle JVM using
[Link]([Link]).
Worker Daemons
When using processIsolation(), Gradle will start a long-lived worker daemon process that can be
reused for future work items.
[Link]
[Link]
When a unit of work for a worker daemon is submitted, Gradle will first look to see if a compatible,
idle daemon already exists. If so, it will send the unit of work to the idle daemon, marking it as
busy. If not, it will start a new daemon. When evaluating compatibility, Gradle looks at a number of
criteria, all of which can be controlled through
[Link]([Link]).
By default, a worker daemon starts with a maximum heap of 512MB. This can be changed by
adjusting the workers' fork options.
executable
A daemon is considered compatible only if it uses the same Java executable.
classpath
A daemon is considered compatible if its classpath contains all the classpath entries requested.
Note that a daemon is considered compatible only if the classpath exactly matches the requested
classpath.
heap settings
A daemon is considered compatible if it has at least the same heap size settings as requested.
In other words, a daemon that has higher heap settings than requested would be considered
compatible.
jvm arguments
A daemon is compatible if it has set all the JVM arguments requested.
Note that a daemon is compatible if it has additional JVM arguments beyond those requested
(except for those treated especially, such as heap settings, assertions, debug, etc.).
system properties
A daemon is considered compatible if it has set all the system properties requested with the
same values.
Note that a daemon is compatible if it has additional system properties beyond those requested.
environment variables
A daemon is considered compatible if it has set all the environment variables requested with the
same values.
Note that a daemon is compatible if it has more environment variables than requested.
bootstrap classpath
A daemon is considered compatible if it contains all the bootstrap classpath entries requested.
Note that a daemon is compatible if it has more bootstrap classpath entries than requested.
debug
A daemon is considered compatible only if debug is set to the same value as requested (true or
false).
enable assertions
A daemon is considered compatible only if enable assertions are set to the same value as
requested (true or false).
Worker daemons will remain running until the build daemon that started them is stopped or
system memory becomes scarce. When system memory is low, Gradle will stop worker daemons to
minimize memory consumption.
A step-by-step description of converting a normal task action to use the worker API
NOTE
can be found in the section on developing parallel tasks.
To support cancellation (e.g., when the user stops the build with CTRL+C) and task timeouts, custom
tasks should react to interrupting their executing thread. The same is true for work items submitted
via the worker API. If a task does not respond to an interrupt within 10s, the daemon will shut
down to free up system resources.
Advanced Tasks
Incremental tasks
In Gradle, implementing a task that skips execution when its inputs and outputs are already UP-TO-
DATE is simple and efficient, thanks to the Incremental Build feature.
However, there are times when only a few input files have changed since the last execution, and it
is best to avoid reprocessing all the unchanged inputs. This situation is common in tasks that
transform input files into output files on a one-to-one basis.
To optimize your build process you can use an incremental task. This approach ensures that only
out-of-date input files are processed, improving build performance.
For a task to process inputs incrementally, that task must contain an incremental task action.
This is a task action method that has a single InputChanges parameter. That parameter tells Gradle
that the action only wants to process the changed inputs.
In addition, the task needs to declare at least one incremental file input property by using either
@Incremental or @SkipWhenEmpty:
[Link]
@get:Incremental
@get:InputDirectory
val inputDir: DirectoryProperty = [Link]()
@get:OutputDirectory
val outputDir: DirectoryProperty = [Link]()
@get:Input
val inputProperty: RegularFileProperty = [Link]()
// File input property
@TaskAction
fun execute(inputs: InputChanges) { // InputChanges parameter
val msg = if ([Link]) "CHANGED inputs are out of date"
else "ALL inputs are out of date"
println(msg)
}
}
[Link]
@Incremental
@InputDirectory
def File inputDir
@OutputDirectory
def File outputDir
@Input
def inputProperty // File input property
@TaskAction
void execute(InputChanges inputs) { // InputChanges parameter
println [Link] ? "CHANGED inputs are out of date"
: "ALL inputs are out of date"
}
}
To query incremental changes for an input file property, that property must
always return the same instance. The easiest way to accomplish this is to use
one of the following property types: RegularFileProperty, DirectoryProperty
IMPORTANT or ConfigurableFileCollection.
The incremental task action can use [Link]() to find out what files have
changed for a given file-based input property, be it of type RegularFileProperty, DirectoryProperty
or ConfigurableFileCollection.
The method returns an Iterable of type FileChanges, which in turn can be queried for the
following:
The following example demonstrates an incremental task that has a directory input. It assumes that
the directory contains a collection of text files and copies them to an output directory, reversing the
text within each file:
[Link]
@get:OutputDirectory
abstract val outputDir: DirectoryProperty
@get:Input
abstract val inputProperty: Property<String>
@TaskAction
fun execute(inputChanges: InputChanges) {
println(
if ([Link]) "Executing incrementally"
else "Executing non-incrementally"
)
[Link]
@OutputDirectory
abstract DirectoryProperty getOutputDir()
@Input
abstract Property<String> getInputProperty()
@TaskAction
void execute(InputChanges inputChanges) {
println([Link]
? 'Executing incrementally'
: 'Executing non-incrementally'
)
If, for some reason, the task is executed non-incrementally (by running with --rerun-tasks, for
example), all files are reported as ADDED, irrespective of the previous state. In this case, Gradle
automatically removes the previous outputs, so the incremental task must only process the given
files.
For a simple transformer task like the above example, the task action must generate output files for
any out-of-date inputs and delete output files for any removed inputs.
When a task has been previously executed, and the only changes since that execution are to
incremental input file properties, Gradle can intelligently determine which input files need to be
processed, a concept known as incremental execution.
However, there are many cases where Gradle cannot determine which input files need to be
processed (i.e., non-incremental execution). Examples include:
• You are building with a different version of Gradle. Currently, Gradle does not use task history
from a different version.
• A non-incremental input file property has changed since the previous execution.
• One or more output files have changed since the previous execution.
In these cases, Gradle will report all input files as ADDED, and the getFileChanges() method will
return details for all the files that comprise the given input property.
You can check if the task execution is incremental or not with the [Link]()
method.
Consider an instance of IncrementalReverseTask executed against a set of inputs for the first time.
[Link]<IncrementalReverseTask>("incrementalReverse") {
inputDir = file("inputs")
outputDir = [Link]("outputs")
inputProperty = [Link]("taskInputProperty") as String? ?:
"original"
}
[Link]
[Link]('incrementalReverse', IncrementalReverseTask) {
inputDir = file('inputs')
outputDir = [Link]("outputs")
inputProperty = [Link]['taskInputProperty'] ?: 'original'
}
.
├── [Link]
└── inputs
├── [Link]
├── [Link]
└── [Link]
$ gradle -q incrementalReverse
Executing non-incrementally
ADDED: [Link]
ADDED: [Link]
ADDED: [Link]
Naturally, when the task is executed again with no changes, then the entire task is UP-TO-DATE, and
the task action is not executed:
$ gradle incrementalReverse
> Task :incrementalReverse UP-TO-DATE
BUILD SUCCESSFUL in 0s
1 actionable task: 1 up-to-date
When an input file is modified in some way or a new input file is added, then re-executing the task
results in those files being returned by [Link]().
The following example modifies the content of one file and adds another before running the
incremental task:
[Link]
[Link]("updateInputs") {
val inputsDir = [Link]("inputs")
[Link](inputsDir)
doLast {
[Link]("[Link]").[Link]("Changed content for
existing file 1.")
[Link]("[Link]").[Link]("Content for new file 4.")
}
}
[Link]
[Link]('updateInputs') {
def inputsDir = [Link]('inputs')
[Link](inputsDir)
doLast {
[Link]('[Link]').[Link] = 'Changed content for existing
file 1.'
[Link]('[Link]').[Link] = 'Content for new file 4.'
}
}
The various mutation tasks (updateInputs, removeInput, etc) are only present to
NOTE demonstrate the behavior of incremental tasks. They should not be viewed as the
kinds of tasks or task implementations you should have in your own build scripts.
When an existing input file is removed, then re-executing the task results in that file being returned
by [Link]() as REMOVED.
The following example removes one of the existing files before executing the incremental task:
[Link]
[Link]<Delete>("removeInput") {
delete("inputs/[Link]")
}
[Link]
[Link]('removeInput', Delete) {
delete 'inputs/[Link]'
}
Gradle cannot determine which input files are out-of-date when an output file is deleted (or
modified). In this case, details for all the input files for the given property are returned by
[Link]().
The following example removes one of the output files from the build directory. However, all the
input files are considered to be ADDED:
[Link]
[Link]<Delete>("removeOutput") {
delete([Link]("outputs/[Link]"))
}
[Link]
[Link]('removeOutput', Delete) {
delete [Link]("outputs/[Link]")
}
The last scenario we want to cover concerns what happens when a non-file-based input property is
modified. In such cases, Gradle cannot determine how the property impacts the task outputs, so the
task is executed non-incrementally. This means that all input files for the given property are
returned by [Link]() and they are all treated as ADDED.
The following example sets the project property taskInputProperty to a new value when running
the incrementalReverse task. That project property is used to initialize the task’s inputProperty
property, as you can see in the first example of this section.
Sometimes, a user wants to declare the value of an exposed task property on the command line
instead of the build script. Passing property values on the command line is particularly helpful if
they change more frequently.
The task API supports a mechanism for marking a property to automatically generate a
corresponding command line parameter with a specific name at runtime.
To expose a new command line option for a task property, annotate the corresponding setter
method of a property with Option:
A task can expose as many command line options as properties available in the class.
Options may be declared in superinterfaces of the task class as well. If multiple interfaces declare
the same property but with different option flags, they will both work to set the property.
In the example below, the custom task UrlVerify verifies whether a URL can be resolved by making
an HTTP call and checking the response code. The URL to be verified is configurable through the
property url. The setter method for the property is annotated with @Option:
[Link]
import [Link];
@Input
public String getUrl() {
return url;
}
@TaskAction
public void verify() {
getLogger().quiet("Verifying URL '{}'", url);
All options declared for a task can be rendered as console output by running the help task and the
--task option.
• The option uses a double-dash as a prefix, e.g., --url. A single dash does not qualify as valid
syntax for a task option.
• The option argument follows directly after the task declaration, e.g., verifyUrl
--url=[Link]
• Multiple task options can be declared in any order on the command line following the task
name.
Building upon the earlier example, the build script creates a task instance of type UrlVerify and
provides a value from the command line through the exposed option:
[Link]
[Link]<UrlVerify>("verifyUrl")
[Link]
[Link]('verifyUrl', UrlVerify)
Gradle limits the data types that can be used for declaring command line options.
Double, Property<Double>
Describes an option with a double value.
Passing the option on the command line also requires a value, e.g., --factor=2.2 or --factor 2.2.
Integer, Property<Integer>
Describes an option with an integer value.
Passing the option on the command line also requires a value, e.g., --network-timeout=5000 or
--network-timeout 5000.
Long, Property<Long>
Describes an option with a long value.
Passing the option on the command line also requires a value, e.g., --threshold=2147483648 or
--threshold 2147483648.
String, Property<String>
Describes an option with an arbitrary String value.
Passing the option on the command line also requires a value, e.g., --container-id=2x94held or
--container-id 2x94held.
enum, Property<enum>
Describes an option as an enumerated type.
Passing the option on the command line also requires a value e.g., --log-level=DEBUG or --log
-level debug.
The value is not case-sensitive.
DirectoryProperty, RegularFileProperty
Describes an option with a file system element.
Passing the option on the command line also requires a value representing a path, e.g., --output
-file=[Link] or --output-dir outputDir.
Relative paths are resolved relative to the project directory of the project that owns this property
instance. See [Link]().
Theoretically, an option for a property type String or List<String> can accept any arbitrary value.
Accepted values for such an option can be documented programmatically with the help of the
annotation OptionValues:
@OptionValues('file')
This annotation may be assigned to any method that returns a List of one of the supported data
types. You need to specify an option identifier to indicate the relationship between the option and
available values.
Passing a value on the command line not supported by the option does not fail the
NOTE build or throw an exception. You must implement custom logic for such behavior in
the task action.
The example below demonstrates the use of multiple options for a single task. The task
implementation provides a list of available values for the option output-type:
[Link]
import [Link];
import [Link];
public abstract class UrlProcess extends DefaultTask {
private String url;
private OutputType outputType;
@Input
@Option(option = "http", description = "Configures the http protocol to be
allowed.")
public abstract Property<Boolean> getHttp();
@Option(option = "url", description = "Configures the URL to send the request to.
")
public void setUrl(String url) {
if (!getHttp().getOrElse(true) && [Link]("[Link] {
throw new IllegalArgumentException("HTTP is not allowed");
} else {
[Link] = url;
}
}
@Input
public String getUrl() {
return url;
}
@OptionValues("output-type")
public List<OutputType> getAvailableOutputTypes() {
return new ArrayList<OutputType>([Link]([Link]()));
}
@Input
public OutputType getOutputType() {
return outputType;
}
@TaskAction
public void process() {
getLogger().quiet("Writing out the URL response from '{}' to '{}'", url,
outputType);
Command line options using the annotations Option and OptionValues are self-documenting.
You will see declared options and their available values reflected in the console output of the help
task. The output renders options alphabetically, except for boolean disable options, which appear
following the enable option:
Path
:processUrl
Type
UrlProcess (UrlProcess)
Options
--http Configures the http protocol to be allowed.
Description
-
Group
-
Limitations
Support for declaring command line options currently comes with a few limitations.
• Command line options can only be declared for custom tasks via annotation. There’s no
programmatic equivalent for defining options.
• When assigning an option on the command line, the task exposing the option needs to be
spelled out explicitly, e.g., gradle check --tests abc does not work even though the check task
depends on the test task.
• If you specify a task option name that conflicts with the name of a built-in Gradle option, use the
-- delimiter before calling your task to reference that option. For more information, see
Disambiguate Task Options from Built-in Options.
Verification failures
Normally, exceptions thrown during task execution result in a failure that immediately terminates
a build. The outcome of the task will be FAILED, the result of the build will be FAILED, and no further
tasks will be executed. When running with the --continue flag, Gradle will continue to run other
requested tasks in the build after encountering a task failure. However, any tasks that depend on a
failed task will not be executed.
There is a special type of exception that behaves differently when downstream tasks only rely on
the outputs of a failing task. A task can throw a subtype of VerificationException to indicate that it
has failed in a controlled manner such that its output is still valid for consumers. A task depends on
the outcome of another task when it directly depends on it using dependsOn. When Gradle is run
with --continue, consumer tasks that depend on a producer task’s output (via a relationship
between task inputs and outputs) can still run after the producer fails.
A failed unit test, for instance, will cause a failing outcome for the test task. However, this doesn’t
prevent another task from reading and processing the (valid) test results the task produced.
Verification failures are used in exactly this manner by the Test Report Aggregation Plugin.
Verification failures are also useful for tasks that need to report a failure even after producing
useful output consumable by other tasks.
[Link]
doLast {
val logFile = [Link]().asFile
[Link]("Step 1 Complete.") ②
throw VerificationException("Process failed!") ③
[Link]("Step 2 Complete.") ④
}
}
[Link]("postProcess") {
[Link](process) ⑤
doLast {
println("Results: ${[Link]()}") ⑥
}
}
[Link]
[Link]("process") {
def outputFile = [Link]("[Link]")
[Link](outputFile) ①
doLast {
def logFile = [Link]().asFile
logFile << "Step 1 Complete." ②
throw new VerificationException("Process failed!") ③
logFile << "Step 2 Complete." ④
}
}
[Link]("postProcess") {
[Link]([Link]("process")) ⑤
doLast {
println("Results: ${[Link]}") ⑥
}
}
① Register Output: The process task writes its output to a log file.
③ Task Failure: The task throws a VerificationException and fails at this point.
④ Continue to Modify Output: This line never runs due to the exception stopping the task.
⑤ Consume Output: The postProcess task depends on the output of the process task due to using
that task’s outputs as its own inputs.
⑥ Use Partial Result: With the --continue flag set, Gradle still runs the requested postProcess task
despite the process task’s failure. postProcess can read and display the partial (though still valid)
result.
Using Shared Build Services
Shared build services allow tasks to share state or resources. For example, tasks might share a
cache of pre-computed values or use a web service or database instance.
A build service is an object that holds the state for tasks to use. It provides an alternative
mechanism for hooking into a Gradle build and receiving information about task execution and
operation completion.
Gradle manages the service lifecycle, creating the service instance only when required and
cleaning it up when no longer needed. Gradle can also coordinate access to the build service,
ensuring that no more than a specified number of tasks use the service concurrently.
To implement a build service, create an abstract class that implements BuildService. Then, define
methods you want the tasks to use on this type.
A build service implementation is treated as a custom Gradle type and can use any of the features
available to custom Gradle types.
A build service can optionally take parameters, which Gradle injects into the service instance when
creating it. To provide parameters, you define an abstract class (or interface) that holds the
parameters. The parameters type must implement (or extend) BuildServiceParameters. The service
implementation can access the parameters using [Link](). The parameters type is also
a custom Gradle type.
When the build service does not require any parameters, you can use [Link]
as the type of parameter.
A build service implementation can also optionally implement AutoCloseable, in which case Gradle
will call the build service instance’s close() method when it discards the service instance. This
happens sometime between the completion of the last task that uses the build service and the end
of the build.
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
DirectoryProperty getResources();
}
@Override
public void close() {
// Stop the server ...
}
}
Note that you should not implement the [Link]() method, as Gradle will
provide an implementation of this.
A build service implementation must be thread-safe, as it will potentially be used by multiple tasks
concurrently.
Registering a build service and connecting it to a task
To create a build service, you register the service instance using the
[Link]() method.
Registering the service does not create the service instance. This happens on demand when a task
first uses the service. The service instance will not be created if no task uses the service during a
build.
Currently, build services are scoped to a build, rather than a project, and these services are
available to be shared by the tasks of all projects. You can access the registry of shared build
services via [Link]().getSharedServices().
Here is an example of a plugin that registers the previous service when the task property
consuming the service is annotated with @ServiceReference:
[Link]
import [Link];
import [Link];
import [Link];
As you can see, there is no need to assign the build service provider returned by registerIfAbsent()
to the task, the service is automatically injected into all matching properties that were annotated
with @ServiceReference.
Here is an example of a task that consumes the previous service via a property annotated with
@ServiceReference:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@OutputFile
abstract RegularFileProperty getOutputFile();
@TaskAction
public void download() {
// Use the server to download a file
WebServer server = getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}
Automatic matching of registered build services with service reference properties is done by type
and (optionally) by name (for properties that declare the name of the service they expect). In case
multiple services would match the requested service type (i.e. multiple services were registered for
the same type, and a service name was not provided in the @ServiceReference annotation), you will
need also to assign the shared build service provider manually to the task property.
Read on to compare that to when the task property consuming the service is instead annotated with
@Internal.
[Link]
import [Link];
import [Link];
import [Link];
In this case, the plugin registers the service and receives a Provider<WebService> back. This provider
can be connected to task properties to pass the service to the task. Note that for a task property
annotated with @Internal, the task property needs to (1) be explicitly assigned with the provider
obtained during registation, and (2) you must tell Gradle the task uses the service via
[Link]. None of that is needed when the task property consuming the service is
annotated with @ServiceReference.
Here is an example of a task that consumes the previous service via a property annotated with
@Internal:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@OutputFile
abstract RegularFileProperty getOutputFile();
@TaskAction
public void download() {
// Use the server to download a file
WebServer server = getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}
Note that using a service with any annotation other than @ServiceReference or @Internal is currently
not supported. For example, it is currently impossible to mark a service as an input to a task.
Generally, build services are intended to be used by tasks, and as they usually represent some
potentially expensive state to create, you should avoid using them at configuration time. However,
sometimes, using the service at configuration time can make sense. This is possible; call get() on
the provider.
In addition to using a build service from a task, you can use a build service from a Worker API
action, an artifact transform or another build service. To do this, pass the build service Provider as a
parameter of the consuming action or service, in the same way you pass other parameters to the
action or service.
For example, to pass a MyServiceType service to Worker API action, you might add a property of type
Property<MyServiceType> to the action’s parameters object and then connect the
Provider<MyServiceType> that you receive when registering the service to this property:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@Override
public void execute() {
// Use the server to download a file
WebServer server = getParameters().getServer().get();
URI uri = [Link]().resolve("[Link]");
[Link]([Link]("Downloading %s", uri));
}
}
@Inject
abstract public WorkerExecutor getWorkerExecutor();
// This property provides access to the service instance from the task
@ServiceReference("web")
abstract Property<WebServer> getServer();
@TaskAction
public void download() {
WorkQueue workQueue = getWorkerExecutor().noIsolation();
[Link]([Link], parameter -> {
[Link]().set(getServer());
});
}
}
Currently, it is impossible to use a build service with a worker API action that uses ClassLoader or
process isolation modes.
You can constrain concurrent execution when you register the service, by using the Property object
returned from [Link](). When this property has no value, which is
the default, Gradle does not constrain access to the service. When this property has a value > 0,
Gradle will allow no more than the specified number of tasks to use the service concurrently.
When the consuming task property is annotated with @Internal, for the
constraint to take effect, the build service must be registered with the
consuming task via [Link]. NOTE: at this time, Gradle cannot
discover indirect usage of services (for instance, if an additional service is
IMPORTANT used only by a service that the task uses directly). As a workaround, indirect
usage may be declared explicitly to Gradle by either adding a
@ServiceReference property to the task and assigning the service that is only
used indirectly to it (making it a direct reference), or invoking
[Link].
Receiving information about task execution
A build service can be used to receive events as tasks are executed. To do this, create and register a
build service that implements OperationCompletionListener:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@Override
public void onFinish(FinishEvent finishEvent) {
if (finishEvent instanceof TaskFinishEvent) { ②
// Handle task finish event...
}
}
}
Then, in the plugin, you can use the methods on the BuildEventsListenerRegistry service to start
receiving events:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
import [Link];
@Override
public void apply(Project project) {
Provider<TaskEventsService> serviceProvider =
[Link]().getSharedServices().registerIfAbsent(
"taskEvents", [Link], spec -> {}); ②
getEventsListenerRegistry().onTaskCompletion(serviceProvider); ③
}
}
③ Use the service Provider to subscribe to the build service to build events.
[1] You might be wondering why there is neither an import for the StopExecutionException nor do we access it via its fully qualified
name. The reason is that Gradle adds a set of default imports to your script (see Default imports).
DEVELOPING PLUGINS
Understanding Plugins
Gradle comes with a set of powerful core systems such as dependency management, task execution,
and project configuration. But everything else it can do is supplied by plugins.
Plugins encapsulate logic for specific tasks or integrations, such as compiling code, running tests, or
deploying artifacts. By applying plugins, users can easily add new features to their build process
without having to write complex code from scratch.
This plugin-based approach allows Gradle to be lightweight and modular. It also promotes code
reuse and maintainability, as plugins can be shared across projects or within an organization.
Before reading this chapter, it’s recommended that you first read Learning The Basics and complete
the Tutorial.
Plugins Introduction
Plugins can be sourced from Gradle or the Gradle community. But when users want to organize
their build logic or need specific build capabilities not provided by existing plugins, they can
develop their own.
2. Community Plugins - plugins that come from Gradle Plugin Portal or a public repository.
Core Plugins
The term core plugin refers to a plugin that is part of the Gradle distribution such as the Java
Library Plugin. They are always available.
Community Plugins
The term community plugin refers to a plugin published to the Gradle Plugin Portal (or another
public repository) such as the Spotless Plugin.
The term local or custom plugin refers to a plugin you write yourself for your own build.
Custom plugins
Script plugins
Script plugins are typically small, local plugins written in script files for tasks specific to a single
build or project. They do not need to be reused across multiple projects. Script plugins are not
recommended but many other forms of plugins evolve from script plugins.
To create a plugin, you need to write a class that implements the Plugin interface.
The following sample creates a GreetingPlugin, which adds a hello task to a project when applied:
[Link]
[Link]
$ gradle -q hello
Hello from the GreetingPlugin
The Project object is passed as a parameter in apply(), which the plugin can use to configure the
project however it needs to (such as adding tasks, configuring dependencies, etc.). In this example,
the plugin is written directly in the build file which is not a recommended practice.
When the plugin is written in a separate script file, it can be applied using apply(from =
"file_name.[Link]") or apply from: 'file_name.gradle'. In the example below, the plugin is
coded in the [Link](.kts) script file. Then, the [Link](.kts) is applied to
[Link](.kts) using apply from:
[Link]
[Link]
[Link]
apply(from = "[Link]")
[Link]
$ gradle -q hi
Hi from the GreetingScriptPlugin
Precompiled script plugins are compiled into class files and packaged into a JAR before they are
executed. These plugins use the Groovy DSL or Kotlin DSL instead of pure Java, Kotlin, or Groovy.
They are best used as convention plugins that share build logic across projects or as a way to
neatly organize build logic.
2. Use Gradle’s Groovy DSL - The plugin is a .gradle file, and apply id("groovy-gradle-plugin").
To apply a precompiled script plugin, you need to know its ID. The ID is derived from the plugin
script’s filename and its (optional) package declaration.
When the plugin is applied to a project, Gradle creates an instance of the plugin class and calls the
instance’s [Link]() method.
NOTE A new instance of a Plugin is created within each project applying that plugin.
Let’s rewrite the GreetingPlugin script plugin as a precompiled script plugin. Since we are using the
Groovy or Kotlin DSL, the file essentially becomes the plugin. The original script plugin simply
created a hello task which printed a greeting, this is what we will do in the pre-compiled script
plugin:
buildSrc/src/main/kotlin/[Link]
[Link]("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}
buildSrc/src/main/groovy/[Link]
[Link]("hello") {
doLast {
println("Hello from the convention GreetingPlugin")
}
}
The GreetingPlugin can now be applied in other subprojects' builds by using its ID:
app/[Link]
plugins {
application
id("GreetingPlugin")
}
app/[Link]
plugins {
id 'application'
id('GreetingPlugin')
}
$ gradle -q hello
Hello from the convention GreetingPlugin
Convention plugins
A convention plugin is typically a precompiled script plugin that configures existing core and
community plugins with your own conventions (i.e. default values) such as setting the Java version
by using [Link] = [Link](17). Convention plugins are
also used to enforce project standards and help streamline the build process. They can apply and
configure plugins, create new tasks and extensions, set dependencies, and much more.
Let’s take an example build with three subprojects: one for data-model, one for database-logic and
one for app code. The project has the following structure:
.
├── buildSrc
│ ├── src
│ │ └──...
│ └── [Link]
├── data-model
│ ├── src
│ │ └──...
│ └── [Link]
├── database-logic
│ ├── src
│ │ └──...
│ └── [Link]
├── app
│ ├── src
│ │ └──...
│ └── [Link]
└── [Link]
database-logic/[Link]
plugins {
id("java-library")
id("[Link]") version "2.0.21"
}
repositories {
mavenCentral()
}
java {
[Link]([Link](11))
}
[Link] {
useJUnitPlatform()
}
kotlin {
jvmToolchain(11)
}
database-logic/[Link]
plugins {
id 'java-library'
id '[Link]' version '2.0.21'
}
repositories {
mavenCentral()
}
java {
[Link]([Link](11))
}
[Link] {
useJUnitPlatform()
}
kotlin {
jvmToolchain {
[Link]([Link](11))
}
}
We apply the java-library plugin and add the [Link] plugin for Kotlin support.
We also configure Kotlin, Java, tests and more.
The more plugins we apply and the more plugins we configure, the larger it gets. There’s also
repetition in the build files of the app and data-model subprojects, especially when configuring
common extensions like setting the Java version and Kotlin support.
To address this, we use convention plugins. This allows us to avoid repeating configuration in each
build file and keeps our build scripts more concise and maintainable. In convention plugins, we can
encapsulate arbitrary build configuration or custom build logic.
buildSrc/src/main/kotlin/[Link]
plugins {
id("java-library")
id("[Link]")
}
repositories {
mavenCentral()
}
java {
[Link]([Link](11))
}
[Link] {
useJUnitPlatform()
}
kotlin {
jvmToolchain(11)
}
buildSrc/src/main/groovy/[Link]
plugins {
id 'java-library'
id '[Link]'
}
repositories {
mavenCentral()
}
java {
[Link]([Link](11))
}
[Link] {
useJUnitPlatform()
}
kotlin {
jvmToolchain {
[Link]([Link](11))
}
}
The name of the file my-java-library is the ID of our brand-new plugin, which we can now use in all
of our subprojects.
The database-logic build file becomes much simpler by removing all the redundant build logic and
applying our convention my-java-library plugin instead:
database-logic/[Link]
plugins {
id("my-java-library")
}
database-logic/[Link]
plugins {
id('my-java-library')
}
This convention plugin enables us to easily share common configurations across all our build files.
Any modifications can be made in one place, simplifying maintenance.
Binary plugins
Binary plugins in Gradle are plugins that are built as standalone JAR files and applied to a project
using the plugins{} block in the build script.
Let’s move our GreetingPlugin to a standalone project so that we can publish it and share it with
others. The plugin is essentially moved from the buildSrc folder to its own build called greeting-
plugin.
You can publish the plugin from buildSrc, but this is not recommended practice.
NOTE
Plugins that are ready for publication should be in their own build.
greeting-plugin is simply a Java project that produces a JAR containing the plugin classes.
The easiest way to package and publish a plugin to a repository is to use the Gradle Plugin
Development Plugin. This plugin provides the necessary tasks and configurations (including the
plugin metadata) to compile your script into a plugin that can be applied in other builds.
Here is a simple build script for the greeting-plugin project using the Gradle Plugin Development
Plugin:
[Link]
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("simplePlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}
[Link]
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
simplePlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}
In the example used through this section, the plugin accepts the Project type as a type parameter.
Alternatively, the plugin can accept a parameter of type Settings to be applied in a settings script, or
a parameter of type Gradle to be applied in an initialization script.
The difference between these types of plugins lies in the scope of their application:
Project Plugin
A project plugin is a plugin that is applied to a specific project in a build. It can customize the
build logic, add tasks, and configure the project-specific settings.
Settings Plugin
A settings plugin is a plugin that is applied in the [Link] or [Link] file. It
can configure settings that apply to the entire build, such as defining which projects are
included in the build, configuring build script repositories, and applying common configurations
to all projects.
Init Plugin
An init plugin is a plugin that is applied in the [Link] or [Link] file. It can
configure settings that apply globally to all Gradle builds on a machine, such as configuring the
Gradle version, setting up default repositories, or applying common plugins to all builds.
Script Plugins are simple and easy to write. They are written in Kotlin DSL or Groovy DSL. They
are suitable for small, one-off tasks or for quick experimentation. However, they can become hard
to maintain as the build script grows in size and complexity.
Precompiled Script Plugins are Kotlin or Groovy DSL scripts compiled into Java class files
packaged in a library. They offer better performance and maintainability compared to script
plugins, and they can be reused across different projects. You can also write them in Groovy DSL
but that is not recommended.
Binary Plugins are full-fledged plugins written in Java, Groovy, or Kotlin, compiled into JAR files,
and published to a repository. They offer the best performance, maintainability, and reusability.
They are suitable for complex build logic that needs to be shared across projects, builds, and teams.
You can also write them in Scala or Groovy but that is not recommended.
If you suspect issues with your plugin code, try creating a Build Scan to identify bottlenecks. The
Gradle profiler can help automate Build Scan generation and gather more low-level information.
A convention plugin is a plugin that normally configures existing core and community plugins
with your own conventions (i.e. default values) such as setting the Java version by using
[Link] = [Link](17). Convention plugins are also used to
enforce project standards and help streamline the build process. They can apply and configure
plugins, create new tasks and extensions, set dependencies, and much more.
The plugin ID for a precompiled script is derived from its file name and optional package
declaration.
buildSrc/[Link]
plugins {
`kotlin-dsl`
}
app/[Link]
plugins {
id("code-quality")
}
buildSrc/[Link]
plugins {
id 'groovy-gradle-plugin'
}
app/[Link]
plugins {
id 'code-quality'
}
On the other hand, a script named [Link] located in src/main/kotlin/my with the
package declaration my would be exposed as the [Link]-quality plugin:
buildSrc/[Link]
plugins {
`kotlin-dsl`
}
app/[Link]
plugins {
id("[Link]-quality")
}
Extension objects are commonly used in plugins to expose configuration options and additional
functionality to build scripts.
When you apply a plugin that defines an extension, you can access the extension object and
configure its properties or call its methods to customize the behavior of the plugin or tasks
provided by the plugin.
A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.
buildSrc/src/main/kotlin/[Link]
buildSrc/src/main/groovy/[Link]
However, the GreetingPluginExtension object becomes available as a project property with the same
name as the extension object. You can now access message like so:
buildSrc/src/main/kotlin/[Link]
buildSrc/src/main/groovy/[Link]
If you apply the greetings plugin, you can set the convention in your build script:
app/[Link]
plugins {
application
id("greetings")
}
greeting {
message = "Hello from Gradle"
}
app/[Link]
plugins {
id 'application'
id('greetings')
}
configure(greeting) {
message = "Hello from Gradle"
}
Adding default configuration as conventions
In plugins, you can define default values, also known as conventions, using the project object.
Convention properties are properties that are initialized with default values but can be overridden:
buildSrc/src/main/kotlin/[Link]
buildSrc/src/main/groovy/[Link]
[Link](…) sets a convention for the message property of the extension. This
convention specifies that the value of message should default to "Hello from Gradle".
If the message property is not explicitly set, its value will be automatically set to "Hello from Gradle".
Using an extension and mapping it to a custom task’s input/output properties is common in plugins.
In this example, the message property of the GreetingPluginExtension is mapped to the message
property of the GreetingTask as an input:
buildSrc/src/main/kotlin/[Link]
@TaskAction
fun greet() {
println("Message: ${[Link]()}")
}
}
buildSrc/src/main/groovy/[Link]
@TaskAction
void greet() {
println("Message: ${[Link]()}")
}
}
$ gradle -q hello
Message: Hello from Gradle
This means that changes to the extension’s message property will trigger the task to be considered
out-of-date, ensuring that the task is re-executed with the new message.
You can find out more about types that you can use in task implementations and extensions in Lazy
Configuration.
In order to apply an external plugin in a precompiled script plugin, it has to be added to the plugin
project’s implementation classpath in the plugin’s build file:
buildSrc/[Link]
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
dependencies {
implementation("[Link]:gradle-docker-plugin:6.4.0")
}
buildSrc/[Link]
plugins {
id 'groovy-gradle-plugin'
}
repositories {
mavenCentral()
}
dependencies {
implementation '[Link]:gradle-docker-plugin:6.4.0'
}
buildSrc/src/main/kotlin/[Link]
plugins {
id("[Link]-remote-api")
}
buildSrc/src/main/groovy/[Link]
plugins {
id '[Link]-remote-api'
}
The Gradle Plugin Development plugin can be used to assist in developing Gradle plugins.
This plugin will automatically apply the Java Plugin, add the gradleApi() dependency to the api
configuration, generate the required plugin descriptors in the resulting JAR file, and configure the
Plugin Marker Artifact to be used when publishing.
To apply and configure the plugin, add the following code to your build file:
[Link]
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("simplePlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}
[Link]
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
simplePlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}
Writing and using custom task types is recommended when developing plugins as it automatically
benefits from incremental builds. As an added benefit of applying the plugin to your project, the
task validatePlugins automatically checks for an existing input/output annotation for every public
property defined in a custom task type implementation.
Creating a plugin ID
Plugin IDs are meant to be globally unique, similar to Java package names (i.e., a reverse domain
name). This format helps prevent naming collisions and allows grouping plugins with similar
ownership.
An explicit plugin identifier simplifies applying the plugin to a project. Your plugin ID should
combine components that reflect the namespace (a reasonable pointer to you or your organization)
and the name of the plugin it provides. For example, if your Github account is named foo and your
plugin is named bar, a suitable plugin ID might be [Link]. Similarly, if the plugin was
developed at the baz organization, the plugin ID might be [Link].
• Must contain at least one '.' character separating the namespace from the plugin’s name.
• Conventionally use a lowercase reverse domain name convention for the namespace.
A namespace that identifies ownership and a name is sufficient for a plugin ID.
When bundling multiple plugins in a single JAR artifact, adhering to the same naming conventions
is recommended. This practice helps logically group related plugins.
There is no limit to the number of plugins that can be defined and registered (by different
identifiers) within a single project.
The identifiers for plugins written as a class should be defined in the project’s build script
containing the plugin classes. For this, the java-gradle-plugin needs to be applied:
buildSrc/[Link]
plugins {
id("java-gradle-plugin")
}
gradlePlugin {
plugins {
create("androidApplicationPlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
create("androidLibraryPlugin") {
id = "[Link]"
implementationClass = "[Link]"
}
}
}
buildSrc/[Link]
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
androidApplicationPlugin {
id = '[Link]'
implementationClass = '[Link]'
}
androidLibraryPlugin {
id = '[Link]'
implementationClass = '[Link]'
}
}
}
When developing plugins, it’s a good idea to be flexible when accepting input configuration for file
locations.
It is recommended to use Gradle’s managed properties and [Link] to select file or directory
locations. This will enable lazy configuration so that the actual location will only be resolved when
the file is needed and can be reconfigured at any time during build configuration.
This Gradle build file defines a task GreetingToFileTask that writes a greeting to a file. It also
registers two tasks: greet, which creates the file with the greeting, and sayGreeting, which prints the
file’s contents. The greetingFile property is used to specify the file path for the greeting:
[Link]
@get:OutputFile
abstract val destination: RegularFileProperty
@TaskAction
fun greet() {
val file = [Link]().asFile
[Link]()
[Link]("Hello!")
}
}
val greetingFile = [Link]()
[Link]<GreetingToFileTask>("greet") {
destination = greetingFile
}
[Link]("sayGreeting") {
dependsOn("greet")
val greetingFile = greetingFile
doLast {
val file = [Link]().asFile
println("${[Link]()} (file: ${[Link]})")
}
}
greetingFile = [Link]("[Link]")
[Link]
@OutputFile
abstract RegularFileProperty getDestination()
@TaskAction
def greet() {
def file = getDestination().get().asFile
[Link]()
[Link] 'Hello!'
}
}
[Link]('greet', GreetingToFileTask) {
destination = greetingFile
}
[Link]('sayGreeting') {
dependsOn greet
doLast {
def file = [Link]().asFile
println "${[Link]} (file: ${[Link]})"
}
}
greetingFile = [Link]('[Link]')
$ gradle -q sayGreeting
Hello! (file: [Link])
In this example, we configure the greet task destination property as a closure/provider, which is
evaluated with the [Link]([Link]) method to turn the return value of the
closure/provider into a File object at the last minute. Note that we specify the greetingFile
property value after the task configuration. This lazy evaluation is a key benefit of accepting any
value when setting a file property and then resolving that value when reading the property.
You can learn more about working with files lazily in Working with Files.
Most plugins offer configuration options for build scripts and other plugins to customize how the
plugin works. Plugins do this using extension objects.
A Project has an associated ExtensionContainer object that contains all the settings and properties
for the plugins that have been applied to the project. You can provide configuration for your plugin
by adding an extension object to this container.
An extension object is simply an object with Java Bean properties representing the configuration.
Let’s add a greeting extension object to the project, which allows you to configure the greeting:
[Link]
interface GreetingPluginExtension {
val message: Property<String>
}
apply<GreetingPlugin>()
[Link]
interface GreetingPluginExtension {
Property<String> getMessage()
}
$ gradle -q hello
Hi from Gradle
In this example, GreetingPluginExtension is an object with a property called message. The extension
object is added to the project with the name greeting. This object becomes available as a project
property with the same name as the extension object. the<GreetingPluginExtension>() is equivalent
to [Link](GreetingPluginExtension::[Link]).
Often, you have several related properties you need to specify on a single plugin. Gradle adds a
configuration block for each extension object, so you can group settings:
[Link]
interface GreetingPluginExtension {
val message: Property<String>
val greeter: Property<String>
}
class GreetingPlugin : Plugin<Project> {
override fun apply(project: Project) {
val extension =
[Link]<GreetingPluginExtension>("greeting")
[Link]("hello") {
doLast {
println("${[Link]()} from
${[Link]()}")
}
}
}
}
apply<GreetingPlugin>()
[Link]
interface GreetingPluginExtension {
Property<String> getMessage()
Property<String> getGreeter()
}
$ gradle -q hello
Hi from Gradle
In this example, several settings can be grouped within the greeting closure. The name of the
closure block in the build script (greeting) must match the extension object name. Then, when the
closure is executed, the fields on the extension object will be mapped to the variables within the
closure based on the standard Groovy closure delegate feature.
Using an extension object extends the Gradle DSL to add a project property and DSL block for the
plugin. Because an extension object is a regular object, you can provide your own DSL nested inside
the plugin block by adding properties and methods to the extension object.
[Link]
plugins {
id("[Link]-env")
}
environments {
create("dev") {
url = "[Link]
}
create("staging") {
url = "[Link]
}
create("production") {
url = "[Link]
}
}
[Link]
plugins {
id '[Link]-env'
}
environments {
dev {
url = '[Link]
}
staging {
url = '[Link]
}
production {
url = '[Link]
}
}
The DSL exposed by the plugin exposes a container for defining a set of environments. Each
environment the user configures has an arbitrary but declarative name and is represented with its
own DSL configuration block. The example above instantiates a development, staging, and
production environment, including its respective URL.
Each environment must have a data representation in code to capture the values. The name of an
environment is immutable and can be passed in as a constructor parameter. Currently, the only
other parameter the data object stores is a URL.
[Link]
@[Link]
public ServerEnvironment(String name) {
[Link] = name;
}
It’s common for a plugin to post-process the captured values within the plugin implementation, e.g.,
to configure tasks:
[Link]
NamedDomainObjectContainer<ServerEnvironment> serverEnvironmentContainer =
[Link]([Link], name -> objects
.newInstance([Link], name));
[Link]().add("environments", serverEnvironmentContainer);
[Link](serverEnvironment -> {
String env = [Link]();
String capitalizedServerEnv = [Link](0, 1).toUpperCase() + env
.substring(1);
String taskName = "deployTo" + capitalizedServerEnv;
[Link]().register(taskName, [Link], task -> [Link]()
.set([Link]()));
});
}
}
In the example above, a deployment task is created dynamically for every user-configured
environment.
You can find out more about implementing project extensions in Developing Custom Gradle Types.
For example, let’s consider the following extension provided by a plugin. In its current form, it
offers a "flat" list of properties for configuring the creation of a website:
[Link]
plugins {
id("[Link]")
}
site {
outputDir = [Link]("mysite")
websiteUrl = "[Link]
vcsUrl = "[Link]
}
[Link]
plugins {
id '[Link]'
}
site {
outputDir = [Link]("mysite")
websiteUrl = '[Link]
vcsUrl = '[Link]
}
As the number of exposed properties grows, you should introduce a nested, more expressive
structure.
The following code snippet adds a new configuration block named siteInfo as part of the extension.
This provides a stronger indication of what those properties mean:
[Link]
plugins {
id("[Link]")
}
site {
outputDir = [Link]("mysite")
siteInfo {
websiteUrl = "[Link]
vcsUrl = "[Link]
}
}
[Link]
plugins {
id '[Link]'
}
site {
outputDir = [Link]("mysite")
siteInfo {
websiteUrl = '[Link]
vcsUrl = '[Link]
}
}
Implementing the backing objects for such an extension is simple. First, introduce a new data
object for managing the properties websiteUrl and vcsUrl:
[Link]
In the extension, create an instance of the siteInfo class and a method to delegate the captured
values to the data instance.
[Link]
@Nested
abstract public SiteInfo getSiteInfo();
Plugins commonly use an extension to capture user input from the build script and map it to a
custom task’s input/output properties. The build script author interacts with the extension’s DSL,
while the plugin implementation handles the underlying logic:
app/[Link]
@TaskAction
fun executeTask() {
println("Input parameter: $inputParameter")
}
}
app/[Link]
@TaskAction
def executeTask() {
println("Input parameter: $inputParameter")
}
}
In this example, the MyExtension class defines an inputParameter property that can be set in the build
script. The MyPlugin class configures this extension and uses its inputParameter value to configure
the MyCustomTask task. The MyCustomTask task then uses this input parameter in its logic.
You can learn more about types you can use in task implementations and extensions in Lazy
Configuration.
Plugins should provide sensible defaults and standards in a specific context, reducing the number
of decisions users need to make. Using the project object, you can define default values. These are
known as conventions.
Conventions are properties that are initialized with default values and can be overridden by the
user in their build script. For example:
[Link]
interface GreetingPluginExtension {
val message: Property<String>
}
apply<GreetingPlugin>()
[Link]
interface GreetingPluginExtension {
Property<String> getMessage()
}
$ gradle -q hello
Hello from GreetingPlugin
In this example, GreetingPluginExtension is a class that represents the convention. The message
property is the convention property with a default value of 'Hello from GreetingPlugin'.
GreetingPluginExtension {
message = "Custom message"
}
[Link]
GreetingPluginExtension {
message = 'Custom message'
}
$ gradle -q hello
Custom message
Separating capabilities from conventions in plugins allows users to choose which tasks and
conventions to apply.
For example, the Java Base plugin provides un-opinionated (i.e., generic) functionality like
SourceSets, while the Java plugin adds tasks and conventions familiar to Java developers like
classes, jar or javadoc.
When designing your own plugins, consider developing two plugins — one for capabilities and
another for conventions — to offer flexibility to users.
In the example below, MyPlugin contains conventions, and MyBasePlugin defines capabilities. Then,
MyPlugin applies MyBasePlugin, this is called plugin composition. To apply a plugin from another one:
[Link]
import [Link];
import [Link];
import [Link];
import [Link];
// define conventions
}
}
Reacting to plugins
For example, a plugin could assume that it is applied to a Java-based project and automatically
reconfigure the standard source directory:
[Link]
The drawback to this approach is that it automatically forces the project to apply the Java plugin,
imposing a strong opinion on it (i.e., reducing flexibility and generality). In practice, the project
applying the plugin might not even deal with Java code.
Instead of automatically applying the Java plugin, the plugin could react to the fact that the
consuming project applies the Java plugin. Only if that is the case, then a certain configuration is
applied:
[Link]
Reacting to plugins is preferred over applying plugins if there is no good reason to assume that the
consuming project has the expected setup.
[Link]
Plugins can access the status of build features in the build. The Build Features API allows checking
whether the user requested a particular Gradle feature and if it is active in the current build. An
example of a build feature is the configuration cache.
@Inject
protected abstract BuildFeatures getBuildFeatures(); ①
@Override
public void apply(Project p) {
BuildFeatures buildFeatures = getBuildFeatures();
① The BuildFeatures service can be injected into plugins, tasks, and other managed types.
The [Link]() status of a build feature determines if the user requested to enable
or disable the feature.
• undefined — the user neither opted in nor opted out from using the feature
The [Link]() status of a build feature is always defined. It represents the effective
state of the feature in the build.
• true — the feature may affect the build behavior in a way specific to the feature
Note that the active status does not depend on the requested status. Even if the user requests a
feature, it may still not be active due to other build options being used in the build. Gradle can also
activate a feature by default, even if the user did not specify a preference.
Using a custom dependencies block
A plugin can provide dependency declarations in custom blocks that allow users to declare
dependencies in a type-safe and context-aware way.
For instance, instead of users needing to know and use the underlying Configuration name to add
dependencies, a custom dependencies block lets the plugin pick a meaningful name that can be used
consistently.
To add a custom dependencies block, you need to create a new type that will represent the set of
dependency scopes available to users. That new type needs to be accessible from a part of your
plugin (from a domain object or extension). Finally, the dependency scopes need to be wired back
to underlying Configuration objects that will be used during dependency resolution.
See JvmComponentDependencies and JvmTestSuite for an example of how this is used in a Gradle
core plugin.
[Link]
/**
* Custom dependencies block for the example plugin.
*/
public interface ExampleDependencies extends Dependencies {
For each dependency scope your plugin wants to support, add a getter method that returns a
DependencyCollector.
[Link]
/**
* Dependency scope called "implementation"
*/
DependencyCollector getImplementation();
3. Add accessors for custom dependencies block
To make the custom dependencies block configurable, the plugin needs to add a getDependencies
method that returns the new type from above and a configurable block method named
dependencies.
By convention, the accessors for your custom dependencies block should be called
getDependencies()/dependencies(Action). This method could be named something else, but users
would need to know that a different block can behave like a dependencies block.
[Link]
/**
* Custom dependencies for this extension.
*/
@Nested
ExampleDependencies getDependencies();
/**
* Configurable block
*/
default void dependencies(Action<? super ExampleDependencies> action) {
[Link](getDependencies());
}
Finally, the plugin needs to wire the custom dependencies block to some underlying Configuration
objects. If this is not done, none of the dependencies declared in the custom block will be available
to dependency resolution.
[Link]
[Link]().dependencyScope("exampleImplementation", conf
-> {
[Link]([Link]()
.getImplementation());
});
In this example, the name users will use to add dependencies is "implementation",
NOTE
but the underlying Configuration is named exampleImplementation.
[Link]
example {
dependencies {
implementation("junit:junit:4.13")
}
}
[Link]
example {
dependencies {
implementation("junit:junit:4.13")
}
}
Differences between the custom dependencies and the top-level dependencies blocks
Each depe