Skip to content

Commit afc5066

Browse files
Google APIscopybara-github
authored andcommitted
fix!: Remove temp_bucket from VirtualClusterConfig, as its value was not used
Committer: @Harwayne PiperOrigin-RevId: 440224385
1 parent cbd3367 commit afc5066

1 file changed

Lines changed: 0 additions & 15 deletions

File tree

google/cloud/dataproc/v1/clusters.proto

Lines changed: 0 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -303,21 +303,6 @@ message VirtualClusterConfig {
303303
// a Cloud Storage bucket.**
304304
string staging_bucket = 1 [(google.api.field_behavior) = OPTIONAL];
305305

306-
// Optional. A Cloud Storage bucket used to store ephemeral cluster and jobs data,
307-
// such as Spark and MapReduce history files.
308-
// If you do not specify a temp bucket,
309-
// Dataproc will determine a Cloud Storage location (US,
310-
// ASIA, or EU) for your cluster's temp bucket according to the
311-
// Compute Engine zone where your cluster is deployed, and then create
312-
// and manage this project-level, per-location bucket. The default bucket has
313-
// a TTL of 90 days, but you can use any TTL (or none) if you specify a
314-
// bucket (see
315-
// [Dataproc staging and temp
316-
// buckets](https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/staging-bucket)).
317-
// **This field requires a Cloud Storage bucket name, not a `gs://...` URI to
318-
// a Cloud Storage bucket.**
319-
string temp_bucket = 2 [(google.api.field_behavior) = OPTIONAL];
320-
321306
oneof infrastructure_config {
322307
// Required. The configuration for running the Dataproc cluster on Kubernetes.
323308
KubernetesClusterConfig kubernetes_cluster_config = 6 [(google.api.field_behavior) = REQUIRED];

0 commit comments

Comments
 (0)