Skip to content

Conversation

@JuArce
Copy link
Collaborator

@JuArce JuArce commented Aug 4, 2025

feat: upload batches to multiple storage services

Description

The Batcher now uploads the batches to two data services. Operators, explorer and aggregation mode can fetch the batch from any of them.

The URLs (batch_data_pointer) are uploaded to the contract separated by a comma, for example:

http://localhost:4566/aligned.storage/17de1de6011e441e464b18310a5de5e12c83b1e49448571a5b7990ec47826a51.json,http://localhost:4567/aligned.storage/17de1de6011e441e464b18310a5de5e12c83b1e49448571a5b7990ec47826a51.json

The Batcher exposes a metrics with the current quantity of available data services. In normal conditions, it is 2. The system can work in degraded mode with 1 data service. This metric is showed in a Grafana dashboard.

Important

All operators, explorer and aggregation mode must be upgraded before upgrading batcher

How to Test

Verification Layer

  1. Run Ethereum Package
make ethereum_package_start
  1. Run batcher
make batcher_start_ethereum_package

This will start two local buckets

  1. Run aggregator
make aggregator_start_ethereum_package ENVIRONMENT=devnet
  1. Run operator
make build_all_ffi
make operator_full_registration CONFIG_FILE=./config-files/config-operator-1-ethereum-package.yaml ENVIRONMENT=devnet
make operator_start CONFIG_FILE=./config-files/config-operator-1-ethereum-package.yaml ENVIRONMENT=devnet
  1. Run explorer
make explorer_clean_db
make explorer_start
  1. Run metrics
make metrics_start
  1. Send tasks
make batcher_send_proof_with_random_address

You should see the following logs in the Batcher

[2025-08-05T13:14:38Z INFO  aligned_batcher] Successfully uploaded batch to primary S3: http://localhost:4566/aligned.storage/689e0265af9107219660c25edeeb687ca05f07edeb0c6c3485104923011312e7.json
...
[2025-08-05T13:14:38Z INFO  aligned_batcher] Successfully uploaded batch to secondary S3: http://localhost:4567/aligned.storage/689e0265af9107219660c25edeeb687ca05f07edeb0c6c3485104923011312e7.json

You should the following logs in the Operator

2025-08-04T18:08:24.075-0300	INFO	pkg/s3.go:25	Getting batch from data service with 2 URLs: [http://localhost:4566/aligned.storage/17de1de6011e441e464b18310a5de5e12c83b1e49448571a5b7990ec47826a51.json http://localhost:4567/aligned.storage/17de1de6011e441e464b18310a5de5e12c83b1e49448571a5b7990ec47826a51.json]
2025-08-04T18:08:24.075-0300	INFO	pkg/s3.go:31	Trying URL 1 of 2: http://localhost:4566/aligned.storage/17de1de6011e441e464b18310a5de5e12c83b1e49448571a5b7990ec47826a51.json

In the grafana dashboard, you will see the quantity of active data services

image

Aggregation Mode

After sending tasks to the verification layer, you can run the aggregation mode

  1. Run aggregation mode
make proof_aggregator_start_ethereum_package AGGREGATOR=sp1

You should see the following logs in the Proof Aggregator

2025-08-04T20:38:22.491226Z  INFO proof_aggregator::backend::s3: Getting batch from data service with 2 URLs: ["http://localhost:4566/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json", "http://localhost:4567/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json"]

You can test steps 11 and 12, with different buckets confgurations, for example, you can shutdown one of them.

For example, you will see the following logs in the Aggregation Mode:

2025-08-04T20:38:22.491226Z  INFO proof_aggregator::backend::s3: Getting batch from data service with 2 URLs: ["http://localhost:4566/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json", "http://localhost:4567/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json"]
2025-08-04T20:38:22.491245Z  INFO proof_aggregator::backend::s3: Fetching batch from S3 URL: http://localhost:4566/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json
2025-08-04T20:38:22.491942Z  WARN proof_aggregator::backend::s3: Failed to fetch batch from URL http://localhost:4566/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json: FetchingS3Batch("error sending request for url (http://localhost:4566/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json)")
2025-08-04T20:38:22.491954Z  INFO proof_aggregator::backend::s3: Fetching batch from S3 URL: http://localhost:4567/aligned.storage/6ae77c221a1d106b0831594b903d537522281579b7a11f1df9381f7a782ee206.json
2025-08-04T20:38:22.538746Z  INFO proof_aggregator::backend::fetcher: Data downloaded from S3, number of proofs 1

Type of change

Please delete options that are not relevant.

  • New feature

Checklist

  • “Hotfix” to testnet, everything else to staging
  • Linked to Github Issue
  • This change depends on code or research by an external entity
    • Acknowledgements were updated to give credit
  • Unit tests added
  • This change requires new documentation.
    • Documentation has been added/updated.
  • This change is an Optimization
    • Benchmarks added/run
  • Has a known issue
  • If your PR changes the Operator compatibility (Ex: Upgrade prover versions)
    • This PR adds compatibility for operator for both versions and do not change crates/docs/examples
    • This PR updates batcher and docs/examples to the newer version. This requires the operator are already updated to be compatible

@JuArce JuArce self-assigned this Aug 4, 2025
@JuArce JuArce marked this pull request as ready for review August 5, 2025 14:11
@MauroToscano MauroToscano enabled auto-merge August 6, 2025 19:01
@MauroToscano MauroToscano added this pull request to the merge queue Aug 6, 2025
Merged via the queue into staging with commit e345a37 Aug 6, 2025
5 checks passed
@MauroToscano MauroToscano deleted the 2041-feat-upload-batches-to-multiple-storage-services branch August 6, 2025 19:24
@JuArce JuArce linked an issue Aug 11, 2025 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: upload batches to multiple storage services

4 participants