-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: storageIssues related to the Cloud Storage API.Issues related to the Cloud Storage API.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Description
I'm using upload_from_file with encryption_key to store files inside a Google Cloud Storage bucket. This works very well.
Now I'm trying to download these files again. Using download_to_file with the same encryption_key. This crashes with the following stack trace:
Traceback (most recent call last):
File "restore.py", line 40, in <module>
main()
File "restore.py", line 37, in main
download_from_storage(args.file, args.bucket, args.credentials, key)
File "restore.py", line 17, in download_from_storage
blob.download_to_file(my_file, encryption_key=encryption_key)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/storage/blob.py", line 354, in download_to_file
download.initialize_download(request, client._connection.http)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 390, in initialize_download
self.stream_file(use_chunks=True)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 614, in stream_file
response = self._process_response(response)
File "/Users/daniel/Projects/python/gcs-backup/env/lib/python2.7/site-packages/google/cloud/streaming/transfer.py", line 528, in _process_response
raise TransferRetryError(response.content)
google.cloud.streaming.exceptions.TransferRetryError: The target object is encrypted by a customer-supplied encryption key.
I'm using google-cloud-storage==0.20.0.
This only occurs when downloading files larger than 1mb. Any file less than 1mb is downloaded and decrypted as expected. A file larger than 1mb might be downloaded in chunks and only the first 1mb chunk is saved to disk.
Steps to reproduce
- Upload a file larger than 1mb using a custom encryption key
- Try to download this file using the same encryption key
- Find a 1mb chunk of your file at the destination
demo code
#!/usr/bin/env python
from tempfile import NamedTemporaryFile
import os
from google.cloud import storage
from google.cloud.storage import Blob
CREDENTIALS = './credentials.json'
BUCKET_NAME = '< SET BUCKET >'
ENCRYPTION_KEY = 'v3CtoFyEJmj52RGsSqze7C8fD6mzgpnd'
FILE_NAME = 'enc-test-file'
client = storage.Client.from_service_account_json(CREDENTIALS)
bucket = client.get_bucket(BUCKET_NAME)
def upload_to_storage(filename):
blob = Blob(FILE_NAME, bucket)
with open(filename, 'rb') as my_file:
blob.upload_from_file(my_file, encryption_key=ENCRYPTION_KEY)
def download_from_storage(filename):
blob = Blob(FILE_NAME, bucket)
with open(filename, 'wb') as my_file:
blob.download_to_file(my_file, encryption_key=ENCRYPTION_KEY)
def main():
size = 2097152 # 2mb file
f = NamedTemporaryFile(delete=False)
f.write("\0" * size)
f.close()
print('Uploading {} to Google Cloud Storage...'.format(f.name))
upload_to_storage(f.name)
print('Downloading {} from Google Cloud Storage...'.format(f.name))
download_from_storage(f.name)
os.remove(f.name)
if __name__ == '__main__':
main()
This might be because the encryption headers aren't added to the request for the next chunk.
Metadata
Metadata
Assignees
Labels
api: storageIssues related to the Cloud Storage API.Issues related to the Cloud Storage API.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.