Removed size parameter from blob.upload_to_file() to prevent mem error

By not passing the size, the gcloud module uses os.fstat to determine
the size of the to-be-uploaded file, and switch to resumable uploads.
This should prevent memory errors uploading large files.
This commit is contained in:
Sybren A. Stüvel 2016-05-13 17:40:37 +02:00
parent 193d7cef5e
commit 41a278c4f0

View File

@ -550,8 +550,7 @@ def stream_to_gcs(project_id):
gcs = GoogleCloudStorageBucket(project_id)
blob = gcs.bucket.blob('_/' + internal_fname, chunk_size=256 * 1024 * 2)
blob.upload_from_file(stream_for_gcs,
content_type=uploaded_file.mimetype,
size=uploaded_file.content_length)
content_type=uploaded_file.mimetype)
except Exception:
log.exception('Error uploading file to Google Cloud Storage (GCS),'
' aborting handling of uploaded file (id=%s).', file_id)