-
Notifications
You must be signed in to change notification settings - Fork 159
ReadTimeout errors #1184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@uatach, have you resolved this issue? I keep on receiving this exception. |
@shamyjun22 I've added some retry logic and it was enough, I haven't tried the new package version yet. |
Hi @uatach, I encounter this exception when trying to upload file to Google Cloud Storage: Code: ConnectionError Does it mean that I was able to upload the file successfully, but I just don't receive any response from where I am uploading to? Also, if it wont bother, can you provide a snippet of code on how you resolve it? Thanks in advance. |
@shamyjun22 are you trying to upload many files, like in parallel or really fast? I've felt that several GCP python libs have some problem dealing with simultaneous connections but I still haven't been able to pin the problem down... I have been adding some sleep calls just to avoid these problems
I don't think so
Sure, but as I said is a simple retry logic: blob = Client().bucket('bucket_name').blob('blob_name')
while True:
try:
with blob.open('w') as fp:
fp.write('data')
except:
continue
else:
break |
@uatach, thank you for your response. By the way, i am not uploading many files in parallel at one time, but i am uploading one file every second interval because I need realtime data. This is how I structure my code in python, maybe you can have suggestion the way it is structure
Thanks. |
This is not directly on topic, but can we not expect to GCS to handle blob requests within 1 minute in the general case? We're seeing these errors much more frequently (for small blobs) than I would expect (I mean - there's no guarantees in life), so wondering if it's likely something else on our end, or if GCS just is unreliable? |
Uh oh!
There was an error while loading. Please reload this page.
I keep receiving
requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Read timed out. (read timeout=60)
when trying to write files to storage.I've tried adding the
timeout
value when opening the file:Client().bucket('my_bucket').blob('my_blob').open(mode='wb', timeout=3600)
Investigating the traceback:
I've found that the method
transmit_next_chunk
can receive atimeout
param that is not being set.The text was updated successfully, but these errors were encountered: