Here’s another interesting tidbit.
If you have scripts that connect to S3, and you run out of buckets (Amazon only allows 100 buckets per account), you might get a nasty surprise.
See, you may have been using create_bucket(name-of-bucket) to get your bucket object. It’s undocumented as far as I can see, but apparently if you use create_bucket() on an bucket that actually exists, it’ll return the Bucket object. That’s handy! Except it breaks if you’re unable to create more buckets (even though you aren’t really trying to create more). Sigh, so I refactored as such:
# old and busted: bucket = s3_conn.create_bucket(bucket_name) # new hotness: # iterate over Bucket objects and return the one matching string: def find_s3_bucket(s3_conn, string): for i in s3_conn.get_all_buckets(): if string in i.name: return i Used as: bucket = find_s3_bucket(s3_conn, bucket_name)
There is likely a more elegant way, but hey this works.
No Comments yet... be the first »