python - SSlError from concurrent s3 file download -


so can't life of me figure out how fix bug....

basically have program downloads of files s3 bucket , parses through each file's contents , adds records database. doing sequentially slow, decided try downloading , processing files in parallel using multiprocess module since whole process doesn't need have sort of shared memory or locking system. therefore made code following:

from multiprocess import pool import boto  def dowork(s3key):      # connect s3 bucket , s3 file key     conn = boto.connect_s3('aws_key_id', 'aws_secret_access_key')     bucket = conn.get_bucket('my_bucket')     key = bucket.get_key(s3key)      ##########error thrown call#####################     key.get_contents_to_filename('my_file')      # insert database here....   if __name__ == '__main__':     pool = pool()      # every s3 file in bucket, download/process file in parallel worker     s3keyname in s3keyname_list:          pool.apply_async(dowork, s3keyname) 

for reason call 'get_contents_to_filename()' raises sslerror sometimes! weirder if downloading 7 files @ once, 1 file not download correctly because error thrown in 1 of subprocesses. of files download correctly, , others 1 file throws error. error message says using wrong version of ssl, shouldn't trying download other files throw same error well? i've tried googling around, , best find maybe sort of buffer overflow issue? appreciated!


Comments

Popular posts from this blog

database - VFP Grid + SQL server 2008 - grid not showing correctly -

jquery - Set jPicker field to empty value -

.htaccess - htaccess convert request to clean url and add slash at the end of the url -