如何使用boto将文件上传到S3存储桶中的目录
我想使用python复制s3存储桶中的文件。如何使用boto将文件上传到S3存储桶中的目录
例如:我有桶名称=测试。并在桶中,我有2个文件夹名称“转储”&“输入”。现在我想复制一个文件从本地目录到S3“转储”文件夹使用python ...任何人都可以帮助我吗?
尝试......
import boto
import boto.s3
import sys
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump'
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY)
bucket = conn.create_bucket(bucket_name,
location=boto.s3.connection.Location.DEFAULT)
testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
(testfile, bucket_name)
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)
[更新] 我不是pythonist,所以感谢抬起头对import语句。另外,我不建议在你自己的源代码中放置证书。如果您正在运行这里面AWS使用与实例概要文件IAM凭证(http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html),并保持相同的行为在你的开发/测试环境中,使用这样的全息图将AdRoll(https://github.com/AdRoll/hologram)
我用这个,这是非常简单地实现
import tinys3
conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True)
f = open('some_file.zip','rb')
conn.upload('some_file.zip',f,'my_bucket')
我不认为这适用于大文件。我不得不使用这个:http://docs.pythonboto.org/en/latest/s3_tut.html#storing-large-data – wordsforthewise 2016-10-12 03:10:49
这也导致我这个修复:https://github.com/boto/boto/问题/ 2207#issuecomment-60682869 和这个: http://stackoverflow.com/questions/5396932/why-are-no-amazon-s3-authentication-handlers-ready – wordsforthewise 2016-10-12 03:36:25
由于tinys3项目被放弃,你不应该使用这个。 https://github.com/smore-inc/tinys3/issues/45 – 2018-01-27 12:24:46
没有必要弄得这么复杂:
s3_connection = boto.connect_s3()
bucket = s3_connection.get_bucket('your bucket name')
key = boto.s3.key.Key(bucket, 'some_file.zip')
with open('some_file.zip') as f:
key.send_file(f)
from boto3.s3.transfer import S3Transfer
import boto3
#have all the variables populated which are required below
client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key)
transfer = S3Transfer(client)
transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename)
什么是文件路径,什么是文件夹名称+文件名?它很混乱 – colintobing 2017-08-03 02:41:22
@colintobing文件路径是群集上的文件路径,而folder_name /文件名是你想要的内部s3存储区的命名约定 – 2017-08-29 11:47:51
哇,为什么有50种方法可以做到这一点...... – 2018-01-24 10:33:21
这也将工作:
import os
import boto
import boto.s3.connection
from boto.s3.key import Key
try:
conn = boto.s3.connect_to_region('us-east-1',
aws_access_key_id = 'AWS-Access-Key',
aws_secret_access_key = 'AWS-Secrete-Key',
# host = 's3-website-us-east-1.amazonaws.com',
# is_secure=True, # uncomment if you are not using ssl
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
bucket = conn.get_bucket('YourBucketName')
key_name = 'FileToUpload'
path = 'images/holiday' #Directory Under which file should get upload
full_key_name = os.path.join(path, key_name)
k = bucket.new_key(full_key_name)
k.set_contents_from_filename(key_name)
except Exception,e:
print str(e)
print "error"
import boto
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
END_POINT = '' # eg. us-east-1
S3_HOST = '' # eg. s3.us-east-1.amazonaws.com
BUCKET_NAME = 'test'
FILENAME = 'upload.txt'
UPLOADED_FILENAME = 'dumps/upload.txt'
# include folders in file path. If it doesn't exist, it will be created
s3 = boto.s3.connect_to_region(END_POINT,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
host=S3_HOST)
bucket = s3.get_bucket(BUCKET_NAME)
k = Key(bucket)
k.key = UPLOADED_FILENAME
k.set_contents_from_filename(FILENAME)
import boto3
s3 = boto3.resource('s3')
BUCKET = "test"
s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")
您能解释这一行吗? s3.Bucket(BUCKET).upload_file(“your/local/file”,“dump/file“) – venkat 2018-03-06 12:44:39
@venkat”your/local/file“是一个文件路径,例如使用python/boto的计算机上的”/home/file.txt“,”dump/file“是一个存储文件的关键名称S3 Bucket。请参阅:http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Bucket.upload_file – 2018-03-06 22:16:56
我会避免多进口线,而不是Python的。将导入行移动到顶部,对于boto,可以使用from boto.s3.connection import S3Connection; conn = S3Connection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY); bucket = conn.create_bucket(bucketname ...); bucket.new_key(keyname,...)。set_contents_from_filename .... – cgseller 2015-06-29 22:51:15