libcloud.storage.drivers package¶
Submodules¶
libcloud.storage.drivers.atmos module¶
- class libcloud.storage.drivers.atmos.AtmosConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, proxy_url=None)[source]¶
Bases: libcloud.common.base.ConnectionUserAndKey
- responseCls¶
alias of AtmosResponse
- class libcloud.storage.drivers.atmos.AtmosDriver(key, secret=None, secure=True, host=None, port=None)[source]¶
Bases: libcloud.storage.base.StorageDriver
- DEFAULT_CDN_TTL = 604800¶
- api_name = 'atmos'¶
- connectionCls¶
alias of AtmosConnection
- get_object_cdn_url(obj, expiry=None, use_object=False)[source]¶
Return an object CDN URL.
Parameters: - obj (Object) – Object instance
- expiry (str) – Expiry
- use_object (bool) – Use object
Return type: str
- host = None¶
- name = 'atmos'¶
- path = None¶
- supports_chunked_encoding = True¶
- website = 'http://atmosonline.com/'¶
- class libcloud.storage.drivers.atmos.AtmosResponse(response, connection)[source]¶
Bases: libcloud.common.base.XmlResponse
Parameters: - response (httplib.HTTPResponse) – HTTP response object. (optional)
- connection (Connection) – Parent connection object.
libcloud.storage.drivers.azure_blobs module¶
- class libcloud.storage.drivers.azure_blobs.AzureBlobLease(driver, object_path, use_lease)[source]¶
Bases: object
A class to help in leasing an azure blob and renewing the lease
Parameters: - driver (AzureStorageDriver) – The Azure storage driver that is being used
- object_path (str) – The path of the object we need to lease
- use_lease (bool) – Indicates if we must take a lease or not
- class libcloud.storage.drivers.azure_blobs.AzureBlobsConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, proxy_url=None)[source]¶
Bases: libcloud.common.azure.AzureConnection
Represents a single connection to Azure Blobs
- class libcloud.storage.drivers.azure_blobs.AzureBlobsStorageDriver(key, secret=None, secure=True, host=None, port=None, **kwargs)[source]¶
Bases: libcloud.storage.base.StorageDriver
- connectionCls¶
alias of AzureBlobsConnection
- download_object(obj, destination_path, overwrite_existing=False, delete_on_failure=True)[source]¶
@inherits: StorageDriver.download_object
- download_object_as_stream(obj, chunk_size=None)[source]¶
@inherits: StorageDriver.download_object_as_stream
- ex_blob_type = 'BlockBlob'¶
- ex_set_object_metadata(obj, meta_data)[source]¶
Set metadata for an object
Parameters: - obj (Object) – The blob object
- meta_data (dict) – Metadata key value pairs
- hash_type = 'md5'¶
- name = 'Microsoft Azure (blobs)'¶
- supports_chunked_encoding = False¶
- upload_object(file_path, container, object_name, extra=None, verify_hash=True, ex_blob_type=None, ex_use_lease=False)[source]¶
Upload an object currently located on a disk.
@inherits: StorageDriver.upload_object
Parameters: - ex_blob_type (str) – Storage class
- ex_use_lease (bool) – Indicates if we must take a lease before upload
- upload_object_via_stream(iterator, container, object_name, verify_hash=False, extra=None, ex_use_lease=False, ex_blob_type=None, ex_page_blob_size=None)[source]¶
@inherits: StorageDriver.upload_object_via_stream
Parameters: - ex_blob_type (str) – Storage class
- ex_page_blob_size (int) – The maximum size to which the page blob can grow to
- ex_use_lease (bool) – Indicates if we must take a lease before upload
- website = 'http://windows.azure.com/'¶
libcloud.storage.drivers.cloudfiles module¶
- class libcloud.storage.drivers.cloudfiles.ChunkStreamReader(file_path, start_block, end_block, chunk_size)[source]¶
Bases: object
- class libcloud.storage.drivers.cloudfiles.CloudFilesConnection(user_id, key, secure=True, use_internal_url=False, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.OpenStackSwiftConnection
Base connection class for the Cloudfiles driver.
- auth_url = 'https://auth.api.rackspacecloud.com'¶
- rawResponseCls¶
alias of CloudFilesRawResponse
- request(action, params=None, data='', headers=None, method='GET', raw=False, cdn_request=False)[source]¶
- responseCls¶
alias of CloudFilesResponse
- class libcloud.storage.drivers.cloudfiles.CloudFilesRawResponse(connection)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesResponse, libcloud.common.base.RawResponse
Parameters: connection (Connection) – Parent connection object.
- class libcloud.storage.drivers.cloudfiles.CloudFilesResponse(response, connection)[source]¶
Bases: libcloud.common.base.Response
Parameters: - response (httplib.HTTPResponse) – HTTP response object. (optional)
- connection (Connection) – Parent connection object.
- valid_response_codes = [404, 409]¶
- class libcloud.storage.drivers.cloudfiles.CloudFilesStorageDriver(key, secret=None, secure=True, host=None, port=None, region='ord', use_internal_url=False, **kwargs)[source]¶
Bases: libcloud.storage.base.StorageDriver, libcloud.common.openstack.OpenStackDriverMixin
CloudFiles driver.
@inherits: StorageDriver.__init__
Parameters: region (str) – ID of the region which should be used. - connectionCls¶
alias of CloudFilesConnection
- enable_container_cdn(container, ex_ttl=None)[source]¶
@inherits: StorageDriver.enable_container_cdn
Parameters: ex_ttl (int) – cache time to live
- ex_enable_static_website(container, index_file='index.html')[source]¶
Enable serving a static website.
Parameters: - container (Container) – Container instance
- index_file – Name of the object which becomes an index page for
every sub-directory in this container. :type index_file: str
Return type: bool
- ex_get_object_temp_url(obj, method='GET', timeout=60)[source]¶
Create a temporary URL to allow others to retrieve or put objects in your Cloud Files account for as long or as short a time as you wish. This method is specifically for allowing users to retrieve or update an object.
Parameters: - obj (Object) – The object that you wish to make temporarily public
- method (str) – Which method you would like to allow, ‘PUT’ or ‘GET’
- timeout – Time (in seconds) after which you want the TempURL
to expire. :type timeout: int
Return type: bool
- ex_multipart_upload_object(file_path, container, object_name, chunk_size=33554432, extra=None, verify_hash=True)[source]¶
- ex_purge_object_from_cdn(obj, email=None)[source]¶
Purge edge cache for the specified object.
Parameters: email – Email where a notification will be sent when the job completes. (optional) :type email: str
- ex_set_account_metadata_temp_url_key(key)[source]¶
Set the metadata header X-Account-Meta-Temp-URL-Key on your Cloud Files account.
Parameters: key (str) – X-Account-Meta-Temp-URL-Key Return type: bool
- ex_set_error_page(container, file_name='error.html')[source]¶
Set a custom error page which is displayed if file is not found and serving of a static website is enabled.
Parameters: - container (Container) – Container instance
- file_name (str) – Name of the object which becomes the error page.
Return type: bool
- hash_type = 'md5'¶
- iterate_container_objects(container, ex_prefix=None)[source]¶
Return a generator of objects for the given container.
Parameters: - container (Container) – Container instance
- ex_prefix (str) – Only get objects with names starting with ex_prefix
Returns: A generator of Object instances.
Return type: generator of Object
- list_container_objects(container, ex_prefix=None)[source]¶
Return a list of objects for the given container.
Parameters: - container (Container) – Container instance.
- ex_prefix (str) – Only get objects with names starting with ex_prefix
Returns: A list of Object instances.
Return type: list of Object
- name = 'CloudFiles'¶
- supports_chunked_encoding = True¶
- upload_object(file_path, container, object_name, extra=None, verify_hash=True)[source]¶
Upload an object.
Note: This will override file with a same name if it already exists.
- website = 'http://www.rackspace.com/'¶
- class libcloud.storage.drivers.cloudfiles.CloudFilesUKStorageDriver(*args, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesStorageDriver
Cloudfiles storage driver for the UK endpoint.
- name = 'CloudFiles (UK)'¶
- type = 'cloudfiles_uk'¶
- class libcloud.storage.drivers.cloudfiles.CloudFilesUSStorageDriver(*args, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesStorageDriver
Cloudfiles storage driver for the US endpoint.
- name = 'CloudFiles (US)'¶
- type = 'cloudfiles_us'¶
- class libcloud.storage.drivers.cloudfiles.FileChunkReader(file_path, chunk_size)[source]¶
Bases: object
- class libcloud.storage.drivers.cloudfiles.OpenStackSwiftConnection(user_id, key, secure=True, **kwargs)[source]¶
Bases: libcloud.common.openstack.OpenStackBaseConnection
Connection class for the OpenStack Swift endpoint.
- auth_url = 'https://auth.api.rackspacecloud.com'¶
- rawResponseCls¶
alias of CloudFilesRawResponse
- request(action, params=None, data='', headers=None, method='GET', raw=False, cdn_request=False)[source]¶
- responseCls¶
alias of CloudFilesResponse
- class libcloud.storage.drivers.cloudfiles.OpenStackSwiftStorageDriver(key, secret=None, secure=True, host=None, port=None, region=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesStorageDriver
Storage driver for the OpenStack Swift.
- connectionCls¶
alias of OpenStackSwiftConnection
- name = 'OpenStack Swift'¶
- type = 'cloudfiles_swift'¶
libcloud.storage.drivers.dummy module¶
- class libcloud.storage.drivers.dummy.DummyFileObject(yield_count=5, chunk_len=10)[source]¶
Bases: file
- class libcloud.storage.drivers.dummy.DummyStorageDriver(api_key, api_secret)[source]¶
Bases: libcloud.storage.base.StorageDriver
Dummy Storage driver.
>>> from libcloud.storage.drivers.dummy import DummyStorageDriver >>> driver = DummyStorageDriver('key', 'secret') >>> container = driver.create_container(container_name='test container') >>> container <Container: name=test container, provider=Dummy Storage Provider> >>> container.name 'test container' >>> container.extra['object_count'] 0
Parameters: - api_key (str) – API key or username to used (required)
- api_secret (str) – Secret password to be used (required)
Return type: None
- create_container(container_name)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> container = driver.create_container( ... container_name='test container 1') ... Traceback (most recent call last): ContainerAlreadyExistsError: Traceback (most recent call last): ContainerAlreadyExistsError:
@inherits: StorageDriver.create_container
- delete_container(container)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container = Container(name = 'test container', ... extra={'object_count': 0}, driver=driver) >>> driver.delete_container(container=container) ... Traceback (most recent call last): ContainerDoesNotExistError: >>> container = driver.create_container( ... container_name='test container 1') ... >>> len(driver._containers) 1 >>> driver.delete_container(container=container) True >>> len(driver._containers) 0 >>> container = driver.create_container( ... container_name='test container 1') ... >>> obj = container.upload_object_via_stream( ... object_name='test object', iterator=DummyFileObject(5, 10), ... extra={}) >>> driver.delete_container(container=container) ... Traceback (most recent call last): ContainerIsNotEmptyError: Traceback (most recent call last): ContainerIsNotEmptyError:
@inherits: StorageDriver.delete_container
- delete_object(obj)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container = driver.create_container( ... container_name='test container 1') ... >>> obj = container.upload_object_via_stream(object_name='test object', ... iterator=DummyFileObject(5, 10), extra={}) >>> obj <Object: name=test object, size=50, ...> >>> container.delete_object(obj=obj) True >>> obj = Object(name='test object 2', ... size=1000, hash=None, extra=None, ... meta_data=None, container=container,driver=None) >>> container.delete_object(obj=obj) Traceback (most recent call last): ObjectDoesNotExistError: Traceback (most recent call last): ObjectDoesNotExistError:
@inherits: StorageDriver.delete_object
- download_object_as_stream(obj, chunk_size=None)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container = driver.create_container( ... container_name='test container 1') ... >>> obj = container.upload_object_via_stream(object_name='test object', ... iterator=DummyFileObject(5, 10), extra={}) >>> stream = container.download_object_as_stream(obj) >>> stream <...closed...>
@inherits: StorageDriver.download_object_as_stream
- get_container(container_name)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> driver.get_container('unknown') Traceback (most recent call last): ContainerDoesNotExistError: >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> container.name 'test container 1' >>> driver.get_container('test container 1') <Container: name=test container 1, provider=Dummy Storage Provider> Traceback (most recent call last): ContainerDoesNotExistError:
@inherits: StorageDriver.get_container
- get_container_cdn_url(container)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> driver.get_container('unknown') Traceback (most recent call last): ContainerDoesNotExistError: >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> container.name 'test container 1' >>> container.get_cdn_url() 'http://www.test.com/container/test_container_1' Traceback (most recent call last): ContainerDoesNotExistError:
@inherits: StorageDriver.get_container_cdn_url
- get_meta_data()[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> driver.get_meta_data()['object_count'] 0 >>> driver.get_meta_data()['container_count'] 0 >>> driver.get_meta_data()['bytes_used'] 0 >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container_name = 'test container 2' >>> container = driver.create_container(container_name=container_name) >>> obj = container.upload_object_via_stream( ... object_name='test object', iterator=DummyFileObject(5, 10), ... extra={}) >>> driver.get_meta_data()['object_count'] 1 >>> driver.get_meta_data()['container_count'] 2 >>> driver.get_meta_data()['bytes_used'] 50
Return type: dict
- get_object(container_name, object_name)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> driver.get_object('unknown', 'unknown') ... Traceback (most recent call last): ContainerDoesNotExistError: >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> driver.get_object( ... 'test container 1', 'unknown') Traceback (most recent call last): ObjectDoesNotExistError: >>> obj = container.upload_object_via_stream(object_name='test object', ... iterator=DummyFileObject(5, 10), extra={}) >>> obj.name 'test object' >>> obj.size 50 Traceback (most recent call last): ObjectDoesNotExistError:
@inherits: StorageDriver.get_object
- get_object_cdn_url(obj)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> obj = container.upload_object_via_stream( ... object_name='test object 5', ... iterator=DummyFileObject(5, 10), extra={}) >>> obj.name 'test object 5' >>> obj.get_cdn_url() 'http://www.test.com/object/test_object_5'
@inherits: StorageDriver.get_object_cdn_url
- iterate_containers()[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> list(driver.iterate_containers()) [] >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 1, provider=Dummy Storage Provider> >>> container.name 'test container 1' >>> container_name = 'test container 2' >>> container = driver.create_container(container_name=container_name) >>> container <Container: name=test container 2, provider=Dummy Storage Provider> >>> container = driver.create_container( ... container_name='test container 2') ... Traceback (most recent call last): ContainerAlreadyExistsError: >>> container_list=list(driver.iterate_containers()) >>> sorted([c.name for c in container_list]) ['test container 1', 'test container 2'] Traceback (most recent call last): ContainerAlreadyExistsError:
@inherits: StorageDriver.iterate_containers
- name = 'Dummy Storage Provider'¶
- upload_object(file_path, container, object_name, extra=None, file_hash=None)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container_name = 'test container 1' >>> container = driver.create_container(container_name=container_name) >>> container.upload_object(file_path='/tmp/inexistent.file', ... object_name='test') Traceback (most recent call last): LibcloudError: >>> file_path = path = os.path.abspath(__file__) >>> file_size = os.path.getsize(file_path) >>> obj = container.upload_object(file_path=file_path, ... object_name='test') >>> obj <Object: name=test, size=...> >>> obj.size == file_size True Traceback (most recent call last): LibcloudError:
@inherits: StorageDriver.upload_object :param file_hash: File hash :type file_hash: str
- upload_object_via_stream(iterator, container, object_name, extra=None)[source]¶
>>> driver = DummyStorageDriver('key', 'secret') >>> container = driver.create_container( ... container_name='test container 1') ... >>> obj = container.upload_object_via_stream( ... object_name='test object', iterator=DummyFileObject(5, 10), ... extra={}) >>> obj <Object: name=test object, size=50, ...>
@inherits: StorageDriver.upload_object_via_stream
- website = 'http://example.com'¶
libcloud.storage.drivers.google_storage module¶
- class libcloud.storage.drivers.google_storage.GoogleStorageConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, proxy_url=None)[source]¶
Bases: libcloud.common.base.ConnectionUserAndKey
Repersents a single connection to the Google storage API endpoint.
- host = 'commondatastorage.googleapis.com'¶
- rawResponseCls¶
alias of S3RawResponse
- responseCls¶
alias of S3Response
- class libcloud.storage.drivers.google_storage.GoogleStorageDriver(key, secret=None, secure=True, host=None, port=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.BaseS3StorageDriver
- connectionCls¶
alias of GoogleStorageConnection
- hash_type = 'md5'¶
- http_vendor_prefix = 'x-goog'¶
- name = 'Google Storage'¶
- namespace = 'http://doc.s3.amazonaws.com/2006-03-01'¶
- supports_chunked_encoding = False¶
- supports_s3_multipart_upload = False¶
- website = 'http://cloud.google.com/'¶
libcloud.storage.drivers.ktucloud module¶
- class libcloud.storage.drivers.ktucloud.KTUCloudStorageConnection(user_id, key, secure=True, use_internal_url=False, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesConnection
Connection class for the KT UCloud Storage endpoint.
- auth_url = 'https://ssproxy.ucloudbiz.olleh.com/auth/v1.0'¶
- class libcloud.storage.drivers.ktucloud.KTUCloudStorageDriver(key, secret=None, secure=True, host=None, port=None, region='ord', use_internal_url=False, **kwargs)[source]¶
Bases: libcloud.storage.drivers.cloudfiles.CloudFilesStorageDriver
Cloudfiles storage driver for the UK endpoint.
@inherits: StorageDriver.__init__
Parameters: region (str) – ID of the region which should be used. - connectionCls¶
alias of KTUCloudStorageConnection
- name = 'KTUCloud Storage'¶
- type = 'ktucloud'¶
libcloud.storage.drivers.local module¶
libcloud.storage.drivers.nimbus module¶
- class libcloud.storage.drivers.nimbus.NimbusConnection(*args, **kwargs)[source]¶
Bases: libcloud.common.base.ConnectionUserAndKey
- host = 'nimbus.io'¶
- responseCls¶
alias of NimbusResponse
- class libcloud.storage.drivers.nimbus.NimbusResponse(response, connection)[source]¶
Bases: libcloud.common.base.JsonResponse
Parameters: - response (httplib.HTTPResponse) – HTTP response object. (optional)
- connection (Connection) – Parent connection object.
- valid_response_codes = [200, 404, 409, 400]¶
libcloud.storage.drivers.ninefold module¶
libcloud.storage.drivers.s3 module¶
- class libcloud.storage.drivers.s3.BaseS3Connection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, proxy_url=None)[source]¶
Bases: libcloud.common.base.ConnectionUserAndKey
Represents a single connection to the S3 Endpoint
- host = 's3.amazonaws.com'¶
- rawResponseCls¶
alias of S3RawResponse
- responseCls¶
alias of S3Response
- class libcloud.storage.drivers.s3.BaseS3StorageDriver(key, secret=None, secure=True, host=None, port=None, **kwargs)[source]¶
Bases: libcloud.storage.base.StorageDriver
- connectionCls¶
alias of BaseS3Connection
- ex_cleanup_all_multipart_uploads(container, prefix=None)[source]¶
Extension method for removing all partially completed S3 multipart uploads.
Parameters: - container (Container) – The container holding the uploads
- prefix (str) – Delete only uploads of objects with this prefix
- ex_iterate_multipart_uploads(container, prefix=None, delimiter=None)[source]¶
Extension method for listing all in-progress S3 multipart uploads.
Each multipart upload which has not been committed or aborted is considered in-progress.
Parameters: - container (Container) – The container holding the uploads
- prefix (str) – Print only uploads of objects with this prefix
- delimiter (str) – The object/key names are grouped based on being split by this delimiter
Returns: A generator of S3MultipartUpload instances.
Return type: generator of S3MultipartUpload
- ex_location_name = ''¶
- hash_type = 'md5'¶
- http_vendor_prefix = 'x-amz'¶
- iterate_container_objects(container, ex_prefix=None)[source]¶
Return a generator of objects for the given container.
Parameters: - container (Container) – Container instance
- ex_prefix (str) – Only return objects starting with ex_prefix
Returns: A generator of Object instances.
Return type: generator of Object
- list_container_objects(container, ex_prefix=None)[source]¶
Return a list of objects for the given container.
Parameters: - container (Container) – Container instance.
- ex_prefix (str) – Only return objects starting with ex_prefix
Returns: A list of Object instances.
Return type: list of Object
- name = 'Amazon S3 (standard)'¶
- namespace = 'http://s3.amazonaws.com/doc/2006-03-01/'¶
- supports_chunked_encoding = False¶
- supports_s3_multipart_upload = True¶
- upload_object(file_path, container, object_name, extra=None, verify_hash=True, ex_storage_class=None)[source]¶
@inherits: StorageDriver.upload_object
Parameters: ex_storage_class (str) – Storage class
- upload_object_via_stream(iterator, container, object_name, extra=None, ex_storage_class=None)[source]¶
@inherits: StorageDriver.upload_object_via_stream
Parameters: ex_storage_class (str) – Storage class
- website = 'http://aws.amazon.com/s3/'¶
- class libcloud.storage.drivers.s3.S3APNEConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.storage.drivers.s3.S3Connection
- host = 's3-ap-northeast-1.amazonaws.com'¶
- class libcloud.storage.drivers.s3.S3APNEStorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.S3StorageDriver
- connectionCls¶
alias of S3APNEConnection
- ex_location_name = 'ap-northeast-1'¶
- name = 'Amazon S3 (ap-northeast-1)'¶
- class libcloud.storage.drivers.s3.S3APSEConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.storage.drivers.s3.S3Connection
- host = 's3-ap-southeast-1.amazonaws.com'¶
- class libcloud.storage.drivers.s3.S3APSEStorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.S3StorageDriver
- connectionCls¶
alias of S3APSEConnection
- ex_location_name = 'ap-southeast-1'¶
- name = 'Amazon S3 (ap-southeast-1)'¶
- class libcloud.storage.drivers.s3.S3Connection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.common.aws.AWSTokenConnection, libcloud.storage.drivers.s3.BaseS3Connection
Represents a single connection to the S3 endpoint, with AWS-specific features.
- class libcloud.storage.drivers.s3.S3EUWestConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.storage.drivers.s3.S3Connection
- host = 's3-eu-west-1.amazonaws.com'¶
- class libcloud.storage.drivers.s3.S3EUWestStorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.S3StorageDriver
- connectionCls¶
alias of S3EUWestConnection
- ex_location_name = 'EU'¶
- name = 'Amazon S3 (eu-west-1)'¶
- class libcloud.storage.drivers.s3.S3MultipartUpload(key, id, created_at, initiator, owner)[source]¶
Bases: object
Class representing an amazon s3 multipart upload
Class representing an amazon s3 multipart upload
Parameters: - key (str) – The object/key that was being uploaded
- id (str) – The upload id assigned by amazon
- created_at (str) – The date/time at which the upload was started
- initiator (str) – The AWS owner/IAM user who initiated this
- owner (str) – The AWS owner/IAM who will own this object
- class libcloud.storage.drivers.s3.S3RawResponse(connection)[source]¶
Bases: libcloud.storage.drivers.s3.S3Response, libcloud.common.base.RawResponse
Parameters: connection (Connection) – Parent connection object.
- class libcloud.storage.drivers.s3.S3Response(response, connection)[source]¶
Bases: libcloud.common.aws.AWSBaseResponse
Parameters: - response (httplib.HTTPResponse) – HTTP response object. (optional)
- connection (Connection) – Parent connection object.
- namespace = None¶
- valid_response_codes = [404, 409, 400]¶
- class libcloud.storage.drivers.s3.S3StorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.common.aws.AWSDriver, libcloud.storage.drivers.s3.BaseS3StorageDriver
- connectionCls¶
alias of S3Connection
- class libcloud.storage.drivers.s3.S3USWestConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.storage.drivers.s3.S3Connection
- host = 's3-us-west-1.amazonaws.com'¶
- class libcloud.storage.drivers.s3.S3USWestOregonConnection(user_id, key, secure=True, host=None, port=None, url=None, timeout=None, token=None)[source]¶
Bases: libcloud.storage.drivers.s3.S3Connection
- host = 's3-us-west-2.amazonaws.com'¶
- class libcloud.storage.drivers.s3.S3USWestOregonStorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.S3StorageDriver
- connectionCls¶
alias of S3USWestOregonConnection
- ex_location_name = 'us-west-2'¶
- name = 'Amazon S3 (us-west-2)'¶
- class libcloud.storage.drivers.s3.S3USWestStorageDriver(key, secret=None, secure=True, host=None, port=None, api_version=None, region=None, token=None, **kwargs)[source]¶
Bases: libcloud.storage.drivers.s3.S3StorageDriver
- connectionCls¶
alias of S3USWestConnection
- ex_location_name = 'us-west-1'¶
- name = 'Amazon S3 (us-west-1)'¶
Module contents¶
Drivers for working with different providers