Amazon S3 API Kits

In the following sections, you’ll look at some libraries to S3 written in PHP and Python.


The following API kits are available:

  • php-aws [275]

  • at, which looks like a popular class implementation[276]

  • edoceo’s phps3tk [277]

In this section, we’ll concentrate on how to use php-aws. You can access the source using SVN. In your web browser, you can download the library from here:

You can find documentation for the S3 class here:

The following blog entry introduces php-aws:

Note the following about this library:

  • Its use of curl means it is built to handle larger files.

  • Only the public read or private ACL is currently implemented.

  • There is no implementation of user metadata for objects.

To get started with php-aws, follow these steps:

  1. Download to your favorite local PHP directory. (In my case, this is /home/rdhyee/phplib/php-aws/class.s3.php.)

  2. Try the following sample code to get you started (this code first lists your S3 buckets and then creates a bucket by the name of mashupguidetest if it doesn’t already exist):

                      $key = "[AWSAccessKeyID]";
                      $secret = "[SecretAccessKey]";
                      $s3 = new S3($key,$secret);
                      // get list of buckets
                      $buckets = $s3->getBuckets();
                      // if the bucket "mashupguidetest" doesn't exist, create it
                      $BNAME = "mashupguidetest";
                      if (! $s3->bucketExists($BNAME)) {
                      // get list of buckets again
                      $buckets = $s3->getBuckets();

 For you to use php-aws, you need to have a ­command-­line invokable instance of curl installed on your system. You might also need to set the $_pathToCurl parameter in class.s3.php so that php-aws can find curl.


Some ­Python-­based S3 libraries are as follows:

  • boto [278]

  • HanzoiArchive’s S3 tools [279]

  • BitBucket [280]

I recommend looking at boto as a good choice of a library. One of the best ways to learn how to use boto is to read the tutorial here:

You can learn the basics of using boto by studying the next code sample, which does the following:

  • It reads the list of your S3 buckets and displays the name, creation date, and XML representation of the bucket’s ACL.

  • It reads the list of objects contained in a specific bucket, along with the last modified time stamp and the object’s metadata.

  • It uploads a file to a bucket and reads back the metadata of the newly uploaded file.

            AWSSecretAccessKey = '[AWSSecretAccessKey]'
            FILENAME = 'D:\Document\PersonalInfoRemixBook\858Xtoc___.pdf'
            BUCKET = 'mashupguidetest'
            from boto.s3.connection import S3Connection
            def upload_file(fname, bucket, key, acl='public-read', metadata=None):
                from boto.s3.key import Key
                fpic = Key(bucket)
                fpic.key = key
                return fpic
            # set up a connection to S3
            conn = S3Connection(AWSAccessKeyId, AWSSecretAccessKey)
            # retrieve all the buckets 
            buckets = conn.get_all_buckets()
            print "number of buckets:", len(buckets)
            # print out the names, creation date, and the XML the represents the ACL 
            # of the bucket
            for b in buckets:
                print "%s\t%s\t%s" % (, b.creation_date, b.get_acl().acl.to_xml())
            # get list of all files for the mashupguide bucket
            print "keys in " + BUCKETmg_bucket = conn.get_bucket(BUCKET)
            keys = mg_bucket.get_all_keys()
            for key in keys:
                print "%s\t%s\t%s" % (, key.last_modified, key.metadata)
            # upload the table of contents to mashupguide bucket.
            metadata = {'author':'Raymond Yee'}
            # read back the TOC
            toc = mg_bucket.get_key('samplefile')
            print toc.metadata