Amazon S3 API Kits

In the following sections, you’ll look at some libraries to S3 written in PHP and Python.

PHP

The following API kits are available:

  • php-aws [275]

  • s3.class.zip at Neurofuzzy.net, which looks like a popular class implementation[276]

  • edoceo’s phps3tk [277]

In this section, we’ll concentrate on how to use php-aws. You can access the source using SVN. In your web browser, you can download the library from here:

http://php-aws.googlecode.com/svn/trunk/class.s3.php

You can find documentation for the S3 class here:

http://code.google.com/p/php-aws/wiki/S3Class

The following blog entry introduces php-aws:

http://sitening.com/blog/2007/01/30/introducing-php-aws/

Note the following about this library:

  • Its use of curl means it is built to handle larger files.

  • Only the public read or private ACL is currently implemented.

  • There is no implementation of user metadata for objects.

To get started with php-aws, follow these steps:

  1. Download http://php-aws.googlecode.com/svn/trunk/class.s3.php to your favorite local PHP directory. (In my case, this is /home/rdhyee/phplib/php-aws/class.s3.php.)

  2. Try the following sample code to get you started (this code first lists your S3 buckets and then creates a bucket by the name of mashupguidetest if it doesn’t already exist):

                      <?php
                      require_once("php-aws/class.s3.php");
                       
                      $key = "[AWSAccessKeyID]";
                      $secret = "[SecretAccessKey]";
                      
                      $s3 = new S3($key,$secret);
                      
                      // get list of buckets
                      $buckets = $s3->getBuckets();
                      print_r($buckets);
                      
                      // if the bucket "mashupguidetest" doesn't exist, create it
                      $BNAME = "mashupguidetest";
                      if (! $s3->bucketExists($BNAME)) {
                        $s3->createBucket($BNAME);
                      }
                      
                      // get list of buckets again
                      $buckets = $s3->getBuckets();
                      print_r($buckets);
                   
[Note]Note

 For you to use php-aws, you need to have a ­command-­line invokable instance of curl installed on your system. You might also need to set the $_pathToCurl parameter in class.s3.php so that php-aws can find curl.

Python

Some ­Python-­based S3 libraries are as follows:

  • boto [278]

  • HanzoiArchive’s S3 tools [279]

  • BitBucket [280]

I recommend looking at boto as a good choice of a library. One of the best ways to learn how to use boto is to read the tutorial here:

http://boto.googlecode.com/svn/trunk/doc/s3_tut.txt

You can learn the basics of using boto by studying the next code sample, which does the following:

  • It reads the list of your S3 buckets and displays the name, creation date, and XML representation of the bucket’s ACL.

  • It reads the list of objects contained in a specific bucket, along with the last modified time stamp and the object’s metadata.

  • It uploads a file to a bucket and reads back the metadata of the newly uploaded file.

            AWSAccessKeyId='[AWSAccessKeyId]'
            AWSSecretAccessKey = '[AWSSecretAccessKey]'
            FILENAME = 'D:\Document\PersonalInfoRemixBook\858Xtoc___.pdf'
            BUCKET = 'mashupguidetest'
                
            from boto.s3.connection import S3Connection
            
            def upload_file(fname, bucket, key, acl='public-read', metadata=None):
                from boto.s3.key import Key
                
                fpic = Key(bucket)
                fpic.key = key
                #fpic.set_metadata('source','flickr')
                fpic.update_metadata(metadata)
                fpic.set_contents_from_filename(fname)
                fpic.set_acl(acl)
                return fpic
            
            # set up a connection to S3
            
            conn = S3Connection(AWSAccessKeyId, AWSSecretAccessKey)
            
            # retrieve all the buckets 
            buckets = conn.get_all_buckets()
            print "number of buckets:", len(buckets)
            
            # print out the names, creation date, and the XML the represents the ACL 
            # of the bucket
            
            for b in buckets:
                print "%s\t%s\t%s" % (b.name, b.creation_date, b.get_acl().acl.to_xml())
            
            # get list of all files for the mashupguide bucket
            
            print "keys in " + BUCKETmg_bucket = conn.get_bucket(BUCKET)
            keys = mg_bucket.get_all_keys()
            for key in keys:
                print "%s\t%s\t%s" % (key.name, key.last_modified, key.metadata)
            
            # upload the table of contents to mashupguide bucket.
            
            metadata = {'author':'Raymond Yee'}
            upload_file(FILENAME,mg_bucket,'samplefile','public-read',metadata)
            
            # read back the TOC
            toc = mg_bucket.get_key('samplefile')
            print toc.metadata