Skip to content

alephcloud/aws

 
 

Repository files navigation

Amazon Web Services for Haskell

Introduction

The aws package attempts to provide support for using Amazon Web Services like S3 (storage), SQS (queuing) and others to Haskell programmers. The ultimate goal is to support all Amazon Web Services.

Installation

Make sure you have a recent GHC installed, as well as cabal-install, and installation should be as easy as:

$ cabal install aws

If you prefer to install from source yourself, you should first get a clone of the aws repository, and install it from inside the source directory:

$ git clone https://github.com/aristidb/aws.git
$ cd aws
$ cabal install

Using aws

Concepts and organisation

The aws package is organised into the general Aws module namespace, and subnamespaces like Aws.S3 for each Amazon Web Service. Under each service namespace in turn, there are general support modules and and Aws.<Service>.Commands.<Command> module for each command. For easier usage, there are the “bundling” modules Aws (general support), and Aws.<Service>.

The primary concept in aws is the Transaction, which corresponds to a single HTTP request to the Amazon Web Services. A transaction consists of a request and a response, which are associated together via the Transaction typeclass. Requests and responses are simple Haskell records, but for some requests there are convenience functions to fill in default values for many parameters.

Example usage

To be able to access AWS resources, you should put your into a configuration file. (You don’t have to store it in a file, but that’s how we do it in this example.) Save the following in $HOME/.aws-keys.

default AccessKeyID SecretKey

You do have to replace AccessKeyID and SecretKey with the Access Key ID and the Secret Key respectively, of course.

Then, copy this example into a Haskell file, and run it with runghc (after installing aws):

{-# LANGUAGE OverloadedStrings #-}

import qualified Aws
import qualified Aws.S3 as S3
import           Data.Conduit (($$+-))
import           Data.Conduit.Binary (sinkFile)
import           Network.HTTP.Conduit (withManager, responseBody)

main :: IO ()
main = do
  {- Set up AWS credentials and the default configuration. -}
  cfg <- Aws.baseConfiguration
  let s3cfg = Aws.defServiceConfig :: S3.S3Configuration Aws.NormalQuery

  {- Set up a ResourceT region with an available HTTP manager. -}
  withManager $ \mgr -> do
    {- Create a request object with S3.getObject and run the request with pureAws. -}
    S3.GetObjectResponse { S3.gorResponse = rsp } <-
      Aws.pureAws cfg s3cfg mgr $
        S3.getObject "haskell-aws" "cloud-remote.pdf"

    {- Save the response to a file. -}
    responseBody rsp $$+- sinkFile "cloud-remote.pdf"

You can also find this example in the source distribution in the Examples/ folder.

Frequently Asked Questions

S3 questions

  • I get an error when I try to access my bucket with upper-case characters / a very long name.

    Those names are not compliant with DNS. You need to use path-style requests, by setting s3RequestStyle in the configuration to PathStyle. Note that such bucket names are only allowed in the US standard region, so your endpoint needs to be US standard.

Release Notes

0.10 series

  • 0.10.5
    • support for conduit 1.2
  • 0.10.4
    • S3: support for multi-part uploads
    • DynamoDB: fixes for JSON serialization WARNING: This includes making some fields in TableDescription Maybe fields, which is breaking. But DynamoDB support was and is also marked as EXPERIMENTAL.
    • DynamoDB: TCP connection reuse where possible (improving performance)
    • DynamoDB: Added test suite
    • SES: support for additional regions
  • 0.10.3
    • fix bug introduced in 0.10.2 that broke SQS and IAM connections without STS
  • 0.10.2
    • support STS / IAM temporary credentials in all services
  • 0.10
    • [EXPERIMENTAL!] DynamoDB: support for creating/updating/querying and scanning items
    • SQS: complete overhaul to support 2012-11-05 features
    • SQS: test suite
    • S3: use Maybe for 404 HEAD requests on objects instead of throwing a misleading exception
    • S3: support of poAutoMakeBucket for Internet Archive users
    • S3: implement GetBucketLocation
    • S3: add South American region
    • S3: allow specifying the Content-Type when copying objects
    • core: fix typo in NoCredentialsException accessor

0.9 series

  • 0.9.4
    • allow conduit 1.2
  • 0.9.3
    • fix performance regression for loadCredentialsDefault
    • add generic makeCredentials function
    • add S3 DeleteBucket operation
    • add S3 NukeBucket example
    • SES: use security token if enabled (should allow using it with IAM roles on EC2 instances)
  • 0.9.2
    • Support for credentials from EC2 instance metadata (only S3 for now)
    • aeson 0.8 compatibility
  • 0.9.1
    • Support for multi-page S3 GetBucket requests
    • S3 GLACIER support
    • Applicative instance for Response to conform to the Applicative-Monad Proposal
    • Compatibility with transformers 0.4
  • 0.9
    • Interface changes:
      • attempt and failure were deprecated, remove
      • switch to new cryptohash interface
    • updated version bounds of conduit and xml-conduit

0.8 series

  • 0.8.6
    • move Instance metadata functions out of ResourceT to remove problem with exceptions-0.5 (this makes a fresh install of aws on a clean system possible again)
  • 0.8.5
    • compatibility with case-insensitive 1.2
    • support for V4 signatures
    • experimental support for DynamoDB
  • 0.8.4
    • compatibility with http-conduit 2.0
  • 0.8.3
    • compatibility with cryptohash 0.11
    • experimental IAM support
  • 0.8.2
    • compatibility with cereal 0.4.x
  • 0.8.1
    • compatibility with case-insensitive 1.1
  • 0.8.0
    • S3, SQS: support for US-West2 (#58)
    • S3: GetObject now has support for Content-Range (#22, #50)
    • S3: GetBucket now supports the “IsTruncated” flag (#39)
    • S3: PutObject now supports web page redirects (#46)
    • S3: support for (multi-object) DeleteObjects (#47, #56)
    • S3: HeadObject now uses an actual HEAD request (#53)
    • S3: fixed signing issues for GetObject call (#54)
    • SES: support for many more operations (#65, #66, #70, #71, #72, #74)
    • SES: SendRawEmail now correctly encodes destinations and allows multiple destinations (#73)
    • EC2: support fo Instance metadata (#37)
    • Core: queryToHttpRequest allows overriding “Date” for the benefit of Chris Dornan’s Elastic Transcoder bindings (#77)

0.7 series

  • 0.7.6.4
    • CryptoHash update
  • 0.7.6.3
    • In addition to supporting http-conduit 1.9, it would seem nice to support conduit 1.0. Previously slipped through the radar.
  • 0.7.6.2
    • Support for http-conduit 1.9
  • 0.7.6.1
    • Support for case-insensitive 1.0 and http-types 0.8
  • 0.7.6
    • Parsing of SimpleDB error responses was too strict, fixed
    • Support for cryptohash 0.8
    • Failure 0.1 does not work with aws, stricter lower bound
  • 0.7.5
    • Support for http-conduit 1.7 and 1.8
  • 0.7.1-0.7.4
    • Support for GHC 7.6
    • Wider constraints to support newer versions of various dependencies
    • Update maintainer e-mail address and project categories in cabal file
  • 0.7.0
    • Change ServiceConfiguration concept so as to indicate in the type whether this is for URI-only requests (i.e. awsUri)
    • EXPERIMENTAL: Direct support for iterated transaction, i.e. such where multiple HTTP requests might be necessary due to e.g. response size limits.
    • Put aws functions in ResourceT to be able to safely return Sources and streams.
      • simpleAws* does not require ResourceT and converts streams into memory values (like ByteStrings) first.
    • Log response metadata (level Info), and do not let all aws runners return it.
    • S3:
      • GetObject: No longer require a response consumer in the request, return the HTTP response (with the body as a stream) instead.
      • Add CopyObject (PUT Object Copy) request type.
    • Add Examples cabal flag for building code examples.
    • Many more, small improvements.

0.6 series

  • 0.6.2
    • Properly parse Last-Modified header in accordance with RFC 2616.
  • 0.6.1
    • Fix for MD5 encoding issue in S3 PutObject requests.
  • 0.6.0
    • API Cleanup
      • General: Use Crypto.Hash.MD5.MD5 when a Content-MD5 hash is required, instead of ByteString.
      • S3: Made parameter order to S3.putObject consistent with S3.getObject.
    • Updated dependencies:
      • conduit 0.5 (as well as http-conduit 1.5 and xml-conduit 1.0).
      • http-types 0.7.
    • Minor changes.
    • Internal changes (notable for people who want to add more commands):
      • http-types’ new ‘QueryLike’ interface allows creating query lists more conveniently.

0.5 series

0.5.0
New configuration system: configuration split into general and service-specific parts.

Significantly improved API reference documentation.

Re-organised modules to make library easier to understand.

Smaller improvements.

0.4 series

0.4.1
Documentation improvements.
0.4.0.1
Change dependency bounds to allow the transformers 0.3 package.
0.4.0
Update conduit to 0.4.0, which is incompatible with earlier versions.

0.3 series

0.3.2
Add awsRef / simpleAwsRef request variants for those who prefer an IORef over a Data.Attempt.Attempt value. Also improve README and add simple example.

Resources

Contributors

NameGithubE-MailCompanyComponents
Abhinav Guptaabhinav[email protected]-IAM, SES
Aristid Breitkreuzaristidb[email protected]-Co-Maintainer
Bas van Dijkbasvandijk[email protected]Erudify AGS3
David Vollbrachtqxjit
Felipe Lessameteficha[email protected]currently secretCore, S3, SES
Nathan HowellNathanHowell[email protected]Alpha Heavy IndustriesS3
Ozgun Atamanozataman[email protected]Soostone IncCore, S3, DynamoDb
Steve Severancesseveran[email protected]Alpha Heavy IndustriesS3, SQS
John Wiegleyjwiegley[email protected]FP CompleteCo-Maintainer, S3
Chris Dornancdornan[email protected]Iris ConnectCore
John LenzwuzzebDynamoDB, Core

Packages

No packages published

Languages

  • Haskell 99.7%
  • Nix 0.3%