On This page
On Related Pages
First-Time Setup
Sign up to AWS
- Create an AWS Account
- Use your firstName.lastName@sagebase.org email address for the account name
- Enter Sage Bionetworks' physical address for the address
- You will need to use your own credit card temporarily
- Send Mike Kellen an email to have your new AWS account added to the consolidated bill
- Once this is done, you will no longer be billed on your own credit card
Subscribe to services
- Sign up for EC2 http://aws.amazon.com/ec2/
- Sign up for S3 http://aws.amazon.com/s3/
- Sign up for ElasticMapReduce http://aws.amazon.com/elasticmapreduce/
- Sign up for Simple DB http://aws.amazon.com/simpledb/
Configure EC2
- Use the AWS console to create a new SSH key named SageKeyPair
- Download it to
~/.ssh
on the shared servers - ssh to sodo
- Fix the permissions on it
~>chmod 600 ~/.ssh/SageKeyPair.pem mode of `/home/ndeflaux/.ssh/SageKeyPair.pem' retained as 0600 (rw-------)
Configure S3
- Use the AWS console to make a new S3 bucket named sagebio-YourUnixUsername Note: Do not put any underscores in your bucket name. Only use hyphens, lowercase letters and numbers.
- Make these five subdirectories
- scripts
- input
- output
- results
- logs
Set up your config file for the AWS Elastic MapReduce command line tool installed on the shared servers
Get your credentials
Get your security credentials from your AWS Account
- Access Key ID
- Secret Access Key
Set up the Elastic MapReduce command line tools
Set up your configuration files for the Elastic MapReduce AWS tool installed on the shared servers (belltown, sodo, ballard, ...)
- ssh to sodo
- Create the configuration file for the Elastic Map Reduce command line tool
~>cat ~/.ssh/$USER-credentials.json { "access_id": "YourAWSAccessKeyID", "private_key": "YourAWSSecretAccessKey", "keypair": "SageKeyPair", "key-pair-file": "/home/ndeflaux/.ssh/SageKeyPair.pem", "log_uri": "s3n://sagebio-YourUnixUsername/logs/", "region": "us-east-1" }
- Test that you can run it
~>/work/platform/bin/elastic-mapreduce-cli/elastic-mapreduce --credentials ~/.ssh/$USER-credentials.json --help Usage: elastic-mapreduce [options] Creating Job Flows --create Create a new job flow --name NAME The name of the job flow being created --alive Create a job flow that stays running even though it has executed all its steps --with-termination-protection Create a job with termination protection (default is no termination protection) --num-instances NUM Number of instances in the job flow ...
- For less typing, you can make an alias to this command. If you use bash, you can put the following in your .bashrc:
alias emr='/work/platform/bin/elastic-mapreduce-cli/elastic-mapreduce --credentials ~/.ssh/$USER-credentials.json'
Other useful tools
s3curl
You can use the AWS Console to upload/download files to S3 but sometimes it is handy to do this from the command line too, and this tool will let you do that.
To set up your configuration file for the s3curl AWS tool installed on the shared servers (belltown, sodo, ballard, ...):
- ssh to sodo
- Create the configuration file for
s3curl
command line tool~>cat ~/.ssh/s3curl #!/bin/perl %awsSecretAccessKeys = ( YourUnixUsername => { id => 'YourAccessKeyID', key => 'YourSecretAccessKey', }, );
- Make a symlink to it in your home directory
~>ln -s ~/.ssh/s3curl ~/.s3curl
- Test that you can run s3curl
~> chmod 600 /home/$USER/.s3curl ~>/work/platform/bin/s3curl.pl --id $USER https://s3.amazonaws.com/sagebio-$USER/ | head -c 200 <?xml version="1.0" encoding="UTF-8"?> <ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><Name>sagebio-ndeflaux</Name><Prefix></Prefix><Marker></Marker><MaxKeys>1000</MaxKeys><IsTruncated>
nano text editor
The nano editor is available on sodo/ballard/belltown/etc... and on the miami cluster. It does not use X windows. If you need a simple text editor and are not familiar with vi or emacs, nano is a good choice and installed by default on many linux systems.
~>ssh ndeflaux@pegasus.ccs.miami.edu ndeflaux@pegasus.ccs.miami.edu's password: Last login: Thu May 19 18:59:51 2011 from dhcp149019.fhcrc.org *********************************************************************** * * * Welcome to Pegasus Linux Cluster at CCS/University of Miami. * * * * .... * * *********************************************************************** [ndeflaux@u01 ~]$ nano file.txt [ndeflaux@u01 ~]$
Where to go next?
- Try A Simple Example of an R MapReduce Job
- Take a look at the AWS documentation for Elastic MapReduce