OmicsPipe on AWS uses a custom StarCluster image, created with docker.io ( which installs docker.io, environment-modules, and easybuild on an AWS EC2 cluster. All you have to do is get the docker image, upload your data, launch the Amazon cluster and run a single command to analyze all of your data according to published, best-practice methods.
Download docker.io following the instructions at Get-Docker
From inside the Docker environment, run the command:
docker run -i -t omicspipe/aws\_readymade /bin/bash
Within the Docker environment, run the commands:
starcluster createvolume --name=data -i ami-52112317 -d -s <volume size in GB> us-west-1b starcluster createvolume --name=results -i ami-52112317 -d -s <volume size in GB> us-west-1b
- Go to the AWS-Console
Run the command:
nano ~/.starcluster/config
(This will open the config file with the text editor, nano. Vim is also available if it is preferred.)
- enter your “AWS ACCESS KEY ID”, “AWS SECRET ACCESS KEY”, and “AWS USER ID” along with the “VOLUME_ID”s of your S3 buckets
- **NOTE: if you do not live in the AWS us-west region, change your “AWS REGION NAME” and “AWS REGION HOST” variables as appropriate
- save the file
Run the command:
starcluster createkey omicspipe -o ~/.ssh/omicspipe.rsa
starcluster start mypipe
There are two options to upload your data:
starcluster put mypipe <myfile> /data/raw
After these steps, your StarCluster AWS EC2 cluster will be created with one slave node. Edit the ~/.starcluster/config file to further modify your EC2 cluster.
The Dockerfiles in OmicsPipe can be used to build the dockerCluster image.
Download docker.io following the instructions at Get-Docker
Run the command:
docker build -t <Repository Name> https://bitbucket.org/sulab/omics\_pipe/downloads/Dockerfile\_AWS\_custombuild
This will store the dockercluster image in the Repository Name of your choice.