What it does: Allows you to create computing jobs on the cloud. Amazon computers (EC2) will complete the job and you can recieve the results.
Work flow:
http://docs.aws.amazon.com/batch/latest/userguide/Batch_GetStarted.html
https://boto3.readthedocs.io/en/latest/guide/quickstart.html
In this simple test, I will attempt to start a job process using AWS Batch. The job will not depend on any input data or other parameters.
In [3]:
import boto3
# low level client that can access AWS batch methods
client = boto3.client("batch")
In [2]:
# create Compute Environments
# Compute environments contain the EC2 instances that are used to run batch jobs. You will map a compute environment
# to a job queue. The job queue has a scheduler that helps plan out which EC2 instances are ready to take a job.
response = client.create_compute_environment(
computeEnvironmentName= "Test_Env",
type= 'Managed', # managed means AWS will manage the computing resources that you specify
state= 'ENABLED', # enable your computing environment
computeResources={
'type': 'EC2'|'SPOT',
'minvCpus': 1, #min number of cpus environment should maintain.
'maxvCpus': 5, #max number of cpus environment should maintain.
'desiredvCpus': 3,
'instanceTypes': ['c4.large'], #type of instances allowed to run.
'subnets': ['subnet-220c0e0a'] #this is a thing
'securityGroupIds': ['sg-cf5093b2'], #this is also a thing
'instanceRole': 'ecsInstanceRole'
},
serviceRole= 'string'
)
print response
In [4]:
# create job queue (this is where AWS will store your jobs until an EC2 Instance is available to run them)
response = client.create_job_queue(
jobQueueName='test_queue',
state='ENABLED',
priority=1,
computeEnvironmentOrder=[
{
'order': 1,
'computeEnvironment': 'test_batch'
},
]
)
print response
In [15]:
# job description: specifies how the jobs are run (required field for jobs)
# special attributes you can attach: Docker images to use with the job, a command (can be overwritten during runtime)
# environment variables, data volumes, etc
response = client.register_job_definition(
type='container',
containerProperties={
'command': [
'echo',
'Hello World',
],
'image': 'busybox', # docker image with basic UNIX utils
'memory': 128,
'vcpus': 1,
},
jobDefinitionName='echoMsg',
)
print(response)
In [6]:
#submit a job
response = client.submit_job(
jobDefinition='echoMsg',
jobName='test',
jobQueue='test_queue',
containerOverrides={
'command': ['echo', 'NEURODATA']
}
)
print response
In [13]:
from IPython.display import Image
Image('../../Desktop/proof.png')
Out[13]:
In [ ]: