One of the biggest trends in cloud is using cloud provided services rather than creating and managing your own VM-based services. Extending CloudCenter’s services library through the new External Services functionality lets you use these services along side any other service you application might need.
In my last post, I talked about the new types of services available and examples of services each can be used to supply.
In my rant this week, the focus will be on the External Service type and connecting to Amazon Web Services to create a DynamoDB instance. DyanmoDB provides a NoSQL database and is an example of what we at CliQr consider a platform service because it’s:
- Hosted by a cloud platform and therefore cloud specific
- Requires no creation of virtual infrastructure
- Requires no installation of software
- Requires no ongoing service maintenance
This is a good time to make note that generally our application profiles are cloud agnostic so that they can be deployed to any cloud, but because DynamoDB is a platform specific service, modeling a CloudCenter application profile with this service will limit your target cloud options for deploying. If flexibility is important for your organization, I would recommend one of the following solutions:
- Automating the deployment of a similar service through CloudCenter to deliver the value of speed and agility for users while maintaining cloud portability. An example would be to use MongoDB rather than then AWS specific DynamoDB
- Use multi-cloud deployment functionality (available in CloudCenter 4.4 and above) which allows users to deploy specific application tiers to different clouds. In this case, DynamoDB could be in AWS, and the rest of the services in another cloud
This walkthrough will use Python to run a script to call AWS. Along the way, we’ll install Boto3 and other tools to help make our lives easier. The end result will be a DynamoDB table called “users” with only two columns—“username” and “last_name”. The script came from an example that AWS has published here.
- Download the service start and stop scripts, bundled together here
- Ensure that you have an AWS user account with a valid access key (both key and secret key)
- Have the target AWS region name ready (ex: us-east-1)
Once you have downloaded the scripts you’ll see 3 pieces of information in each script that you need to fill in: **YOUR_KEY**, **YOUR_SECRET_KEY**, and **YOUR_AWS_REGION**. This information is required to connect to the AWS API via Boto3 SDK. In later articles, we’ll show you how to let users provide credentials at deploy time rather than hard coding them to the script.
- Upload the start and stop scripts that you modified to a CloudCenter accessible URL or repository
- Logged in as a tenant owner or co-admin, navigate to Admin> Infrastructure heading> Services
- Create a new service
- Select a Service Type: External Service
- Fill in required fields like name, service ID, category (to make it easier to find later on), and a cost per hour.
- Enter the location of the service scripts
- Start field: Choose either URL or repository (based on where you uploaded the “dynamodemo.sh” script to)
- Start field value: relative location on repository or URL to the script
- Stop field: Choose either URL or repository (based on where you uploaded the “dynamodemodelete.sh” script to)
- Stop field value: relative location on repository or URL to the script
- Save the service
- Model a new N-Tier application profile, drag in the newly created DynamoDB service from the services library
- Fill in all required fields and save the application profile
Test your work
- Deploy the DynamoDB application profile
- Log into AWS console
- Make sure you’re in the region you specified in the scripts
- Navigate to the DynamoDB area and there should be a new table called “users”
- In CloudCenter, go into the deployments area and terminate the DynamoDB deployment
- The DynamoDB instance should be removed from AWS
Note that regardless of the cloud you choose from the dropdown list in the deployment screen, the service will be deployed in the AWS region you speficied in the scripts. In later articles, we will talk more about how you can parameterize these and other variables to not be static.
As I mentioned in the previous blog article, for security, CloudCenter External Services use an ephemeral Docker container that’s destroyed after every use to run these scripts and commands. That means that regardless of what you’re connecting to, there will be work that has to happen each time. In the case of connecting to AWs, regardless of which AWS service you want to use, you have to install the Python SDK (Boto3) and supporting tools each time.
The scripts I’ve provided do a lot of this work for you. If you wanted to use these scripts for any other AWS service, all you would have to do is put your Python commands into the highlighted area here:
#Building Python Script
aws_access_key_id = **YOUR_KEY**
aws_secret_access_key = **YOUR_SECRET_KEY**
region=**YOUR_AWS_REGION**” > ~/.aws/credentials
dynamodb = boto3.resource(‘dynamodb’)
table = dynamodb.Table(‘users’)
table.delete()” > dynamodb.py