tool

There are many many tools available for managing the AWS stacks, it is not possible to list all them or to keep an updated list.

As an example, you can find some tools below.

Sceptre

Sceptre is a tool to drive CloudFormation. Sceptre manages the creation, update and deletion of stacks while providing meta commands which allow users to retrieve information about their stacks.. yes, this is the first sentence of the introduction of Sceptre in the official site, and yes, it is a all-around tool that you can use to manage your stacks by cli or python script.

The available commands of Sceptre are a lot, and you can manage all that you need by yaml files and, if you want, some python scripts.

Sceptre is available like a python package or a docker image,

pip install sceptre # for installing Sceptre
sceptre --help # for printing its commands

If you want to install Sceptre and also its plugins for the examples below,

cd tool/sceptre
make install

If you want to install only the Sceptre plugins for the examples below,

cd tool/sceptre
make compile

By shell

Sceptre deploys one stack for each configuration file and one configuration file uses one template file. The relation between configuration file and template file could be many to one or one to one.

The command for validating one configuration file and template file related,

export AWS_PROFILE=your-account
sceptre validate path/configuration-file

Instead, the command for deploying that stack of that configuration file,

sceptre launch path/configuration-file

When the stack will be created, you can get all information of the stack outputs that they are described in the template Outputs section. By the command below you can get the stack outputs

sceptre list outputs path/configuration-file

For updating the stack after changes on parameters or templates files,

sceptre launch path/configuration-file

for deleting the stack,

sceptre delete path/configuration-file

Below you can find a simple sample (*) of deployment of a Minetest server and a lambda function.

cd tool/sceptre
export AWS_PROFILE=your-account
sceptre validate basic/ec2 # an example for validating one configuration file
sceptre validate basic # an example for validating all configuration files of the environment named basic
sceptre launch basic # for deploying stacks
sceptre delete basic # for deleting stacks

Below you can find an example (*) of deployment of a Minetest server, a lambda function and more.

cd tool/sceptre
export AWS_PROFILE=your-account
sceptre launch more

Below you can find an example (*) of deployment of a Minetest server, a lambda function and what else you can manage.

cd tool/sceptre
export AWS_PROFILE=your-account
sceptre launch plus

(*) Please, pay attention: in the config.yaml files, there are some identifiers that you need to change before running Sceptre!

By a bash script

An ad hoc script maybe it could be useful for a specific CI / CD system. Generally, it is not necessary.

Remember

Sceptre provides two power components:

  • hooks, for running your scripts at a particular hook point
  • resolvers, for recovering stack outputs or parameters from AWS::SSM, and so on, for your configuration files

So you can avoid to hardcoding a password directly in the property or in the parameter configuration file by

Another fantastic feature is that Sceptre can use a python script like a template, for example using the python package troposphere:

Serverless framework

The Serverless framework helps you develop and deploy your AWS Lambda functions, along with the AWS infrastructure resources they require.. yes, this is the first sentence of the introduction of Serverless framework in the official site, and it shows very well strength and weakness: this framework works around to AWS lambda functions, and what it does not manage, is configured as a resource in the cloudformation format (with small exceptions), until you create a plugin to manage it more easily.

The available custom properties of Serverless framework are a lot, and you can manage all that you need by yaml and json files and the conditions system.

The Serverless plugin registry is available on GitHub or by shell with the command below

npm install -g serverless # for installing Serverless framework
serverless plugin list # for listing its plugins

The available commands of Serverless framework are not many, but sufficient to manage a stack starting from the configuration files.

By shell

The Serverless framework documentation is rich and clean. And it is simple to use Serverless framework: if you do not use plugins, you can use only a few commands.

For deploying your stack,

cd serverless-path/
export AWS_PROFILE=your-account
serverless deploy --stage name-of-your-environment

For deleting your stack,

cd serverless-path/
export AWS_PROFILE=your-account
serverless remove --stage name-of-your-environment

Below you can find a simple sample of deployment of a lambda function by serverless.yaml file.

After downloading the code for the lambda, or updating it, and installing the requirements and serverless plugin,

cd tool/serverless/
git clone https://github.com/bilardi/aws-saving
cd aws-saving/
pip3 install -r requirements.txt -t .
npm install serverless-python-requirements

You can deploy the lambda by Serverless framework,

cd tool/serverless/
export AWS_PROFILE=your-account
cp *yaml aws-saving/; cd aws-saving/
serverless deploy --stage only-lambda

Below you can find an example of deployment of a lambda function and an EC2 with the installation for Minetest server by serverless.yaml file and other configuration files.

You have to edit the serverless.yaml file for changing subnet and security group before deploying the EC2.

cd tool/serverless/
export AWS_PROFILE=your-account
cp *yaml aws-saving/; cd aws-saving/
serverless deploy --stage ec2-basic

There is another stage where you can see the security group creation: ec2-more, with this configuration, you have to edit serverless.yaml file for changing vpc id and subnet before deploying the EC2.

cd tool/serverless/
export AWS_PROFILE=your-account
cp *yaml aws-saving/; cd aws-saving/
serverless deploy --stage ec2-more

By a bash script

An ad hoc script is only necessary if you need to manage

  • more applications in the same repo and you want one stack for each application, so you could cycle their deployment
  • where you deploy the stack, the instance needs special precautions

Remember

When you deploy your objects by Serverless,

Coming soon for other tools.