Giter Club home page Giter Club logo

edb-terraform's Introduction

edb-terraform

Terraform templates aimed to provide easy to use YAML configuration file describing the target cloud infrastructure.

Supported Cloud providers

Provider Component Supported
EDB BigAnimal - AWS
AWS EC2 - VM
AWS EC2 - additional EBS vol.
AWS multi-region VPC peering
AWS Security (ports)
AWS RDS
AWS RDS Aurora
AWS Elastic Kubernetes Service
GCloud Compute Engine - VM
GCloud CloudSQL
GCloud AlloyDB
GCloud Google Kubernetes Engine
EDB BigAnimal - Azure
Azure VM
Azure Database - Flexible
Azure CosmoDB
Azure Azure Kubernetes Service

Prerequisites and installation

The following components must be installed on the system:

  • Python3 >= 3.6
  • AWS CLI
  • GCloud CLI
  • Azure CLI
  • BigAnimal token (CLI currently optional)
  • Terraform >= 1.3.6

Infrastructure file examples

Infrastructure files describing the target cloud can be found inside of the infrastructure-examples directory

Prequisites installation on Debian 11

Python3/pip3

$ sudo apt install python3 python3-pip -y
$ sudo pip3 install pip --upgrade

BigAnimal Token by API

Getting an API token

  • access_token - expires in 24 hours
  • refresh_token
    • expires
      • 30 days
      • when refreshed
      • expired refresh_tokens reused
    • changes after every refresh with a new access_token
wget https://raw.githubusercontent.com/EnterpriseDB/cloud-utilities/main/api/get-token.sh
bash get-token.sh
# Visit the biganimal link to activate the device
# ex. Please login to https://auth.biganimal.com/activate?user_code=JWPL-RCXL with your BigAnimal account
#     Have you finished the login successfully. (y/n)
# Save the refresh token, if needed
export BA_BEARER_TOKEN=<access_token>

Refresh the token

bash get-token.sh --refresh <refresh_token>
# Save the new refresh token, if needed again
export BA_BEARER_TOKEN=<access_token>

BigAnimal CLI

Using the BigAnimal CLI

The CLI currently requires users to visit a link during when using biganimal reset-credential . The token directly from the API is preferred to avoid needing to revisit the link.

AWS CLI

$ sudo pip3 install awscli

AWS Access Key and Secret Access Key configuration:

$ aws configure

GCloud CLI

Install CLI

Initialize GCloud and export project id

$ gcloud init
$ export GOOGLE_PROJECT=<project_id>

Terraform

$ sudo apt install unzip -y
$ wget https://releases.hashicorp.com/terraform/1.3.6/terraform_1.3.6_linux_amd64.zip
$ unzip terraform_1.3.6_linux_amd64.zip
$ sudo install terraform /usr/bin

edb-terraform installation

$ git clone https://github.com/EnterpriseDB/edb-terraform.git

asciicast

$ python3 -m pip install edb-terraform/. --upgrade

asciicast

Cloud Resources Creation

Once the infrastructure file has been created we can to proceed with cloud resources creation:

  1. We can attempt to setup a compatable version of Terraform. This directory will be inside of ~/.edb-terraform/bin Logs can be found inside of ~/.edb-terraform/logs

    $ edb-terraform setup
  2. A new Terraform project must be created with the help of the edb-terraform script. This script is in charge of creating a dedicated directory for the project, generating SSH keys, building Terraform configuration based on the infrastructure file, copying Terraform modules into the project directory.

    a. First argument is the project path, second argument is the path to the infrastructure file

    Use option -c to specify the cloud provider option: azure aws gcloud

    Defaults to aws if not used

    $ edb-terraform generate --project-name aws-terraform \
                             --cloud-service-provider aws \
                             --infra-file edb-terraform/infrastructure-examples/aws/edb-ra-3.yml \
                             --user-templates edb-terraform/infrastructure-examples/templates/inventory.yml.tftpl

    b. Step 2 can be skipped if option --validate is included with generate, which provides basic validations and checks through terraform.

    
    

asciicast

  1. Terraform initialisation of the project:

    $ cd aws-terraform
    $ terraform init
  2. Apply Cloud resources creation:

    $ cd aws-terraform
    $ terraform apply -auto-approve

asciicast

SSH access to the machines

Once cloud resources provisioning is completed, machines public and private IPs are stored in the servers.yml file, located into the project's directory. These outputs can be used with a list of templates to generate files for other programs such as ansible. See example here which uses the below outputs.

Example:

---
servers:
  machines:
    dbt2-driver:
      additional_volumes: []
      instance_type: "c5.4xlarge"
      operating_system: {"name":"debian-10-amd64","owner":"136693071363","ssh_user":"admin"}
      private_ip: "10.2.20.38"
      public_dns: "ec2-54-197-78-139.compute-1.amazonaws.com"
      public_ip: "54.197.78.139"
      region: "us-east-1"
      tags: {"Name":"dbt2-driver-Demo-Infra-d8d0a932","cluster_name":"Demo-Infra","created_by":"edb-terraform","terraform_hex":"d8d0a932","terraform_id":"2NCpMg","terraform_time":"2023-05-24T21:09:11Z","type":"dbt2-driver"}
      type: null
      zone: "us-east-1b"
    pg1:
      additional_volumes: [{"encrypted":false,"iops":5000,"mount_point":"/opt/pg_data","size_gb":20,"type":"io2"},{"encrypted":false,"iops":5000,"mount_point":"/opt/pg_wal","size_gb":20,"type":"io2"}]
      instance_type: "c5.4xlarge"
      operating_system: {"name":"Rocky-8-ec2-8.6-20220515.0.x86_64","owner":"679593333241","ssh_user":"rocky"}
      private_ip: "10.2.30.197"
      public_dns: "ec2-3-89-238-24.compute-1.amazonaws.com"
      public_ip: "3.89.238.24"
      region: "us-east-1"
      tags: {"Name":"pg1-Demo-Infra-d8d0a932","cluster_name":"Demo-Infra","created_by":"edb-terraform","terraform_hex":"d8d0a932","terraform_id":"2NCpMg","terraform_time":"2023-05-24T21:09:11Z","type":"postgres"}
      type: null
      zone: "us-east-1b"
[...]

You can also use terraform output to get a json object for use

terraform output -json servers | python3 -m json.tool

SSH key files: ssh-id_rsa and ssh-id_rsa.pub.

Customizations

Users can further modify their resources after the initial provisioning. If any output files are needed based on the resources, terraform templates can be added to the projects template directory to have it rendered with any resource outputs once all resources are created. Examples of template files can be found here: edb-ansible included inventory.yml sample inventory.yml

asciicast

Cloud resources destruction

$ cd aws-terraform
$ terraform destroy -auto-approve

asciicast

edb-terraform's People

Contributors

bryan-bar avatar jt-edb avatar mw2q avatar dougortiz avatar rdimag avatar richyen avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.