Giter Club home page Giter Club logo

terraform-aws-airflow's Introduction

Maintained by Powerdatahub.com Apache Airflow 1.10.11

Airflow AWS Module

Terraform module to deploy an Apache Airflow cluster on AWS, backed by RDS PostgreSQL for metadata, S3 for logs and SQS as message broker with CeleryExecutor

Terraform supported versions:

Terraform version Tag
<= 0.11 v0.7.x
>= 0.12 >= v0.8.x

Usage

You can use this module from the Terraform Registry

module "airflow-cluster" {
  # REQUIRED
  source                   = "powerdatahub/airflow/aws"
  key_name                 = "airflow-key"
  cluster_name             = "my-airflow"
  cluster_stage            = "prod" # Default is 'dev'
  db_password              = "your-rds-master-password"
  fernet_key               = "your-fernet-key" # see https://airflow.readthedocs.io/en/stable/howto/secure-connections.html

  # OPTIONALS
  vpc_id                   = "some-vpc-id"                     # Use default if not provided
  custom_requirements      = "path/to/custom/requirements.txt" # See examples/custom_requirements for more details
  custom_env               = "path/to/custom/env"              # See examples/custom_env for more details
  ingress_cidr_blocks      = ["0.0.0.0/0"]                     # List of IPv4 CIDR ranges to use on all ingress rules
  ingress_with_cidr_blocks = [                                 # List of computed ingress rules to create where 'cidr_blocks' is used
    {
      description = "List of computed ingress rules for Airflow webserver"
      from_port   = 8080
      to_port     = 8080
      protocol    = "tcp"
      cidr_blocks = "0.0.0.0/0"
    },
    {
      description = "List of computed ingress rules for Airflow flower"
      from_port   = 5555
      to_port     = 5555
      protocol    = "tcp"
      cidr_blocks = "0.0.0.0/0"
    }
  ]
  tags                     = {
    FirstKey  = "first-value"                                  # Additional tags to use on resources
    SecondKey = "second-value"
  }
  load_example_dags        = false
  load_default_conns       = false
  rbac                     = true                              # See examples/rbac for more details
  admin_name               = "John"                            # Only if rbac is true
  admin_lastname           = "Doe"                             # Only if rbac is true
  admin_email              = "[email protected]"                 # Only if rbac is true
  admin_username           = "admin"                           # Only if rbac is true
  admin_password           = "supersecretpassword"             # Only if rbac is true
}

Debug and logs

The Airflow service runs under systemd, so logs are available through journalctl.

$ journalctl -u airflow -n 50

Todo

  • Run airflow as systemd service
  • Provide a way to pass a custom requirements.txt files on provision step
  • Provide a way to pass a custom packages.txt files on provision step
  • RBAC
  • Support for Google OAUTH
  • Flower
  • Secure Flower install
  • Provide a way to inject environment variables into airflow
  • Split services into multiples files
  • Auto Scalling for workers
  • Use SPOT instances for workers
  • Maybe use the AWS Fargate to reduce costs

Special thanks to villasv/aws-airflow-stack, an incredible project, for the inspiration.


Requirements

Name Version
terraform >= 0.12

Providers

Name Version
aws n/a
template n/a

Inputs

Name Description Type Default Required
admin_email Admin email. Only If RBAC is enabled, this user will be created in the first run only. string "[email protected]" no
admin_lastname Admin lastname. Only If RBAC is enabled, this user will be created in the first run only. string "Doe" no
admin_name Admin name. Only If RBAC is enabled, this user will be created in the first run only. string "John" no
admin_password Admin password. Only If RBAC is enabled. string false no
admin_username Admin username used to authenticate. Only If RBAC is enabled, this user will be created in the first run only. string "admin" no
ami Default is Ubuntu Server 18.04 LTS (HVM), SSD Volume Type. string "ami-0ac80df6eff0e70b5" no
aws_region AWS Region string "us-east-1" no
azs Run the EC2 Instances in these Availability Zones map(string)
{
"1": "us-east-1a",
"2": "us-east-1b",
"3": "us-east-1c",
"4": "us-east-1d"
}
no
cluster_name The name of the Airflow cluster (e.g. airflow-xyz). This variable is used to namespace all resources created by this module. string n/a yes
cluster_stage The stage of the Airflow cluster (e.g. prod). string "dev" no
custom_env Path to custom airflow environments variables. string null no
custom_requirements Path to custom requirements.txt. string null no
db_allocated_storage Dabatase disk size. string 20 no
db_dbname PostgreSQL database name. string "airflow" no
db_instance_type Instance type for PostgreSQL database string "db.t2.micro" no
db_password PostgreSQL password. string n/a yes
db_subnet_group_name db subnet group, if assigned, db will create in that subnet, default create in default vpc string "" no
db_username PostgreSQL username. string "airflow" no
fernet_key Key for encrypting data in the database - see Airflow docs. string n/a yes
ingress_cidr_blocks List of IPv4 CIDR ranges to use on all ingress rules list(string)
[
"0.0.0.0/0"
]
no
ingress_with_cidr_blocks List of computed ingress rules to create where 'cidr_blocks' is used
list(object({
description = string
from_port = number
to_port = number
protocol = string
cidr_blocks = string
}))
[
{
"cidr_blocks": "0.0.0.0/0",
"description": "Airflow webserver",
"from_port": 8080,
"protocol": "tcp",
"to_port": 8080
},
{
"cidr_blocks": "0.0.0.0/0",
"description": "Airflow flower",
"from_port": 5555,
"protocol": "tcp",
"to_port": 5555
}
]
no
instance_subnet_id subnet id used for ec2 instances running airflow, if not defined, vpc's first element in subnetlist will be used string "" no
key_name AWS KeyPair name. string null no
load_default_conns Load the default connections initialized by Airflow. Most consider these unnecessary, which is why the default is to not load them. bool false no
load_example_dags Load the example DAGs distributed with Airflow. Useful if deploying a stack for demonstrating a few topologies, operators and scheduling strategies. bool false no
private_key Enter the content of the SSH Private Key to run provisioner. string null no
private_key_path Enter the path to the SSH Private Key to run provisioner. string "~/.ssh/id_rsa" no
public_key Enter the content of the SSH Public Key to run provisioner. string null no
public_key_path Enter the path to the SSH Public Key to add to AWS. string "~/.ssh/id_rsa.pub" no
rbac Enable support for Role-Based Access Control (RBAC). string false no
root_volume_delete_on_termination Whether the volume should be destroyed on instance termination. bool true no
root_volume_ebs_optimized If true, the launched EC2 instance will be EBS-optimized. bool false no
root_volume_size The size, in GB, of the root EBS volume. string 35 no
root_volume_type The type of volume. Must be one of: standard, gp2, or io1. string "gp2" no
s3_bucket_name S3 Bucket to save airflow logs. string "" no
scheduler_instance_type Instance type for the Airflow Scheduler. string "t3.micro" no
spot_price The maximum hourly price to pay for EC2 Spot Instances. string "" no
tags Additional tags used into terraform-terraform-labels module. map(string) {} no
vpc_id The ID of the VPC in which the nodes will be deployed. Uses default VPC if not supplied. string null no
webserver_instance_type Instance type for the Airflow Webserver. string "t3.micro" no
webserver_port The port Airflow webserver will be listening. Ports below 1024 can be opened only with root privileges and the airflow process does not run as such. string "8080" no
worker_instance_count Number of worker instances to create. string 1 no
worker_instance_type Instance type for the Celery Worker. string "t3.small" no

Outputs

Name Description
database_endpoint Endpoint to connect to RDS metadata DB
database_username Username to connect to RDS metadata DB
this_cluster_security_group_id The ID of the security group
this_database_security_group_id The ID of the security group
webserver_admin_url Url for the Airflow Webserver Admin
webserver_public_ip Public IP address for the Airflow Webserver instance

forthebadge forthebadge forthebadge

terraform-aws-airflow's People

Contributors

dependabot-preview[bot] avatar edbizarro avatar fmunteanu avatar ouadakarim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

terraform-aws-airflow's Issues

Inherit provider settings from parent

Hello,

The module seems to be explicitly defining the provider and does not inherit it from the parent/root module. This limits the user of only being able to configure the region and does not allow, for example, static credentials.

I've deleted the provider.tf from the module to get around this, however, I wonder if there are more elegant solutions.

Thank you!

Service does not start on 0.8.11

I managed to deploy the cluster using examples\deploy, however, the service does not start:

Jul 18 17:26:47 ip-172-31-40-22 systemd[1]: Started Airflow daemon.
Jul 18 17:26:47 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:47,651] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=25465
Jul 18 17:26:47 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:47,879] {__init__.py:51} INFO - Using executor CeleryExecutor
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]:   ____________       _____________
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]:  ____    |__( )_________  __/__  /________      __
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]: ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]: ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]:  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
Jul 18 17:26:48 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:48,581] {__init__.py:305} INFO - Filling up the DagBag from /dev/null
Jul 18 17:26:49 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:49,148] {security.py:446} INFO - Start syncing user roles.
Jul 18 17:26:49 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:49,624] {security.py:357} INFO - Fetching a set of all permission, view_menu from FAB meta-table
Jul 18 17:26:49 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:49,963] {security.py:307} INFO - Cleaning faulty perms
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:50,633] {settings.py:182} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=25507
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]: /usr/local/lib/python3.6/dist-packages/psycopg2/__init__.py:144: UserWarning: The psycopg2 wheel package will be renamed from release 2.8; in order to keep installing from binary please use "pip install psycopg2-binary" instead. For details see: <http://initd.org/psycopg/docs/install.html#binary-install-from-pypi>.
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   """)
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]: [2019-07-18 17:26:50 +0000] [25507] [INFO] Starting gunicorn 19.9.0
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]: Traceback (most recent call last):
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/bin/gunicorn", line 11, in <module>
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     sys.exit(run())
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/app/wsgiapp.py", line 61, in run
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     WSGIApplication("%(prog)s [OPTIONS] [APP_MODULE]").run()
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/app/base.py", line 223, in run
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     super(Application, self).run()
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/app/base.py", line 72, in run
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     Arbiter(self).run()
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/arbiter.py", line 199, in run
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     self.start()
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/arbiter.py", line 139, in start
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     self.pidfile.create(self.pid)
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/local/lib/python3.6/dist-packages/gunicorn/pidfile.py", line 36, in create
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     fd, fname = tempfile.mkstemp(dir=fdir)
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/lib/python3.6/tempfile.py", line 483, in mkstemp
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     return _mkstemp_inner(dir, prefix, suffix, flags, output_type)
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:   File "/usr/lib/python3.6/tempfile.py", line 401, in _mkstemp_inner
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]:     fd = _os.open(file, flags, 0o600)
Jul 18 17:26:50 ip-172-31-40-22 terraform-aws-airflow[25465]: PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/tmpqk3e9n1v'

stat : no such file or directory

apply failed with:

Error: Error applying plan:

3 error(s) occurred:

* module.airflow-cluster.aws_instance.airflow_webserver: stat : no such file or directory
* module.airflow-cluster.aws_instance.airflow_worker: stat : no such file or directory
* module.airflow-cluster.aws_instance.airflow_scheduler: stat : no such file or directory

Config was as in this issue:
#9

I think the issue may be that the package does not contain requirements.txt and I did not specify a custom one. (I see .txt in Github but it may not have been included in the module. Not seeing it in .terraform)

Terraform has an open issue where missing files will cause the file provisioner to fail with a cryptic message.

Can work around it by creating an empty requirements.txt.

Edit: still encountering this after providing requirements.txt. Unfortunately terraform doesn't say which resource is missing.

Scheduler number of runs

Hi,

I noticed that the scheduler has number of runs set to 10.

What was the reason for this? To pick up new DAG files? Cause for that, as far as I understood, dag_dir_list_interval does that automatically every 300 seconds by default.

Issues deploying an Airflow cluster due to obsolete ami image default and environment provisioning failure in EC2 instances.

Hi all and greetings for this great job.

  1. The default ami image used to deploy EC2 instances is no longer available (at least in eu-west-1 zone), I must use the more recent: ami-089cc16f7f08c4457" # ubuntu/images/hvm-ssd/ubuntu-bionic-18.04-amd64-server-20200611, instead setting it in main.tf with "ami =" option without problem.

  2. In EC2 instance provisining file "cloud-init.sh" I got a failure setting the Airflow environment variables in /etc/environment due to incorrect line format in file because of the use of preceding "export ", to avoid that I propose the next patch:

--- .terraform/modules/airflow_cluster/terraform-aws-airflow-0.12.0/files/cloud-init.sh.orig    2020-06-25 11:42:51.153032110 +0000
+++ .terraform/modules/airflow_cluster/terraform-aws-airflow-0.12.0/files/cloud-init.sh 2020-06-26 06:42:23.942287388 +0000
@@ -92,11 +92,12 @@

    cat /etc/environment | sudo tee -a /tmp/airflow_environment
    cat /tmp/custom_env | sudo tee -a /tmp/airflow_environment
-   sed 's/^/export /' -- </tmp/airflow_environment | sudo tee -a /etc/environment
+    sudo cp /tmp/airflow_environment /etc/environment
+   sed 's/^/export /' -- </tmp/airflow_environment | sudo tee -a /tmp/airflow_exportenv
    sudo cat /tmp/airflow.service >> /etc/systemd/system/airflow.service
    cat /tmp/airflow_environment | sudo tee -a /etc/sysconfig/airflow

-   source /etc/environment
+   source /tmp/airflow_exportenv

    if [ "$AIRFLOW__CORE__LOAD_DEFAULTS" = false ]; then
            airflow upgradedb

Best regards,

Juan M.Victoria

[Proposal] New features

I installed your module, thank you for making everyone's life easy, deploying Airflow with Terraform. ๐Ÿ˜„

At Expedia, we need few specific variables to be added. We need be able to:

  • define custom subnets, AZs (#19) and security groups
  • be able to use an AWS console existing ssh key (instead of creating a fresh one)
  • restrict IAM role permissions (current permissions could allow an user logged into EC2 to delete all buckets, instead of being restricted to the one created by Terraform)
  • use custom tags we can define, for all Terraform resources
  • use SSL for Admin UI and Flower
  • use LDAP authentication for Admin UI (admin, user, read_only, etc.)

I would like to contribute to the code with all these features, which should be useful to other people. How do you want me to approach this? Should I clone the master and create a branch to be evaluated by your contributors?

Cannot define azs

module "airflow_cluster" {
    source             = "powerdatahub/airflow/aws"
    version            = "0.8.11"
    ...
    ami                = "ami-005bdb005fb00e791" # us-west-2
    aws_region         = "us-west-2"
    azs = {
        "1"            = "us-west-2a",
        "2"            = "us-west-2b",
        "3"            = "us-west-2c"
    }
    ...
}

Error:

Error launching source instance: Unsupported: Your requested instance type (t3.micro) is not supported in your requested Availability Zone (us-west-2d). Please retry your request by not specifying an Availability Zone or choosing us-west-2a, us-west-2b, us-west-2c.
	status code: 400, request id: 7afc6751-570a-4bca-a1da-9a012b242919

Bump dependencies for Terraform v0.13.2

I downloaded the latest terraform and tried to terraform init, but the https://github.com/cloudposse/terraform-terraform-label.git repo dependency is pinned to ref=tags/0.4.0, which is incompatible with v0.13.X apparently.

Error: Unsupported Terraform Core version

  on .terraform/modules/airflow-cluster.airflow_labels/versions.tf line 2, in terraform:
   2:   required_version = "~> 0.12.0"

Module module.airflow-cluster.module.airflow_labels (from
git::https://github.com/cloudposse/terraform-terraform-label.git?ref=tags/0.4.0)
does not support Terraform version 0.13.2. To proceed, either choose another
supported Terraform version or update this version constraint. Version
constraints are normally set for good reason, so updating the constraint may
lead to other errors or unexpected behavior.


Error: Unsupported Terraform Core version

  on .terraform/modules/airflow-cluster.airflow_labels_scheduler/versions.tf line 2, in terraform:
   2:   required_version = "~> 0.12.0"

Module module.airflow-cluster.module.airflow_labels_scheduler (from
git::https://github.com/cloudposse/terraform-terraform-label.git?ref=tags/0.4.0)
does not support Terraform version 0.13.2. To proceed, either choose another
supported Terraform version or update this version constraint. Version
constraints are normally set for good reason, so updating the constraint may
lead to other errors or unexpected behavior.


Error: Unsupported Terraform Core version

  on .terraform/modules/airflow-cluster.airflow_labels_webserver/versions.tf line 2, in terraform:
   2:   required_version = "~> 0.12.0"

Module module.airflow-cluster.module.airflow_labels_webserver (from
git::https://github.com/cloudposse/terraform-terraform-label.git?ref=tags/0.4.0)
does not support Terraform version 0.13.2. To proceed, either choose another
supported Terraform version or update this version constraint. Version
constraints are normally set for good reason, so updating the constraint may
lead to other errors or unexpected behavior.


Error: Unsupported Terraform Core version

  on .terraform/modules/airflow-cluster.airflow_labels_worker/versions.tf line 2, in terraform:
   2:   required_version = "~> 0.12.0"

Module module.airflow-cluster.module.airflow_labels_worker (from
git::https://github.com/cloudposse/terraform-terraform-label.git?ref=tags/0.4.0)
does not support Terraform version 0.13.2. To proceed, either choose another
supported Terraform version or update this version constraint. Version
constraints are normally set for good reason, so updating the constraint may
lead to other errors or unexpected behavior.

s3 bucket creation error

error message:

Error: Error creating S3 bucket: AuthorizationHeaderMalformed: The authorization header is malformed; the region 'us-east-1' is wrong; expecting 'ap-south-1'
        status code: 400, request id: 28351BE143721736, host id: MshbxPXefuouNbELCQa/sKSsv4YSOShgBuAjBz9CiDhUI18YF7hRhPXNXmP6xrNvX6er3MWPeNM=

  on .terraform/modules/airflow/PowerDataHub-terraform-aws-airflow-a679957/main.tf line 22, in resource "aws_s3_bucket" "airflow_logs":
  22: resource "aws_s3_bucket" "airflow_logs" {


terrform version: Terraform v0.12.9

config file:

provider "aws" {
  profile = "default"
  region  = "us-east-1"
}

module "airflow" {
  version = "0.9.1"
  # 
}

Failed to parse key file

From some preliminary Google searches this is possibly due to ssh-agent (not confident in that though). I'm trying this on OS X.


* module.airflow-cluster.aws_instance.airflow_webserver: Failed to parse key file "-----BEGIN OPENSSH PRIVATE KEY-----\n*redacted*\n-----END OPENSSH PRIVATE KEY-----\n": ssh: cannot decode encrypted private keys
* module.airflow-cluster.aws_instance.airflow_scheduler: Failed to parse key file "-----BEGIN OPENSSH PRIVATE KEY-----\n*redacted*\n-----END OPENSSH PRIVATE KEY-----\n": ssh: cannot decode encrypted private keys
* module.airflow-cluster.aws_instance.airflow_worker: Failed to parse key file "-----BEGIN OPENSSH PRIVATE KEY-----\n*redacted*\n-----END OPENSSH PRIVATE KEY-----\n": ssh: cannot decode encrypted private keys

Haven't tried to work around it yet but let me know if there's more info I can supply.

Specify vpc_id failed when creating RDS instance

Module Version: 0.7.2 (after initial investigate, this issue might also happened in latest version 0.8.10)

Issue: when I providing vpc_id variables (my existing vpc). it will failed when creating RDS instance.

Reason: this module will create rds in default vpc, and with providing VPC id, it will generate and attached providing VPC's secruity group to default VPC which cause failure.

The vpc_security_group_ids in bellowed code (in main.tf)

resource "aws_db_instance" "airflow_database" {
  identifier = "${module.airflow_labels.id}-db"
  allocated_storage = "${var.db_allocated_storage}"
  engine = "postgres"
  engine_version = "11.1"
  instance_class = "${var.db_instance_type}"
  name = "${var.db_dbname}"
  username = "${var.db_username}"
  password = "${var.db_password}"
  storage_type = "gp2"
  backup_retention_period = 14
  multi_az = false
  publicly_accessible = false
  apply_immediately = true
  skip_final_snapshot = true
  vpc_security_group_ids = ["${module.sg_database.this_security_group_id}"]
  port = "5432"
}

Possible solution: create subnet group with providing vpc and add to aws_db_instance's db_sunet_group argument

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.