Giter Club home page Giter Club logo

terraform-google-event-function's Introduction

Event Function

This module configures a system which responds to events by invoking a Cloud Functions function.

The root module configures a function sourced from a directory on localhost to respond to a given event trigger. The source directory is compressed and uploaded as a Cloud Storage bucket object which will be leveraged by the function.

Alternatively, the repository-function submodule configures a function sourced from a Cloud Source Repositories repository.

Compatibility

This module is meant for use with Terraform 0.13+ and tested using Terraform 1.0+. If you find incompatibilities using Terraform >=0.13, please open an issue.

If you haven't upgraded and need a Terraform 0.12.x-compatible version of this module, the last released version intended for Terraform 0.12.x is v1.6.0.

Usage

The automatic-labelling-from-localhost example is a tested reference of how to use the root module with the event-project-log-entry submodule.

Terraform Created Source Files

If you have local_file Terraform resources that need to be included in the function's archive include them in the optional source_dependent_files.

This will tell the module to wait until those files exist before creating the archive.

Example can also be seen in examples/dynamic-files

resource "local_file" "file" {
  content  = "some content"
  filename = "${path.module}/function_source/terraform_created_file.txt"
}

module "localhost_function" {
  ...

  source_dependent_files = [local_file.file]
}

Inputs

Name Description Type Default Required
available_memory_mb The amount of memory in megabytes allotted for the function to use. number 256 no
bucket_force_destroy When deleting the GCS bucket containing the cloud function, delete all objects in the bucket first. bool false no
bucket_labels A set of key/value label pairs to assign to the function source archive bucket. map(string) {} no
bucket_name The name to apply to the bucket. Will default to a string of the function name. string "" no
build_environment_variables A set of key/value environment variable pairs available during build time. map(string) {} no
create_bucket Whether to create a new bucket or use an existing one. If false, bucket_name should reference the name of the alternate bucket to use. bool true no
description The description of the function. string "Processes events." no
docker_registry Docker Registry to use for storing the function's Docker images. Allowed values are CONTAINER_REGISTRY (default) and ARTIFACT_REGISTRY. string null no
docker_repository User managed repository created in Artifact Registry optionally with a customer managed encryption key. If specified, deployments will use Artifact Registry. string null no
entry_point The name of a method in the function source which will be invoked when the function is executed. string n/a yes
environment_variables A set of key/value environment variable pairs to assign to the function. map(string) {} no
event_trigger A source that fires events in response to a condition in another service. map(string) {} no
event_trigger_failure_policy_retry A toggle to determine if the function should be retried on failure. bool false no
files_to_exclude_in_source_dir Specify files to ignore when reading the source_dir list(string) [] no
ingress_settings The ingress settings for the function. Allowed values are ALLOW_ALL, ALLOW_INTERNAL_AND_GCLB and ALLOW_INTERNAL_ONLY. Changes to this field will recreate the cloud function. string "ALLOW_ALL" no
kms_key_name Resource name of a KMS crypto key (managed by the user) used to encrypt/decrypt function resources. string null no
labels A set of key/value label pairs to assign to the Cloud Function. map(string) {} no
log_bucket Log bucket string null no
log_object_prefix Log object prefix string null no
max_instances The maximum number of parallel executions of the function. number 0 no
name The name to apply to any nameable resources. string n/a yes
project_id The ID of the project to which resources will be applied. string n/a yes
region The region in which resources will be applied. string n/a yes
runtime The runtime in which the function will be executed. string n/a yes
secret_environment_variables A list of maps which contains key, project_id, secret_name (not the full secret id) and version to assign to the function as a set of secret environment variables. list(map(string)) [] no
service_account_email The service account to run the function as. string "" no
source_dependent_files A list of any Terraform created local_files that the module will wait for before creating the archive.
list(object({
filename = string
id = string
}))
[] no
source_directory The pathname of the directory which contains the function source code. string n/a yes
timeout_s The amount of time in seconds allotted for the execution of the function. number 60 no
trigger_http Wheter to use HTTP trigger instead of the event trigger. bool null no
vpc_connector The VPC Network Connector that this cloud function can connect to. It should be set up as fully-qualified URI. The format of this field is projects//locations//connectors/*. string null no
vpc_connector_egress_settings The egress settings for the connector, controlling what traffic is diverted through it. Allowed values are ALL_TRAFFIC and PRIVATE_RANGES_ONLY. If unset, this field preserves the previously set value. string null no

Outputs

Name Description
https_trigger_url URL which triggers function execution.
name The name of the function.

Requirements

The following sections describe the requirements which must be met in order to invoke this module.

Software Dependencies

The following software dependencies must be installed on the system from which this module will be invoked:

IAM Roles

The Service Account which will be used to invoke this module must have the following IAM roles:

  • Cloud Functions Developer: roles/cloudfunctions.developer
  • Storage Admin: roles/storage.admin
  • Secret Manager Accessor: roles/secretmanager.secretAccessor

APIs

The project against which this module will be invoked must have the following APIs enabled:

  • Cloud Functions API: cloudfunctions.googleapis.com
  • Cloud Storage API: storage-component.googleapis.com
  • Secret Manager API: secretmanager.googleapis.com

The Project Factory module can be used to provision projects with specific APIs activated.

Contributing

Refer to the contribution guidelines for information on contributing to this module.

terraform-google-event-function's People

Contributors

aaron-lane avatar adrienthebo avatar amandakarina avatar apeabody avatar apsureda avatar aronllstone avatar averbuks avatar bharathkkb avatar cloud-foundation-bot avatar dependabot[bot] avatar erjohnso avatar fbeevikm avatar franviera92 avatar jdyke avatar jtyr avatar kiran002 avatar kopachevsky avatar kpeder avatar morgante avatar nathan-gauci-kasna avatar nicholasazar avatar nick4fake avatar release-please[bot] avatar renovate[bot] avatar siddharthab avatar tardigrde avatar taylorludwig avatar the-fool avatar zachberger avatar zefdelgadillo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-google-event-function's Issues

Remove deprecated `null_data_source` datas

TL;DR

This project makes use of the null_data_source datas, which is deprecated in favor of locals (docs). This trigger warnings

Expected behavior

No warnings are triggered when using this module

Observed behavior

Warnings are triggered when using this module
image

Terraform Configuration

Any configuration should trigger the warnings

Terraform Version

1.5.4

Additional information

I have stumbled upon this while using https://github.com/terraform-google-modules/terraform-google-scheduled-function

This is not a bug per se, but I think it's the most appropriate template

Here are the 2 deprecated null_data_source:

Support for vpc_connector

The terraform resource supports a vpc_connector . It would be great to include it in the module as well. We've had to make a custom change to use this in our environment. Would be great to avoid drift from the original module.

google_cloudfunctions_function is re-created everytime despite containing no changes in config

TL;DR

whenever I run apply for google_cloudfunctions_function it always re-creates the function. Even if I've made no changes to config.

Expected behavior

The function should remain untouched

Observed behavior

function is destroyed and re-created

  # google_cloudfunctions_function.export-surveys-function is tainted, so must be replaced
-/+ resource "google_cloudfunctions_function" "export-surveys-function" {
      - environment_variables         = {} -> null
      + https_trigger_url             = (known after apply)
      ~ id                            = "projects/project/locations/europe-west1/functions/function-name" -> (known after apply)
      - labels                        = {} -> null
        name                          = "function-name"
      + vpc_connector_egress_settings = (known after apply)
        # (15 unchanged attributes hidden)

        # (5 unchanged blocks hidden)
    }

Terraform Configuration

resource "google_cloudfunctions_function" "export-surveys-function" {
  docker_registry       = "CONTAINER_REGISTRY"
  name                  = local.function_name
  description           = "Function that is triggered by a schedule (via pubsub) to export all survey data from mongo"
  region                = var.region
  entry_point           = "entryPoint"
  source_archive_bucket = google_storage_bucket.source_code_bucket.name
  source_archive_object = "function.zip"
  runtime               = "nodejs16"
  labels                = {}
  project = data.google_project.project.project_id
  service_account_email  = data.google_service_account.run-account.email
  min_instances         = 0
  environment_variables = {}
  https_trigger_security_level  = "SECURE_ALWAYS"
}

Terraform Version

Terraform v1.3.1
on darwin_arm64
+ provider registry.terraform.io/hashicorp/google v4.38.0

Additional information

No response

Adds support for excludes parameter for arquive_file for the source code

Hey folks,

It would be good to module support the excludes parameter in arquive_file step.

From:

data "archive_file" "main" {
  type        = "zip"
  output_path = pathexpand("${var.source_directory}.zip")
  source_dir  = data.null_data_source.wait_for_files.outputs["source_dir"]
}

To:

data "archive_file" "main" {
  type        = "zip"
  output_path = pathexpand("${var.source_directory}.zip")
  source_dir  = data.null_data_source.wait_for_files.outputs["source_dir"]
  excludes     = var.files_to_exclude_in_source_dir
}

Changing the source code bucket does not update the function

TL;DR

Given a function that's deployed with a source_archive_bucket when the bucket is updated by terraform it does not re-deploy the function.

Expected behavior

Changing the contents of the source code bucket should re-deploy the function to update the source.

Observed behavior

the source is not updated, The work-around is that you have to do: terraform plan -replace= to force TF to replace the function every time.

Terraform Configuration

resource "google_storage_bucket" "source_code_bucket" {
...
}

resource "google_storage_bucket_object" "function" {
  name   = "function.zip"
  source = "../dist/function.zip"
  bucket = google_storage_bucket.source_code_bucket.name
}

resource "google_cloudfunctions_function" "export-surveys-function" {
...
  source_archive_bucket = google_storage_bucket.source_code_bucket.name
  source_archive_object = "function.zip"
 }

Terraform Version

Terraform v1.2.8
on darwin_arm64
+ provider registry.terraform.io/hashicorp/google v4.38.0

Additional information

No response

ZIP file is not recreated if running in a separate stage

When I modify a file in the source_directory, I find that the zip file is not recreated when one of the files in the source directory has changed.

It seems we fixed a part of this when fixing #32 with PR #35 by forcing update on the object in the GCS bucket - but the archive_file (zip) itself is not being updated, i.e in the end not forcing the update of the Cloud Function.

The upstream issue for failing to update the archive_file is here: hashicorp/terraform-provider-archive#39. It seems it's linked to running plan and apply in a different context (think a Cloud Build pipeline composed of two configs that effectively 'forgets' about the .terraform/ folder between plan and apply).

I see 3 ways we can work-around this in this module (while waiting for an upstream fix):

  • compute the hash of all files in the source directory and force the recomputing of the archive_file resource based on if any hash in the source_directory has changed; but I'm not certain how to do this for all files in the source_directory (for one file, it would be straightforward - for multiple, we'll need a custom shell script to list out all the files in the directory).

  • have a random id appended to the zip filename that would be forced to regenerate when we need it, i.e by passing e.g a force_update variable to the module.

  • advertise that plan and apply should be run in the same context, which would make some users struggle.

Any help is appreciated here :)

Support subdirectories in bucket object name

It would be nice to somehow indicate in which subdirectory the source zip file should be placed.

For example, I have a single bucket for all cloud functions sources, and I want to keep a file layout like this:

my-bucket/
    function1/hash-build.zip
    function2/hash-build.zip

My suggestion is to add an optional variable for object directory path.

TF 13 Upgrade: Replace `source_dir` by `source`

I have noticed that using for_each with this module will cause inconsistent results because the source folder is overwritten by the successive executions in the for_each loop. In general, we should avoid creating local directories to build GCF code until Terraform accepts temp directories.

To fix this, imo it is better to not define any local directory and specify the source for files directly since we don't manipulate local files anymore. It will also help to get rid of the null_resource.wait_for_files resource hack.

We should switch from this:

data "archive_file" "main" {
  type        = "zip"
  output_path = pathexpand("${var.source_directory}.zip")
  source_dir  = "${data.null_data_source.wait_for_files.outputs["source_dir"]}"
}

to

resource "random_uuid" "this" {
  keepers = {
    for filename in fileset(local.function_source_directory, "**/*") :
    filename => filemd5("${local.function_source_directory}/${filename}")
  }
}

data "archive_file" "main" {
  type        = "zip"
  output_path = pathexpand("${var.source_directory}.zip")
  dynamic "source" {
    for_each = var.files
    content {
      content = each.value.content
      filename = each.value.filename
    }
  }
}

and switch inputs to

files = [ 
 { content = "test", filename = "test.txt" },
 { content = "test2", filename = "test2.txt" } 
]

See terraform-google-modules/terraform-google-slo#38 (comment)

Add vpc_connector_egress_settings support

The terraform resource supports a vpc_connector and vpc_connector_egress_settings. It would be great to include it in the module as well. We've had to make a custom change to use this in our environment. Would be great to avoid drift from the original module.

Cloud Function deployment fails with "Failed to retrieve function source code"

TL;DR

We are seeing some flakes in creating Cloud Functions with a generic "Failed to retrieve function source code" error message. Can we improve the error message to indicate what is wrong?

Expected behavior

Succeed or output a message that is useful to troubleshoot.

Observed behavior

The message is too generic. The same deployment has passed before, and we don't know what is causing this flake.

Terraform Configuration

module "batch_cf" {
  source                = "terraform-google-modules/event-function/google//modules/repository-function"
  version               = "2.1.0"
  name                  = "batch-function"
  project_id            = module.project.project_id
  description           = "Cloud Function to initiate batch pipeline"
  timeout_s             = 540
  entry_point           = "launch_batch_pipeline"
  service_account_email = "${google_service_account.function_runner.account_id}@${module.project.project_id}.iam.gserviceaccount.com"

  source_repository_url = "https://source.developers.google.com/projects/<myproject>/repos/<myrepo>/fixed-aliases/v1.0.0"
  runtime               = "python37"

  region = "us-central1"

  event_trigger = {
    event_type                         = "google.storage.object.finalize"
    resource                           = "<mybucketname>"
    event_trigger_failure_policy_retry = true
  }

  environment_variables = {
    LOCATION                   = "us-central1"
    TEMPLATE_FILE_GCS_LOCATION = "gs://<path to file>.json"
  }
}

Terraform Version

0.14.8

Additional information

No response

Setting trigger_http to false does not work

TL;DR

We're checking for null, so setting it to false will result in an error.

Expected behavior

If trigger_http = false and event_trigger is defined, the module should work.

Observed behavior

Error message:

One of event_trigger or trigger_http is required: You must specify a trigger when deploying a new function.

Terraform Configuration

module "cloud-function" {
  source              = "terraform-google-modules/event-function/google"
  version             = "2.5.0"
  name                = "name"
  project_id          = var.project_id
  region              = var.region
  runtime             = "nodejs16"
  timeout_s           = 540  # 9min, maximum for gen1 cloud functions (gen2 will have 3600)
  available_memory_mb = 512
  bucket_name         = "${var.project_id}-name"
  source_directory    = "source"
  entry_point         = "handleMessage"
  trigger_http        = false
  event_trigger = {
    event_type = "google.pubsub.topic.publish"
    resource   = "projects/project/topics/topic"
  }
}

Terraform Version

v1.3.4

Additional information

No response

Updating the source for a local event-function doesn't update the function

If I am using the local_directory event-function I'd expect modifying the files in that directory (and running terraform apply) to redeploy the function.

module "localhost_function" {
  source  = "terraform-google-modules/event-function/google"
  version = "1.1.0"

  description = "Adds projects to VPC service permiterer."
  entry_point = "handler"

  environment_variables = {
    FOLDER_ID = var.folder_id
  }

  event_trigger    = module.event_folder_log_entry.function_event_trigger
  name             = local.watcher_name
  project_id       = var.project_id
  region           = var.region
  source_directory = abspath(path.module)
  runtime          = "python37"
  //  runtime               = "go111"
  available_memory_mb   = 2048
  timeout_s             = 540
  service_account_email = google_service_account.watcher.email
}

Currently it doesn't update the function if I do this. I have to explicitly recreate the function.

We should update this so changes to the source directory force a function reupload.

Apply on the module changes between Linux and MacOS

TL;DR

The google_storage_bucket_object.main must be replaced if we apply first on Linux then on Mac (or conversly). This comes from the md5 of the archive_file data being different between the two operating systems because the generated zip file does not have the same system permission, 0664 on linux and 0644 on macOS.

Expected behavior

The resource google_storage_bucket_object.main should not diverge between two operating systems. The output_file_mode field on the data archive_file should be a defaulted variable to avoid this issue.

Observed behavior

No response

Terraform Configuration

module "function" {
  source  = "terraform-google-modules/event-function/google"
  version = "~> 2.3.0"

  entry_point = "ProcessPubSub"
  event_trigger = {
    event_type = "google.pubsub.topic.publish"
    resource   = module.pubsub_topic.id
  }
  name             = "${local.name}-exporter"
  project_id       = var.project
  region           = var.region
  runtime          = "go116"
  source_directory = "${path.module}/function"

  source_dependent_files = []

  available_memory_mb                = 256
  bucket_force_destroy               = true
  bucket_name                        = "${local.name}-scheduled-exporter-function-${random_id.suffix.hex}"
  description                        = "Function to backup SQL instance."
  event_trigger_failure_policy_retry = false
  service_account_email              = google_service_account.this.email
  timeout_s                          = 540
}


### Terraform Version

```sh
1.3.0

Additional information

No response

Changing the source code bucket does not update the cloud function

TL;DR

Pushing the source code to storage using ADO pipeline, Referring the object generation in the cloud function.

Expected behavior

Re-deploying the cloud function should update the source code.

Observed behavior

Re-deploying the cloud function can deploy the latest version, but source code is not updated.

Terraform Configuration

data "google_storage_bucket_object" "abc" {
  name   = var.abc
  bucket = data.google_storage_bucket.bucket.name
}

resource "google_cloudfunctions_function" "abc" {
  project               = var.project
  name                  = var.abc
  region                = var.region
  description           = var.description
  runtime               = var.runtime
  service_account_email = module.service_accounts.email

  available_memory_mb   = 128
  source_archive_bucket = data.google_storage_bucket_object.abc.bucket
  source_archive_object = data.google_storage_bucket_object.abc.name
  entry_point           = var.entrypoint

  event_trigger {
    event_type = var.triggertype
    resource   = var.resource
  }

  environment_variables = {
    object_url = data.google_storage_bucket_object.abc.media_link
  }
}

Terraform Version

1.4.2

Additional information

No response

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Pending Status Checks

These updates await pending status checks. To force their creation now, click the checkbox below.

  • chore(deps): Update go modules and/or dev-tools (cloud.google.com/go, google.golang.org/api)
  • chore(deps): Update Terraform terraform-google-modules/event-function/google to v4
  • chore(deps): Update Terraform terraform-google-modules/project-factory/google to v16
  • chore(deps): Update dependency googleapis to v142

Detected dependencies

gomod
examples/delete-vms-without-cmek/function_source/go.mod
  • go 1.20
  • cloud.google.com/go v0.115.0
  • google.golang.org/api v0.191.0
test/integration/go.mod
  • go 1.22
  • go 1.22.6
  • github.com/GoogleCloudPlatform/cloud-foundation-toolkit/infra/blueprint-test v0.16.1
  • github.com/stretchr/testify v1.9.0
npm
examples/automatic-labelling-folder/function_source/package.json
  • googleapis ^140.0.0
examples/automatic-labelling-from-localhost/function_source/package.json
  • googleapis ^140.0.0
examples/automatic-labelling-from-repository/function_source/package.json
  • googleapis ^140.0.0
examples/dynamic-files/function_source/package.json
regex
Makefile
  • cft/developer-tools 1.22
build/int.cloudbuild.yaml
  • cft/developer-tools 1.22
build/lint.cloudbuild.yaml
  • cft/developer-tools 1.22
terraform
examples/automatic-labelling-folder/main.tf
  • terraform-google-modules/event-function/google ~> 3.0
  • terraform-google-modules/event-function/google ~> 3.0
examples/automatic-labelling-folder/versions.tf
  • archive ~> 2.0
  • null >= 2.1
  • random >= 3.0
  • hashicorp/terraform >= 0.13
examples/automatic-labelling-from-localhost/main.tf
  • terraform-google-modules/event-function/google ~> 3.0
  • terraform-google-modules/event-function/google ~> 3.0
examples/automatic-labelling-from-localhost/versions.tf
  • archive ~> 2.0
  • null >= 2.1
  • random >= 3.0
  • hashicorp/terraform >= 0.13
examples/automatic-labelling-from-repository/main.tf
  • terraform-google-modules/event-function/google ~> 3.0
  • terraform-google-modules/event-function/google ~> 3.0
examples/automatic-labelling-from-repository/versions.tf
  • archive ~> 2.0
  • null >= 2.1
  • random >= 3.0
  • hashicorp/terraform >= 0.13
examples/delete-vms-without-cmek/main.tf
  • hashicorp/terraform >= 0.12
  • terraform-google-modules/event-function/google ~> 3.0
  • terraform-google-modules/event-function/google ~> 3.0
examples/delete-vms-without-cmek/versions.tf
  • archive ~> 2.0
  • random >= 3.0
  • hashicorp/terraform >= 0.13
examples/dynamic-files/main.tf
  • terraform-google-modules/event-function/google ~> 3.0
examples/dynamic-files/versions.tf
  • google >= 4.11
  • hashicorp/terraform >= 0.13
modules/event-folder-log-entry/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
modules/event-folder-log-entry/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
modules/event-project-log-entry/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
modules/event-project-log-entry/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
modules/repository-function/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
test/fixtures/automatic-labelling-folder/main.tf
test/setup/main.tf
  • terraform-google-modules/network/google ~> 9.0
  • terraform-google-modules/project-factory/google ~> 15.0
test/setup/versions.tf
  • google >= 3.53.0, < 6
  • hashicorp/terraform >= 0.13
versions.tf
  • archive >= 1.2, < 3.0
  • google >= 4.23, < 6
  • null >= 2.1, < 4.0
  • hashicorp/terraform >= 0.13

  • Check this box to trigger a request for Renovate to run again on this repository

Bad request error when creating cloud function using terraform

I am trying to create a Cloud Function using the terraform module. My configuration is as follows:

module "test_function" {
  source      = "terraform-google-modules/event-function/google//modules/repository-function"
  version     = "1.4.0"
  description = "Cloud Function for Testing"
  entry_point = "Process"
  event_trigger = {
    event_type = "google.storage.object.finalize"
    resource   = "projects/{{.my_test_project}}/buckets/test-bucket"
  }
  name                  = "test-function"
  project_id            = "{{.my_test_project}}"
  region                = "us-central1"
  source_repository_url = "https://source.developers.google.com/projects/{{.my_test_project}}/repos/{{.my-repo}}/main"
  environment_variables = {
    GOOGLE_CLOUD_PROJECT       = "{{.my_test_project}}"
    REGION                     = "us-central1"
  }
  timeout_s = 60
  runtime   = "go111"
}

Running terraform init and terraform plan completes successfully, but terraform apply returns

Error: googleapi: Error 400: The request has errors, badRequest

on .terraform/modules/test_function/modules/repository-function/main.tf line 17, in resource "google_cloudfunctions_function" "main":
17: resource "google_cloudfunctions_function" "main" {

I can't tell from the error message what is wrong and the configuration seems fine. A quick search showed other people running into the same problem but their solutions have not worked for me. Are there suggestions for what I'm missing or how to find a more detailed error message?

Thanks!

Deprecated resource warning because of null_data_source

TL;DR

Use of this module generates a warning about the use of the deprecated null_data_source.

Expected behavior

No warnings by Terraform.

Observed behavior

The following warning is shown when using the module:

Warning: Deprecated Resource
 with module.scheduled-function.module.main.data.null_data_source.wait_for_files,
 on .terraform/modules/scheduled-function.main/main.tf line 32, in data "null_data_source" "wait_for_files":
 32: data "null_data_source" "wait_for_files" {
The null_data_source was historically used to construct intermediate values
to re-use elsewhere in configuration, the same can now be achieved using
locals

Terraform Configuration

n/a

Terraform Version

Terraform v1.2.7
on darwin_arm64

Additional information

No response

Support `source_archive_bucket`

Support the argument source_archive_bucket to enable passing a zip in a bucket already created by previous TF code.
This will unlock 0.13 upgrade as well, since currently looping through this module will result in GCF source code (shared folder) being overwritten by each for_each element.

Support aggregated log exports

We are currently only supporting project-level log exports. Log exports can also be aggregated at the folder or organization level: it would be nice to support them.

Suggest using the terraform-google-log-export module that already supports all 3 log exports, and soon will support billing-level log exports as well.

Additionally, sometimes users want to export logs from the GSuite Admin APIs: we recommend using the terraform-google-gsuite-export for this use case.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.