Giter Club home page Giter Club logo

terraform-google-secure-for-cloud's Introduction

Sysdig Secure for Cloud in GCP

Terraform module that deploys the Sysdig Secure for Cloud stack in Google Cloud.

Provides unified threat-detection, compliance, forensics and analysis through these major components:

  • Threat Detection: Tracks abnormal and suspicious activities in your cloud environment based on Falco language. Managed through cloud-connector module.

  • Compliance: Enables the evaluation of standard compliance frameworks. Requires both modules cloud-connector and cloud-bench.

  • Image Scanning: Automatically scans all container images pushed to the registry (GCR) and the images that run on the GCP workload (currently CloudRun). Managed through cloud-connector.
    Disabled by Default, can be enabled through deploy_scanning input variable parameters.

For other Cloud providers check: AWS, Azure


Usage

There are several ways to deploy Secure for Cloud in you GCP infrastructure,

Find specific overall service arquitecture diagrams attached to each example/use-case.

In the long-term our purpose is to evaluate those use-cases and if they're common enough, convert them into examples to make their usage easier.

If you're unsure about what/how to use this module, please fill the questionnaire report as an issue and let us know your context, we will be happy to help.

Notice

  • GCP regions
  • All Sysdig Secure for Cloud features but Image Scanning are enabled by default. You can enable it through deploy_scanning input variable parameter of each example.
  • This example will create resources that cost money. Run terraform destroy when you don't need them anymore.
  • For free subscription users, beware that organizational examples may not deploy properly due to the 1 cloud-account limitation. Open an Issue so we can help you here!

Prerequisites

Your user must have following roles in your GCP credentials

  • Owner
  • Organization Admin (organizational usage only)

Google Cloud CLI Authentication

To authorize the cloud CLI to be used by Terraform check the following Terraform Google Provider docs

Use a Service Account

Instead of using a user, you can also deploy the module using a Service Account (SA). In order to create a SA for the organization, you need to go to one of your organization projects and create a SA. This SA must have been granted with Organization Admin role. Additionally, you should allow your user to be able to use this SA.

SA role SA user permissions
Service Account Role Service Account User

APIs

Besides, the following GCP APIs must be enabled (how do I check it?) depending on the desired feature:

Cloud Connector
Cloud Scanning
Cloud Benchmarks

Confirm the Services are Working

Check official documentation on Secure for cloud - GCP, Confirm the Services are working

Forcing Events - Threat Detection

Choose one of the rules contained in an activated Runtime Policies for GCP, such as Sysdig GCP Activity Logs policy and execute it in your GCP account. ex.: Create an alert (Monitoring > Alerting > Create policy). Delete it to prompt the event.

Remember that in case you add new rules to the policy you need to give it time to propagate the changes.

In the cloud-connector logs you should see similar logs to these

An alert has been deleted (requesting user=..., requesting IP=..., resource name=projects/test/alertPolicies/3771445340801051512)

In Secure > Events you should see the event coming through, but beware you may need to activate specific levels such as Info depending on the rule you're firing.

Alternatively, use Terraform example module to trigger GCP Update, Disable or Delete Sink event can be found on examples/trigger-events

Forcing Events - Image Scanning

  • For Repository image scanning, upload an image to a new Repository in a Artifact Registry. Follow repository Setup Instructions provided by GCP

    $ docker tag IMAGE:VERSION REPO_REGION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:latest
    $ docker push REPO_REGION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE:latest
  • For CloudRun image scanning, deploy a runner.

It may take some time, but you should see logs detecting the new image in the cloud-connector logs, similar to these

An image has been pushed to GCR registry (project=..., tag=europe-west2-docker.pregionkg.dev/test-repo/alpine/alpine:latest, digest=europe-west2-docker.pkg.dev/test-repo/alpine/alpine@sha256:be9bdc0ef8e96dbc428dc189b31e2e3b05523d96d12ed627c37aa2936653258c) Starting GCR scanning for 'europe-west2-docker.pkg.dev/test-repo/alpine/alpine:latest

And a CloudBuild being launched successfully.


Troubleshooting

Q: How can I check enabled API Services?

A: On your Google Cloud account, search for "APIs & Services > Enabled APIs & Services" or run following command

$ gcloud services list --enabled

Q: Getting "googleapi: 403 ***"

A: This may happen because permissions are not enough, API services were not correctly enabled, or you're not correctly authenticated for terraform google prolvider.
S: Verify permissions, api-services, and that the Terraform Google Provider authentication has been correctly setup. You can also launch the following terraform manifest to check whether you're authenticated with what you expect

data "google_client_openid_userinfo" "me" {
}

output "me" {
  value = data.google_client_openid_userinfo.me.*
}

Q: In organizaitonal setup, Compliance trust-relationship is not being deployed on our projects

A: If your organizational uses folders we currently don't support that.
S: A workaround would be to use the benchmark_project_ids parameter so you can define the projects where compliance role is to be deployed explicitly. Let us know if this workaround won't be enough and we will work on implementing a solution.

Q: Compliance is not working. How can I check everything is properly setup

A: On your GCP infrastructure, per-project where Comliance has been setup, check following points

  1. there is a Workload Identity Pool and associated Workload Identity Pool Provider configured, which must have an ID of sysdigcloud (display name doesn't matter)
  2. the pool should have a connected service account with the name sfcsysdigcloudbench, with the email [email protected]
  3. this serviceaccount should allow access to the following format principalset: principalSet://iam.googleapis.com/projects/<PROJECTID>/locations/global/workloadIdentityPools/sysdigcloud/attribute.aws_role/arn:aws:sts::***:assumed-role/***
  4. the serviceaccount should have the viewer role on the target project, as well as a custom role containing the "storage.buckets.getIamPolicy", "bigquery.tables.list", "cloudasset.assets.listIamPolicy" and "cloudasset.assets.listResource" permissions
  5. the pool provider should allow access to Sysdig's trusted identity, retrieved through
$ curl https://<SYSDIG_SECURE_URL>/api/cloud/v2/gcp/trustedIdentity \
--header 'Authorization: Bearer <SYSDIG_SECURE_API_TOKEN>'

Q: Getting "Error creating Service: googleapi: got HTTP response code 404" "The requested URL /serving.knative.dev/v1/namespaces/***/services was not found on this server"

"module.secure-for-cloud_example_organization.module.cloud_connector.goo
gle_cloud_run_service.cloud_connector" error: Error creating Service: googleapi: got HTTP response code 404 with
…
  <p><b>404.</b> <ins>That’s an error.</ins>
  <p>The requested URL <code>/apis/serving.knative.dev/v1/namespaces/****/services</code> was not found on this server.  <ins>That’s all we know.</ins>

A: This error is given by the Terraform GCP provider when an invalid region is used.
S: Use one of the available GCP regions. Do not confuse required region with GCP location or zone. Identifying a region or zone

Q: Error because it cannot resolve the address below, "https://-run.googleapis.com/apis/serving.knative.dev"

A: GCP region was not provided in the provider block

Q: Why do we need google-beta provider?

A: Some resources we use, such as the google_iam_workload_identity_pool_provider are only available in the beta version.

Q: Getting "Error creating WorkloadIdentityPool: googleapi: Error 409: Requested entity already exists"

A: Currently Sysdig Backend does not support dynamic WorkloadPool and it's name is fixed to sysdigcloud.
Besides, Google, only performs a soft-deletion of this resource. https://cloud.google.com/iam/docs/manage-workload-identity-pools-providers#delete-pool

You can undelete a pool for up to 30 days after deletion. After 30 days, deletion is permanent. Until a pool is permanently deleted, you cannot reuse its name when creating a new workload identity pool.


S: For the moment, federation workload identity pool+provider have fixed name. In case you want to reuse it, you can reactivate and import it, into your terraform state manually.

# re-activate pool and provider
$ gcloud iam workload-identity-pools undelete sysdigcloud  --location=global
$ gcloud iam workload-identity-pools providers undelete sysdigcloud --workload-identity-pool="sysdigcloud" --location=global

# import to terraform state
# for this you have to adapt the import resource to your specific usage
# ex.: for single-project, input your project-id
$ terraform import 'module.secure-for-cloud_example_single-project.module.cloud_bench[0].module.trust_relationship["<YOUR_PROJECT_ID>"].google_iam_workload_identity_pool.pool' sysdigcloud
$ terraform import 'module.secure-for-cloud_example_single-project.module.cloud_bench[0].module.trust_relationship["<YOUR_PROJECT_ID>"].google_iam_workload_identity_pool_provider.pool_provider' sysdigcloud/sysdigcloud

# ex.: for organization example you should change its reference too, per project
$ terraform import 'module.secure-for-cloud_example_organization.module.cloud_bench[0].module.trust_relationship["<YOUR_PROJECT_ID>"].google_iam_workload_identity_pool.pool' sysdigcloud
$ terraform import 'module.secure-for-cloud_example_organization.module.cloud_bench[0].module.trust_relationship["<YOUR_PROJECT_ID>"].google_iam_workload_identity_pool_provider.pool_provider' sysdigcloud/sysdigcloud

Note: if you're using terragrunt, run terragrunt import

Q: Getting "Error creating Topic: googleapi: Error 409: Resource already exists in the project (resource=gcr)"

│ Error: Error creating Topic: googleapi: Error 409: Resource already exists in the project (resource=gcr).
│
│   with module.sfc_example_single_project.module.pubsub_http_subscription.google_pubsub_topic.topic[0],
│   on ../../../modules/infrastructure/pubsub_push_http_subscription/main.tf line 10, in resource "google_pubsub_topic" "topic":
│   10: resource "google_pubsub_topic" "topic" {

A: This error happens due to a GCP limitation where only a single topic named gcr can exist. This name is gcp hardcoded and is the one we used to detect images pushed to the registry.
S: If the topic already exists, you can import it in your terraform state, BUT BEWARE that once you call destroy it will be removed.

$ terraform import 'module.sfc_example_single_project.module.pubsub_http_subscription.google_pubsub_topic.topic[0]' gcr

Contact us to develop a workaround for this, where the topic name is to be reused.

Note: if you're using terragrunt, run terragrunt import

Q: Getting "Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable."

A: If cloud-connector cloud run module cannot start it will give this error. The error is given by the health-check system, it's not specific to its PORT per-se
S: Verify possible logs before the deployment crashes. Could be limitations due to Sysdig license (expired trial subscription or free-tier usage where cloud-account limit has been surpassed)

Q: Getting "message: Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable"

A: Contrary to AWS, Terraform Google deployment requires just-started workload to start in a healthy status. If this does not happen it will fail.
S: Check your workload services (cloud run) logs to see what really failed. One common cause is a wrong Sysdig Secure API Token

Q: Scanning, I get an error saying:

error starting scan runner for image ****: rpc error: code = PermissionDenied desc = Cloud Build API has not been used in project *** before or it is disabled.
Enable it by visiting https://console.developers.google.com/apis/api/cloudbuild.googleapis.com/overview?project=*** then retry.

If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry

A: Do as the error says and activate CloudBuild API. Check the list of all the required APIs that need to be activated per feature module.

Q-Scanning: Scanning does not seem to work

A: Verify that gcr topic exists. If create_gcr_topic is set to false and gcr topic is not found, the GCR scanning is omitted and won't be deployed. For more info see GCR PubSub topic.

Upgrading

  1. Uninstall previous deployment resources before upgrading
$ terraform destroy
  1. Upgrade the full terraform example with
$ terraform init -upgrade
$ terraform plan
$ terraform apply
  • If the event-source is created throuh SFC, some events may get lost while upgrading with this approach. however, if the cloudtrail is re-used (normal production setup) events will be recovered once the ingestion resumes.

  • If required, you can upgrade cloud-connector component by restarting the task (stop task). Because it's not pinned to an specific version, it will download the latest one.


Authors

Module is maintained and supported by Sysdig.

License

Apache 2 Licensed. See LICENSE for full details.

terraform-google-secure-for-cloud's People

Contributors

tembleking avatar hayk99 avatar nkraemer-sysdig avatar miketnt avatar penguinjournals avatar airadier avatar sameer-in avatar dependabot[bot] avatar ido-gold-apolicy avatar regiluze avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.