Giter Club home page Giter Club logo

terraform-provider-honeycombio's Introduction

Honeycomb.io Terraform Provider

OSS Lifecycle CI codecov Terraform Registry

A Terraform provider for Honeycomb.io.

📄 Check out the documentation

🏗️ Examples can be found in example/

❓ Questions? Feel free to create a new issue or find us on the Honeycomb Pollinators Slack, channel #discuss-api-and-terraform (you can find a link to request an invite here)

🔧 Want to contribute? Check out CONTRIBUTING.md

Using the provider

You can install the provider directly from the Terraform Registry. Add the following block in your Terraform config, this will download the provider from the Terraform Registry:

terraform {
  required_providers {
    honeycombio = {
      source  = "honeycombio/honeycombio"
      version = "~> 0.23.0"
    }
  }
}

Set the API key used by Terraform setting the HONEYCOMB_API_KEY environment variable. You can override the default API Endpoint (https://api.honeycomb.io) by setting the HONEYCOMB_API_ENDPOINT environment variable.

Configuring the provider for Honeycomb EU

If you are a Honeycomb EU customer, to use the provider you must override the default API host. This can be done with a provider block (example below) or by setting the HONEYCOMB_API_ENDPOINT environment variable.

provider "honeycombio" {
  api_url = "https://api.eu1.honeycomb.io"
}

License

This software is distributed under the terms of the MIT license, see LICENSE for details.

terraform-provider-honeycombio's People

Contributors

aelmekeev avatar alouie-sfdc avatar arusahni avatar bixu avatar brookesargent avatar cewkrupa avatar dependabot[bot] avatar dstrelau avatar fitzoh avatar irvingpop avatar ismith avatar jbalboni avatar jharley avatar jmhodges-color avatar josslynzhang avatar kvrhdn avatar lafentres avatar lukecarrier avatar marceloboeira avatar martin308 avatar mjayaram avatar olegy2008 avatar rainofterra avatar reulan avatar slizco avatar tdarwin avatar teddylear avatar tyler-boyd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-provider-honeycombio's Issues

Create a clean dataset before running acceptance tests to have more deterministic behaviour

As @fitzoh proposed on the Honeycomb Slack: before running the acceptance tests, we could initialize a fresh dataset with all the expected columns. Contributors would only need to supply an API key to run all the tests.

This would avoid a lot of manual setup work and make the tests more deterministic. Currently, the tests will fail if the dataset is missing certain columns (i.e. duration_ms, app.tenant).

Query values converted to strings

Creating a board using this honeycombio_query:

data "honeycombio_query" "db_heatmap" {
  calculation {
    op     = "HEATMAP"
    column = "db.duration"
  }
  filter {
    column = "db.duration"
    op     = ">"
    value  = 500
  }
}

ends up yielding this query:
Annotation 2020-08-15 232920

(Note that the duration is quoted)

This results in no data being shown, while data is returned if the value is converted to a numeric value.

Resource: dataset

A resource to manage datasets.

From Honeycomb Slack:

Automation and managing everything as IaC when possible 🙂 When an environment is created, the appropriate dataset is created that applications would use to send their data. Granted these will most likely never change or change every ice age. New engineers on the team are able to browse our code and understand how the system is composed
[source]

honeycombio_trigger: recipient type webhook

It's possible to set trigger recipients of type webhook using the UI. This isn't possible yet using the Terraform provider, though the API most likely allows this.
It's probably not possible to create webhooks directly (has to be done manually in the integration center), which would be similar to Slack recipients.

Screenshot 2020-09-09 at 23 43 14

Screenshot 2020-09-09 at 23 52 38

TODO:

  • adapt go-honeycombio to add TriggerRecipientWebhook
  • allow type = "webhook" in honeycombio_trigger (update docs as well)
  • allow type = "webhook" in honeycombio_trigger_recipient (update docs as well)

Support Graph Settings for queries

These options don't appear to be supported by the API at the moment, but it would be nice to be able to generate graphs using these features (I was trying to terraform a graph that used stacked graphs)
Screen Shot 2020-10-16 at 5 51 32 PM

Resource: trigger

A resource to create triggers.

More information: https://docs.honeycomb.io/working-with-your-data/triggers/
API: https://docs.honeycomb.io/api/triggers/

Possible Terraform config (this is not final + probably not legal Terraform code):

resource "honeycombio_trigger" "trigger" {
  name        = "Requests are slower than usual"
  description = "Average duration of all requests for the last 5 minutes."

  # NOTE: maybe `enabled` would be simpler to grok?
  disabled = false

  # NOTE: should we prefer using terminology from the UI? I.e. `visualize`, `where`, `group by`
  query {
    # exactly one calculation is required
    calculation {
      op     = "AVG"
      column = "duration_ms"
    }

    # zero or more filter blocks
    filter {
      column = "trace.parent_id"
      op     = "does-not-exist"
    }

    # should also be supported: filter_combination, breakdowns
  }

  frequency = 300 // in seconds, 5 minutes

  threshold {
    op    = ">"
    value = 1000
  }

  # zero or more recipients
  recipient {
    type   = "email"
    target = "[email protected]"
  }

  recipient {
    type = "pagerduty"
  }
}

Migrate away from query spec and use query_id instead

With the introduction of the new queries API, triggers and boards now link to their queries using a query_id. To ease the transition the triggers and boards API will temporarily return both values (the original query and the new query_id).

We should refactor the Terraform logic so we don't rely on query anymore and instead fetch the query specification using query_id.

Resource: board

A resource to create boards

More information: https://docs.honeycomb.io/working-with-your-data/collaborating/boards/#docs-sidebar
API: https://docs.honeycomb.io/api/boards-api/#docs-sidebar

Edit 3 Aug: possible DSL for creating a board:

data "honeycombio_query" "query_1" {
  # ...
}

data "honeycombio_query" "query_2" {
  # ...
}

resource "honeycombio_board" "example" {
  name        = "Test board"
  description = "This is an example."
  style       = "visual" # list or visual

  query {
    caption    = "My first query"
    dataset    = "dataset-1"
    query_json = data.honeycombio_query.query_1.json
  }

  query {
    caption    = "The same query but a different dataset"
    dataset    = "dataset-2"
    query_json = data.honeycombio_query.query_1.json
  }

  query {
    dataset    = "dataset-1"
    query_json = data.honeycombio_query.query_2.json
  }
}

Data source: recipient from integration center

A data source to refer to existing recipients from the integration center.

Both triggers and SLO's support specifying recipients. Specifying an email recipient is fairly straightforward, but to use Slack you have to use the recipient ID.
This data source would be a convenient way to get this ID.

Documentation

Psuedo Terraform code:

data "honeycombio_recipient" "slack" {
  type   = "slack"
  target = "#alerts"
}

data "honeycombio_recipient" "pagerduty" {
  type   = "pagerduty"
}

resource "honeycombio_slo" "example" {
  # ...
  burn_alert {
    # ...
    recipients = [
      honeycombio_recipient.slack.id,
      honeycombio_recipient.pagerduty.id,
    ]
  }
}

resource "honeycombio_trigger" "example" {
  # ...
  recipients = [
    honeycombio_recipient.slack.id,
  ]
}

Support for graph settings

We would like to see full support for graph and board settings in the provider.

Query related items:

  • useStackedGraphs
  • omitMissingValues

Board related items:

  • one column vs multi-column

Is your feature request related to a problem? Please describe.
Most/all of our boards are created via Terraform, and these are blockers for us atm.

Describe the solution you'd like
Support for the above.

Describe alternatives you've considered
We are currently creating boards by hand and planning to go back and migrate to terraform once these features (and others) land.

Additional context
None.

Support Import operations

I was attempting to import a manually created board into terraform, but that does not appear to be documented.

From looking at the docs, no resources currently support importing.

Graceful handling of 409 Conflict when creating duplicate column

Hey,

Would it be possible to have the provider not treat HTTP 409s as an error? When I try to run terraform apply with an empty state and a dataset with columns already exists, I get this error:

╷
│ Error: 409 Conflict: Can't insert, column already exists with this key_name or alias.
│
│   on columns.tf line 15, in resource "honeycombio_column" "column":
│   15: resource "honeycombio_column" "column" {
│
╵

It'd be awesome if in this case it updated the existing column instead of having to manually import all of them.

Thanks!

Bug: Non string values (float, integer, etc.) in filters are not valid

We are seeing query filters fail to validate if we need to specify anything other than a string as a comparison value: float, int, etc.

Versions
Latest version of the provider.

Steps to reproduce
Use something like the following in the query definition:

filter {
   column 		= honeycombio_derived_column.<derived-column-name>.alias
   op     			= ">"
   value_float  	= 0.0
}

Additional context
None required.

RFC: Provider Breaking Changes

Overview

Honeycomb has recently taken ownership of the Terraform Provider which was originally developed and maintained by community member Koenraad Verheyden (thank you! 🙏🏻 ).

Leading up to the first Honeycomb-owned release of the Provider there are three breaking changes we feel need to be made:

  1. The honeycombio_query data source will be renamed to honeycombio_query_specification. Initially raised in #65.
  2. The honeycombio_board resource will no longer support inline Query JSON and instead require that you create a honeycombio_query resource and pass the ID in the list of queries on the Board. Initially raised in #55, and currently blocking #84 and #82.
  3. The honeycombio_trigger resource will no longer support inline Query JSON and instead require that you create a honeycombio_query resource and pass the ID to the Trigger resource. Also raised in #55.

As the move to the Honeycomb-maintained version of the provider will require that you update the namespace used in your required_providers configuration from kvrhdn to honeycombio we hope that these changes will be more easily managed.

This informal RFC was written to offer transparency around these changes and give the provider's user community a chance to comment on the changes.

honeycombio_query data source change

This datasource never allowed you to fetch an existing Query object, but instead helped you form a valid Query Specification in HCL. Renaming this to honeycombio_query_specification seems most accurate and the change should easily be handled by a simple ‘find and replace’.

honeycombio_board resource change

The Provider currently supports providing one or more inline queries (as Query Spec JSON query_json) to a Board resource to create a Board. This was created before either of the Query API or Query Annotations API existed, but both customers and the field team are looking to be able to build Boards via Terraform made up of fully annotated queries managed in code.

Inline queries are problematic because providing the query JSON to the API makes a Query object for you which is then difficult to track in Terraform as it’s a “computed sometimes” attribute of the resource. The explicit creation of a Query resource also feels more declarative and ✨ Terraform-ish ✨ .

Boards with Terraform today look something like this:

resource “honeycombio_board” “myboard” {
    name  = "Test board managed by Terraform"
    style = "list"

    query {
      caption       = "test query"
      dataset       = var.dataset
      query_json    = data.honeycombio_query.test.json
    }
    query {
      caption        = "my verbose query"
      query_style    = "combo"
      dataset        = var.dataset
      // Inline query
      query_json  = <<EOH
{
    "time_range": 604800,
<truncated for brevity>
}
EOH
    }
}

The current Board resource implementation ignores the returned query_id completely.

Proposed change

The change to the Board resource’s schema will remove the ability to provide a Query Specification directly and instead require that you make use of the existing Query resource to construct Queries – with the recommendation that you do so with the assistance of the (newly renamed) Query Specification data source.

The above would turn into something like this:

resource “honeycombio_query” “helpful-query” {
   dataset       = var.dataset
   query_json    = data.honeycombio_query_specification.helpful.json
}

resource “honeycombio_query” “other-query” {
   dataset     = var.dataset
   query_json  = data.honeycombio_query_specification.other.json
}

resource “honeycombio_board” “myboard” {
    name  = "Test board managed by Terraform"
    style   = "list"

    query {
      caption       = "A very Helpful Query"
      dataset       = var.dataset
      query_id      = honeycombio_query.helpful-query.id
    }
    query {
      caption        = "my verbose query"
      query_style    = "table"
      dataset        = var.dataset
      query_id       = honeycombio_query.other-query.id
    }
}

The above paves the way for cleanly making use of the Query Annotation resource to name and describe the queries on the Board.

Abandoned Ideas

  • Adding an “inline_query” type to the Board resource which allowed you to provide the Query Spec as JSON and would track the returned “query_id” as a “Calculated” schema attribute. Abandoned as it was still a breaking change (all current “query” would need to be moved to “inline_query”)
  • Attempting to use the Terraform Plugin SDK’s experimental “GetRawConfig” to extract the parsed HCL and determine if “query_json” was in use. This was ugly and felt like the sort of brittle that would break in a future version of the Provider SDK.

honeycombio_trigger resource change

Similar to the Board resource, the Provider currently supports passing an inline Query (as Query Specification JSON) to create a Trigger. The explicit creation of a Query resource also feels more declarative and ✨ Terraform-ish ✨ .

This change is ultimately intended to bring more consistency to the Provider’s interface.

Triggers in Terraform today look something like this:

resource “honeycombio_trigger” “mytrigger” {
  name       = "Trigger managed by Terraform"
  dataset    = var.dataset
  query_json = data.honeycombio_query.mytrigger.json

  frequency = 300

  threshold {
    op    = ">"
    value = 50
  }
  
  recipient {
    type   = "email"
    target = "[email protected]"
  }
}

Proposed change

The change to the Trigger resource’s schema will remove the ability to provide a Query Specification directly and instead require that you make use of the existing Query resource to construct Queries – with the recommendation that you do so with the assistance of the (newly renamed) Query Specification data source.

The above would turn into something like this:

resource “honeycombio_query” “mytrigger” {
  dataset    = var.dataset
  query_json = data.honeycombio_query_specification.mytrigger.json
}

resource “honeycombio_trigger” “mytrigger” {
  name     = "Trigger managed by Terraform"
  dataset  = var.dataset
  query_id = honeycombio_query.mytrigger.id

  frequency = 300

  threshold {
    op    = ">"
    value = 50
  }
  
  recipient {
    type   = "email"
    target = "[email protected]"
  }
}

Add HONEYCOMB_API_KEY envvar support

The Honeycomb standard has become HONEYCOMB_API_KEY so the provider should support it too in the name of consistency.

This will deprecate but continue to support the current HONEYCOMBIO_APIKEY but drop support altogether for the 1.0.0 release.

Refactor internals in preparation for v0.1.0

In preparation for releasing a v0.1.0, I'd like to do the following:

  • improve the use of enums, there are currently a couple of lists duplicated with go-honeycombio. For example validQueryCalculationOps
  • all error messages should start with a lowercase, this seems more in line with the SDK
  • create a more generic function to read values from the schema, there is currently a lot of duplicated boilerplate, example reading column
  • make acceptance tests more consistent across resources

expression update in honeycombio_derived_colum enforces resource recreation

Hey!
Thanks for this operator!
While updating expression value in honeycombio_derived_column resource provider forces resource replacement while it's working fine in UI and API supports in-place updates: https://docs.honeycomb.io/api/derived-columns/#update-a-derived-column.
In our case deletion of derived_column is failing because it is referenced in SLOs.
https://github.com/kvrhdn/terraform-provider-honeycombio/blob/7ec7ca02cde626c5be91f5a7b8524afb29b55f3b/honeycombio/resource_derived_column.go#L29-L33

Is this a bug or there are reasons for this behavior (recreation of resource instead of in-place update)?

Crashes with Error: rpc error: code = Unavailable desc = transport is closing

Running terraform plan works successfully. But running terraform apply crashes with Error: rpc error: code = Unavailable desc = transport is closing

Versions

Terraform v0.14.7
+ provider registry.terraform.io/kvrhdn/honeycombio v0.1.2

Terraform Config

Below is the config in main.tf

terraform {
  required_providers {
    honeycombio = {
      source  = "kvrhdn/honeycombio"
      version = "~> 0.1.2"
    }
  }
}

data "honeycombio_query" "query" {
  calculation {
    op     = "COUNT"
  }
  time_range = 120
}

resource "honeycombio_trigger" "trigger" {
  name        = "test"
  description = "test"
  disabled = false
  query_json = data.honeycombio_query.query.json
  dataset    = "ssherbondy-dev"
  frequency = 120 
  threshold {
    op    = ">"
    value = 300000
  }
}

Error output

Below is the output from terraform apply:

Error: rpc error: code = Unavailable desc = transport is closing


panic: runtime error: invalid memory address or nil pointer dereference
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: [signal SIGSEGV: segmentation violation code=0x1 addr=0x38 pc=0x1695547]
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2:
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: goroutine 49 [running]:
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio.resourceTriggerRead(0x1931800, 0xc0002c51a0, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0xc0001aee00, 0xc0001aee70, 0x0)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/kvrhdn/terraform-provider-honeycombio/honeycombio/resource_trigger.go:162 +0x267
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio.resourceTriggerCreate(0x1931800, 0xc0002c51a0, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0xc0002feb90, 0x12c635a, 0xc0004a9200)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/kvrhdn/terraform-provider-honeycombio/honeycombio/resource_trigger.go:143 +0x295
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).create(0xc00039c6e0, 0x1931780, 0xc0002e2500, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0x0, 0x0, 0x0)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:276 +0x1ec
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc00039c6e0, 0x1931780, 0xc0002e2500, 0xc0001ae380, 0xc0004a9200, 0x172cee0, 0xc000044ba0, 0x0, 0x0, 0x0, ...)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:387 +0x681
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/helper/plugin.(*GRPCProviderServer).ApplyResourceChange(0xc000262200, 0x1931780, 0xc0002e2500, 0xc00013df80, 0xc000262200, 0xc000262210, 0x1849f90)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/helper/plugin/grpc_provider.go:952 +0x8b2
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/tfplugin5._Provider_ApplyResourceChange_Handler.func1(0x1931780, 0xc0002e2500, 0x17dfc00, 0xc00013df80, 0xc0002e2500, 0x1763a00, 0xc0002c4c01, 0xc0004a8f40)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/tfplugin5/tfplugin5.pb.go:3312 +0x86
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/plugin.Serve.func3.1(0x1931840, 0xc0002e9050, 0x17dfc00, 0xc00013df80, 0xc0004a8f20, 0xc0004a8f40, 0xc000330ba0, 0x11b92e8, 0x17b8500, 0xc0002e9050)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/plugin/serve.go:76 +0x87
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/tfplugin5._Provider_ApplyResourceChange_Handler(0x17ee100, 0xc000262200, 0x1931840, 0xc0002e9050, 0xc0002c4cc0, 0xc000810a20, 0x1931840, 0xc0002e9050, 0xc0001f26c0, 0x223)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/tfplugin5/tfplugin5.pb.go:3314 +0x14b
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).processUnaryRPC(0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100, 0xc000712510, 0x1e30ce0, 0x0, 0x0, 0x0)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	google.golang.org/[email protected]/server.go:1171 +0x50a
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).handleStream(0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100, 0x0)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	google.golang.org/[email protected]/server.go:1494 +0xccd
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc0000b4310, 0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	google.golang.org/[email protected]/server.go:834 +0xa1
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: created by google.golang.org/grpc.(*Server).serveStreams.func1
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: 	google.golang.org/[email protected]/server.go:832 +0x204
2021-02-18T22:13:20.428-0500 [WARN]  plugin.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = transport is closing"
2021/02/18 22:13:20 [DEBUG] honeycombio_trigger.trigger: apply errored, but we're indicating that via the Error pointer rather than returning it: rpc error: code = Unavailable desc = transport is closing
2021/02/18 22:13:20 [TRACE] EvalMaybeTainted: honeycombio_trigger.trigger encountered an error during creation, so it is now marked as tainted
2021/02/18 22:13:20 [TRACE] EvalWriteState: removing state object for honeycombio_trigger.trigger
2021/02/18 22:13:20 [TRACE] EvalApplyProvisioners: honeycombio_trigger.trigger has no state, so skipping provisioners
2021/02/18 22:13:20 [TRACE] EvalMaybeTainted: honeycombio_trigger.trigger encountered an error during creation, so it is now marked as tainted
2021/02/18 22:13:20 [TRACE] EvalWriteState: removing state object for honeycombio_trigger.trigger
2021/02/18 22:13:20 [TRACE] vertex "honeycombio_trigger.trigger": visit complete
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "provider[\"registry.terraform.io/kvrhdn/honeycombio\"] (close)" errored, so skipping
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "meta.count-boundary (EachMode fixup)" errored, so skipping
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "root" errored, so skipping
2021-02-18T22:13:20.429-0500 [DEBUG] plugin: plugin process exited: path=.terraform/providers/registry.terraform.io/kvrhdn/honeycombio/0.1.2/darwin_amd64/terraform-provider-honeycombio_v0.1.2 pid=40062 error="exit status 2"
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: not making a backup, because the new snapshot is identical to the old
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: no state changes since last snapshot
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: writing snapshot at terraform.tfstate
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: removing lock metadata file .terraform.tfstate.lock.info
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: unlocking terraform.tfstate using fcntl flock
2021-02-18T22:13:20.446-0500 [DEBUG] plugin: plugin exited

When I review the list of triggers in Honeycomb, I can see that the trigger was successfully created as expected. It seems like there may be a failure on reading the trigger from the honeycomb API after it is created.

Data Source: datasets

A data source that retrieves a list of all datasets present in the account.
This would be useful for creating standardized boards/triggers across all datasets.

Could potentially include the ability to filter based on the name (ie all datasets that end with -prod).

Requires Honeycomb API support.

Rename data source honeycombio_query to honeycombio_query_spec

With the new honeycombio_query resource, it becomes confusing to have to differentiate between the honeycombio_query data source and the resource.
The honeycombio_query data source is not actually a query though, it's a query specification as used described in the docs.

To avoid this confusion we should:

  • rename the honeycombio_query data source to honeycombio_query_spec
  • rename attributes that expect a query specification from query to query_spec

Title / description of query added to boards?

Every query I add to a board has Untitled Query as a name. If I change it manually in the web-ui but re-run terraform apply it reverts back. Is there a way to add a query_annotation to a query that gets added to a board?

Resource: derived column

A resource to manage the derived columns of a dataset.

More information: https://docs.honeycomb.io/working-with-your-data/customizing-your-query/derived-columns/
Reference: https://docs.honeycomb.io/working-with-your-data/customizing-your-query/derived-columns/reference/#docs-sidebar
API: there is no API yet for this This is okay now :)

Possible Terraform schema:

resource "honeycombio_derived_column" "example" {
  dataset     = "my-dataset"
  alias       = "duration_ms_log10"
  description = "Log10 of duration_ms"

  function = "LOG10($\"duration_ms\")"
}

Data source: column

Add a pair of column data sources: honeycombio_column and honeycombio_columns

Able to set invalid "COUNT" columns

While trying to work around #17 I tried setting a column with my COUNT column.

The terraform apply succeeded but upon trying to view it I received the following error:
Capture

This validation may belong at the API layer (where it is missing), but might be worth considering at the terraform layer as well

Resource: api key

A resource to manage API keys.

More information: no documentation available
API: there is no API yet for this

Possible Terraform config:

resource "honeycombio_api_key" "example" {
  name   = "Backend"
  enable = true

  permissions = ["send_events"]
}

# honeycombio_api_key exports the field `key`

Multiline values of `expression` break Honeycomb WebUI

My use case:
I want to use multiline strings in expression field of a derived column.
Web UI is clearly not designed for this, I can apply a function only if I make it one-liner.

However terraform accepts it and I've encountered the problem when I applied.

After terraform apply I accessed the "Derived Columns" page on the web ui
and got the blank white screen.

The problem

  • (Honeycomb) API accepts strings that could break the "derived columns" page.

Expected behaviour

  • (Honeycomb) API does not accept values that could break the UI
  • (provider) terraform provider removes newline characters (?)

How to reproduce
Terraform provider allows applying multiline values

resource "honeycombio_derived_column" "tmp_action" {
  alias       = "tmp_action"
  dataset     = var.dataset
  description = "SLI: successful action"
  expression  = <<EOT
  IF(
    REG_MATCH($request, `PATCH https://www.whatever.[a-z]+:443/gw/api/subscriptions/[0-9]+/bar?`),
    AND(
      OR(
        STARTS_WITH($backend_status_code, "2"), 
        STARTS_WITH($backend_status_code, "3"), 
        STARTS_WITH($backend_status_code, "4")
      ),
      LTE($backend_processing_time, 2)
    )
  )
  EOT
}

terraform apply succeeds and consequent terraform plan shows no changes. The column actually gets created. If I query the fetch_schema endpoint, I can see the column there. The newline characters are encoded as \n and this, probably, breaks the web interface.

{
      "id": "abc1234",
      "alias": "tmp_action",
      "expression": "  IF(\n    REG_MATCH($request, `PATCH https://www.whatever.[a-z]+:443/gw/api/subscriptions/[0-9]+/bar?`),\n    AND(\n      OR(\n        STARTS_WITH($backend_status_code, \"2\"), \n        STARTS_WITH($backend_status_code, \"3\"), \n        STARTS_WITH($backend_status_code, \"4\")\n      ),\n      LTE($backend_processing_time, 2)\n    )\n  )\n",
      "description": "SLI: successful action",
      "creator": {
        "created_at": "0001-01-01T00:00:00Z",
        "updated_at": "0001-01-01T00:00:00Z",
        "email": "",
        "firstName": "",
        "lastName": "",
        "companyName": "",
        "role": 0,
        "receiveUsageEmails": false,
        "id": "xxxxx"
      },

When I access the "Derived Columns" page, I get the blank white screen.
In order to fix the page I delete the column via terraform.

Please let me know if the stripping of newline characters on the provider side would make sense in this case.

Minor: suppress diff when trigger recipient id is specified

provider version = 0.0.7

Disclaimer: this might be a bad idea.

I've got the following terraform configuration that applies successfully and works as expected:

data "honeycombio_trigger_recipient" "slack" {
  dataset = "my-very-special-datset"
  type    = "slack"
  target  = "#alerts"
}


resource "honeycombio_trigger" "trigger" {
  
 //query omitted

  recipient {
    id     = data.honeycombio_trigger_recipient.slack.id
  }
}

However, I receive the following when I run a plan/apply

 ~ recipient {
            id     = "some-id"
          - target = "#alerts" -> null
          - type   = "slack" -> null
        }

It would be nice if that diff could be suppressed without duplicating the recipient target and type.

Resource: SLO

A resource to manage SLO's.

More information: https://docs.honeycomb.io/working-with-your-data/slos/
API: there is no API yet for this

Possible Terraform config:

resource "honeycombio_derived_column" "sli" {
  # ...
}

resource "honeycombio_slo" "example" {
  name        = "API service"
  description = "..."

  column = honeycombio_derived_column.sli.alias

  time_period       = "28" // in days
  target_percentage = 99.9

  burn_alert {
    exhaustion_time = 4

    # use same structure as trigger recipients
    recipient {
      type = "pagerduty"
    }
  }
}

Resource: named queries

A resource to manage named queries and their description.

More information: once you've built a query in the UI, you can name this query and add a description. This makes it easier to share this query. Additionally, this title is shown on boards.

API: there is no API yet for this

Possible Terraform schema:

resource "honeycombio_query" "example" {
  name        = "My query"
  description = "A description of this query. _Supports Markdown._"

  dataset    = "my-dataset"
  query_json = data.honeycombio_query.my_query.json
}

To discuss:

  • this might be confusing next to data "honeycombio_query"? Maybe rename the data source to honeycombio_query_spec so it's clear this is only the specification of query

cc. @fitzoh mentioned this in Slack

Add CONCURRENCY support to Query Specification

Currently, the provider client does not have support for the CONCURRENCY calculation operator, and will return an error if you try to craft a query making use of it.

CONCURRENCY is a unary operator similar to COUNT and will need the same validation checking.

Add contributing.md

Add contributing.md to help out people that want to contribute.

Things we should discuss:

  • How to set up a test dataset (i.e. sending dummy events with the expected column using something like honeyvent)
  • How to run the tests
  • How to setup GitHub Actions in the fork
  • Maybe something about code style? I.e. running go fmt etc.
  • ...

Missing `RATE_SUM`

I attempted to create a query using the op RATE_SUM, but it appears that it is not on a whitelist in the provider.

Error: expected calculation.7.op to be one of [COUNT SUM AVG COUNT_DISTINCT MAX MIN P001 P01 P05 P10 P25 P50 P75 P90 P95 P99 P999 HEATMAP], got RATE_SUM

Versions

Steps to reproduce

Additional context

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.