Comments (3)
2005-01-02 would be a good format, but maybe it's better to have it configurable. Google Cloud Storage doesn't actually have folders. It just groups files for you in a folder-like structure when you use slashes in the object name.
Right, however, even in your original request you started given the objects a path, so I assume others would ask the same. We could do something similar to what we do in opensearch, where the index name there takes a Golang template.
What I am think then is a config like this:
[[outputs.google_cloud_storage]]
## Bucket
## Name of Cloud Storage bucket to send metrics to.
bucket = ""
## Object name
## Target object name for metrics. This is a Golang template (see
## https://pkg.go.dev/text/template). You can also specify metric name
## (`{{.Name}}`), tag value (`{{.Tag "tag_name"}}`), field value
## (`{{.Field "field_name"}}`), or timestamp (`{{.Time.Format "xxxxxxxxx"}}`).
## If the tag does not exist, the default tag value will be empty string "".
##
## For example: "telegraf-{{.Time.Format \"2006-01-02\"}}-{{.Tag \"host\"}}"
## would set it to `telegraf-2023-07-27-HostName`
object_name = ""
## Data format to output
## Each data format has its own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_OUTPUT.md
# data_format = "influx"
## Credentials file
## Optional. File path for GCP credentials JSON file to authorize calls to
## Google Cloud Storage APIs. If not set explicitly, Telegraf will attempt to use
## Application Default Credentials, which is preferred.
# credentials_file = "path/to/my/creds.json"
from telegraf.
Hi,
Some questions around the proposal:
Have you looked into how to manage credentials?
bucket = "my-bucket"
Would telegraf create the bucket or would we assume the user has created it?
metrics_per_object = 1
If you have 20 objects, would you then write 20 files at every interval? Likewise, if you have 10,000 metrics, 10,000 files? Rather than dividing shouldn't a plugin respect the batch format serializer setting instead.
//.line
group_by = "day"
What are you assuming date would look like? 2005-01-02? Are you assuming telegraf would create and manage different folders and auto-create new ones? How does that relate to the group by?
Are you planning to submit a PR?
from telegraf.
Have you looked into how to manage credentials?
I assumed credentials would work in the same way as they do for the google_cloud_storage
input plugin.
Would telegraf create the bucket or would we assume the user has created it?
Creating the bucket is not a hard requirement for me, but it would be nice if Telegraf could take care of it.
If you have 20 objects, would you then write 20 files at every interval? Likewise, if you have 10,000 metrics, 10,000 files? Rather than dividing shouldn't a plugin respect the batch format serializer setting instead.
Right, this is better handled by the serializer indeed.
What are you assuming date would look like? 2005-01-02? Are you assuming telegraf would create and manage different folders and auto-create new ones? How does that relate to the group by?
2005-01-02 would be a good format, but maybe it's better to have it configurable. Google Cloud Storage doesn't actually have folders. It just groups files for you in a folder-like structure when you use slashes in the object name.
Are you planning to submit a PR?
I'm afraid not
from telegraf.
Related Issues (20)
- [inputs.tail] Logs being processed everytime the agent restarts HOT 2
- Debian Server Telegraf Docker Container fails to connect to OPCUA while Ubuntu Machine connects HOT 9
- [inputs.win_eventlog] Telegraf agent not collecting custom event logs HOT 4
- chore: support new golang version 1.22.3
- Add Python example for Execd plugin HOT 1
- [documentation]: improve details on kubernetes input configuration HOT 3
- Refreshing of Bearer token is unsupported in HTTP input plugin HOT 2
- influxdb_v2_listener rate limit HOT 10
- Internet Speed Monitor - add packet loss HOT 1
- Ping size configuration on Windows HOT 2
- feat(processors): Traffic shaper processor plugin to shape uneven distribution of incoming metrics HOT 1
- Kafka Input plugin - allow access to broker list through a VIP HOT 3
- No gather_time_ns when running Windows and --test HOT 2
- Adding support of extended_gateway to SFlow plugin HOT 10
- Support for explicit credentials for monitoring remote systems via win_perf_counters plugin HOT 2
- Cross compile telegraf depend on libc.so instead of libc.so.6 HOT 7
- Cisco_telemetry_mdt input plugin incorrect parsing of child policy HOT 2
- Update produces log: "Failed to execute operation: File exists" HOT 4
- port_name plugin errors for unknown protocols cannot be silenced HOT 2
- Go Mod Toolkit more specific version HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from telegraf.