Giter Club home page Giter Club logo

aqueduct-courier's People

Contributors

anyu avatar drich10 avatar idoru avatar jamesburchell avatar joyvuu-dave avatar jtarchie avatar kcboyle avatar maliksalman avatar mattcampbell97 avatar simonjjones avatar soup-of-the-day avatar sunjaybhatia avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aqueduct-courier's Issues

Please configure GITBOT

Pivotal provides the Gitbot service to synchronize issues and pull requests made against public GitHub repos with Pivotal Tracker projects.

If you do not want to use Pivotal Tracker to manage this GitHub repo, you do not need to take any action.

If you are a Pivotal employee, you can configure Gitbot to sync your GitHub repo to your Pivotal Tracker project with a pull request.

Steps:

  • Add the Toolsmiths-Bots team to have admin access to your repo
  • Add the cf-gitbot ([email protected]) user to have owner access to your Pivotal Tracker project
  • Create new branch in this repo: cfgitbot-config (an ask+cf@ ticket is the fastest way to get write access if you get a 404)
  • Add your new repo and or project to config-production.yml file
  • Submit a PR, which will get auto-merged if you've done it right. Detailed steps here

If you are not a pivotal employee, you can request that [email protected] set up the integration for you.

You might also be interested in configuring GitHub's Service Hook for Tracker on your repo so you can link your commits to Tracker stories. You can do this yourself by following the directions at:

https://www.pivotaltracker.com/blog/guide-githubs-service-hook-tracker/

If there are any questions, please reach out to [email protected].

[Feature] configurable --target during send

In order to ingest our telemetry data for our own consumption, the telemetry send command should accept a --target flag to shuttle the data to endpoints other than the Pivotal one.

Although the produced tarball could be consumed by us, being able to leverage the send command let's us easily integrate our existing CI process to send telemetry data to both Pivotal and our own endpoints.

background

With that said, I do not have an endpoint to receive the data to yet, nor a system to marshal / process / visualize the data yet. We're currently working on some in-house tools, and as telemetry starts to get fleshed out, we'll start putting engineering time into it. I'm also not 100% sure what we'd do about the --api-key yet, would have to do a little reverse-engineering of how things work. As a sidenote, if the strategy of telemetry is to have customers collect / visualize their own data, it could be beneficial to have some API contracts for how data will be structured & the structure of HTTP payloads, and/or publishing some sample data. It would also be beneficial to know a little bit about how the data is being processed so that customers can get ideas on how to use the data for themselves ๐Ÿ‘

Thoughts?
thanks for your time

[Feature] usefulness of combined collect/send?

In order to no longer require storing the collected data tarball locally && then sending the tarball as a separate command, it may be beneficial to combine the collect/send process into one step (stored in memory, perhaps?) and not write anything to disk.

background

Not sure what the UX would look like on this? I like having the separation of telemetry send/collect... but if we had something like telemetry collect-and-send (telemetry push? I'm horrible at naming), we wouldn't need to pass the tarball around as a separate Concourse task (which isn't a huge deal)... But more interestingly, we could run telemetry in difference places, like as a scheduled task on PCF itself... Which I guess we could already do, but may just be more convenient to have it as a single command which doesn't write anything to disk.

As you might deduce, this is definitely a low-priority issue... In fact, the more I think about this the more I think it's not that valuable of a feature and is a "meh" use-case. But I've already written this issue up, so I'll leave it up and close it real soon ๐Ÿ˜‰

Thanks for your time!

What is this project?

I'm very curious. Can you tell us a little bit about what this project is?

Looks like a tool to configure opsmgr + tiles via externalized git config?

Ability to configure IaaS type alongside environment

In order to provide accurate telemetry data across multi-cloud configurations, a PCF operator should be able to specify the IaaS type in their telemetry data (AWS, vSphere, Azure, etc.)

background

For example, you're able to specify the environment (qa, development, pre-prod, prod) but not able to specify the IaaS (AWS, Azure, GCP, etc.).

Perhaps I'm misunderstanding how telemetry data aggregation works-- maybe this already happens automatically?-- but I'm concerned that if I send my AWS development foundation telemetry, and I send my Azure development foundation telemetry, you'll lose the fidelity of which IaaS the data belongs to.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.