Giter Club home page Giter Club logo

chronicle's Introduction

The Hack Club Dashboard!

Purpose

Chronicle is the “Hack Club dashboard”, and is meant to:

  1. Provide metrics around HQ programs, like “How many people have completed a ‘you ship, we ship’ project like Sprig this month?”
  2. Enable Hack Club staff to do direct outreach to Hack Clubbers who are involved in one part of Hack Club, but not others yet (ex. “Who has published a Sprig game, but not yet come to an HQ hackathon?“).

Some examples of questions that Chronicle should be able to answer:

  • Who are active Hack Clubbers within driving distance of AngelHacks?
  • Who came to Epoch and has contributed to Hack Club repos on GitHub?
  • Who is new to Slack, but hasn’t contributed a Sprig game?
  • Who has a lot of expertise with Rust, but hasn’t contributed to the Burrow project nor joined its channel?

Key to this project is the fact that Hack Club is volunteer-led, and doesn’t want to mandate usage of universal systems (ex. unified hackathon registration) across the organization.

Components

Chronicle is split into two major components:

  1. A suite of command line tools that load, transform, conflate, and ultimately sync data in a target Elasticsearch cluster
  2. An Elasticsearch cluster w/ Kibana used for creating dashboards

Usage

Access to Chronicle will be limited to a very small set of Hack Club HQ employees with a set of very strict use cases.

Data

Data sources

Chronicle uses data from a variety of sources, including but not limited to:

  • Airtable (leaders table, ops address table)
  • Scrapbook DB
  • Slack APIs
  • Google Geocoding APIs
  • Github APIs
  • Pirateship
  • Raw slack data exports (ONLY public data)

Data freshness

By design, Chronicle will not be designed in a way where new changes in underlying data will be synced to Elasticsearch promptly. Instead, snapshots will be generated periodically from origin datasources, and then subsequently be consumed when we perform our next sync.

Data privacy

Given the sensitive nature of the data Chronicle uses, Chronicle is not a tool made for public consumption. All data posted here in this repo is for purposes of testing and development is mock data not relating to any real person.

Please refer to Hack Club's official data policy guidelines here.

Local development environment

Bring up the local Elasticsearch stack

cd docker
docker-compose up

Package the CLI tool into fat jar (dependencies included) form

mvn package
java -jar target/chronicle-1.0-SNAPSHOT-jar-with-dependencies.jar -h

Running tests locally

mvn test

Contribution guide

All contributions must...

  • ... be sent in pull request form...
  • ... have at least one reviewer approving from the 'infra' team...
  • ... not cause any test to fail...

...before merging code to the main branch.

chronicle's People

Contributors

grymmy avatar sheepy3 avatar

Stargazers

 avatar Leo Wilkin avatar Zigao Wang avatar  avatar Arav Narula avatar Nisarga avatar Gaurav Pandey avatar Gary Tou avatar zach latta avatar  avatar YoungChief avatar Shubham Panth avatar Toby Brown avatar

Watchers

zach latta avatar Matthew Stanciu avatar Rishi Kothari avatar Theo Bleier avatar Gary Tou avatar Kostas Georgiou avatar  avatar Caleb Denio avatar

Forkers

grymmy sheepy3

chronicle's Issues

Store kibana dashboards in git and synchronize them

For transparency and also for keeping a backup of the dashboard configurations we use, we should export the current dashboard(s) from kibana and keep them stored in the repo, and perhaps add some simple scripts to sync those exports with a destination cluster.

Create utility to export postgres tables to csv

Something that would allow us to take snapshots of current state of an postgres table, and persist it locally to disk.

Imagined cli experience: csvify postgres <host> <database> <username> <password> <table> outputfile.csv

Coverage analysis: Geodata

Perform an analysis of expected geodata coverage across all Hack Clubbers vs. what is currently in Chronicle. Feedback from zrl is that he expected significantly more coverage in certain localities.

Create utility to export airtable tables to csv

Something that would allow us to take snapshots of current state of an airtable, and persist it locally to disk.

Imagined cli experience: csvify airtable <host> <database> <username> <password> <table> outputfile.csv

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.