Giter Club home page Giter Club logo

datahub-metadata-day-2022's Introduction

DataHub

Raft's Submission for Metadata Day 2022

This repository is a mirror of the Datahub repo containing our modifications to support single-click data products in Superset. The following changes were made:

  • An additional frontend componenent was created: datahub-web-react/src/app/entity/shared/containers/profile/sidebar/SidebarVisualizeWithSupersetSection.tsx
  • The frontend component was added to datahub-web-react/src/app/entity/dataset/DatasetEntity.tsx
  • An additional GraphQL mutation was added to datahub-graphql-core/src/main/resources/entity.graphql and datahub-web-react/src/graphql/mutations.graphql
  • A resolver for the GraphQL mutation was added to datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/resolvers/mutate/VisualizeWithSupersetResolver.java and registered in datahub-web-react/src/graphql/mutations.graphql

Deployment

For this hackathon, we deployed services using Kubernetes. The following components should be present in order to run the hackathon submission:

  • Kafka Zookeeper (we used 1)
  • Kafka Brokers (we used 3)
  • Confluent Schema Registry
  • Elasticsearch
  • Trino, with an additional catalog registered:
    additionalCatalogs:
      postgres: |
        connector.name=postgresql
        connection-url=jdbc:postgresql://postgres:5432/postgres
        connection-user=datahub
        connection-password=datahub
    
  • Apache Superset
  • Redis
  • Postgres

The values.yaml from acryldata/datahub-helm was modified to add the following environment variables to datahub-gms:

    - name: SUPERSET_PASSWORD
      value: "admin"
    - name: SUPERSET_USERNAME 
      value: "admin"
    - name: SUPERSET_ENDPOINT
      value: "http://superset:8088"
    - name: SUPERSET_EXTERNAL_ENDPOINT
      value: "http://localhost:8088"
    - name: SUPERSET_PROVIDER
      value: "db"
    - name: DATASOURCE_PASSWORD
      value: ""
    - name: DATASOURCE_USERNAME 
      value: "trino"
    - name: DATASOURCE_HOST_AND_PORT
      value: "trino:8080"

Running the demo

  1. Once the demo is deployed, download AIS_2020_01_01-first-10000.csv.
  2. Copy the file into the Postgres Pod (or port-forward locally and run the following command using psql)
  3. Create a file with the following SQL statements:
    CREATE TABLE ais_noaa_2020
    (mmsi varchar,
    basedatetime timestamp ,
    lat numeric(10, 6),
    lon numeric(10, 6),
    sog numeric(8, 2),
    cog numeric(8, 2),
    heading numeric(6, 0),
    vesselname varchar,
    imo varchar,
    callsign varchar,
    vesseltype numeric(6, 0),
    status numeric(6, 0),
    length numeric(6, 2),
    width numeric(6, 2),
    draft numeric(6, 2),
    cargo numeric(6, 0),
    transceiverclass varchar);
    
    copy ais_noaa_2020
    FROM '/tmp/AIS_2020_01_01-first-10000.csv'
    DELIMITER ','
    CSV HEADER;
  4. On the pod or locally, run:
    psql -U postgres -d postgres -W -f <path-to-sql-create>
    
  5. In Datahub, create the Trino ingestion source:
    source:
      type: trino
      config:
        host_port: 'trino:8080'
        username: trino
        database: postgres
        # Supported environment values: https://datahubproject.io/docs/graphql/enums/#fabrictype
        env: DEV # Change if using a different environment.  
        # Enables profiling of SQL tables (row count, column count, etc.)
        # More information about profile: https://datahubproject.io/docs/metadata-ingestion/docs/dev_guides/sql_profiles
        profiling:
          enabled: true
          include_field_median_value: false
    
    sink:
      type: datahub-rest
      config:
        server: 'http://datahub-gms:8080'
  6. Run the ingestion.
  7. Once it completes, port-forward Superset.
  8. Navigate to the ais_noaa_2020 dataset in Datahub. Click the Visualize with Superset button. You should momentarily be redirected to Superset's SQL Editor page, with a pre-populated query to the dataset.

DataHub: The Metadata Platform for the Modern Data Stack

Built with ❤️ by Acryl Data and LinkedIn

Version PyPI version build & test Docker Pulls Slack PRs Welcome GitHub commit activity License YouTube Medium Follow

🏠 Project Homepage: datahubproject.io


Quickstart | Documentation | Features | Roadmap | Adoption | Demo | Town Hall


📣 Next DataHub town hall meeting on Feb 25th, 9am-10am PDT (convert to your local time)

✨ Latest Update:

Introduction

DataHub is an open-source metadata platform for the modern data stack. Read about the architectures of different metadata systems and why DataHub excels here. Also read our LinkedIn Engineering blog post, check out our Strata presentation and watch our Crunch Conference Talk. You should also visit DataHub Architecture to get a better understanding of how DataHub is implemented.

Quickstart

Please follow the DataHub Quickstart Guide to get a copy of DataHub up & running locally using Docker. As the guide assumes some basic knowledge of Docker, we'd recommend you to go through the "Hello World" example of A Docker Tutorial for Beginners if Docker is completely foreign to you.

Development

If you're looking to build & modify datahub please take a look at our Development Guide.

Demo and Screenshots

There's a hosted demo environment where you can play around with DataHub before installing.

DataHub Demo GIF

Source Code and Repositories

  • datahub-project/datahub: This repository contains the complete source code for DataHub's metadata model, metadata services, integration connectors and the web application.

Documentation

We have documentation available at https://datahubproject.io/docs/.

Releases

See Releases page for more details. We follow the SemVer Specification when versioning the releases and adopt the Keep a Changelog convention for the changelog format.

Features & Roadmap

Check out DataHub's Features & Roadmap.

Contributing

We welcome contributions from the community. Please refer to our Contributing Guidelines for more details. We also have a contrib directory for incubating experimental features.

Extending

If you need to understand how to extend our model with custom types, please see Extending the Metadata Model

Community

Join our slack workspace for discussions and important announcements. You can also find out more about our upcoming town hall meetings and view past recordings.

Adoption

Here are the companies that have officially adopted DataHub. Please feel free to add yours to the list if we missed it.

Select Articles & Talks

See the full list here.

License

Apache License 2.0.

datahub-metadata-day-2022's People

Contributors

edward-morgan avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.