Giter Club home page Giter Club logo

backstage-plugin-dbt's Introduction

Backstage Plugin dbt

Backstage plugins to view dbt doc.

npm (scoped) latest workflow pr latest workflow release latest workflow prerelease Servier Inspired

Table of contents

Features

  • List all dbt models and tests
  • Get details on dbt models and tests like:
    • Documentations
    • Stats
    • Columns
    • Dependency graph
    • Code source (raw and compiled)

Limitations

As of now, the plugin only support the following backends:

  • Google Cloud Storage
  • AWS S3
  • Azure Blob Storage

Screenshots

Landing page: Landing page

Model details:

Test details:

Note: catalog and manifest come from https://github.com/fivetran/dbt_shopify/tree/main

Setup

  1. Install packages:
# From your Backstage root directory
yarn --cwd packages/app add @iiben_orgii/backstage-plugin-dbt
yarn --cwd packages/backend add @iiben_orgii/backstage-plugin-dbt-backend
  1. Add a new dbt tab to the entity page.

packages/app/src/components/catalog/EntityPage.tsx

// packages/app/src/components/catalog/EntityPage.tsx

import { DbtPage, isDBTAvailable } from "@iiben_orgii/backstage-plugin-dbt";

// Farther down at the serviceEntityPage declaration
const serviceEntityPage = (
  <EntityLayout>
    {/* Place the following section where you want the tab to appear */}
    <EntityLayout.Route if={isDBTAvailable} path="/dbt" title="dbt">
      <DbtPage />
    </EntityLayout.Route>
  </EntityLayout>
);

Legacy backend system

  1. Add the dbt route by creating the file packages/backend/src/plugins/dbt.ts:

packages/backend/src/plugins/dbt.ts

import {
  createRouter,
} from "@iiben_orgii/backstage-plugin-dbt-backend";

import { Router } from "express";
import { PluginEnvironment } from "../types";

export default async function createPlugin(
  env: PluginEnvironment,
): Promise<Router> {
  return await createRouter({
    logger: env.logger,
    config: env.config,
  });
}

then you have to add the route as follows:

packages/backend/src/index.ts

// packages/backend/src/index.ts
import dbt from "./plugins/dbt";

async function main() {
  //...
  const dbtEnv = useHotMemoize(module, () => createEnv("dbt"));
  //...
  apiRouter.use("/dbt", await dbt(dbtEnv));
  //...
}

New backend system

In the file: packages/backend/src/index.ts

import { dbtPlugin } from '@iiben_orgii/backstage-plugin-dbt-backend';

//[...]

const backend = createBackend();
//[...]

backend.add(dbtPlugin);

//[...]
backend.start();

Usage

Single bucket model

You can define one bucket with all your manifest and catalog files.

Add a file application/packages/app/config.d.ts:

export interface Config {
  dbtdoc: {
    /**
     * Frontend root URL
     * @visibility frontend
     */
    bucket: string;
  };
}

Update the file application/packages/app/package.json with

  "files": [
    "dist",
    "config.d.ts"
  ],
  "configSchema": "config.d.ts"

Then you can add to your app-config.yaml:

dbtdoc:
  bucket: your-bucket-123
  backend: GoogleStorage # or S3

One bucket for each application or overwrite the global config

Limitation: all dbt docs must be saved on same backend type (GoogleStorage or S3)

Add dbtdoc-bucket as annotation in catalog-info.yaml. Optionally, add the dbtdoc-path annotation if your GCS bucket structure does not conform to the expected structure.

apiVersion: backstage.io/v1alpha1
kind: Component
spec:
  type: service
  owner: user:guest
  lifecycle: experimental
metadata:
  name: "test"
  annotations:
    dbtdoc-bucket: "my-bucket"
    dbtdoc-path: "optional/override/path" # Optional

Then you can add to your app-config.yaml:

dbtdoc:
  backend: GoogleStorage # or S3

Files path in the bucket

Following path must be respect regardless your bucket setup (single or multi).

If using the multi setup, you can override the {kind}/{name} portion of the path using the dbtdoc-path annotation.

You can upload your manifest.json and catalog.json to a GCS Bucket as follow:

  • {dbtdoc-bucket}/{kind}/{name}/manifest.json
  • {dbtdoc-bucket}/{kind}/{name}/catalog.json

Authentication

For authentification to GCS Bucket, the plugin use ADC credentials https://cloud.google.com/docs/authentication/provide-credentials-adc.

Update from v1 to v2

Update the app-config.yaml as follow:

dbtdoc:
  bucket: your-bucket-123
  backend: GoogleStorage

or

dbtdoc:
  bucket: your-bucket-123
  backend: S3

Apply the following changes only if you use the old backend system.

In v1: packages/backend/src/plugins/dbt.ts

// packages/backend/src/plugins/dbt.ts
import {
  createRouter,
  GoogleStorageProvider,
} from "@iiben_orgii/backstage-plugin-dbt-backend";
import { Router } from "express";
import { PluginEnvironment } from "../types";

const storageProvider = new GoogleStorageProvider();

export default async function createPlugin(
  env: PluginEnvironment,
): Promise<Router> {
  return await createRouter({
    logger: env.logger,
    storageProvider: storageProvider,
  });
}

In v2: packages/backend/src/plugins/dbt.ts

// packages/backend/src/plugins/dbt.ts
import {
  createRouter,
} from "@iiben_orgii/backstage-plugin-dbt-backend";

import { Router } from "express";
import { PluginEnvironment } from "../types";

export default async function createPlugin(
  env: PluginEnvironment,
): Promise<Router> {
  return await createRouter({
    logger: env.logger,
    config: env.config,
  });
}

backstage-plugin-dbt's People

Contributors

dependabot[bot] avatar iibenii avatar mattiasmts avatar sqsp-rwatkins avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

backstage-plugin-dbt's Issues

Section "Columns" empty

Hi,

thank you for your work!

Clicking on the "Details" button, of every "line", I have a problem, the "Columns" section is empty.
Screenshot 2024-06-24 alle 16 40 48

But with the built-in doc (dbt docs serve) I'm able to see the columns:
Screenshot 2024-06-24 alle 16 41 51

This is an extract of the json:

"nodes": {
  "model.model.test_model": {
    "config": {
      "column_types": {

      },
    },
    "description": "",
    "columns": {
      "country": {
        "name": "country",
        "description": "",
        "meta": {

        },
        "data_type": "STRING",
        "constraints": [],
        "quote": true,
        "tags": []
      },
      "transactionDate": {
        "name": "transactionDate",
        "description": "",
        "meta": {

        },
        "data_type": "DATE",
        "constraints": [],
        "quote": true,
        "tags": []
      },
      "bookingId": {
        "name": "bookingId",
        "description": "",
        "meta": {

        },
        "data_type": "STRING",
        "constraints": [],
        "quote": true,
        "tags": []
      },
      "transactionStatus": {
        "name": "transactionStatus",
        "description": "",
        "meta": {

        },
        "data_type": "STRING",
        "constraints": [],
        "quote": true,
        "tags": []
      },
      "value": {
        "name": "value",
        "description": "",
        "meta": {

        },
        "data_type": "FLOAT",
        "constraints": [],
        "quote": true,
        "tags": []
      },
    },
    "docs": {
      "show": true,
      "node_color": "green"
    },
    "build_path": null,
    "deferred": false,
  },
}

Could you please help me better understand why?

Thank you in advance

Allow multiple storage provider

As @MattiasMTS note in this comment, maybe we can add a publisher like techdocs to specify which storage provider we want to use. Right now it's when we add the plugin to backstage we import the storage provider we want to use.

Should be ready for review now. I'm happy to add any "publisher" type in the metadata (similar to how techdocs have on "publisher type" azureBlobStorage, awsS3, etc.)

Let me know what you think @IIBenII !

Originally posted by @MattiasMTS in #165 (comment)

Allow custom file path

Hello,

Thank you for your reactivity about new backend system.
Do you think you could make file path customizable ?

I'm dealing with legacy docs and It could be tricky to move it.

Thanks,
Martinho

dbtdoc-path in root bucket

Hello,

Me again :)
Thanks for the last PR about custom path.

I'm still having an issue because my jsons are in the root of my bucket and last PR doesn't seem to allow that.

Do you think it's possible to allow it ?

Thanks

New backend system

Hello,

Do you have plans to make this module compatible with the new backend system ?

Thank you

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.