Giter Club home page Giter Club logo

dbt2looker's Introduction

The open-source Looker alternative.


WebsiteWatch demoDocsJoin Slack Community

Enable everybody in your company to answer their own questions using data

connect your dbt project --> add metrics into dbt --> share insights with your team

If you're a fan, star the repo ⭐️ (we plant a tree for every GitHub star we get 🌱).

Come join the team, we're hiring.

Features:

  • 🙏 Familiar interface for your users to self-serve using pre-defined metrics
  • 👩‍💻 Declare dimensions and metrics in yaml alongside your dbt project
  • 🤖 Automatically creates dimensions from your dbt models
  • 📖 dbt descriptions and metadata synced for your users
  • 🔍 Easily access to underlying records of charts as well as ability to data drill-down
  • 🧮 Table calculations make it easy to dig into your data, on the fly
  • 🕵️‍♀️ Lineage lets you see the upstream and downstream dependencies of a model
  • 📊 Comprehensive and intuitive data visualisation library for your metrics
  • 👷‍♂️ Save charts & build dashboards to share your insights with your team
  • 💻 Powerful developer experience including Preview BI Environments and automated content validation via CI/CD
  • 🔄 Explore version history of all your charts and roll-back at any point
  • 🚀 Easily share your work via URL or schedule deliveries via Slack or Email

Something missing? Check out our open issues to see if what you're looking for already exists (and give it a 👍). Otherwise, we'd love it if you'd open a new issue with your request 😊

Demo

Play with our demo app!

Quick start

Start with Lightdash Cloud

You can avoid the hassle of hosting and configuring Lightdash yourself by signing up for a free trial of Lightdash Cloud. More details on pricing available.

1-click deploy

Deploy Lightdash with 1-click on Render for free.

Run locally

Take advantage of our installation script to easily run Lightdash locally.

git clone https://github.com/lightdash/lightdash
cd lightdash
./scripts/install.sh

Deploy to production

Follow our kubernetes guide to deploy Lightdash to production using our community helm charts.

Getting started

Step 1 - ⚡️ Self-host Lightdash (optional)

Step 2 - 🔌 Connect a project

Step 3 - 👩‍💻 Create your first metric

Community Support

📣 If you want something a bit more, then head on over to our Slack Community where you’ll be able to chat directly with all of us at Lightdash and all the other amazing members of our community. We’re happy to talk about anything from feature requests, implementation details or dbt quirks to memes and SQL jokes!

You can also keep up to date with Lightdash by following us here:

About Lightdash

🗂 Keep all of your business logic in one place.

We let you define your metrics and dimensions directly in your dbt project, keeping all of your business logic in one place and increasing the context around your analytics.

No more deciding which of the four different values for total revenue is the right one (you can thank us later 😉).

🤝 Build trust in your data.

We want everyone at your company to feel like they can trust the data. So, why not show them that they can?

We bring the context you want around data quality into your BI tool so people know that they can trust the data.

🧱 Give users meaningful building blocks to answer their own data questions.

With Lightdash, you can leave the SQL to the experts. We give your data team the tools they need to build metrics and dimensions that everyone else can use.

So, anybody in the business can combine, segment, and filter these metrics and dimensions to answer their own questions.

📖 Open source, now and forever

Lightdash is built with our community, for our community.

We think that a BI tool should be affordable, configurable, and secure - and being open source lets us be all three 🙂

🤑 Affordable analytics

Love Looker, but don't love Looker's price tag?

With Lightdash, we offer a free self-hosted service (it's all just open source!), or an affordable cloud-service option if you're looking for an easy analytics set up.

Docs

Have a question about a feature? Or maybe fancy some light reading? Head on over to our Lightdash documentation to check out our tutorials, reference docs, FAQs and more.

Reporting bugs and feature requests

Want to report a bug or request a feature? Open an issue.

The Lightdash Forest

We're planting trees with the help of the Lightdash community.

Tree planting is one of the simplest and most cost-effective means of mitigating climate change, by absorbing CO2 from the atmosphere. So we thought it would be pretty neat to grow a forest while we grow Lightdash.

Want to help us grow our forest?

Just star this repo! We plant a tree for every star we get on Github. ⭐️ ➡️ 🌱

We plant trees with TIST, you can read all about them here: https://program.tist.org/.

Developing locally & Contributing

We love contributions big or small, check out our guide on how to get started.

See our instructions on developing Lightdash locally.

Contributors ✨

Thanks goes to these wonderful people (emoji key):

Rahul Jain
Rahul Jain

📖
Oliver Laslett
Oliver Laslett

💻 📖 🐛 🎨 🚇
Katie Hindson
Katie Hindson

🐛 📖 🎨 💻 🤔
Hamzah Chaudhary
Hamzah Chaudhary

📖 💻 🤔 🐛
Harry Grieve
Harry Grieve

📖
Dominik Dorfmeister
Dominik Dorfmeister

🎨
amin-nejad
amin-nejad

🐛
Mitja Potočin
Mitja Potočin

💻
Jose Rego
Jose Rego

💻 🎨 📖 🐛 ⚠️ 🚇
Rahul
Rahul

🐛 🎨 💻 📖
Jeshua Maxey
Jeshua Maxey

🐛
Sreejith Madhavan
Sreejith Madhavan

🐛
skame
skame

🐛 🎨
sgoley
sgoley

📖
djayatillake
djayatillake

🎨 💻 🤔
Mukesh
Mukesh

🔣 🐛
Andreia Freitas
Andreia Freitas

⚠️ 📖
jb
jb

💻 🐛 🎨
Amy Chen
Amy Chen

📖
John Keech
John Keech

🚇
Dr. Ernie Prabhakar
Dr. Ernie Prabhakar

🐛 🤔
PriPatel
PriPatel

🎨 🐛 🤔
NaomiJohnson
NaomiJohnson

🎨 🐛
Rich Shen
Rich Shen

💻 ⚠️ 🐛
David Gasquez
David Gasquez

🤔 🎨
xjaner
xjaner

🤔
Chris Bol
Chris Bol

🤔
Anil V
Anil V

🤔
rlebrao
rlebrao

🤔 🐛
philcarr-tsl
philcarr-tsl

🐛 🔣
HashimsGitHub
HashimsGitHub

🚇
Nathalia Buitrago Jurado
Nathalia Buitrago Jurado

📖 💻 🐛 🎨
norbag
norbag

🐛
Shrpp
Shrpp

🐛
Cuong Vu
Cuong Vu

🐛
Takaaki Yoshikawa
Takaaki Yoshikawa

🤔
nkotlyarov
nkotlyarov

🐛
kim monzon
kim monzon

🤔
rverheijen
rverheijen

⚠️ 🐛
Spencer Carrucciu
Spencer Carrucciu

🤔
Mark Olliver
Mark Olliver

🐛
gary-beautypie
gary-beautypie

🐛
André Claudino
André Claudino

💻 🚇
Jim Park
Jim Park

🚇
gc-p
gc-p

🐛
Michał Łazowik
Michał Łazowik

💻
Chun Wei
Chun Wei

🤔
snyh-paulhenderson
snyh-paulhenderson

🤔
Frank Hoffsümmer
Frank Hoffsümmer

🐛
Sarah Moens
Sarah Moens

📖
Abhishek K M
Abhishek K M

💻
Javier Rengel Jiménez
Javier Rengel Jiménez

💻 🐛 ⚠️ 🚇 📖
Fisa
Fisa

🐛
JoelAlander
JoelAlander

🐛
Chad Floyd
Chad Floyd

🤔
André Claudino
André Claudino

🤔
12ian34
12ian34

📖 🐛 💻
raphaelauv
raphaelauv

🐛 📖
BA-CY
BA-CY

🤔
John Romanski
John Romanski

🐛
Jamie Davenport
Jamie Davenport

🐛
Marcus Windmark
Marcus Windmark

🤔
Shruti Kuber
Shruti Kuber

📖
Fszta
Fszta

📖
Mohamed Muhsin
Mohamed Muhsin

💻
magants
magants

🤔
Martin Carlsson
Martin Carlsson

🤔 🐛
Tomas Čerkasas
Tomas Čerkasas

🤔
TiFaBl
TiFaBl

🤔
Eric Cecchi
Eric Cecchi

💻
KristyMayer
KristyMayer

🐛
rahulstomar08
rahulstomar08

🤔
Charles Picowski
Charles Picowski

🐛
Matt Machczynski
Matt Machczynski

🐛
Irakli Janiashvili
Irakli Janiashvili

💻 🐛 🎨 ⚠️
Gordon Lee
Gordon Lee

🤔 📖
Olly
Olly

🤔 🐛
gautamdoulani
gautamdoulani

📖
David Peitinho
David Peitinho

🐛
Istvan Meszaros
Istvan Meszaros

🤔
Rif
Rif

📖
Phillip W.
Phillip W.

🤔
XiaozhouWang85
XiaozhouWang85

🤔
Rebecca Sanjabi
Rebecca Sanjabi

🐛
Kailin L
Kailin L

🤔
Metin Karakus
Metin Karakus

💻
Yasmine
Yasmine

💻 🐛 🤔 🎨
Piotr Pilis
Piotr Pilis

💻
Judah Rand
Judah Rand

🐛
Annebelle Olminkhof
Annebelle Olminkhof

📖
Victor Apolonio
Victor Apolonio

💻
Rodolfo Ferreira
Rodolfo Ferreira

💻
Patrick Brusven
Patrick Brusven

💻
Thomas Purchas
Thomas Purchas

💻 🐛
Adrian Letchford
Adrian Letchford

🤔
Collins
Collins

💻
Paul van der Linden
Paul van der Linden

🐛
Chris
Chris

💻
Mike Thoun
Mike Thoun

💻
Leon Kozlowski
Leon Kozlowski

💻
Nathan Coleman
Nathan Coleman

💻
Nicolas Frati
Nicolas Frati

💻
Fred
Fred

📖
Victor Lindell
Victor Lindell

💻
stellar-ahmed
stellar-ahmed

🐛
Cooper Williams
Cooper Williams

💻
Lokeswaran Aruljothi
Lokeswaran Aruljothi

💻
João Viana
João Viana

📖 💻 🐛
Muhammad Jufry
Muhammad Jufry

🐛
Patrik Braborec
Patrik Braborec

📖 🐛
David Flanagan
David Flanagan

💻 🚇
Moulik Aggarwal
Moulik Aggarwal

💻
Chandaluri Vamsi Krishna
Chandaluri Vamsi Krishna

💻
Elton Okawa
Elton Okawa

💻
JAY ANAND
JAY ANAND

💻
Yu Ishikawa
Yu Ishikawa

💻 🤔
magnew
magnew

💻 🐛
Advith Chelikani
Advith Chelikani

🐛
Sai Pranavdhar Reddy N
Sai Pranavdhar Reddy N

💻
Ujwal Kumar
Ujwal Kumar

💻
Nimit Haria
Nimit Haria

💻
Teghan Nightengale
Teghan Nightengale

💻
David Mattia
David Mattia

📖
Ayush Trivedi
Ayush Trivedi

💻
Zoltan K.
Zoltan K.

📖
Ankush Banik
Ankush Banik

💻
Karan Handa
Karan Handa

💻
Rohit Verma
Rohit Verma

💻
David Witkowski
David Witkowski

📖
iMac
iMac

💻
Andy Kish
Andy Kish

💻
Bilal Ahmad Bhat
Bilal Ahmad Bhat

🤔 💻
Shifan Gu
Shifan Gu

🐛 💻
Daniel Reis
Daniel Reis

💻 🐛
Jogeshwar Singh
Jogeshwar Singh

💻
lancerael
lancerael

💻 🐛
Ulises Gascón
Ulises Gascón

💻
Gary James
Gary James

💻
Bruno Almeida
Bruno Almeida

💻
Filipe Dobreira
Filipe Dobreira

📖 💻
Joe Milton
Joe Milton

💻
Andras Lassu
Andras Lassu

💻
Chase Wu
Chase Wu

📖
agha4to
agha4to

💻
Jovan Sakovic
Jovan Sakovic

💻

This project follows the all-contributors specification. Contributions of any kind welcome!

dbt2looker's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbt2looker's Issues

Join models in explores

Expose config for defining explores with joined models.

Ideally this would live in a dbt exposure but it's currently missing meta information.

Add to models for now?

Enforce Looker naming convention for measures

With the new ability to add measures in the schema.yml file, it introduces the ability to add spaces into the names of measures. This inevitably causes issue with the lookml. I think if there are spaces in the measure name it might be good to add underscores and enforce lowercase naming convention.

image

image

Redshift type conversions missing

Redshift has missing type conversions:

10:07:17 WARNING Column type timestamp without time zone not supported for conversion from redshift to looker. No dimension will be created.
10:07:17 WARNING Column type boolean not supported for conversion from redshift to looker. No dimension will be created.
10:07:17 WARNING Column type double precision not supported for conversion from redshift to looker. No dimension will be created.
10:07:17 WARNING Column type character varying(108) not supported for conversion from redshift to looker. No dimension will be created.
10:07:17 DEBUG  Created view from model dim_appointment with 0 measures, 0 dimensions

ImportError when trying to import PydanticValueError from pydantic

Description

When trying to execute dbt2looker from a virtual environment, I encountered an ImportError related to the pydantic package. The error occurs when attempting to import PydanticValueError from pydantic.

Steps to Reproduce

  1. python3.11 -m venv dbt2looker-venv
  2. source dbt2looker-venv/bin/activate
  3. pip install dbt2looker
  4. dbt2looker
  5. Observe the error during the import process.

Expected Behavior

The dbt2looker tool should start without any import errors.

Actual Behavior

The following ImportError is thrown:

Traceback (most recent call last):
File "/Users/username/Projects/dbt/dbt2looker-venv/bin/dbt2looker", line 5, in <module>
from dbt2looker.cli import run
File "/Users/username/Projects/dbt/dbt2looker-venv/lib/python3.11/site-packages/dbt2looker/cli.py", line 18, in <module>
from . import parser
File "/Users/username/Projects/dbt/dbt2looker-venv/lib/python3.11/site-packages/dbt2looker/parser.py", line 5, in <module>
from . import models
File "/Users/username/Projects/dbt/dbt2looker-venv/lib/python3.11/site-packages/dbt2looker/models.py", line 7, in <module>
from pydantic import BaseModel, Field, PydanticValueError, validator
ImportError: cannot import name 'PydanticValueError' from 'pydantic' (/Users/username/Projects/dbt/dbt2looker-venv/lib/python3.11/site-packages/pydantic/init.py)

Additional Information

  • Python Version: 3.11
  • dbt2looker Version: 0.11.0
  • Operating System: macOS Sonoma 14.4.1
  • Installation Method: pip

Issue when parsing dbt models

Hey folks!

I've just run 'dbt2looker' in my local dbt repo folder, and I receive the following error:

❯ dbt2looker
12:11:54 ERROR  Cannot parse model with id: "model.smallpdf.brz_exchange_rates" - is the model file empty?
Failed

The model file itself (pictured below) is not empty, therefore I am not sure what the issue with parsing this model dbt2looker appears to have. It is not materialised as a table or view, it is utilised by dbt as ephemeral - is that of importance when parsing files in the project? I've also tried running dbt2looker on a limited subset of dbt models via a tag, the same error appears. Any help is greatly appreciated!

Screenshot 2022-06-20 at 12 12 22

Other details:

  • on dbt version dbt 1.0.0
  • using dbt-redshift adapter [email protected]
  • let me know if anything else is of importance!

Different target environments - workaround

When trying to figure out how I'd create the LookML files for production environments as opposed to development, I realized that this process is built off the artifacts. So all you need to do is overwrite the artifacts with the correct information and then re-run. This workaround is dbt docs generate --target production and then run dbt2looker. It creates the LookML that references the right DB's and Schema's.

This still introduces issues when creating new models that are not materialized in the production environment. I wonder if there is a better way to parse through the manifest and then replace the schema and database references. The issue is that these are generated at runtime in dbt. So not sure how to best handle this one.

Adding Filters to Meta Looker Config in schema.yml

Use Case: Given that programatic creation of all LookML files is the goal, there are a couple features that could potentially be added in order to give people more flexibility in measure creation. The first one I could think of was filters. Individuals would use filters to calculate measures like Active Users (ex: count_distinct user ids where some sort of flag is true).

The following code is my admitted techno-babble as I don't fully understand pydantic and my python is almost exclusively pandas based.

def lookml_dimensions_from_model(model: models.DbtModel, adapter_type: models.SupportedDbtAdapters):
    return [
        {
            'name': column.name,
            'type': map_adapter_type_to_looker(adapter_type, column.data_type),
            'sql': f'${{TABLE}}.{column.name}',
            'description': column.description
            'filter':[measure.name: f'measure.value']

        }
        for column in model.columns.values()
        for filter in column.meta.looker.filters
        if map_adapter_type_to_looker(adapter_type, column.data_type) in looker_scalar_types
    ]


def lookml_measures_from_model(model: models.DbtModel):
    return [
        {
            'name': measure.name,
            'type': measure.type.value,
            'sql': f'${{TABLE}}.{column.name}',
            'description': f'{measure.type.value.capitalize()} of {column.description}',
            **'filter':[measure.name: f'measure.value']**
        }
        for column in model.columns.values()
        for measure in column.meta.looker.measures
        **for filter in column.meta.looker.filters**

    ]

Pretty obvious I would imagine that my Python skills are lacking/non-existent (and I have no idea if this would actually work) but this idea would add more functionality for those who want to create more dynamic measures. Here is a bare-bones idea of how it could be configured in dbt

image

Then the output would look something like.

  measure: Page views {
    type: count
    sql: ${TABLE}.relevant_field ;;
    description: "Count of something."
    filter: [the_name_of_defined_column: value_of_defined_column]
  }

Non-empty models cannot be parsed and are reported as empty

As of version 0.9.2, dbt2looker will not run for us anymore. v0.7.0 does run successfully. The error returned by 0.9.2 is 'Cannot parse model with id: "%s" - is the model file empty?'. However, the model that this is returned for is not empty. Based on the code, it seems like the attribute 'name' is missing, but inspecting the manifest.json file shows that there is actually a name for this model. I have no idea why the system reports these models as empty. The manifest.json object for one of the offending models is pasted below.

Reverting to v0.9.0 (which does not yet have this error message) just leads to dbt2looker crashing without any information. Reverting to 0.7.0 fixes the problem. This issue effectively locks us (and likely others) into using an old version of dbt2looker

"model.zivver_dwh.crm_account_became_customer_dates":
        {
            "raw_sql": "WITH sfdc_accounts AS (\r\n\r\n    SELECT * FROM {{ ref('stg_sfdc_accounts') }}\r\n\r\n), crm_opportunities AS (\r\n\r\n    SELECT * FROM {{ ref('crm_opportunities') }}\r\n\r\n), crm_account_lifecycle_stage_changes_into_customer_observed AS (\r\n\r\n    SELECT\r\n        *\r\n    FROM {{ ref('crm_account_lifecycle_stage_changes_observed') }}\r\n    WHERE\r\n        new_stage = 'CUSTOMER'\r\n\r\n), became_customer_dates_from_opportunities AS (\r\n\r\n    SELECT\r\n        crm_account_id AS sfdc_account_id,\r\n\r\n        -- An account might have multiple opportunities. The account became customer when the first one was closed won.\r\n        MIN(closed_at) AS became_customer_at\r\n    FROM crm_opportunities\r\n    WHERE\r\n        opportunity_stage = 'CLOSED_WON'\r\n    GROUP BY\r\n        1\r\n\r\n), became_customer_dates_observed AS (\r\n\r\n    -- Some accounts might not have closed won opportunities, but still be a customer. Examples would be Connect4Care\r\n    -- customers, which have a single opportunity which applies to multiple accounts. If an account is manually set\r\n    -- to customer, this should also count as a customer.\r\n    --\r\n    -- We try to get the date at which they became a customer from the property history. Since that wasn't on from\r\n    -- the beginning, we conservatively default to either the creation date of the account or the history tracking\r\n    -- start date, whichever was earlier. Please note that this case should be exceedingly rare.\r\n    SELECT\r\n        sfdc_accounts.sfdc_account_id,\r\n        CASE\r\n            WHEN {{ var('date:sfdc:account_history_tracking:start_date') }} <= sfdc_accounts.created_at\r\n                THEN sfdc_accounts.created_at\r\n            ELSE {{ var('date:sfdc:account_history_tracking:start_date') }}\r\n        END AS default_became_customer_date,\r\n\r\n        COALESCE(\r\n            MIN(crm_account_lifecycle_stage_changes_into_customer_observed.new_stage_entered_at),\r\n            default_became_customer_date\r\n        ) AS became_customer_at\r\n\r\n    FROM sfdc_accounts\r\n    LEFT JOIN crm_account_lifecycle_stage_changes_into_customer_observed\r\n        ON sfdc_accounts.sfdc_account_id = crm_account_lifecycle_stage_changes_into_customer_observed.sfdc_account_id\r\n    WHERE\r\n        sfdc_accounts.lifecycle_stage = 'CUSTOMER'\r\n    GROUP BY\r\n        1,\r\n        2\r\n\r\n)\r\nSELECT\r\n    COALESCE(became_customer_dates_from_opportunities.sfdc_account_id,\r\n        became_customer_dates_observed.sfdc_account_id) AS sfdc_account_id,\r\n    COALESCE(became_customer_dates_from_opportunities.became_customer_at,\r\n        became_customer_dates_observed.became_customer_at) AS became_customer_at\r\nFROM became_customer_dates_from_opportunities\r\nFULL OUTER JOIN became_customer_dates_observed\r\n    ON became_customer_dates_from_opportunities.sfdc_account_id = became_customer_dates_observed.sfdc_account_id",
            "resource_type": "model",
            "depends_on":
            {
                "macros":
                [
                    "macro.zivver_dwh.ref",
                    "macro.zivver_dwh.audit_model_deployment_started",
                    "macro.zivver_dwh.audit_model_deployment_completed",
                    "macro.zivver_dwh.grant_read_rights_to_role"
                ],
                "nodes":
                [
                    "model.zivver_dwh.stg_sfdc_accounts",
                    "model.zivver_dwh.crm_opportunities",
                    "model.zivver_dwh.crm_account_lifecycle_stage_changes_observed"
                ]
            },
            "config":
            {
                "enabled": true,
                "materialized": "ephemeral",
                "persist_docs":
                {},
                "vars":
                {},
                "quoting":
                {},
                "column_types":
                {},
                "alias": null,
                "schema": "bl",
                "database": null,
                "tags":
                [
                    "business_layer",
                    "commercial"
                ],
                "full_refresh": null,
                "crm_record_types": null,
                "post-hook":
                [
                    {
                        "sql": "{{ audit_model_deployment_completed() }}",
                        "transaction": true,
                        "index": null
                    },
                    {
                        "sql": "{{ grant_read_rights_to_role('data_engineer', ['all']) }}",
                        "transaction": true,
                        "index": null
                    },
                    {
                        "sql": "{{ grant_read_rights_to_role('analyst', ['all']) }}",
                        "transaction": true,
                        "index": null
                    }
                ],
                "pre-hook":
                [
                    {
                        "sql": "{{ audit_model_deployment_started() }}",
                        "transaction": true,
                        "index": null
                    }
                ]
            },
            "database": "analytics",
            "schema": "bl",
            "fqn":
            [
                "zivver_dwh",
                "business_layer",
                "commercial",
                "crm_account_lifecycle_stage_changes",
                "intermediates",
                "crm_account_became_customer_dates",
                "crm_account_became_customer_dates"
            ],
            "unique_id": "model.zivver_dwh.crm_account_became_customer_dates",
            "package_name": "zivver_dwh",
            "root_path": "C:\\Users\\tjebbe.bodewes\\Documents\\zivver-dwh\\dwh\\transformations",
            "path": "business_layer\\commercial\\crm_account_lifecycle_stage_changes\\intermediates\\crm_account_became_customer_dates\\crm_account_became_customer_dates.sql",
            "original_file_path": "models\\business_layer\\commercial\\crm_account_lifecycle_stage_changes\\intermediates\\crm_account_became_customer_dates\\crm_account_became_customer_dates.sql",
            "name": "crm_account_became_customer_dates",
            "alias": "crm_account_became_customer_dates",
            "checksum":
            {
                "name": "sha256",
                "checksum": "a037b5681219d90f8bf8d81641d3587f899501358664b8ec77168901b3e1808b"
            },
            "tags":
            [
                "business_layer",
                "commercial"
            ],
            "refs":
            [
                [
                    "stg_sfdc_accounts"
                ],
                [
                    "crm_opportunities"
                ],
                [
                    "crm_account_lifecycle_stage_changes_observed"
                ]
            ],
            "sources":
            [],
            "description": "",
            "columns":
            {
                "sfdc_account_id":
                {
                    "name": "sfdc_account_id",
                    "description": "",
                    "meta":
                    {},
                    "data_type": null,
                    "quote": null,
                    "tags":
                    []
                },
                "became_customer_at":
                {
                    "name": "became_customer_at",
                    "description": "",
                    "meta":
                    {},
                    "data_type": null,
                    "quote": null,
                    "tags":
                    []
                }
            },
            "meta":
            {},
            "docs":
            {
                "show": true
            },
            "patch_path": "zivver_dwh://models\\business_layer\\commercial\\crm_account_lifecycle_stage_changes\\intermediates\\crm_account_became_customer_dates\\crm_account_became_customer_dates.yml",
            "compiled_path": null,
            "build_path": null,
            "deferred": false,
            "unrendered_config":
            {
                "pre-hook":
                [
                    "{{ audit_model_deployment_started() }}"
                ],
                "post-hook":
                [
                    "{{ grant_read_rights_to_role('analyst', ['all']) }}"
                ],
                "tags":
                [
                    "commercial"
                ],
                "materialized": "ephemeral",
                "schema": "bl",
                "crm_record_types": null
            },
            "created_at": 1637233875
        }

Update manifest to v2

The manifest has been updated and includes additional properties, which is causing errors during compiling lookml.

Support Bigquery BIGNUMERIC datatype

Previously, dbt2looker would not create dimension for field with data type BIGNUMERIC since Looker didn't support converting BIGNUMERIC. So when we ran dbt2looker in CLI there is a warning WARNING Column type BIGNUMERIC not supported for conversion from bigquery to looker. No dimension will be created.
However, as of November 2021, Looker has officially supported BigQuery BIGNUMBERIC (link). Please help to add this.
Thank you,

Avoid duplicate naming

If a user regularly uses the column name as the metric name, the yaml spec becomes quite repetitive, can we make this any easier?

e.g.

        - name: NewCustPromoCost_USD
          description: "Total customers Promo Type Cost in USD."
          meta:
           measures:
             NewCustPromoCost_USD:
               type: sum
           dimension:
            enabled: false

Support model level measures

Motivation

We technically implement a measure with multiple columns under a column meta. But, it would be more natural to implement such measures as model-level.

models:
  - name: ubie_jp_lake__dm_medico__hourly_score_for_nps
    description: |
      {{ doc("ubie_jp_lake__dm_medico__hourly_score_for_nps") }}
    meta:
      measures:
        total_x_y_z:
          type: number
          description: 'Summation of total x, total y and total z'
          sql: '${total_x} + ${total_y} + ${total_z}'

Multiple manifest.json/catalog.json/dbt_project.yml files found in path ./

When running

dbt2looker --tag test

I get

$ dbt2looker --tag test
19:31:20 WARNING Multiple manifest.json files found in path ./ this can lead to unexpected behaviour
19:31:20 WARNING Multiple catalog.json files found in path ./ this can lead to unexpected behaviour
19:31:20 WARNING Multiple dbt_project.yml files found in path ./ this can lead to unexpected behaviour
19:31:20 INFO   Generated 0 lookml views in ./lookml/views
19:31:20 INFO   Generated 1 lookml model in ./lookml
19:31:20 INFO   Success

and no lookml files are generated.

I assume this is because I have multiple dbt packages installed? Is there a way to get around this? Otherwise, a feature request would be the ability to specify which files should be used - perhaps in a separate dbt2looker.yml settings file.

Parsing error

Hi there!

When trying to run dbt2looker --tag published after compiling my dbt project, I get a huge amount of errors, here is an extract:

11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f: Additional properties are not allowed ('metrics' was unexpected)
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f: Additional properties are not allowed ('metrics', 'test_metadata', 'column_name', 'file_key_name' were unexpected)
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f.resource_type: 'test' is not one of ['seed']
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f: Additional properties are not allowed ('metrics', 'test_metadata', 'column_name', 'file_key_name' were unexpected)
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f.resource_type: 'test' is not one of ['snapshot']
11:24:05 ERROR  Error in manifest at nodes.test.dbt_id_verification_analytics.relationships_stg_production__configuration_organization_id__organization_id__ref_stg_production__organization_.12c6be697f.resource_type: 'test' is not one of ['snapshot']
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.not_null_int_identifications_missed_sla_technical_time_in_seconds.8f9662f569: Additional properties are not allowed ('metrics', 'test_metadata', 'column_name', 'file_key_name' were unexpected)
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.not_null_int_identifications_missed_sla_technical_time_in_seconds.8f9662f569.resource_type: 'test' is not one of ['analysis']
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.not_null_int_identifications_missed_sla_technical_time_in_seconds.8f9662f569: 'compiled' is a required property
11:24:05 ERROR    Error in manifest at nodes.test.dbt_id_verification_analytics.not_null_int_identifications_missed_sla_technical_time_in_seconds.8f9662f569: Additional properties are not allowed ('metrics', 'test_metadata', 'column_name', 'file_key_name' were unexpected)

Any idea how to fix this?

Thanks a lot!

ValueError: Failed to parse dbt manifest.json

Hey!
I'm trying to run this package and hitting errors right after installation. I pip installed dbt2looker, ran the following in the root of my dbt project.

dbt docs generate
dbt2looker

This gives me the following error:

Traceback (most recent call last):
File "/Users/josh/.pyenv/versions/3.10.0/bin/dbt2looker", line 8, in
sys.exit(run())
File "/Users/josh/.pyenv/versions/3.10.0/lib/python3.10/site-packages/dbt2looker/cli.py", line 108, in run
raw_manifest = get_manifest(prefix=args.target_dir)
File "/Users/josh/.pyenv/versions/3.10.0/lib/python3.10/site-packages/dbt2looker/cli.py", line 33, in get_manifest
parser.validate_manifest(raw_manifest)
File "/Users/josh/.pyenv/versions/3.10.0/lib/python3.10/site-packages/dbt2looker/parser.py", line 20, in validate_manifest
raise ValueError("Failed to parse dbt manifest.json")
ValueError: Failed to parse dbt manifest.json

This is preceded by a whole mess of error messages like such:

21:01:05 ERROR Error in manifest at nodes.model.jaffle_shop.stg_customers.created_at: 1639274126.771925 is not of type 'integer'
21:01:05 ERROR Error in manifest at nodes.model.jaffle_shop.stg_customers.resource_type: 'model' is not one of ['analysis']
21:01:05 ERROR Error in manifest at nodes.model.jaffle_shop.stg_customers.created_at: 1639274126.771925 is not of type 'integer'
21:01:05 ERROR Error in manifest at nodes.model.jaffle_shop.stg_customers.resource_type: 'model' is not one of ['test']

Any idea what might be going wrong here? Happy to provide more detail. Thank you!

Possible missing type conversions for BigQuery [BIGNUMERIC and BYTE]

On running dbt2looker, I get these warning messages:

WARNING Column type BYTES not supported for conversion from bigquery to looker. No dimension will be created.

And

WARNING Column type BIGNUMERIC not supported for conversion from bigquery to looker. No dimension will be created.

Support for these would be appreciated.

Support group_labels in yml for dimensions

class Dbt2LookerDimension(BaseModel):

measures seem to have this but not dimensions. probably all/most properties in available in https://docs.lightdash.com/references/dimensions/ should be represented here -- is this something lightdash is willing to maintain or would you want a contribution? @TuringLovesDeathMetal / @owlas - i figure full support for lightdash properties that can map to looker should be, in order to maximize the value of this utility for enabling looker customers to uncouple themselves from looker.

Column Type None Error - Field's Not Converting To Dimensions

When running dbt2looker --tag marts on my mart models, I receive dozens of errors around none type conversions.

20:54:28 WARNING Column type None not supported for conversion from snowflake to looker. No dimension will be created.

Here is the example of the schema.yml file.

image

The interesting thing is that it correctly recognizes the doc that corresponds to the model. The explore within the model file is correct and has the correct documentation.

Not sure if I can be of any more help but let me know if there is anything!

DBT version 1.0

Hi,

Is this library supporting DBT version 1.0 and forward? I can't get it to run at all.
There's a lot of errors when checking the schema of the manifest.json file.

/ Andrea

Discrepancy Between Manifest & Catalog - Non-Defined Columns Missing

While looking at the results of this last version 0.7.0, I noticed that one of my models only had a single column defined within the LookML. I dug into it and realized that this is because I only defined a single column within my schema.yml for that model. I dug into the manifest of the specific model model using this command cat target/manifest.json | jq '.nodes["model.my_project_name.model_name"]' and noticed that only a single column is returned in the manifest.

I then pulled up the catalog with this command cat target/catalog.json | jq '.nodes["model.my_project_name.model_name"]' and there were 229 columns returned.

I'm not sure if this is an opinionated stance that has been taken (ie, columns that make it to the Looker view should be defined within the corresponding schema.yml) but wanted to highlight just in case it wasn't intentional.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.