Giter Club home page Giter Club logo

metriql's Introduction

Metriql

All Contributors

Metriql is an open-source metrics store which allows companies to define their metrics as code and share them across their BI and data tools easily. It uses dbt for the transformation layer and integrate with dbt via its manifest.json artifact. You can learn more about metriql from here.

This is the core repository of the metriql project. It includes the JDBC driver, REST API, and the CLI.

Integrations

Running Metriql in your IDE

Run the following commands to pull Metriql and build it locally:

git clone https://github.com/metriql/metriql.git
cd metriql
./mvnw clean install -DskipTests
cd ./frontend && npm run build

We suggest using Intellij, here are the configuration that you need to run Metriql locally. Here is the required configuration:

Main class: com.metriql.ServiceStarterKt
Program arguments: serve --jdbc --manifest-json "https://metriql.github.io/metriql-public-demo/manifest.json"

Once you're with the setup, please make sure that:

Goodreads

Contributors โœจ

We're currently hiring a senior software engineer in UK & France if you'd like to join our team and help us build the next big thing in the modern data stack!

metriql's People

Contributors

actions-user avatar allcontributors[bot] avatar buremba avatar emresemercioglu avatar lialzm avatar oksouhei avatar xiaodongma9 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

metriql's Issues

Support TIMESTAMP '' in where

Thoughtspot generates queries on timestamps like below

WHERE (
  "ta_1"."booked_date" >= TIMESTAMP '2021-01-01 00:00:00.0'
)

But this is currently not supported in metriql, I guess it should be?

Add `--models` and `--selector` to `metriql generate`

Motivation

We want to generate partial models, not all models. We define dbt models in a single repository, but departments at our company use different BI tools. For instance, the department A uses tableau, but the department B uses looker. If we can generate partial models with conditions as --models and --selector, that would be awesome.

Cannot generate aggregates

I am trying to add aggregates to an existing dbt model (using the jaffle_shop dataset from dbt tutorial since it is simple and easy to set up). However, the generate command is throwing me an exception even for simple aggregation and dimension properties in my model. My model code I modified in /models to add in these meta properties is:

models:

- name: stg_customers
  description: Staged customer data from our jaffle shop app.
  meta:
    metriql:
      measures:
        total_rows:
          aggregation: count
      dimensions:
        full_name:
          sql: CONCAT({TABLE}.first_name, {TABLE}.last_name)
          type: string
  columns:
    - name: customer_id
      description: The primary key for customers.
      tests:
        - unique
        - not_null
    - name: first_name
      description: First name of customer.
    - name: last_name
      description: Last name of customer.
      meta:
        metriql.dimension:
          type: string

and the message I get when trying to generate (in debug mode) is:

Exception in thread "main" com.hubspot.jinjava.interpret.FatalTemplateErrorsException: Could not resolve function 'get_where_subquery' at com.hubspot.jinjava.Jinjava.render(Jinjava.java:191) at com.metriql.dbt.DbtJinjaRenderer.renderReference(DbtJinjaRenderer.kt:63) at com.metriql.dbt.DbtManifest$Node$TestMetadata$DbtModelColumnTest$Relationships.getSourceModelName(DbtManifest.kt:79) at com.metriql.dbt.DbtManifest$Node$TestMetadata.applyTestToModel(DbtManifest.kt:123) at com.metriql.dbt.DbtManifest$Node.toModel(DbtManifest.kt:193) at com.metriql.dbt.DbtManifestParser.parse(DbtManifestParser.kt:45) at com.metriql.Commands.parseRecipe(Commands.kt:181) at com.metriql.Commands.parseRecipe$default(Commands.kt:122) at com.metriql.Commands$Generate.run(Commands.kt:222) at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:180) at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:189) at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:17) at com.github.ajalt.clikt.core.CliktCommand.parse(CliktCommand.kt:396) at com.github.ajalt.clikt.core.CliktCommand.parse$default(CliktCommand.kt:393) at com.github.ajalt.clikt.core.CliktCommand.main(CliktCommand.kt:411) at com.github.ajalt.clikt.core.CliktCommand.main(CliktCommand.kt:436) at com.metriql.ServiceStarterKt.main(ServiceStarter.kt:25)

I am under the impression that I need to generate the dbt models using metriql first so that the models/metriql directory is created and then run the serve command based off of the documentation.

Provide an informative error message when dbt profile is missing or incorrectly configured

Please see below for an example message that says - profile null doesn't exist. In this example, I didn't specify --profile custom_name, hence it failed.

2021-10-11T14:25:25.509708+00:00 heroku[web.1]: Starting process with command `/docker-entrypoint.sh`
2021-10-11T14:25:26.614330+00:00 app[web.1]: Oct 11, 2021 2:25:26 PM ru.yandex.clickhouse.ClickHouseDriver <clinit>
2021-10-11T14:25:26.614347+00:00 app[web.1]: INFO: Driver registered
2021-10-11T14:25:33.089795+00:00 app[web.1]: profile null doesn't exist
2021-10-11T14:25:33.271659+00:00 heroku[web.1]: Process exited with status 1
2021-10-11T14:25:33.385970+00:00 heroku[web.1]: State changed from starting to crashed

Connect via DBeaver when SSL

Hi,

I've exposed MetriQL behind an ssl proxy and I can connect via the python library, but via DBeaver I get an exception when trying to connect.

Error starting query at https://some-domain.com/v1/statement returned an invalid response: JsonResponse{statusCode=500, statusMessage=, headers={access-control-allow-credentials=[true], access-control-allow-headers=[content-type,token], access-control-allow-origin=[*], content-length=[129], date=[Sun, 28 Nov 2021 06:46:36 GMT], samesite=[None; Secure], server=[openresty/1.19.9.1], strict-transport-security=[max-age=63072000]}, hasValue=false} [Error: {"errors":[{"id":null,"links":null,"status":null,"code":null,"title":"An error occurred","detail":null,"meta":null}],"meta":null}]

The server side says.

SELECT TABLE_CAT, TABLE_SCHEM, TABLE_NAME, TABLE_TYPE, REMARKS,
TYPE_CAT, TYPE_SCHEM, TYPE_NAME,   SELF_REFERENCING_COL_NAME, REF_GENERATION
FROM system.jdbc.tables
WHERE TABLE_NAME LIKE '%' ESCAPE '\'
ORDER BY TABLE_TYPE, TABLE_CAT, TABLE_SCHEM, TABLE_NAME
javax.ws.rs.WebApplicationException: User must be set
io.trino.server.HttpRequestSessionContext.badRequest(HttpRequestSessionContext.java:503)
io.trino.server.HttpRequestSessionContext.assertRequest(HttpRequestSessionContext.java:452)
io.trino.server.HttpRequestSessionContext.buildSessionIdentity(HttpRequestSessionContext.java:223)
io.trino.server.HttpRequestSessionContext.<init>(HttpRequestSessionContext.java:120)
com.metriql.service.jdbc.StatementService.createSessionContext(StatementService.kt:95)
com.metriql.service.jdbc.StatementService.query$lambda-3(StatementService.kt:105)

Support for Qlik

Metriql already has integration with Qlik through Presto connection but there are pending issues related to this integration:

  • #35
  • Support for Qlik Master Items

Support OAuth credentials for BigQuery target

I'm trying to run metriql on a macbook.
My dbt project compiles and runs just fine.
I reinstalled docker and pulled the latest metriql image.

My dbt setup is customised a little:

  • profiles.yml is in the project root. This is configured as an env_variable
  • I'm building to ${workspaceFolder}/.dbt/artifacts/. This is configured in dbt_project.
    Both work fine except for metriql.
(.venv) daniel@dbrtly-MBP dbt_shop % pwd
/Users/daniel/git/dbrtly/dbt_shop
(.venv) daniel@dbrtly-MBP dbt_shop % export DBT_PROJECT_DIR=${PWD}
export DBT_PROFILES_DIR=${PWD}
export METRIQL_PORT=5656

docker run -it -p "${METRIQL_PORT}:5656" -v "${DBT_PROJECT_DIR}:/root/app" -v "${DBT_PROFILES_DIR}:/root/.dbt" -e METRIQL_RUN_HOST=0.0.0.0 -e DBT_PROJECT_DIR=/root/app buremba/metriql \
 run
Aug 02, 2021 7:25:13 AM http://ru.yandex.clickhouse.ClickHouseDriver <clinit>
INFO: Driver registered
Exception in thread "main" Unknown warehouse: bigquery

Also FWIW I'd really prefer to use docker-compose rather than docker cli.

Unable to access metriql's local server

I started metriql, referring to the documentation and the #24 , but I cannot connect to metriql.

The command used to start it (My environment is fish, so the way variables are written may be different).

docker run -it -v "$DBT_PROJECT_DIR:/root/app" -v "$DBT_PROFILES_DIR:/root/.dbt" -v "$GCLOUD_DIR:/root/.gcloud" -e DBT_PROJECT_DIR=/root/app -p "5656:5656" -e METRIQL_RUN_HOST=0.0.0.0 buremba/metriql:latest serve

This is the result of executing the command.

Sep 13, 2021 7:12:45 AM ru.yandex.clickhouse.ClickHouseDriver <clinit>
INFO: Driver registered
Sep 13, 2021 7:12:47 AM com.metriql.service.model.DiscoverService discoverDimensionFieldTypes
INFO: Discovering 6 dimensions of model `model_jaffle_shop_dim_customers`
Sep 13, 2021 7:12:51 AM io.netty.util.internal.logging.Slf4JLogger info
INFO: [id: 0x1fa72a43] REGISTERED
Sep 13, 2021 7:12:51 AM io.netty.util.internal.logging.Slf4JLogger info
INFO: [id: 0x1fa72a43] BIND(/127.0.0.1:5656)
Sep 13, 2021 7:12:51 AM io.netty.util.internal.logging.Slf4JLogger info
INFO: [id: 0x1fa72a43, L:/127.0.0.1:5656] ACTIVE

When I try to connect with Dbeaver in this state, the connection itself succeeds (it takes a very long time). However, when I try to check the table list, I get a timeout error.

I also could not connect to it from a web browser (in the YouTube video, you can connect to it at http://127.0.0.7:5656, right?

It seems to me that metriql itself is not working correctly. What should I do?

Ability to define metrics in dbt models

In order to use metriql in a dbt project, you need to be using YML files in dbt but not all the users are familiar with it. We should find a dbt-native way for SQL-heavy users to define them inside the SQL. One solution would be parsing column names such as metric__[MEASURE_NAME]__[OPTIONAL_AGGREGATION_TYPE] and creating measures automatically for these columns.

We need to create a macro so that the users can define them as follows:

{{config(materialized='view')}}

select c_nationkey, 
    count(*) as {{measure('total_users', aggregation='sum')}}, 
    min(c_acctbal) as {{measure('min_balance', aggregation='min')}}
from {{source('tpch', 'customer')}} 

The model above will be compiled as:

select c_nationkey, 
  count(*) as metric__total_users__count, 
  min(c_acctbal) as metric__min_balance__min
from snowflake_sample_data.tpch_sf1.customer group by 1

If there is no YML file that documents the model, we will get the table schema, create dimensions for each column, look for the columns that start with metric__, and create a measure for each of them. The dataset will look something like this:

target: source('tpch', "customer")
dimensions:
  c_nationkey:
        column: c_nationkey
  total_users:
        column: total_users
  min_balance:
        column: min_balance
measures:
  total_users_sum:
     aggregation: sum
     column: total_users
  min_balance_min:
     aggregation: min
     column: min_balance

Support DATE '' in where

Sorry that I didn't check this before, but the same as with TIMESTAMP in where ( that should be fixed now), DATE '' in where is not supported either I think.

tutorial does not work

I followed the instructions provided here, I also executed a "dbt compile" before running metriql in order to generate the manifest.json file.
I'm running metriql thru docker, so my command is:

docker run -v c:\dbt:/mnt/dbt -v c:\users\frigod.dbt:/mnt/dbt_profile buremba/metriql:latest run --project-dir /mnt/dbt/metriql_test2 --profiles-dir /mnt/dbt_profile --manifest-json file:/mnt/dbt/metriql_test/target/manifest.json

I then tried connecting to the Trino engine using dbeaver. I can connect, but I cannot see any table (and opening the table list invalidates the connection).

I the tried running metriql generate to understand the issue:

docker run -v c:\dbt:/mnt/dbt -v c:\users\frigod.dbt:/mnt/dbt_profile buremba/metriql:latest generate --project-dir /mnt/dbt/metriql_test2 --profiles-dir /mnt/dbt_profile

and the result is:

Done creating 0 aggregate dbt models.

Looker Integration

We will create LookML files via CLI.

๐Ÿ‘ Upvote if you plan to use this integration.
โœ๏ธ If you have any ideas, questions or use cases you'd like to share about this issue, feel free to use comments.

Support multiple environments in a single Metriql deployment

Metriql community version requires dbt's manifest.json file as an argument and use that target for all the users. In the enterprise version, we will introduce a system that makes Metriql support dynamic manifest.json files and dbt profiles for different users.

Support for Thoughtspot

We already have integration with Thoughtspot through the Trino interface but for formatting/field mapping, we need to generate TML files and upload them to Thougtspot instances. Here is the Python library for the task: https://github.com/thoughtspot/ts_rest_api_and_tml_tools

Ideally, if Thoughtspot introduces integration with GIT, we can output zip files similar to our Looker integration but for now, we need to upload the TML files using Thoughtspot REST API.

Docker install does not recognize BigQuery connection

After pulling latest version of metriql Docker image, I try to run the command in the installation step with a BigQuery connection but get this error:

Exception in thread "main" /root/.config/gcloud/application_default_credentials.json not found connecting the BigQuery.

I have installed gcloud CLI tool and configured it to point to desired project and the application_default_credentials.json file is present and readable but metriql is not able to use it.

CTEs

Hi,

Is CTEs currently supported? We get some of those and it fails, might be something else going wrong but feels like it is about the CTEs.

Cheers!

Implement `window` measures

The traditional dimension & measure mechanism doesn't really work well with WINDOW functions because they do not work with GROUP BY and aggregation functions.

We can introduce a boolean property called window to measures and re-construct the window function in a subquery or use QUALIFY if it's supported by the underlying data warehouse.

Currently, you need to use the WINDOW functions in SQL queries and add them as dimensions but it sounds like an anti-pattern. I would love to hear potential use-cases and feedbacks though!

Add `metriql test` command

The test command should execute a metadata query referencing all dimension & measure definitions with LIMIT 0 on the underlying data warehouse and let the user know if there is any error in the field definitions.

Metabase integration

Error running `serve`

I installed the Metriql CLI through docker: docker pull buremba/metriql:latest

Then, while in my dbt directory, ran the instructions on the installation page:

export DBT_PROJECT_DIR=${PWD}
export DBT_PROFILES_DIR=${HOME}/.dbt
export METRIQL_PORT=5656

docker run -it -p "${METRIQL_PORT}:5656" -v "${DBT_PROJECT_DIR}:/root/app" -v "${DBT_PROFILES_DIR}:/root/.dbt" -e METRIQL_RUN_HOST=0.0.0.0 -e DBT_PROJECT_DIR=/root/app buremba/metriql \
 serve

I then get the following error:

Nov 03, 2021 11:36:01 AM ru.yandex.clickhouse.ClickHouseDriver <clinit>
INFO: Driver registered
Exception in thread "main" java.lang.IllegalAccessError: class com.metriql.Commands$Serve tried to access private method 'java.lang.String com.metriql.Commands.getProfilesContent()' (com.metriql.Commands$Serve and com.metriql.Commands are in unnamed module of loader 'app')
	at com.metriql.Commands$Serve.access$getProfilesContent$s-537891160(Commands.kt:180)
	at com.metriql.Commands$Serve.run(Commands.kt:232)
	at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:198)
	at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:211)
	at com.github.ajalt.clikt.parsers.Parser.parse(Parser.kt:18)
	at com.github.ajalt.clikt.core.CliktCommand.parse(CliktCommand.kt:395)
	at com.github.ajalt.clikt.core.CliktCommand.parse$default(CliktCommand.kt:392)
	at com.github.ajalt.clikt.core.CliktCommand.main(CliktCommand.kt:410)
	at com.github.ajalt.clikt.core.CliktCommand.main(CliktCommand.kt:435)
	at com.metriql.ServiceStarterKt.main(ServiceStarter.kt:25)

Any idea on how to resolve this?

Support for Qlik

Qlik uses prepared statements so here are the steps:

Implement #36
Support for Qlik Master Items

Google Data Studio Integration

We will create a Community Connector that interacts with REST API and execute queries over the API.

๐Ÿ‘ Upvote if you plan to use this integration.
โœ๏ธ If you have any ideas, questions, or use cases you'd like to share about this issue, feel free to use comments.

Connect via OAuth

Hi,

Is it possible to give different users different access? Or is the solution currently that if you have access to Metriql you can access all data that Metriql can access?

Cheers!

GROUP BY

Hi,

Group by seems to not work that great. If using any aggregation it usually fails

SELECT 
  "ta_1"."some_field" "ca_1", 
  count("ta_1"."total_orders") "ca_2", 
  avg("ta_1"."some_field_2") "ca_3"
FROM "metriql"."public"."model_some" "ta_1"
GROUP BY "ta_1"."some_field"
SELECT list expression references column some_field which is neither grouped nor aggregated at [1:8]
SELECT some_field AS `ca_1`, `total_orders` AS `ca_2`, avg(some_field_2) AS `ca_3` FROM (
SELECT 
    `model_some`.`some_field` AS `some_field`,
    `model_some`.`some_field_2` AS `some_field_2`,
    count(1) AS `total_orders`
FROM `project`.`mart`.`table` AS `model_some`

    GROUP BY
    1,  2 

) AS `ta_1` 

ConcurrentModificationException

Not really sure where this is coming from, but when using thoughtspot / DBeaver these errors pop up a lot in the metriql logs.

SEVERE: An uncaught exception raised while processing request.
java.util.ConcurrentModificationException
        at java.base/java.util.LinkedHashMap$LinkedHashIterator.nextNode(Unknown Source)
        at java.base/java.util.LinkedHashMap$LinkedEntryIterator.next(Unknown Source)
        at java.base/java.util.LinkedHashMap$LinkedEntryIterator.next(Unknown Source)
        at com.metriql.service.task.TaskQueueService.currentTasks(TaskQueueService.kt:155)
        at com.metriql.service.task.TaskQueueService.currentTasks$default(TaskQueueService.kt:75)
        at com.metriql.service.task.TaskHttpService.activeCount(TaskHttpService.kt:40)
        at java.base/java.lang.invoke.MethodHandle.invokeWithArguments(Unknown Source)
        at org.rakam.server.http.HttpServer.handleJsonRequest(HttpServer.java:622)
        at org.rakam.server.http.HttpServer.lambda$createGetRequestHandler$18(HttpServer.java:506)
        at org.rakam.server.http.RouteMatcher.handle(RouteMatcher.java:47)
        at org.rakam.server.http.HttpServerHandler.channelRead(HttpServerHandler.java:86)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:366)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:352)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:345)
        at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:435)
        at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
        at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
        at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:250)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:366)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:352)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:345)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:366)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:352)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:883)
        at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:389)
        at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:305)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.base/java.lang.Thread.run(Unknown Source)
        

Cannot run program "gcloud"

Hi,

I'm running with a profile that is using bigquery and a service account. But when I start the docker container it says "Cannot run program "gcloud"". Is gcloud necessary even if I use a service account?

Cheers

Support LOWER/UPPER

Currently it seems LOWER and UPPER is not implemented. BI Tools really on this functionality at least in the WHERE clauses.

Meta/Tag from DBT

Hi,

Any reason why tags/meta doesn't follow along to Metriql? :) would be kind of nice not to have to fetch from multiple places when fetching metadata.

Cheers!

Floats / Decimals

Hi,
When I'm using dbeaver and testing to do some calculations or just retrieving decimals/floats I get the below error.
One weird thing is that it says "type":"unknown" in the response, which makes me think it's something weird with the types.

org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error: Error executing query
	at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:133)
	at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:513)
	at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:444)
	at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
	at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:431)
	at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:816)
	at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3468)
	at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
	at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:171)
	at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
	at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4810)
	at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
	at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: java.sql.SQLException: Error executing query
	at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:287)
	at io.trino.jdbc.TrinoStatement.execute(TrinoStatement.java:240)
	at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.execute(JDBCStatementImpl.java:330)
	at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:130)
	... 12 more
Caused by: java.lang.RuntimeException: Error fetching next at http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a returned an invalid response: JsonResponse{statusCode=200, statusMessage=OK, headers={access-control-allow-origin=[*], connection=[keep-alive], content-length=[1898], content-type=[application/json]}, hasValue=false} [Error: {"id":"e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","infoUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","nextUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","columns":[{"name":"column","type":"unknown","typeSignature":{"rawType":"unknown","arguments":[]}}],"data":[["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"]],"stats":{"state":"QUEUED","queued":true,"scheduled":false,"nodes":0,"totalSplits":1,"queuedSplits":0,"runningSplits":0,"completedSplits":0,"cpuTimeMillis":0,"wallTimeMillis":0,"queuedTimeMillis":2702,"elapsedTimeMillis":2702,"processedRows":0,"processedBytes":0,"physicalInputBytes":0,"peakMemoryBytes":0,"spilledBytes":0},"warnings":[]}]
	at io.trino.jdbc.$internal.client.StatementClientV1.requestFailedException(StatementClientV1.java:447)
	at io.trino.jdbc.$internal.client.StatementClientV1.advance(StatementClientV1.java:386)
	at io.trino.jdbc.TrinoResultSet.getColumns(TrinoResultSet.java:235)
	at io.trino.jdbc.TrinoResultSet.create(TrinoResultSet.java:53)
	at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:262)
	... 15 more
Caused by: java.lang.IllegalArgumentException: Unable to create class io.trino.jdbc.$internal.client.QueryResults from JSON response:
[{"id":"e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","infoUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","nextUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","columns":[{"name":"column","type":"unknown","typeSignature":{"rawType":"unknown","arguments":[]}}],"data":[["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"]],"stats":{"state":"QUEUED","queued":true,"scheduled":false,"nodes":0,"totalSplits":1,"queuedSplits":0,"runningSplits":0,"completedSplits":0,"cpuTimeMillis":0,"wallTimeMillis":0,"queuedTimeMillis":2702,"elapsedTimeMillis":2702,"processedRows":0,"processedBytes":0,"physicalInputBytes":0,"peakMemoryBytes":0,"spilledBytes":0},"warnings":[]}]
	at io.trino.jdbc.$internal.client.JsonResponse.<init>(JsonResponse.java:69)
	at io.trino.jdbc.$internal.client.JsonResponse.execute(JsonResponse.java:143)
	at io.trino.jdbc.$internal.client.StatementClientV1.advance(StatementClientV1.java:372)
	... 18 more
Caused by: io.trino.jdbc.$internal.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of `io.trino.jdbc.$internal.client.QueryResults`, problem: Illegal base64 character 2e
 at [Source: (String)"{"id":"e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","infoUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","nextUri":"http://localhost:5656/v1/statement/queued/?id=e8a50b5b-ea33-476e-b23d-f8ebb37fce2a","columns":[{"name":"column","type":"unknown","typeSignature":{"rawType":"unknown","arguments":[]}}],"data":[["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0.8"],["0.8"],["0.8"],["0.3"],["0.8"],["0."[truncated 1398 chars]; line: 1, column: 1898]
	at io.trino.jdbc.$internal.jackson.databind.exc.ValueInstantiationException.from(ValueInstantiationException.java:47)
	at io.trino.jdbc.$internal.jackson.databind.DeserializationContext.instantiationException(DeserializationContext.java:1907)
	at io.trino.jdbc.$internal.jackson.databind.deser.std.StdValueInstantiator.wrapAsJsonMappingException(StdValueInstantiator.java:587)
	at io.trino.jdbc.$internal.jackson.databind.deser.std.StdValueInstantiator.rewrapCtorProblem(StdValueInstantiator.java:610)
	at io.trino.jdbc.$internal.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:293)
	at io.trino.jdbc.$internal.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
	at io.trino.jdbc.$internal.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
	at io.trino.jdbc.$internal.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
	at io.trino.jdbc.$internal.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1405)
	at io.trino.jdbc.$internal.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
	at io.trino.jdbc.$internal.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
	at io.trino.jdbc.$internal.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
	at io.trino.jdbc.$internal.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:2033)
	at io.trino.jdbc.$internal.jackson.databind.ObjectReader.readValue(ObjectReader.java:1491)
	at io.trino.jdbc.$internal.client.JsonCodec.fromJson(JsonCodec.java:68)
	at io.trino.jdbc.$internal.client.JsonResponse.<init>(JsonResponse.java:66)
	... 20 more
Caused by: java.lang.IllegalArgumentException: Illegal base64 character 2e
	at java.base/java.util.Base64$Decoder.decode0(Unknown Source)
	at java.base/java.util.Base64$Decoder.decode(Unknown Source)
	at java.base/java.util.Base64$Decoder.decode(Unknown Source)
	at io.trino.jdbc.$internal.client.FixJsonDataUtils.fixValue(FixJsonDataUtils.java:187)
	at io.trino.jdbc.$internal.client.FixJsonDataUtils.fixData(FixJsonDataUtils.java:74)
	at io.trino.jdbc.$internal.client.QueryResults.<init>(QueryResults.java:69)
	at jdk.internal.reflect.GeneratedConstructorAccessor24.newInstance(Unknown Source)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
	at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
	at io.trino.jdbc.$internal.jackson.databind.introspect.AnnotatedConstructor.call(AnnotatedConstructor.java:124)
	at io.trino.jdbc.$internal.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:291)
	... 31 more

The dimension is specified like this:

- name: column
   meta:
      metriql.dimension:
         type: float # ( Tested decimal as well )

PowerBI Integration

It looks like there are only two ways to support Direct Query, via ODBC or SQL Analysis services. OData and alternatives don't support direct query which is a deal-breaker to us. ODBC doesn't have the metric definitions so the only option seems to be XMLA. Also, it looks like PowerBI connects to XMLA services without additional driver so we don't need to build a driver for metriql.

References:
https://powerbi.microsoft.com/en-us/blog/announcing-read-write-xmla-endpoints-in-power-bi-premium-public-preview/
https://github.com/microsoft/azure-analysis-services-http-sample

Fix BigQuery OAuth authentification

It looks like we didn't implement OAuth method properly. I wiped gcloud data locally and installed the latest version and authenticated using gcloud auth application-default login command and it produced the file in ~/.config/gcloud/application_default_credentials.json. We should use from ~/.config/gcloud as the default config directory and use application_default_credentials.json file under the config directory.

cc: @guillaumelachaud

Auth towards trino connection

Hi,

I tried to add auth, but it seems to only add auth to the API endpoints, not the actual trino(interface) connection. Is this possible today? Would like to deploy it for testing but not expose it to the entire world.

Cheers!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.