Giter Club home page Giter Club logo

wundergraph's Introduction

Wundergraph

Wundergraph provides a platform to easily expose your database through a GraphQL interface.

Build Status

Example

For a full example application see the example project

#[macro_use] extern crate diesel;
use wundergraph::prelude::*;

table! {
    heros {
        id -> Integer,
        name -> Text,
        hair_color -> Nullable<Text>,
        species -> Integer,
    }
}

table! {
    species {
        id -> Integer,
        name -> Text,
    }
}

#[derive(Clone, Debug, Identifiable, WundergraphEntity)]
#[table_name = "heros"]
pub struct Hero {
    id: i32,
    name: String,
    hair_color: Option<String>,
    species: HasOne<i32, Species>,
}

#[derive(Clone, Debug, Identifiable, WundergraphEntity)]
#[table_name = "species"]
pub struct Species {
    id: i32,
    name: String,
    heros: HasMany<Hero, heros::species>,
}

wundergraph::query_object!{
    Query {
       Hero,
       Species,
   }
}

Building

Depending on your backend choice you need to install a native library. libpq is required for the postgresql feature, libsqlite3 for the sqlite feature.

License

Licensed under either of these:

Contributing

Unless you explicitly state otherwise, any contribution you intentionally submit for inclusion in the work, as defined in the Apache-2.0 license, shall be dual-licensed as above, without any additional terms or conditions.

wundergraph's People

Contributors

killercup avatar mre avatar naufraghi avatar p-alik avatar philipp-m avatar weiznich avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

wundergraph's Issues

Write documentation

  • Add documentation to wundergraph itself
  • Document the usage of wundergraph_derive
  • Build wundergraph with `#![deny(missing_docs)]
  • Publish docs

Context based validation or modification

I've been playing around with wundergraph for a bit and I'm really liking it, but I'm a bit stuck on devising a method of operation authorization. I'd ideally like to be able to inspect instantiated models prior to inserting or updating and either allow the update, or, reject the update in consideration of the context.

I've been poking around in the source and it doesn't seem like there's any straight forward way to do this at the moment. I think(?) the easiest way of doing something like this might be to allow the implementation of a trait like:

impl<M, Ctx> WundergraphShouldInsert for MyInsertable<M, Ctx> {
    pub fn should_insert(model: &M, context: &Ctx) -> bool;
    // or
    pub fn should_insert(model: &M, context: &Ctx) -> Option<&M>;
}

In the case of inserting, it seems like this could probably be checked in the handle_insert call prior to actually asking Diesel to insert.

I'd also like something for reading/deleting, but I think the easiest cases to start with are probably insert/update. If this seems like something that you'd be interested in having Wundergraph support, I'd be happy to take a crack at implementing this. If I can already do this somehow and I'm missing it, I'd love to hear.

Thanks!

Support raw identifiers

Hello, it´s me again! :-)

Is there any way to support raw identifiers?

It might be a corner case, but sometimetimes it´s necessary to use the notation if you have naming collisions.

table! {
    options (id) {
        id -> Int4,
        label -> Varchar,
        values -> Array<Text>,
        // a field called type must use raw identifier
        r#type -> crate::domain::option::model::OptionTypeEnumMapping,
    }
}

#[derive(DbEnum, Debug, Serialize, Clone, Copy, WundergraphValue, GraphQLEnum)]
#[sql_type = "OptionTypeEnumMapping"]
#[PgType = "optiontype_enum"]
pub enum OptionTypeEnum {
    Checkbox,
    // collisions, so using raw identifier
    r#Option,
    r#String,
    Number
}

// Danger, calling the whole stuct Option!
#[derive(Clone, Debug, Identifiable, Queryable, Associations, WundergraphEntity)]
#[table_name = "options"]
pub struct Option {
    pub id: i32,
    pub label: String,
    pub values: Vec<String>,
    // And using type as a field
    pub r#type: OptionTypeEnum,
}

This gives the error:

error: proc-macro derive panicked
  --> src\domain\option\model.rs:9:1
   |
9  | / table! {
10 | |     options (id) {
11 | |         id -> Int4,
12 | |         label -> Varchar,
...  |
15 | |     }
16 | | }
   | |_^
   |
   = help: message: `"_impl_query_id_for_r#type"` is not a valid identifier
   = note: this warning originates in a macro outside of the current crate (in Nightly builds, run with -Z external-macro-backtrace for more info)

This example is exaggerated but I do think there are valid uses if you want to keep a clean naming structure someties.

Is this supported already? Am I posting this in the right repo or should I post to juniper/diesel instead?

wundegraph_cli doesn't catch missed not auto increment primary key in struct derive GraphQLInputObject

As mentioned in #3 CI bench checks run faultless despite the struct

#[derive(Insertable, GraphQLInputObject, Clone, Debug, Copy)]
#[table_name = "film_actor"]
pub struct NewFilmActor {
last_update: NaiveDateTime,
}
doesn't contain two fields, which are set as primary key for the table
#[table_name = "film_actor"]
#[primary_key(actor_id, film_id)]
pub struct FilmActor {
actor_id: i16,
film_id: i16,
last_update: NaiveDateTime,
}

Unable to rename graphql entity

According to #8 and the commit here it should be possible to rename the graphql entity using #[wundergraph(graphql_name = "MyNewName")] however, it does not seem to work for me.

I change the name (trying to fix a pluralizatoin issue) but it has no effect.

Is it me missing a switch again, or an actual bug?

My code

use wundergraph::prelude::*;

table! {
    addresses (id) {
        id -> Int4,
        address_line -> Varchar,
        postal_code -> Varchar,
        city -> Varchar,
        country -> Varchar,
    }
}

#[derive(Clone, Debug, Identifiable, Queryable, WundergraphEntity)]
#[wundergraph(graphql_name = "Addresses")] // <-- Trying to enforce this name that gets wrong in pluralization
#[table_name = "addresses"]
pub struct Address {
    pub id: i32,
    pub address_line: String,
    pub postal_code: String,
    pub city: String,
    pub country: String
}

Result
image

The address entity is still wrongfully named Adresss with three s

wundergraph_cli's print_schema::tests::round_trip test fails

I've faced the issue by working on #48, but got it even by testing master branch based on 6c8cb7b

$ DATABASE_URL=wundergraph_cli.db cargo test --features sqlite --no-default-features  print_schema::tests::round_trip
   Compiling wundergraph_cli v0.1.1 (/opt/devel/rust/wundergraph/wundergraph_cli)
warning: use of deprecated item 'std::error::Error::description': use the Display impl or to_string()
   --> wundergraph_cli/src/infer_schema_internals/sqlite.rs:280:29
    |
280 |         Err(e) => assert!(e.description().starts_with(
    |                             ^^^^^^^^^^^
    |
    = note: `#[warn(deprecated)]` on by default

    Finished test [unoptimized + debuginfo] target(s) in 17.16s
     Running /opt/devel/rust/wundergraph/target/debug/deps/wundergraph_cli-95ce337ef33f77ec

running 1 test
     Created binary (application) `wundergraph_roundtrip_test` package
test print_schema::tests::round_trip ... test print_schema::tests::round_trip has been running for over 60 seconds
Started http server: http://127.0.0.1:8001
test print_schema::tests::round_trip ... FAILED

failures:

---- print_schema::tests::round_trip stdout ----
"[package]\nname = \"wundergraph_roundtrip_test\"\nversion = \"0.1.0\"\nauthors = [\"***\"]\nedition = \"2018\"\n\n# See more keys and their definitions at https://doc.r"
Updating crates.io index
Compiling libc v0.2.70
Compiling cfg-if v0.1.10
Compiling autocfg v1.0.0
Compiling proc-macro2 v1.0.13
Compiling unicode-xid v0.2.0
Compiling syn v1.0.22
Compiling byteorder v1.3.4
Compiling log v0.4.8
Compiling futures v0.1.29
Compiling lazy_static v1.4.0
Compiling semver-parser v0.7.0
Compiling scopeguard v1.1.0
Compiling fnv v1.0.7
Compiling maybe-uninit v2.0.0
Compiling smallvec v1.4.0
Compiling slab v0.4.2
Compiling autocfg v0.1.7
Compiling proc-macro2 v0.4.30
Compiling unicode-xid v0.1.0
Compiling rand_core v0.4.2
Compiling matches v0.1.8
Compiling syn v0.15.44
Compiling memchr v2.3.3
Compiling serde v1.0.110
Compiling getrandom v0.1.14
Compiling cc v1.0.53
Compiling regex-syntax v0.6.17
Compiling proc-macro-hack v0.5.15
Compiling failure_derive v0.1.8
Compiling itoa v0.4.5
Compiling gimli v0.21.0
Compiling object v0.19.0
Compiling rustc-demangle v0.1.16
Compiling percent-encoding v2.1.0
Compiling copyless v0.1.4
Compiling percent-encoding v1.0.1
Compiling match_cfg v0.1.0
Compiling linked-hash-map v0.5.3
Compiling crc32fast v1.2.0
Compiling ryu v1.0.4
Compiling quick-error v1.2.3
Compiling either v1.5.3
Compiling ppv-lite86 v0.2.8
Compiling pkg-config v0.3.17
Compiling httparse v1.3.4
Compiling arc-swap v0.4.6
Compiling encoding_rs v0.8.23
Compiling bitflags v1.2.1
Compiling dtoa v0.4.5
Compiling sha1 v0.6.0
Compiling language-tags v0.2.2
Compiling mime v0.3.16
Compiling uuid v0.7.4
Compiling crossbeam-utils v0.7.2
Compiling num-traits v0.2.11
Compiling num-integer v0.1.42
Compiling indexmap v1.3.2
Compiling thread_local v1.0.1
Compiling lock_api v0.3.4
Compiling semver v0.9.0
Compiling unicode-normalization v0.1.12
Compiling rand_core v0.3.1
Compiling rand_jitter v0.1.4
Compiling unicode-bidi v0.3.4
Compiling rand_chacha v0.1.1
Compiling rand_pcg v0.1.2
Compiling rand v0.6.5
Compiling hashbrown v0.6.3
Compiling tokio-sync v0.1.8
Compiling actix-service v0.4.2
Compiling miniz-sys v0.1.12
Compiling brotli-sys v0.3.2
Compiling addr2line v0.12.1
Compiling lru-cache v0.1.2
Compiling libsqlite3-sys v0.17.3
Compiling rustc_version v0.2.3
Compiling rand_xorshift v0.1.1
Compiling rand_hc v0.1.0
Compiling rand_isaac v0.1.1
Compiling idna v0.1.5
Compiling idna v0.2.0
Compiling quote v1.0.6
Compiling smallvec v0.6.13
Compiling iovec v0.1.4
Compiling net2 v0.2.34
Compiling num_cpus v1.13.0
Compiling rand_os v0.1.3
Compiling time v0.1.43
Compiling backtrace v0.3.48
Compiling hostname v0.3.1
Compiling socket2 v0.3.12
Compiling parking_lot_core v0.7.2
Compiling signal-hook-registry v1.2.0
Compiling base64 v0.10.1
Compiling parking_lot_core v0.6.2
Compiling parking_lot v0.9.0
Compiling quote v0.6.13
Compiling aho-corasick v0.7.10
Compiling url v1.7.2
Compiling url v2.1.1
Compiling tokio-executor v0.1.10
Compiling bytes v0.4.12
Compiling mio v0.6.22
Compiling threadpool v1.8.1
Compiling rand_core v0.5.1
Compiling const-random-macro v0.1.8
Compiling resolv-conf v0.6.3
Compiling parking_lot v0.10.2
Compiling flate2 v1.0.14
Compiling regex v1.3.7
Compiling tokio-timer v0.2.13
Compiling tokio-current-thread v0.1.7
Compiling chrono v0.4.11
Compiling tokio-io v0.1.13
Compiling http v0.1.21
Compiling string v0.2.1
Compiling rand_chacha v0.2.2
Compiling mio-uds v0.6.8
Compiling brotli2 v0.3.2
Compiling const-random v0.1.8
Compiling scheduled-thread-pool v0.2.4
Compiling synstructure v0.12.3
Compiling tokio-codec v0.1.2
Compiling enum-as-inner v0.2.1
Compiling derive_more v0.15.0
Compiling rand v0.7.3
Compiling ahash v0.2.18
Compiling r2d2 v0.8.8
Compiling actix-codec v0.1.2
Compiling serde_derive v1.0.110
Compiling diesel_derives v1.4.1
Compiling thiserror-impl v1.0.18
Compiling paste-impl v0.1.12
Compiling juniper_codegen v0.14.2
Compiling actix-web-codegen v0.1.3
Compiling wundergraph_derive v0.1.0 (/opt/devel/rust/wundergraph/wundergraph_derive)
Compiling tokio-reactor v0.1.12
Compiling actix-utils v0.4.7
Compiling thiserror v1.0.18
Compiling actix-threadpool v0.1.2
Compiling paste v0.1.12
Compiling tokio-tcp v0.1.4
Compiling tokio-udp v0.1.6
Compiling tokio-signal v0.2.9
Compiling actix-rt v0.2.6
Compiling diesel v1.4.4
Compiling actix-server-config v0.1.2
Compiling failure v0.1.8
Compiling actix-server v0.6.1
Compiling trust-dns-proto v0.7.4
Compiling actix-testing v0.1.0
Compiling serde_urlencoded v0.6.1
Compiling serde_json v1.0.53
Compiling actix-router v0.1.5
Compiling trust-dns-resolver v0.11.1
Compiling h2 v0.1.26
Compiling juniper v0.14.2
Compiling actix-connect v0.2.5
Compiling actix-http v0.2.11
Compiling wundergraph v0.1.2 (/opt/devel/rust/wundergraph/wundergraph)
Compiling awc v0.2.8
Compiling actix-web v1.0.9
Compiling wundergraph_roundtrip_test v0.1.0 (/tmp/roundtrip_test.knICxKiRUKHR/wundergraph_roundtrip_test)
Finished dev [unoptimized + debuginfo] target(s) in 2m 18s
Running `target/debug/wundergraph_roundtrip_test`
Started server
thread 'print_schema::tests::round_trip' panicked at 'called `Result::unwrap()` on an `Err` value: Error(Io(Custom { kind: TimedOut, error: "timed out" }), "http://127.0.0.1:8001/graphql")', wundergraph_cli/src1
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    print_schema::tests::round_trip

test result: FAILED. 0 passed; 1 failed; 0 ignored; 0 measured; 9 filtered out

error: test failed, to rerun pass '--bin wundergraph_cli'

RUSTSEC-2020-0049: Use-after-free in Framed due to lack of pinning

Use-after-free in Framed due to lack of pinning

Details
Package actix-codec
Version 0.1.2
URL actix/actix-net#91
Date 2020-01-30
Patched versions >= 0.3.0-beta.1

Affected versions of this crate did not require the buffer wrapped in Framed to be pinned,
but treated it as if it had a fixed location in memory. This may result in a use-after-free.

The flaw was corrected by making the affected functions accept Pin&lt;&amp;mut Self&gt; instead of &amp;mut self.

See advisory page for additional details.

RUSTSEC-2021-0020: Multiple Transfer-Encoding headers misinterprets request payload

Multiple Transfer-Encoding headers misinterprets request payload

Details
Package hyper
Version 0.12.35
URL GHSA-6hfq-h8hq-87mf
Date 2021-02-05
Patched versions >=0.14.3,>=0.13.10, <0.14.0
Unaffected versions <0.12.0

hyper's HTTP server code had a flaw that incorrectly understands some requests
with multiple transfer-encoding headers to have a chunked payload, when it
should have been rejected as illegal. This combined with an upstream HTTP proxy
that understands the request payload boundary differently can result in
"request smuggling" or "desync attacks".

See advisory page for additional details.

RUSTSEC-2021-0124: Data race when sending and receiving after closing a `oneshot` channel

Data race when sending and receiving after closing a oneshot channel

Details
Package tokio
Version 0.1.22
URL tokio-rs/tokio#4225
Date 2021-11-16
Patched versions >=1.8.4, <1.9.0,>=1.13.1
Unaffected versions <0.1.14

If a tokio::sync::oneshot channel is closed (via the
oneshot::Receiver::close method), a data race may occur if the
oneshot::Sender::send method is called while the corresponding
oneshot::Receiver is awaited or calling try_recv.

When these methods are called concurrently on a closed channel, the two halves
of the channel can concurrently access a shared memory location, resulting in a
data race. This has been observed to cause memory corruption.

Note that the race only occurs when both halves of the channel are used
after the Receiver half has called close. Code where close is not used, or where the
Receiver is not awaited and try_recv is not called after calling close,
is not affected.

See tokio#4225 for more details.

See advisory page for additional details.

RUSTSEC-2020-0159: Potential segfault in `localtime_r` invocations

Potential segfault in localtime_r invocations

Details
Package chrono
Version 0.4.19
URL chronotope/chrono#499
Date 2020-11-10

Impact

Unix-like operating systems may segfault due to dereferencing a dangling pointer in specific circumstances. This requires an environment variable to be set in a different thread than the affected functions. This may occur without the user's knowledge, notably in a third-party library.

Workarounds

No workarounds are known.

References

See advisory page for additional details.

RUSTSEC-2020-0048: Use-after-free in BodyStream due to lack of pinning

Use-after-free in BodyStream due to lack of pinning

Details
Package actix-http
Version 0.2.11
URL actix/actix-web#1321
Date 2020-01-24
Patched versions >= 2.0.0-alpha.1

Affected versions of this crate did not require the buffer wrapped in BodyStream to be pinned,
but treated it as if it had a fixed location in memory. This may result in a use-after-free.

The flaw was corrected by making the trait MessageBody require Unpin
and making poll_next() function accept Pin&lt;&amp;mut Self&gt; instead of &amp;mut self.

See advisory page for additional details.

Order by not working

Is there anything special i need to do to get ordering to work?

Seen in this picture, I'm trying to order by id. But I've tried different columns with different datatypes (strings, timestamps) but I get the same result. No sorting is done at all.

Thank you for a kick ass crate by the way!

image

RUSTSEC-2020-0036: failure is officially deprecated/unmaintained

failure is officially deprecated/unmaintained

Details
Status unmaintained
Package failure
Version 0.1.8
URL rust-lang-deprecated/failure#347
Date 2020-05-02

The failure crate is officially end-of-life: it has been marked as deprecated
by the former maintainer, who has announced that there will be no updates or
maintenance work on it going forward.

The following are some suggested actively developed alternatives to switch to:

See advisory page for additional details.

mutation_object! macro fails when udpate is set to false

The following fails with error:

error: type arguments must be declared prior to const arguments
   --> src/schema.rs:129:1
    |
129 | / wundergraph::mutation_object! {
130 | |     Mutation {
131 | |         Comments(insert = NewComments, update = false,),
132 | |         Posts(insert = NewPosts, update = PostsChangeset,),
133 | |         Users(insert = NewUser, update = UsersChangeset,)
134 | |     }
135 | | }
    | |_^
    |
    = note: this error originates in a macro outside of the current crate (in Nightly builds, run with -Z external-macro-backtrace for more info)

Code:

#[derive(Insertable, GraphQLInputObject, Clone, Debug)]
#[table_name = "comments"]
pub struct NewComments {
    comment: String,
    author: i32,
    post: i32
}

#[derive(Insertable, GraphQLInputObject, Clone, Debug)]
#[table_name = "posts"]
pub struct NewPosts {
    title: String,
    content: String,
    author: i32
}

#[derive(AsChangeset, GraphQLInputObject, Identifiable, Debug)]
#[table_name = "posts"]
pub struct PostsChangeset {
    id: i32,
    title: String,
    content: String,
    author: i32
}

#[derive(Insertable, GraphQLInputObject, Clone, Debug)]
#[table_name = "users"]
pub struct NewUsers {
    name: String
}

#[derive(AsChangeset, GraphQLInputObject, Identifiable, Debug)]
#[table_name = "users"]
pub struct UsersChangeset {
    id: i32,
    name: String
}

wundergraph::mutation_object! {
    Mutation {
        Comments(insert = NewComments, update = false,),
        Posts(insert = NewPosts, update = PostsChangeset,),
        Users(insert = NewUsers, update = UsersChangeset,)
    }
}

Error on filter set to null

Example: Heros

Query:

query speciess{
  Speciess(filter: null){
    id
    name
    heros {
      id
      heroName
    }
  }
}

Expected result: Same as with not mentioning the filter at all

Result:

{
  "data": null,
  "errors": [
    {
      "message": "Could not build filter from arguments",
      "locations": [
        {
          "line": 2,
          "column": 3
        }
      ],
      "path": [
        "Speciess"
      ]
    }
  ]
}

RUSTSEC-2021-0081: Potential request smuggling capabilities due to lack of input validation

Potential request smuggling capabilities due to lack of input validation

Details
Package actix-http
Version 0.2.11
Date 2021-06-16
Patched versions >=2.2.1, <3.0.0,>=3.0.0-beta.9

Affected versions of this crate did not properly detect invalid requests that could allow HTTP/1 request smuggling (HRS) attacks when running alongside a vulnerable front-end proxy server. This can result in leaked internal and/or user data, including credentials, when the front-end proxy is also vulnerable.

Popular front-end proxies and load balancers already mitigate HRS attacks so it is recommended that they are also kept up to date; check your specific set up. You should upgrade even if the front-end proxy receives exclusively HTTP/2 traffic and connects to the back-end using HTTP/1; several downgrade attacks are known that can also expose HRS vulnerabilities.

See advisory page for additional details.

Allow to specify the columns used to load data

Allow something like this:

table! {
  users(id) {
     id -> Integer,
     name -> Text,
     secret -> Bytea,
  }
}

#[derive(Queryable, WundergraphEntity, WundergraphFilter, Identifiable, Debug, Clone)]
#[table_name = "users"]
#[wundergraph(select(id, name))]
struct User {
   id: i32,
   name: String
}

Add support for MySql

This one should be quite easy:

Connections Never Closed

I have noticed that I reach MAX_CONNECTIONS quite quick by just sitting at home being 1 user and querying the database.

It seems like connections doesn´t get released. This is the same project as I uploaded in #41, with the exception that I updated diesel to latest (1.4.4).

Running select * from pg_stat_activity; shows that idle connections is filling up. But I also found that one query got spammed: SET CLIENT_ENCODING TO 'UTF8'

Any ideas of why this is happening? It´s a postgresql v12 database server on ubuntu.

image

RUSTSEC-2021-0079: Integer overflow in `hyper`'s parsing of the `Transfer-Encoding` header leads to data loss

Integer overflow in hyper's parsing of the Transfer-Encoding header leads to data loss

Details
Package hyper
Version 0.12.36
URL GHSA-5h46-h7hh-c6x9
Date 2021-07-07
Patched versions >=0.14.10

When decoding chunk sizes that are too large, hyper's code would encounter an integer overflow. Depending on the situation,
this could lead to data loss from an incorrect total size, or in rarer cases, a request smuggling attack.

To be vulnerable, you must be using hyper for any HTTP/1 purpose, including as a client or server, and consumers must send
requests or responses that specify a chunk size greater than 18 exabytes. For a possible request smuggling attack to be possible,
any upstream proxies must accept a chunk size greater than 64 bits.

See advisory page for additional details.

__typename is not returned when queried

In a query like the following:

query {
   Hero {
      __typename
   }
}

The expected response would contain a key like:

"__typename": "Hero"

Instead, the __typename key is omitted from the response. I've done a bit of poking around in Juniper, and it seems that the concrete_type_name function is called to determine the value __typename should resolve when queried. Wundergraph appears to define that function here, but to no effect. Any thoughts on what might be happening here? Thanks!

Question: How do I extend the schema?

How can I extend the schema manually when using wundergraph?

Example from the juniper book:

#[juniper::object]
impl Root {
    fn personFromCurrentUser() -> FieldResult<Option<Person>> {
        // Look up logged in user in database...
unimplemented!()
    }
}

Currently my query root only consists of wundergraph generated entities

wundergraph::query_object! {
    Query {
       Person,
       Event,
       Address     
       // How can  I add personFromCurrentUser here?
   }
}

LoadingHandler not implemented

Hi,

I'm trying to get your examples running, but for some reason it seems like the macros doesnt get applied and extends my types. Perhaps you see directly what I'm doing wrong/missing?

The only thing I have done different from your example is to break apart the content in different files/modules. But the have correct (what I think) imports etc between them.

Tried putting everything in the same file but still got the same error...

The error

error[E0277]: the trait bound `person::model::Person: wundergraph::query_builder::selection::LoadingHandler<_, _>` is not satisfied
  --> src\main.rs:45:18
   |
45 |     let schema = Schema::new(query, mutation);
   |                  ^^^^^^^^^^^ the trait `wundergraph::query_builder::selection::LoadingHandler<_, _>` is not implemented for `person::model::Person`
   |
   = note: required because of the requirements on the impl of `juniper::types::base::GraphQLType<wundergraph::scalar::WundergraphScalarValue>` for `graphql::query::Query<_>`
   = note: required by `juniper::schema::model::RootNode````

My type implementation

use diesel_derive_enum::DbEnum;
use serde::Serialize;
use wundergraph::prelude::*;
use crate::address::model::Address;
use crate::db::schema::*;

#[derive(Clone, Debug, Identifiable, Queryable, WundergraphEntity)]
pub struct Person {
    pub id: i32,
    pub firstname: String,
    pub lastname: Option<String>,
    pub year_of_birth: Option<i32>,
    pub address: HasOne<i32, Address>,
}

My diesel type

table! {
    persons (id) {
        id -> Int4,
        firstname -> Varchar,
        lastname -> Nullable<Varchar>,
        year_of_birth -> Nullable<Int4>,
        address_id -> Nullable<Int4>,
    }
}

QueryModifier

impl<T, C, DB> QueryModifier<T, DB> for MyContext<C>
where
    C: Connection<Backend = DB>,
    DB: Backend + ApplyOffset + 'static,
    T: LoadingHandler<DB, Self>,
    Self: WundergraphContext,
    Self::Connection: Connection<Backend = DB>,
{
    fn modify_query<'a>(
        &self,
        _select: &LookAheadSelection<'_, WundergraphScalarValue>,
        query: BoxedQuery<'a, T, DB, Self>,
    ) -> Result<BoxedQuery<'a, T, DB, Self>> {
        match T::TYPE_NAME {
            _ => Ok(query),
        }
    }
}

Query object

use crate::person::model::Person;
use crate::event::model::Event;
use crate::address::model::Address;

wundergraph::query_object!{
    Query {
       Person,
       Event,
       Address
   }
}

Write a tool/macro to generate the whole schema

Similar to diesel_infer_schema / diesel_cli there should be some tool to generate all the code directly from the database.

  • Write a library do to the code generation
  • Write a cli tool to output the code
  • Write a infer_wundergraph_schema! macro to be used inside users code.

Write tests

We should at least have tests for

  • Perform a simple request
  • Perform a request with a filter
    • Perform a request with an equal filter
    • Perform a request with a non equal filter
    • Perform a request with an equal any filter
    • Perform a request with a like filter
    • Perform a request with a is null filter
    • Perform a request with a is not null filter
    • Perform a request combining two filters with and
    • Perform a request combining two filters with or
  • Perform a request with order
    • Perform a request with column asc
    • Perform a request with column desc
    • Perform a request with a invalid column name
  • Perform a request with a limit
  • Perform a request with a offset
  • Perform several requests combining everything above
  • Perform a request containing a one to many relation ship
  • Perform a request containing a one to one relation ship

RUSTSEC-2021-0078: Lenient `hyper` header parsing of `Content-Length` could allow request smuggling

Lenient hyper header parsing of Content-Length could allow request smuggling

Details
Package hyper
Version 0.12.36
URL GHSA-f3pg-qwvg-p99c
Date 2021-07-07
Patched versions >=0.14.10

hyper's HTTP header parser accepted, according to RFC 7230, illegal contents inside Content-Length headers.
Due to this, upstream HTTP proxies that ignore the the header may still forward them along if it chooses to ignore the error.

To be vulnerable, hyper must be used as an HTTP/1 server and using an HTTP proxy upstream that ignores the header's contents
but still forwards it. Due to all the factors that must line up, an attack exploiting this vulnerablity is unlikely.

See advisory page for additional details.

Error 500 on non-assigned variables

Example: Heros

Query:

query speciess($filter: SpeciesFilter){
  Speciess(filter: $filter){
    id
    name
    heros {
      id
      heroName
    }
  }
}

Variables: None

Expected result: the same as without mentioning the filter at all.
Result: Error 500

Fix small issues with `wundergraph_cli`

Sometimes the generated file contains structs which may contain types (for example uuid::Uuid or chrono::DateTime) that are nowt in the prelude. wundergraph_cli does nto generate the required imports there. Find a way to do that.
I think the easiest way to fix that is to just generate fully qualified paths instead of using only the type name there. Type mapping is happening here

Fix codegeneration in `wundergraph_cil`

Currently the code generated by wundergraph_cli has the following issues:

  • For nullable references the #[wundergraph(is_nullable_reference = "true")] annotation is missing. This can be fixed by checking the type of the referenced field.

RUSTSEC-2020-0071: Potential segfault in the time crate

Potential segfault in the time crate

Details
Package time
Version 0.1.43
URL time-rs/time#293
Date 2020-11-18
Patched versions >=0.2.23
Unaffected versions =0.2.0,=0.2.1,=0.2.2,=0.2.3,=0.2.4,=0.2.5,=0.2.6

Impact

Unix-like operating systems may segfault due to dereferencing a dangling pointer in specific circumstances. This requires an environment variable to be set in a different thread than the affected functions. This may occur without the user's knowledge, notably in a third-party library.

The affected functions from time 0.2.7 through 0.2.22 are:

  • time::UtcOffset::local_offset_at
  • time::UtcOffset::try_local_offset_at
  • time::UtcOffset::current_local_offset
  • time::UtcOffset::try_current_local_offset
  • time::OffsetDateTime::now_local
  • time::OffsetDateTime::try_now_local

The affected functions in time 0.1 (all versions) are:

  • at
  • at_utc
  • now

Non-Unix targets (including Windows and wasm) are unaffected.

Patches

Pending a proper fix, the internal method that determines the local offset has been modified to always return None on the affected operating systems. This has the effect of returning an Err on the try_* methods and UTC on the non-try_* methods.

Users and library authors with time in their dependency tree should perform cargo update, which will pull in the updated, unaffected code.

Users of time 0.1 do not have a patch and should upgrade to an unaffected version: time 0.2.23 or greater or the 0.3 series.

Workarounds

No workarounds are known.

References

time-rs/time#293

See advisory page for additional details.

Expected Option type for id

#[derive(Queryable, Identifiable, PartialEq, Debug, Serialize,WundergraphEntity,
         WundergraphFilter, Associations)]
#[table_name = "users"]
pub struct User {
    pub id: i32,
    pub first_name: Option<String>,
    pub last_name: Option<String>,
    pub username: Option<String>,
    pub email: Option<String>,
    pub password: Option<String>,
    pub date_join: NaiveDateTime,
    pub last_login: Option<NaiveDateTime>,
    pub is_active: Option<bool>,
    pub is_staff: Option<bool>,
    pub is_admin: Option<bool>,
    pub tel: Option<String>,
    pub gender:Option<String>,
    pub img:Option<String>,
}

schema.rs:

table! {
    users (id) {
        id -> Integer,
        first_name -> Nullable<Varchar>,
        last_name -> Nullable<Varchar>,
        username -> Nullable<Varchar>,
        email -> Nullable<Varchar>,
        password -> Nullable<Varchar>,
        date_join -> Timestamp,
        last_login -> Nullable<Timestamp>,
        is_active -> Nullable<Bool>,
        is_staff -> Nullable<Bool>,
        is_admin -> Nullable<Bool>,
        tel -> Nullable<Varchar>,
        gender -> Nullable<Varchar>,
        img -> Nullable<Varchar>,
    }
}

And it says;

pub id: i32,
^^ expected i32, found enum std::option::Option

Broken query generation

For the wundegraph_bench crate the following query returns a wrong query:

query artists_collaboration {
  Artists(filter: {albums: {tracks: {composer: {eq: "Ludwig van Beethoven"}}}})
  {
    id
    name
  }
}

The following sql is generated:

SELECT "artists"."id", "artists"."name" FROM "artists" 
WHERE "artists"."id" IN (
   SELECT "albums"."artist_id" FROM "albums" WHERE "albums"."id" IN (
       SELECT "tracks"."album_id" FROM "tracks" WHERE "tracks"."composer" = $1 
       AND "tracks"."album_id" IS NOT NULL)) 
AND "artists"."id" NOT IN (
    SELECT "albums"."artist_id" FROM "albums" WHERE "albums"."id" IN (
        SELECT "tracks"."album_id" FROM "tracks" WHERE "tracks"."composer" = $2 
        AND "tracks"."album_id" IS NOT NULL)) 
-- binds: ["Ludwig van Beethoven", "Ludwig van Beethoven"]

NaiveDateTime not converted correctly

As discussed in Gitter, I opened up this issue. However, I couldn't manage to create a reproducible example. I uploaded my code to https://github.com/SirWindfield/wundergraph_mvp. Compiling with the wg feature uses wundergraph. And in fact, it does compile here (compared to my other code where it doesn't).
Not sure if the difference lies in the fact that I just create one single model compared to the whole GraphQL schema. But the compilation error prior didn't indicate that the problem originates from the schema related part.

Edit:
The original model definition alongside the PostgreSQL and error:

#[derive(Clone, Debug, Identifiable, Queryable, WundergraphEntity)]
#[table_name = "pantry_items"]
pub struct PantryItem {
    pub id: i32,
    pub name: String,
    pub expires_at: chrono::naive::NaiveDateTime,
}
CREATE TABLE pantry_items(
    id INTEGER GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY NOT NULL,
    name VARCHAR NOT NULL,
    expires_at TIMESTAMPTZ NOT NULL
);
type mismatch resolving `<diesel::sql_types::Timestamptz as diesel::sql_types::IntoNullable>::Nullable == diesel::sql_types::Nullable<diesel::sql_types::Timestamp>`
  --> src/db/pantry_items.rs:12:49
   |
12 | #[derive(Clone, Debug, Identifiable, Queryable, WundergraphEntity)]
   |                                                 ^^^^^^^^^^^^^^^^^ expected struct `diesel::sql_types::Timestamptz`, found struct `diesel::sql_types::Timestamp`
   |
   = note: expected struct `diesel::sql_types::Nullable<diesel::sql_types::Timestamptz>`
              found struct `diesel::sql_types::Nullable<diesel::sql_types::Timestamp>`
   = note: this error originates in a derive macro (in Nightly builds, run with -Z macro-backtrace for more info)

Cargo build in workspace root complains about redundant wundergraph_example targets

I'm not sure it is an significant issue, but it could be.

wundergraph $ cargo build
warning: output filename collision.
The bin target `main` in package `wundergraph_example v0.1.0 (/home/auser/Documents/devel/rust/wundergraph/wundergraph_example)` has the same output filename as the bin target `main` in package `wundergraph_bench v0.1.0 (/home/auser/Documents/devel/rust/wundergraph/wundergraph_bench)`.
Colliding filename is: /home/auser/Documents/devel/rust/wundergraph/target/debug/main
The targets should have unique names.
Consider changing their names to be unique or compiling them separately.
This may become a hard error in the future; see <https://github.com/rust-lang/cargo/issues/6313>.

build is broken

It's currently not possible to build the project using Rust 1.49. Both master and 0.1.2 are broken.

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.