Giter Club home page Giter Club logo

lacinia's Introduction

Lacinia

Clojars Project CI

Lacinia Manual | Lacinia Tutorial | API Documentation

This library is a full implementation of Facebook's GraphQL specification.

Lacinia should be viewed as roughly analogous to the official reference JavaScript implementation. In other words, it is a backend-agnostic GraphQL query execution engine. Lacinia is not an Object Relational Mapper ... it's simply the implementation of a contract sitting between the GraphQL client and your data.

Lacinia features:

  • An EDN-based schema language, or use GraphQL's Interface Definition Language.

  • High performance parser for GraphQL queries, built on Antlr4.

  • Efficient and asynchronous query execution.

  • Full support for GraphQL types, interfaces, unions, enums, input objects, and custom scalars.

  • Full support for GraphQL subscriptions.

  • Full support of inline and named query fragments.

  • Full support for GraphQL Schema Introspection.

Lacinia has been developed with a set of core philosophies:

  • Prefer data over macros and other tricks: Compose your schema in whatever mix of data and code works for you.

  • Embrace Clojure: Use EDN data, keywords, functions, and persistent data structures.

  • Keep it simple: You provide the schema and a handful of functions to resolve data, and Lacinia does the rest.

  • Do the right thing: apply reasonable defaults without a lot of "magic".

This library can be plugged into any Clojure HTTP pipeline. The companion library lacinia-pedestal provides full HTTP support, including GraphQL subscriptions, for Pedestal.

An externally developed library, duct-lacinia, provides similar capability for Duct.

Getting Started

For more detailed documentation, read the manual.

GraphQL starts with a schema definition of types that can be queried.

A schema starts as an EDN file; the example below demonstrates a small subset of the available options:

{:enums
 {:Episode
  {:description "The episodes of the original Star Wars trilogy."
   :values [:NEWHOPE :EMPIRE :JEDI]}}

 :objects
 {:Droid
  {:fields {:id {:type Int}
            :primaryFunctions {:type (list String)}
            :name {:type String}
            :appearsIn {:type (list :Episode)}}}

  :Human
  {:fields {:id {:type Int}
            :name {:type String}
            :homePlanet {:type String}
            :appearsIn {:type (list :Episode)}}}
  :Query
  {:fields {:hero {:type (non-null :Human)
                   :args {:episode {:type :Episode}}}
            :droid {:type :Droid
                    :args {:id {:type String 
                                :default-value "2001"}}}}}}}

The fields of the special Query object define the query operations available; with this schema, a client can find the Human hero of an episode, or find a Droid by its id.

A schema alone describes what data is available to clients, but doesn't identify where the data comes from; that's the job of a field resolver.

A field resolver is just a function which is passed the application context, a map of arguments values, and a resolved value from a parent field. The field resolver returns a value consistent with the type of the field; most field resolvers return a Clojure map or record, or a list of those. Lacinia then uses the GraphQL query to select fields of that value to return in the response.

Here's what a very opinionated get-hero field resolver might look like:

(defn get-hero 
  [context arguments value]
  (let [{:keys [episode]} arguments]
    (if (= episode :NEWHOPE)
      {:id 1000
       :name "Luke"
       :homePlanet "Tatooine"
       :appearsIn ["NEWHOPE" "EMPIRE" "JEDI"]}
      {:id 2000
       :name "Lando Calrissian"
       :homePlanet "Socorro"
       :appearsIn ["EMPIRE" "JEDI"]})))

In this greatly simplified example, the field resolver can simply return the resolved value. Field resolvers that return multiple values return a list, vector, or set of values.

In real applications, a field resolver might execute a query against a database, or send a request to another web service.

After injecting resolvers, it is necessary to compile the schema; this step performs validations, provides defaults, and organizes the schema for efficient execution of queries.

This needs only be done once, in application startup code:

(require '[clojure.edn :as edn]
         '[com.walmartlabs.lacinia.util :refer [inject-resolvers]]
         '[com.walmartlabs.lacinia.schema :as schema])

(def star-wars-schema
  (-> "schema.edn"
      slurp
      edn/read-string
      (inject-resolvers {:Query/hero get-hero
                         :Query/droid (constantly {})})
      schema/compile))

With the compiled application available, it can be used to execute requests; this typically occurs inside a Ring handler function:

(require '[com.walmartlabs.lacinia :refer [execute]]
         '[clojure.data.json :as json])

(defn handler [request]
  {:status 200
   :headers {"Content-Type" "application/json"}
   :body (let [query (get-in request [:query-params :query])
               result (execute star-wars-schema query nil nil)]
           (json/write-str result))})

Lacinia doesn't know about the web tier at all, it just knows about parsing and executing queries against a compiled schema. A companion library, lacinia-pedestal, is one way to expose your schema on the web.

Clients will typically send a JSON POST request, with a query key containing the GraphQL query document:

{
  hero {
    id
    name
  }
}

The execute function returns EDN data that can be easily converted to JSON. The :data key contains the value requested for the hero query in the request.

{:data
  {:hero {:id 2000
          :name "Lando Calrissian"}}}

This example request has no errors, and contained only a single query. GraphQL supports multiple queries in a single request. There may be errors executing the query, Lacinia will process as much as it can, and will report errors in the :errors key.

One of the benefits of GraphQL is that the client has the power to rename fields in the response:

{
  hero(episode: NEWHOPE) {
    movies: appearsIn
  }
}
{:data {:hero {:movies [:NEWHOPE :EMPIRE :JEDI]}}}

This is just an overview, far more detail is available in the manual.

Status

This library has been used in production at Walmart since 2017, going through a very long beta period as it evolved; we transitioned to a 1.0 release on 9 Oct 2021.

To use this library with Clojure 1.8, you must include a dependency on clojure-future-spec.

More details are in the manual.

License

Copyright © 2017-2023 WalmartLabs

Distributed under the Apache License, Version 2.0.

lacinia's People

Contributors

annapawlicka avatar bcarrell avatar brendanyounger avatar candid82 avatar decoursin avatar gklijs avatar gusbicalho avatar happy-river avatar hiredman avatar hlship avatar ivarref avatar j1mr10rd4n avatar jeffp42ker avatar keoko avatar lennartbuit avatar ljb7977 avatar malcolmsparks avatar matteoredaelli avatar matthiasn avatar mpenet avatar namenu avatar olimsaidov avatar olivergeorge avatar ryancrum avatar sashton avatar seeday avatar sundbry avatar thumbnail avatar yogidevbear avatar yyna avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lacinia's Issues

Naming an argument 'query' results in exception when parsing the GraphQL query

When a query accepts an argument with the name query it's not parseable.

Minimal Example to reproduce the error

This test has been run on 0.19.0

Working

(def good-schema
  (com.walmartlabs.lacinia.schema/compile
    {:objects {:QueryResult {:fields {:result {:type 'String}}}}
     :queries {:someQuery {:type    :QueryResult
                           :args    {:q {:type 'String}}
                           :resolve (fn [ctx args v]
                                      {:result (:q args)})}}}))

(com.walmartlabs.lacinia.parser/parse-query good-schema "query DoQuery {someQuery(q: \"lala\") {result}}")
;; => no exception

Not Working

(def bad-schema
  (com.walmartlabs.lacinia.schema/compile
    {:objects {:QueryResult {:fields {:result {:type 'String}}}}
     :queries {:someQuery {:type    :QueryResult
                           :args    {:query {:type 'String}}
                           :resolve (fn [ctx args v]
                                      {:result (:query args)})}}}))

(com.walmartlabs.lacinia.parser/parse-query bad-schema "query DoQuery {someQuery(query: \"lala\") {result}}")

The only difference between good-schema and bad-schema is the name of the argument q versus query.

GraphiQL thinks it's ok, and as far as I understand the spec says that the keys of Argument have to be of type 'Name', which doesn't forbid the specific string query.

Idea: replace named fragments with inline fragments

It occured to me that, as part of the prepare stage, named fragments could be looked up in the parsed query and converted to inline fragments for execution. This might have some positive performance impact.

Introspection query returns empty list for Directives

When I run this introspection query I receive an empty collection for :directives, when I expected the two supported directives (skip and include) to be returned as such:

{"directives"
   [{"name" "skip",
     "description" nil,
     "locations" ["INLINE_FRAGMENT" "FIELD" "FRAGMENT_SPREAD"],
     "args"
     [{"name" "if",
       "description" nil,
       "type"
       {"kind" "NON_NULL",
        "name" nil,
        "ofType" {"kind" "SCALAR", "name" "Boolean", "ofType" nil}},
       "defaultValue" nil}]}
    {"name" "include",
     "description" nil,
     "locations" ["INLINE_FRAGMENT" "FIELD" "FRAGMENT_SPREAD"],
     "args"
     [{"name" "if",
       "description" nil,
       "type"
       {"kind" "NON_NULL",
        "name" nil,
        "ofType" {"kind" "SCALAR", "name" "Boolean", "ofType" nil}},
       "defaultValue" nil}]}]}

Dealing with namespaced keywords

I'm trying to retrofit an existing REST endpoint (which returned EDN) to GraphQL using Lacinia.
The response contains a lot of namespaced keywords. Has Lacinia support for querying these?

Use metadata for tagging, whenever possible - use wrapper type only when metadata not supported

Problem: If the resolver returns a TaggedValue, any transformation we want to do in a decorator requires using the fns in com.walmart.lacinia.internal-utils to extract the original value and its tag (to rebuild the TaggedValue later). Something like the fn below solves that use case:

(defn map-tagged-value [f, result]
  (let [tag   (lacinia.i-utils/extract-type-tag result)
        value (lacinia.i-utils/extract-value result)]
    (lacinia.schema/tag-with-type (f value) tag)))

Custom scalar :parse is never called

When defining a custom scalar that will be sent as an argument, the :parse function is never called, but :serialize is. Example below will return with an error, but wouldn't if the parsing had been called:

(require
  '[clojure.spec :as s]
  '[com.walmartlabs.lacinia :as g]
  '[com.walmartlabs.lacinia.schema :as schema])

(import
  java.text.SimpleDateFormat
  java.util.Date)

(def date-formatter
  "Used by custom scalar :Date."
  (SimpleDateFormat. "yyyy-MM-dd"))

(defn parse-fn [input]
  (prn "PARSE NEVER CALLED!")
  (.parse date-formatter input))

(defn serialize-fn [output]
  (prn "SERIALIZE CALLED AND WILL THROW BECAUSE NEVER PARSED!")
  (.format date-formatter output))

(def schema
  (schema/compile
    {:scalars {
               :Date {:parse (s/conformer parse-fn)
                      :serialize (s/conformer serialize-fn)}}

     :queries {
               :today {:type :Date
                       :args {:asOf {:type :Date}}
                       :resolve (fn [ctx args v] (:asOf args))}}}))

(g/execute schema "query($asOf: Date!){today(asOf: $asOf)}" {:asOf "2017-04-05"} nil)

Type system is insufficient - list and not-null can not be freely combined

The current system is quite limited: you can add a not-null qualifier, meaning that the value may not be nil. You can add a list qualifier, meaning that the resolver returns a seq of values, not a single one. You can combine the two, meaning that the the values may not be nil (though the list itself may be).

This is a subset of what the GraphQL spec calls for; we should be able to say that a field's type is a non-null list of lists of non-null Strings, for example.

That means the list and not-null qualifiers need to be fully nestable, with some system of reified types to make it all work; a somewhat fundamental change to how the executor operates.

null collapsing issue

As per discussion on slack, we sometimes get com.walmartlabs.lacinia.executor/null exposed in results that would be caused by faulty null collapsing.

{
  "data": {
    "users": [
      {
        "id": "com.walmartlabs.lacinia.executor/null"
      }
    ]
  },
  "errors": [
    {
      "message": "Non-nullable field was null.",
      "locations": [
        {
          "line": 2,
          "column": 53
        }
      ],
      "query-path": [
        "users",
        "id"
      ]
    }
  ]
}

subscriptionType support

When I try to use graphql-ide it fails with this error message:

{"errors":[{"message":"Cannot query field `subscriptionType' on type `__Schema'.","query-path":["__schema"],"locations":[{"line":3,"column":13}],"field":"subscriptionType","type":"__Schema"}]}

Any ideas?

The :resolve key should be required for all queries and mutations

Because Lacinia is forgiving of missing keys in the spec, and elsewhere, it is too easy to add a key, :resolver (instead of :resolve) and have a query (or mutation) simply not get invoked.

For fields of objects, it often makes sense that the default field resolver is sufficient, but for QueryRoot and MutationRoot, each field should be required to have a resolver.

I stumbled on this working out why a simple query (in a test schema) returned nil ... my investigation would have been shorter with a schema compile error along the lines of "QueryRoot/node field does not have a :resolve key."

Fields are not merged

This query:

query { 
  viewer { 
    login
  }
  viewer {
    databaseId
  }
}

Should return something like:

{:data {:viewer {:login "someone" :id "qwertyuiop"}}}

Instead, it returns only the databaseId. If we query for the login last, it returns just the login.

Field/argument documentation inheritance

It would be very nice if the documentation attached to a field in an interface would become a default for the documentation of an object implementing that interface, if the object field does not provide its own documentation. Likewise for descriptions of arguments to that field, if any.

Basically, I would like to document my fields in the interfaces, and not have to recapitulate that documentation in the objects.

Scalar :serialize conformer invoked for nil value [0.15.0]

A change in behavior from 0.13.0 to 0.15.0 is that a custom :serialize for a scalar is being invoked when the value is nil. It appears (not certain) that this is a change in behavior. Not sure which behavior is correct with respect to the spec.

Add ability to decorate resolver functions

A certain useful class of problems can be solved by passing field resolver functions through a set of middleware before being employed in a schema; this would allow for such things as authentication/authorization checks, implicit logging, tracing, and other common use-cases.

I would suggest a signature as follows:

 (middleware object-name field-name f) => f'

The object-name and field-name are keywords. The function is as provided in the input schema, and the result is the same or a decorated resolver function.

Subscription support

We're getting closer to adding subscription support, which has, or is about to, hit the official spec.

Here's my initial thoughts on implementing subscriptions:

For subscription operations, there's two callbacks: the traditional field resolver, but also a subscriber function, which I'm tentatively calling stream.

The stream function is passed the application context, the operation's arguments, and an event function.

The stream function has two responsibilities:

  • Provide resolved values to the event function
  • Return a cleanup callback

The subscriber is responsible for invoking the event function when there's new data to send to the subscribed client.

In other words, the stream function is responsible
for repeatedly invoking the event function with a succession of values, as they become available. Presumably, there will be a started thread, or core.async CSP, responsible for this.

The value passed to the event function becomes the resolved value passed to the field resolver (rather than the nil value normally passed to a top level operation).

After invoking the event function, the provided value will be passed to the fields resolver, much as with a traditional query operation.

There is no guarantee that the resolver will be invoked in the same thread. More likely, there will be some amount of asynchronous processing going on.

This continues until the client disconnects, or the event function is invoked with nil.

The stream function returns a cleanup function; this function is invoked after the subscription is canceled (weather it was closed by passing nil to the event function, by the client closing the connection, or by the loss of the connection to the client).

Input coercion rejects numeric literals for variables

When a query uses variables and the variables, some coercions do not accept numeric literals:

{:errors [{:message "Scalar value is not parsable as type `Int'."
           :type-name :Int
           :value 42}]}

That is because the built in Scalar types are oriented towards accepting strings derived from the parsed query document, but the same code is uses to conform variable values.

The built-in scalars should be modified in accordance with the spec on this, particularly with how Int, Float, and Boolean are handled.

Documentation should be updated w.r.t. implementing custom scalars.

List types should allow a single value on input

Noticed this in the spec:

When expected as an input, list values are accepted only when each item in the list can be accepted by the list’s item type.

If the value passed as an input to a list type is not a list and not the null value, it should be coerced as though the input was a list of size one, where the value passed is the only item in the list. This is to allow inputs that accept a “var args” to declare their input type as a list; if only one argument is passed (a common case), the client can just pass that value rather than constructing the list.

that when a null value is provided via a runtime variable value for a list type, the value is interpreted as no list being provided, and not a list of size one with the value null.

I believe the code is currently more strict on this item, enforcing that lists passed as input values (e.g., field argument, field of input object, or variable) are in fact lists.

Fields with arguments are not checked against fields in interface

We have good checks in place that a field in an object uses the same, or compatible, types as the field in an interface from which the object inherits.

However, there is no check that the interface field and the object field have the same arguments.

The interface and field should define the same set of arguments, with identical types (because argument types are limited, there's no question of unions, interfaces, or objects: just scalar types, enums, and input objects.

execute does not use the provided variables

lacinia/execute returns an error, "For argument 'foo', argument references undeclared variable 'abc'", even though the parameter map includes :abc.

  (require 
   '[com.walmartlabs.lacinia :as lacinia]
   '[com.walmartlabs.lacinia.schema :as lacinia-schema]
   '[com.walmartlabs.lacinia.util :as lacinia-util])

  (defn resolve-q-cars [app args _enc]
    [{:name "Green Baron"}])
  
  (def schema {:objects {:Car {:fields {:name {:type :String}}}}
               :queries {:cars {:type '(list :Car)
                                :args {:foo {:type :String}}
                                :resolve resolve-q-cars}}})

  (def c-schema (lacinia-schema/compile schema))

  ;; The following query, which uses a literal instead of a variable, works:
  (lacinia/execute c-schema
                   "query {cars(foo:\"eggplant\") {name}}"
                   {}
                   {})
  ;; {:data #ordered/map ([:cars (#ordered/map ([:name "Green Baron"]))])}

  ;; The following query, which uses a variable, does not work:
  (lacinia/execute c-schema
                   "query {cars(foo:$abc) {name}}"
                   {:abc "eggplant"}
                   {})
  ;; Exception applying arguments to field `cars': For argument `foo', argument references undeclared variable `abc'.
  ;; (Empty list of declared variables)

application/graphql not supported despite what README says

In the README it clearly states,

User queries are provided as the body of a request with the content type application/graphql. It looks a lot like JSON.

However, this wasn't the case and actually causes a validation failure at lacinia#L85 when you setup the example to use the application/graphql MIME type.

Was the README bit meant for lacinia-pedestal? Very confusing.

If I am missing something with the implementation please let me know.

Thanks in advance!

Async Processing - Design Overview

Added here to collect comments.

We already have the resolve-as function, and the related ResolverResult.
These are special values that can be returned by a resolver function (in 0.13.0); they are used to allow a return value (or seq of values) along with an error map (or seq of error maps).

Asynchronous field resolvers would return a kind of deferred ResolverResult, and would later (in a different thread) provide the resolved value and/or error map(s).

My proposal is to extend the ResolverResult protocol to support asynchronous behavior.

A new function, when-ready!, will be added; its argument is a data callback function invoked when the resolved value and error are available.
The data callback is passed the the tuple's value and errors, and the return value from the callback is ignored.
when-ready! is stateful: it may only be invoked once.

For ResolverResultImpl, when-ready! can immediately invoke the callback.

A new protocol, DeferredResolverResult, and method, resolve-async!, will be introduced. resolve-async! is the method equivalent to resolve-to as applied for deferred ResolverResult instances.
Invoking this method provides the value and/or error, and invokes the data callback.
This method is stateful and may only be invoked once.
A new function, resolve-deferred, returns an instance that implements ResolverResult and Deferred.

For the ResolverResult implementation, the data callback will typically not be invoked immediately, instead it will be invoked from a thread, started from the resolver function, invokes the resolve-async! method.

The resolved-value and resolved-errors methods on a DeferredResolverResult will block until data is available (via resolve-async!).

So, basically, there's going to be a whole lot of when-ready! callbacks that accept a resolved value (and/or errors) and turn around, process or combine those values, and then resolve-async! on a different instance.
I can definitely see some additional functions or macros to make that easier.

What I haven't quite puzzled out yet is timeouts; do we set just a global timeout? Do we just substitute nils in when there's a timeout, or fabricate a timeout error map? Do you specify timeouts with every when-ready!? Maybe an optional alternate callback for when a timeout occurs.

Or as Matt has opined: timeouts are managed by the resolver function, so do them right.

Maybe there's a default timeout time and timeout value and error, and this is in effect globally, but there's an on-timeout! method as well that lets it be overridden on an individual tuple?

Basically, you probably want a more generous timeout on the top level, and then smaller timeouts at the leaves, where field resolvers actually get invoked.

Suggestions on better names for protocols and methods are welcome.

I've also thought about a callback, provided as an option to c.w.l.schema/compile, that would be used to decorate resolver functions to adapt their return value, i.e., convert a core.async channel into a DeferredResolverResult.

Int type parses/coerces as (32 bit) int, not (64 bit) long

Large numeric values, stored as type long, are conformed to ints:

(defn ^:private coerce-to-int
  [v]
  (cond
    (number? v) (.intValue v)
    (string? v) (Integer/parseInt v)
    :else (throw (ex-info (str "Invalid Int value: " v) {:value v}))))

(defn ^:private coerce-to-float
  [v]
  (cond
    (number? v) (double v)
    (string? v) (Double/parseDouble v)
    :else (throw (ex-info (str "Invalid Float value: " v) {:value v}))))

(def default-scalar-transformers
  {:String {:parse (conformer str)
            :serialize (conformer str)}
   :Float {:parse (conformer #(Double/parseDouble %))
           :serialize (conformer coerce-to-float)}
   :Int {:parse (conformer #(Integer/parseInt %))
         :serialize (conformer coerce-to-int)}
   :Boolean {:parse (conformer #(Boolean/parseBoolean %))
             :serialize (conformer #(Boolean/valueOf %))}
   :ID {:parse (conformer str)
        :serialize (conformer str)}})

This ends up truncating the values.

(.intValue 1461381425413)
=> 1092544773

Define a record type and mapping to schema type, as an alternative to tagging

Especially now that tagging involved a new wrapper type, rather than meta-data, it's gotten a bit intrusive.

With a little more declaration in the schema, we could relate a GraphQL object type to a Clojure record type (or even Java class).

This would allow Lacinia to inuit the object type when dealing with a union or interface type.

Documentation for `execute` `variables` parameter is unclear

The documentation for the variables parameter of com.walmartlabs.lacinia/execute says

compile-time variables that can be referenced inside the query using the $variable-name production.

which is not very helpful. My coworker and I attempted to recover the expected value type information by navigating the passage of variables through the code but after digging all the way down to com.walmartlabs.lacinia.parser/compute-arguments we decided to open an issue instead.

It would be nice to see either or even both of :pre/:post conditions or specs for not only this particular function, but public functions this library exposes in general.

Document a solution for batched resolvers

I'm not yet a Clojure user (just started learning), so I'm not familiar with all the patterns available, but is there any guidance on how to batch up data-fetching resolvers to avoid the common "n+1" problem?

I look at every GraphQL server solution I come across to see whether it can solve this problem. Some servers (Sangria, Absinthe) provide native solutions, whilst others (graphql-js, graphql-ruby) have the problem be solved in user-land via well-known libraries.

Which would it end up being for Lacinia/Clojure?

For a basic overview on the kind of request optimizations that should be possible, I've written an article about it: https://dev-blog.apollodata.com/optimizing-your-graphql-request-waterfalls-7c3f3360b051

Change enum values to be keywords, not strings

As of current, 0.14.0, enums are treated as a special string. However, enums must be names, which represent a subset of what's allowed in a Clojure keyword. In general, enums are backed by keyword values in the raw data ... why not just allow enums to be keywords that serialize to strings? I think this would make server-side coding with enums more straightforward.

Spec checking wrong for enum default values

given that we have schema like

{:enums
 {:animal
  {:values [:DOG :CAT]}}
 :objects
 {:thing {:fields {:name {:type String}}
 :queries
 {:search_animal
  {:type (list :thing)
   :args {:animal_type {:type (list :animal)
                                     :default-value [:DOG]} ; this could be either keyword or symbol in the list, but it would be passed as symbol to resolver if we put a symbol in schema.edn
             :another {:type :animal
                             :default-value DOG}}} ; this must be symbol to get pass the spec checking, it would be passed as keyword to resolver
}

I think the default-value of (list :animal) should accept [:DOG] and spec should check this.
I think the default-value of :type :animal should accept :DOG and spec should check this.

Error Messages -- NPE on fragment w/ undefined type

Howdy!

This isn't a bug, just a quality of life improvement on error messaging :)

If you query a fragment w/ an undefined type you get an NPE on this line:

(-> condition-type :type-name q)))))

The condition-type is nil:

fragment-type (get schema (:type m))

Can reproduce using the sample schema:

(parser/parse-query schema "query { human(id: \"1001\"){ ... on foo { name }}}")
;; CompilerException java.lang.NullPointerException

I would expect something similar as you've done for undefined fields:

(parser/parse-query schema "query { human(id: \"1001\"){ foo }}")
;; CompilerException clojure.lang.ExceptionInfo: Cannot query field `foo' on type `human'. {:query-path [:human], :locations [{:line 1, :column 25}], :field :foo, :type :human}

Thanks for releasing this -- I look forward to when you can accept contributions :)

Some internal functions are showing up in the documentation

A select number of functions, such as schema/floor-selector are semi-private; they are tagged with ^:nodoc an and are considered internal, but are public for reuse across namespaces. These functions should not appear in documentation ... yet it appears they are.

file:///Users/hlship/workspaces/walmart/lacinia/target/doc/com.walmartlabs.lacinia.schema.html#var-floor-selector

Unable to resolve symbol import with clojure 1.9

I am unable to depend on lacinia with clojure 1.9.0-x.

project.clj

(defproject server "0.1.0-SNAPSHOT"
  :description "Backend api for the snap-battle system."
  :url "http://api.snapbattle.com"
  :license {:name "Eclipse Public License"
            :url "http://www.eclipse.org/legal/epl-v10.html"}
  :dependencies [[org.clojure/clojure "1.9.0-alpha15"]

                 [com.walmartlabs/lacinia "0.15.0"]
                 ]
  )

Stacktrace

Exception in thread "main" java.lang.ExceptionInInitializerError
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at clojure.lang.RT.classForName(RT.java:2183)
	at clojure.lang.RT.classForName(RT.java:2192)
	at clojure.lang.RT.loadClassForName(RT.java:2211)
	at clojure.lang.RT.load(RT.java:445)
	at clojure.lang.RT.load(RT.java:421)
	at clojure.lang.RT.doInit(RT.java:463)
	at clojure.lang.RT.<clinit>(RT.java:333)
	at clojure.main.<clinit>(main.java:20)
Caused by: java.lang.RuntimeException: Unable to resolve symbol: import in this context, compiling:(clojure/core_instant18.clj:11:1)
	at clojure.lang.Compiler.analyze(Compiler.java:6720)
	at clojure.lang.Compiler.analyze(Compiler.java:6657)
	at clojure.lang.Compiler$InvokeExpr.parse(Compiler.java:3767)
	at clojure.lang.Compiler.analyzeSeq(Compiler.java:6921)
	at clojure.lang.Compiler.analyze(Compiler.java:6701)
	at clojure.lang.Compiler.analyze(Compiler.java:6657)
	at clojure.lang.Compiler$BodyExpr$Parser.parse(Compiler.java:6030)
	at clojure.lang.Compiler$FnMethod.parse(Compiler.java:5407)
	at clojure.lang.Compiler$FnExpr.parse(Compiler.java:3973)
	at clojure.lang.Compiler.analyzeSeq(Compiler.java:6917)
	at clojure.lang.Compiler.analyze(Compiler.java:6701)
	at clojure.lang.Compiler.eval(Compiler.java:6975)
	at clojure.lang.Compiler.load(Compiler.java:7430)
	at clojure.lang.RT.loadResourceScript(RT.java:374)
	at clojure.lang.RT.loadResourceScript(RT.java:365)
	at clojure.lang.RT.load(RT.java:455)
	at clojure.lang.RT.load(RT.java:421)
	at clojure.core$load$fn__7831.invoke(core.clj:6008)
	at clojure.core$load.invokeStatic(core.clj:6007)
	at clojure.core$load.doInvoke(core.clj:5991)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$fn__9320.invokeStatic(core.clj:6664)
	at clojure.core$fn__9320.invoke(core.clj:6662)
	at clojure.core__init.load(Unknown Source)
	at clojure.core__init.<clinit>(Unknown Source)
	... 10 more
Caused by: java.lang.RuntimeException: Unable to resolve symbol: import in this context
	at clojure.lang.Util.runtimeException(Util.java:221)
	at clojure.lang.Compiler.resolveIn(Compiler.java:7215)
	at clojure.lang.Compiler.resolve(Compiler.java:7159)
	at clojure.lang.Compiler.analyzeSymbol(Compiler.java:7120)
	at clojure.lang.Compiler.analyze(Compiler.java:6680)
	... 34 more
Subprocess failed

Can't start with Clojure 1.9

Hello, I tried to start a new project with Lacinia and Clojure 1.9, when trying to run the REPL I always get this error:

Exception in thread "main" java.lang.ExceptionInInitializerError
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at clojure.lang.RT.classForName(RT.java:2183)
	at clojure.lang.RT.classForName(RT.java:2192)
	at clojure.lang.RT.loadClassForName(RT.java:2211)
	at clojure.lang.RT.load(RT.java:445)
	at clojure.lang.RT.load(RT.java:421)
	at clojure.lang.RT.doInit(RT.java:463)
	at clojure.lang.RT.<clinit>(RT.java:333)
	at clojure.main.<clinit>(main.java:20)
Caused by: java.lang.RuntimeException: Unable to resolve symbol: import in this context, compiling:(clojure/core_instant18.clj:11:1)
	at clojure.lang.Compiler.analyze(Compiler.java:6720)
	at clojure.lang.Compiler.analyze(Compiler.java:6657)
	at clojure.lang.Compiler$InvokeExpr.parse(Compiler.java:3767)
	at clojure.lang.Compiler.analyzeSeq(Compiler.java:6921)
	at clojure.lang.Compiler.analyze(Compiler.java:6701)
	at clojure.lang.Compiler.analyze(Compiler.java:6657)
	at clojure.lang.Compiler$BodyExpr$Parser.parse(Compiler.java:6030)
	at clojure.lang.Compiler$FnMethod.parse(Compiler.java:5407)
	at clojure.lang.Compiler$FnExpr.parse(Compiler.java:3973)
	at clojure.lang.Compiler.analyzeSeq(Compiler.java:6917)
	at clojure.lang.Compiler.analyze(Compiler.java:6701)
	at clojure.lang.Compiler.eval(Compiler.java:6975)
	at clojure.lang.Compiler.load(Compiler.java:7430)
	at clojure.lang.RT.loadResourceScript(RT.java:374)
	at clojure.lang.RT.loadResourceScript(RT.java:365)
	at clojure.lang.RT.load(RT.java:455)
	at clojure.lang.RT.load(RT.java:421)
	at clojure.core$load$fn__7831.invoke(core.clj:6008)
	at clojure.core$load.invokeStatic(core.clj:6007)
	at clojure.core$load.doInvoke(core.clj:5991)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$fn__9320.invokeStatic(core.clj:6664)
	at clojure.core$fn__9320.invoke(core.clj:6662)
	at clojure.core__init.load(Unknown Source)
	at clojure.core__init.<clinit>(Unknown Source)
	... 10 more
Caused by: java.lang.RuntimeException: Unable to resolve symbol: import in this context
	at clojure.lang.Util.runtimeException(Util.java:221)
	at clojure.lang.Compiler.resolveIn(Compiler.java:7215)
	at clojure.lang.Compiler.resolve(Compiler.java:7159)
	at clojure.lang.Compiler.analyzeSymbol(Compiler.java:7120)
	at clojure.lang.Compiler.analyze(Compiler.java:6680)
	... 34 more
Exception in thread "Thread-3" clojure.lang.ExceptionInfo: Subprocess failed {:exit-code 1}
	at clojure.core$ex_info.invokeStatic(core.clj:4617)
	at clojure.core$ex_info.invoke(core.clj:4617)
	at leiningen.core.eval$fn__5732.invokeStatic(eval.clj:264)
	at leiningen.core.eval$fn__5732.invoke(eval.clj:260)
	at clojure.lang.MultiFn.invoke(MultiFn.java:233)
	at leiningen.core.eval$eval_in_project.invokeStatic(eval.clj:366)
	at leiningen.core.eval$eval_in_project.invoke(eval.clj:356)
	at leiningen.repl$server$fn__11838.invoke(repl.clj:243)
	at clojure.lang.AFn.applyToHelper(AFn.java:152)
	at clojure.lang.AFn.applyTo(AFn.java:144)
	at clojure.core$apply.invokeStatic(core.clj:646)
	at clojure.core$with_bindings_STAR_.invokeStatic(core.clj:1881)
	at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1881)
	at clojure.lang.RestFn.invoke(RestFn.java:425)
	at clojure.lang.AFn.applyToHelper(AFn.java:156)
	at clojure.lang.RestFn.applyTo(RestFn.java:132)
	at clojure.core$apply.invokeStatic(core.clj:650)
	at clojure.core$bound_fn_STAR_$fn__4671.doInvoke(core.clj:1911)
	at clojure.lang.RestFn.invoke(RestFn.java:397)
	at clojure.lang.AFn.run(AFn.java:22)
	at java.lang.Thread.run(Thread.java:745)

I tried with Clojure 1.9.0-alpha14 and 1.9.0-alpha15.

It works fine with Clojure 1.8.0.

Exception thrown if a field resolver returns anything but a IObj

If a lacinia resolver returns anything other than clojure.lang.IObj an exception is being generated. The reason for this is that it is somewhere in the call chain trying to attach meta-data to the returned value.

java.lang.ClassCastException: datomic.query.EntityMap cannot be cast to clojure.lang.IObj

This however is a bit awkward when working with plain Java return types like e.g. EntityMap from datomic.

We have worked around this by defining a custom resolver function (needed anyway to do namespaced keywords conversion).

(defn find-object-type
  [v]
  (some->> v
           (keys)
           (filter #(= "id" (name %)))
           (first)
           (namespace)))

(defn default-field-resolver
  "This is a function that accepts a field name (a keyword) and
  returns a field resolver function for the field. This includes
  transforming to datomic namespaced keywords and wrapping/unwrapping
  DatomicEntity objects."
  [field-name]
  ^ResolverResult (fn [ctx args v]
                    (let [value (if (instance? DatomicEntity v) ;; DatomicEntity is a record with only one field, the entity
                                  (get (.entity v) (keyword (find-object-type (.entity v)) (name field-name)))
                                  (get v field-name))]
                      (resolve-as
                        (if (instance? datomic.Entity value)
                          (wrappers/->DatomicEntity value)
                          value)))))

This is however only part of the solution as each root query needs to wrap it's datomic entity results in the clojure record.

We also would like our resolvers to return manifold deferred's instead of a lacinia promise because of code readability. This approach suffers the same issue as with a Datomic Entity and combining the two becomes more and more a problem.

Is there any infrastructure available to extend lacinia's default resolver functionality (not sure if I'm using the right terminology here). What would be sufficient for this use case:

  1. a function that is applied before the returned resolver result is being passed down to the type tagging functionality. Here you could wrap everything.
  2. the default resolver function that can unwrap results and get the values out for a given field (i.e. what you can specify now as ':default-field-resolver').

An example:

(defn default-field-resolver
  [field-name]
  ^ResolverResult (fn [ctx args v]
                    (resolve-as
                      (if (instance? DatomicEntity v)
                        (get (.entity v) (keyword (find-object-type (.entity v)) (name field-name)))
                        (get v field-name)))))

(defn default-resolver-middleware
  [resolver]
  (fn [ctx args v]
    (let [result (resolver ctx args v)]
      (if (instance? datomic.Entity result)
        (wrappers/->DatomicEntity result)
        result))))

(lacinia.schema/compile
  {:default-field-resolver default-field-resolver
   :default-middleware     default-resolver-middleware})

Enums can be defined as symbols (or keywords) causing inscrutable failures

It's is possible to define the :values of an enum as symbols; this doesn't give a compile time or spec error BUT at runtime, request enum values (as strings) will not compare against schema enum values (as symbols). This is further obscured when the resulting error is reported as JSON instead of EDN, as the symbols in the :enum-values key are converted to strings.

API for identifying operation(s) for request

Given a parsed query, it should be possible to identify the operations for the request.

Something like:

(query-operations parsed-query) => {:type :mutation :operations #{:add_employee :approve_vacation}}

This would be desirable as part of a web-tier pipeline, to enforce auth/auth before executing the query.

Exception catching is too aggressive

In the current implementation, any exception thrown by a field-resolver is caught by execute and only the message is passed through. I had to checkout the repo and remove all the catch statement to find the issue in my code. Would it be possible to have an option to toggle the verbosity of the errors, or whether to throw them or not?

The error in my case was because the keys in the variables I was passing in were strings and lacinia was expecting keywords, which led to NullPointer exceptions down the line.

caution when using custom scalars ... must be JSON encodable

Tracking down a problem with the latest upgrade. Because of a resolve failure, the arguments to a field are included in an error map; one value is a java.time.LocalDateTime. Cheshire throws an exception when streaming this as JSON, and the final result is a response with a body that is the empty string.

The right solution is to use cheshire.generator/add-encoder:

(add-encoder LocalDateTime
             (fn [v ^JsonGenerator json-generator]
               (.writeString json-generator (format-local-date-time v)))) 

Accept clojure sets as valid graphql list type when returned from resolver

Right now it returns sets as strings.
This is something quite common in clojure apis and would be more friendly than having to walk our return data to convert sets to list/vec or define a custom scalar that reads the string (lots of wasted cycles for something like this, set->string->set). Worth noting that's something all (?) json marshaling libraries for clojure do transparently too.

it seems mostly due to sequential? checks instead of (or (sequential? x) (set? x)), ex here

(defn ^:private enforce-single-result
[tuple]
(with-resolved-value [value tuple]
(if-not (sequential? value)
tuple
(error "Field resolver returned a collection of values, expected only a single value."))))
(defn ^:private enforce-multiple-results
[tuple]
(with-resolved-value [value tuple]
(if (sequential? value)
tuple
(error "Field resolver returned a single value, expected a collection of values."))))

Required vs optional query arguments

Is there a way to specify in the edn schema whether arguments to a query (or mutation or subscription) are required or optional? If not, could there be?

What's the currently preferred method of handling missing arguments?

Error when providing nested input object via variables

This is a minimal case to reproduce (v 0.16.0)

Schema

(def schema
  {:input-objects {:IDObject {:fields {:id {:type '(non-null String)}}}}
   :objects       {:Something  {:fields {:id         {:type '(non-null String)}
                                         :name       {:type 'String}
                                         :otherThing {:type :OtherThing}}}
                   :OtherThing {:fields {:id   {:type '(non-null String)}
                                         :name {:type 'String}}}}
   :mutations     {:CreateSomething
                   {:args {:name {:type 'String}
                           :otherThing {:type :IDObject}}
                    :type :Something}}})

Execute the query by filling in parameters directly

(com.walmartlabs.lacinia/execute (com.walmartlabs.lacinia.schema/compile schema)
                                 "mutation CreateSomething { CreateSomething(name: \"Test\", otherThing: {id: \"ID\"}) {id }}"
                                 nil
                                 nil)
=> {:data #ordered/map([:CreateSomething nil])}

Execute the query by providing variables

(com.walmartlabs.lacinia/execute (com.walmartlabs.lacinia.schema/compile schema)
                                 "mutation CreateSomething($name: String!, $otherThing: IDObject!) { CreateSomething(name: $name, otherThing: $otherThing) {id }}"
                                 {:name "Test"
                                  :otherThing {:id "ID"}}
                                 nil)
=> {:errors [{:message "Sanity check - no option in process-result."}]}

The error happens in com.walmartlabs.lacinia.parser/process-result because there is no matching clause in the cond for (= category :input-object).

Variable nil vs. missing variable for field argument

From the GraphQL spec, concerning null values

The same two methods of representing the lack of a value are possible via variables by either providing the a variable value as null and not providing a variable value at all.

I'm pretty sure that the implementation will pass a nil for the argument value if the argument is bound to a variable that is simply not provided in the query. If so, this must be changed.

Support operation-name in lacinia/execute

I would like to be able to call lacinia/execute with a query string and operation name rather than having to call parse-query explicitly. When testing, it's fairly common for me to set up a template query in GraphiQL with several named queries and mutations.

Schema compiler should enforce that field and entity names are properly formed

For example:
(execute my-schema "queryname { some-field-with-dashes }" nil nil)
return following error:
extraneous input '-' expecting {'(', '{', '}', '...', '@', Name}

While my schema allow me to have fields with dashes - i.e. this is only query time issue. I managed to track this down to file: com\walmartlabs\lacinia\Graphql.g4 line 152

Name
    : [_A-Za-z][_0-9A-Za-z]*
    ;

I think this regexp should look like this rather:

Name
    : [_A-Za-z-][_0-9A-Za-z-]*
    ;

i.e. allow dash as part of field name.

API for identifying selections from application context

The application context passed to a field resolver function includes a key ::lacinia/selections, that identifies the field selections for the field.

We don't, and probably should not, publish the details of the structure of this value (as it is quite valuable to keep it private, opaque, and subject to change).

However, it exists so that field resolvers can preview the selections that may occur on the resolved value.

The canonical example of this is a resolver that performs a SQL database query, and may customize the request based on what is requested, for example, to optionally include a join only if the data from the join is needed.

I would suggest something like the following:

(selects-field? context :User/permissions) => true

This would scan the selections inside the context for any reference to the :permissions field of the :User object.

Another useful function might be:

(selections context) => #{:User/first_name :User/last_name :User:born_on :User/permissions :Permission/name}

I'm struggling with how this might work in the context of unions and interfaces; I would guess that the collection of these extended field names would driven by fragments, and ultimately, the fragments would identify the specific object fields.

spec for GraphQL schema does not define :queries and :mutations

The spec definitions in com.walmartlabs.lacinia.schema does not provide any detail for the :queries and :mutations keys. These are (s/map-of keyword? :type/operation), where an operation is just about the same as a field, but :resolve should be required (not optional).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.