Giter Club home page Giter Club logo

graphql-tools-fork's Introduction

student, programmer, radiologist, father

open-source GraphQL development sponsored primarily by ✨ the Guild

  • currently working 🔭 on a schema stitching executor with support for incremental delivery...
  • consider 🌱 sponsoring my work...
  • want to hear more? reach me at 📫 [email protected]

graphql-tools-fork's People

Contributors

abernix avatar bchensyd avatar benjamn avatar dxcx avatar enaqx avatar excitement-engineer avatar freiksenet avatar greenkeeper[bot] avatar greenkeeperio-bot avatar helfer avatar hwillson avatar kamilkisiela avatar kbrandwijk avatar lucasconstantino avatar martijnwalraven avatar maticzav avatar nicolaslopezj avatar nwfortner avatar oricordeau avatar reconbot avatar renovate-bot avatar renovate[bot] avatar rricard avatar sebastienbarre avatar slava avatar stubailo avatar tgriesser avatar trevorblades avatar vladshcherbin avatar yaacovcr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

graphql-tools-fork's Issues

Enums on fragments are broken after schema transform

Given a union type where one of the members has an enum field, query results with an error GraphQLError: Expected a value of type "MyCustomEnum" but received: "EnumValue". Enums on fragments are broken after schema transforms. It probably also doesn't work for enum arrays.

Question: Schema Transform partial application of arguments

Hello!

We have a remote schema with a required argument lang on all root fields, we would like to set this value via a transform and remove it as an argument from the exposed API (or set the default value).

We were previously on version 7.2.3 and I achieved the above by using the below TransformRootFields to set the defaultValue:

import { fieldToFieldConfig } from "graphql-tools-fork/dist/stitching/schemaRecreation";

// ...

new TransformRootFields((_operation, name, field) => {
  // Set any root fields with a `lang` arg to default to the correct locale.
  const config = fieldToFieldConfig(field);

  if (config.args.lang !== undefined) {
    config.args.lang.defaultValue = locale.langtag.toLowerCase();
  }

  return { name, field: config };
})

But on upgrading to the latest version to resolve #43 , fieldToFieldConfig has been removed. What's the best way to go about this without fieldToFieldConfig?

delegateToSchema return type

The return type of a call to delegateToSchema is somewhat inconsistent:

  1. If the operation returns a leaf type (enum or scalar) or list* of leaf types then it will correspond to the source schema internal value of that leaf type or list* of leaf types.
  2. If the result is an object type, then any descendant fields will still be in the external (serialized) form.
  3. If the result has __typename fields, even if renamed using the RenameTypes transform, they will correspond to the names within the source schema.
  4. If the WrapType, WrapFields, or HoistField transforms are used, the result will include objects with structures that match the proxying target schema, as unwrapping and dehoisting is done as part of resolution.
  5. If the result is a list of objects, if type merging is enabled for those objects and said merging is performed asyncrhonously, delegateToSchema will return an array of promises that will resolve to the merged objects rather than an array of merged objects.

This can be avoided if all of this return type processing is folded into the checkResultAndHandleErrors transform.

A solution might be to accept a custom checkResultAndHandleErrors transform that takes as its arguments all of the above schema manipulation RenameTypes, WrapFields, etc, etc, so that only a single round of result parsing is required.

"‘Upload’ scalar serialization unsupported."

Hi guy.
I encounter an error when using Upload scalar to upload files with mergeSchemas, this type is supported by Apollo server.

"‘Upload’ scalar serialization unsupported."

I think I found the problem.
v5.2.0...v6.0.0
In the function serializeArgumentValue of the file AddArgumentsAsVariables.ts, you call

    if (nullableType instanceof graphql_1.GraphQLEnumType || nullableType instanceof graphql_1.GraphQLScalarType) {
        return nullableType.serialize(value);
    }

But here is the implementation of Upload scalar

export const GraphQLUpload = new GraphQLScalarType({
  name: 'Upload',
  description: 'The `Upload` scalar type represents a file upload.',
  parseValue: value => value,
  parseLiteral() {
    throw new Error('‘Upload’ scalar literal unsupported.')
  },
  serialize() {
    throw new Error('‘Upload’ scalar serialization unsupported.')
  }
})

So that's why the error is threw.
It would be good if we can check GraphQLUpload type.

Question: ApolloLink for remote schemas with cache?

Hey @yaacovCR

First of all, thanks for your fantastic work on graphql-tools-fork! This is truly impressive.

We are going to implement some sort of caching / query batching for gatsby-source-graphql. Before jumping into custom implementation I decided to ask you - maybe you already did something like Apollo Http Link for remote schemas but with cache?

Or maybe it is a straight stupid idea?

I'll appreciate any thoughts/advice. Thanks!

Unable to mutate enum value through visitEnumValue

Just hit this issue while trying to implement a simple directive that would let you change the value of an enum value:

export class ValueDirective extends SchemaDirectiveVisitor {
  visitEnumValue(
    enumValue: GraphQLEnumValue,
  ): GraphQLEnumValue {
  return {
    ...enumValue,
    value: this.args.string,
  }  
}

The intent here is to have an SDL-only way to deal with "internal" values of enum values without having to utilize the resolver map.

The issues is that GraphQLEnumType utilizes several private properties that are derived from the initial set of values. I think the entire GraphQLEnumType would need to be recreated with an updated config whenever the visitor is called or after all directives are ran, but off-hand I'm not sure how feasible either approach is.

Transforms can lead to "empty" types breaking schema

Given:

  • ObjectType A
  • ObjectType B with a single Field of type A

When:

  • Schema transforms lead to ObjectType A being removed/hidden

Then:

  • ObjectType B becomes an empty type without any fields, making schema invalid

My current workaround is removing empty types through FilterTypes transform:

import { GraphQLObjectType } from "graphql";
import { FilterTypes } from "graphql-tools-fork";

new FilterTypes(type => {
  if (type instanceof GraphQLObjectType) {
    const fieldsMap = type.getFields();
    const fieldsCount = Object.keys(fieldsMap).length;
    return fieldsCount > 0;
  }
  return true;
})

The question is, should this be the responsibility of FilterToSchema transform to clean up empty types as an additional step? It's already hiding queries and mutations that use types which were hidden, I think (or something like that).

Multiple errors not merged correctly

I am having some problem with the error merging. Maybe this is intended behaviour but seems wrong.
It works fine when I query for a single type within the query, for example:
The schema:

type Me {
  basket: Basket
  order(id: ID!): Order
  customer: Customer
}
type Query {
  me: Me!
}

The query:

query {
  me {
    basket {
      id
    }
  }
}

The response looks perfect, correct error message, code, etc.

{
  "errors": [
    {
      "message": "The request has not been Authenticated",
      "locations": [
        {
          "line": 3,
          "column": 5
        }
      ],
      "path": [
        "me",
        "basket"
      ],
      "extensions": {
        "code": "UNAUTHENTICATED",
        "exception": {
          "message": "The request has not been Authenticated",
          "stacktrace": [
            "GraphQLError: The request has not been Authenticated",
            "    at Object.combineErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:101:16)",
            "    at Object.handleErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/checkResultAndHandleErrors.js:118:44)",
            "    at defaultMergedResolver (/var/task/node_modules/graphql-tools-fork/dist/stitching/defaultMergedResolver.js:23:63)",
            "    at resolveFieldValueOrError (/var/task/node_modules/graphql/execution/execute.js:467:18)",
            "    at resolveField (/var/task/node_modules/graphql/execution/execute.js:434:16)",
            "    at executeFields (/var/task/node_modules/graphql/execution/execute.js:275:18)",
            "    at collectAndExecuteSubfields (/var/task/node_modules/graphql/execution/execute.js:713:10)",
            "    at completeObjectValue (/var/task/node_modules/graphql/execution/execute.js:703:10)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:591:12)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:557:21)"
          ]
        }
      }
    }
  ],
  "data": {
    "me": {
      "basket": null
    }
  }
}

But when I add another field into the query:

query {
  me {
    basket {
      id
    }
    customer {
      id
    }
  }
}

I receive multiple errors with merged messages, the code changed to INTERNAL_SERVER_ERROR and the original errors with codes moved under exception.errors:

{
  "errors": [
    {
      "message": "The request has not been Authenticated\nThe request has not been Authenticated",
      "locations": [
        {
          "line": 3,
          "column": 5
        }
      ],
      "path": [
        "me",
        "basket"
      ],
      "extensions": {
        "code": "INTERNAL_SERVER_ERROR",
        "exception": {
          "errors": [
            {
              "message": "The request has not been Authenticated",
              "extensions": {
                "code": "UNAUTHENTICATED"
              }
            },
            {
              "message": "The request has not been Authenticated",
              "extensions": {
                "code": "UNAUTHENTICATED"
              }
            }
          ],
          "stacktrace": [
            "Error: The request has not been Authenticated",
            "The request has not been Authenticated",
            "    at new CombinedError (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:93:28)",
            "    at Object.combineErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:104:16)",
            "    at Object.handleErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/checkResultAndHandleErrors.js:118:44)",
            "    at defaultMergedResolver (/var/task/node_modules/graphql-tools-fork/dist/stitching/defaultMergedResolver.js:23:63)",
            "    at resolveFieldValueOrError (/var/task/node_modules/graphql/execution/execute.js:467:18)",
            "    at resolveField (/var/task/node_modules/graphql/execution/execute.js:434:16)",
            "    at executeFields (/var/task/node_modules/graphql/execution/execute.js:275:18)",
            "    at collectAndExecuteSubfields (/var/task/node_modules/graphql/execution/execute.js:713:10)",
            "    at completeObjectValue (/var/task/node_modules/graphql/execution/execute.js:703:10)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:591:12)"
          ]
        }
      }
    },
    {
      "message": "The request has not been Authenticated\nThe request has not been Authenticated",
      "locations": [
        {
          "line": 6,
          "column": 5
        }
      ],
      "path": [
        "me",
        "customer"
      ],
      "extensions": {
        "code": "INTERNAL_SERVER_ERROR",
        "exception": {
          "errors": [
            {
              "message": "The request has not been Authenticated",
              "extensions": {
                "code": "UNAUTHENTICATED"
              }
            },
            {
              "message": "The request has not been Authenticated",
              "extensions": {
                "code": "UNAUTHENTICATED"
              }
            }
          ],
          "stacktrace": [
            "Error: The request has not been Authenticated",
            "The request has not been Authenticated",
            "    at new CombinedError (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:93:28)",
            "    at Object.combineErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:104:16)",
            "    at Object.handleErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/checkResultAndHandleErrors.js:118:44)",
            "    at defaultMergedResolver (/var/task/node_modules/graphql-tools-fork/dist/stitching/defaultMergedResolver.js:23:63)",
            "    at resolveFieldValueOrError (/var/task/node_modules/graphql/execution/execute.js:467:18)",
            "    at resolveField (/var/task/node_modules/graphql/execution/execute.js:434:16)",
            "    at executeFields (/var/task/node_modules/graphql/execution/execute.js:275:18)",
            "    at collectAndExecuteSubfields (/var/task/node_modules/graphql/execution/execute.js:713:10)",
            "    at completeObjectValue (/var/task/node_modules/graphql/execution/execute.js:703:10)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:591:12)"
          ]
        }
      }
    }
  ],
  "data": {
    "me": {
      "basket": null,
      "customer": null
    }
  }
}

Instead of this, I would expect to receive something like:

{
  "errors": [
    {
      "message": "The request has not been Authenticated",
      "locations": [
        {
          "line": 3,
          "column": 5
        }
      ],
      "path": [
        "me",
        "basket"
      ],
      "extensions": {
        "code": "UNAUTHENTICATED",
        "exception": {
          "message": "The request has not been Authenticated",
          "stacktrace": [
            "Error: The request has not been Authenticated",
            "The request has not been Authenticated",
            "    at new CombinedError (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:93:28)",
            "    at Object.combineErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:104:16)",
            "    at Object.handleErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/checkResultAndHandleErrors.js:118:44)",
            "    at defaultMergedResolver (/var/task/node_modules/graphql-tools-fork/dist/stitching/defaultMergedResolver.js:23:63)",
            "    at resolveFieldValueOrError (/var/task/node_modules/graphql/execution/execute.js:467:18)",
            "    at resolveField (/var/task/node_modules/graphql/execution/execute.js:434:16)",
            "    at executeFields (/var/task/node_modules/graphql/execution/execute.js:275:18)",
            "    at collectAndExecuteSubfields (/var/task/node_modules/graphql/execution/execute.js:713:10)",
            "    at completeObjectValue (/var/task/node_modules/graphql/execution/execute.js:703:10)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:591:12)"
          ]
        }
      }
    },
    {
      "message": "The request has not been Authenticated",
      "locations": [
        {
          "line": 6,
          "column": 5
        }
      ],
      "path": [
        "me",
        "customer"
      ],
      "extensions": {
        "code": "UNAUTHENTICATED",
        "exception": {
          "message": "The request has not been Authenticated",
          "stacktrace": [
            "Error: The request has not been Authenticated",
            "The request has not been Authenticated",
            "    at new CombinedError (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:93:28)",
            "    at Object.combineErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/errors.js:104:16)",
            "    at Object.handleErrors (/var/task/node_modules/graphql-tools-fork/dist/stitching/checkResultAndHandleErrors.js:118:44)",
            "    at defaultMergedResolver (/var/task/node_modules/graphql-tools-fork/dist/stitching/defaultMergedResolver.js:23:63)",
            "    at resolveFieldValueOrError (/var/task/node_modules/graphql/execution/execute.js:467:18)",
            "    at resolveField (/var/task/node_modules/graphql/execution/execute.js:434:16)",
            "    at executeFields (/var/task/node_modules/graphql/execution/execute.js:275:18)",
            "    at collectAndExecuteSubfields (/var/task/node_modules/graphql/execution/execute.js:713:10)",
            "    at completeObjectValue (/var/task/node_modules/graphql/execution/execute.js:703:10)",
            "    at completeValue (/var/task/node_modules/graphql/execution/execute.js:591:12)"
          ]
        }
      }
    }
  ],
  "data": {
    "me": {
      "basket": null,
      "customer": null
    }
  }
}

Custom scalar args broken as query variables after schema transform

Given:

  • Custom ScalarType X with a class representing it internally
  • Query or Mutation with an argument of ScalarType X
  • Any schema transforms

When:

  • Required argument of ScalarType X is passed as query variable ($argScalar) instead of inline value inside the query

Then:

  • Weird stuff happens, see the reproduction and results

Minimalistic reproduction:

import { ApolloServer } from "apollo-server-express";
import { createTestClient } from "apollo-server-testing";
import { GraphQLScalarType, GraphQLSchema, Kind } from "graphql";
import { GraphQLResponse } from "graphql-extensions";
import { FilterRootFields, transformSchema } from "graphql-tools-fork";
import {
  Arg,
  buildSchemaSync,
  Query,
  Resolver
} from "type-graphql";

function createTestSchema(): GraphQLSchema {
  class MyCustomScalar {
    private readonly prefix = "Xxx-";

    constructor(private value: string) {}

    public encode(): string {
      return this.prefix + this.value;
    }

    public static parse(value: string): MyCustomScalar {
      if (typeof value === "string") return new MyCustomScalar(value.slice(4));
      console.log("-> inside .parse", typeof value, value);
      throw new Error("SOMETHING WENT WRONG! SHOULD NEVER HAPPEN!");
    }
  }

  const MyCustomScalarType = new GraphQLScalarType({
    name: "MyCustomScalar",
    parseValue: (value: string): MyCustomScalar => {
      console.log("-> inside .parseValue", typeof value, value);
      return MyCustomScalar.parse(value)
    },
    serialize: (value: MyCustomScalar): string => {
      console.log("-> inside .serialize");
      return value.encode()
    },
    parseLiteral: (ast): MyCustomScalar | undefined => {
      console.log("-> inside .parseLiteral");
      if (ast.kind === Kind.STRING) {
        return MyCustomScalar.parse(ast.value)
      }
      return undefined;
    }
  });

  @Resolver()
  class MyTestResolver {
    @Query()
    public queryWithScalarArg(@Arg("id") id: MyCustomScalar): MyCustomScalar {
      return id;
    }
  }

  return buildSchemaSync({
    resolvers: [MyTestResolver],
    scalarsMap: [
      {
        type: MyCustomScalar,
        scalar: MyCustomScalarType
      }
    ]
  });
}

function execute(schema: GraphQLSchema, query: string, variables?: { [name: string]: any }): Promise<GraphQLResponse> {
  const server = new ApolloServer({ schema });
  return createTestClient(server).query({ query, variables });
}

const schema = createTestSchema();
const transformedSchema = transformSchema(schema, [new FilterRootFields((_operation, _fieldName, _field) => true)]);

describe("schema transformation", () => {
  it("preserves custom scalars on arguments when variables are used", async () => {
    const scalarValue = "Xxx-scalarValue";
    const query = "query MyTestQuery($argScalar: MyCustomScalar!) { result: queryWithScalarArg(id: $argScalar) }";
    const variables = { argScalar: scalarValue };
    console.log("> executing query on original schema");
    const response = await execute(schema, query, variables);
    console.log("> executing query on transformed schema");
    const transformedResponse = await execute(transformedSchema, query, variables);

    expect(transformedResponse).toEqual(response);
    expect(transformedResponse.errors).toBeUndefined();
    expect(transformedResponse.data).toEqual({
      result: scalarValue
    });
  });
});

Expectation:

- Expected
+ Received

-   "data": Object {
-     "result": "Xxx-scalarValue",
-   },
-   "errors": undefined,
+   "data": null,
+   "errors": Array [
+     [GraphQLError: Variable "$argScalar" got invalid value { value: "scalarValue", prefix: "Xxx-" }; Expected type MyCustomScalar. SOMETHING WENT WRONG! SHOULD NEVER HAPPEN!],
+   ],

What is being logged:

    > executing query on original schema
    -> inside .parseValue string Xxx-scalarValue
    -> inside .serialize
    > executing query on transformed schema
    -> inside .parseValue string Xxx-scalarValue
    -> inside .parseValue object MyCustomScalar { value: 'scalarValue', prefix: 'Xxx-' }
    -> inside .parse object MyCustomScalar { value: 'scalarValue', prefix: 'Xxx-' }

Broken Import path in `transformSchema.ts`

After upgrading to the latest graphql-tools-fork version.
I am getting the following error for transformSchema import. The error as below:

Ref for broken import - https://github.com/yaacovCR/graphql-tools-fork/blob/master/src/transforms/transformSchema.ts#L2

Cannot find module './utils/SchemaDirectiveVisitor' from 'makeExecutableSchema.js'
   'moduleFileExtensions', which is currently ['js', 'json', 'jsx', 'ts', 'tsx', 'node'].

graphql-tools-fork version - 6.4.2

After transform custom scalars are broken

import { createTestClient } from "apollo-server-testing";
import { GraphQLScalarType, Kind } from "graphql";
import { makeExecutableSchema } from "graphql-tools";
import { transformSchema } from "graphql-tools-fork";

import { GraphQLServer } from "@bootstrapping/apollo-server/graphql-server";

export const myCustomScalarType = new GraphQLScalarType({
  name: "MyCustomScalar",
  description: "Description of my custom scalar type",
  parseValue: (_value: string) => "parseValue: value",
  serialize: (value: string) => `serialize: ${value} ` + JSON.stringify(value),
  parseLiteral: ast => {
    if (ast.kind === Kind.STRING) {
      return ast.value;
    }
    return null;
  }
});

const schemaString = `

scalar MyCustomScalar

type Foo {
  aField: MyCustomScalar
}

type Query {
  foo: Foo
  myFoo2: MyCustomScalar
}

`;

const resolverFunctions = {
  MyCustomScalar: myCustomScalarType,
  Query: {
    foo: {
      aField: () => "myFoo"
    },
    myFoo2: () => {
      return "James";
    }
  }
};

const schema = makeExecutableSchema({
  typeDefs: schemaString,
  resolvers: resolverFunctions
});

const client = createTestClient(new GraphQLServer({ schema }));

const query = "query myFoo { foo { aField } myFoo2 }";
client.query({ query }).then(results => {
  console.log("client", results.data!.myFoo2);
});

const transformedSchema = transformSchema(schema, []);
const transformedClient = createTestClient(new GraphQLServer({ schema: transformedSchema }));

transformedClient.query({ query }).then(results => {
  console.log("transformedClient", results.data!.myFoo2);
});

After running this the output is:

client serialize: James "James"
transformedClient serialize: parseValue: value "parseValue: value"

so the value passed to serialize is different than before schema was transformed.

Schema delegation doesn't work with query variables in nested fields

When you use schema stitching combined with delegateToSchema and try to use query variables on a nested part of the delegated schema, the query breaks.

  • Intended outcome: query variables should work just fine even on a delegated schema.

  • Actual outcome: the query breaks with an error like: Variable "$foo" is not defined.

  • How to reproduce the issue:

The error can be reproduced by a simple modification of the test src/test/testStitchingFromSubschemas.ts. I'll report here a series of tests I tried to isolate the problem, just to show the specific case.

If you add a query variable in the test query, like this:

query($myId: ID!) {
  userById(id: $myId) {
    chirps {
      id
      textAlias: text
      author {
        email
      }
    }
  }
}
...
console.error(result.errors) // This is to see the error details

It breaks with the error Variable "$myId" of required type "ID!" was not provided. That is correct. If you then pass the variable value, like this:

const result = await graphql(mergedSchema, query, null, null, { myId: 5 });

It works correctly. The problem arises when the variable is used in a nested query. Add a parameter to a field, even not a required one, and then use it literally:

const chirpTypeDefs = `
  type Chirp {
    id: ID!
    text: String
    authorId: ID!
    author(foo: Boolean): User
  }
`;

...

query($myId: ID!) {
  userById(id: $myId) {
    chirps {
      id
      textAlias: text
      author(foo: true) {
        email
      }
    }
  }
}

This works as expected. But then pass the parameter value by a query variable:

query($myId: ID!, $foo: Boolean) {
  userById(id: $myId) {
    chirps {
      id
      textAlias: text
      author(foo: $foo) {
        email
      }
    }
  }
}

const result = await graphql(mergedSchema, query, null, null, { myId: 5, foo: true });

This produces the following error: Variable "$foo" is not defined. So, as shown before this is not a problem with providing the value, since the error is different, nor with the parameter definition itself, since it works by passing it literally.

In my understanding, this is caused by the mergeSchema operation mapping variableDefinitions uncorrectly, thus losing the outer schema definitions.

I even managed to find the place in the code, but I'm not confident enough in this codebase to submit a pull request. Anyway, I found that the problem lies in the src/delegate/createRequest.ts file, when defining the newVariableDefinitions, which contains remapped values and also new generated ones, but loses the provided definitions in this case. For example, if you change the line 136 like this:

newVariableDefinitions = updatedVariableDefinitions.concat(variableDefinitions)

It works, and also doesn't break any existing tests. But it seems to be a naive solution, so I'd like the repository owner or someone else more faamiliar with the codebase to provide a more solid fix.

This is really important for my company, since we use this package to patch a graphql schema we don't control, and we prefer not forking it, since it seems to be a really trivial bug.

Overriding remote custom scalars/enums not working when renaming types

Currently, you can merge custom scalars/enums from a subschema and then provided new internal values such that the external values will be parsed into the internal ones prior to hitting your resolvers.

This is supported, although not necessarily a great idea, as the main purpose of internal values is when you know that you are going to be using an external api that expects these values to be of a different form -- see https://www.apollographql.com/docs/apollo-server/schema/scalars-enums/#internal-values -- and in this case the subschema is at least one external api that is expecting the values to be in the external form!

But, nevertheless, it is supportable, and supported. However, because type renaming is currently not handling directly by the stitching logic, but by user definable transforms, and there is no interaction between the transforms, the transform that performs the serialization of args prior to delegation (AddArgumentsAsVariables) and the transform that renames types (RenameTypes) cannot communicate, and so if the subschema has a type that is renamed, AddArgumentsAsVariables does not realize that it should use that types serializer.

This will only hit users if they are using schemaConfig objects within mergeSchemas, as if you fully transform the schema using wrapSchema or transformSchema prior to merging, and get hit by that extra round of delegation, all types in the subschema will be of the same name as the merged schema, avoiding this problem.

Stitched schema errors are replaced with another one

I'm using graphql-tools-fork v7.2.3
NodeJS v12.8.1, tried also on v13.3.0

I have a following schema:

schema {
    mutation: Mutation
}

type Mutation {
    myOperation(input: ID!): ID!
}

When I'm resolving mutation I get an error and get following response from GraphQL server:

{
  "errors": [
    {
      "message": "Person does not exist",
      "locations": [],
      "path": [
        "myOperation",
        "persons",
        "0",
        "id"
      ],
      "extensions": {
        "code": "ValidationError",
        "severity": "ERROR",
      }
    }
  ],
  "data": null
}

Next - this response gets to the service that stitched mentioned schema and I receive following response:

{
	"name": "GraphQLError",
	"message": "ID cannot represent value: { person: { 0: [Object] } }",
	"locations": [],
	"path": [
		"myOperation"
	],
	"originalError": {
		"stack": "TypeError: ID cannot represent value: { person: { 0: [Object] } }\n    at GraphQLScalarType.serializeID [as serialize] (<Path>\\node_modules\\graphql\\type\\scalars.js:223:9)\n    at completeLeafValue (<Path>\\node_modules\\graphql\\execution\\execute.js:635:37)\n    at completeValue (<Path>\\node_modules\\graphql\\execution\\execute.js:579:12)\n    at completeValue (<Path>\\node_modules\\graphql\\execution\\execute.js:557:21)\n    at <Path>\\node_modules\\graphql\\execution\\execute.js:492:16\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)",
		"message": "ID cannot represent value: { person: { 0: [Object] } }"
	},
	"extensions": {
		"code": "INTERNAL_SERVER_ERROR"
	}
}

The error was lost and replaced to other error.
This might be related to the path in error being:

[
        "myOperation",
        "persons",
        "0",
        "id"
]

But why it tries to go further if data is null in the first place?

Nesting entire root query

Hello. I need to transform schema in such manner:

Query { Foo: Something }

to

Query { Prefix: PrefixQuery }
PrefixQuery { Foo: PrefixSomething }

In other words to

  1. Prefix all types (with Prefix in example)
  2. Wrap/nest entire query in one object (QueryPrefixQuery)

Prefixing is easily done with RenameTypes transform, but I can't figure out how to properly make nesting (that is in fact prefixing root types and creating a new root type with single field).

Any suggestions on how to do this with existing tooling or how to make a transformer for this?

addResolversToSchema uses schema transforms that do not modify original schema

In general, graphql-tools uses two methods to generate modified/new schemas:

  1. Direct modification of existing schema: eg addResolversToSchema, addMockFunctionsToSchema, and the SchemaVisitor customizable approach.

  2. Schema recreation: eg mergeSchemas, transformSchema base method, visitSchema and the individual schema transforms.

The difference is important. Although addResolversToSchemas returns the modified schema, some callers might rely on the fact that the reference to the original schema should also reflect the new resolvers.

This is fine as long as API is clear, but two problems are noted:

A: addResolversToSchemas can also be used to add definitions for enums/scalars, but then it also requires reparsing of default values. As currently implemented, this uses transforms that rely on visitSchema and do not modify the original schema. Callers that are using the returned value of addResolversToSchemas are fine, but if they are using the original reference, they would not get the new default values.

B: overall, visitSchema seems to be deprecated in favor of SchemaVisitor. The individual transforms can modify the passed schema or not, because the
true underlying original schema has already been wrapped at that point, so it would be ok to switch to SchemaVisitor or direct modification. It is ok to keep it this way, however. transformSchema uses a version of visitSchema that must recreate the schema and then updates resolvers. This could easily be refactored to recreate the schema with toConfig and then just use addResolversToSchemas to replace resolvers. This is also not strictly necessary, but if visitSchema can be completely deprecated, this action item might be a good long term goal.

Did not fetch typename for object, unable to resolve interface

I'm trying to use this fork as error handling is broken in graphql-tools. However I'm experiencing different bugs with this.
I only use these from graphql-tools:

  • makeExecutableSchema
  • introspectSchema
  • makeRemoteExecutableSchema
  • mergeSchemas
  • ReplaceFieldWithFragment
  • WrapQuery

I'm experiencing a weird bug when trying to stitch something that results in the error Did not fetch typename for object, unable to resolve interface. I have found that handleResult in checkResultAndHandleErrors first gets a result that is a list which is handled fine and recognised as a list, but then the same result runs through that code again for some reason and even though the result is still a list, it gets recognised as a CompositeType which then breaks resolveFromParentTypename as it's expecting an object and not a list. All this is very difficult to debug so I don't currently have a repro that I can share. Given that this wasn't a problem in the latest graphql-tools I assume it's some bug introduced here or some weird change in behaviour somewhere.

I read through the changelog and nothing stood out to me to be the cause of this issue I'm experiencing.

Errors are getting lost for array elements

I'm using graphql-tools-fork v7.2.0
NodeJS v12.8.1, tried also on v13.2.0
I have a following schema:

type Query {
  getType1(): Type1
}

type Type1 {
  type2: [Type2]!
}

type Type2{
  mandatoryField: String!
}

In Type1.type2 field resolver I return new Error('Custom error').
This error gets lost, instead I receive other error Cannot return null for non-nullable field Type2.mandatoryField.

Returning 0 becomes null

I have a field that's an Int and when the value is 0 it is returned as null which throws an error [[GraphQLError: Cannot return null for non-nullable field MyObject.myNumber.]] when the value is any other number it works correctly.

Prevent sync operations from becoming async

See #33 and https://github.com/apollographql/graphql-tools/pull/1057/files.

delegateToSchema should only return a promise when the configured executor/subscriber requires async work., e.g. not with execution via a local subschema.

Similarly, when not merging fields, handleObject should not convert lists of results to lists of promises to results,. Even when merging fields, given the above change, conversion to lists of promises to results should only be performed when necessary, i.e. when merging must be async.

RenameTypes + RenameObjectFields with interface resolveType not transforming correctly

Hello. We have been using this fork for a while now, and I would like to thank you for keeping Stitching going!

I posted an issue a while back #27 around schema transforms and I think I have ran into another issue with applying multiple transforms. We're using Prismic, a headless CMS with a GraphQL API, along with schema stitching. We're renaming types and object fields to make their API consistent with ours and to avoid naming conflicts. However I have ran into an issues where the transforms are not mapped correctly.

Here is a simplified version our transforms:

const schema = transformSchema(fakeRemoteSchema, [
  new RenameTypes(
    name => (name.startsWith("_") ? name : `P${pascalize(name)}`),
    {
      renameBuiltins: false,
      renameScalars: false
    }
  ),
  new RenameObjectFields((typeName, fieldName) => {
    if (typeName === "Query") return fieldName;

    // Remote uses leading underscores for special fields. Leave them alone.
    if (fieldName[0] === "_") return fieldName;

    // This issue appears to be with renaming `a_item`
    // if (fieldName === "a_item") return fieldName;

    return camelize(fieldName);
  })
]);

When querying on a renamed object field on a renamed interface implementation type e.g. aTransformedObjectField { ... on SomeInterface { field } }, we receive null rather than the data.

Please find a repo reproducing the issue here: https://github.com/m-sanders/graphql-tools-fork-43

`FilterTypes` Transform can break an UnionType

Given:

  • ObjectTypes A, B, and C
  • UnionType with types A, B, and C

When:

  • FilterType transform is used to hide one of the ObjectTypes included in the UnionType

Then:

  • ObjectType which was hidden becomes null inside UnionType, making schema invalid

My current workaround is removing null members and recreating UnionTypes without them:

import { GraphQLNamedType, GraphQLSchema, GraphQLUnionType } from "graphql";
import { Transform } from "graphql-tools-fork";
import { visitSchema, VisitSchemaKind } from "graphql-tools-fork/dist/transforms/visitSchema";

export class CleanUpUnionTypes implements Transform {
  public transformSchema(originalSchema: GraphQLSchema): GraphQLSchema {
    return visitSchema(originalSchema, {
      [VisitSchemaKind.UNION_TYPE](type): GraphQLNamedType {
        const config = (<GraphQLUnionType>type).toConfig();
        config.types = config.types.filter(Boolean);
        return new GraphQLUnionType(config);
      }
    });
  }
}

The question is, should this be responsibility of FilterType transform to clean up unions as an additional step?

TransformRootFields + RenameObjectFields not working together

Thank you for this library. We’ve found that schema delegation is not ready for the primetime and we’re continuing with stitching! I have however, encountered a problem with TransformRootFields + RenameObjectFields transforms in the latest version (7.1.4) and wondered if you could help.

I am using Prismic as a headless CMS and below is a snippet of a query to illustrate the bug:

{
  ingredient(uid: "sugar", lang: "en-us") {
    title
    camel_case
  }
  allIngredients(first: 1, lang: "en-us") {
    edges {
      node {
        title
        camel_case
      }
    }
  }
}

To match the rest of our application we’d like to apply some transformations:

  • Change allIngredients -> ingredients
  • Rename camel_case -> camelCase
  • Apply a default value to lang based on the context.

I am creating the schema with the below:

const link = new PrismicLink({
  uri: PRISMIC_URL
});
const schema = makeRemoteExecutableSchema({
  schema: await introspectSchema(link),
  link
});

return transformSchema(schema, [
  new TransformRootFields((_operation, fieldName, field) => {
    // `allSomethingContentTypes` -> `somethingContentTypes`.
    let name = fieldName;
    if (fieldName.match(/^all/)) {
      name = name[3].toLowerCase() + fieldName.substr(4);
    }

    // Set any root fields with a `lang` arg to default to the correct locale.
    const config = fieldToFieldConfig(field);
    if (config.args.lang !== undefined) {
      config.args.lang.defaultValue = locale.langtag.toLowerCase();
    }
    return { name, field: config };
  }),
  new RenameObjectFields((_typeName, fieldName) => {
    // prismic using leading underscores for special fields. Leave them alone.
    if (fieldName[0] === "_") return fieldName;
    // snake_case -> camelCase
    return fieldName.replace(/([-_][a-z])/gi, $1 => {
      return $1
        .toUpperCase()
        .replace("-", "")
        .replace("_", "");
    });
  })

However the below query returns inconsistent results. ingredient.camelCase gets correctly transformed but ingredient.edges.node.camelCase does not and returns null

{
  ingredient(uid: "sugar") {
    title
    camelCase
  }
  ingredients(first: 1) {
    edges {
      node {
        title
        camelCase
      }
    }
  }
}

Result:

{
  "data": {
    "ingredient": {
      "title": "Sugar",
      "camelCase": [
        {
          "type": "paragraph",
          "text": "Camel Camel Camel",
          "spans": []
        }
      ]
    },
    "ingredients": {
      "edges": [
        {
          "node": {
            "title": "Sugar",
            "camelCase": null
          }
        }
      ]
    }
  }
}

If I remove the rename on TransformRootFields, it works as expected. Thoughts?

Array of enums is broken after transformSchema

Hello, we have a following schema:

enum EnumField {
  FirstValue
  SecondValue
  ThirdValue
}

type ExampleType {
  arrayEnumField: [EnumField!]!
}

arrayEnumField is a custom field resolver returning array of EnumField values.

When querying the field after transformSchema we get an error: [GraphQLError: Expected a value of type "EnumField" but received: "FirstValue"]

It's similar to the error for plain enums that have been fixed in this fork, but array of enums is still broken.

Any chance it gets fixed anytime soon?

null error annotating children with errors at stitching

I'm using graphql-tools-fork v7.0.3
NodeJS v12.8.1, tried also on v13.0.1
I have a following schema:

type Query {
  getType1(): Type1
}

type Type1 {
  type2: [Type2]!
}

type Type2{
  mandatoryField: String!
}

If I by error don't fill mandatoryField, then annotateWithChildrenErrors will fail with error:

{
  "stack": "TypeError: Cannot set property 'Symbol(subSchemaErrors)' of null\n    at annotateWithChildrenErrors (<path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\errors.ts:73:23)\n    at <path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\errors.ts:68:37\n    at Array.forEach (<anonymous>)\n    at Object.annotateWithChildrenErrors (<path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\errors.ts:68:12)\n    at Object.handleResult (<path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\checkResultAndHandleErrors.ts:63:5)\n    at defaultMergedResolver (<path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\defaultMergedResolver.ts:24:10)\n    at field.resolve (<path>api\\node_modules\\graphql-tools-fork\\src\\stitching\\mergeSchemas.ts:275:16)\n    at field.resolve (<path>api\\node_modules\\graphql-extensions\\src\\index.ts:274:18)\n    at resolveFieldValueOrError (<path>api\\node_modules\\graphql\\execution\\execute.js:467:18)\n    at resolveField (<path>api\\node_modules\\graphql\\execution\\execute.js:434:16)\n    at executeFields (<path>api\\node_modules\\graphql\\execution\\execute.js:275:18)\n    at collectAndExecuteSubfields (<path>api\\node_modules\\graphql\\execution\\execute.js:713:10)\n    at completeObjectValue (<path>api\\node_modules\\graphql\\execution\\execute.js:703:10)\n    at completeValue (<path>api\\node_modules\\graphql\\execution\\execute.js:591:12)\n    at completeValue (<path>api\\node_modules\\graphql\\execution\\execute.js:557:21)\n    at <path>api\\node_modules\\graphql\\execution\\execute.js:492:16",
  "message": "Cannot set property 'Symbol(subSchemaErrors)' of null"
}

After validation to annotateWithChildrenErrors comes object:

{
  type2: [
    null
  ]
}

And childrenErrors:

[
  {
    message: 'Cannot return null for non-nullable field Type2.mandatoryField.',
    path: ['type2', 0, 'mandatoryField']
  }
]

As error will be relocated it will reach such state when object is null and childrenErrors will still be an array - this will lead to null exception on line 73.
I've checked the code - v7.1.0 does not have it fixed.

Type `TypeVisitor` is incorrect

I'm trying to implement a custom Transform based on the ones in /src/transforms to hide some stuff away. I saw that I should be able to do that by returning null in visitSchema (it's how it's done in FilterTypes). Unfortunately TypeVisitor won't allow me to do that because it is defined as returning only GraphQLNamedType. I'm not sure how it's possible that FilterTypes is working just fine with that declaration. It seems like | undefined | null is missing from type declaration.

applySchemaTransforms not called in delegateToSchema

Hi,

Thanks for maintaining this fork.

I was working through some things and wanted to use some of the built-in transforms provided and I came across an issue when using RenameObjectFields which uses TransformObjectFields which in turn requires a call to transformSchema to sets up state to be used in transformRequest.

As such using the transformers crashes and burns when calling delegateToSchema

Am using these correctly?

( I am passing RenameObjectFields directly to the transforms of delegateToSchema as I only want to apply the ObjectFields transform on a particular query currently)

deprecate transforms' visitSchema in favor of SchemaVisitor's visitSchema

HEAVILY EDITED

See #19 and #14.

visitSchema as defined in the transforms code section seems to be deprecated in favor of SchemaVisitor. SchemaVisitor even has its own visitSchema method!

The differences are as follows:

  1. Modification of original schema. SchemaVisitor version modifies the original schema, while the transforms version creates a new schema. You can now easily create a new schema prior to passing to visitSchema by using the toConfig() method, which could be done prior to visiting, or visiting could be changed to do this internally, perhaps by a variable.
  2. Method of passing hooks. SchemaVisitor allows customization by passing a derived class that overrides particular methods. The transforms version allows passing an object similar to the graphql visit() function.
  3. Types of hooks. SchemaVisitor has more customization hooks than the transforms method, i.e. you can hook into fields, arguments, and enum values. This could be added to the transforms visitSchema method with new VisitSchemaKind values like VisitSchemaKind.FIELD and VisitSchemaKind.ARGUMENT, etc.

This begs for unification.

One path forward is just to deprecate the transforms version. This would be pretty straightforward, as the transforms version is used just for to (A) initially wrap the schema with a round of delegation and (B) to modify the outer schema within a few transforms.

(A) transformSchema uses the transforms visitSchema with the stripResolvers set as true, so it wraps the original schema with new root fields that delegate to the original schema and changes the rest of the resolvers to defaultMergedResolver. This could easily be refactored to recreate the schema with toConfig and then just use addResolversToSchemas to replace resolvers.

(B) The individual transforms uses the transforms visitSchema to modify the outer schema. These transforms currently don't modify the original schema, but there is no reason why they cannot, and so it would be ok to switch to SchemaVisitor or even just direct modification.

Error when handling resolver error

On graphql-tools-fork v7.2.2 I get a following error:

ReferenceError: context is not defined
    at <Path>\node_modules\graphql-tools-fork\src\stitching\checkResultAndHandleErrors.ts:110:5
    at Array.map (<anonymous>)
    at handleList (<Path>\node_modules\graphql-tools-fork\src\stitching\checkResultAndHandleErrors.ts:104:15)
    at Object.handleResult (<Path>\node_modules\graphql-tools-fork\src\stitching\checkResultAndHandleErrors.ts:73:12)
    at defaultMergedResolver (<Path>\node_modules\graphql-tools-fork\src\stitching\defaultMergedResolver.ts:33:10)
    at field.resolve (<Path>\node_modules\graphql-tools-fork\src\stitching\mergeSchemas.ts:248:16)
    at field.resolve (<Path>\node_modules\graphql-extensions\src\index.ts:274:18)
    at resolveFieldValueOrError (<Path>\node_modules\graphql\execution\execute.js:467:18)
    at resolveField (<Path>\node_modules\graphql\execution\execute.js:434:16)
    at executeFields (<Path>\node_modules\graphql\execution\execute.js:275:18)
    at collectAndExecuteSubfields (<Path>\node_modules\graphql\execution\execute.js:713:10)
    at completeObjectValue (<Path>\node_modules\graphql\execution\execute.js:703:10)
    at completeValue (<Path>\node_modules\graphql\execution\execute.js:591:12)
    at completeValue (<Path>\node_modules\graphql\execution\execute.js:557:21)
    at <Path>\node_modules\graphql\execution\execute.js:492:16
    at processTicksAndRejections (internal/process/task_queues.js:93:5)

Function handleList does not have context defined.

Docs out of date

So much docs:

Docs for schema cloning, healing, visiting.

Docs for ExtendSchema and MapFields transforms and helpers.

Docs for defaultMergedResolver and makeMergedType.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.