Skip to content
This repository was archived by the owner on Mar 20, 2023. It is now read-only.

[WIP] Add rudimentary batched query support #100

Closed
wants to merge 1 commit into from

Conversation

taion
Copy link

@taion taion commented Jun 23, 2016

This is a WIP adding very rudimentary support for batched queries executed with the same shared execution context, when queries are specified as a JSON array.

This allows schemas using DataLoader (and sticking those DataLoaders on context) to share batching and caching across batched requests.

Do you think this makes sense to have here?

@@ -53,7 +53,7 @@
"lint": "eslint src",
"check": "flow check",
"build": "rm -rf dist/* && babel src --ignore __tests__ --out-dir dist",
"watch": "babel --optional runtime resources/watch.js | node",
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a no-op with Babel 6.

@taion
Copy link
Author

taion commented Jun 23, 2016

This is missing (at least) a richer set of test cases, along with proper semantics for the response error code.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 216dee3 on taion:batch into 6dcf37a on graphql:master.

@ghost ghost added the CLA Signed label Jul 12, 2016
@ghost ghost added the CLA Signed label Jul 19, 2016
@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling cd720a3 on taion:batch into 6dcf37a on graphql:master.

if (!query) {
if (showGraphiQL) {
return null;
function executeQuery(requestData) {
Copy link

@nodkz nodkz Jul 22, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will be cool, if you find other solution, rather than create function for every request. GC says thank you ;)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, express-graphql already creates a bunch of functions for every request to handle promise continuations. Ideally these all go away in a young gen GC anyway.

@nodkz
Copy link

nodkz commented Jul 22, 2016

Awesome PR!

It allows me remove bunch of hacks from my react-relay-network-layer:

@nodkz
Copy link

nodkz commented Aug 1, 2016

Any movements by this PR?

@taion
Copy link
Author

taion commented Aug 1, 2016

I think it's looking less likely at this point. Search for "batched query support" in the Q&A transcript here: https://github.com/reactiflux/q-and-a/blob/master/lee-byron_facebook-graphql.md.

@helfer
Copy link
Contributor

helfer commented Aug 1, 2016

@nodkz We've implemented batched query support in our rewrite of apollo-server, just in case that's of interest to you. Version 0.2.0 will be published on npm on Monday or Tuesday. We're always looking for more contributors, so if you have specific requirements for the batched query support, feel free to tune in!

@stubailo
Copy link

stubailo commented Aug 3, 2016

Dang, I was really hoping this could be merged because batching is a really great feature from a client PoV. @taion is there any way to use a middleware or similar to achieve this with express-graphql?

@ghost ghost added the CLA Signed label Aug 3, 2016
@taion
Copy link
Author

taion commented Aug 4, 2016

@nodkz's middleware does that. I put up this PR because I think it's a slightly cleaner API – essentially in this form, express-graphql takes care of sharing the context for the entire request.

@ghost ghost added the CLA Signed label Aug 4, 2016
@stubailo
Copy link

@nodkz I'm curious - what is the id for in your react-relay-network-layer? I think this is the right API and @Poincare is in the process of implementing batching for Apollo Client - would it be reasonable for us to just implement an array of requests and array of responses?

@taion
Copy link
Author

taion commented Aug 13, 2016

@leebyron It'd be great if you could comment on any specific concerns with handling batching in this manner that might prove problematic down the road.

@nodkz
Copy link

nodkz commented Aug 13, 2016

@stubailo I answered about id here I've realised mapping by relay id, neither array index. Today I understood (after Lee's comments 👍) that I can stream responses back in readiness. While some long queries not ready, the client has light responses already.

But this feature requires me to fork graphql-express and write new one "bicycle" 😱 I'm not ready to do this now. So-so motivation. Firstly, I finish graphql-compose and after that when I'll come to subscriptions I will be forced to implement own middleware for express.

BTW please, advice a good pub-sub lib 🤔


... it should instead represent the canonical usage and export tools to make it easy to implement more complex usages like this in your own apps (or shared as libraries)

@leebyron let change terminology a little bit 😉 Let say that it is not a batching, it is a simple query list executor. For batching exist too many requirements and questions. But for the query list executor - does not, furthermore it's realized in this PR and has brilliant backward compatibility. If the server gets {}, this is one query. If [] seems to be a list of queries. If somebody wants super performance, it can realize its solution. This trivial implementation of batching in graphql-express will satisfy the vast majority. A simple solution is better than nothing.

@ghost ghost added the CLA Signed label Aug 13, 2016
@josephsavona
Copy link

josephsavona commented Aug 13, 2016

I don't think it's feasible for this library to represent all possible uses of GraphQL over HTTP - it should instead represent the canonical usage and export tools to make it easy to implement more complex usages like this in your own apps

I couldn't agree more with @leebyron about keeping this library focused. It won't be enough to just add batching: accepting this PR would inevitably lead to more PRs to flush responses as data is ready, to support interdependencies between queries, etc.

most consumers of Relay send multiple queries in parallel for a single Relay route....
Is there something obvious I'm missing, then, for how to handle this use case?

@taion Note that the Relay network layer for sending queries is plural: sendQueries(queries: Array<RelayQueryRequest>). This is intentional, as Relay ensures that all queries for a given route will be sent to the server in one JS event loop, and passed to the network layer in a single call to sendQueries. This means that whether you do one HTTP request total or one per query is up to you.

Our internal network layer sends all queries via our batch API. In the simple case (no deferred queries) that means that we're able to achieve fewer HTTP requests (1 not n) and also receive results as they are ready. In the more advanced case, deferred queries are implemented by using interdependencies between batch entries.

As part of developing the new Relay core, we've implemented a module that emulates the most useful pieces of our internal batch API: resolving queries as they complete, expressing interdependencies between queries, etc. It's a standalone JS module that can be used on the client or (nodejs) server. It's also completely agnostic of Relay internals, which means it could be useful to Apollo and other clients/servers (cc @stubailo, @helfer). We'll open source this soon.

For the simple array of queries -> array of responses case, how about building a standalone npm module that reuses some helper functions from express-graphql?

@ghost ghost added the CLA Signed label Aug 14, 2016
@taion
Copy link
Author

taion commented Aug 14, 2016

I am currently using a custom network layer that batches up the query requests into a single HTTP request.

The issue on my end is that neither express-graphql nor the default Relay network layer provide anything in the way of batching. As such, in a typical use case, I need to replace both ends of my stack to avoid sending one HTTP request per Relay query request.

I am in fact running on this code right now with a custom network layer designed to communicate with it, but it seems like there should be some reasonable way to do minimal batching. Currently, the default behavior is just very suboptimal for any "real" use case.

What I really mean is – I don't need something perfect or fully optimal, but in the context of the default Relay network layer that does a Promise.all anyway, I'd like to have something just slightly better that doesn't generate n HTTP requests, in a manner that's as out-of-the-box as possible.

Otherwise this ends up being part of the stack that users end up more or less required to customize – and on both the server and the client end.

@stubailo
Copy link

accepting this PR would inevitably lead to more PRs to flush responses as data is ready, to support interdependencies between queries, etc.

If people can't add features or options here, where should they put them? I feel like there should be a repository that represents a "complete" HTTP GraphQL server for Node.js, since there seems like a pretty finite set of necessary features.

Otherwise this ends up being part of the stack that users end up more or less required to customize – and on both the server and the client end.

I agree, it would be great if a user could just install a package and have that package do the most reasonable thing, rather than having to configure many extensions.

it could be useful to Apollo and other clients/servers (cc @stubailo, @helfer). We'll open source this soon.

Any possibility that we we could collaborate on its design? It's not ideal to release these complete pieces of technology when they are ready and tested, because at that point they might not meet everyone's needs.


Perhaps the right thing to do here is to first agree on a standard way to do batching, perhaps something that identically matches Facebook's internal way of doing it, and then we can implement and merge it here? Then GraphQL implementations in other languages can add it as well, and eventually we can all just assume that batching exists instead of needing to do customization or feature detection.

@nodkz
Copy link

nodkz commented Aug 29, 2016

Step by step I'm rewriting own express implementation 😂

relay-tools/react-relay-network-layer@4e07086

@stubailo
Copy link

We've already got one here, and looking for contributors! https://github.com/apollostack/apollo-server

@nodkz
Copy link

nodkz commented Aug 29, 2016

@stubailo TypeScript is not my way (right now I'm under flow). If @taion starts using TypeScript in his OSS, I change my opinion, cause I teach from his code and solutions.

So right now I prefer to link on the representation of the "canonical reference of how to use GraphQL over HTTP". And maintain its changes, rather use appolo-server with its weekly changes 😉. Also in future react-relay-network-layer should have seamless integration with subscriptions and graphql-compose. So when I'll come to its realization may be I choose appolo-server, may be other solution. But I intuitively feel that it will be something from @taion.

@ghost ghost added the CLA Signed label Aug 29, 2016
@taion
Copy link
Author

taion commented Aug 29, 2016

Please don't take anything I make as normative. I just haven't had a chance to look at other options for serving up a GraphQL endpoint. The intention here was to add batching support in a minimal way – that's all. It might well be the case that express-graphql as-is isn't going to work as a real production server piece anyway.

@nodkz
Copy link

nodkz commented Oct 18, 2016

@leeb what do you think today about this PR?
Will be supported execution of a list of queries out of the box or not?

@facebook-github-bot
Copy link

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please # at https://code.facebook.com/cla - and if you have received this in error or have any questions, please drop us a line at cla@fb.com. Thanks!

If you are contributing on behalf of someone else (eg your employer): the individual CLA is not sufficient - use https://developers.facebook.com/opensource/cla?type=company instead. Contact cla@fb.com if you have any questions.

@facebook-github-bot
Copy link

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks!

@fzaninotto
Copy link

fzaninotto commented Sep 5, 2017

Any updates on this one? Since Apollo removed Merged queries to support only batched queries (cf https://github.com/apollographql/apollo-client/pull/734/files), It's no longer possible to use batched queries with express-graphql. I'm +1 for the change, especially considering that:

  • the change is small enough
  • Apollo is extremely popular on the client side

Not supporting this is a motivation for people to use apollo-server instead of this package...

@taion
Copy link
Author

taion commented Sep 5, 2017

Is this API-compatible with Apollo's batched queries?

@stubailo
Copy link

stubailo commented Sep 6, 2017

It's pretty simple to implement a custom network interface to send queries however you want (you just need to provide a function that can take an array of queries to send) but here's the format we support: http://dev.apollodata.com/tools/apollo-server/requests.html#batching

@taion
Copy link
Author

taion commented Sep 6, 2017

Cool, so the same as https://github.com/graphql/express-graphql/pull/100/files#diff-500ea3fc5d54f0388f497b2c9d49a49eR403.

I do feel like this is a step in the right direction. Can we revisit moving forward on this, @leebyron?

@stubailo
Copy link

stubailo commented Sep 6, 2017

Yep, looks like it's compatible! Would be a great improvement.

Is there anything preventing people from using Apollo Server? Just curious if there is some missing feature or other factor that makes express-graphql more appealing.

@taion
Copy link
Author

taion commented Sep 7, 2017

No particular reason. For us, I guess it's just that this is the first-party recommended server, and feels more minimal. Privilege of place, I suppose.

@nodkz
Copy link

nodkz commented Nov 6, 2017

Today most servers support batch request [{req1, req2}] by default. Any new thoughts about supporting BATCH queries by default in express-graphql (as canonical implementation)?

Right now I'm rewriting https://github.com/nodkz/react-relay-network-layer for supporting Relay Modern in a new package https://github.com/nodkz/react-relay-network-modern. Both of this packages have the same wrapper for express-graphql, which emulates sub-requests if received an array of queries. So I should decide to export this hacky wrapper as a standalone package or express-graphql add this feature at last...

@taion
Copy link
Author

taion commented Nov 6, 2017

@leebyron Any further thoughts here?

@taion
Copy link
Author

taion commented Dec 15, 2017

I'm going to close this out.

Given that the options function now takes the GraphQL params, it's not clear how to properly handle the semantics for things like pretty, as the values across a batch of multiple operations may not agree.

If you were previously running against this code, please use Apollo Server instead, as the API is mostly compatible with the API here.

@taion taion closed this Dec 15, 2017
@taion taion deleted the batch branch December 15, 2017 17:14
# for free to subscribe to this conversation on GitHub. Already have an account? #.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants