-
Notifications
You must be signed in to change notification settings - Fork 535
[WIP] Add rudimentary batched query support #100
Conversation
@@ -53,7 +53,7 @@ | |||
"lint": "eslint src", | |||
"check": "flow check", | |||
"build": "rm -rf dist/* && babel src --ignore __tests__ --out-dir dist", | |||
"watch": "babel --optional runtime resources/watch.js | node", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a no-op with Babel 6.
This is missing (at least) a richer set of test cases, along with proper semantics for the response error code. |
if (!query) { | ||
if (showGraphiQL) { | ||
return null; | ||
function executeQuery(requestData) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be cool, if you find other solution, rather than create function for every request. GC says thank you ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, express-graphql already creates a bunch of functions for every request to handle promise continuations. Ideally these all go away in a young gen GC anyway.
Awesome PR! It allows me remove bunch of hacks from my react-relay-network-layer:
|
Any movements by this PR? |
I think it's looking less likely at this point. Search for "batched query support" in the Q&A transcript here: https://github.com/reactiflux/q-and-a/blob/master/lee-byron_facebook-graphql.md. |
@nodkz We've implemented batched query support in our rewrite of apollo-server, just in case that's of interest to you. Version 0.2.0 will be published on npm on Monday or Tuesday. We're always looking for more contributors, so if you have specific requirements for the batched query support, feel free to tune in! |
Dang, I was really hoping this could be merged because batching is a really great feature from a client PoV. @taion is there any way to use a middleware or similar to achieve this with express-graphql? |
@nodkz's middleware does that. I put up this PR because I think it's a slightly cleaner API – essentially in this form, express-graphql takes care of sharing the context for the entire request. |
@leebyron It'd be great if you could comment on any specific concerns with handling batching in this manner that might prove problematic down the road. |
@stubailo I answered about But this feature requires me to fork BTW please, advice a good pub-sub lib 🤔
@leebyron let change terminology a little bit 😉 Let say that it is not a |
I couldn't agree more with @leebyron about keeping this library focused. It won't be enough to just add batching: accepting this PR would inevitably lead to more PRs to flush responses as data is ready, to support interdependencies between queries, etc.
@taion Note that the Relay network layer for sending queries is plural: Our internal network layer sends all queries via our batch API. In the simple case (no deferred queries) that means that we're able to achieve fewer HTTP requests (1 not n) and also receive results as they are ready. In the more advanced case, deferred queries are implemented by using interdependencies between batch entries. As part of developing the new Relay core, we've implemented a module that emulates the most useful pieces of our internal batch API: resolving queries as they complete, expressing interdependencies between queries, etc. It's a standalone JS module that can be used on the client or (nodejs) server. It's also completely agnostic of Relay internals, which means it could be useful to Apollo and other clients/servers (cc @stubailo, @helfer). We'll open source this soon. For the simple array of queries -> array of responses case, how about building a standalone npm module that reuses some helper functions from |
I am currently using a custom network layer that batches up the query requests into a single HTTP request. The issue on my end is that neither express-graphql nor the default Relay network layer provide anything in the way of batching. As such, in a typical use case, I need to replace both ends of my stack to avoid sending one HTTP request per Relay query request. I am in fact running on this code right now with a custom network layer designed to communicate with it, but it seems like there should be some reasonable way to do minimal batching. Currently, the default behavior is just very suboptimal for any "real" use case. What I really mean is – I don't need something perfect or fully optimal, but in the context of the default Relay network layer that does a Otherwise this ends up being part of the stack that users end up more or less required to customize – and on both the server and the client end. |
If people can't add features or options here, where should they put them? I feel like there should be a repository that represents a "complete" HTTP GraphQL server for Node.js, since there seems like a pretty finite set of necessary features.
I agree, it would be great if a user could just install a package and have that package do the most reasonable thing, rather than having to configure many extensions.
Any possibility that we we could collaborate on its design? It's not ideal to release these complete pieces of technology when they are ready and tested, because at that point they might not meet everyone's needs. Perhaps the right thing to do here is to first agree on a standard way to do batching, perhaps something that identically matches Facebook's internal way of doing it, and then we can implement and merge it here? Then GraphQL implementations in other languages can add it as well, and eventually we can all just assume that batching exists instead of needing to do customization or feature detection. |
Step by step I'm rewriting own express implementation 😂 |
We've already got one here, and looking for contributors! https://github.com/apollostack/apollo-server |
@stubailo TypeScript is not my way (right now I'm under So right now I prefer to link on the representation of the "canonical reference of how to use GraphQL over HTTP". And maintain its changes, rather use |
Please don't take anything I make as normative. I just haven't had a chance to look at other options for serving up a GraphQL endpoint. The intention here was to add batching support in a minimal way – that's all. It might well be the case that express-graphql as-is isn't going to work as a real production server piece anyway. |
@leeb what do you think today about this PR? |
Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please # at https://code.facebook.com/cla - and if you have received this in error or have any questions, please drop us a line at cla@fb.com. Thanks! If you are contributing on behalf of someone else (eg your employer): the individual CLA is not sufficient - use https://developers.facebook.com/opensource/cla?type=company instead. Contact cla@fb.com if you have any questions. |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
Any updates on this one? Since Apollo removed Merged queries to support only batched queries (cf https://github.com/apollographql/apollo-client/pull/734/files), It's no longer possible to use batched queries with express-graphql. I'm +1 for the change, especially considering that:
Not supporting this is a motivation for people to use apollo-server instead of this package... |
Is this API-compatible with Apollo's batched queries? |
It's pretty simple to implement a custom network interface to send queries however you want (you just need to provide a function that can take an array of queries to send) but here's the format we support: http://dev.apollodata.com/tools/apollo-server/requests.html#batching |
Cool, so the same as https://github.com/graphql/express-graphql/pull/100/files#diff-500ea3fc5d54f0388f497b2c9d49a49eR403. I do feel like this is a step in the right direction. Can we revisit moving forward on this, @leebyron? |
Yep, looks like it's compatible! Would be a great improvement. Is there anything preventing people from using Apollo Server? Just curious if there is some missing feature or other factor that makes express-graphql more appealing. |
No particular reason. For us, I guess it's just that this is the first-party recommended server, and feels more minimal. Privilege of place, I suppose. |
Today most servers support batch request Right now I'm rewriting https://github.com/nodkz/react-relay-network-layer for supporting Relay Modern in a new package https://github.com/nodkz/react-relay-network-modern. Both of this packages have the same wrapper for |
@leebyron Any further thoughts here? |
I'm going to close this out. Given that the If you were previously running against this code, please use Apollo Server instead, as the API is mostly compatible with the API here. |
This is a WIP adding very rudimentary support for batched queries executed with the same shared execution context, when queries are specified as a JSON array.
This allows schemas using DataLoader (and sticking those DataLoaders on context) to share batching and caching across batched requests.
Do you think this makes sense to have here?