New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: Inject Request Context? #17
Comments
Sorry for the delay. The simplest thing you could do is move the schema creation code into the handler, so you can access the request directly. The downside of this approach is the schema would be recreated on each request.
With that schema you can use a parameter such as I'm working on a more complete & production ready example that includes authentication and uses Guice servlet extension. |
@siderakis Thank you so much for this. I was looking to lean into the Guice servlet extension as well so its good to know that I'm on the right path. I suspect we may need something similar for managing the dispatch for DataFetchers. |
I think I have a version sussed out using Guice Servlet. Does this setup make sense? This is +/- the boilerplate in the examples/ repo but reworked as a Servlet. I haven't leaned in yet at all, but this should preserve the schema across requests and allow me to start injecting request based instances where needed (Dataloaders, Auth'd users, etc...) public class RQLServer {
private static final int HTTP_PORT = 8080;
private static final Logger logger = Logger.getLogger(Server.class.getName());
public static void main(String[] args) throws Exception {
Server server = new Server(HTTP_PORT);
ServletContextHandler context = new ServletContextHandler(
server,
"/",
ServletContextHandler.SESSIONS
);
context.addEventListener(new GuiceServletContextListener() {
@Override
protected Injector getInjector() {
return Guice.createInjector(
new SchemaProviderModule(),
new RQLModule(),
new RQLSchema(),
new ServletModule() {
@Override
protected void configureServlets() {
serve("/graphql").with(RQLHandler.class);
}
});
}
});
context.addFilter(
GuiceFilter.class,
"/*",
EnumSet.of(javax.servlet.DispatcherType.REQUEST, javax.servlet.DispatcherType.ASYNC)
);
context.addServlet(DefaultServlet.class, "/");
try {
logger.info("Server running on port " + HTTP_PORT);
server.start();
server.join();
} catch (Exception e) {
logger.info("error" + e.getMessage());
} finally {
server.destroy();
}
}
} public class RQLModule extends AbstractModule {
@Override
protected void configure() {
// ...
}
@Provides
GraphQL getGraphQL(Injector injector) {
GraphQLSchema schema = injector.getInstance(Key.get(GraphQLSchema.class, Schema.class));
Instrumentation INSTRUMENTATION =
new ChainedInstrumentation(
Arrays.asList(
GuavaListenableFutureSupport.listenableFutureInstrumentation(),
new TracingInstrumentation()));
return GraphQL.newGraphQL(schema).instrumentation(INSTRUMENTATION).build();
}
} @Singleton
public class RQLHandler extends HttpServlet {
private static final TypeToken<Map<String, Object>> MAP_TYPE_TOKEN =
new TypeToken<Map<String, Object>>() {
};
private static final Gson GSON = new GsonBuilder().serializeNulls().create();
@Inject
GraphQL graphQL;
@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
Map<String, Object> json = readJson(req);
String query = (String) json.get("query");
if (query == null) {
resp.setStatus(400);
return;
}
String operationName = (String) json.get("operationName");
Map<String, Object> variables = getVariables(json.get("variables"));
ExecutionInput executionInput =
ExecutionInput.newExecutionInput()
.query(query)
.operationName(operationName)
.variables(variables)
.context(new Object())
.build();
ExecutionResult executionResult = graphQL.execute(executionInput);
resp.setContentType("application/json");
resp.setStatus(HttpServletResponse.SC_OK);
GSON.toJson(executionResult.toSpecification(), resp.getWriter());
}
private static Map<String, Object> getVariables(Object variables) {
Map<String, Object> variablesWithStringKey = new HashMap<>();
if (variables instanceof Map) {
((Map) variables).forEach((k, v) -> variablesWithStringKey.put(String.valueOf(k), v));
}
return variablesWithStringKey;
}
private static Map<String, Object> readJson(HttpServletRequest request) {
try {
String json = CharStreams.toString(request.getReader());
return jsonToMap(json);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
private static Map<String, Object> jsonToMap(String json) {
if (Strings.isNullOrEmpty(json)) {
return ImmutableMap.of();
}
return Optional.<Map<String, Object>>ofNullable(GSON.fromJson(json, MAP_TYPE_TOKEN.getType()))
.orElse(ImmutableMap.of());
}
} |
This looks great! I need to take a closer look but one small suggestion might be to inject the GraphQL schema directly rather than injecting the Injector in RQLModule. Something like:
|
Thats a much better call. Thank you so much for your guidance. Rejoiner has been a joy to use so far! Let me know if you'd like me to wrap that up in an example. I could probably change the library example to do this instead? |
@siderakis This got me pretty far until I tried using something request scoped in a // AbstractModule
@Provides
@RequestScoped
DataLoaderRegistry getLoaders(CoreClient client) {
DataLoaderRegistry registry = new DataLoaderRegistry();
BatchLoader<String, Book> batchBookLoader = ids -> {
return FutureConverter.toCompletableFuture(client.getBooks(ids));
};
registry.register("books", new DataLoader<>(batchBookLoader));
return registry;
} // Handler
@Inject
Provider<DataLoaderRegistry> registryProvider;
@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
throws IOException {
DataLoaderRegistry registry = registryProvider.get();
Map<String, Object> json = readJson(req);
String query = (String) json.get("query");
if (query == null) {
resp.setStatus(400);
return;
}
String operationName = (String) json.get("operationName");
Map<String, Object> variables = getVariables(json.get("variables"));
Instrumentation INSTRUMENTATION =
new ChainedInstrumentation(
Arrays.asList(
GuavaListenableFutureSupport.listenableFutureInstrumentation(),
new DataLoaderDispatcherInstrumentation(registry),
new TracingInstrumentation()
)
);
GraphQL graphQL = builder.instrumentation(INSTRUMENTATION).build();
ExecutionInput executionInput =
ExecutionInput.newExecutionInput()
.query(query)
.operationName(operationName)
.variables(variables)
.build();
ExecutionResult executionResult = graphQL.execute(executionInput);
resp.setContentType("application/json");
resp.setStatus(HttpServletResponse.SC_OK);
GSON.toJson(executionResult.toSpecification(), resp.getWriter());
} // SchemaModule
@SchemaModification(addField = "book", onType = Shelf.class)
ListenableFuture<Book> addBooksToShel(DataLoaderRegistry registry,
Shelf shelf) {
DataLoader<String, Book> loader = registry.getDataLoader("books");
CompletableFuture<Listing> res = loader
.load(String.valueOf(shelf.getBookIds()));
return FutureConverter.toListenableFuture(res);
} This wires fine, the server starts up, but then Guice complains that I'm not in a RequestContext when attempting to inject the
|
I figured out the issue here. The root of my query returned a Future, which was resolved by a multi-threaded HTTP client (asyncHttp). When it finally did resolved, it invokes the schema modification handler, but at that point I've lost my request context and Guice refuses to supply my request scoped |
Interesting, where does Referenced in |
The builder is injected via a // AbstractModule
@Provides
@Singleton
Builder getGraphQL(@Schema GraphQLSchema schema) {
return GraphQL.newGraphQL(schema);
} // Handler
@Inject
GraphQL.Builder builder;
@Inject
Provider<DataLoaderRegistry> registryProvider;
@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
throws IOException {
// ... |
Turns out this isn't the async client necessarily, but any time I jump threads (i.e. when returning a This is very simple to reproduce. I suspect this is an issue with how I'm setting up the servlet and Guice, but I'm not sure how else to set this up. Any guidance here would be greatly appreciated and once I have all of these moving parts worked out I am happy to submit an example to this repo to help other folks out. |
I was able to get past this issue, but unfortunately I'm now running into what I expect is a bug with the Dataloaders themselves. I believe its related to this issue: graphql-java/graphql-java#979 . For posterity, I was able to transfer the servlet context, which got the DI working again, but exposed this issue with the Dataloader. public class RQLDataLoaderProvider implements Provider<DataLoaderRegistry> {
@Inject
CoreClient core;
@Inject
LPClient lp;
public static <T> CompletableFuture<T> asFuture(Callable<? extends T> callable,
Executor executor) {
CompletableFuture<T> future = new CompletableFuture<>();
executor.execute(() -> {
try {
future.complete(callable.call());
} catch (Throwable t) {
future.completeExceptionally(t);
}
});
return future;
}
BatchLoader<String, Listing> batchListingsLoader = ids -> asFuture(
ServletScopes.transferRequest(() -> core.getListings(ids).getListingsList()),
MoreExecutors.directExecutor());
BatchLoader<String, Release> batchReleasesLoader = ids -> asFuture(
ServletScopes.transferRequest(() -> lp.getReleases(ids).getReleasesList()),
MoreExecutors.directExecutor());
public DataLoaderRegistry get() {
DataLoaderRegistry registry = new DataLoaderRegistry();
registry.register("listings", new DataLoader<>(batchListingsLoader));
registry.register("releases", new DataLoader<>(batchReleasesLoader));
return registry;
}
} This results in @SchemaModification(addField = "release", onType = Listing.class)
ListenableFuture<Release> listingToRelease(DataLoaderRegistry registry, Listing listing) {
DataLoader<String, Release> loader = registry.getDataLoader("releases");
return FutureConverter.toListenableFuture(loader.load(listing.getMerchandisingUuid()));
}
@SchemaModification(addField = "listing", onType = FeedEntry.class)
ListenableFuture<Listing> feedEntryToListing(DataLoaderRegistry registry,
FeedEntry entry) {
if (entry.getType() == EntryType.LISTING) {
DataLoader<String, Listing> loader = registry.getDataLoader("listings");
CompletableFuture<Listing> res = loader.load(String.valueOf(entry.getId()));
return FutureConverter.toListenableFuture(res);
}
return null;
}
@SchemaModification(addField = "lpFeed", onType = AuthUser.class)
ImmutableList<FeedEntry> userToLPFeedEntries(
@LPFeed FeedServiceGrpc.FeedServiceBlockingStub client,
AuthUser user) {
EntriesRequest.Builder req = EntriesRequest.newBuilder();
req.setUserId(user.getResourceOwnerId());
FeedResponse resp = client.getFeedEntries(req.build());
return ImmutableList.copyOf(resp.getEntriesList());
} {
user(input: { token: "token"}) {
lpFeed {
listing {
title
release {
title
}
}
}
}
} This project is so close to what we need, but if I can't resolve this issue, I'll likely need to move on. Thanks again for your help and the fantastic project. |
Nice work! Can you push your changes to a branch so I can try to debug it? I have some time tonight I can take a look at it
…Sent from my iPhone
On Mar 20, 2018, at 12:58 PM, Erik Benoist ***@***.***> wrote:
I was able to get past this issue, but unfortunately I'm now running into what I expect is a bug with the Dataloaders themselves. I believe its related to this issue: graphql-java/graphql-java#979 .
For posterity, I was able to transfer the servlet context, which got the DI working again, but exposed this issue with the Dataloader.
public class RQLDataLoaderProvider implements Provider<DataLoaderRegistry> {
@Inject
CoreClient core;
@Inject
LPClient lp;
public static <T> CompletableFuture<T> asFuture(Callable<? extends T> callable,
Executor executor) {
CompletableFuture<T> future = new CompletableFuture<>();
executor.execute(() -> {
try {
future.complete(callable.call());
} catch (Throwable t) {
future.completeExceptionally(t);
}
});
return future;
}
BatchLoader<String, Listing> batchListingsLoader = ids -> asFuture(
ServletScopes.transferRequest(() -> core.getListings(ids).getListingsList()),
MoreExecutors.directExecutor());
BatchLoader<String, Release> batchReleasesLoader = ids -> asFuture(
ServletScopes.transferRequest(() -> lp.getReleases(ids).getReleasesList()),
MoreExecutors.directExecutor());
public DataLoaderRegistry get() {
DataLoaderRegistry registry = new DataLoaderRegistry();
registry.register("listings", new DataLoader<>(batchListingsLoader));
registry.register("releases", new DataLoader<>(batchReleasesLoader));
return registry;
}
}
This results in listings being resolved in batch, correctly. Unfortunately when the batch loader is invoked for releases, which joins to listings, the batch loader immediately executes, creating an N+1 problem.
@SchemaModification(addField = "release", onType = Listing.class)
ListenableFuture<Release> listingToRelease(DataLoaderRegistry registry, Listing listing) {
DataLoader<String, Release> loader = registry.getDataLoader("releases");
return FutureConverter.toListenableFuture(loader.load(listing.getMerchandisingUuid()));
}
@SchemaModification(addField = "listing", onType = FeedEntry.class)
ListenableFuture<Listing> feedEntryToListing(DataLoaderRegistry registry,
FeedEntry entry) {
if (entry.getType() == EntryType.LISTING) {
DataLoader<String, Listing> loader = registry.getDataLoader("listings");
CompletableFuture<Listing> res = loader.load(String.valueOf(entry.getId()));
return FutureConverter.toListenableFuture(res);
}
return null;
}
@SchemaModification(addField = "lpFeed", onType = AuthUser.class)
ImmutableList<FeedEntry> userToLPFeedEntries(
@LPFeed FeedServiceGrpc.FeedServiceBlockingStub client,
AuthUser user) {
EntriesRequest.Builder req = EntriesRequest.newBuilder();
req.setUserId(user.getResourceOwnerId());
FeedResponse resp = client.getFeedEntries(req.build());
return ImmutableList.copyOf(resp.getEntriesList());
}
{
user(input: { token: "token"}) {
lpFeed {
listing {
title
release {
title
}
}
}
}
}
This project is so close to what we need, but if I can't resolve this issue, I'll likely need to move on. Thanks again for your help and the fantastic project.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Where are you calling Can you confirm there is only one instance of DataLoaderRegistry per request? |
I created a branch named |
Please ignore the previous questions I figured it out. I got the library example working with a data loader for simple queries but ended up getting the same com.google.inject.OutOfScopeException exception. I tired I'm going to try to pass the DataloadRegistry as part of the context when executing the query. This may allow us to avoid the need to use the request scope all together. |
Got it- I took a look at your branch and its a similar approach that I took. I was able to work through the OutOfScopeException with I really like the approach here. I'm currently building out a protoc -> graphql schema generator to attempt to replicate some of this library with the apollo-server library, but I'd be eager to come back to this if the underlying issue gets worked out. |
How would one go about implementing the
@AuthenticatedUser
Arg annotation shown in the examples? Can a Provider also have the Jetty request context injected to make headers/session values available?This project is incredibly exciting and I'd be happy submit more documentation if someone can point me in the right general direction.
The text was updated successfully, but these errors were encountered: