Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support queries over HTTP GET #574

Open
ontofractal opened this issue Sep 28, 2018 · 13 comments
Open

support queries over HTTP GET #574

ontofractal opened this issue Sep 28, 2018 · 13 comments

Comments

@ontofractal
Copy link

@ontofractal ontofractal commented Sep 28, 2018

Graphql Engine returns {"path":"$","error":"resource does not exist","code":"not-found"} when using {useGETForQueries:true} Apollo Link options.

Is it possible to enable GET requests endpoints for Hasura engine?

@shahidhk

This comment has been minimized.

Copy link
Member

@shahidhk shahidhk commented Sep 28, 2018

@ontofractal Hasura does not support GET for GraphQL queries as of now. Any particular use case where you need to use GET, not POST?

@0x777 0x777 changed the title "Server response was missing for query 'null' " error for GET requests support queries over HTTP GET Sep 29, 2018
@ecthiender

This comment has been minimized.

Copy link
Member

@ecthiender ecthiender commented Sep 29, 2018

In a GET request, how would the payload be sent? Through query string? That does not seem clean.

Also, we should probably return 405 Method Not Allowed instead of a 404 on GET /v1alpha1/graphql.

@shahidhk

This comment has been minimized.

Copy link
Member

@shahidhk shahidhk commented Sep 29, 2018

AFAIK GraphQL Spec does not talk about GET or POST.

But GraphQL.org has a best practices section which talks about GET and POST methods.

@ontofractal

This comment has been minimized.

Copy link
Author

@ontofractal ontofractal commented Sep 29, 2018

@shahidhk I mostly used GET queries to cache server responses

@ontofractal

This comment has been minimized.

Copy link
Author

@ontofractal ontofractal commented Nov 22, 2018

Is this on the roadmap? It would be massively useful to be able to use commodity CDN services like Cloudflare with Hasura Engine.

@rhaksw

This comment has been minimized.

Copy link

@rhaksw rhaksw commented May 17, 2019

Seconding @ontofractal 's point. Cloudflare will not cache POST requests, they only cache based on the URL. An option to process GET requests would be helpful. Apollo Client, for example, has the option useGETForQueries, for the very reason that setting up caching is easier with GET requests.

I was about to setup Hasura in production when I noticed this is not currently possible. If anyone knows a workaround, I'd love to hear it. Thanks!

@Jerry-Hong

This comment has been minimized.

Copy link

@Jerry-Hong Jerry-Hong commented Jul 18, 2019

Apollo server has Automatic persisted queries, it will change POST to GET at client, and mutation is still POST.
https://www.apollographql.com/docs/apollo-server/features/apq/#cdn-integration

@abn

This comment has been minimized.

Copy link
Contributor

@abn abn commented Jul 23, 2019

@rhaksw a simple/quick solution might be to put nginx or similar in front. There could be a few edge cases around the transformation of the request payload that needs to be handled with care.

    location /v1/graphql {
        if ($request_method = GET ) {
            proxy_method POST;
            // transform request payloads
            proxy_set_header Content-Type "application/json";
            proxy_set_body '{"some_key":"$arg_some_key"}';
        }
        proxy_pass http://hasura:8080;
    }
@rhaksw

This comment has been minimized.

Copy link

@rhaksw rhaksw commented Jul 24, 2019

Hey @abn great point! I'm using Hasura's digital ocean droplet which has Caddy. My config for GraphQL is currently the original,

api.revddit.com {
    proxy / graphql-engine:8080 {
        websocket
    }
}

I'll look into how to configure your suggestion for Caddy.

@loganpowell

This comment has been minimized.

Copy link

@loganpowell loganpowell commented Aug 16, 2019

Service workers also enable GET fetches to be intercepted, but not POST, so for the new hotness, GET is the new POST

@Toub

This comment has been minimized.

Copy link

@Toub Toub commented Nov 29, 2019

Once GET method is supported, ETag header can be implemented: #2792

Supporting ETag header will avoid to transfer data if it matches current browser cache version.

Use case:

When application starts, initial state data is loaded.

If you refresh the page, or if the data has not changed since your last visit, the application is loaded immediately, without waiting for the network transfer.

Very useful for static data (or data which changes rarely), or when data is usually modified by you (your profile, your last messages sent...).

@rhaksw

This comment has been minimized.

Copy link

@rhaksw rhaksw commented Mar 25, 2020

I figured out how to convert GET requests to POST in order to cache idempotent GraphQL queries via CDN!

I'm writing this out because I've received a lot of help from Hasura devs and want to give something back. There may be a better way. This is what I did,

Basic

Send a GET to /graphql-get/ which proxies to a customized openresty version of nginx. Openresty converts GET to POST and then proxies that back to hasura's /v1/graphql/. Then setup Cloudflare as normal for any API.

Details

Using Hasura's Digital Ocean droplet that comes with Caddy, I modified the Caddyfile:

# Write out the prefixes http://, https:// rather than just the domain
# This disables auto redirect from http -> https which would mistakenly convert a POST to a GET
http://api.revddit.com, https://api.revddit.com {
    # 172.17.0.1 is an IP for accessing localhost outside the docker container
    # 8282 is the port used below in the openresty configuration
    proxy /graphql-get/ 172.17.0.1:8282 {
        websocket
    }
    proxy / graphql-engine:8080 {
        websocket
    }
}

Then,

sudo docker-compose -f /etc/hasura/docker-compose.yaml restart caddy
sudo ufw allow 8282/tcp

Then, install openresty. It will fail to start on its own because it uses 8080 by default which is already used by Hasura's Caddy installation.

Move the default configuration and create a new one,

sudo mv /etc/openresty/nginx.conf /etc/openresty/nginx.conf.old
touch ~/resty.conf
sudo ln -s ~/resty.conf /etc/openresty/nginx.conf
vim ~/resty.conf
# ~/resty.conf

worker_processes  1;
events {
    worker_connections 1024;
}
http {
   server {
        listen 8282;
        server_name api.revddit.com;
        location /graphql-get/ {
            set $query $arg_query;
            set $variables $arg_variables;
            proxy_method POST;
            rewrite_by_lua_block {
                ngx.var.query = ngx.unescape_uri(ngx.var.query)
                ngx.var.variables = ngx.unescape_uri(ngx.var.variables)
            }
            proxy_set_body '{"query":"$query","variables":$variables}';
            add_header Access-Control-Allow-Headers content-type;
            proxy_pass http://api.revddit.com/v1/graphql/;
            set $args '';
        }
    }
}

Start openresty,

sudo systemctl restart openresty

Note above I am proxying to http://... not https. This is fine for me but you may want to look into it if that concerns you. Configuring ssl on nginx to use Caddy's ssl certificates seemed unnecessary. At first when I did this, requests were received by Hasura as GET requests, despite having been rewritten as POST. That confused me until I found this issue that describes how to configure Caddy to disable the auto-redirect from http to https.

Note that Caddy v2 is coming soon, and after that, support may be added so that the proxy via openresty/nginx is not necessary. I discussed that with Caddy developers in this thread.

Then, point DNS to Cloudflare and setup a page rule that looks like this. Cache Level "Cache Everything" enables caching JSON and setting "Edge Cache TTL" tells Cloudflare how long to keep a result before requesting again. By default, Browser Cache lasts 4 hours, and you can set it as low as 30 minutes.

My client-side javascript looks like this,

// disable preflight request, which can't be cached by cloudflare, by using customFetch
const customFetch = (uri, options) => {
  const fetchOptions = {
    credentials: 'same-origin',
    method: "GET",
    headers: {
      accept: '*/*'
    },
    signal: options.signal
  }
  return fetch(uri, fetchOptions)
}

const apolloClient = new ApolloClient({
  cache: new InMemoryCache(),
  link: new HttpLink({
    uri: 'https://api.revddit.com/graphql-get/',
    fetch: customFetch,
    useGETForQueries: true
  }),
})

The above sends minimal headers which somehow tells the browser not to send an OPTIONS (preflight) request.

By the way, I first tried to convert GET requests to POST via a Cloudflare worker since they have a template for just this task. That was not working for me, perhaps due to not setting "Cache Level->Cache Everything" and "Edge Cache TTL" in the Cloudflare page rule, which I later discovered was necessary. I may try this again later to satisfy my curiosity. Anyway, workers have some limits and you can avoid these by implementing it yourself.

An easier way to cache GraphQL requests would be to hash the body, as Akamai suggests, however it looks like Cloudflare does not support caching of POST types.

That's it! Now watch Hasura unveil support for GET requests and make all this pointless =)

@ecthiender

This comment has been minimized.

Copy link
Member

@ecthiender ecthiender commented Mar 25, 2020

@rhaksw you should consider writing a blog post about this 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
You can’t perform that action at this time.