Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

fetch fails for OpenAPI data connector connecting to localhost #10634

Open
durkino opened this issue Dec 19, 2024 · 1 comment
Open

fetch fails for OpenAPI data connector connecting to localhost #10634

durkino opened this issue Dec 19, 2024 · 1 comment
Labels
k/v3-bug Bug affecting Hasura v3 (DDN)

Comments

@durkino
Copy link

durkino commented Dec 19, 2024

Component

c/v3-ndc-open-api-lambda

What is the current behaviour?

I have been trying to use the OpenAPI data connector to connect to an internal server, but I get the following error:

{
  "level": 50,
  "time": 1734463674663,
  "pid": 20,
  "hostname": "439b9e41b4ca",
  "err": {
    "type": "InternalServerError",
    "message": "Error encountered when invoking function 'getApiV2ReadAsync'",
    "stack": "Error: Error encountered when invoking function 'getApiV2ReadAsync'
               at invokeFunction (/functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:184:13)
               at processTicksAndRejections (node:internal/process/task_queues:95:5)
               at async /functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:50:22",
    "statusCode": 500,
    "details": {
      "type": "Object",
      "message": "terminated",
      "stack": "TypeError: terminated
                at Fetch.onAborted (node:internal/deps/undici/undici: 11190: 53)
                at Fetch.emit (node:events: 518: 28)
                at Fetch.emit (node:domain: 488: 12)
                at Fetch.terminate (node:internal/deps/undici/undici: 10375: 14)
                at Fetch.fetchParams.controller.resume (node:internal/deps/undici/undici: 11167: 36)
                at processTicksAndRejections (node:internal/process/task_queues: 95: 5)
                caused by Error: incorrect header check
                at genericNodeError (node:internal/errors: 984: 15)
                at wrappedFn (node:internal/errors: 538: 14)
                at Zlib.zlibOnError [as onerror ] (node:zlib: 189: 17)
                at Zlib.callbackTrampoline (node:internal/async_hooks: 130: 17)"
    }
  },
  "msg": "Error encountered when invoking function 'getApiV2ReadAsync'"
}

Other endpoints for this server have worked successfully. My hypothesis is that the error is triggered when the response content encoding is compressed (gzip or deflate). It seems that the error is caused by using an out-of-date version of the undici npm module per this defect and should be fixed as of v6.21.0. npm version on the data connector container shows that undici v5.28.4 is being used. Since the package requirement originates in the ndc lambda sdk that project might need to be updated first.

However, I tried reproducing this error by modifying a sample OpenAPI server in Rust but I got a different internal error, regardless of whether the response was delivered compressed. This was the error:

{
    "level": 50,
    "time": 1734636339477,
    "pid": 21,
    "hostname": "d9b5deb670d0",
    "err": {
        "type": "InternalServerError",
        "message": "Error encountered when invoking function 'getHelloworldHelloworld'",
        "stack": "Error: Error encountered when invoking function 'getHelloworldHelloworld'
                    at invokeFunction (/functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:184:13)
                    at processTicksAndRejections (node:internal/process/task_queues:95:5)
                    at async /functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:50:22",
        "statusCode": 500,
        "details": {
            "type": "Object",
            "message": "fetch failed",
            "stack": "TypeError: fetch failed
                        at node:internal/deps/undici/undici:12618:11
                        at processTicksAndRejections (node:internal/process/task_queues:95:5)
                        at async getHelloworldHelloworld (/functions/functions.ts:35:18)
                        at async /functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:176:16
                        at async invokeFunction (/functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:172:12)
                        at async /functions/node_modules/@hasura/ndc-lambda-sdk/src/execution.ts:50:22
                        caused by AggregateError
                        at internalConnectMultiple (node:net:1116:18)
                        at afterConnectMultiple (node:net:1683:7)
                        at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17)"
        }
    },
    "msg": "Error encountered when invoking function 'getHelloworldHelloworld'"
}

Side notes:

  1. ddn connector introspect myopenapi fails when NDC_OAS_DOCUMENT_URI is set to http://localhost:55555/api-docs/openapi.json (presumably because fetch fails for retrieving the swagger file too)
  2. ddn connector introspect myopenapi yields the following: "Error: overwriting is disabled and api.ts file already exists at /etc/connector/api.ts. To enable file overwrite, please set 'NDC_OAS_FILE_OVERWRITE' environment variable to
    true, or use the --overwrite flag". However, ddn connector introspect myopenapi --overwrite gives us "ERR unknown flag: --overwrite". Changing the environment variable does work, but the first error message should not suggest using a command line flag that does not exist.

What is the expected behaviour?

The query should complete without error (regardless of whether compression is being used). In the sample steps below, to run with compression, replace step 3 with cargo run gzip &

How to reproduce the issue?

I modified a sample Rust OpenAPI server for testing purposes. The steps below build and run that server, build a supergraph to connect to the server, and try querying an endpoint.

  1. Clone the sample repo: git clone https://github.com/durkino/hasura-fetch-example.git
  2. cd hasura-fetch-example
  3. Build and run the server in the background. cargo run &
  4. Confirm the server is working. curl http://localhost:55555/helloworld and curl http://localhost:55555/api-docs/openapi.json
  5. mkdir ../supergraph
  6. cd ../supergraph
  7. ddn supergraph init .
  8. ddn connector init -i
  9. Pick the hasura/openapi connector.
  10. Leave everything default except NDC_OAS_BASE_URL should be set to http://localhost:55555
  11. Copy down the swagger file. curl http://localhost:55555/api-docs/openapi.json > app/connector/myopenapi/swagger.json
  12. ddn connector introspect myopenapi
  13. ddn command add myopenapi "*"
  14. ddn supergraph build local
  15. ddn console --local
  16. ddn run docker-start
  17. In the GraphiQL console in your browser try the getHelloworldHelloworld query and observe the internal error.
  18. Cleanup the Rust server running in the background by killing the process (from step 3)

Keywords

fetch
undici
openapi data connector

@durkino durkino added the k/v3-bug Bug affecting Hasura v3 (DDN) label Dec 19, 2024
@durkino durkino changed the title fetch fails for OpenAPI data connector fetch fails for OpenAPI data connector connecting to localhost Dec 19, 2024
@robertjdominguez
Copy link
Contributor

Thanks for a very thorough description and repro steps, @durkino 🙏

@m-Bilal — can you please take a look above and see what can be done with the OpenAPI connector?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
k/v3-bug Bug affecting Hasura v3 (DDN)
Projects
None yet
Development

No branches or pull requests

2 participants