3eb06d9314
<!-- The PR description should answer 2 important questions: --> ### What We've been testing our OpenDD pipeline with snapshot tests that run for each role. However, now that we're adding argument presets, it means that the results will potentially be different. Therefore this saves snapshots for each role the test is run against so that we don't get tests conflicting with themselves. Functional no-op. ### How Change label in `insta::assert_json_snapshot!`, commit new snapshots. V3_GIT_ORIGIN_REV_ID: 1570ed82bdb76d27270c73c96d2671e8a3c7fe13 |
||
---|---|---|
.. | ||
.cargo | ||
crates | ||
docs | ||
nix | ||
rfcs | ||
static | ||
.dockerignore | ||
.envrc | ||
.envrc.local.example | ||
.gitignore | ||
.prettierignore | ||
.prettierrc | ||
Cargo.lock | ||
Cargo.toml | ||
changelog.md | ||
ci.docker-compose.yaml | ||
CONTRIBUTING.md | ||
custom-connector.Dockerfile | ||
dev-auth-webhook.Dockerfile | ||
docker-compose.yaml | ||
Dockerfile | ||
flake.lock | ||
flake.nix | ||
justfile | ||
README.md | ||
rust-toolchain.toml |
Hasura GraphQL Engine V3
Hasura V3 is the API execution engine, based over the Open Data Domain Specification (OpenDD spec) and Native Data Connector Specifications (NDC spec), which powers the Hasura Data Delivery Network (DDN). The v3-engine expects to run against an OpenDDS metadata file and exposes a GraphQL endpoint according to the specified metadata. The v3-engine needs a data connector to run alongside, for the execution of data source specific queries.
Data connectors
Hasura v3-engine does not execute queries directly - instead it sends IR (abstracted, intermediate query) to NDC agents (aka data connectors). To run queries on a database, we'll need to run the data connector that supports the database.
Available data connectors are listed at the Connector Hub
For local development, we use the reference agent implementation that is a part of the NDC spec.
To start the reference agent only, you can do:
docker compose up reference_agent
Run v3-engine (with Postgres)
Building with Docker
You can also start v3-engine, along with a Postgres data connector and Jaeger for tracing using Docker:
docker compose up
Open http://localhost:3000 for GraphiQL, or http://localhost:4002 to view traces in Jaeger.
Note: you'll need to add {"x-hasura-role": "admin"}
to the Headers section to
run queries from GraphiQL.
NDC Postgres is the official connector
by Hasura for Postgres Database. For running V3 engine for GraphQL API on
Postgres, you need to run NDC Postgres Connector and have a metadata.json
file
that is authored specifically for your Postgres database and models (tables,
views, functions).
The recommended way to author metadata.json
for Postgres, is via Hasura DDN.
Follow the Hasura DDN Guide to create a Hasura DDN project, connect your cloud or local Postgres Database (Hasura DDN provides a secure tunnel mechanism to connect your local database easily), and model your GraphQL API. You can then download the authored metadata.json and use the following steps to run GraphQL API on your local Hasura V3 engine.
Running tests
To run the test suite, you need to docker login to ghcr.io
first:
docker login -u <username> -p <token> ghcr.io
where username
is your github username, and token
is your github PAT. The
PAT needs to have the read:packages
scope and Hasura SSO
configured. See
this
for more details.
Running just watch
will start the Docker dependencies, build the engine, and
run all the tests.
Alternatively, run the tests once with just test
Updating goldenfiles
There are some tests where we compare the output of the test against an expected golden file. If you make some changes which expectedly change the goldenfile, you can regenerate them like this:
just update-golden-files
Some other tests use insta
, and these can be reviewed with
cargo insta review
. If the cargo insta
command cannot be found, install it
with cargo install cargo-insta
.
Run benchmarks
The benchmarks operate against the reference agent using the same test cases as the test suite, and need a similar setup.
To run benchmarks for the lexer, parser and validation:
cargo bench -p lang-graphql "lexer"
cargo bench -p lang-graphql "parser"
cargo bench -p lang-graphql "validation/.*"