346804fc67
## Description This change adds support for nested object fields in HGE IR and Schema Cache, the Data Connectors backend and API, and the MongoDB agent. ### Data Connector API changes - The `/schema` endpoint response now includes an optional set of GraphQL type definitions. Table column types can refer to these definitions by name. - Queries can now include a new field type `object` which contains a column name and a nested query. This allows querying into a nested object within a field. ### MongoDB agent changes - Add support for querying into nested documents using the new `object` field type. ### HGE changes - The `Backend` type class has a new type family `XNestedObjects b` which controls whether or not a backend supports querying into nested objects. This is currently enabled only for the `DataConnector` backend. - For backends that support nested objects, the `FieldInfo` type gets a new constructor `FINestedObject`, and the `AnnFieldG` type gets a new constructor `AFNestedObject`. - If the DC `/schema` endpoint returns any custom GraphQL type definitions they are stored in the `TableInfo` for each table in the source. - During schema cache building, the function `addNonColumnFields` will check whether any column types match custom GraphQL object types stored in the `TableInfo`. If so, they are converted into `FINestedObject` instead of `FIColumn` in the `FieldInfoMap`. - When building the `FieldParser`s from `FieldInfo` (function `fieldSelection`) any `FINestedObject` fields are converted into nested object parsers returning `AFNestedObject`. - The `DataConnector` query planner converts `AFNestedObject` fields into `object` field types in the query sent to the agent. ## Limitations ### HGE not yet implemented: - Support for nested arrays - Support for nested objects/arrays in mutations - Support for nested objects/arrays in order-by - Support for filters (`where`) in nested objects/arrays - Support for adding custom GraphQL types via track table metadata API - Support for interface and union types - Tests for nested objects ### Mongo agent not yet implemented: - Generate nested object types from validation schema - Support for aggregates - Support for order-by - Configure agent port - Build agent in CI - Agent tests for nested objects and MongoDB agent PR-URL: https://github.com/hasura/graphql-engine-mono/pull/7844 GitOrigin-RevId: aec9ec1e4216293286a68f9b1af6f3f5317db423 |
||
---|---|---|
.. | ||
Test | ||
Command.hs | ||
Main.hs | ||
README.md |
Data Connector Agent Tests
This test suite provides a set of tests that is able to test any Data Connector agent that contains the Chinook data set to ensure the agent is behaving as expected. The test executable is designed to be distributable to customers building Data Connector agents, but is also useful to ensure Hasura's own agents are working correctly.
Not all tests will be appropriate for all agents. Agents self-describe their capabilities and only the tests appropriate for those capabilities will be run.
The executable also has the ability to export the OpenAPI spec of the Data Connector agent API so that customers can use that to ensure their agent complies with the API format. In addition, the Chinook data set can be exported to files on disk in various formats.
How to Use
Running Tests
First, start your Data Connector agent and ensure it is populated with the Chinook data set. For example, you could start the Reference Agent by following the instructions in its README.
To run the tests against the agent (for example), you must specify the agent's URL on the command line (--agent-base-url
), as well as the agent's configuration JSON (--agent-config
, sent in the X-Hasura-DataConnector-Config
header):
cabal run test:tests-dc-api -- test --agent-base-url "http://localhost:8100" --agent-config '{}'
The test suite will discover what capabilities the agent has by querying it. It will then tailor the tests that it will run to match only those capabilities that the agent has said it supports.
If your agent supports the datasets capability, you can omit the --agent-config
argument and the test suite will clone the Chinook dataset template on the agent to run its test against. If you need to specify some additional configuration to be added to the cloned dataset's configuration, you can specify it using --merge-agent-config
.
The test suite is implemented using the Sandwich test framework. The standard Sandwich command line arguments can be passed by suffixing your command line with sandwich
and then all following args will be passed to Sandwich.
For example, to run the Terminal UI mode of Sandwich, you could run:
cabal run test:tests-dc-api -- test --agent-base-url "http://localhost:8100" --agent-config '{}' sandwich --tui
By default Sandwich will write test results into a test_runs
folder. Every test has a folder that will contain debug information, for example:
- All the HTTP requests that the test made to the agent (
agent-request-[n].http
). These files can be used with a client such as REST Client (VSCode) or HTTP Client (IntelliJ) - All the HTTP responses from the agent that matched those requests (
agent-response-[n].http
)
Exporting Data
To export the Data Connector Agent OpenAPI spec, you can run this command, and the spec will be written to stdout.
> cabal run test:tests-dc-api -- export-openapi-spec
To export the Chinook data set, you can run this command:
> cabal run test:tests-dc-api -- export-data -d /tmp/chinook-data -f JSONLines
This will export the data into the directory specified by -d
in the JSONLines
format (-f
) which is as a JSON object per row, newline separated. Each table's data will be exported into a separate file.
If you need to customize the format of any DateTime columns, you can use the --datetime-format
option and specify a format string using the syntax specified here. By default the DateTime columns are exported in ISO8601 format.
The other format supported by -f
is JSON
, which results in each file being a JSON array of table rows as JSON objects.