graphql-engine/server/lib/dc-api/test
Samir Talwar 342391f39d Upgrade Ormolu to v0.5.
This upgrades the version of Ormolu required by the HGE repository to v0.5.0.1, and reformats all code accordingly.

Ormolu v0.5 reformats code that uses infix operators. This is mostly useful, adding newlines and indentation to make it clear which operators are applied first, but in some cases, it's unpleasant. To make this easier on the eyes, I had to do the following:

* Add a few fixity declarations (search for `infix`)
* Add parentheses to make precedence clear, allowing Ormolu to keep everything on one line
* Rename `relevantEq` to `(==~)` in #6651 and set it to `infix 4`
* Add a few _.ormolu_ files (thanks to @hallettj for helping me get started), mostly for Autodocodec operators that don't have explicit fixity declarations

In general, I think these changes are quite reasonable. They mostly affect indentation.

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/6675
GitOrigin-RevId: cd47d87f1d089fb0bc9dcbbe7798dbceedcd7d83
2022-11-02 20:55:13 +00:00
..
Test Upgrade Ormolu to v0.5. 2022-11-02 20:55:13 +00:00
Command.hs Export Chinook dataset to JSON lines for Athena consumption 2022-10-24 02:10:51 +00:00
Main.hs ConfigSpec Data Connector Agent Test 2022-11-01 21:38:12 +00:00
README.md Export Chinook dataset to JSON lines for Athena consumption 2022-10-24 02:10:51 +00:00

Data Connector Agent Tests

This test suite provides a set of tests that is able to test any Data Connector agent that contains the Chinook data set to ensure the agent is behaving as expected. The test executable is designed to be distributable to customers building Data Connector agents, but is also useful to ensure Hasura's own agents are working correctly.

Not all tests will be appropriate for all agents. Agents self-describe their capabilities and only the tests appropriate for those capabilities will be run.

The executable also has the ability to export the OpenAPI spec of the Data Connector agent API so that customers can use that to ensure their agent complies with the API format. In addition, the Chinook data set can be exported to files on disk in various formats.

How to Use

First, start your Data Connector agent and ensure it is populated with the Chinook data set. For example, you could start the Reference Agent by following the instructions in its README.

To run the tests against the agent (for example), you must specify the agent's URL on the command line (-u), as well as the agent's configuration JSON (-s, sent in the X-Hasura-DataConnector-Config header):

cabal run test:tests-dc-api -- test -u "http://localhost:8100" -s '{}'

By default, the test suite will discover what capabilities the agent exposes by querying it. Otherwise, the user can use command line flags to specify which capabilities their agent has to ensure that it exposes the expected capabilities and that the test suite only runs the tests that correspond to those capabilities.

To set the agent's available the capabilities use -c and comma separate them:

> cabal run test:tests-dc-api -- test -u "http://localhost:8100" -s '{}' -c relationships

If -c is omitted, the default value is autodetect. If you have no capabilities, you can specify none.

To export the OpenAPI spec, you can run this command, and the spec will be written to stdout.

> cabal run test:tests-dc-api -- export-openapi-spec

To export the Chinook data set, you can run this command:

> cabal run test:tests-dc-api -- export-data -d /tmp/chinook-data -f JSONLines

This will export the data into the directory specified by -d in the JSONLines format (-f) which is as a JSON object per row, newline separated. Each table's data will be exported into a separate file.

If you need to customize the format of any DateTime columns, you can use the --datetime-format option and specify a format string using the syntax specified here. By default the DateTime columns are exported in ISO8601 format.

The other format supported by -f is JSON, which results in each file being a JSON array of table rows as JSON objects.