Data Connectors Native Queries Support

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/9499
Co-authored-by: gneeri <10553562+gneeri@users.noreply.github.com>
GitOrigin-RevId: 1e351556c43e643aa973f87adc0306f076cd227e
This commit is contained in:
Lyndon Maydwell 2023-08-17 12:03:45 +10:00 committed by hasura-bot
parent f54034ddf7
commit 41054de113
115 changed files with 2905 additions and 929 deletions

View File

@ -81,7 +81,10 @@ package *
haddock-internal: true
-- -----------------------------------------------------------
-- Allow for dead-code elimination at link-time, to reduce binary size
ghc-options: -split-sections
if !os(darwin)
ghc-options:
-split-sections
if(os(linux))
package *
-- ld on M1 OSX does not recognise this:

View File

@ -138,6 +138,7 @@ The `GET /capabilities` endpoint is used by `graphql-engine` to discover the cap
"column_nullability": "nullable_and_non_nullable"
},
"relationships": {},
"interpolated_queries": {},
"scalar_types": {
"DateTime": {
"comparison_operators": {
@ -175,6 +176,7 @@ The `capabilities` section describes the _capabilities_ of the service. This inc
- `queries`: The query capabilities that the agent supports
- `data_schema`: What sorts of features the agent supports when describing its data schema
- `relationships`: whether or not the agent supports relationships
- `interpolated_queries`: whether or not the agent supports interpolated queries
- `scalar_types`: scalar types and the operations they support. See [Scalar types capabilities](#scalar-type-capabilities).
The `config_schema` property contains an [OpenAPI 3 Schema](https://swagger.io/specification/#schema-object) object that represents the schema of the configuration object. It can use references (`$ref`) to refer to other schemas defined in the `other_schemas` object by name.
@ -182,15 +184,27 @@ The `config_schema` property contains an [OpenAPI 3 Schema](https://swagger.io/s
`graphql-engine` will use the `config_schema` OpenAPI 3 Schema to validate the user's configuration JSON before putting it into the `X-Hasura-DataConnector-Config` header.
#### Query capabilities
The agent can declare whether or not it supports ["foreach queries"](#foreach-queries) by including a `foreach` property with an empty object assigned to it. Foreach query support is optional, but is required if the agent is to be used as the target of remote relationships in HGE.
The agent can also declare whether or not it supports ["data redaction"](#data-redaction) by including a `redaction` property with an empty object assigned to it. Data redaction support is optional, but is required if a user configures HGE with inherited roles with different column selection permissions for the same table in the inherited role's role set.
#### Data schema capabilities
The agent can declare whether or not it supports primary keys or foreign keys by setting the `supports_primary_keys` and `supports_foreign_keys` properties under the `data_schema` object on capabilities. If it does not declare support, it is expected that it will not return any such primary/foreign keys in the schema it exposes on the `/schema` endpoint.
If the agent only supports table columns that are always nullable, then it should set `column_nullability` to `"only_nullable"`. However, if it supports both nullable and non-nullable columns, then it should set `"nullable_and_non_nullable"`.
### Interpolated Queries
Interpolated queries are lists of strings and scalars that represent applied templates of analagous form to [`select * from users where id = `, 5].
By declaring support for the `interpolated_queries` capability the Hasura admin understands that they will be able to define native queries that leverage this cabability through the agent.
In the case of a well understood backend with a well defined query language - for example - Postgres. The admin will assume that they can define native queries using the most common query language for that backend - In the case of Postgres, PG flavoured SQL.
In the case of a more niche or custom backend, they native query format should be well documented so that administrators know how to define them.
#### Scalar type capabilities
Agents should declare the scalar types they support, along with the comparison operators and aggregate functions on those types.
@ -424,8 +438,11 @@ and here is the resulting query request payload:
```json
{
"table": ["Artist"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [],
"query": {
"where": {
"expressions": [],
@ -455,7 +472,7 @@ The implementation of the service is responsible for intepreting this data struc
Let's break down the request:
- The `table` field tells us which table to fetch the data from, namely the `Artist` table. The table name (ie. the array of strings) must be one that was returned previously by the `/schema` endpoint.
- The `table_relationships` field that lists any relationships used to join between tables in the query. This query does not use any relationships, so this is just an empty list here.
- The `relationships` field that lists any relationships used to join between entities in the query. This query does not use any relationships, so this is just an empty list here.
- The `query` field contains further information about how to query the specified table:
- The `where` field tells us that there is currently no (interesting) predicate being applied to the rows of the data set (just an empty conjunction, which ought to return every row).
- The `order_by` field tells us that there is no particular ordering to use, and that we can return data in its natural order.
@ -516,8 +533,11 @@ This would produce the following agent query request JSON:
```json
{
"table": ["Artist"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [],
"query": {
"aggregates_limit": null,
"limit": 2,
@ -570,8 +590,11 @@ This would produce the following agent query request JSON:
```json
{
"table": ["Artist"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [],
"query": {
"aggregates_limit": 5,
"limit": 2,
@ -839,11 +862,11 @@ If the call to `GET /capabilities` returns a `capabilities` record with a `relat
_Note_ : if the `relationships` capability is not present then `graphql-engine` will not send queries to this agent involving relationships.
Relationship fields are indicated by a `type` field containing the string `relationship`. Such fields will also include the name of the relationship in a field called `relationship`. This name refers to a relationship that is specified on the top-level query request object in the `table_relationships` field.
Relationship fields are indicated by a `type` field containing the string `relationship`. Such fields will also include the name of the relationship in a field called `relationship`. This name refers to a relationship that is specified on the top-level query request object in the `relationships` field.
This `table_relationships` is a list of tables, and for each table, a map of relationship name to relationship information. The information is an object that has a field `target_table` that specifies the name of the related table. It has a field called `relationship_type` that specified either an `object` (many to one) or an `array` (one to many) relationship. There is also a `column_mapping` field that indicates the mapping from columns in the source table to columns in the related table.
This `relationships` is a list of relationships from a target - table, function, or interpolated query. For each target, a map of relationship name to relationship information. The information is an object that has a `target` field that specifies the name of the related target. It has a field called `relationship_type` that specified either an `object` (many to one) or an `array` (one to many) relationship. There is also a `column_mapping` field that indicates the mapping from columns in the source table to columns in the related target.
It is intended that the backend should execute the `query` contained in the relationship field and return the resulting query response as the value of this field, with the additional record-level predicate that any mapped columns should be equal in the context of the current record of the current table.
It is intended that the backend should execute the `query` contained in the relationship field and return the resulting query response as the value of this field, with the additional record-level predicate that any mapped columns should be equal in the context of the current record of the current target.
An example will illustrate this. Consider the following GraphQL query:
@ -862,13 +885,20 @@ This will generate the following JSON query if the agent supports relationships:
```json
{
"table": ["Artist"],
"table_relationships": [
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [
{
"type": "table",
"source_table": ["Artist"],
"relationships": {
"ArtistAlbums": {
"target_table": ["Album"],
"target": {
"type": "table",
"name": ["Album"]
},
"relationship_type": "array",
"column_mapping": {
"ArtistId": "ArtistId"
@ -941,7 +971,7 @@ Note the `Albums` field in particular, which traverses the `Artists` -> `Albums`
}
```
The top-level `table_relationships` can be looked up by starting from the source table (in this case `Artist`), locating the `ArtistAlbums` relationship under that table, then extracting the relationship information. This information includes the `target_table` field which indicates the table to be queried when following this relationship is the `Album` table. The `relationship_type` field indicates that this relationship is an `array` relationship (ie. that it will return zero to many Album rows per Artist row). The `column_mapping` field indicates the column mapping for this relationship, namely that the Artist's `ArtistId` must equal the Album's `ArtistId`.
The top-level `relationships` can be looked up by starting from the source (in this case the `Artist` table), subsequently locating the `ArtistAlbums` relationship, then extracting the relationship information. This information includes the `target` field which indicates the (in this case) table to be queried when following this relationship is the `Album` table. The `relationship_type` field indicates that this relationship is an `array` relationship (ie. that it will return zero to many Album rows per Artist row). The `column_mapping` field indicates the column mapping for this relationship, namely that the Artist's `ArtistId` must equal the Album's `ArtistId`.
Back on the relationship field inside the query, there is another `query` field. This indicates the query that should be executed against the `Album` table, but we must remember to enforce the additional constraint between Artist's `ArtistId` and Album's `ArtistId`. That is, in the context of any single outer `Artist` record, we should populate the `Albums` field with the query response containing the array of Album records for which the `ArtistId` field is equal to the outer record's `ArtistId` field.
@ -982,7 +1012,8 @@ Here's an example (truncated) response:
```
#### Cross-Table Filtering
It is possible to form queries that filter their results by comparing columns across tables via relationships. One way this can happen in Hasura GraphQL Engine is when configuring permissions on a table. It is possible to configure a filter on a table such that it joins to another table in order to compare some data in the filter expression.
It is possible to form queries that filter their results by comparing columns across tables (and other targets) via relationships. One way this can happen in Hasura GraphQL Engine is when configuring permissions on a table. It is possible to configure a filter on a table such that it joins to another table in order to compare some data in the filter expression.
The following metadata when used with HGE configures a `Customer` and `Employee` table, and sets up a select permission rule on `Customer` such that only customers that live in the same country as their SupportRep Employee would be visible to users in the `user` role:
@ -1073,13 +1104,20 @@ We would get the following query request JSON:
```json
{
"table": ["Customer"],
"table_relationships": [
"target": {
"type": "table",
"name": ["Customer"]
},
"relationships": [
{
"type": "table",
"source_table": ["Customer"],
"relationships": {
"SupportRep": {
"target_table": ["Employee"],
"target": {
"type": "table",
"name": ["Employee"]
},
"relationship_type": "object",
"column_mapping": {
"SupportRepId": "EmployeeId"
@ -1151,6 +1189,7 @@ We would get the following query request JSON:
The key point of interest here is in the `where` field where we are comparing between columns. Our first expression is an `exists` expression that specifies a row must exist in the table related to the `Customer` table by the `SupportRep` relationship (ie. the `Employee` table). These rows must match a subexpression that compares the related `Employee`'s `Country` column with `equal` to `Customer`'s `Country` column (as indicated by the `["$"]` path). So, in order to evaluate this condition, we'd need to join the `Employee` table using the `column_mapping` specified in the `SupportRep` relationship. Then if any of the related rows (in this case, only one because it is an `object` relation) contain a `Country` that is equal to Customer row's `Country` the `binary_op` would evaluate to True. This would mean a row exists, so the `exists` evaluates to true, and we don't filter out the Customer row.
#### Filtering by Unrelated Tables
It is possible to filter a table by a predicate evaluated against a completely unrelated table. This can happen in Hasura GraphQL Engine when configuring permissions on a table.
In the following example, we are configuring HGE's metadata such that when the Customer table is queried by the employee role, the employee currently doing the query (as specified by the `X-Hasura-EmployeeId` session variable) must be an employee from the city of Calgary, otherwise no rows are returned.
@ -1233,8 +1272,11 @@ We would get the following query request JSON:
```json
{
"table": ["Customer"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Customer"]
},
"relationships": [],
"query": {
"fields": {
"Country": {
@ -1328,8 +1370,11 @@ This would cause the following query request to be performed:
```json
{
"table": ["Artist"],
"table_relationships": [],
"target": {
"type": "table",
"table": ["Artist"]
},
"relationships": [],
"query": {
"aggregates": {
"aggregate_max_ArtistId": {
@ -1366,8 +1411,11 @@ query {
```json
{
"table": ["Album"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Album"]
},
"relationships": [],
"query": {
"aggregates": {
"aggregate_distinct_count": {
@ -1418,8 +1466,11 @@ The `nodes` part of the query ends up as standard `fields` in the `Query`, and t
```json
{
"table": ["Artist"],
"table_relationships": [],
"target": {
"type": "table",
"table": ["Artist"]
},
"relationships": [],
"query": {
"aggregates": {
"aggregate_count": {
@ -1490,13 +1541,20 @@ This would generate the following `QueryRequest`:
```json
{
"table": ["Artist"],
"table_relationships": [
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [
{
"type": "table",
"source_table": ["Artist"],
"relationships": {
"Albums": {
"target_table": ["Album"],
"target": {
"type": "table",
"name": ["Album"]
},
"relationship_type": "array",
"column_mapping": {
"ArtistId": "ArtistId"
@ -1599,13 +1657,20 @@ Here's an example of applying an ordering by a related table; the Album table is
```json
{
"table": ["Album"],
"table_relationships": [
"target": {
"type": "table",
"name": ["Album"]
},
"relationships": [
{
"type": "table",
"source_table": ["Album"],
"relationships": {
"Artist": {
"target_table": ["Artist"],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationship_type": "object",
"column_mapping": {
"ArtistId": "ArtistId"
@ -1640,7 +1705,7 @@ Here's an example of applying an ordering by a related table; the Album table is
}
```
Note that the `target_path` specifies the relationship path of `["Artist"]`, and that this relationship is defined in the top-level `table_relationships`. The ordering element target column `Name` would therefore be found on the `Artist` table after joining to it from each `Album`. (See the [Relationships](#Relationships) section for more information about relationships.)
Note that the `target_path` specifies the relationship path of `["Artist"]`, and that this relationship is defined in the top-level `relationships`. The ordering element target column `Name` would therefore be found on the `Artist` table after joining to it from each `Album`. (See the [Relationships](#Relationships) section for more information about relationships.)
The `relations` property of `order_by` will contain all the relations used in the order by, for the purpose of specifying filters that must be applied to the joined tables before using them for sorting. The `relations` property captures all `target_path`s used in the `order_by` in a recursive fashion, so for example, if the following `target_path`s were used in the `order_by`'s `elements`:
@ -1674,13 +1739,20 @@ For example, here's a query that retrieves artists ordered descending by the cou
```json
{
"table": ["Artist"],
"table_relationships": [
"target": {
"type": "table",
"name": ["Artist"]
},
"relationships": [
{
"type": "table",
"source_table": ["Artist"],
"relationships": {
"Albums": {
"target_table": ["Album"],
"target": {
"type": "table",
"name": ["Album"]
},
"relationship_type": "array",
"column_mapping": {
"ArtistId": "ArtistId"
@ -1744,8 +1816,11 @@ Foreach queries are very similar to standard queries, except they include an add
```json
{
"table": ["Album"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Album"]
},
"relationships": [],
"query": {
"fields": {
"AlbumId": {
@ -1878,8 +1953,11 @@ Here's an example query, querying all the columns of the above example `Test` ta
```jsonc
{
"table": ["Test"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Test"]
},
"relationships": [],
// Redaction expressions are defined per table/function
"redaction_expressions": [
{
@ -1984,8 +2062,11 @@ For example, here's an aggregation query:
```jsonc
{
"table": ["Test"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Test"]
},
"relationships": [],
// Redaction expressions are defined per table/function
"redaction_expressions": [
{
@ -2059,8 +2140,11 @@ For example, here's a query that uses redaction inside the filter expression in
```jsonc
{
"table": ["Test"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Test"]
},
"relationships": [],
// Redaction expressions are defined per table/function
"redaction_expressions": [
{
@ -2127,8 +2211,11 @@ For example, here's a query that uses redaction inside the `order_by`:
```jsonc
{
"table": ["Test"],
"table_relationships": [],
"target": {
"type": "table",
"name": ["Test"]
},
"relationships": [],
// Redaction expressions are defined per table/function
"redaction_expressions": [
{
@ -2220,7 +2307,7 @@ The `POST /mutation` endpoint is invoked when the user issues a mutation GraphQL
```jsonc
{
"table_relationships": [], // Any relationships between tables are described in here in the same manner as in queries
"relationships": [], // Any relationships between tables are described in here in the same manner as in queries
"operations": [ // A mutation request can contain multiple mutation operations
{
"type": "insert", // Also: "update" and "delete"
@ -2295,7 +2382,7 @@ This would result in a mutation request like this:
```json
{
"table_relationships": [],
"relationships": [],
"insert_schema": [
{
"table": ["Artist"],
@ -2431,19 +2518,26 @@ This would result in the following request:
```json
{
"table_relationships": [
"relationships": [
{
"type": "table",
"source_table": ["Album"],
"relationships": {
"Artist": {
"target_table": ["Artist"],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationship_type": "object",
"column_mapping": {
"ArtistId": "ArtistId"
}
},
"Tracks": {
"target_table": ["Track"],
"target": {
"type": "table",
"name": ["Track"]
},
"relationship_type": "array",
"column_mapping": {
"AlbumId": "AlbumId"
@ -2593,10 +2687,10 @@ This would result in the following request:
Note that there are two new types of fields in the `insert_schema` in this query to capture the nested inserts:
* `object_relation`: This captures a nested insert across an object relationship. In this case, we're inserting the related Artist row.
* `relationship`: The name of the relationship across which to insert the related row. The information about this relationship can be looked up in `table_relationships`.
* `relationship`: The name of the relationship across which to insert the related row. The information about this relationship can be looked up in `relationships`.
* `insert_order`: This can be either `before_parent` or `after_parent` and indicates whether or not the related row needs to be inserted before the parent row or after it.
* `array_relation`: This captures a nested insert across an array relationship. In this case, we're inserting the related Tracks rows.
* `relationship`: The name of the relationship across which to insert the related rows. The information about this relationship can be looked up in `table_relationships`.
* `relationship`: The name of the relationship across which to insert the related rows. The information about this relationship can be looked up in `relationships`.
The agent is expected to set the necessary values of foreign key columns itself when inserting all the rows. In this example, the agent would:
* First insert the Artist.
@ -2670,7 +2764,7 @@ This would get translated into a mutation request like so:
```json
{
"table_relationships": [],
"relationships": [],
"operations": [
{
"type": "update",
@ -2757,7 +2851,7 @@ This would cause a mutation request to be send that looks like this:
```json
{
"table_relationships": [],
"relationships": [],
"operations": [
{
"type": "delete",
@ -2911,9 +3005,10 @@ Query Sent to Agent:
],
"relationships": {
"myself": {
"target_table": [
"Artist"
],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationship_type": "object",
"column_mapping": {
"ArtistId": "ArtistId"
@ -2928,9 +3023,10 @@ Query Sent to Agent:
],
"relationships": {
"myself": {
"target_table": [
"Artist"
],
"target": {
"type": "table",
"name": ["Artist"]
},
"relationship_type": "object",
"column_mapping": {
"ArtistId": "ArtistId"
@ -2969,6 +3065,204 @@ Query Sent to Agent:
}
```
### Interpolated Queries (Native Queries)
Interpolated queries are "integrated raw queries" in the sense that the interpretation is up to the agent, and knowledge of their format is intended to be propagated out-of-band. They are integrated in that they can have relationships, and their responses are exposed via fields.
If this is too abstract, then an example may help illuminate:
* An HGE administrator has an application that has basic search functionality that leverages `like` parameters.
* They wish to use advanced full-text-search capabilities that their DB backend provides.
* They want to avoid having to define function in their database-schema due to lack of DB permissions, or expedience reasons.
* Via the `raw` query interface, the administrator develops an SQL query that leverages the full-text-search functions to deliver the results they require
* They add a new logical-model to represent the results format.
* They add a new native query and use `{{variable}}` references to user parameters.
* Application developers reference the new search functions.
Agents may implement "interpolated query" support that powers HGE native queries by doing the following:
* Including the `interpolated_queries: {}` capability
* Handling the `"type": "interpolated"` query-request target
* (Optionally) handling `"type": "interpolated_query"` relationship targets
Requests that include interpolated queries define them (unsurprisingly) under the `interpolated_queries` field, keyed by id, but also including the id in the definition for convenience. References to them from `target`s will also be via this id.
While it is not required, having good `/explain` integration for interpolated queries is very useful, as they are far more opaque to application developers than HGE's (and Hasura's published data connector) built-in query translations.
#### Example
Metadata Schema:
```json
{
"resource_version": 9,
"metadata": {
"version": 3,
"sources": [
{
"name": "chinook",
"kind": "sqlite",
"tables": [
{
"table": [
"Artist"
]
}
],
"native_queries": [
{
"arguments": {},
"code": "select 'db0d9bd6-ca4e-4eb4-8798-944b9536eb3d' as a",
"object_relationships": [
{
"name": "quux",
"using": {
"column_mapping": {
"a": "x"
},
"insertion_order": null,
"remote_native_query": "native_quux"
}
}
],
"returns": "logical_baz",
"root_field_name": "native_baz"
},
{
"arguments": {
"y": {
"nullable": false,
"type": "Boolean"
}
},
"code": "select 'db0d9bd6-ca4e-4eb4-8798-944b9536eb3d' as x /* {{y}} */",
"returns": "logical_quux",
"root_field_name": "native_quux"
}
],
"logical_models": [
{
"fields": [
{
"name": "a",
"type": {
"nullable": false,
"scalar": "String"
}
},
{
"name": "quux",
"type": {
"logical_model": "logical_quux",
"nullable": true
}
}
],
"name": "logical_baz"
},
{
"fields": [
{
"name": "x",
"type": {
"nullable": false,
"scalar": "String"
}
}
],
"name": "logical_quux"
}
],
"configuration": {
"template": null,
"timeout": null,
"value": {
"db": "./dataset_clones/db.chinook.sqlite"
}
}
}
],
"backend_configs": {
"dataconnector": {
"sqlite": {
"uri": "http://localhost:8100"
}
}
}
}
}
```
GraphQL:
```graphql
{
native_baz {
a
quux(args: {y: true}) {
x
}
}
}
```
Agent API Request:
```json
{
"target": {
"type": "interpolated",
"query_id": "native_baz_1",
"arguments": null
},
"interpolated_queries": {
"native_baz_1": ["select 'db0d9bd6-ca4e-4eb4-8798-944b9536eb3d' as a"],
"native_baz_2": ["select 'db0d9bd6-ca4e-4eb4-8798-944b9536eb3d' as x /* ", {"type": "bool", "value": true}," */"]
},
"relationships": [
{
"type": "interpolated",
"source_interpolated_query": "native_baz_1",
"relationships": {
"quux": {
"target": {
"type": "interpolated",
"query_id": "native_baz_2",
"arguments": null
},
"relationship_type": "object",
"column_mapping": {
"a": "x"
}
}
}
}
],
"query": {
"fields": {
"quux": {
"type": "relationship",
"relationship": "quux",
"query": {
"fields": {
"x": {
"type": "column",
"column": "x",
"column_type": "String"
}
}
}
},
"a": {
"type": "column",
"column": "a",
"column_type": "String"
}
}
}
}
```
### Datasets
The `/datasets` resource is available to use in order to create new databases/schemas from templates.
@ -3006,4 +3300,4 @@ flowchart TD;
CONFIG --> SCHEMA["POST /schema"];
CONFIG --> QUERY["POST /query"];
CONFIG --> MUTATION["POST /mutation"];
```
```

View File

@ -1,6 +1,6 @@
{
"name": "@hasura/dc-api-types",
"version": "0.37.0",
"version": "0.40.0",
"description": "Hasura GraphQL Engine Data Connector Agent API types",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",

View File

@ -507,6 +507,9 @@
"explain": {
"$ref": "#/components/schemas/ExplainCapabilities"
},
"interpolated_queries": {
"$ref": "#/components/schemas/InterpolatedQueryCapabilities"
},
"licensing": {
"$ref": "#/components/schemas/Licensing"
},
@ -695,6 +698,7 @@
"type": "object"
},
"RelationshipCapabilities": {},
"InterpolatedQueryCapabilities": {},
"SubqueryComparisonCapabilities": {
"nullable": true,
"properties": {
@ -1528,21 +1532,68 @@
"type": "null"
},
"QueryRequest": {
"discriminator": {
"mapping": {
"function": "FunctionRequest",
"table": "TableRequest"
"properties": {
"foreach": {
"description": "If present, a list of columns and values for the columns that the query must be repeated for, applying the column values as a filter for each query.",
"items": {
"additionalProperties": {
"$ref": "#/components/schemas/ScalarValue"
},
"type": "object"
},
"nullable": true,
"type": "array"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/FunctionRequest"
"interpolated_queries": {
"$ref": "#/components/schemas/InterpolatedQueries"
},
{
"$ref": "#/components/schemas/TableRequest"
"query": {
"$ref": "#/components/schemas/Query"
},
"redaction_expressions": {
"default": [],
"description": "Expressions that can be referenced by the query to redact fields/columns",
"items": {
"$ref": "#/components/schemas/TargetRedactionExpressions"
},
"type": "array"
},
"relationships": {
"description": "The relationships between tables involved in the entire query request",
"items": {
"$ref": "#/components/schemas/Relationships"
},
"type": "array"
},
"target": {
"$ref": "#/components/schemas/Target"
}
]
},
"required": [
"target",
"relationships",
"query"
],
"type": "object"
},
"TInterpolated": {
"properties": {
"id": {
"description": "The id for the query interpolation template",
"type": "string"
},
"type": {
"enum": [
"interpolated"
],
"type": "string"
}
},
"required": [
"id",
"type"
],
"type": "object"
},
"ScalarArgumentValue": {
"properties": {
@ -1615,6 +1666,71 @@
}
]
},
"TFunction": {
"properties": {
"arguments": {
"description": "The arguments of the function",
"items": {
"$ref": "#/components/schemas/FunctionRequestArgument"
},
"type": "array"
},
"name": {
"$ref": "#/components/schemas/FunctionName"
},
"type": {
"enum": [
"function"
],
"type": "string"
}
},
"required": [
"name",
"arguments",
"type"
],
"type": "object"
},
"TTable": {
"properties": {
"name": {
"$ref": "#/components/schemas/TableName"
},
"type": {
"enum": [
"table"
],
"type": "string"
}
},
"required": [
"name",
"type"
],
"type": "object"
},
"Target": {
"discriminator": {
"mapping": {
"function": "TFunction",
"interpolated": "TInterpolated",
"table": "TTable"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/TInterpolated"
},
{
"$ref": "#/components/schemas/TFunction"
},
{
"$ref": "#/components/schemas/TTable"
}
]
},
"RelationshipType": {
"enum": [
"object",
@ -1634,24 +1750,51 @@
"relationship_type": {
"$ref": "#/components/schemas/RelationshipType"
},
"target_table": {
"$ref": "#/components/schemas/TableName"
"target": {
"$ref": "#/components/schemas/Target"
}
},
"required": [
"target_table",
"target",
"relationship_type",
"column_mapping"
],
"type": "object"
},
"InterpolatedRelationships": {
"properties": {
"relationships": {
"additionalProperties": {
"$ref": "#/components/schemas/Relationship"
},
"description": "A map of relationships from the interpolated table to targets. The key of the map is the relationship name",
"type": "object"
},
"source_interpolated_query": {
"description": "The source interpolated query involved in the relationship",
"type": "string"
},
"type": {
"enum": [
"interpolated"
],
"type": "string"
}
},
"required": [
"source_interpolated_query",
"relationships",
"type"
],
"type": "object"
},
"FunctionRelationships": {
"properties": {
"relationships": {
"additionalProperties": {
"$ref": "#/components/schemas/Relationship"
},
"description": "A map of relationships from the source table to target tables. The key of the map is the relationship name",
"description": "A map of relationships from the source function to targets. The key of the map is the relationship name",
"type": "object"
},
"source_function": {
@ -1677,7 +1820,7 @@
"additionalProperties": {
"$ref": "#/components/schemas/Relationship"
},
"description": "A map of relationships from the source table to target tables. The key of the map is the relationship name",
"description": "A map of relationships from the source table to targets. The key of the map is the relationship name",
"type": "object"
},
"source_table": {
@ -1691,9 +1834,9 @@
}
},
"required": [
"type",
"source_table",
"relationships"
"relationships",
"type"
],
"type": "object"
},
@ -1701,11 +1844,15 @@
"discriminator": {
"mapping": {
"function": "FunctionRelationships",
"interpolated": "InterpolatedRelationships",
"table": "TableRelationships"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/InterpolatedRelationships"
},
{
"$ref": "#/components/schemas/FunctionRelationships"
},
@ -1714,6 +1861,25 @@
}
]
},
"TNInterpolatedQuery": {
"properties": {
"interpolated": {
"description": "The id of the interpolated query",
"type": "string"
},
"type": {
"enum": [
"interpolated"
],
"type": "string"
}
},
"required": [
"interpolated",
"type"
],
"type": "object"
},
"TNFunction": {
"properties": {
"function": {
@ -1754,11 +1920,15 @@
"discriminator": {
"mapping": {
"function": "TNFunction",
"interpolated": "TNInterpolatedQuery",
"table": "TNTable"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/TNInterpolatedQuery"
},
{
"$ref": "#/components/schemas/TNFunction"
},
@ -2183,6 +2353,89 @@
],
"type": "object"
},
"InterpolatedText": {
"properties": {
"type": {
"enum": [
"text"
],
"type": "string"
},
"value": {
"type": "string"
}
},
"required": [
"value",
"type"
],
"type": "object"
},
"InterpolatedScalar": {
"properties": {
"type": {
"enum": [
"scalar"
],
"type": "string"
},
"value": {
"additionalProperties": true
},
"value_type": {
"$ref": "#/components/schemas/ScalarType"
}
},
"required": [
"value",
"value_type",
"type"
],
"type": "object"
},
"InterpolatedItem": {
"discriminator": {
"mapping": {
"scalar": "InterpolatedScalar",
"text": "InterpolatedText"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/InterpolatedText"
},
{
"$ref": "#/components/schemas/InterpolatedScalar"
}
]
},
"InterpolatedQuery": {
"properties": {
"id": {
"description": "An id associated with the interpolated query - Should be unique across the request",
"type": "string"
},
"items": {
"description": "Interpolated items in the query",
"items": {
"$ref": "#/components/schemas/InterpolatedItem"
},
"type": "array"
}
},
"required": [
"id",
"items"
],
"type": "object"
},
"InterpolatedQueries": {
"additionalProperties": {
"$ref": "#/components/schemas/InterpolatedQuery"
},
"type": "object"
},
"Field": {
"discriminator": {
"mapping": {
@ -2629,52 +2882,6 @@
}
]
},
"FunctionRequest": {
"properties": {
"function": {
"$ref": "#/components/schemas/FunctionName"
},
"function_arguments": {
"default": [],
"description": "Function Arguments. TODO. Improve this.",
"items": {
"$ref": "#/components/schemas/FunctionRequestArgument"
},
"type": "array"
},
"query": {
"$ref": "#/components/schemas/Query"
},
"redaction_expressions": {
"default": [],
"description": "Expressions that can be referenced by the query to redact fields/columns",
"items": {
"$ref": "#/components/schemas/TargetRedactionExpressions"
},
"type": "array"
},
"relationships": {
"description": "The relationships between entities involved in the entire query request",
"items": {
"$ref": "#/components/schemas/Relationships"
},
"type": "array"
},
"type": {
"enum": [
"function"
],
"type": "string"
}
},
"required": [
"function",
"relationships",
"query",
"type"
],
"type": "object"
},
"ScalarValue": {
"properties": {
"value": {
@ -2690,55 +2897,6 @@
],
"type": "object"
},
"TableRequest": {
"properties": {
"foreach": {
"description": "If present, a list of columns and values for the columns that the query must be repeated for, applying the column values as a filter for each query.",
"items": {
"additionalProperties": {
"$ref": "#/components/schemas/ScalarValue"
},
"type": "object"
},
"nullable": true,
"type": "array"
},
"query": {
"$ref": "#/components/schemas/Query"
},
"redaction_expressions": {
"default": [],
"description": "Expressions that can be referenced by the query to redact fields/columns",
"items": {
"$ref": "#/components/schemas/TargetRedactionExpressions"
},
"type": "array"
},
"table": {
"$ref": "#/components/schemas/TableName"
},
"table_relationships": {
"description": "The relationships between tables involved in the entire query request",
"items": {
"$ref": "#/components/schemas/Relationships"
},
"type": "array"
},
"type": {
"enum": [
"table"
],
"type": "string"
}
},
"required": [
"table",
"table_relationships",
"query",
"type"
],
"type": "object"
},
"ExplainResponse": {
"properties": {
"lines": {
@ -2834,16 +2992,16 @@
},
"type": "array"
},
"table_relationships": {
"description": "The relationships between tables involved in the entire mutation request",
"relationships": {
"description": "The relationships involved in the entire mutation request",
"items": {
"$ref": "#/components/schemas/TableRelationships"
"$ref": "#/components/schemas/Relationships"
},
"type": "array"
}
},
"required": [
"table_relationships",
"relationships",
"insert_schema",
"operations"
],

View File

@ -61,7 +61,6 @@ export type { FunctionInfo } from './models/FunctionInfo';
export type { FunctionInformationArgument } from './models/FunctionInformationArgument';
export type { FunctionName } from './models/FunctionName';
export type { FunctionRelationships } from './models/FunctionRelationships';
export type { FunctionRequest } from './models/FunctionRequest';
export type { FunctionRequestArgument } from './models/FunctionRequestArgument';
export type { FunctionReturnsTable } from './models/FunctionReturnsTable';
export type { FunctionReturnsUnknown } from './models/FunctionReturnsUnknown';
@ -71,6 +70,13 @@ export type { GraphQLType } from './models/GraphQLType';
export type { InsertCapabilities } from './models/InsertCapabilities';
export type { InsertFieldSchema } from './models/InsertFieldSchema';
export type { InsertMutationOperation } from './models/InsertMutationOperation';
export type { InterpolatedItem } from './models/InterpolatedItem';
export type { InterpolatedQueries } from './models/InterpolatedQueries';
export type { InterpolatedQuery } from './models/InterpolatedQuery';
export type { InterpolatedQueryCapabilities } from './models/InterpolatedQueryCapabilities';
export type { InterpolatedRelationships } from './models/InterpolatedRelationships';
export type { InterpolatedScalar } from './models/InterpolatedScalar';
export type { InterpolatedText } from './models/InterpolatedText';
export type { Licensing } from './models/Licensing';
export type { MetricsCapabilities } from './models/MetricsCapabilities';
export type { MutationCapabilities } from './models/MutationCapabilities';
@ -137,12 +143,16 @@ export type { TableInfo } from './models/TableInfo';
export type { TableInsertSchema } from './models/TableInsertSchema';
export type { TableName } from './models/TableName';
export type { TableRelationships } from './models/TableRelationships';
export type { TableRequest } from './models/TableRequest';
export type { TableType } from './models/TableType';
export type { Target } from './models/Target';
export type { TargetName } from './models/TargetName';
export type { TargetRedactionExpressions } from './models/TargetRedactionExpressions';
export type { TFunction } from './models/TFunction';
export type { TInterpolated } from './models/TInterpolated';
export type { TNFunction } from './models/TNFunction';
export type { TNInterpolatedQuery } from './models/TNInterpolatedQuery';
export type { TNTable } from './models/TNTable';
export type { TTable } from './models/TTable';
export type { UnaryComparisonOperator } from './models/UnaryComparisonOperator';
export type { UniqueIdentifierGenerationStrategy } from './models/UniqueIdentifierGenerationStrategy';
export type { UnrelatedTable } from './models/UnrelatedTable';

View File

@ -6,6 +6,7 @@ import type { ComparisonCapabilities } from './ComparisonCapabilities';
import type { DataSchemaCapabilities } from './DataSchemaCapabilities';
import type { DatasetCapabilities } from './DatasetCapabilities';
import type { ExplainCapabilities } from './ExplainCapabilities';
import type { InterpolatedQueryCapabilities } from './InterpolatedQueryCapabilities';
import type { Licensing } from './Licensing';
import type { MetricsCapabilities } from './MetricsCapabilities';
import type { MutationCapabilities } from './MutationCapabilities';
@ -21,6 +22,7 @@ export type Capabilities = {
data_schema?: DataSchemaCapabilities;
datasets?: DatasetCapabilities;
explain?: ExplainCapabilities;
interpolated_queries?: InterpolatedQueryCapabilities;
licensing?: Licensing;
metrics?: MetricsCapabilities;
mutations?: MutationCapabilities;

View File

@ -7,7 +7,7 @@ import type { Relationship } from './Relationship';
export type FunctionRelationships = {
/**
* A map of relationships from the source table to target tables. The key of the map is the relationship name
* A map of relationships from the source function to targets. The key of the map is the relationship name
*/
relationships: Record<string, Relationship>;
source_function: FunctionName;

View File

@ -1,28 +0,0 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { FunctionName } from './FunctionName';
import type { FunctionRequestArgument } from './FunctionRequestArgument';
import type { Query } from './Query';
import type { Relationships } from './Relationships';
import type { TargetRedactionExpressions } from './TargetRedactionExpressions';
export type FunctionRequest = {
function: FunctionName;
/**
* Function Arguments. TODO. Improve this.
*/
function_arguments?: Array<FunctionRequestArgument>;
query: Query;
/**
* Expressions that can be referenced by the query to redact fields/columns
*/
redaction_expressions?: Array<TargetRedactionExpressions>;
/**
* The relationships between entities involved in the entire query request
*/
relationships: Array<Relationships>;
type: 'function';
};

View File

@ -0,0 +1,9 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { InterpolatedScalar } from './InterpolatedScalar';
import type { InterpolatedText } from './InterpolatedText';
export type InterpolatedItem = (InterpolatedText | InterpolatedScalar);

View File

@ -0,0 +1,7 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { InterpolatedQuery } from './InterpolatedQuery';
export type InterpolatedQueries = Record<string, InterpolatedQuery>;

View File

@ -0,0 +1,17 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { InterpolatedItem } from './InterpolatedItem';
export type InterpolatedQuery = {
/**
* An id associated with the interpolated query - Should be unique across the request
*/
id: string;
/**
* Interpolated items in the query
*/
items: Array<InterpolatedItem>;
};

View File

@ -0,0 +1,7 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type InterpolatedQueryCapabilities = {
};

View File

@ -0,0 +1,18 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { Relationship } from './Relationship';
export type InterpolatedRelationships = {
/**
* A map of relationships from the interpolated table to targets. The key of the map is the relationship name
*/
relationships: Record<string, Relationship>;
/**
* The source interpolated query involved in the relationship
*/
source_interpolated_query: string;
type: 'interpolated';
};

View File

@ -0,0 +1,12 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { ScalarType } from './ScalarType';
export type InterpolatedScalar = {
type: 'scalar';
value: any;
value_type: ScalarType;
};

View File

@ -0,0 +1,9 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type InterpolatedText = {
type: 'text';
value: string;
};

View File

@ -3,8 +3,8 @@
/* eslint-disable */
import type { MutationOperation } from './MutationOperation';
import type { Relationships } from './Relationships';
import type { TableInsertSchema } from './TableInsertSchema';
import type { TableRelationships } from './TableRelationships';
import type { TargetRedactionExpressions } from './TargetRedactionExpressions';
export type MutationRequest = {
@ -21,8 +21,8 @@ export type MutationRequest = {
*/
redaction_expressions?: Array<TargetRedactionExpressions>;
/**
* The relationships between tables involved in the entire mutation request
* The relationships involved in the entire mutation request
*/
table_relationships: Array<TableRelationships>;
relationships: Array<Relationships>;
};

View File

@ -2,8 +2,28 @@
/* tslint:disable */
/* eslint-disable */
import type { FunctionRequest } from './FunctionRequest';
import type { TableRequest } from './TableRequest';
import type { InterpolatedQueries } from './InterpolatedQueries';
import type { Query } from './Query';
import type { Relationships } from './Relationships';
import type { ScalarValue } from './ScalarValue';
import type { Target } from './Target';
import type { TargetRedactionExpressions } from './TargetRedactionExpressions';
export type QueryRequest = (FunctionRequest | TableRequest);
export type QueryRequest = {
/**
* If present, a list of columns and values for the columns that the query must be repeated for, applying the column values as a filter for each query.
*/
foreach?: Array<Record<string, ScalarValue>> | null;
interpolated_queries?: InterpolatedQueries;
query: Query;
/**
* Expressions that can be referenced by the query to redact fields/columns
*/
redaction_expressions?: Array<TargetRedactionExpressions>;
/**
* The relationships between tables involved in the entire query request
*/
relationships: Array<Relationships>;
target: Target;
};

View File

@ -3,7 +3,7 @@
/* eslint-disable */
import type { RelationshipType } from './RelationshipType';
import type { TableName } from './TableName';
import type { Target } from './Target';
export type Relationship = {
/**
@ -11,6 +11,6 @@ export type Relationship = {
*/
column_mapping: Record<string, string>;
relationship_type: RelationshipType;
target_table: TableName;
target: Target;
};

View File

@ -3,7 +3,8 @@
/* eslint-disable */
import type { FunctionRelationships } from './FunctionRelationships';
import type { InterpolatedRelationships } from './InterpolatedRelationships';
import type { TableRelationships } from './TableRelationships';
export type Relationships = (FunctionRelationships | TableRelationships);
export type Relationships = (InterpolatedRelationships | FunctionRelationships | TableRelationships);

View File

@ -0,0 +1,16 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { FunctionName } from './FunctionName';
import type { FunctionRequestArgument } from './FunctionRequestArgument';
export type TFunction = {
/**
* The arguments of the function
*/
arguments: Array<FunctionRequestArgument>;
name: FunctionName;
type: 'function';
};

View File

@ -0,0 +1,12 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type TInterpolated = {
/**
* The id for the query interpolation template
*/
id: string;
type: 'interpolated';
};

View File

@ -0,0 +1,12 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type TNInterpolatedQuery = {
/**
* The id of the interpolated query
*/
interpolated: string;
type: 'interpolated';
};

View File

@ -0,0 +1,11 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { TableName } from './TableName';
export type TTable = {
name: TableName;
type: 'table';
};

View File

@ -7,7 +7,7 @@ import type { TableName } from './TableName';
export type TableRelationships = {
/**
* A map of relationships from the source table to target tables. The key of the map is the relationship name
* A map of relationships from the source table to targets. The key of the map is the relationship name
*/
relationships: Record<string, Relationship>;
source_table: TableName;

View File

@ -1,28 +0,0 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { Query } from './Query';
import type { Relationships } from './Relationships';
import type { ScalarValue } from './ScalarValue';
import type { TableName } from './TableName';
import type { TargetRedactionExpressions } from './TargetRedactionExpressions';
export type TableRequest = {
/**
* If present, a list of columns and values for the columns that the query must be repeated for, applying the column values as a filter for each query.
*/
foreach?: Array<Record<string, ScalarValue>> | null;
query: Query;
/**
* Expressions that can be referenced by the query to redact fields/columns
*/
redaction_expressions?: Array<TargetRedactionExpressions>;
table: TableName;
/**
* The relationships between tables involved in the entire query request
*/
table_relationships: Array<Relationships>;
type: 'table';
};

View File

@ -0,0 +1,10 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { TFunction } from './TFunction';
import type { TInterpolated } from './TInterpolated';
import type { TTable } from './TTable';
export type Target = (TInterpolated | TFunction | TTable);

View File

@ -3,7 +3,8 @@
/* eslint-disable */
import type { TNFunction } from './TNFunction';
import type { TNInterpolatedQuery } from './TNInterpolatedQuery';
import type { TNTable } from './TNTable';
export type TargetName = (TNFunction | TNTable);
export type TargetName = (TNInterpolatedQuery | TNFunction | TNTable);

View File

@ -24,7 +24,7 @@
},
"dc-api-types": {
"name": "@hasura/dc-api-types",
"version": "0.37.0",
"version": "0.40.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",
@ -2227,7 +2227,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -2547,7 +2547,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify": "^4.13.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -2868,7 +2868,7 @@
"version": "file:reference",
"requires": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/xml2js": "^0.4.11",
@ -3080,7 +3080,7 @@
"version": "file:sqlite",
"requires": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/sqlite3": "^3.1.8",

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -52,7 +52,7 @@
"integrity": "sha512-lgHwxlxV1qIg1Eap7LgIeoBWIMFibOjbrYPIPJZcI1mmGAI2m3lNYpK12Y+GBdPQ0U1hRwSord7GIaawz962qQ=="
},
"node_modules/@hasura/dc-api-types": {
"version": "0.37.0",
"version": "0.40.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",

View File

@ -122,10 +122,12 @@ export const prettyPrintName = (name: TableName | FunctionName): string => {
export const prettyPrintTargetName = (name: TargetName): string => {
switch (name.type) {
case "table":
case 'table':
return prettyPrintName(name.table);
case "function":
case 'function':
return prettyPrintName(name.function);
case 'interpolated':
return name.interpolated;
default:
return unreachable(name["type"]);
}
@ -221,7 +223,13 @@ const makePerformExistsSubquery = (
case "related":
const relationship = findRelationship(exists.in_table.relationship);
const joinExpression = createFilterExpressionForRelationshipJoin(row, relationship);
return [relationship.target_table, joinExpression];
const relationshipTarget = relationship.target;
switch(relationshipTarget.type) {
case 'table':
return [relationshipTarget.name, joinExpression];
default:
throw new Error('makePerformExistsSubquery: only table relationships currently supported');
}
case "unrelated":
return [exists.in_table.table, undefined];
default:
@ -361,8 +369,14 @@ const makeGetOrderByElementValue = (
if (subquery === null) {
return null;
} else {
const queryResponse = performQuery({type: "table", table: relationship.target_table}, subquery);
return extractResultFromOrderByElementQueryResponse(innerOrderByElement, queryResponse);
const relationshipTarget = relationship.target;
switch(relationshipTarget.type) {
case 'table':
const queryResponse = performQuery({type: "table", table: relationshipTarget.name}, subquery);
return extractResultFromOrderByElementQueryResponse(innerOrderByElement, queryResponse);
default:
throw new Error("makeGetOrderByElementValue: Only table relationships currently supported");
}
}
}
};
@ -405,17 +419,7 @@ const paginateRows = (rows: Iterable<Record<string, RawScalarValue>>, offset: nu
};
const makeFindRelationship = (request: QueryRequest, targetName: TargetName) => (relationshipName: RelationshipName): Relationship => {
const relationships = (() => {
switch(request.type) {
case 'table':
return request.table_relationships;
case 'function':
return request.relationships;
default:
return unreachable(request["type"]);
}})();
for (var r of relationships) {
for (var r of request.relationships) {
switch(targetName.type) {
case 'table':
if(r.type === 'table') {
@ -437,6 +441,8 @@ const makeFindRelationship = (request: QueryRequest, targetName: TargetName) =>
}
}
break;
case 'interpolated':
throw new Error('makeFindRelationship: interpolatedQuery targets not supported');
default:
return unreachable(targetName["type"]);
}
@ -551,7 +557,14 @@ const projectRow = (
case "relationship":
const relationship = findRelationship(field.relationship);
const subquery = addRelationshipFilterToQuery(row, relationship, field.query);
projectedRow[fieldName] = subquery ? performQuery({type: "table", table: relationship.target_table}, subquery) : { aggregates: null, rows: null };
const relationshipTarget = relationship.target;
switch(relationshipTarget.type) {
case 'table':
projectedRow[fieldName] = subquery ? performQuery({type: "table", table: relationshipTarget.name}, subquery) : { aggregates: null, rows: null };
break;
default:
throw new Error(`projectRow: relationships currently only work for tables - Target: ${JSON.stringify(relationshipTarget)}`);
}
break;
case "object":
@ -675,10 +688,12 @@ export type Rows = Record<string, RawScalarValue>[]; // Record<string, ScalarVal
export const queryData = (getTable: (tableName: TableName) => Rows | undefined, queryRequest: QueryRequest): QueryResponse => {
const getTableRows = (targetName: TargetName): Rows | undefined => {
switch (targetName.type) {
case "table":
case 'table':
return getTable(targetName.table);
case "function":
case 'function':
throw new Error("Can't perform a subquery using a function");
case 'interpolated':
throw new Error("Can't perform a subquery using an interpolated query");
default:
return unreachable(targetName["type"]);
}
@ -726,23 +741,27 @@ export const queryData = (getTable: (tableName: TableName) => Rows | undefined,
const performNewQuery = (targetName: TargetName, query: Query): QueryResponse => performQuery([], targetName, query, getTableRows);
switch(queryRequest.type) {
const rootTarget = queryRequest.target;
switch(rootTarget.type) {
case 'function':
const getRows = (targetName: TargetName): Record<string, RawScalarValue>[] | undefined => {
switch (targetName.type) {
case "table":
return getTable(targetName.table);
case "function":
return respondToFunction(queryRequest.function, queryRequest.function_arguments ?? [], getTable);
return respondToFunction(rootTarget.name, rootTarget.arguments ?? [], getTable);
case 'interpolated':
throw new Error("Can't perform a subquery using an interpolated query");
default:
return unreachable(targetName["type"]);
return unreachable(targetName['type']);
}
}
const result = performQuery([], {type: "function", function: queryRequest.function}, queryRequest.query, getRows);
const result = performQuery([], {type: "function", function: rootTarget.name}, queryRequest.query, getRows);
return result;
case 'table':
const targetTable: TargetName = {type: "table", table: queryRequest.table};
const targetTable: TargetName = {type: "table", table: rootTarget.name};
if (queryRequest.foreach) {
return {
rows: queryRequest.foreach.map(foreachFilterIds => {
@ -764,6 +783,8 @@ export const queryData = (getTable: (tableName: TableName) => Rows | undefined,
} else {
return performNewQuery(targetTable, queryRequest.query);
}
case 'interpolated':
throw new Error("Can't perform a query using an interpolated query");
}
};

View File

@ -77,10 +77,12 @@ export const nameEquals = (name1: TableName | FunctionName) => (name2: TableName
export const targetNameEquals = (name1: TargetName) => (name2: TargetName): boolean => {
switch (name1.type) {
case "table":
case 'table':
return name2.type === "table" && nameEquals(name1.table)(name2.table);
case "function":
case 'function':
return name2.type === "function" && nameEquals(name1.function)(name2.function);
case 'interpolated':
return name2.type === "interpolated" && name1.interpolated == name2.interpolated;
default:
return unreachable(name1["type"]);
}

View File

@ -14,6 +14,7 @@ The SQLite agent currently supports the following capabilities:
* [x] Prometheus Metrics
* [x] Exposing Foreign-Key Information
* [x] Mutations
* [x] Native (Interpolated) Queries
* [ ] Subscriptions
* [ ] Streaming Subscriptions

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify": "^4.13.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -57,7 +57,7 @@
"integrity": "sha512-lgHwxlxV1qIg1Eap7LgIeoBWIMFibOjbrYPIPJZcI1mmGAI2m3lNYpK12Y+GBdPQ0U1hRwSord7GIaawz962qQ=="
},
"node_modules/@hasura/dc-api-types": {
"version": "0.37.0",
"version": "0.40.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.37.0",
"@hasura/dc-api-types": "0.40.0",
"fastify-metrics": "^9.2.1",
"fastify": "^4.13.0",
"nanoid": "^3.3.4",

View File

@ -99,6 +99,7 @@ export const capabilitiesResponse: CapabilitiesResponse = {
foreach: {}
},
relationships: {},
interpolated_queries: {},
comparisons: {
subquery: {
supports_relations: true

View File

@ -12,7 +12,7 @@ import { runRawOperation } from './raw';
import { DATASETS, DATASET_DELETE, LOG_LEVEL, METRICS, MUTATIONS, PERMISSIVE_CORS, PRETTY_PRINT_LOGS } from './environment';
import { cloneDataset, deleteDataset, getDataset } from './datasets';
import { runMutation } from './mutation';
import { ErrorWithStatusCode } from './util';
import { ErrorWithStatusCode, unreachable } from './util';
const port = Number(process.env.PORT) || 8100;
@ -132,13 +132,14 @@ server.post<{ Body: QueryRequest, Reply: QueryResponse }>("/query", async (reque
const end = queryHistogram.startTimer()
const config = getConfig(request);
const body = request.body;
switch(body.type) {
switch(body.target.type) {
case 'function':
throw new ErrorWithStatusCode(
"User defined functions not supported in queries",
500,
{function: { name: body.function }}
{function: { name: body.target.name }}
);
case 'interpolated': // interpolated should actually work identically to tables when using the CTE pattern
case 'table':
try {
const result : QueryResponse = await queryData(config, sqlLogger, body);
@ -160,15 +161,19 @@ server.post<{ Body: QueryRequest, Reply: ExplainResponse}>("/explain", async (re
server.log.info({ headers: request.headers, query: request.body, }, "query.request");
const config = getConfig(request);
const body = request.body;
switch(body.type) {
switch(body.target.type) {
case 'function':
throw new ErrorWithStatusCode(
"User defined functions not supported in queries",
500,
{function: { name: body.function }}
{function: { name: body.target.name }}
);
case 'table':
return explain(config, sqlLogger, body);
case 'interpolated':
return explain(config, sqlLogger, body);
default:
throw(unreachable);
}
});

View File

@ -1,8 +1,8 @@
import { ArrayRelationInsertFieldValue, ColumnInsertFieldValue, DeleteMutationOperation, Expression, Field, InsertFieldSchema, InsertMutationOperation, MutationOperation, MutationOperationResults, MutationRequest, MutationResponse, ObjectRelationInsertFieldValue, RowUpdate, TableInsertSchema, TableName, TableRelationships, UpdateMutationOperation } from "@hasura/dc-api-types";
import { ArrayRelationInsertFieldValue, ColumnInsertFieldValue, DeleteMutationOperation, Expression, Field, InsertFieldSchema, InsertMutationOperation, MutationOperation, MutationOperationResults, MutationRequest, MutationResponse, ObjectRelationInsertFieldValue, Relationships, RowUpdate, TableInsertSchema, TableName, TableRelationships, UpdateMutationOperation } from "@hasura/dc-api-types";
import { Config } from "./config";
import { Connection, defaultMode, SqlLogger, withConnection } from "./db";
import { escapeIdentifier, escapeTableName, escapeTableNameSansSchema, json_object, where_clause, } from "./query";
import { asyncSequenceFromInputs, ErrorWithStatusCode, mapObjectToArray, tableNameEquals, unreachable } from "./util";
import { asyncSequenceFromInputs, ErrorWithStatusCode, mapObjectToArray, tableNameEquals, tableToTarget, unreachable } from "./util";
// Types
@ -86,7 +86,7 @@ function columnsString(infos: Array<RowInfo>): string {
function getTableInsertSchema(schemas: Array<TableInsertSchema>, table: TableName): TableInsertSchema | null {
for(var i = 0; i < schemas.length; i++) {
const schema = schemas[i];
if(tableNameEquals(schema.table)(table)) {
if(tableNameEquals(schema.table)(tableToTarget(table))) {
return schema;
}
}
@ -100,8 +100,8 @@ function getTableInsertSchema(schemas: Array<TableInsertSchema>, table: TableNam
*
* Note: The heavy lifting is performed by `where_clause` from query.ts
*/
function whereString(relationships: Array<TableRelationships>, e: Expression, table: TableName): string {
const w = where_clause(relationships, e, table, escapeTableNameSansSchema(table));
function whereString(relationships: Array<Relationships>, e: Expression, table: TableName): string {
const w = where_clause(relationships, e, tableToTarget(table), escapeTableNameSansSchema(table));
return w;
}
@ -111,7 +111,7 @@ function whereString(relationships: Array<TableRelationships>, e: Expression, ta
*
* The `json_object` function from query.ts performs the heavy lifting here.
*/
function returningString(relationships: Array<TableRelationships>, fields: Record<string, Field>, table: TableName): string {
function returningString(relationships: Array<Relationships>, fields: Record<string, Field>, table: TableName): string {
/* Example of fields:
{
"ArtistId": {
@ -121,7 +121,7 @@ function returningString(relationships: Array<TableRelationships>, fields: Recor
}
}
*/
const r = json_object(relationships, fields, table, escapeTableNameSansSchema(table));
const r = json_object(relationships, fields, tableToTarget(table), escapeTableNameSansSchema(table));
return r;
}
@ -131,7 +131,7 @@ function queryValues(info: Array<Info>): Record<string, unknown> {
const EMPTY_AND: Expression = { type: 'and', expressions: [] };
function insertString(relationships: Array<TableRelationships>, op: InsertMutationOperation, info: Array<RowInfo>): string {
function insertString(relationships: Array<Relationships>, op: InsertMutationOperation, info: Array<RowInfo>): string {
const columnValues =
info.length > 0
? `(${columnsString(info)}) VALUES (${valuesString(info)})`
@ -145,7 +145,7 @@ function insertString(relationships: Array<TableRelationships>, op: InsertMutati
`;
}
function deleteString(relationships: Array<TableRelationships>, op: DeleteMutationOperation): string {
function deleteString(relationships: Array<Relationships>, op: DeleteMutationOperation): string {
return `
DELETE FROM ${escapeTableName(op.table)}
WHERE ${whereString(relationships, op.where || EMPTY_AND, op.table)}
@ -155,7 +155,7 @@ function deleteString(relationships: Array<TableRelationships>, op: DeleteMutati
`;
}
function updateString(relationships: Array<TableRelationships>, op: UpdateMutationOperation, info: Array<UpdateInfo>): string {
function updateString(relationships: Array<Relationships>, op: UpdateMutationOperation, info: Array<UpdateInfo>): string {
const result = `
UPDATE ${escapeTableName(op.table)}
SET ${setString(info)}
@ -208,7 +208,7 @@ function getUpdateRowInfos(op: UpdateMutationOperation): Array<UpdateInfo> {
});
}
async function insertRow(db: Connection, relationships: Array<TableRelationships>, op: InsertMutationOperation, info: Array<RowInfo>): Promise<Array<Row>> {
async function insertRow(db: Connection, relationships: Array<Relationships>, op: InsertMutationOperation, info: Array<RowInfo>): Promise<Array<Row>> {
const q = insertString(relationships, op, info);
const v = queryValues(info);
const results = await db.query(q,v);
@ -221,7 +221,7 @@ async function insertRow(db: Connection, relationships: Array<TableRelationships
return results;
}
async function updateRow(db: Connection, relationships: Array<TableRelationships>, op: UpdateMutationOperation, info: Array<UpdateInfo>): Promise<Array<Row>> {
async function updateRow(db: Connection, relationships: Array<Relationships>, op: UpdateMutationOperation, info: Array<UpdateInfo>): Promise<Array<Row>> {
const q = updateString(relationships, op, info);
const v = queryValues(info);
const results = await db.query(q,v);
@ -234,7 +234,7 @@ async function updateRow(db: Connection, relationships: Array<TableRelationships
return results;
}
async function deleteRows(db: Connection, relationships: Array<TableRelationships>, op: DeleteMutationOperation): Promise<Array<Row>> {
async function deleteRows(db: Connection, relationships: Array<Relationships>, op: DeleteMutationOperation): Promise<Array<Row>> {
const q = deleteString(relationships, op);
const results = await db.query(q);
return results;
@ -247,7 +247,7 @@ function postMutationCheckError(op: MutationOperation, failed: Array<Row>): Erro
);
}
async function mutationOperation(db: Connection, relationships: Array<TableRelationships>, schema: Array<TableInsertSchema>, op: MutationOperation): Promise<MutationOperationResults> {
async function mutationOperation(db: Connection, relationships: Array<Relationships>, schema: Array<TableInsertSchema>, op: MutationOperation): Promise<MutationOperationResults> {
switch(op.type) {
case 'insert':
const infos = getInsertRowInfos(schema, op);
@ -322,7 +322,7 @@ async function mutationOperation(db: Connection, relationships: Array<TableRelat
*/
export async function runMutation(config: Config, sqlLogger: SqlLogger, request: MutationRequest): Promise<MutationResponse> {
return await withConnection(config, defaultMode, sqlLogger, async db => {
const resultSet = await asyncSequenceFromInputs(request.operations, (op) => mutationOperation(db, request.table_relationships, request.insert_schema, op));
const resultSet = await asyncSequenceFromInputs(request.operations, (op) => mutationOperation(db, request.relationships, request.insert_schema, op));
return {
operation_results: resultSet
};

View File

@ -1,6 +1,6 @@
import { Config } from "./config";
import { defaultMode, SqlLogger, withConnection } from "./db";
import { coerceUndefinedToNull, coerceUndefinedOrNullToEmptyRecord, isEmptyObject, tableNameEquals, unreachable, stringArrayEquals, ErrorWithStatusCode, mapObject } from "./util";
import { coerceUndefinedToNull, tableNameEquals, unreachable, stringArrayEquals, ErrorWithStatusCode, mapObject } from "./util";
import {
Expression,
BinaryComparisonOperator,
@ -24,8 +24,12 @@ import {
OrderByElement,
OrderByTarget,
ScalarValue,
TableRequest,
FunctionRelationships,
InterpolatedRelationships,
Target,
InterpolatedQueries,
Relationships,
InterpolatedQuery,
InterpolatedItem,
} from "@hasura/dc-api-types";
import { customAlphabet } from "nanoid";
import { DEBUGGING_TAGS, QUERY_LENGTH_LIMIT } from "./environment";
@ -84,6 +88,17 @@ export function escapeTableName(tableName: TableName): string {
return validateTableName(tableName).map(escapeIdentifier).join(".");
}
export function escapeTargetName(target: Target): string {
switch(target.type) {
case 'table':
return escapeTableName(target.name);
case 'interpolated':
return escapeTableName([target.id]); // Interpret as CTE reference
default:
throw(new ErrorWithStatusCode('`escapeTargetName` only implemented for tables and interpolated queries', 500, {target}));
}
}
/**
* @param tableName
* @returns escaped tableName string with schema qualification removed
@ -94,21 +109,18 @@ export function escapeTableNameSansSchema(tableName: TableName): string {
return escapeTableName(getTableNameSansSchema(tableName));
}
export function json_object(relationships: TableRelationships[], fields: Fields, table: TableName, tableAlias: string): string {
export function json_object(all_relationships: Relationships[], fields: Fields, target: Target, tableAlias: string): string {
const result = Object.entries(fields).map(([fieldName, field]) => {
switch(field.type) {
case "column":
return `${escapeString(fieldName)}, ${escapeIdentifier(field.column)}`;
case "relationship":
const tableRelationships = relationships.find(tr => tableNameEquals(tr.source_table)(table));
if (tableRelationships === undefined) {
throw new Error(`Couldn't find table relationships for table ${table}`);
}
const rel = tableRelationships.relationships[field.relationship];
const relationships = find_relationships(all_relationships, target);
const rel = relationships.relationships[field.relationship];
if(rel === undefined) {
throw new Error(`Couldn't find relationship ${field.relationship} for field ${fieldName} on table ${table}`);
throw new Error(`Couldn't find relationship ${field.relationship} for field ${fieldName} on target ${JSON.stringify(target)}`);
}
return `'${fieldName}', ${relationship(relationships, rel, field, tableAlias)}`;
return `'${fieldName}', ${relationship(all_relationships, rel, field, tableAlias)}`;
case "object":
throw new Error('Unsupported field type "object"');
case "array":
@ -121,30 +133,31 @@ export function json_object(relationships: TableRelationships[], fields: Fields,
return tag('json_object', `JSON_OBJECT(${result})`);
}
export function where_clause(relationships: TableRelationships[], expression: Expression, queryTableName: TableName, queryTableAlias: string): string {
const generateWhere = (expression: Expression, currentTableName: TableName, currentTableAlias: string): string => {
export function where_clause(relationships: Relationships[], expression: Expression, queryTarget: Target, queryTableAlias: string): string {
const generateWhere = (expression: Expression, currentTarget: Target, currentTableAlias: string): string => {
switch(expression.type) {
case "not":
const aNot = generateWhere(expression.expression, currentTableName, currentTableAlias);
const aNot = generateWhere(expression.expression, currentTarget, currentTableAlias);
return `(NOT ${aNot})`;
case "and":
const aAnd = expression.expressions.flatMap(x => generateWhere(x, currentTableName, currentTableAlias));
const aAnd = expression.expressions.flatMap(x => generateWhere(x, currentTarget, currentTableAlias));
return aAnd.length > 0
? `(${aAnd.join(" AND ")})`
: "(1 = 1)" // true
case "or":
const aOr = expression.expressions.flatMap(x => generateWhere(x, currentTableName, currentTableAlias));
const aOr = expression.expressions.flatMap(x => generateWhere(x, currentTarget, currentTableAlias));
return aOr.length > 0
? `(${aOr.join(" OR ")})`
: "(1 = 0)" // false
case "exists":
const joinInfo = calculateExistsJoinInfo(relationships, expression, currentTableName, currentTableAlias);
const subqueryWhere = generateWhere(expression.where, joinInfo.joinTableName, joinInfo.joinTableAlias);
const joinInfo = calculateExistsJoinInfo(relationships, expression, currentTarget, currentTableAlias);
const tableTarget: Target = joinInfo.joinTarget;
const subqueryWhere = generateWhere(expression.where, tableTarget, joinInfo.joinTableAlias);
const whereComparisons = [...joinInfo.joinComparisonFragments, subqueryWhere].join(" AND ");
return tag('exists',`EXISTS (SELECT 1 FROM ${escapeTableName(joinInfo.joinTableName)} AS ${joinInfo.joinTableAlias} WHERE ${whereComparisons})`);
return tag('exists',`EXISTS (SELECT 1 FROM ${escapeTargetName(joinInfo.joinTarget)} AS ${joinInfo.joinTableAlias} WHERE ${whereComparisons})`);
case "unary_op":
const uop = uop_op(expression.operator);
@ -180,33 +193,33 @@ export function where_clause(relationships: TableRelationships[], expression: Ex
}
};
return generateWhere(expression, queryTableName, queryTableAlias);
return generateWhere(expression, queryTarget, queryTableAlias);
}
type ExistsJoinInfo = {
joinTableName: TableName,
joinTarget: Target,
joinTableAlias: string,
joinComparisonFragments: string[]
}
function calculateExistsJoinInfo(allTableRelationships: TableRelationships[], exists: ExistsExpression, sourceTableName: TableName, sourceTableAlias: string): ExistsJoinInfo {
function calculateExistsJoinInfo(allRelationships: Relationships[], exists: ExistsExpression, sourceTarget: Target, sourceTableAlias: string): ExistsJoinInfo {
switch (exists.in_table.type) {
case "related":
const tableRelationships = find_table_relationship(allTableRelationships, sourceTableName);
const tableRelationships = find_relationships(allRelationships, sourceTarget);
const relationship = tableRelationships.relationships[exists.in_table.relationship];
const joinTableAlias = generateTableAlias(relationship.target_table);
const joinTableAlias = generateTargetAlias(relationship.target);
const joinComparisonFragments = generateRelationshipJoinComparisonFragments(relationship, sourceTableAlias, joinTableAlias);
return {
joinTableName: relationship.target_table,
joinTarget: relationship.target,
joinTableAlias,
joinComparisonFragments,
};
case "unrelated":
return {
joinTableName: exists.in_table.table,
joinTarget: {type: 'table', name: exists.in_table.table},
joinTableAlias: generateTableAlias(exists.in_table.table),
joinComparisonFragments: []
};
@ -249,6 +262,17 @@ function generateComparisonValueFragment(comparisonValue: ComparisonValue, query
}
}
export function generateTargetAlias(target: Target): string {
switch(target.type) {
case 'function':
throw new ErrorWithStatusCode("Can't create alias for functions", 500, {target});
case 'interpolated':
return generateTableAlias([target.id]);
case 'table':
return generateTableAlias(target.name);
}
}
function generateTableAlias(tableName: TableName): string {
return generateIdentifierAlias(validateTableName(tableName).join("_"))
}
@ -264,14 +288,26 @@ function generateIdentifierAlias(identifier: string): string {
* @param tableName Table Name
* @returns Relationships matching table-name
*/
function find_table_relationship(allTableRelationships: TableRelationships[], tableName: TableName): TableRelationships {
for(var i = 0; i < allTableRelationships.length; i++) {
const r = allTableRelationships[i];
if(tableNameEquals(r.source_table)(tableName)) {
return r;
}
function find_relationships(allRelationships: Relationships[], target: Target): Relationships {
switch(target.type) {
case 'table':
for(var i = 0; i < allRelationships.length; i++) {
const r = allRelationships[i] as TableRelationships;
if(r.source_table != undefined && tableNameEquals(r.source_table)(target)) {
return r;
}
}
break
case 'interpolated':
for(var i = 0; i < allRelationships.length; i++) {
const r = allRelationships[i] as InterpolatedRelationships;
if(r.source_interpolated_query != undefined && r.source_interpolated_query == target.id) {
return r;
}
}
break;
}
throw new Error(`Couldn't find table relationships for table ${tableName} - This shouldn't happen.`);
throw new Error(`Couldn't find table relationships for target ${JSON.stringify(target)} - This shouldn't happen.`);
}
function cast_aggregate_function(f: string): string {
@ -291,8 +327,8 @@ function cast_aggregate_function(f: string): string {
* Builds an Aggregate query expression.
*/
function aggregates_query(
ts: TableRelationships[],
tableName: TableName,
allRelationships: Relationships[],
target: Target,
joinInfo: RelationshipJoinInfo | null,
aggregates: Aggregates,
wWhere: Expression | null,
@ -300,14 +336,13 @@ function aggregates_query(
wOffset: number | null,
wOrder: OrderBy | null,
): string {
const tableAlias = generateTableAlias(tableName);
const orderByInfo = orderBy(ts, wOrder, tableName, tableAlias);
const tableAlias = generateTargetAlias(target);
const orderByInfo = orderBy(allRelationships, wOrder, target, tableAlias);
const orderByJoinClauses = orderByInfo?.joinClauses.join(" ") ?? "";
const orderByClause = orderByInfo?.orderByClause ?? "";
const whereClause = where(ts, wWhere, joinInfo, tableName, tableAlias);
const sourceSubquery = `SELECT ${tableAlias}.* FROM ${escapeTableName(tableName)} AS ${tableAlias} ${orderByJoinClauses} ${whereClause} ${orderByClause} ${limit(wLimit)} ${offset(wOffset)}`
const whereClause = where(allRelationships, wWhere, joinInfo, target, tableAlias);
const sourceSubquery = `SELECT ${tableAlias}.* FROM ${escapeTargetName(target)} AS ${tableAlias} ${orderByJoinClauses} ${whereClause} ${orderByClause} ${limit(wLimit)} ${offset(wOffset)}`
const aggregate_pairs = Object.entries(aggregates).map(([k,v]) => {
switch(v.type) {
@ -332,9 +367,9 @@ type RelationshipJoinInfo = {
columnMapping: Record<string, string> // Mapping from source table column name to target table column name
}
function table_query(
ts: TableRelationships[],
tableName: TableName,
function target_query( // TODO: Rename as `target_query`
allRelationships: Relationships[],
target: Target,
joinInfo: RelationshipJoinInfo | null,
fields: Fields | null,
aggregates: Aggregates | null,
@ -344,36 +379,46 @@ function table_query(
wOffset: number | null,
wOrder: OrderBy | null,
): string {
const tableAlias = generateTableAlias(tableName);
const aggregateSelect = aggregates ? [aggregates_query(ts, tableName, joinInfo, aggregates, wWhere, aggregatesLimit, wOffset, wOrder)] : [];
var tableName;
// TableNames are resolved from IDs when using NQs.
switch(target.type) {
case 'table': tableName = target.name; break;
case 'interpolated': tableName = [target.id]; break;
case 'function':
throw new ErrorWithStatusCode(`Can't execute table_query for UDFs`, 500, {target});
}
const tableAlias = generateTargetAlias(target);
const aggregateSelect = aggregates ? [aggregates_query(allRelationships, target, joinInfo, aggregates, wWhere, aggregatesLimit, wOffset, wOrder)] : [];
// The use of the JSON function inside JSON_GROUP_ARRAY is necessary from SQLite 3.39.0 due to breaking changes in
// SQLite. See https://sqlite.org/forum/forumpost/e3b101fb3234272b for more details. This approach still works fine
// for older versions too.
const fieldSelect = fields === null ? [] : [`'rows', JSON_GROUP_ARRAY(JSON(j))`];
const fieldFrom = fields === null ? '' : (() => {
const whereClause = where(ts, wWhere, joinInfo, tableName, tableAlias);
const whereClause = where(allRelationships, wWhere, joinInfo, target, tableAlias);
// NOTE: The reuse of the 'j' identifier should be safe due to scoping. This is confirmed in testing.
if(wOrder === null || wOrder.elements.length < 1) {
return `FROM ( SELECT ${json_object(ts, fields, tableName, tableAlias)} AS j FROM ${escapeTableName(tableName)} AS ${tableAlias} ${whereClause} ${limit(wLimit)} ${offset(wOffset)})`;
return `FROM ( SELECT ${json_object(allRelationships, fields, target, tableAlias)} AS j FROM ${escapeTableName(tableName)} AS ${tableAlias} ${whereClause} ${limit(wLimit)} ${offset(wOffset)})`;
} else {
const orderByInfo = orderBy(ts, wOrder, tableName, tableAlias);
const orderByInfo = orderBy(allRelationships, wOrder, target, tableAlias);
const orderByJoinClauses = orderByInfo?.joinClauses.join(" ") ?? "";
const orderByClause = orderByInfo?.orderByClause ?? "";
const innerSelect = `SELECT ${tableAlias}.* FROM ${escapeTableName(tableName)} AS ${tableAlias} ${orderByJoinClauses} ${whereClause} ${orderByClause} ${limit(wLimit)} ${offset(wOffset)}`;
const wrappedQueryTableAlias = generateTableAlias(tableName);
return `FROM (SELECT ${json_object(ts, fields, tableName, wrappedQueryTableAlias)} AS j FROM (${innerSelect}) AS ${wrappedQueryTableAlias})`;
return `FROM (SELECT ${json_object(allRelationships, fields, target, wrappedQueryTableAlias)} AS j FROM (${innerSelect}) AS ${wrappedQueryTableAlias})`;
}
})()
return tag('table_query',`(SELECT JSON_OBJECT(${[...fieldSelect, ...aggregateSelect].join(', ')}) ${fieldFrom})`);
}
function relationship(ts: TableRelationships[], r: Relationship, field: RelationshipField, sourceTableAlias: string): string {
function relationship(ts: Relationships[], r: Relationship, field: RelationshipField, sourceTableAlias: string): string {
const relationshipJoinInfo = {
sourceTableAlias,
targetTable: r.target_table,
columnMapping: r.column_mapping,
};
@ -384,9 +429,9 @@ function relationship(ts: TableRelationships[], r: Relationship, field: Relation
? [1, 1]
: [coerceUndefinedToNull(field.query.limit), coerceUndefinedToNull(field.query.aggregates_limit)];
return tag("relationship", table_query(
return tag("relationship", target_query(
ts,
r.target_table,
r.target,
relationshipJoinInfo,
coerceUndefinedToNull(field.query.fields),
coerceUndefinedToNull(field.query.aggregates),
@ -446,7 +491,7 @@ type OrderByInfo = {
orderByClause: string,
}
function orderBy(allTableRelationships: TableRelationships[], orderBy: OrderBy | null, queryTableName: TableName, queryTableAlias: string): OrderByInfo | null {
function orderBy(allRelationships: Relationships[], orderBy: OrderBy | null, queryTarget: Target, queryTableAlias: string): OrderByInfo | null {
if (orderBy === null || orderBy.elements.length < 1) {
return null;
}
@ -454,7 +499,7 @@ function orderBy(allTableRelationships: TableRelationships[], orderBy: OrderBy |
const joinInfos = Object
.entries(orderBy.relations)
.flatMap(([subrelationshipName, subrelation]) =>
generateOrderByJoinClause(allTableRelationships, orderBy.elements, [], subrelationshipName, subrelation, queryTableName, queryTableAlias)
generateOrderByJoinClause(allRelationships, orderBy.elements, [], subrelationshipName, subrelation, queryTarget, queryTableAlias)
);
const orderByFragments =
@ -503,17 +548,17 @@ type OrderByJoinInfo = {
}
function generateOrderByJoinClause(
allTableRelationships: TableRelationships[],
allRelationships: Relationships[],
allOrderByElements: OrderByElement[],
parentRelationshipNames: string[],
relationshipName: string,
orderByRelation: OrderByRelation,
sourceTableName: TableName,
sourceTarget: Target,
sourceTableAlias: string
): OrderByJoinInfo[] {
const relationshipPath = [...parentRelationshipNames, relationshipName];
const tableRelationships = find_table_relationship(allTableRelationships, sourceTableName);
const relationship = tableRelationships.relationships[relationshipName];
const relationships = find_relationships(allRelationships, sourceTarget);
const relationship = relationships.relationships[relationshipName];
const orderByElements = allOrderByElements.filter(byTargetPath(relationshipPath));
const columnTargetsExist = orderByElements.some(element => getJoinTableTypeForTarget(element.target) === "column");
@ -522,12 +567,12 @@ function generateOrderByJoinClause(
const [columnTargetJoin, subrelationJoinInfo] = (() => {
const subrelationsExist = Object.keys(orderByRelation.subrelations).length > 0;
if (columnTargetsExist || subrelationsExist) {
const columnTargetJoin = generateOrderByColumnTargetJoinInfo(allTableRelationships, relationshipPath, relationship, sourceTableAlias, orderByRelation.where);
const columnTargetJoin = generateOrderByColumnTargetJoinInfo(allRelationships, relationshipPath, relationship, sourceTableAlias, orderByRelation.where);
const subrelationJoinInfo = Object
.entries(orderByRelation.subrelations)
.flatMap(([subrelationshipName, subrelation]) =>
generateOrderByJoinClause(allTableRelationships, allOrderByElements, relationshipPath, subrelationshipName, subrelation, relationship.target_table, columnTargetJoin.tableAlias)
generateOrderByJoinClause(allRelationships, allOrderByElements, relationshipPath, subrelationshipName, subrelation, relationship.target, columnTargetJoin.tableAlias)
);
return [[columnTargetJoin], subrelationJoinInfo]
@ -538,7 +583,7 @@ function generateOrderByJoinClause(
})();
const aggregateTargetJoin = aggregateElements.length > 0
? [generateOrderByAggregateTargetJoinInfo(allTableRelationships, relationshipPath, relationship, sourceTableAlias, orderByRelation.where, aggregateElements)]
? [generateOrderByAggregateTargetJoinInfo(allRelationships, relationshipPath, relationship, sourceTableAlias, orderByRelation.where, aggregateElements)]
: [];
@ -552,19 +597,19 @@ function generateOrderByJoinClause(
const byTargetPath = (relationshipPath: string[]) => (orderByElement: OrderByElement): boolean => stringArrayEquals(orderByElement.target_path)(relationshipPath);
function generateOrderByColumnTargetJoinInfo(
allTableRelationships: TableRelationships[],
allRelationships: Relationships[],
relationshipPath: string[],
relationship: Relationship,
sourceTableAlias: string,
whereExpression: Expression | undefined
): OrderByJoinInfo {
const targetTableAlias = generateTableAlias(relationship.target_table);
const targetTableAlias = generateTargetAlias(relationship.target);
const joinComparisonFragments = generateRelationshipJoinComparisonFragments(relationship, sourceTableAlias, targetTableAlias);
const whereComparisons = whereExpression ? [where_clause(allTableRelationships, whereExpression, relationship.target_table, targetTableAlias)] : [];
const whereComparisons = whereExpression ? [where_clause(allRelationships, whereExpression, relationship.target, targetTableAlias)] : [];
const joinOnFragment = [...joinComparisonFragments, ...whereComparisons].join(" AND ");
const joinClause = tag("columnTargetJoin", `LEFT JOIN ${escapeTableName(relationship.target_table)} AS ${targetTableAlias} ON ${joinOnFragment}`);
const joinClause = tag("columnTargetJoin", `LEFT JOIN ${escapeTargetName(relationship.target)} AS ${targetTableAlias} ON ${joinOnFragment}`);
return {
joinTableType: "column",
relationshipPath: relationshipPath,
@ -574,15 +619,16 @@ function generateOrderByColumnTargetJoinInfo(
}
function generateOrderByAggregateTargetJoinInfo(
allTableRelationships: TableRelationships[],
allTableRelationships: Relationships[],
relationshipPath: string[],
relationship: Relationship,
sourceTableAlias: string,
whereExpression: Expression | undefined,
aggregateElements: OrderByElement[],
): OrderByJoinInfo {
const targetTableAlias = generateTableAlias(relationship.target_table);
const subqueryTableAlias = generateTableAlias(relationship.target_table);
const targetTableAlias = generateTargetAlias(relationship.target);
const subqueryTableAlias = generateTargetAlias(relationship.target);
const aggregateColumnsFragments = aggregateElements.flatMap(element => {
switch (element.target.type) {
@ -594,8 +640,8 @@ function generateOrderByAggregateTargetJoinInfo(
});
const joinColumns = Object.values(relationship.column_mapping).map(escapeIdentifier);
const selectColumns = [...joinColumns, aggregateColumnsFragments];
const whereClause = whereExpression ? `WHERE ${where_clause(allTableRelationships, whereExpression, relationship.target_table, subqueryTableAlias)}` : "";
const aggregateSubquery = `SELECT ${selectColumns.join(", ")} FROM ${escapeTableName(relationship.target_table)} AS ${subqueryTableAlias} ${whereClause} GROUP BY ${joinColumns.join(", ")}`
const whereClause = whereExpression ? `WHERE ${where_clause(allTableRelationships, whereExpression, relationship.target, subqueryTableAlias)}` : "";
const aggregateSubquery = `SELECT ${selectColumns.join(", ")} FROM ${escapeTargetName(relationship.target)} AS ${subqueryTableAlias} ${whereClause} GROUP BY ${joinColumns.join(", ")}`
const joinComparisonFragments = generateRelationshipJoinComparisonFragments(relationship, sourceTableAlias, targetTableAlias);
const joinOnFragment = [ ...joinComparisonFragments ].join(" AND ");
@ -623,8 +669,8 @@ function getOrderByTargetAlias(orderByTarget: OrderByTarget): string {
* @param joinInfo Information about a possible join from a source table to the query table that needs to be generated into the where clause
* @returns string representing the combined where clause
*/
function where(ts: TableRelationships[], whereExpression: Expression | null, joinInfo: RelationshipJoinInfo | null, queryTableName: TableName, queryTableAlias: string): string {
const whereClause = whereExpression !== null ? [where_clause(ts, whereExpression, queryTableName, queryTableAlias)] : [];
function where(allRelationships: Relationships[], whereExpression: Expression | null, joinInfo: RelationshipJoinInfo | null, queryTarget: Target, queryTableAlias: string): string {
const whereClause = whereExpression !== null ? [where_clause(allRelationships, whereExpression, queryTarget, queryTableAlias)] : [];
const joinArray = joinInfo
? Object
.entries(joinInfo.columnMapping)
@ -655,11 +701,52 @@ function offset(o: number | null): string {
}
}
function query(request: TableRequest): string {
const tableRelationships = only_table_relationships(request.table_relationships);
const result = table_query(
tableRelationships,
request.table,
// NOTE: InterpolatedItem can be extended to support arrays of values.
// NOTE: It appears that value.value_type comes back as `Boolean` when we advertise only items from `ScalarTypeKey`
function cte_item(value: InterpolatedItem): string {
switch(value.type) {
case 'text':
return value.value;
case 'scalar':
switch(value.value_type.toLowerCase()) {
// Check this list against the types listed in capabilities
case 'string':
return escapeString(value.value);
case 'number':
case 'int':
case 'integer':
case 'float':
return `${value.value}`;
case 'bool':
case 'boolean':
return `${value.value ? 1 : 0}`;
// Assume that everything else is a JSON value
case 'json':
default:
return `json( ${ escapeString(JSON.stringify(value.value)) } )`;
}
default:
return unreachable(value["type"]);
}
}
function cte_items(iq: InterpolatedQuery): string {
const items = iq.items.map(cte_item);
const joined = items.join(' ');
return `${iq.id} AS ( ${joined} )`;
}
function cte_block(qs: InterpolatedQueries): string {
const ctes = Object.entries(qs).map(([_id, iq], _ix) => cte_items(iq));
const cte_string = ctes.join(', ');
return `WITH ${cte_string}`;
}
function query(request: QueryRequest): string {
const cte = request.interpolated_queries == null ? '' : cte_block(request.interpolated_queries);
const result = target_query(
request.relationships,
request.target,
null,
coerceUndefinedToNull(request.query.fields),
coerceUndefinedToNull(request.query.aggregates),
@ -669,7 +756,7 @@ function query(request: TableRequest): string {
coerceUndefinedToNull(request.query.offset),
coerceUndefinedToNull(request.query.order_by),
);
return tag('query', `SELECT ${result} as data`);
return tag('query', `${cte} SELECT ${result} as data`);
}
/**
@ -715,7 +802,8 @@ function foreach_ids_table_value(foreachIds: Record<string, ScalarValue>[]): str
* SELECT table_subquery AS data
* ```
*/
function foreach_query(foreachIds: Record<string, ScalarValue>[], request: TableRequest): string {
function foreach_query(foreachIds: Record<string, ScalarValue>[], request: QueryRequest): string {
const randomSuffix = nanoid();
const foreachTableName: TableName = [`foreach_ids_${randomSuffix}`];
const foreachRelationshipName = "Foreach";
@ -725,7 +813,7 @@ function foreach_query(foreachIds: Record<string, ScalarValue>[], request: Table
relationships: {
[foreachRelationshipName]: {
relationship_type: "array",
target_table: request.table,
target: request.target,
column_mapping: mapObject(foreachIds[0], ([columnName, _scalarValue]) => [columnName, columnName])
}
}
@ -739,10 +827,9 @@ function foreach_query(foreachIds: Record<string, ScalarValue>[], request: Table
};
const foreachIdsTableValue = foreach_ids_table_value(foreachIds);
const tableRelationships = only_table_relationships(request.table_relationships);
const tableSubquery = table_query(
[foreachTableRelationship, ...(tableRelationships)],
foreachTableName,
const tableSubquery = target_query(
[foreachTableRelationship, ...(request.relationships)],
{type: 'table', name: foreachTableName}, // Note: expand to other target types
null,
foreachQueryFields,
null,
@ -755,14 +842,6 @@ function foreach_query(foreachIds: Record<string, ScalarValue>[], request: Table
return tag('foreach_query', `WITH ${escapeTableName(foreachTableName)} AS (${foreachIdsTableValue}) SELECT ${tableSubquery} AS data`);
}
function only_table_relationships(all: Array<TableRelationships | FunctionRelationships>): Array<TableRelationships> {
return all.filter(isTableRelationship);
}
function isTableRelationship(relationships: TableRelationships | FunctionRelationships,): relationships is TableRelationships {
return (relationships as TableRelationships).source_table !== undefined;
}
/** Function to add SQL comments to the generated SQL to tag which procedures generated what text.
*
* comment('a','b') => '/*\<a>\*\/ b /*\</a>*\/'
@ -821,7 +900,7 @@ function tag(t: string, s: string): string {
* ```
*
*/
export async function queryData(config: Config, sqlLogger: SqlLogger, request: TableRequest): Promise<QueryResponse> {
export async function queryData(config: Config, sqlLogger: SqlLogger, request: QueryRequest): Promise<QueryResponse> {
return await withConnection(config, defaultMode, sqlLogger, async db => {
const q =
request.foreach
@ -855,7 +934,7 @@ export async function queryData(config: Config, sqlLogger: SqlLogger, request: T
* @param queryRequest
* @returns
*/
export async function explain(config: Config, sqlLogger: SqlLogger, request: TableRequest): Promise<ExplainResponse> {
export async function explain(config: Config, sqlLogger: SqlLogger, request: QueryRequest): Promise<ExplainResponse> {
return await withConnection(config, defaultMode, sqlLogger, async db => {
const q = query(request);
const result = await db.query(`EXPLAIN QUERY PLAN ${q}`);

View File

@ -1,4 +1,4 @@
import { ErrorResponseType, TableName } from "@hasura/dc-api-types";
import { ErrorResponseType, TableName, Target } from "@hasura/dc-api-types";
export const coerceUndefinedToNull = <T>(v: T | undefined): T | null => v === undefined ? null : v;
@ -52,8 +52,18 @@ export function delay(ms: number): Promise<void> {
return new Promise( resolve => setTimeout(resolve, ms) );
}
export const tableNameEquals = (tableName1: TableName) => (tableName2: TableName): boolean => {
return stringArrayEquals(tableName1)(tableName2);
export const tableNameEquals = (tableName1: TableName) => (target: Target): boolean => {
if(target.type != 'table') {
return false;
}
return stringArrayEquals(tableName1)(target.name);
}
export const tableToTarget = (tableName: TableName): Target => {
return {
type: 'table',
name: tableName
}
}
export const stringArrayEquals = (arr1: string[]) => (arr2: string[]): boolean => {

View File

@ -193,7 +193,8 @@ library
Test.Queries.Errors.NoQueriesAvailableSpec
Test.Queries.FilterSearch.AggregationPredicatesSpec
Test.Queries.FilterSearch.FilterSearchSpec
Test.Queries.NativeQueriesSpec
Test.Queries.NativeQueries.ProOnlySpec
Test.Queries.NativeQueries.NativeQueriesSpec
Test.Queries.NestedObjectSpec
Test.Queries.Paginate.LimitSpec
Test.Queries.Paginate.OffsetSpec

View File

@ -229,6 +229,7 @@ schemaInspectionTests = describe "Schema and Source Inspection" $ do
<&> Lens.set (key "config_schema_response" . key "config_schema") J.Null
<&> Lens.set (key "capabilities" . _Object . Lens.at "datasets") Nothing
<&> Lens.set (key "capabilities" . _Object . Lens.at "licensing") Nothing
<&> Lens.set (key "capabilities" . _Object . Lens.at "interpolated_queries") Nothing
<&> Lens.set (key "capabilities" . key "queries" . _Object . Lens.at "redaction") Nothing
<&> Lens.set (key "options" . key "uri") J.Null
<&> Lens.set (_Object . Lens.at "display_name") Nothing

View File

@ -175,7 +175,7 @@ tests = describe "Aggregate Query Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "Albums",
API.Relationship
{ _rTargetTable = mkTableName "Album",
{ _rTarget = mkTableTarget "Album",
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
@ -298,7 +298,7 @@ tests = describe "Aggregate Query Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "InvoiceLines",
API.Relationship
{ _rTargetTable = mkTableName "InvoiceLine",
{ _rTarget = mkTableTarget "InvoiceLine",
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "InvoiceId", API.ColumnName "InvoiceId")]
}

View File

@ -178,21 +178,22 @@ tests = do
let expectedRequest =
emptyMutationRequest
& API.mrTableRelationships
& API.mrRelationships
.~ Set.fromList
[ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTargetTable = mkTableName "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
[ API.RTable
$ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTarget = mkTableTarget "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
]
& API.mrOperations
.~ [ API.DeleteOperation
@ -286,21 +287,22 @@ tests = do
let expectedRequest =
emptyMutationRequest
& API.mrTableRelationships
& API.mrRelationships
.~ Set.fromList
[ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTargetTable = mkTableName "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
[ API.RTable
$ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTarget = mkTableTarget "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
]
& API.mrOperations
.~ [ API.DeleteOperation

View File

@ -170,21 +170,22 @@ tests = do
let expectedRequest =
emptyMutationRequest
& API.mrTableRelationships
& API.mrRelationships
.~ Set.fromList
[ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTargetTable = mkTableName "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
[ API.RTable
$ API.TableRelationships
{ API._trelSourceTable = mkTableName "Album",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ API._rTarget = mkTableTarget "Artist",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
)
]
}
]
& API.mrInsertSchema
.~ Set.fromList

View File

@ -198,7 +198,7 @@ tests = describe "Order By Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "Albums",
API.Relationship
{ _rTargetTable = mkTableName "Album",
{ _rTarget = mkTableTarget "Album",
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}

View File

@ -226,14 +226,14 @@ tests = describe "Object Relationships Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "Genre",
API.Relationship
{ _rTargetTable = mkTableName "Genre",
{ _rTarget = mkTableTarget "Genre",
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "GenreId", API.ColumnName "GenreId")]
}
),
( API.RelationshipName "MediaType",
API.Relationship
{ _rTargetTable = mkTableName "MediaType",
{ _rTarget = mkTableTarget "MediaType",
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping =
HashMap.fromList
@ -352,7 +352,7 @@ tests = describe "Object Relationships Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "Album",
API.Relationship
{ _rTargetTable = mkTableName "Album",
{ _rTarget = mkTableTarget "Album",
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "AlbumId", API.ColumnName "AlbumId")]
}
@ -366,7 +366,7 @@ tests = describe "Object Relationships Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "Artist",
API.Relationship
{ _rTargetTable = mkTableName "Artist",
{ _rTarget = mkTableTarget "Artist",
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "ArtistId", API.ColumnName "ArtistId")]
}
@ -447,7 +447,7 @@ tests = describe "Object Relationships Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "SupportRep",
API.Relationship
{ _rTargetTable = mkTableName "Employee",
{ _rTarget = mkTableTarget "Employee",
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "SupportRepId", API.ColumnName "EmployeeId")]
}
@ -461,7 +461,7 @@ tests = describe "Object Relationships Tests" $ do
HashMap.fromList
[ ( API.RelationshipName "SupportRepForCustomers",
API.Relationship
{ _rTargetTable = mkTableName "Customer",
{ _rTarget = mkTableTarget "Customer",
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList [(API.ColumnName "EmployeeId", API.ColumnName "SupportRepId")]
}

View File

@ -274,8 +274,7 @@ tests = do
("Title", API.ColumnField (API.ColumnName "Title") (API.ScalarType "string") Nothing)
]
)
& API._QRTable
. API.trForeach
& API.qrForeach
?~ NonEmpty.fromList
[ HashMap.fromList [(API.ColumnName "ArtistId", API.ScalarValue (J.Number 1) (API.ScalarType "number"))],
HashMap.fromList [(API.ColumnName "ArtistId", API.ScalarValue (J.Number 2) (API.ScalarType "number"))]
@ -365,8 +364,7 @@ tests = do
("Title", API.ColumnField (API.ColumnName "Title") (API.ScalarType "string") Nothing)
]
)
& API._QRTable
. API.trForeach
& API.qrForeach
?~ NonEmpty.fromList
[ HashMap.fromList [(API.ColumnName "AlbumId", API.ScalarValue (J.Number 3) (API.ScalarType "number"))],
HashMap.fromList [(API.ColumnName "AlbumId", API.ScalarValue (J.Number 1) (API.ScalarType "number"))],
@ -470,8 +468,7 @@ tests = do
& API.qAggregates
?~ mkFieldsMap [("aggregate_count", API.StarCount)]
)
& API._QRTable
. API.trForeach
& API.qrForeach
?~ NonEmpty.fromList
[ HashMap.fromList [(API.ColumnName "ArtistId", API.ScalarValue (J.Number 1) (API.ScalarType "number"))],
HashMap.fromList [(API.ColumnName "ArtistId", API.ScalarValue (J.Number 2) (API.ScalarType "number"))]

View File

@ -1,5 +1,6 @@
module Test.DataConnector.MockAgent.TestHelpers
( mkTableName,
mkTableTarget,
mkTableRequest,
emptyQuery,
emptyMutationRequest,
@ -15,11 +16,14 @@ import Data.HashMap.Strict qualified as HashMap
import Hasura.Backends.DataConnector.API qualified as API
import Hasura.Prelude
mkTableTarget :: Text -> API.Target
mkTableTarget name = API.TTable (API.TargetTable (mkTableName name))
mkTableName :: Text -> API.TableName
mkTableName name = API.TableName (name :| [])
mkTableRequest :: API.TableName -> API.Query -> API.QueryRequest
mkTableRequest tableName query = API.QRTable $ API.TableRequest tableName mempty mempty query Nothing
mkTableRequest tableName query = API.QueryRequest (API.TTable (API.TargetTable tableName)) mempty mempty mempty query Nothing
emptyQuery :: API.Query
emptyQuery = API.Query Nothing Nothing Nothing Nothing Nothing Nothing Nothing

View File

@ -192,21 +192,22 @@ tests = do
let expectedRequest =
emptyMutationRequest
& API.mrTableRelationships
& API.mrRelationships
.~ Set.fromList
[ API.TableRelationships
{ API._trelSourceTable = mkTableName "Track",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Genre",
API.Relationship
{ API._rTargetTable = mkTableName "Genre",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "GenreId", API.ColumnName "GenreId")]
}
)
]
}
[ API.RTable
$ API.TableRelationships
{ API._trelSourceTable = mkTableName "Track",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Genre",
API.Relationship
{ API._rTarget = mkTableTarget "Genre",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "GenreId", API.ColumnName "GenreId")]
}
)
]
}
]
& API.mrOperations
.~ [ API.UpdateOperation
@ -382,21 +383,22 @@ tests = do
]
let expectedRequest =
emptyMutationRequest
& API.mrTableRelationships
& API.mrRelationships
.~ Set.fromList
[ API.TableRelationships
{ API._trelSourceTable = mkTableName "Track",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Genre",
API.Relationship
{ API._rTargetTable = mkTableName "Genre",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "GenreId", API.ColumnName "GenreId")]
}
)
]
}
[ API.RTable
$ API.TableRelationships
{ API._trelSourceTable = mkTableName "Track",
API._trelRelationships =
HashMap.fromList
[ ( API.RelationshipName "Genre",
API.Relationship
{ API._rTarget = mkTableTarget "Genre",
API._rRelationshipType = API.ObjectRelationship,
API._rColumnMapping = HashMap.fromList [(API.ColumnName "GenreId", API.ColumnName "GenreId")]
}
)
]
}
]
& API.mrOperations
.~ [ API.UpdateOperation

View File

@ -0,0 +1,844 @@
{-# LANGUAGE QuasiQuotes #-}
{-# OPTIONS_GHC -Wno-incomplete-record-updates #-}
-- | Access to the SQL
module Test.Queries.NativeQueries.NativeQueriesSpec (spec, trackLogicalModels, postgresishTrackNativeQuery, schema, tests, distinctOnTests, basicNativeQuery) where
import Data.Aeson (Value)
import Data.List.NonEmpty qualified as NE
import Database.PG.Query qualified as PG
import Harness.Backend.DataConnector.Sqlite qualified as Sqlite
import Harness.Backend.Postgres qualified as Postgres
import Harness.GraphqlEngine qualified as GraphqlEngine
import Harness.Quoter.Graphql
import Harness.Quoter.Yaml (interpolateYaml, yaml)
import Harness.Schema (Table (..), table)
import Harness.Schema qualified as Schema
import Harness.Test.BackendType qualified as BackendType
import Harness.Test.Fixture qualified as Fixture
import Harness.Test.SetupAction qualified as SetupAction
import Harness.TestEnvironment (GlobalTestEnvironment, TestEnvironment, getBackendTypeConfig, graphQLTypeToText)
import Harness.Yaml (shouldAtLeastBe, shouldBeYaml, shouldReturnYaml)
import Hasura.Prelude
import Test.Hspec (SpecWith, describe, it)
-- ** Main Spec
spec :: SpecWith GlobalTestEnvironment
spec = do
Fixture.hgeWithEnv [] do
Fixture.runClean -- re-run fixture setup on every test
( NE.fromList
[ (Fixture.fixture $ Fixture.Backend Postgres.backendTypeMetadata)
{ Fixture.setupTeardown = \(testEnvironment, _) ->
[ Postgres.setupTablesAction schema testEnvironment,
trackLogicalModels testEnvironment,
postgresishTrackNativeQuery testEnvironment
]
}
]
)
(tests >> distinctOnTests)
Fixture.runClean -- re-run fixture setup on every test
( NE.fromList
[ (Fixture.fixture $ Fixture.Backend Sqlite.backendTypeMetadata)
{ Fixture.setupTeardown = \(testEnvironment, _) ->
[ Sqlite.setupTablesAction schema testEnvironment,
trackLogicalModels testEnvironment,
sqliteTrackNativeQuery testEnvironment
]
}
]
)
tests
-- ** Fixtures
-- Includes Cockroach/Citus
postgresishTrackNativeQuery :: TestEnvironment -> SetupAction.SetupAction
postgresishTrackNativeQuery testEnvironment = SetupAction.noTeardown do
let backendTypeMetadata = fromMaybe (error "Unknown backend") $ getBackendTypeConfig testEnvironment
source = BackendType.backendSourceName backendTypeMetadata
helloNQ = "SELECT * FROM (VALUES ('hello', 'world'), ('welcome', 'friend')) as t(\"one\", \"two\")"
helloNQDups = "SELECT * FROM (VALUES ('hello', 'world'), ('hello', 'friend')) as t(\"one\", \"two\")"
queryWithParam = "SELECT * FROM (VALUES ('hello', 'world'), ('welcome', 'friend')) as t(\"one\", \"two\") where 1={{param}}"
articleNQ name =
mkArticleWithExcerptNativeQuery
name
[PG.sql|
select
id,
"Title",
(substring(content, 1, {{length}}) || (case when length(content) < {{length}} then '' else '...' end)) as excerpt
from article
|]
-- Hello
Schema.trackNativeQuery source (helloNQBasic helloNQ) testEnvironment
Schema.trackNativeQuery source (helloNQWithDuplicates helloNQDups) testEnvironment
Schema.trackNativeQuery source (uppercaseNativeQuery helloNQ) testEnvironment
Schema.trackNativeQuery source (inlineNativeQuery helloNQ) testEnvironment
Schema.trackNativeQuery source (helloSemicolonNQ helloNQ) testEnvironment
Schema.trackNativeQuery source (helloCommentNQ helloNQ) testEnvironment
-- Articles
Schema.trackNativeQuery source (articleNQ "article_with_excerpt") testEnvironment
Schema.trackNativeQuery source (articleNQ "article_with_excerpt_1") testEnvironment
Schema.trackNativeQuery source (articleNQ "article_with_excerpt_2") testEnvironment
-- Nullability
Schema.trackNativeQuery source descriptionsAndNullableNativeQuery testEnvironment
Schema.trackNativeQuery source allowedNullabilityNativeQuery testEnvironment
Schema.trackNativeQuery source disallowedNullabilityNativeQuery testEnvironment
-- Params
Schema.trackNativeQuery source (helloNQWithParam queryWithParam) testEnvironment
sqliteTrackNativeQuery :: TestEnvironment -> SetupAction.SetupAction
sqliteTrackNativeQuery testEnvironment = SetupAction.noTeardown do
let backendTypeMetadata = fromMaybe (error "Unknown backend") $ getBackendTypeConfig testEnvironment
source = BackendType.backendSourceName backendTypeMetadata
-- Is there a nicer way to do this without UNION in SQLite?
helloNQ = "SELECT 'hello' as one, 'world' as two UNION SELECT 'welcome' as one, 'friend' as two"
helloNQDups = "SELECT 'hello' as one, 'world' as two UNION SELECT 'hello' as one, 'friend' as two"
queryWithParam = "SELECT * FROM (SELECT 'hello' as one, 'world' as two UNION SELECT 'welcome' as one, 'friend' as two) WHERE 1={{param}}"
articleNQ name =
mkArticleWithExcerptNativeQuery
name
[PG.sql|
select
id,
"Title",
(substring(content, 1, {{length}}) || (CASE WHEN length(content) < {{length}} THEN '' ELSE '...' END)) as excerpt
from article
|]
-- Hello
Schema.trackNativeQuery source (helloNQBasic helloNQ) testEnvironment
Schema.trackNativeQuery source (helloNQWithDuplicates helloNQDups) testEnvironment
Schema.trackNativeQuery source (uppercaseNativeQuery helloNQ) testEnvironment
Schema.trackNativeQuery source (inlineNativeQuery helloNQ) testEnvironment
Schema.trackNativeQuery source (helloSemicolonNQ helloNQ) testEnvironment
Schema.trackNativeQuery source (helloCommentNQ helloNQ) testEnvironment
-- Articles
Schema.trackNativeQuery source (articleNQ "article_with_excerpt") testEnvironment
Schema.trackNativeQuery source (articleNQ "article_with_excerpt_1") testEnvironment
Schema.trackNativeQuery source (articleNQ "article_with_excerpt_2") testEnvironment
-- Nullability
Schema.trackNativeQuery source descriptionsAndNullableNativeQuery testEnvironment
Schema.trackNativeQuery source allowedNullabilityNativeQuery testEnvironment
Schema.trackNativeQuery source disallowedNullabilityNativeQuery testEnvironment
-- Params
Schema.trackNativeQuery source (helloNQWithParam queryWithParam) testEnvironment
-- ** Logical Models
trackLogicalModels :: TestEnvironment -> Fixture.SetupAction
trackLogicalModels testEnvironment = SetupAction.noTeardown do
let backendTypeMetadata = fromMaybe (error "Unknown backend") $ getBackendTypeConfig testEnvironment
source = BackendType.backendSourceName backendTypeMetadata
Schema.trackLogicalModel source helloWorldLogicalModel testEnvironment
Schema.trackLogicalModel source articleWithExcerptLogicalModel testEnvironment
Schema.trackLogicalModel source nullabilityLogicalModel testEnvironment
Schema.trackLogicalModel source descriptionsAndNullableLogicalModel testEnvironment
where
nullabilityLogicalModel :: Schema.LogicalModel
nullabilityLogicalModel =
(Schema.logicalModel "nullability_model")
{ Schema.logicalModelColumns =
[ Schema.logicalModelScalar "arbitrary_number" Schema.TInt
]
}
helloWorldLogicalModel :: Schema.LogicalModel
helloWorldLogicalModel =
(Schema.logicalModel "hello_world_return_type")
{ Schema.logicalModelColumns =
[ Schema.logicalModelScalar "one" Schema.TStr,
Schema.logicalModelScalar "two" Schema.TStr
]
}
articleWithExcerptLogicalModel :: Schema.LogicalModel
articleWithExcerptLogicalModel =
(Schema.logicalModel "article_with_excerpt")
{ Schema.logicalModelColumns =
[ Schema.logicalModelScalar "id" Schema.TInt,
Schema.logicalModelScalar "Title" Schema.TStr,
Schema.logicalModelScalar "excerpt" Schema.TStr
]
}
descriptionsAndNullableLogicalModel :: Schema.LogicalModel
descriptionsAndNullableLogicalModel =
(Schema.logicalModel "divided_stuff")
{ Schema.logicalModelColumns =
[ (Schema.logicalModelScalar "divided" Schema.TInt)
{ Schema.logicalModelColumnDescription = Just "A divided thing"
},
(Schema.logicalModelScalar "something_nullable" Schema.TStr)
{ Schema.logicalModelColumnDescription = Just "Something nullable",
Schema.logicalModelColumnNullable = True
}
],
Schema.logicalModelDescription = Just "Return type description"
}
-- ** Helpers
basicNativeQuery :: Text -> Text -> Text -> Schema.NativeQuery
basicNativeQuery name query returns = Schema.nativeQuery name (const query) returns
-- TODO: A lot of these "hello" function helpers are the same thing and can be generic
helloNQBasic :: Text -> Schema.NativeQuery
helloNQBasic sql = basicNativeQuery "hello_world_function" sql "hello_world_return_type"
helloSemicolonNQ :: Text -> Schema.NativeQuery
helloSemicolonNQ sql = basicNativeQuery "hello_semicolon_function" (sql <> "; \n") "hello_world_return_type"
helloCommentNQ :: Text -> Schema.NativeQuery
helloCommentNQ sql = basicNativeQuery "hello_comment_function" (sql <> " -- my query") "hello_world_return_type"
-- NOTE: This NQ isn't currently executed in any of the tests, it tests that missed parameters give useful errors
helloNQWithParam :: Text -> Schema.NativeQuery
helloNQWithParam sql =
(Schema.nativeQuery "hello_world_function_with_arg" (const sql) "hello_world_return_type")
{ Schema.nativeQueryArguments =
[Schema.nativeQueryColumn "param" Schema.TInt]
}
inlineNativeQuery :: Text -> Schema.NativeQuery
inlineNativeQuery sql =
( Schema.inlineNativeQuery
"hello_world_function_inline"
(const sql)
[ Schema.logicalModelScalar "one" Schema.TStr,
Schema.logicalModelScalar "two" Schema.TStr
]
)
helloNQWithDuplicates :: Text -> Schema.NativeQuery
helloNQWithDuplicates queryWithDuplicates =
(Schema.nativeQuery "hello_world_function_duplicates" (const queryWithDuplicates) "hello_world_return_type")
-- The NQ Type `TInt` should be mapped to `INTEGER` via `backendScalarType :: ScalarType -> Text` field.
mkArticleWithExcerptNativeQuery :: Text -> Text -> Schema.NativeQuery
mkArticleWithExcerptNativeQuery name sql =
(Schema.nativeQuery name (const sql) "article_with_excerpt")
{ Schema.nativeQueryArguments =
[ Schema.nativeQueryColumn "length" Schema.TInt
]
}
-- NOTE: May want to extend to other backends with SQL text param
descriptionsAndNullableNativeQuery :: Schema.NativeQuery
descriptionsAndNullableNativeQuery =
Schema.nativeQuery "divided_stuff" (const nullableQuery) "divided_stuff"
where
nullableQuery = "SELECT (thing / 2)::integer AS divided, null::text as something_nullable FROM stuff"
disallowedNullabilityNativeQuery :: Schema.NativeQuery
disallowedNullabilityNativeQuery =
(Schema.nativeQuery "non_nullability" (const nullabilityQuery) "nullability_model")
{ Schema.nativeQueryArguments =
[ (Schema.nativeQueryColumn "index" Schema.TInt)
{ Schema.nativeQueryColumnNullable = False
}
]
}
-- NOTE: May want to extend to other backends with SQL text param
allowedNullabilityNativeQuery :: Schema.NativeQuery
allowedNullabilityNativeQuery =
(Schema.nativeQuery "nullability" (const nullabilityQuery) "nullability_model")
{ Schema.nativeQueryArguments =
[ (Schema.nativeQueryColumn "index" Schema.TInt)
{ Schema.nativeQueryColumnNullable = True
}
]
}
nullabilityQuery :: Text
nullabilityQuery = "SELECT coalesce({{index}}, 100) as arbitrary_number"
uppercaseNativeQuery :: Text -> Schema.NativeQuery
uppercaseNativeQuery sql = (Schema.nativeQuery "UppercaseNativeQuery" (const sql) "hello_world_return_type")
-- ** Setup and teardown
-- we add and track a table here as it's the only way we can currently define a
-- return type
schema :: [Schema.Table]
schema =
[ (table "stuff")
{ tableColumns =
[ Schema.column "thing" Schema.TInt,
Schema.column "date" Schema.TUTCTime
]
},
(table "article")
{ tableColumns =
[ Schema.column "id" Schema.TInt,
Schema.column "Title" Schema.TStr,
Schema.column "content" Schema.TStr
],
tableData =
[ [ Schema.VInt 1,
Schema.VStr "Dogs",
Schema.VStr "I like to eat dog food I am a dogs I like to eat dog food I am a dogs I like to eat dog food I am a dogs"
]
]
},
(Schema.table "articles")
{ Schema.tableColumns =
[ Schema.column "id" Schema.TInt,
Schema.column "author_id" Schema.TInt,
Schema.column "title" Schema.TStr,
Schema.column "content" Schema.TStr
],
Schema.tableData =
[ [Schema.VInt 1, Schema.VInt 1, Schema.VStr "Fright Knight", Schema.VStr "Well, well, well"],
[Schema.VInt 2, Schema.VInt 2, Schema.VStr "Man to Man", Schema.VStr "Well2, well2, well2"]
]
},
(Schema.table "authors")
{ Schema.tableColumns =
[ Schema.column "id" Schema.TInt,
Schema.column "name" Schema.TStr
],
Schema.tableData =
[ [Schema.VInt 1, Schema.VStr "Marenghi"],
[Schema.VInt 2, Schema.VStr "Learner"]
]
}
]
-- ** Tests
-- | These should be defined seperately since some backends have issues with distinct-on.
-- We run these on everything except SQLServer, and SQLite because they don't have distinct_on implemented.
distinctOnTests :: SpecWith TestEnvironment
distinctOnTests = do
describe "Distinct_on tests"
$ it "Runs a simple query using distinct_on and order_by"
$ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function_duplicates:
- one: "hello"
two: "world"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function_duplicates (
distinct_on: [one]
order_by: [{one:asc}]
){
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
tests :: SpecWith TestEnvironment
tests = do
describe "Testing Native Queries" $ do
it "Runs a simple query that takes one parameter and uses it multiple times" $ \testEnvironment -> do
let actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
article_with_excerpt(args: { length: 34 }) {
id
Title
excerpt
}
}
|]
expected =
[yaml|
data:
article_with_excerpt:
- id: 1
Title: "Dogs"
excerpt: "I like to eat dog food I am a dogs..."
|]
shouldReturnYaml testEnvironment actual expected
it "Uses two queries with the same argument names and ensure they don't mess with one another" $ \testEnvironment -> do
let actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
article_with_excerpt_1(args: { length: 34 }) {
excerpt
}
article_with_excerpt_2(args: { length: 13 }) {
excerpt
}
}
|]
expected =
[yaml|
data:
article_with_excerpt_1:
- excerpt: "I like to eat dog food I am a dogs..."
article_with_excerpt_2:
- excerpt: "I like to eat..."
|]
shouldReturnYaml testEnvironment actual expected
it "Uses a one parameter query and uses it multiple times" $ \testEnvironment -> do
let actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
first: article_with_excerpt(args: { length: 34 }) {
excerpt
}
second: article_with_excerpt(args: { length: 13 }) {
excerpt
}
}
|]
expected =
[yaml|
data:
first:
- excerpt: "I like to eat dog food I am a dogs..."
second:
- excerpt: "I like to eat..."
|]
shouldReturnYaml testEnvironment actual expected
it "Uses a one parameter query, passing it a GraphQL variable" $ \testEnvironment -> do
let lengthType = graphQLTypeToText testEnvironment Schema.TInt
let variables =
[yaml|
length: 34
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphqlWithVariables
testEnvironment
[graphql|
query MyQuery($length: #{lengthType}!) {
article_with_excerpt(args: { length: $length }) {
excerpt
}
}
|]
variables
expected =
[yaml|
data:
article_with_excerpt:
- excerpt: "I like to eat dog food I am a dogs..."
|]
shouldReturnYaml testEnvironment actual expected
describe "Parameter nullability" $ do
it "Query with non-nullable parameter does not accept null" $ \testEnvironment -> do
actual <-
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
non_nullability(args: { index: null }) {
arbitrary_number
}
}
|]
let expected =
[yaml|
errors:
- extensions:
code: validation-failed
path: $.selectionSet.non_nullability.args.args.index
|]
actual `shouldAtLeastBe` expected
it "Query with nullable parameter accepts a null value" $ \testEnvironment -> do
let actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
notnull: nullability(args: { index: 42 }) {
arbitrary_number
}
withnull: nullability(args: { index: null }) {
arbitrary_number
}
}
|]
expected =
[yaml|
data:
notnull:
- arbitrary_number: 42
withnull:
- arbitrary_number: 100
|]
shouldReturnYaml testEnvironment actual expected
it "Explain works" $ \testEnvironment -> do
let explain :: Value
explain =
[graphql|
query {
hello_world_function (where: { two: { _eq: "world" } }){
one
two
}
}
|]
expected =
[interpolateYaml|
[{
"field": "hello_world_function"
}]
|]
actual <- GraphqlEngine.postExplain testEnvironment explain
actual `shouldAtLeastBe` expected
it "Descriptions and nullability appear in the schema" $ \testEnvironment -> do
let intType = graphQLTypeToText testEnvironment Schema.TInt
let stringType = graphQLTypeToText testEnvironment Schema.TStr
let queryTypesIntrospection :: Value
queryTypesIntrospection =
[graphql|
query {
__type(name: "divided_stuff") {
name
description
fields {
name
description
type {
name
kind
ofType {
name
}
}
}
}
}
|]
expected =
[interpolateYaml|
{
"data": {
"__type": {
"description": "Return type description",
"fields": [
{
"description": "A divided thing",
"name": "divided",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"name": "#{intType}"
}
}
},
{
"description": "Something nullable",
"name": "something_nullable",
"type": {
"kind": "SCALAR",
"name": "#{stringType}",
"ofType": null
}
}
],
"name": "divided_stuff"
}
}
}
|]
actual <- GraphqlEngine.postGraphql testEnvironment queryTypesIntrospection
actual `shouldBeYaml` expected
it "Runs the absolute simplest query that takes no parameters" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function:
- one: "hello"
two: "world"
- one: "welcome"
two: "friend"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs the a simple query with uppercase letters in the name" $ \testEnvironment -> do
let expected =
[yaml|
data:
UppercaseNativeQuery:
- one: "hello"
two: "world"
- one: "welcome"
two: "friend"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
UppercaseNativeQuery {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs simple query with a basic where clause" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function:
- one: "hello"
two: "world"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function (where: { two: { _eq: "world" } }){
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query defined with an inline Logical Model" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function_inline:
- one: "hello"
two: "world"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function_inline(where: {one: {_eq: "hello"}}) {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query that takes no parameters" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function:
- one: "hello"
two: "world"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function(where: {one: {_eq: "hello"}}) {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query that takes has an order_by" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_world_function:
- two: "world"
- two: "friend"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function(order_by: {one: asc}) {
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query with a parameter that returns a nice error when we do not pass that parameter" $ \testEnvironment -> do
let expected =
[yaml|
errors:
- extensions:
code: validation-failed
path: $.selectionSet.hello_world_function_with_arg.args.args.param
message: missing required field 'param'
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function_with_arg(args: {}, order_by: {one: asc}) {
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query that takes one dummy parameter and returns a nice error when we do not pass any args field" $ \testEnvironment -> do
let expected =
[yaml|
errors:
- extensions:
code: validation-failed
path: $.selectionSet.hello_world_function_with_arg.args.args
message: missing required field 'args'
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_world_function_with_arg(order_by: {one: asc}) {
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query that takes no parameters but ends with a comment" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_comment_function:
- one: "hello"
two: "world"
- one: "welcome"
two: "friend"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_comment_function {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected
it "Runs a simple query that takes no parameters but ends with a semicolon" $ \testEnvironment -> do
let expected =
[yaml|
data:
hello_semicolon_function:
- one: "hello"
two: "world"
- one: "welcome"
two: "friend"
|]
actual :: IO Value
actual =
GraphqlEngine.postGraphql
testEnvironment
[graphql|
query {
hello_semicolon_function {
one
two
}
}
|]
shouldReturnYaml testEnvironment actual expected

View File

@ -2,7 +2,7 @@
-- Native Queries is a pro-only feature now for anything but Postgres.
-- This test ensures that this continues to be the case.
module Test.Queries.NativeQueriesSpec (spec) where
module Test.Queries.NativeQueries.ProOnlySpec (spec) where
import Data.List.NonEmpty qualified as NE
import Harness.Backend.Postgres qualified as Postgres

View File

@ -101,6 +101,7 @@ library
Hasura.Backends.DataConnector.API.V0.Schema
Hasura.Backends.DataConnector.API.V0.Table
Hasura.Backends.DataConnector.API.V0.Target
Hasura.Backends.DataConnector.API.V0.InterpolatedQuery
Hasura.Backends.DataConnector.API.V0.Dataset
other-modules:
Hasura.Backends.DataConnector.API.V0.Name

View File

@ -6,6 +6,7 @@ module Hasura.Backends.DataConnector.API.V0
module Expression,
module ErrorResponse,
module Function,
module InterpolatedQuery,
module Mutations,
module OrderBy,
module Query,
@ -29,6 +30,7 @@ import Hasura.Backends.DataConnector.API.V0.ErrorResponse as ErrorResponse
import Hasura.Backends.DataConnector.API.V0.Explain as Explain
import Hasura.Backends.DataConnector.API.V0.Expression as Expression
import Hasura.Backends.DataConnector.API.V0.Function as Function
import Hasura.Backends.DataConnector.API.V0.InterpolatedQuery as InterpolatedQuery
import Hasura.Backends.DataConnector.API.V0.Mutations as Mutations
import Hasura.Backends.DataConnector.API.V0.OrderBy as OrderBy
import Hasura.Backends.DataConnector.API.V0.Query as Query

View File

@ -16,6 +16,7 @@ module Hasura.Backends.DataConnector.API.V0.Capabilities
cSubscriptions,
cScalarTypes,
cRelationships,
cInterpolatedQueries,
cComparisons,
cMetrics,
cExplain,
@ -52,6 +53,7 @@ module Hasura.Backends.DataConnector.API.V0.Capabilities
ScalarTypeCapabilities (..),
ScalarTypesCapabilities (..),
RelationshipCapabilities (..),
InterpolatedQueryCapabilities (..),
ComparisonCapabilities (..),
SubqueryComparisonCapabilities (..),
MetricsCapabilities (..),
@ -101,6 +103,7 @@ data Capabilities = Capabilities
_cSubscriptions :: Maybe SubscriptionCapabilities,
_cScalarTypes :: ScalarTypesCapabilities,
_cRelationships :: Maybe RelationshipCapabilities,
_cInterpolatedQueries :: Maybe InterpolatedQueryCapabilities,
_cComparisons :: Maybe ComparisonCapabilities,
_cMetrics :: Maybe MetricsCapabilities,
_cExplain :: Maybe ExplainCapabilities,
@ -114,7 +117,7 @@ data Capabilities = Capabilities
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Capabilities
defaultCapabilities :: Capabilities
defaultCapabilities = Capabilities defaultDataSchemaCapabilities Nothing Nothing Nothing mempty Nothing Nothing Nothing Nothing Nothing Nothing Nothing Nothing
defaultCapabilities = Capabilities defaultDataSchemaCapabilities Nothing Nothing Nothing mempty Nothing Nothing Nothing Nothing Nothing Nothing Nothing Nothing Nothing
instance HasCodec Capabilities where
codec =
@ -126,6 +129,7 @@ instance HasCodec Capabilities where
<*> optionalField "subscriptions" "The agent's subscription capabilities" .= _cSubscriptions
<*> optionalFieldWithOmittedDefault "scalar_types" mempty "The agent's scalar types and their capabilities" .= _cScalarTypes
<*> optionalField "relationships" "The agent's relationship capabilities" .= _cRelationships
<*> optionalField "interpolated_queries" "The agent's interpolated (native) query capabilities" .= _cInterpolatedQueries
<*> optionalField "comparisons" "The agent's comparison capabilities" .= _cComparisons
<*> optionalField "metrics" "The agent's metrics capabilities" .= _cMetrics
<*> optionalField "explain" "The agent's explain capabilities" .= _cExplain
@ -309,6 +313,14 @@ data RelationshipCapabilities = RelationshipCapabilities {}
instance HasCodec RelationshipCapabilities where
codec = object "RelationshipCapabilities" $ pure RelationshipCapabilities
data InterpolatedQueryCapabilities = InterpolatedQueryCapabilities {}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec InterpolatedQueryCapabilities
instance HasCodec InterpolatedQueryCapabilities where
codec = object "InterpolatedQueryCapabilities" $ pure InterpolatedQueryCapabilities
newtype ComparisonOperators = ComparisonOperators
{ unComparisonOperators :: HashMap GQL.Syntax.Name ScalarType
}

View File

@ -10,6 +10,8 @@ module Hasura.Backends.DataConnector.API.V0.Function
FunctionInfo (..),
FunctionType (..),
FunctionArg (..),
FunctionArgument (..),
ArgumentValue (..),
FunctionReturnType (..),
FunctionArity (..),
functionNameToText,
@ -22,6 +24,9 @@ module Hasura.Backends.DataConnector.API.V0.Function
faInputArgOptional,
faInputArgName,
faInputArgType,
faName,
faValue,
savValue,
_FunctionReturnsTable,
_FunctionReturnsUnknown,
)
@ -80,16 +85,17 @@ data FunctionType = FRead | FWrite
instance HasCodec FunctionType where
codec = named "FunctionType" $ stringConstCodec [(FRead, "read"), (FWrite, "write")]
-- | FunctionArg represents the args exposed in the agent schema
--
-- TODO: This should be extended to support positional args, etc. in future
-- Example: `data FunctionArgIdentifier = NamedArg Text | PositionalArg Int`
-- Serialized: `{ "type": "name", "name": "arg1" }` or `{ "type": "positional", "index": 0 }`
--
data FunctionArg = FunctionArg
{ _faInputArgName :: Text,
_faInputArgType :: API.ScalarType,
_faInputArgOptional :: Bool
}
deriving stock (Eq, Ord, Show, Generic)
deriving stock (Eq, Ord, Show, Generic, Data)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec FunctionArg
@ -127,6 +133,49 @@ instance HasCodec FunctionReturnType where
("unknown", ("FunctionReturnsUnknown", pure FunctionReturnsUnknown))
]
-- | FunctionArgument represent arguments sent via Request IR to the agent.
data FunctionArgument = NamedArgument
{ _faName :: Text,
_faValue :: ArgumentValue
}
deriving stock (Eq, Ord, Show, Generic, Data)
-- | This type wrapper exists to allow easy extension to different types in the future.
newtype ArgumentValue = ScalarArgumentValue
{ _savValue :: API.ScalarValue
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasCodec ArgumentValue where
codec =
object "ArgumentValue" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
(ScalarArgumentValue n) -> ("scalar", mapToEncoder n objectCodec)
dec =
HashMap.fromList
[ ("scalar", ("ScalarArgumentValue", mapToDecoder ScalarArgumentValue objectCodec))
]
namedArgumentObjectCodec :: JSONObjectCodec FunctionArgument
namedArgumentObjectCodec =
NamedArgument
<$> requiredField "name" "The name of the named argument" .= _faName
<*> requiredField "value" "The value of the named argument" .= _faValue
instance HasCodec FunctionArgument where
codec =
object "FunctionRequestArgument" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
n -> ("named", mapToEncoder n namedArgumentObjectCodec)
dec =
HashMap.fromList
[ ("named", ("NamedArgument", mapToDecoder id namedArgumentObjectCodec))
]
data FunctionArity = FunctionArityOne | FunctionArityMany
deriving stock (Eq, Show, Ord, Generic)
deriving anyclass (NFData, Hashable)
@ -169,6 +218,8 @@ instance HasCodec FunctionInfo where
$(makeLenses ''FunctionInfo)
$(makeLenses ''FunctionArg)
$(makeLenses ''FunctionArgument)
$(makeLenses ''ArgumentValue)
$(makePrisms ''FunctionReturnType)
$(makePrisms ''FunctionArity)
$(makePrisms ''FunctionType)

View File

@ -0,0 +1,87 @@
{-# LANGUAGE TemplateHaskell #-}
module Hasura.Backends.DataConnector.API.V0.InterpolatedQuery
( InterpolatedQueries (..),
InterpolatedQuery (..),
iqId,
iqItems,
InterpolatedQueryId (..),
interpolatedQueryId,
InterpolatedItem (..),
_InterpolatedText,
_InterpolatedScalar,
)
where
import Autodocodec (discriminatedUnionCodec, mapToDecoder, mapToEncoder, objectCodec, requiredField')
import Autodocodec.Extended (HasCodec (codec), dimapCodec, named, object, requiredField, (.=))
import Autodocodec.OpenAPI ()
import Control.DeepSeq (NFData)
import Control.Lens.TH (makeLenses, makePrisms)
import Data.Aeson qualified as J
import Data.Data (Data)
import Data.HashMap.Strict qualified as HashMap
import Data.Hashable (Hashable)
import Data.Text (Text)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.Scalar qualified as API.V0
import Prelude
-- | Convenience wrapper for set of queries - Useful for Has tuple instances.
newtype InterpolatedQueries = InterpolatedQueries
{unInterpolatedQueries :: HashMap.HashMap InterpolatedQueryId InterpolatedQuery}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving newtype (Semigroup, Monoid)
instance HasCodec InterpolatedQueries where
codec = named "InterpolatedQueries" $ dimapCodec InterpolatedQueries unInterpolatedQueries codec
data InterpolatedQuery = InterpolatedQuery
{ -- | NOTE: We may not need this in the query itself, could just use the map key, but might be handy to have in-situ for convenience
_iqId :: InterpolatedQueryId,
_iqItems :: [InterpolatedItem]
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasCodec InterpolatedQuery where
codec =
object "InterpolatedQuery" $
InterpolatedQuery
<$> requiredField "id" "An id associated with the interpolated query - Should be unique across the request" .= _iqId
<*> requiredField "items" "Interpolated items in the query" .= _iqItems
-- | Newtype to help keep interpolated IDs distinct from other text
newtype InterpolatedQueryId = InterpolatedQueryId
{_interpolatedQueryId :: Text}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving newtype (NFData, Hashable, J.FromJSONKey, J.ToJSONKey)
instance HasCodec InterpolatedQueryId where
codec = dimapCodec InterpolatedQueryId _interpolatedQueryId codec
data InterpolatedItem
= InterpolatedText Text
| InterpolatedScalar API.V0.ScalarValue
-- TODO: Implement scalar arrays
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasCodec InterpolatedItem where
codec =
named "InterpolatedItem" $
object "InterpolatedItem" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
InterpolatedText t -> ("text", mapToEncoder t interpolatedTextObjectCodec)
InterpolatedScalar s -> ("scalar", mapToEncoder s objectCodec)
dec =
HashMap.fromList
[ ("text", ("InterpolatedText", mapToDecoder InterpolatedText interpolatedTextObjectCodec)),
("scalar", ("InterpolatedScalar", mapToDecoder InterpolatedScalar objectCodec))
]
interpolatedTextObjectCodec = requiredField' "value"
$(makeLenses ''InterpolatedQueries)
$(makeLenses ''InterpolatedQuery)
$(makeLenses ''InterpolatedQueryId)
$(makePrisms ''InterpolatedItem)

View File

@ -4,7 +4,7 @@
module Hasura.Backends.DataConnector.API.V0.Mutations
( MutationRequest (..),
mrTableRelationships,
mrRelationships,
mrRedactionExpressions,
mrInsertSchema,
mrOperations,
@ -82,7 +82,7 @@ import Prelude
--
-- TODO: Does this need to be enhanced ala. QueryRequest to support FunctionRequests?
data MutationRequest = MutationRequest
{ _mrTableRelationships :: Set API.V0.TableRelationships,
{ _mrRelationships :: Set API.V0.Relationships,
_mrRedactionExpressions :: Set API.V0.TargetRedactionExpressions,
_mrInsertSchema :: Set TableInsertSchema,
_mrOperations :: [MutationOperation]
@ -95,8 +95,8 @@ instance HasCodec MutationRequest where
codec =
object "MutationRequest" $
MutationRequest
<$> requiredField "table_relationships" "The relationships between tables involved in the entire mutation request"
.= _mrTableRelationships
<$> requiredField "relationships" "The relationships involved in the entire mutation request"
.= _mrRelationships
<*> optionalFieldWithOmittedDefault "redaction_expressions" mempty "Expressions that can be referenced by the query to redact fields/columns"
.= _mrRedactionExpressions
<*> requiredField "insert_schema" "The schema by which to interpret row data specified in any insert operations in this request"

View File

@ -4,28 +4,14 @@
module Hasura.Backends.DataConnector.API.V0.Query
( QueryRequest (..),
_QRTable,
_QRFunction,
TableRequest (..),
pattern TableQueryRequest,
FunctionRequest (..),
pattern FunctionQueryRequest,
FunctionArgument (..),
ArgumentValue (..),
pattern NativeQueryRequest,
qrRelationships,
qrRedactionExpressions,
qrQuery,
qrForeach,
trTable,
trRelationships,
trRedactionExpressions,
trQuery,
trForeach,
frFunction,
frFunctionArguments,
frQuery,
frRelationships,
frRedactionExpressions,
qrTarget,
FieldName (..),
Query (..),
qFields,
@ -45,6 +31,7 @@ module Hasura.Backends.DataConnector.API.V0.Query
QueryResponse (..),
qrRows,
qrAggregates,
qrInterpolatedQueries,
FieldValue,
mkColumnFieldValue,
mkRelationshipFieldValue,
@ -65,7 +52,7 @@ where
import Autodocodec.Extended
import Autodocodec.OpenAPI ()
import Control.Arrow (left)
import Control.Lens (Lens', Prism', Traversal', lens, prism')
import Control.Lens (Lens', Prism', lens, prism')
import Control.Lens.TH (makeLenses, makePrisms)
import Data.Aeson (FromJSON, ToJSON, Value)
import Data.Aeson qualified as J
@ -86,161 +73,56 @@ import Hasura.Backends.DataConnector.API.V0.Aggregate qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Column qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Expression qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Function qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.InterpolatedQuery qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.OrderBy qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Relationships qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Scalar qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Table qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Target qualified as API.V0
import Servant.API (HasStatus (..))
import Prelude
-- | A serializable request to retrieve strutured data from some
-- source.
data QueryRequest
= QRTable TableRequest
| QRFunction FunctionRequest
-- | A serializable request to retrieve strutured data from tables.
data QueryRequest = QueryRequest
{ _qrTarget :: API.V0.Target,
_qrRelationships :: Set API.V0.Relationships,
_qrRedactionExpressions :: Set API.V0.TargetRedactionExpressions,
_qrInterpolatedQueries :: API.V0.InterpolatedQueries,
_qrQuery :: Query,
_qrForeach :: Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue))
}
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec QueryRequest
-- Can we build this with existing traversals without a case?
qrRelationships :: Lens' QueryRequest (Set API.V0.Relationships)
qrRelationships = lens get set
where
get (QRTable (TableRequest {_trRelationships})) = _trRelationships
get (QRFunction (FunctionRequest {_frRelationships})) = _frRelationships
set (QRTable qrt) r = QRTable (qrt {_trRelationships = r})
set (QRFunction qrf) r = QRFunction (qrf {_frRelationships = r})
qrQuery :: Lens' QueryRequest Query
qrQuery = lens get set
where
get (QRTable (TableRequest {_trQuery})) = _trQuery
get (QRFunction (FunctionRequest {_frQuery})) = _frQuery
set (QRTable qrt) x = QRTable (qrt {_trQuery = x})
set (QRFunction qrf) x = QRFunction (qrf {_frQuery = x})
qrRedactionExpressions :: Lens' QueryRequest (Set API.V0.TargetRedactionExpressions)
qrRedactionExpressions = lens get set
where
get (QRTable (TableRequest {_trRedactionExpressions})) = _trRedactionExpressions
get (QRFunction (FunctionRequest {_frRedactionExpressions})) = _frRedactionExpressions
set (QRTable qrt) x = QRTable (qrt {_trRedactionExpressions = x})
set (QRFunction qrf) x = QRFunction (qrf {_frRedactionExpressions = x})
instance HasCodec QueryRequest where
codec =
named "QueryRequest" $
object "QueryRequest" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
QRTable qrt -> ("table", mapToEncoder qrt objectCodec)
QRFunction qrf -> ("function", mapToEncoder qrf objectCodec)
dec =
HashMap.fromList
[ ("table", ("TableRequest", mapToDecoder QRTable objectCodec)),
("function", ("FunctionRequest", mapToDecoder QRFunction objectCodec))
]
object "QueryRequest" objectCodec
pattern TableQueryRequest :: API.V0.TableName -> Set API.V0.Relationships -> Set API.V0.TargetRedactionExpressions -> Query -> Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue)) -> QueryRequest
pattern TableQueryRequest table relationships redactionExps query foreach = QRTable (TableRequest table relationships redactionExps query foreach)
-- | A serializable request to retrieve strutured data from tables.
data TableRequest = TableRequest
{ _trTable :: API.V0.TableName,
_trRelationships :: Set API.V0.Relationships,
_trRedactionExpressions :: Set API.V0.TargetRedactionExpressions,
_trQuery :: Query,
_trForeach :: Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue))
}
deriving stock (Eq, Ord, Show, Generic)
instance HasObjectCodec TableRequest where
instance HasObjectCodec QueryRequest where
objectCodec =
TableRequest
<$> requiredField "table" "The name of the table to query"
.= _trTable
-- TODO: Rename this field to "relationships" at some point in the future ala FunctionRequest.
-- NOTE: This can't be done immediately as it would break compatibility in agents.
<*> requiredField "table_relationships" "The relationships between tables involved in the entire query request"
.= _trRelationships
QueryRequest
<$> requiredField "target" "The name of the table to query"
.= _qrTarget
<*> requiredField "relationships" "The relationships between tables involved in the entire query request"
.= _qrRelationships
<*> optionalFieldWithOmittedDefault "redaction_expressions" mempty "Expressions that can be referenced by the query to redact fields/columns"
.= _trRedactionExpressions
.= _qrRedactionExpressions
<*> optionalFieldWithOmittedDefault "interpolated_queries" mempty "The details of the query against the table"
.= _qrInterpolatedQueries
<*> requiredField "query" "The details of the query against the table"
.= _trQuery
.= _qrQuery
<*> optionalFieldOrNull "foreach" "If present, a list of columns and values for the columns that the query must be repeated for, applying the column values as a filter for each query."
.= _trForeach
.= _qrForeach
pattern FunctionQueryRequest :: API.V0.FunctionName -> [FunctionArgument] -> Set API.V0.Relationships -> Set API.V0.TargetRedactionExpressions -> Query -> QueryRequest
pattern FunctionQueryRequest function args relationships redactionExps query = QRFunction (FunctionRequest function args relationships redactionExps query)
pattern FunctionQueryRequest :: API.V0.FunctionName -> [API.V0.FunctionArgument] -> Set API.V0.Relationships -> Set API.V0.TargetRedactionExpressions -> API.V0.InterpolatedQueries -> Query -> Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue)) -> QueryRequest
pattern FunctionQueryRequest function args relationships redactionExps interpolated query foreach = QueryRequest (API.V0.TFunction (API.V0.TargetFunction function args)) relationships redactionExps interpolated query foreach
-- | A serializable request to compute strutured data from a function.
data FunctionRequest = FunctionRequest
{ _frFunction :: API.V0.FunctionName,
_frFunctionArguments :: [FunctionArgument],
_frRelationships :: Set API.V0.Relationships,
_frRedactionExpressions :: Set API.V0.TargetRedactionExpressions,
_frQuery :: Query
}
deriving stock (Eq, Ord, Show, Generic)
pattern TableQueryRequest :: API.V0.TableName -> Set API.V0.Relationships -> Set API.V0.TargetRedactionExpressions -> API.V0.InterpolatedQueries -> Query -> Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue)) -> QueryRequest
pattern TableQueryRequest table relationships redactionExps inpterpolated query foreach = QueryRequest (API.V0.TTable (API.V0.TargetTable table)) relationships redactionExps inpterpolated query foreach
-- | Note: Only named arguments are currently supported,
-- however this is reified explicitly since so that it can be extended to ordinal or other types in future.
-- We reuse the same type for the Codec and ObjectCodec since we only have one constructor but still
-- wish to make the type explicit.
data FunctionArgument = NamedArgument
{ _faName :: Text,
_faValue :: ArgumentValue
}
deriving stock (Eq, Ord, Show, Generic)
newtype ArgumentValue = ScalarArgumentValue
{ _savValue :: API.V0.ScalarValue
}
deriving stock (Eq, Ord, Show, Generic)
instance HasCodec ArgumentValue where
codec =
object "ArgumentValue" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
(ScalarArgumentValue n) -> ("scalar", mapToEncoder n objectCodec)
dec =
HashMap.fromList
[ ("scalar", ("ScalarArgumentValue", mapToDecoder ScalarArgumentValue objectCodec))
]
namedArgumentObjectCodec :: JSONObjectCodec FunctionArgument
namedArgumentObjectCodec =
NamedArgument
<$> requiredField "name" "The name of the named argument" .= _faName
<*> requiredField "value" "The value of the named argument" .= _faValue
instance HasCodec FunctionArgument where
codec =
object "FunctionRequestArgument" $
discriminatedUnionCodec "type" enc dec
where
enc = \case
n -> ("named", mapToEncoder n namedArgumentObjectCodec)
dec =
HashMap.fromList
[ ("named", ("NamedArgument", mapToDecoder id namedArgumentObjectCodec))
]
instance HasObjectCodec FunctionRequest where
objectCodec =
FunctionRequest
<$> requiredField "function" "The name of the function to query"
.= _frFunction
<*> optionalFieldWithDefault "function_arguments" mempty "Function Arguments. TODO. Improve this."
.= _frFunctionArguments
<*> requiredField "relationships" "The relationships between entities involved in the entire query request"
.= _frRelationships
<*> optionalFieldWithOmittedDefault "redaction_expressions" mempty "Expressions that can be referenced by the query to redact fields/columns"
.= _frRedactionExpressions
<*> requiredField "query" "The details of the query against the table"
.= _frQuery
pattern NativeQueryRequest :: API.V0.InterpolatedQueryId -> Set API.V0.Relationships -> Set API.V0.TargetRedactionExpressions -> API.V0.InterpolatedQueries -> Query -> Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue)) -> QueryRequest
pattern NativeQueryRequest interpolatedId relationships redactionExps interpolatedQueries query foreach = QueryRequest (API.V0.TInterpolated (API.V0.TargetInterpolatedQuery interpolatedId)) relationships redactionExps interpolatedQueries query foreach
newtype FieldName = FieldName {unFieldName :: Text}
deriving stock (Eq, Ord, Show, Generic, Data)
@ -520,13 +402,8 @@ _NestedArrayFieldValue :: Prism' FieldValue [FieldValue]
_NestedArrayFieldValue = prism' mkNestedArrayFieldValue (either (const Nothing) Just . deserializeAsNestedArrayFieldValue)
$(makePrisms ''QueryRequest)
$(makeLenses ''TableRequest)
$(makeLenses ''FunctionRequest)
$(makeLenses ''QueryRequest)
$(makeLenses ''Query)
$(makePrisms ''Field)
$(makeLenses ''QueryResponse)
$(makePrisms ''FieldValue)
qrForeach :: Traversal' QueryRequest (Maybe (NonEmpty (HashMap API.V0.ColumnName API.V0.ScalarValue)))
qrForeach = _QRTable . trForeach

View File

@ -8,12 +8,13 @@ module Hasura.Backends.DataConnector.API.V0.Relationships
pattern RFunctionRelationships,
FunctionRelationships (..),
TableRelationships (..),
InterpolatedRelationships (..),
trelSourceTable,
trelRelationships,
frelRelationships,
frelSourceFunction,
Relationship (..),
rTargetTable,
rTarget,
rRelationshipType,
rColumnMapping,
RelationshipName (..),
@ -36,10 +37,12 @@ import Data.Text (Text)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.Column qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Function qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.InterpolatedQuery qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Table qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Target qualified as API.V0
import Prelude
data Relationships = RTable TableRelationships | RFunction FunctionRelationships
data Relationships = RTable TableRelationships | RFunction FunctionRelationships | RInterpolated InterpolatedRelationships
deriving stock (Eq, Ord, Show, Generic, Data)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Relationships
@ -58,35 +61,38 @@ instance HasCodec Relationships where
enc = \case
RTable rt -> ("table", mapToEncoder rt objectCodec)
RFunction rf -> ("function", mapToEncoder rf objectCodec)
RInterpolated ri -> ("interpolated", mapToEncoder ri objectCodec)
dec =
HashMap.fromList
[ ("table", ("TableRelationships", mapToDecoder RTable objectCodec)),
("function", ("FunctionRelationships", mapToDecoder RFunction objectCodec))
("function", ("FunctionRelationships", mapToDecoder RFunction objectCodec)),
("interpolated", ("InterpolatedRelationships", mapToDecoder RInterpolated objectCodec))
]
data InterpolatedRelationships = InterpolatedRelationships
{ _irSource :: API.V0.InterpolatedQueryId,
_irRelationships :: HashMap.HashMap RelationshipName Relationship
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasObjectCodec InterpolatedRelationships where
objectCodec =
InterpolatedRelationships
<$> requiredField "source_interpolated_query" "The source interpolated query involved in the relationship" .= _irSource
<*> requiredField "relationships" "A map of relationships from the interpolated table to targets. The key of the map is the relationship name" .= _irRelationships
-- NOTE: Prefix is `trel` due to TableRequest conflicting with `tr` prefix.
data TableRelationships = TableRelationships
{ _trelSourceTable :: API.V0.TableName,
_trelRelationships :: HashMap.HashMap RelationshipName Relationship
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec TableRelationships
instance HasObjectCodec TableRelationships where
objectCodec =
TableRelationships
<$> requiredField "source_table" "The name of the source table in the relationship" .= _trelSourceTable
<*> requiredField "relationships" "A map of relationships from the source table to target tables. The key of the map is the relationship name" .= _trelRelationships
-- Note: This instance is defined because MutationRequest uses TableRelationships directly without wrapping it in RTable.
instance HasCodec TableRelationships where
codec = object "TableRelationships" $ typeTag *> objectFields
where
typeTag = requiredFieldWith' "type" (literalTextCodec "table") .= const "table"
objectFields =
TableRelationships
<$> requiredField "source_table" "The name of the source table in the relationship" .= _trelSourceTable
<*> requiredField "relationships" "A map of relationships from the source table to target tables. The key of the map is the relationship name" .= _trelRelationships
<*> requiredField "relationships" "A map of relationships from the source table to targets. The key of the map is the relationship name" .= _trelRelationships
data FunctionRelationships = FunctionRelationships
{ _frelSourceFunction :: API.V0.FunctionName,
@ -98,11 +104,11 @@ instance HasObjectCodec FunctionRelationships where
objectCodec =
FunctionRelationships
<$> requiredField "source_function" "The name of the source function in the relationship" .= _frelSourceFunction
<*> requiredField "relationships" "A map of relationships from the source table to target tables. The key of the map is the relationship name" .= _frelRelationships
<*> requiredField "relationships" "A map of relationships from the source function to targets. The key of the map is the relationship name" .= _frelRelationships
-- Top level seperation of tables and functions should be adopted here too.
data Relationship = Relationship
{ _rTargetTable :: API.V0.TableName,
{ _rTarget :: API.V0.Target,
_rRelationshipType :: RelationshipType,
_rColumnMapping :: HashMap.HashMap SourceColumnName TargetColumnName
}
@ -113,7 +119,7 @@ instance HasCodec Relationship where
codec =
object "Relationship" $
Relationship
<$> requiredField "target_table" "The name of the target table in the relationship" .= _rTargetTable
<$> requiredField "target" "The name of the target table in the relationship" .= _rTarget
<*> requiredField "relationship_type" "The type of the relationship" .= _rRelationshipType
<*> requiredField "column_mapping" "A mapping between columns on the source table to columns on the target table" .= _rColumnMapping

View File

@ -13,6 +13,7 @@ import Autodocodec
import Autodocodec.OpenAPI ()
import Control.DeepSeq (NFData)
import Data.Aeson (FromJSON, FromJSONKey (..), ToJSON, ToJSONKey (..), Value)
import Data.Data (Data)
import Data.Hashable (Hashable)
import Data.OpenApi (ToSchema)
import Data.Text (Text)
@ -22,7 +23,7 @@ import Prelude
--------------------------------------------------------------------------------
newtype ScalarType = ScalarType {getScalarType :: Text}
deriving stock (Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show, Data)
deriving anyclass (Hashable, NFData)
deriving newtype (FromJSONKey, ToJSONKey)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ScalarType
@ -36,7 +37,7 @@ data ScalarValue = ScalarValue
{ _svValue :: Value,
_svValueType :: ScalarType
}
deriving stock (Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show, Data)
deriving anyclass (Hashable, NFData)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ScalarValue

View File

@ -4,14 +4,27 @@
{-# LANGUAGE TemplateHaskell #-}
module Hasura.Backends.DataConnector.API.V0.Target
( TargetName (..),
( Target (..),
TargetTable (..),
TargetFunction (..),
TargetInterpolatedQuery (..),
TargetName (..),
pattern TTargetTable,
pattern TTargetFunction,
_TTable,
_TFunction,
_TInterpolated,
ttName,
tfName,
tfArguments,
tiQueryId,
)
where
import Autodocodec.Extended
import Autodocodec.OpenAPI ()
import Control.DeepSeq (NFData)
import Control.Lens.TH (makePrisms)
import Control.Lens.TH (makeLenses, makePrisms)
import Data.Aeson (FromJSON, ToJSON)
import Data.Data (Data)
import Data.HashMap.Strict qualified as HashMap
@ -19,12 +32,73 @@ import Data.Hashable (Hashable)
import Data.OpenApi (ToSchema)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.Function qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.InterpolatedQuery qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Table qualified as API.V0
import Prelude
data Target
= TTable TargetTable
| TFunction TargetFunction
| TInterpolated TargetInterpolatedQuery
deriving stock (Eq, Ord, Show, Generic, Data)
pattern TTargetTable :: API.V0.TableName -> Target
pattern TTargetTable n = TTable (TargetTable n)
pattern TTargetFunction :: API.V0.FunctionName -> [API.V0.FunctionArgument] -> Target
pattern TTargetFunction n a = TFunction (TargetFunction n a)
instance HasCodec Target where
codec = object "Target" $ discriminatedUnionCodec "type" enc dec
where
tableKey = "table"
functionKey = "function"
interpolatedKey = "interpolated"
enc = \case
(TTable t) -> (tableKey, mapToEncoder t objectCodec)
(TFunction f) -> (functionKey, mapToEncoder f objectCodec)
(TInterpolated i) -> (interpolatedKey, mapToEncoder i objectCodec)
dec =
HashMap.fromList
[ (tableKey, ("TTable", mapToDecoder TTable objectCodec)),
(functionKey, ("TFunction", mapToDecoder TFunction objectCodec)),
(interpolatedKey, ("TInterpolated", mapToDecoder TInterpolated objectCodec))
]
newtype TargetTable = TargetTable
{ _ttName :: API.V0.TableName
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasObjectCodec TargetTable where
objectCodec = TargetTable <$> requiredField "name" "The name of the table to query" .= _ttName
data TargetFunction = TargetFunction
{ _tfName :: API.V0.FunctionName,
_tfArguments :: [API.V0.FunctionArgument]
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasObjectCodec TargetFunction where
objectCodec =
TargetFunction
<$> requiredField "name" "The name of the function to invoke" .= _tfName
<*> requiredField "arguments" "The arguments of the function" .= _tfArguments
data TargetInterpolatedQuery = TargetInterpolatedQuery
{ _tiQueryId :: API.V0.InterpolatedQueryId
}
deriving stock (Eq, Ord, Show, Generic, Data)
instance HasObjectCodec TargetInterpolatedQuery where
objectCodec =
TargetInterpolatedQuery
<$> requiredField "id" "The id for the query interpolation template" .= _tiQueryId
data TargetName
= TNTable API.V0.TableName
| TNFunction API.V0.FunctionName
| TNInterpolatedQuery API.V0.InterpolatedQueryId
deriving stock (Eq, Ord, Show, Generic, Data)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec TargetName
@ -34,15 +108,23 @@ instance HasCodec TargetName where
where
tableKey = "table"
functionKey = "function"
interpolatedQueryKey = "interpolated"
tnTableObjectCodec = requiredField "table" "The name of the table to query"
tnFunctionObjectCodec = requiredField "function" "The name of the function"
tnInterpolatedQueryObjectCodec = requiredField "interpolated" "The id of the interpolated query"
enc = \case
(TNTable t) -> (tableKey, mapToEncoder t tnTableObjectCodec)
(TNFunction f) -> (functionKey, mapToEncoder f tnFunctionObjectCodec)
(TNInterpolatedQuery q) -> (interpolatedQueryKey, mapToEncoder q tnInterpolatedQueryObjectCodec)
dec =
HashMap.fromList
[ (tableKey, ("TNTable", mapToDecoder TNTable tnTableObjectCodec)),
(functionKey, ("TNFunction", mapToDecoder TNFunction tnFunctionObjectCodec))
(functionKey, ("TNFunction", mapToDecoder TNFunction tnFunctionObjectCodec)),
(interpolatedQueryKey, ("TNInterpolatedQuery", mapToDecoder TNInterpolatedQuery tnInterpolatedQueryObjectCodec))
]
$(makePrisms ''Target)
$(makeLenses ''TargetTable)
$(makeLenses ''TargetFunction)
$(makeLenses ''TargetInterpolatedQuery)
$(makePrisms ''TargetName)

View File

@ -159,7 +159,7 @@ artistsTableRelationships =
in API.TableRelationships
artistsTableName
( HashMap.fromList
[ (albumsRelationshipName, API.Relationship albumsTableName API.ArrayRelationship joinFieldMapping)
[ (albumsRelationshipName, API.Relationship (API.TTargetTable albumsTableName) API.ArrayRelationship joinFieldMapping)
]
)
@ -180,8 +180,8 @@ albumsTableRelationships =
in API.TableRelationships
albumsTableName
( HashMap.fromList
[ (artistRelationshipName, API.Relationship artistsTableName API.ObjectRelationship artistsJoinFieldMapping),
(tracksRelationshipName, API.Relationship tracksTableName API.ArrayRelationship tracksJoinFieldMapping)
[ (artistRelationshipName, API.Relationship (API.TTargetTable artistsTableName) API.ObjectRelationship artistsJoinFieldMapping),
(tracksRelationshipName, API.Relationship (API.TTargetTable tracksTableName) API.ArrayRelationship tracksJoinFieldMapping)
]
)
@ -208,8 +208,8 @@ customersTableRelationships =
in API.TableRelationships
customersTableName
( HashMap.fromList
[ (supportRepRelationshipName, API.Relationship employeesTableName API.ObjectRelationship supportRepJoinFieldMapping),
(invoicesRelationshipName, API.Relationship invoicesTableName API.ArrayRelationship invoicesJoinFieldMapping)
[ (supportRepRelationshipName, API.Relationship (API.TTargetTable employeesTableName) API.ObjectRelationship supportRepJoinFieldMapping),
(invoicesRelationshipName, API.Relationship (API.TTargetTable invoicesTableName) API.ArrayRelationship invoicesJoinFieldMapping)
]
)
@ -236,8 +236,8 @@ employeesTableRelationships =
in API.TableRelationships
employeesTableName
( HashMap.fromList
[ (supportRepForCustomersRelationshipName, API.Relationship customersTableName API.ArrayRelationship supportRepJoinFieldMapping),
(reportsToEmployeeRelationshipName, API.Relationship employeesTableName API.ObjectRelationship reportsToEmployeeJoinFieldMapping)
[ (supportRepForCustomersRelationshipName, API.Relationship (API.TTargetTable customersTableName) API.ArrayRelationship supportRepJoinFieldMapping),
(reportsToEmployeeRelationshipName, API.Relationship (API.TTargetTable employeesTableName) API.ObjectRelationship reportsToEmployeeJoinFieldMapping)
]
)
@ -264,8 +264,8 @@ invoicesTableRelationships =
in API.TableRelationships
invoicesTableName
( HashMap.fromList
[ (invoiceLinesRelationshipName, API.Relationship invoiceLinesTableName API.ArrayRelationship invoiceLinesJoinFieldMapping),
(customerRelationshipName, API.Relationship customersTableName API.ObjectRelationship customersJoinFieldMapping)
[ (invoiceLinesRelationshipName, API.Relationship (API.TTargetTable invoiceLinesTableName) API.ArrayRelationship invoiceLinesJoinFieldMapping),
(customerRelationshipName, API.Relationship (API.TTargetTable customersTableName) API.ObjectRelationship customersJoinFieldMapping)
]
)
@ -285,8 +285,8 @@ invoiceLinesTableRelationships =
in API.TableRelationships
invoiceLinesTableName
( HashMap.fromList
[ (invoiceRelationshipName, API.Relationship invoicesTableName API.ObjectRelationship invoiceJoinFieldMapping),
(trackRelationshipName, API.Relationship tracksTableName API.ObjectRelationship tracksJoinFieldMapping)
[ (invoiceRelationshipName, API.Relationship (API.TTargetTable invoicesTableName) API.ObjectRelationship invoiceJoinFieldMapping),
(trackRelationshipName, API.Relationship (API.TTargetTable tracksTableName) API.ObjectRelationship tracksJoinFieldMapping)
]
)
@ -322,11 +322,11 @@ tracksTableRelationships =
in API.TableRelationships
tracksTableName
( HashMap.fromList
[ (invoiceLinesRelationshipName, API.Relationship invoiceLinesTableName API.ArrayRelationship invoiceLinesJoinFieldMapping),
(mediaTypeRelationshipName, API.Relationship mediaTypesTableName API.ObjectRelationship mediaTypeJoinFieldMapping),
(albumRelationshipName, API.Relationship albumsTableName API.ObjectRelationship albumJoinFieldMapping),
(genreRelationshipName, API.Relationship genresTableName API.ObjectRelationship genreJoinFieldMapping),
(playlistTracksRelationshipName, API.Relationship playlistTracksTableName API.ArrayRelationship playlistTracksJoinFieldMapping)
[ (invoiceLinesRelationshipName, API.Relationship (API.TTargetTable invoiceLinesTableName) API.ArrayRelationship invoiceLinesJoinFieldMapping),
(mediaTypeRelationshipName, API.Relationship (API.TTargetTable mediaTypesTableName) API.ObjectRelationship mediaTypeJoinFieldMapping),
(albumRelationshipName, API.Relationship (API.TTargetTable albumsTableName) API.ObjectRelationship albumJoinFieldMapping),
(genreRelationshipName, API.Relationship (API.TTargetTable genresTableName) API.ObjectRelationship genreJoinFieldMapping),
(playlistTracksRelationshipName, API.Relationship (API.TTargetTable playlistTracksTableName) API.ArrayRelationship playlistTracksJoinFieldMapping)
]
)
@ -363,7 +363,7 @@ genresTableRelationships =
in API.TableRelationships
genresTableName
( HashMap.fromList
[ (tracksRelationshipName, API.Relationship tracksTableName API.ArrayRelationship joinFieldMapping)
[ (tracksRelationshipName, API.Relationship (API.TTargetTable tracksTableName) API.ArrayRelationship joinFieldMapping)
]
)
@ -543,7 +543,7 @@ mkTestData schemaResponse testConfig =
prefixTableRelationships :: API.TableRelationships -> API.TableRelationships
prefixTableRelationships =
API.trelSourceTable %~ formatTableName testConfig
>>> API.trelRelationships . traverse . API.rTargetTable %~ formatTableName testConfig
>>> API.trelRelationships . traverse . API.rTarget . API._TTable . API.ttName %~ (formatTableName testConfig)
-- | Test data from the TestingEdgeCases dataset template
data EdgeCasesTestData = EdgeCasesTestData

View File

@ -27,6 +27,6 @@ spec TestData {..} = describe "Error Protocol" do
(CustomBinaryComparisonOperator "FOOBAR")
(_tdCurrentComparisonColumn "ArtistId" artistIdScalarType)
(Data.scalarValueComparison (Number 1) $ artistIdScalarType)
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in QueryRequest (TTargetTable _tdArtistsTableName) mempty mempty mempty query Nothing
artistIdScalarType = _tdFindColumnScalarType _tdArtistsTableName "ArtistId"

View File

@ -28,4 +28,4 @@ spec TestData {..} _ = do
artistsQueryRequest =
let fields = Data.mkFieldsMap [("ArtistId", _tdColumnField _tdArtistsTableName "ArtistId"), ("Name", _tdColumnField _tdArtistsTableName "Name")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing

View File

@ -12,7 +12,6 @@ import Data.List.NonEmpty qualified as NonEmpty
import Data.Maybe (fromMaybe, maybeToList)
import Data.Set qualified as Set
import Hasura.Backends.DataConnector.API
import Hasura.Backends.DataConnector.API.V0.Relationships as API
import Test.AgentAPI (mutationGuarded, queryGuarded)
import Test.AgentDatasets (chinookTemplate, usesDataset)
import Test.Data (EdgeCasesTestData (..), TestData (..))
@ -119,11 +118,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
let deleteOperation =
mkDeleteOperation _tdInvoiceLinesTableName
& dmoWhere ?~ whereExp
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships]
let tableRelationships = Set.singleton $ RTable $ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [DeleteOperation deleteOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -139,7 +138,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
pure $ track ^? Data.field "Composer" . Data._ColumnFieldString /= Just "Eric Clapton"
)
receivedInvoiceLines <- Data.sortResponseRowsBy "InvoiceLineId" <$> queryGuarded (invoiceLinesQueryRequest & qrRelationships .~ Set.map API.RTable tableRelationships)
receivedInvoiceLines <- Data.sortResponseRowsBy "InvoiceLineId" <$> queryGuarded (invoiceLinesQueryRequest & qrRelationships .~ tableRelationships)
Data.responseRows receivedInvoiceLines `rowsShouldBe` expectedRemainingRows
for_ (_cMutations >>= _mcReturningCapabilities) $ \_returningCapabilities -> describe "returning" $ do
@ -182,11 +181,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
)
)
]
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships]
let tableRelationships = Set.singleton $ RTable $ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [DeleteOperation deleteOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -246,13 +245,13 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
]
let tableRelationships =
Set.fromList
[ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships,
Data.onlyKeepRelationships [_tdPlaylistTracksRelationshipName] _tdTracksTableRelationships
[ RTable $ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships,
RTable $ Data.onlyKeepRelationships [_tdPlaylistTracksRelationshipName] _tdTracksTableRelationships
]
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [DeleteOperation deleteOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -307,13 +306,13 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
]
let tableRelationships =
Set.fromList
[ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships,
Data.onlyKeepRelationships [_tdPlaylistTracksRelationshipName] _tdTracksTableRelationships
[ RTable $ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships,
RTable $ Data.onlyKeepRelationships [_tdPlaylistTracksRelationshipName] _tdTracksTableRelationships
]
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [DeleteOperation deleteOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -371,13 +370,13 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
]
let tableRelationships =
Set.fromList
[ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships,
Data.onlyKeepRelationships [_tdInvoiceLinesRelationshipName] _tdInvoicesTableRelationships
[ RTable $ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships,
RTable $ Data.onlyKeepRelationships [_tdInvoiceLinesRelationshipName] _tdInvoicesTableRelationships
]
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [DeleteOperation deleteOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -453,7 +452,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Delete Mutati
invoiceLinesQueryRequest :: QueryRequest
invoiceLinesQueryRequest =
let query = Data.emptyQuery & qFields ?~ invoiceLinesFields & qOrderBy ?~ OrderBy mempty (_tdOrderByColumn [] "InvoiceId" Ascending :| [])
in TableQueryRequest _tdInvoiceLinesTableName mempty mempty query Nothing
in TableQueryRequest _tdInvoiceLinesTableName mempty mempty mempty query Nothing
invoiceIdScalarType = _tdFindColumnScalarType _tdInvoiceLinesTableName "InvoiceId"
invoiceLineIdScalarType = _tdFindColumnScalarType _tdInvoiceLinesTableName "InvoiceLineId"

View File

@ -181,8 +181,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships .~ Set.fromList [Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]
& mrInsertSchema .~ Set.singleton albumsInsertSchema
& mrRelationships .~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships)
response <- mutationGuarded mutationRequest
@ -210,8 +210,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships .~ Set.fromList [Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]
& mrInsertSchema .~ Set.singleton albumsInsertSchema
& mrRelationships .~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships)
response <- mutationExpectError mutationRequest
_crType response `shouldBe` MutationPermissionCheckFailure
@ -236,8 +236,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships .~ Set.fromList [Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]
& mrInsertSchema .~ Set.singleton albumsInsertSchema
& mrRelationships .~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships)
response <- mutationGuarded mutationRequest
@ -264,8 +264,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships .~ Set.fromList [Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]
& mrInsertSchema .~ Set.singleton albumsInsertSchema
& mrRelationships .~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships)
response <- mutationExpectError mutationRequest
_crType response `shouldBe` MutationPermissionCheckFailure
@ -346,8 +346,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships .~ Set.fromList [Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]
& mrInsertSchema .~ Set.singleton albumsInsertSchema
& mrRelationships .~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships)
response <- mutationGuarded mutationRequest
@ -410,8 +410,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [employeesInsertSchema]
& mrTableRelationships
.~ Set.fromList [Data.onlyKeepRelationships [_tdReportsToEmployeeRelationshipName, _tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships]
& mrRelationships
.~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdReportsToEmployeeRelationshipName, _tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships)
response <- mutationGuarded mutationRequest
@ -475,8 +475,8 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [employeesInsertSchema]
& mrTableRelationships
.~ Set.fromList [Data.onlyKeepRelationships [_tdReportsToEmployeeRelationshipName, _tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships]
& mrRelationships
.~ Set.singleton (RTable $ Data.onlyKeepRelationships [_tdReportsToEmployeeRelationshipName, _tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships)
response <- mutationGuarded mutationRequest
@ -543,10 +543,10 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyMutationRequest
& mrOperations .~ [InsertOperation insertOperation]
& mrInsertSchema .~ Set.fromList [albumsInsertSchema]
& mrTableRelationships
& mrRelationships
.~ Set.fromList
[ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships,
Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships
[ RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships,
RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships
]
response <- mutationGuarded mutationRequest
@ -717,7 +717,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyQuery
& qFields ?~ mkFieldsFromExpectedData _tdArtistsTableName (expectedInsertedArtists artistsStartingId)
& qWhere ?~ ApplyBinaryArrayComparisonOperator In (_tdCurrentComparisonColumn "ArtistId" artistIdScalarType) (J.Number . fromInteger <$> artistIds) artistIdScalarType
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
albumsQueryRequest :: [Integer] -> QueryRequest
albumsQueryRequest albumIds =
@ -725,7 +725,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyQuery
& qFields ?~ mkFieldsFromExpectedData _tdAlbumsTableName (expectedInsertedAcdcAlbums albumsStartingId)
& qWhere ?~ ApplyBinaryArrayComparisonOperator In (_tdCurrentComparisonColumn "AlbumId" albumIdScalarType) (J.Number . fromInteger <$> albumIds) albumIdScalarType
in TableQueryRequest _tdAlbumsTableName mempty mempty query Nothing
in TableQueryRequest _tdAlbumsTableName mempty mempty mempty query Nothing
employeesQueryRequest :: [Integer] -> QueryRequest
employeesQueryRequest employeeIds =
@ -733,7 +733,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Insert Mutati
Data.emptyQuery
& qFields ?~ mkFieldsFromExpectedData _tdEmployeesTableName (expectedInsertedEmployees employeesStartingId)
& qWhere ?~ ApplyBinaryArrayComparisonOperator In (_tdCurrentComparisonColumn "EmployeeId" albumIdScalarType) (J.Number . fromInteger <$> employeeIds) employeeIdScalarType
in TableQueryRequest _tdEmployeesTableName mempty mempty query Nothing
in TableQueryRequest _tdEmployeesTableName mempty mempty mempty query Nothing
artistsInsertSchema :: TableInsertSchema
artistsInsertSchema = _tdMkDefaultTableInsertSchema _tdArtistsTableName

View File

@ -13,7 +13,6 @@ import Data.List.NonEmpty (NonEmpty (..))
import Data.Maybe (catMaybes, mapMaybe, maybeToList)
import Data.Set qualified as Set
import Hasura.Backends.DataConnector.API
import Hasura.Backends.DataConnector.API.V0.Relationships as API
import Language.GraphQL.Draft.Syntax.QQ qualified as G
import Test.AgentAPI (mutationExpectError, mutationGuarded, queryGuarded)
import Test.AgentDatasets (chinookTemplate, usesDataset)
@ -100,11 +99,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
mkUpdateOperation _tdArtistsTableName
& umoUpdates .~ Set.fromList [SetColumn $ _tdRowColumnOperatorValue _tdArtistsTableName "Name" (J.String "Metalika")]
& umoWhere ?~ whereExp
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships]
let tableRelationships = Set.singleton $ RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -125,7 +124,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
)
& fmap (\artist -> artist & Data.field "Name" . Data._ColumnFieldString .~ "Metalika")
receivedArtists <- Data.sortResponseRowsBy "ArtistId" <$> queryGuarded (artistsQueryRequest whereExp & qrRelationships .~ Set.map API.RTable tableRelationships)
receivedArtists <- Data.sortResponseRowsBy "ArtistId" <$> queryGuarded (artistsQueryRequest whereExp & qrRelationships .~ tableRelationships)
Data.responseRows receivedArtists `rowsShouldBe` expectedModifiedRows
usesDataset chinookTemplate $ it "can set the value of a column differently using multiple operations" $ do
@ -365,11 +364,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
& umoUpdates .~ Set.fromList [SetColumn $ _tdRowColumnOperatorValue _tdInvoiceLinesTableName "InvoiceId" (J.Number 298)]
& umoWhere ?~ whereExp
& umoPostUpdateCheck ?~ postUpdateExp
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships]
let tableRelationships = Set.singleton (RTable $ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships)
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -386,7 +385,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
let invoiceLineIds = expectedModifiedRows & mapMaybe (^? Data.field "InvoiceLineId" . Data._ColumnFieldNumber) & fmap J.Number
let alternateWhereExp = ApplyBinaryArrayComparisonOperator In (_tdCurrentComparisonColumn "InvoiceLineId" invoiceLineIdScalarType) invoiceLineIds invoiceLineIdScalarType
receivedInvoiceLines <- Data.sortResponseRowsBy "InvoiceLineId" <$> queryGuarded (invoiceLinesQueryRequest alternateWhereExp & qrRelationships .~ Set.map API.RTable tableRelationships)
receivedInvoiceLines <- Data.sortResponseRowsBy "InvoiceLineId" <$> queryGuarded (invoiceLinesQueryRequest alternateWhereExp & qrRelationships .~ tableRelationships)
Data.responseRows receivedInvoiceLines `rowsShouldBe` expectedModifiedRows
usesDataset chinookTemplate $ it "fails to update when post update check against related table fails" $ do
@ -399,11 +398,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
& umoUpdates .~ Set.fromList [SetColumn $ _tdRowColumnOperatorValue _tdInvoiceLinesTableName "InvoiceId" (J.Number 298)]
& umoWhere ?~ whereExp
& umoPostUpdateCheck ?~ postUpdateExp
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships]
let tableRelationships = Set.singleton (RTable $ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships)
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationExpectError mutationRequest
@ -474,11 +473,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
)
)
]
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships]
let tableRelationships = Set.singleton (RTable $ Data.onlyKeepRelationships [_tdTrackRelationshipName] _tdInvoiceLinesTableRelationships)
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -525,11 +524,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
)
)
]
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships]
let tableRelationships = Set.singleton (RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships)
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -568,11 +567,11 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
)
)
]
let tableRelationships = Set.fromList [Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships]
let tableRelationships = Set.singleton (RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships)
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -635,13 +634,13 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
]
let tableRelationships =
Set.fromList
[ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships,
Data.onlyKeepRelationships [_tdInvoiceLinesRelationshipName] _tdInvoicesTableRelationships
[ RTable $ Data.onlyKeepRelationships [_tdInvoiceRelationshipName] _tdInvoiceLinesTableRelationships,
RTable $ Data.onlyKeepRelationships [_tdInvoiceLinesRelationshipName] _tdInvoicesTableRelationships
]
let mutationRequest =
Data.emptyMutationRequest
& mrOperations .~ [UpdateOperation updateOperation]
& mrTableRelationships .~ tableRelationships
& mrRelationships .~ tableRelationships
response <- mutationGuarded mutationRequest
@ -719,7 +718,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
artistsQueryRequest :: Expression -> QueryRequest
artistsQueryRequest whereExp =
let query = Data.emptyQuery & qFields ?~ artistsFields & qWhere ?~ whereExp
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
invoiceLinesFields :: HashMap FieldName Field
invoiceLinesFields =
@ -734,7 +733,7 @@ spec TestData {..} edgeCasesTestData Capabilities {..} = describe "Update Mutati
invoiceLinesQueryRequest :: Expression -> QueryRequest
invoiceLinesQueryRequest whereExp =
let query = Data.emptyQuery & qFields ?~ invoiceLinesFields & qWhere ?~ whereExp
in TableQueryRequest _tdInvoiceLinesTableName mempty mempty query Nothing
in TableQueryRequest _tdInvoiceLinesTableName mempty mempty mempty query Nothing
incOperator :: UpdateColumnOperatorName
incOperator = UpdateColumnOperatorName $ [G.name|inc|]

View File

@ -597,7 +597,7 @@ spec TestData {..} relationshipCapabilities = describe "Aggregate Queries" $ do
artistOrderBy = OrderBy mempty $ _tdOrderByColumn [] "ArtistId" Ascending :| []
artistQuery = Data.emptyQuery & qFields ?~ artistFields & qOrderBy ?~ artistOrderBy
artistsTableRelationships = Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships
in QRTable $ TableRequest _tdArtistsTableName (Set.fromList [API.RTable artistsTableRelationships]) mempty artistQuery Nothing
in TableQueryRequest _tdArtistsTableName (Set.fromList [API.RTable artistsTableRelationships]) mempty mempty artistQuery Nothing
-- This query is basically what would be generated by this complex HGE GraphQL query
-- @
@ -674,22 +674,23 @@ spec TestData {..} relationshipCapabilities = describe "Aggregate Queries" $ do
]
)
mempty
mempty
artistQuery
Nothing
artistsQueryRequest :: HashMap FieldName Aggregate -> QueryRequest
artistsQueryRequest aggregates =
let query = Data.emptyQuery & qAggregates ?~ aggregates
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
invoicesQueryRequest :: HashMap FieldName Aggregate -> QueryRequest
invoicesQueryRequest aggregates =
let query = Data.emptyQuery & qAggregates ?~ aggregates
in TableQueryRequest _tdInvoicesTableName mempty mempty query Nothing
in TableQueryRequest _tdInvoicesTableName mempty mempty mempty query Nothing
albumsQueryRequest :: QueryRequest
albumsQueryRequest =
TableQueryRequest _tdAlbumsTableName mempty mempty Data.emptyQuery Nothing
TableQueryRequest _tdAlbumsTableName mempty mempty mempty Data.emptyQuery Nothing
aggregate :: (NonEmpty a -> Value) -> [a] -> Value
aggregate aggFn values =

View File

@ -74,10 +74,10 @@ spec TestData {..} = describe "Basic Queries" $ do
artistsQueryRequest =
let fields = Data.mkFieldsMap [("ArtistId", _tdColumnField _tdArtistsTableName "ArtistId"), ("Name", _tdColumnField _tdArtistsTableName "Name")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
albumsQueryRequest :: QueryRequest
albumsQueryRequest =
let fields = Data.mkFieldsMap [("AlbumId", _tdColumnField _tdAlbumsTableName "AlbumId"), ("ArtistId", _tdColumnField _tdAlbumsTableName "ArtistId"), ("Title", _tdColumnField _tdAlbumsTableName "Title")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdAlbumsTableName mempty mempty query Nothing
in TableQueryRequest _tdAlbumsTableName mempty mempty mempty query Nothing

View File

@ -42,7 +42,7 @@ spec TestData {..} (ScalarTypesCapabilities scalarTypesCapabilities) = describe
let queryRequest =
let fields = Data.mkFieldsMap [(unColumnName columnName, _tdColumnField tableName (unColumnName columnName))]
query' = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest tableName mempty mempty query' Nothing
in TableQueryRequest tableName mempty mempty mempty query' Nothing
where' =
ApplyBinaryComparisonOperator
(CustomBinaryComparisonOperator (unName operatorName))

View File

@ -313,13 +313,13 @@ spec TestData {..} comparisonCapabilities = describe "Filtering in Queries" $ do
artistsQueryRequest =
let fields = Data.mkFieldsMap [("ArtistId", _tdColumnField _tdArtistsTableName "ArtistId"), ("Name", _tdColumnField _tdArtistsTableName "Name")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
albumsQueryRequest :: QueryRequest
albumsQueryRequest =
let fields = Data.mkFieldsMap [("AlbumId", _tdColumnField _tdAlbumsTableName "AlbumId"), ("ArtistId", _tdColumnField _tdAlbumsTableName "ArtistId"), ("Title", _tdColumnField _tdAlbumsTableName "Title")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdAlbumsTableName mempty mempty query Nothing
in TableQueryRequest _tdAlbumsTableName mempty mempty mempty query Nothing
albumIdScalarType = _tdFindColumnScalarType _tdAlbumsTableName "AlbumId"
albumTitleScalarType = _tdFindColumnScalarType _tdAlbumsTableName "Title"

View File

@ -261,13 +261,13 @@ spec TestData {..} Capabilities {..} = describe "Foreach Queries" $ do
albumsQueryRequest =
let fields = Data.mkFieldsMap [("AlbumId", _tdColumnField _tdAlbumsTableName "AlbumId"), ("ArtistId", _tdColumnField _tdAlbumsTableName "ArtistId"), ("Title", _tdColumnField _tdAlbumsTableName "Title")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdAlbumsTableName mempty mempty query Nothing
in TableQueryRequest _tdAlbumsTableName mempty mempty mempty query Nothing
playlistTracksQueryRequest :: QueryRequest
playlistTracksQueryRequest =
let fields = Data.mkFieldsMap [("PlaylistId", _tdColumnField _tdPlaylistTracksTableName "PlaylistId"), ("TrackId", _tdColumnField _tdPlaylistTracksTableName "TrackId")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdPlaylistTracksTableName mempty mempty query Nothing
in TableQueryRequest _tdPlaylistTracksTableName mempty mempty mempty query Nothing
mkForeachIds :: TableName -> [(Text, J.Value)] -> HashMap ColumnName ScalarValue
mkForeachIds tableName =

View File

@ -409,13 +409,13 @@ spec TestData {..} Capabilities {..} = describe "Order By in Queries" $ do
albumsQueryRequest :: QueryRequest
albumsQueryRequest =
TableQueryRequest _tdAlbumsTableName mempty mempty albumsQuery Nothing
TableQueryRequest _tdAlbumsTableName mempty mempty mempty albumsQuery Nothing
artistsQueryRequest :: QueryRequest
artistsQueryRequest =
let fields = Data.mkFieldsMap [("ArtistId", _tdColumnField _tdArtistsTableName "ArtistId"), ("Name", _tdColumnField _tdArtistsTableName "Name")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdArtistsTableName mempty mempty query Nothing
in TableQueryRequest _tdArtistsTableName mempty mempty mempty query Nothing
tracksQuery :: Query
tracksQuery =
@ -424,13 +424,13 @@ spec TestData {..} Capabilities {..} = describe "Order By in Queries" $ do
tracksQueryRequest :: QueryRequest
tracksQueryRequest =
TableQueryRequest _tdTracksTableName mempty mempty tracksQuery Nothing
TableQueryRequest _tdTracksTableName mempty mempty mempty tracksQuery Nothing
invoicesQueryRequest :: QueryRequest
invoicesQueryRequest =
let fields = Data.mkFieldsMap [("InvoiceId", _tdColumnField _tdInvoicesTableName "InvoiceId"), ("BillingState", _tdColumnField _tdInvoicesTableName "BillingState")]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdInvoicesTableName mempty mempty query Nothing
in TableQueryRequest _tdInvoicesTableName mempty mempty mempty query Nothing
orderBySingleColumnAggregateMax :: ColumnName -> ScalarType -> OrderByTarget
orderBySingleColumnAggregateMax columnName resultType = OrderBySingleColumnAggregate $ SingleColumnAggregate (SingleColumnAggregateFunction [G.name|max|]) columnName Nothing resultType

View File

@ -368,7 +368,7 @@ spec TestData {..} Capabilities {..} = describe "Data Redaction in Queries" $ do
("BillingCity", _tdColumnField _tdInvoicesTableName "BillingCity")
]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdInvoicesTableName mempty mempty query Nothing
in TableQueryRequest _tdInvoicesTableName mempty mempty mempty query Nothing
customerQueryRequest :: QueryRequest
customerQueryRequest =
@ -379,7 +379,7 @@ spec TestData {..} Capabilities {..} = describe "Data Redaction in Queries" $ do
("LastName", _tdColumnField _tdCustomersTableName "LastName")
]
query = Data.emptyQuery & qFields ?~ fields
in TableQueryRequest _tdCustomersTableName mempty mempty query Nothing
in TableQueryRequest _tdCustomersTableName mempty mempty mempty query Nothing
singleColumnAggregateMin :: ColumnName -> ScalarType -> RedactionExpressionName -> Aggregate
singleColumnAggregateMin columnName resultType redactionExpName = SingleColumn $ SingleColumnAggregate (SingleColumnAggregateFunction [G.name|min|]) columnName (Just redactionExpName) resultType

View File

@ -247,7 +247,7 @@ spec TestData {..} subqueryComparisonCapabilities = describe "Relationship Queri
("Artist", RelField $ RelationshipField _tdArtistRelationshipName artistsSubquery)
]
query = albumsQuery & qFields ?~ fields
in TableQueryRequest _tdAlbumsTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]) mempty query Nothing
in TableQueryRequest _tdAlbumsTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdArtistRelationshipName] _tdAlbumsTableRelationships]) mempty mempty query Nothing
artistsWithAlbumsQuery :: (Query -> Query) -> QueryRequest
artistsWithAlbumsQuery modifySubquery =
@ -261,7 +261,7 @@ spec TestData {..} subqueryComparisonCapabilities = describe "Relationship Queri
("Albums", RelField $ RelationshipField _tdAlbumsRelationshipName albumsSubquery)
]
query = artistsQuery & qFields ?~ fields
in TableQueryRequest _tdArtistsTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships]) mempty query Nothing
in TableQueryRequest _tdArtistsTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdAlbumsRelationshipName] _tdArtistsTableRelationships]) mempty mempty query Nothing
employeesWithCustomersQuery :: (Query -> Query) -> QueryRequest
employeesWithCustomersQuery modifySubquery =
@ -273,7 +273,7 @@ spec TestData {..} subqueryComparisonCapabilities = describe "Relationship Queri
[ ("SupportRepForCustomers", RelField $ RelationshipField _tdSupportRepForCustomersRelationshipName customersSubquery)
]
query = employeesQuery & qFields ?~ fields
in TableQueryRequest _tdEmployeesTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships]) mempty query Nothing
in TableQueryRequest _tdEmployeesTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdSupportRepForCustomersRelationshipName] _tdEmployeesTableRelationships]) mempty mempty query Nothing
customersWithSupportRepQuery :: (Query -> Query) -> QueryRequest
customersWithSupportRepQuery modifySubquery =
@ -284,7 +284,7 @@ spec TestData {..} subqueryComparisonCapabilities = describe "Relationship Queri
[ ("SupportRep", RelField $ RelationshipField _tdSupportRepRelationshipName supportRepSubquery)
]
query = customersQuery & qFields ?~ fields
in TableQueryRequest _tdCustomersTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdSupportRepRelationshipName] _tdCustomersTableRelationships]) mempty query Nothing
in TableQueryRequest _tdCustomersTableName (Set.fromList [API.RTable $ Data.onlyKeepRelationships [_tdSupportRepRelationshipName] _tdCustomersTableRelationships]) mempty mempty query Nothing
artistsQuery :: Query
artistsQuery =

View File

@ -60,7 +60,7 @@ spec testConfig API.Capabilities {} = describe "supports functions" $ preloadAge
k = "take" :: Text.Text
v = API.ScalarValue (Number (fromIntegral fibonacciRows)) (API.ScalarType "number")
args = [NamedArgument k (API.ScalarArgumentValue v)]
in FunctionQueryRequest _ftdFibonacciFunctionName args mempty mempty query'
in FunctionQueryRequest _ftdFibonacciFunctionName args mempty mempty mempty query' mempty
testData@FunctionsTestData {..} = mkFunctionsTestData preloadedSchema testConfig
query = fibonacciRequest testData
@ -90,13 +90,13 @@ spec testConfig API.Capabilities {} = describe "supports functions" $ preloadAge
in API.FunctionRelationships
_ftdSearchArticlesFunctionName
( HashMap.fromList
[ (_ftdAuthorRelationshipName, API.Relationship _ftdAuthorsTableName API.ObjectRelationship authorsJoinFieldMapping)
[ (_ftdAuthorRelationshipName, API.Relationship (API.TTargetTable _ftdAuthorsTableName) API.ObjectRelationship authorsJoinFieldMapping)
]
)
relationships = Set.singleton (API.RFunction authorRelationship)
v = API.ScalarValue (String "x") (API.ScalarType "string")
args = [NamedArgument "query" (API.ScalarArgumentValue v)]
in FunctionQueryRequest _ftdSearchArticlesFunctionName args relationships mempty query'
in FunctionQueryRequest _ftdSearchArticlesFunctionName args relationships mempty mempty query' mempty
testData = mkFunctionsTestData preloadedSchema testConfig
query = articlesRequest testData
@ -138,13 +138,13 @@ spec testConfig API.Capabilities {} = describe "supports functions" $ preloadAge
in API.FunctionRelationships
_ftdSearchArticlesFunctionName
( HashMap.fromList
[ (_ftdAuthorRelationshipName, API.Relationship _ftdAuthorsTableName API.ObjectRelationship authorsJoinFieldMapping)
[ (_ftdAuthorRelationshipName, API.Relationship (API.TTargetTable _ftdAuthorsTableName) API.ObjectRelationship authorsJoinFieldMapping)
]
)
relationships = Set.singleton (API.RFunction authorRelationship)
v = API.ScalarValue (String "y") (API.ScalarType "string")
args = [NamedArgument "query" (API.ScalarArgumentValue v)]
in FunctionQueryRequest _ftdSearchArticlesFunctionName args relationships mempty query'
in FunctionQueryRequest _ftdSearchArticlesFunctionName args relationships mempty mempty query' mempty
testData = mkFunctionsTestData preloadedSchema testConfig
query = articlesRequest testData

View File

@ -43,7 +43,7 @@ import Harness.Schema
Table (..),
)
import Harness.Schema qualified as Schema
import Harness.Test.BackendType (BackendTypeConfig)
import Harness.Test.BackendType (BackendTypeConfig, postgresishGraphQLType)
import Harness.Test.BackendType qualified as BackendType
import Harness.Test.Fixture (SetupAction (..))
import Harness.TestEnvironment (TestEnvironment (..))
@ -65,7 +65,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Nothing,
backendSchemaKeyword = "dataset",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = postgresishGraphQLType
}
--------------------------------------------------------------------------------

View File

@ -47,7 +47,7 @@ import Harness.Logging
import Harness.Quoter.Yaml (interpolateYaml)
import Harness.Schema (BackendScalarType (..), BackendScalarValue (..), ScalarValue (..), SchemaName (..))
import Harness.Schema qualified as Schema
import Harness.Test.BackendType (BackendTypeConfig)
import Harness.Test.BackendType (BackendTypeConfig, postgresishGraphQLType)
import Harness.Test.BackendType qualified as BackendType
import Harness.Test.SetupAction (SetupAction (..))
import Harness.TestEnvironment (TestEnvironment (..))
@ -67,7 +67,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Nothing,
backendSchemaKeyword = "schema",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = postgresishGraphQLType
}
--------------------------------------------------------------------------------

View File

@ -46,7 +46,7 @@ import Harness.Logging
import Harness.Quoter.Yaml (interpolateYaml)
import Harness.Schema (BackendScalarType (..), BackendScalarValue (..), ScalarValue (..), SchemaName (..))
import Harness.Schema qualified as Schema
import Harness.Test.BackendType (BackendTypeConfig)
import Harness.Test.BackendType (BackendTypeConfig, postgresishGraphQLType)
import Harness.Test.BackendType qualified as BackendType
import Harness.Test.SetupAction (SetupAction (..))
import Harness.TestEnvironment (TestEnvironment (..))
@ -66,7 +66,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Nothing,
backendSchemaKeyword = "schema",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = postgresishGraphQLType
}
--------------------------------------------------------------------------------

View File

@ -79,7 +79,8 @@ backendTypeConfig =
backendReleaseNameString = Nothing,
backendServerUrl = Just "http://localhost:65005",
backendSchemaKeyword = "schema",
backendScalarType = const ""
backendScalarType = const "",
backendGraphQLType = const ""
}
--------------------------------------------------------------------------------

View File

@ -102,7 +102,8 @@ backendTypeConfig =
backendReleaseNameString = Nothing,
backendServerUrl = Just "http://localhost:65007",
backendSchemaKeyword = "schema",
backendScalarType = const ""
backendScalarType = const "",
backendGraphQLType = const ""
}
--------------------------------------------------------------------------------

View File

@ -59,7 +59,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Just "http://localhost:65006",
backendSchemaKeyword = "schema",
backendScalarType = const ""
backendScalarType = const "",
backendGraphQLType = const ""
}
--------------------------------------------------------------------------------

View File

@ -75,6 +75,7 @@ capabilities =
API._cSubscriptions = Nothing,
API._cScalarTypes = scalarTypesCapabilities,
API._cRelationships = Just API.RelationshipCapabilities {},
API._cInterpolatedQueries = Just API.InterpolatedQueryCapabilities {},
API._cComparisons =
Just
API.ComparisonCapabilities

View File

@ -65,7 +65,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Just "http://localhost:65007",
backendSchemaKeyword = "schema",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = scalarType
}
--------------------------------------------------------------------------------
@ -229,7 +230,7 @@ mkColumn :: Schema.Column -> Text
mkColumn Schema.Column {columnName, columnType, columnNullable, columnDefault} =
Text.unwords
[ wrapIdentifier columnName,
scalarType columnType,
toColumnType columnType,
bool "NOT NULL" "DEFAULT NULL" columnNullable,
maybe "" ("DEFAULT " <>) columnDefault
]
@ -263,6 +264,16 @@ mkReference _schemaName Schema.Reference {referenceLocalColumn, referenceTargetT
scalarType :: Schema.ScalarType -> Text
scalarType = \case
Schema.TInt -> "number"
Schema.TDouble -> "number"
Schema.TStr -> "string"
Schema.TUTCTime -> "DateTime"
Schema.TBool -> "bool"
Schema.TGeography -> "string"
Schema.TCustomType txt -> Schema.getBackendScalarType txt Schema.bstSqlite
toColumnType :: Schema.ScalarType -> Text
toColumnType = \case
Schema.TInt -> "INTEGER"
Schema.TDouble -> "REAL"
Schema.TStr -> "TEXT"

View File

@ -65,7 +65,7 @@ import Harness.Schema
)
import Harness.Schema qualified as Schema
import Harness.Services.Database.Postgres qualified as Postgres
import Harness.Test.BackendType (BackendTypeConfig)
import Harness.Test.BackendType (BackendTypeConfig, postgresishGraphQLType)
import Harness.Test.BackendType qualified as BackendType
import Harness.Test.SetupAction (SetupAction (..))
import Harness.TestEnvironment (GlobalTestEnvironment (..), TestEnvironment (..), TestingMode (..))
@ -86,7 +86,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Nothing,
backendSchemaKeyword = "schema",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = postgresishGraphQLType
}
--------------------------------------------------------------------------------

View File

@ -38,7 +38,7 @@ import Harness.Logging
import Harness.Quoter.Yaml (yaml)
import Harness.Schema (BackendScalarType (..), BackendScalarValue (..), ScalarValue (..))
import Harness.Schema qualified as Schema
import Harness.Test.BackendType (BackendType (SQLServer), BackendTypeConfig (..))
import Harness.Test.BackendType (BackendType (SQLServer), BackendTypeConfig (..), postgresishGraphQLType)
import Harness.Test.SetupAction (SetupAction (..))
import Harness.TestEnvironment (TestEnvironment (..))
import Hasura.Prelude
@ -57,7 +57,8 @@ backendTypeMetadata =
backendReleaseNameString = Nothing,
backendServerUrl = Nothing,
backendSchemaKeyword = "schema",
backendScalarType = scalarType
backendScalarType = scalarType,
backendGraphQLType = postgresishGraphQLType
}
--------------------------------------------------------------------------------

View File

@ -6,6 +6,7 @@ module Harness.Quoter.Graphql (graphql, ToGraphqlString (..)) where
import Data.Bifunctor qualified as Bifunctor
import Data.String (fromString)
import Data.Text (unpack)
import Hasura.Prelude
import Language.Haskell.Meta (parseExp)
import Language.Haskell.TH
@ -24,6 +25,9 @@ instance ToGraphqlString Bool where
instance ToGraphqlString String where
showGql = id
instance ToGraphqlString Text where
showGql = unpack
-- | Transforms GraphQL to its JSON representation. Does string interpolation.
-- For every expression enclosed as #{expression}, this Quasi Quoter will
-- evaluate 'expression' in the context it was written. For example:

View File

@ -6,6 +6,7 @@ module Harness.Test.BackendType
BackendTypeConfig (..),
isDataConnector,
parseCapabilities,
postgresishGraphQLType,
pattern DataConnectorMock,
pattern DataConnectorReference,
pattern DataConnectorSqlite,
@ -17,6 +18,7 @@ where
import Data.Aeson (Value)
import Data.Aeson.Key (Key)
import Data.Aeson.Types qualified as Aeson
import GHC.Stack (HasCallStack)
import Harness.Test.ScalarType
import Hasura.Backends.DataConnector.API.V0 qualified as API
import Hasura.Prelude
@ -41,12 +43,25 @@ data BackendTypeConfig = BackendTypeConfig
backendServerUrl :: Maybe String,
backendSchemaKeyword :: Key,
-- | How should we render scalar types for this backend?
backendScalarType :: ScalarType -> Text
backendScalarType :: ScalarType -> Text,
-- | How to map scalar types to GraphQL types
backendGraphQLType :: ScalarType -> Text
}
parseCapabilities :: BackendTypeConfig -> Maybe API.Capabilities
parseCapabilities = backendCapabilities >=> Aeson.parseMaybe Aeson.parseJSON
-- | Convenience Helper for "Postgresish" GraphQL Types.
postgresishGraphQLType :: (HasCallStack) => ScalarType -> Text
postgresishGraphQLType = \case
TInt -> "Int"
TStr -> "String"
TDouble -> "float8"
TUTCTime -> "timestamp"
TBool -> "Boolean"
TGeography -> "geography"
TCustomType txt -> getBackendScalarType txt bstPostgres -- Maybe?
-- | A supported backend type.
-- NOTE: Different data-connector agents are represented by seperate constructors
-- If we want to be able to test these generatively we may want to have a

View File

@ -18,6 +18,7 @@ module Harness.TestEnvironment
focusFixtureLeft,
focusFixtureRight,
scalarTypeToText,
graphQLTypeToText,
serverUrl,
stopServer,
testLogTrace,
@ -78,6 +79,11 @@ scalarTypeToText TestEnvironment {fixtureName} = case fixtureName of
Backend BackendTypeConfig {backendScalarType} -> backendScalarType
_ -> error "scalarTypeToText only currently defined for the `Backend` `FixtureName`"
graphQLTypeToText :: TestEnvironment -> ScalarType -> Text
graphQLTypeToText TestEnvironment {fixtureName} = case fixtureName of
Backend BackendTypeConfig {backendGraphQLType} -> backendGraphQLType
_ -> error "graphQLTypeToText only currently defined for the `Backend` `FixtureName`"
-- | The role we're going to use for testing. Either we're an admin, in which
-- case all permissions are implied, /or/ we're a regular user, in which case
-- the given permissions will be applied.

View File

@ -331,6 +331,7 @@ buildCacheStaticConfig AppEnv {..} =
-- Native Queries are always enabled for Postgres in the OSS edition.
_cscAreNativeQueriesEnabled = \case
Postgres Vanilla -> True
DataConnector -> True
_ -> False,
_cscAreStoredProceduresEnabled = False
}

View File

@ -34,6 +34,6 @@ instance BackendMetadata 'BigQuery where
listAllTables = BigQuery.listAllTables
listAllTrackables _ = throw400 UnexpectedPayload "listAllTrackables not supported by BigQuery!"
getTableInfo _ _ = throw400 UnexpectedPayload "get_table_info not yet supported in BigQuery!"
validateNativeQuery _ _ _ _ nq = do
validateNativeQuery _ _ _ _ _ nq = do
validateArgumentDeclaration nq
pure (trimQueryEnd (_nqmCode nq)) -- for now, all queries are valid

View File

@ -6,7 +6,7 @@ module Hasura.Backends.DataConnector.Adapter.API () where
import Hasura.Prelude
import Hasura.RQL.Types.BackendType (BackendType (DataConnector))
import Hasura.Server.API.Backend (BackendAPI (..), functionCommands, functionPermissionsCommands, logicalModelsCommands, relationshipCommands, remoteRelationshipCommands, sourceCommands, tableCommands, tablePermissionsCommands, trackableCommands)
import Hasura.Server.API.Backend (BackendAPI (..), functionCommands, functionPermissionsCommands, logicalModelsCommands, nativeQueriesCommands, relationshipCommands, remoteRelationshipCommands, sourceCommands, tableCommands, tablePermissionsCommands, trackableCommands)
--------------------------------------------------------------------------------
@ -21,5 +21,6 @@ instance BackendAPI 'DataConnector where
functionPermissionsCommands @'DataConnector,
relationshipCommands @'DataConnector,
remoteRelationshipCommands @'DataConnector,
logicalModelsCommands @'DataConnector
logicalModelsCommands @'DataConnector,
nativeQueriesCommands @'DataConnector
]

View File

@ -56,6 +56,9 @@ import Hasura.Logging (Hasura, Logger)
import Hasura.LogicalModel.Cache
import Hasura.LogicalModel.Metadata (LogicalModelMetadata (..))
import Hasura.LogicalModel.Types
import Hasura.NativeQuery.InterpolatedQuery (trimQueryEnd)
import Hasura.NativeQuery.Metadata (NativeQueryMetadata (..))
import Hasura.NativeQuery.Validation
import Hasura.Prelude
import Hasura.RQL.DDL.Relationship (defaultBuildArrayRelationshipInfo, defaultBuildObjectRelationshipInfo)
import Hasura.RQL.IR.BoolExp (ComparisonNullability (..), OpExpG (..), PartialSQLExp (..), RootOrCurrent (..), RootOrCurrentColumn (..))
@ -115,6 +118,14 @@ instance BackendMetadata 'DataConnector where
getTableInfo = getTableInfo'
supportsBeingRemoteRelationshipTarget = supportsBeingRemoteRelationshipTarget'
validateNativeQuery _ _ _ sc _ nq = do
unless (isJust (API._cInterpolatedQueries (DC._scCapabilities sc))) do
let nqName = _nqmRootFieldName nq
throw400 NotSupported $ "validateNativeQuery: " <> toTxt nqName <> " - Native Queries not implemented for this Data Connector backend."
-- Adapted from server/src-lib/Hasura/Backends/BigQuery/Instances/Metadata.hs
validateArgumentDeclaration nq
pure (trimQueryEnd (_nqmCode nq)) -- for now, all queries are valid
arityJsonAggSelect :: API.FunctionArity -> JsonAggSelect
arityJsonAggSelect = \case
API.FunctionArityOne -> JASSingleObject

View File

@ -23,7 +23,7 @@ import Hasura.Backends.DataConnector.Adapter.Types.Mutations qualified as DC
import Hasura.Base.Error
import Hasura.Function.Cache qualified as RQL
import Hasura.GraphQL.Parser.Class
import Hasura.GraphQL.Schema.Backend (BackendSchema (..), BackendTableSelectSchema (..), BackendUpdateOperatorsSchema (..), ComparisonExp, MonadBuildSchema)
import Hasura.GraphQL.Schema.Backend (BackendLogicalModelSelectSchema (..), BackendNativeQuerySelectSchema (..), BackendSchema (..), BackendTableSelectSchema (..), BackendUpdateOperatorsSchema (..), ComparisonExp, MonadBuildSchema)
import Hasura.GraphQL.Schema.BoolExp qualified as GS.BE
import Hasura.GraphQL.Schema.Build qualified as GS.B
import Hasura.GraphQL.Schema.Common qualified as GS.C
@ -33,7 +33,10 @@ import Hasura.GraphQL.Schema.Table qualified as GS.T
import Hasura.GraphQL.Schema.Typename qualified as GS.N
import Hasura.GraphQL.Schema.Update qualified as GS.U
import Hasura.GraphQL.Schema.Update.Batch qualified as GS.U.B
import Hasura.LogicalModel.Cache qualified as Cache
import Hasura.LogicalModel.Schema qualified as Schema
import Hasura.Name qualified as Name
import Hasura.NativeQuery.Schema qualified as NativeQueries
import Hasura.Prelude
import Hasura.RQL.IR.BoolExp qualified as IR
import Hasura.RQL.IR.Delete qualified as IR
@ -86,6 +89,8 @@ instance BackendSchema 'DataConnector where
countTypeInput = countTypeInput'
buildNativeQueryRootFields = NativeQueries.defaultBuildNativeQueryRootFields
-- aggregateOrderByCountType is only used when generating Relay schemas, and Data Connector backends do not yet support Relay
-- If/when we want to support this we would need to add something to Capabilities to tell HGE what (integer-like) scalar
-- type should be used to represent the result of a count aggregate in relay order-by queries.
@ -95,6 +100,14 @@ instance BackendSchema 'DataConnector where
computedField =
error "computedField: not implemented for the Data Connector backend."
instance BackendLogicalModelSelectSchema 'DataConnector where
logicalModelArguments = dataConnectorLogicalModelArgs
logicalModelSelectionSet = Schema.defaultLogicalModelSelectionSet
instance BackendNativeQuerySelectSchema 'DataConnector where
selectNativeQuery = NativeQueries.defaultSelectNativeQuery
selectNativeQueryObject = NativeQueries.defaultSelectNativeQueryObject
instance BackendTableSelectSchema 'DataConnector where
tableArguments = tableArgs'
selectTable = GS.S.defaultSelectTable
@ -106,6 +119,21 @@ instance BackendUpdateOperatorsSchema 'DataConnector where
parseUpdateOperators = parseUpdateOperators'
-- | Based on `defaultLogicalModelArgs` from `Hasura.LogicalModel.Schema`
-- Omits distinct as this is not implemented for Data Connectors.
dataConnectorLogicalModelArgs ::
forall b r m n.
( MonadBuildSchema b r m n,
GS.BE.AggregationPredicatesSchema b
) =>
Cache.LogicalModelInfo b ->
GS.C.SchemaT r m (P.InputFieldsParser n (GS.C.SelectArgs b))
dataConnectorLogicalModelArgs logicalModel = do
whereParser <- Schema.logicalModelWhereArg logicalModel
orderByParser <- Schema.logicalModelOrderByArg logicalModel
distinctParser <- pure $ pure Nothing
GS.S.defaultArgsParser whereParser orderByParser distinctParser
--------------------------------------------------------------------------------
buildFunctionQueryFields' ::

View File

@ -13,7 +13,7 @@ module Hasura.Backends.DataConnector.Plan.Common
writeOutput,
replaceOutput,
TableRelationships (..),
recordTableRelationship,
recordRelationship,
recordTableRelationshipFromRelInfo,
FieldPrefix,
noPrefix,
@ -28,6 +28,8 @@ module Hasura.Backends.DataConnector.Plan.Common
mkRelationshipName,
mapFieldNameHashMap,
encodeAssocListAsObject,
targetToTargetName,
recordNativeQuery,
ColumnStack,
emptyColumnStack,
pushColumn,
@ -40,8 +42,10 @@ import Data.Aeson.Encoding qualified as JE
import Data.Aeson.Types qualified as J
import Data.Bifunctor (Bifunctor (bimap))
import Data.ByteString qualified as BS
import Data.Char (intToDigit)
import Data.Has
import Data.HashMap.Strict qualified as HashMap
import Data.Hashable (hash)
import Data.List.NonEmpty qualified as NonEmpty
import Data.Set (Set)
import Data.Set qualified as Set
@ -53,6 +57,8 @@ import Hasura.Backends.DataConnector.API qualified as API
import Hasura.Backends.DataConnector.Adapter.Backend
import Hasura.Backends.DataConnector.Adapter.Types
import Hasura.Base.Error
import Hasura.NativeQuery.IR
import Hasura.NativeQuery.InterpolatedQuery as IQ
import Hasura.Prelude
import Hasura.RQL.IR.BoolExp
import Hasura.RQL.IR.Value
@ -63,6 +69,7 @@ import Hasura.RQL.Types.Common
import Hasura.RQL.Types.Relationships.Local (RelInfo (..), RelTarget (..))
import Hasura.SQL.Types (CollectableType (..))
import Hasura.Session
import Numeric (showIntAtBase)
import Witch qualified
--------------------------------------------------------------------------------
@ -109,9 +116,9 @@ instance Semigroup TableRelationships where
instance Monoid TableRelationships where
mempty = TableRelationships mempty
-- | Records a table relationship encountered during request translation into the output of the current
-- | Records a relationship encountered during request translation into the output of the current
-- 'CPS.WriterT'
recordTableRelationship ::
recordRelationship ::
( MonadState state m,
Has TableRelationships state
) =>
@ -119,7 +126,7 @@ recordTableRelationship ::
API.RelationshipName ->
API.Relationship ->
m ()
recordTableRelationship sourceName relationshipName relationship =
recordRelationship sourceName relationshipName relationship =
writeOutput $ TableRelationships $ HashMap.singleton sourceName (HashMap.singleton relationshipName relationship)
recordTableRelationshipFromRelInfo ::
@ -139,16 +146,50 @@ recordTableRelationshipFromRelInfo sourceTableName RelInfo {..} = do
RelTargetTable targetTableName -> do
let relationship =
API.Relationship
{ _rTargetTable = Witch.from targetTableName,
{ _rTarget = API.TTable (API.TargetTable (Witch.from targetTableName)),
_rRelationshipType = relationshipType,
_rColumnMapping = HashMap.fromList $ bimap Witch.from Witch.from <$> HashMap.toList riMapping
}
recordTableRelationship
recordRelationship
sourceTableName
relationshipName
relationship
pure (relationshipName, relationship)
-- | Records a Native Query encountered during request translation into the output of the current
-- 'CPS.WriterT'
recordNativeQuery ::
( Has API.InterpolatedQueries state,
Has API.ScalarTypesCapabilities r,
MonadReader r m,
MonadState state m,
MonadError QErr m,
Has SessionVariables r
) =>
NativeQuery 'DataConnector (UnpreparedValue 'DataConnector) ->
m API.InterpolatedQueryId
recordNativeQuery nq = do
nqL <- traverse prepareLiteral nq
let iq@(API.InterpolatedQuery i _) = apiInterpolateQuery nqL
interpolatedQueries = API.InterpolatedQueries $ HashMap.singleton i iq
writeOutput interpolatedQueries
pure i
apiInterpolateQuery :: NativeQuery 'DataConnector Literal -> API.InterpolatedQuery
apiInterpolateQuery NativeQuery {nqRootFieldName, nqInterpolatedQuery} = API.InterpolatedQuery i interpolatedItems
where
-- NOTE: An alternative hash mechanism could be explored if any issues are found with this implementation.
i = API.InterpolatedQueryId (toTxt nqRootFieldName <> "_" <> qh)
qh = T.pack $ showIntAtBase 16 intToDigit (abs $ hash nqInterpolatedQuery) ""
interpolatedItems = interpolateItem <$> IQ.getInterpolatedQuery nqInterpolatedQuery
interpolateItem :: IQ.InterpolatedItem Literal -> API.InterpolatedItem
interpolateItem = \case
IQ.IIText t -> API.InterpolatedText t
IQ.IIVariable l -> case l of
ValueLiteral st v -> API.InterpolatedScalar (API.ScalarValue v (Witch.from st)) -- TODO: Witchify?
_ -> error "array literals not yet implemented"
--------------------------------------------------------------------------------
-- | Collects encountered redaction expressions on a per table/function basis.
@ -162,6 +203,7 @@ recordRedactionExpression ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -308,6 +350,7 @@ toColumnSelector (ColumnStack stack) columnName =
translateBoolExpToExpression ::
( MonadState state m,
Has TableRelationships state,
Has API.InterpolatedQueries state,
Has RedactionExpressionState state,
MonadError QErr m,
MonadReader r m,
@ -323,6 +366,7 @@ translateBoolExpToExpression sourceName boolExp = do
translateBoolExp ::
( MonadState state m,
Has TableRelationships state,
Has API.InterpolatedQueries state,
Has RedactionExpressionState state,
MonadError QErr m,
MonadReader r m,
@ -349,7 +393,7 @@ translateBoolExp sourceName columnStack = \case
BoolField (AVRelationship relationshipInfo (RelationshipFilters {rfTargetTablePermissions, rfFilter})) -> do
(relationshipName, API.Relationship {..}) <- recordTableRelationshipFromRelInfo sourceName relationshipInfo
-- TODO: How does this function keep track of the root table?
API.Exists (API.RelatedTable relationshipName) <$> translateBoolExp (API.TNTable _rTargetTable) emptyColumnStack (BoolAnd [rfTargetTablePermissions, rfFilter])
API.Exists (API.RelatedTable relationshipName) <$> translateBoolExp (targetToTargetName _rTarget) emptyColumnStack (BoolAnd [rfTargetTablePermissions, rfFilter])
BoolExists GExists {..} ->
let tableName = Witch.from _geTable
in API.Exists (API.UnrelatedTable tableName) <$> translateBoolExp (API.TNTable tableName) emptyColumnStack _geWhere
@ -361,6 +405,13 @@ translateBoolExp sourceName columnStack = \case
[singleExp] -> singleExp
zeroOrManyExps -> mk $ Set.fromList zeroOrManyExps
-- | Helper function to convert targets into Keys
targetToTargetName :: API.Target -> API.TargetName
targetToTargetName = \case
(API.TTable (API.TargetTable tn)) -> API.TNTable tn
(API.TFunction (API.TargetFunction n _)) -> API.TNFunction n
(API.TInterpolated (API.TargetInterpolatedQuery qId)) -> API.TNInterpolatedQuery qId
removeAlwaysTrueExpression :: API.Expression -> Maybe API.Expression
removeAlwaysTrueExpression = \case
API.And exprs | exprs == mempty -> Nothing

View File

@ -94,44 +94,59 @@ translateMutationDB ::
m API.MutationRequest
translateMutationDB = \case
MDBInsert insert -> do
(insertOperation, (tableRelationships, redactionExpressionState, tableInsertSchemas)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) $ translateInsert insert
(insertOperation, (tableRelationships, redactionExpressionState, API.InterpolatedQueries interpolatedQueries, tableInsertSchemas)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty, mempty) $ translateInsert insert
unless (null interpolatedQueries) do
-- TODO: See if we can allow this in mutations
throw400 NotSupported "translateMutationDB: Native Queries not supported in insert operations."
let apiTableInsertSchema =
unTableInsertSchemas tableInsertSchemas
& HashMap.toList
& fmap (\(tableName, TableInsertSchema {..}) -> API.TableInsertSchema tableName _tisPrimaryKey _tisFields)
let apiTableRelationships = Set.fromList $ uncurry API.TableRelationships <$> rights (map eitherKey (HashMap.toList (unTableRelationships tableRelationships)))
let apiTableRelationships = Set.fromList $ API.RTable . uncurry API.TableRelationships <$> mapMaybe tableKey (HashMap.toList (unTableRelationships tableRelationships))
pure
$ API.MutationRequest
{ _mrTableRelationships = apiTableRelationships,
{ _mrRelationships = apiTableRelationships,
_mrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_mrInsertSchema = Set.fromList apiTableInsertSchema,
_mrOperations = [API.InsertOperation insertOperation]
}
MDBUpdate update -> do
(updateOperations, (tableRelationships, redactionExpressionState)) <-
flip runStateT (mempty, RedactionExpressionState mempty) $ translateUpdate update
(updateOperations, (tableRelationships, redactionExpressionState, API.InterpolatedQueries interpolatedQueries)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) $ translateUpdate update
unless (null interpolatedQueries) do
-- TODO: See if we can allow this in mutations
throw400 NotSupported "translateMutationDB: Native Queries not supported in update operations."
let apiTableRelationships =
Set.fromList
$ uncurry API.TableRelationships
<$> rights (map eitherKey (HashMap.toList (unTableRelationships tableRelationships)))
$ API.RTable
. uncurry API.TableRelationships
<$> mapMaybe tableKey (HashMap.toList (unTableRelationships tableRelationships))
pure
$ API.MutationRequest
{ _mrTableRelationships = apiTableRelationships,
{ _mrRelationships = apiTableRelationships,
_mrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_mrInsertSchema = mempty,
_mrOperations = API.UpdateOperation <$> updateOperations
}
MDBDelete delete -> do
(deleteOperation, (tableRelationships, redactionExpressionState)) <-
flip runStateT (mempty, RedactionExpressionState mempty) $ translateDelete delete
(deleteOperation, (tableRelationships, redactionExpressionState, API.InterpolatedQueries interpolatedQueries)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) $ translateDelete delete
unless (null interpolatedQueries) do
-- TODO: See if we can allow this in mutations
throw400 NotSupported "translateMutationDB: Native Queries not supported in delete operations."
let apiTableRelationships =
Set.fromList
$ uncurry API.TableRelationships
<$> rights (map eitherKey (HashMap.toList (unTableRelationships tableRelationships)))
$ API.RTable
. uncurry API.TableRelationships
<$> mapMaybe tableKey (HashMap.toList (unTableRelationships tableRelationships))
pure
$ API.MutationRequest
{ _mrTableRelationships = apiTableRelationships,
{ _mrRelationships = apiTableRelationships,
_mrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_mrInsertSchema = mempty,
_mrOperations = [API.DeleteOperation deleteOperation]
@ -139,15 +154,16 @@ translateMutationDB = \case
MDBFunction _returnsSet _select ->
throw400 NotSupported "translateMutationDB: function mutations not implemented for the Data Connector backend."
eitherKey :: (API.TargetName, c) -> Either (API.FunctionName, c) (API.TableName, c)
eitherKey (API.TNFunction f, x) = Left (f, x)
eitherKey (API.TNTable t, x) = Right (t, x)
tableKey :: (API.TargetName, c) -> Maybe (API.TableName, c)
tableKey (API.TNTable t, x) = Just (t, x)
tableKey _ = Nothing
translateInsert ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has TableInsertSchemas state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -244,6 +260,7 @@ translateUpdate ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -260,6 +277,7 @@ translateUpdateBatch ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -315,6 +333,7 @@ translateDelete ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -339,6 +358,7 @@ translateMutationOutputToReturningFields ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -357,6 +377,7 @@ translateMutField ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,

View File

@ -22,12 +22,14 @@ import Data.List.NonEmpty qualified as NE
import Data.Semigroup (Min (..))
import Data.Set qualified as Set
import Data.Text.Extended (toTxt)
import Hasura.Backends.DataConnector.API (Target (TInterpolated))
import Hasura.Backends.DataConnector.API qualified as API
import Hasura.Backends.DataConnector.Adapter.Backend
import Hasura.Backends.DataConnector.Adapter.Types
import Hasura.Backends.DataConnector.Plan.Common
import Hasura.Base.Error
import Hasura.Function.Cache qualified as Function
import Hasura.NativeQuery.IR (NativeQuery (..))
import Hasura.Prelude
import Hasura.RQL.IR.BoolExp
import Hasura.RQL.IR.OrderBy
@ -119,6 +121,7 @@ translateAnnSelectToQueryRequest ::
Has RedactionExpressionState state,
MonadError QErr m2,
MonadReader r m2,
Has API.InterpolatedQueries state,
Has API.ScalarTypesCapabilities r,
Has SessionVariables r
) =>
@ -131,39 +134,139 @@ translateAnnSelectToQueryRequest ::
translateAnnSelectToQueryRequest translateFieldsAndAggregates selectG = do
case _asnFrom selectG of
FromIdentifier _ -> throw400 NotSupported "AnnSelectG: FromIdentifier not supported"
FromNativeQuery {} -> throw400 NotSupported "AnnSelectG: FromNativeQuery not supported"
FromStoredProcedure {} -> throw400 NotSupported "AnnSelectG: FromStoredProcedure not supported"
FromTable tableName -> do
(query, (TableRelationships tableRelationships, redactionExpressionState)) <-
flip runStateT (mempty, RedactionExpressionState mempty) $ translateAnnSelect translateFieldsAndAggregates (API.TNTable (Witch.into tableName)) selectG
FromNativeQuery nativeQuery -> do
((query, nqid), (TableRelationships tableRelationships, redactionExpressionState, nativeQueries)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) do
nqid <- recordNativeQuery nativeQuery -- TODO: Duplicate work here? Shouldn't cause issues due to monoid
query' <- (translateAnnSelect translateFieldsAndAggregates (API.TNInterpolatedQuery nqid) selectG)
pure (query', nqid)
let relationships = mkRelationships <$> HashMap.toList tableRelationships
let target = API.TInterpolated $ API.TargetInterpolatedQuery nqid
pure
$ API.QRTable
API.TableRequest
{ _trTable = Witch.into tableName,
_trRelationships = Set.fromList relationships,
_trRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_trQuery = query,
_trForeach = Nothing
}
$ API.QueryRequest
{ _qrTarget = target,
_qrRelationships = Set.fromList relationships,
_qrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_qrInterpolatedQueries = nativeQueries,
_qrQuery = query,
_qrForeach = Nothing
}
FromTable tableName -> do
(query, (TableRelationships tableRelationships, redactionExpressionState, nativeQueries)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) $ translateAnnSelect translateFieldsAndAggregates (API.TNTable (Witch.into tableName)) selectG
let relationships = mkRelationships <$> HashMap.toList tableRelationships
let target = API.TTable (API.TargetTable (Witch.into tableName))
pure
$ API.QueryRequest
{ _qrTarget = target,
_qrRelationships = Set.fromList relationships,
_qrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_qrInterpolatedQueries = nativeQueries,
_qrQuery = query,
_qrForeach = Nothing
}
FromFunction fn@(FunctionName functionName) argsExp _dListM -> do
args <- mkArgs argsExp fn
(query, (redactionExpressionState, TableRelationships tableRelationships)) <-
flip runStateT (RedactionExpressionState mempty, mempty) $ translateAnnSelect translateFieldsAndAggregates (API.TNFunction (Witch.into functionName)) selectG
(query, (redactionExpressionState, TableRelationships tableRelationships, nativeQueries)) <-
flip runStateT (RedactionExpressionState mempty, mempty, mempty) $ translateAnnSelect translateFieldsAndAggregates (API.TNFunction (Witch.into functionName)) selectG
let relationships = mkRelationships <$> HashMap.toList tableRelationships
let target = API.TFunction (API.TargetFunction (Witch.into functionName) args)
pure
$ API.QRFunction
API.FunctionRequest
{ _frFunction = Witch.into functionName,
_frRelationships = Set.fromList relationships,
_frRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_frQuery = query,
_frFunctionArguments = args
}
$ API.QueryRequest
{ _qrTarget = target,
_qrRelationships = Set.fromList relationships,
_qrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_qrInterpolatedQueries = nativeQueries,
_qrQuery = query,
_qrForeach = Nothing
}
fromNativeQueryArray ::
( MonadState state m,
MonadError QErr m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has SessionVariables r,
Has API.InterpolatedQueries state,
MonadReader r m,
Has API.ScalarTypesCapabilities r
) =>
AnnRelationSelectG 'DataConnector (AnnSelectG 'DataConnector fieldType (UnpreparedValue 'DataConnector)) ->
(API.TargetName -> Fields (fieldType (UnpreparedValue 'DataConnector)) -> m FieldsAndAggregates) ->
API.TargetName ->
NativeQuery 'DataConnector (UnpreparedValue 'DataConnector) ->
m API.Field
fromNativeQueryArray arrRel translateFieldsAndAggregates sourceTargetName nativeQuery = do
nqid <- recordNativeQuery nativeQuery
query <- translateAnnSelect translateFieldsAndAggregates (API.TNInterpolatedQuery nqid) (_aarAnnSelect arrRel)
let relationshipName = mkRelationshipName $ _aarRelationshipName arrRel
recordRelationship
sourceTargetName
relationshipName
API.Relationship
{ _rTarget = TInterpolated (API.TargetInterpolatedQuery nqid),
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList $ bimap Witch.from Witch.from <$> HashMap.toList (_aarColumnMapping arrRel)
}
pure
. API.RelField
$ API.RelationshipField
relationshipName
query
-- | fromNativeQuery implements the (FromNativeQuery nq) branch of `translateAnnField`
-- Uses a variant of the `FromTable tableName` branch from within `AFObjectRelation objRel`
fromNativeQueryObject ::
( MonadState state m,
MonadError QErr m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has SessionVariables r,
Has API.InterpolatedQueries state,
MonadReader r m,
Has API.ScalarTypesCapabilities r
) =>
ObjectRelationSelectG 'DataConnector Void (UnpreparedValue 'DataConnector) ->
API.TargetName ->
NativeQuery 'DataConnector (UnpreparedValue 'DataConnector) ->
m API.Field
fromNativeQueryObject objRel sourceTargetName nativeQuery = do
let relationshipName = mkRelationshipName $ _aarRelationshipName objRel
nqid <- recordNativeQuery nativeQuery
fields <- translateAnnFields noPrefix (API.TNInterpolatedQuery nqid) (_aosFields (_aarAnnSelect objRel)) -- TODO: coerce?
whereClause <- translateBoolExpToExpression (API.TNInterpolatedQuery nqid) (_aosTargetFilter (_aarAnnSelect objRel))
recordRelationship
sourceTargetName
relationshipName
API.Relationship
{ _rTarget = TInterpolated (API.TargetInterpolatedQuery nqid),
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList $ bimap Witch.from Witch.from <$> HashMap.toList (_aarColumnMapping objRel)
}
pure
$ API.RelField
$ API.RelationshipField
relationshipName
( API.Query
{ _qFields = Just $ mapFieldNameHashMap fields,
_qAggregates = mempty,
_qWhere = whereClause,
_qAggregatesLimit = Nothing,
_qLimit = Nothing,
_qOffset = Nothing,
_qOrderBy = Nothing
}
)
mkRelationships :: (API.TargetName, (HashMap API.RelationshipName API.Relationship)) -> API.Relationships
mkRelationships (API.TNFunction functionName, relationships) = API.RFunction (API.FunctionRelationships functionName relationships)
mkRelationships (API.TNTable tableName, relationships) = API.RTable (API.TableRelationships tableName relationships)
mkRelationships (API.TNInterpolatedQuery interpolatedName, relationships) = API.RInterpolated (API.InterpolatedRelationships interpolatedName relationships)
mkArgs ::
forall r m.
@ -193,6 +296,7 @@ translateAnnSelect ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -231,6 +335,7 @@ translateOrderBy ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -254,6 +359,7 @@ translateOrderByElement ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -280,9 +386,9 @@ translateOrderByElement sourceName orderDirection targetReversePath columnStack
translateOrderByElement sourceName orderDirection targetReversePath (pushColumn columnStack _noiColumn) nestedOrderBy
AOCObjectRelation relationshipInfo filterExp orderByElement -> do
(relationshipName, API.Relationship {..}) <- recordTableRelationshipFromRelInfo sourceName relationshipInfo
(translatedOrderByElement, subOrderByRelations) <- translateOrderByElement (API.TNTable _rTargetTable) orderDirection (relationshipName : targetReversePath) columnStack orderByElement
(translatedOrderByElement, subOrderByRelations) <- translateOrderByElement (targetToTargetName _rTarget) orderDirection (relationshipName : targetReversePath) columnStack orderByElement
targetTableWhereExp <- translateBoolExpToExpression (API.TNTable _rTargetTable) filterExp
targetTableWhereExp <- translateBoolExpToExpression (targetToTargetName _rTarget) filterExp
let orderByRelations = HashMap.fromList [(relationshipName, API.OrderByRelation targetTableWhereExp subOrderByRelations)]
pure (translatedOrderByElement, orderByRelations)
@ -304,7 +410,7 @@ translateOrderByElement sourceName orderDirection targetReversePath columnStack
_obeOrderDirection = orderDirection
}
targetTableWhereExp <- translateBoolExpToExpression (API.TNTable _rTargetTable) filterExp
targetTableWhereExp <- translateBoolExpToExpression (targetToTargetName _rTarget) filterExp
let orderByRelations = HashMap.fromList [(relationshipName, API.OrderByRelation targetTableWhereExp mempty)]
pure (translatedOrderByElement, orderByRelations)
@ -331,6 +437,7 @@ translateAnnFieldsWithNoAggregates ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -347,6 +454,7 @@ translateAnnFields ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -364,6 +472,7 @@ translateAnnField ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -372,13 +481,13 @@ translateAnnField ::
API.TargetName ->
AnnFieldG 'DataConnector Void (UnpreparedValue 'DataConnector) ->
m (Maybe API.Field)
translateAnnField sourceTableName = \case
translateAnnField targetName = \case
AFNestedObject nestedObj ->
Just
. API.NestedObjField (Witch.from $ _anosColumn nestedObj)
<$> translateNestedObjectSelect sourceTableName nestedObj
<$> translateNestedObjectSelect targetName nestedObj
AFNestedArray _ (ANASSimple field) ->
fmap mkArrayField <$> translateAnnField sourceTableName field
fmap mkArrayField <$> translateAnnField targetName field
where
mkArrayField nestedField =
API.NestedArrayField (API.ArrayField nestedField Nothing Nothing Nothing Nothing)
@ -386,21 +495,23 @@ translateAnnField sourceTableName = \case
AFNestedArray _ (ANASAggregate _) ->
pure Nothing -- TODO(dmoverton): support nested array aggregates
AFColumn AnnColumnField {..} -> do
redactionExpName <- recordRedactionExpression sourceTableName _acfRedactionExpression
redactionExpName <- recordRedactionExpression targetName _acfRedactionExpression
pure . Just $ API.ColumnField (Witch.from _acfColumn) (Witch.from $ columnTypeToScalarType _acfType) redactionExpName
AFObjectRelation objRel ->
case _aosTarget (_aarAnnSelect objRel) of
FromFunction {} -> error "translateAnnField: AFObjectRelation: Functions not supported as targets of relationships"
FromNativeQuery nq -> Just <$> fromNativeQueryObject objRel targetName nq -- TODO: Implementation WIP
FromTable tableName -> do
let targetTable = Witch.from tableName
let relationshipName = mkRelationshipName $ _aarRelationshipName objRel
fields <- translateAnnFields noPrefix (API.TNTable targetTable) (_aosFields (_aarAnnSelect objRel))
whereClause <- translateBoolExpToExpression (API.TNTable targetTable) (_aosTargetFilter (_aarAnnSelect objRel))
recordTableRelationship
sourceTableName
recordRelationship
targetName
relationshipName
API.Relationship
{ _rTargetTable = targetTable,
{ _rTarget = API.TTable (API.TargetTable targetTable),
_rRelationshipType = API.ObjectRelationship,
_rColumnMapping = HashMap.fromList $ bimap Witch.from Witch.from <$> HashMap.toList (_aarColumnMapping objRel)
}
@ -422,9 +533,9 @@ translateAnnField sourceTableName = \case
)
other -> error $ "translateAnnField: " <> show other
AFArrayRelation (ASSimple arrayRelationSelect) -> do
Just <$> translateArrayRelationSelect sourceTableName (translateAnnFieldsWithNoAggregates noPrefix) arrayRelationSelect
Just <$> translateArrayRelationSelect targetName (translateAnnFieldsWithNoAggregates noPrefix) arrayRelationSelect
AFArrayRelation (ASAggregate arrayRelationSelect) ->
Just <$> translateArrayRelationSelect sourceTableName translateTableAggregateFields arrayRelationSelect
Just <$> translateArrayRelationSelect targetName translateTableAggregateFields arrayRelationSelect
AFExpression _literal ->
-- We ignore literal text fields (we don't send them to the data connector agent)
-- and add them back to the response JSON when we reshape what the agent returns
@ -435,6 +546,7 @@ translateArrayRelationSelect ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -444,21 +556,21 @@ translateArrayRelationSelect ::
(API.TargetName -> Fields (fieldType (UnpreparedValue 'DataConnector)) -> m FieldsAndAggregates) ->
AnnRelationSelectG 'DataConnector (AnnSelectG 'DataConnector fieldType (UnpreparedValue 'DataConnector)) ->
m API.Field
translateArrayRelationSelect sourceName translateFieldsAndAggregates arrRel = do
translateArrayRelationSelect targetName translateFieldsAndAggregates arrRel = do
case _asnFrom (_aarAnnSelect arrRel) of
FromIdentifier _ -> throw400 NotSupported "AnnSelectG: FromIdentifier not supported"
FromNativeQuery {} -> throw400 NotSupported "AnnSelectG: FromNativeQuery not supported"
FromNativeQuery nq -> fromNativeQueryArray arrRel translateFieldsAndAggregates targetName nq
FromStoredProcedure {} -> throw400 NotSupported "AnnSelectG: FromStoredProcedure not supported"
FromFunction {} -> throw400 NotSupported "translateArrayRelationSelect: FromFunction not currently supported"
FromTable targetTable -> do
query <- translateAnnSelect translateFieldsAndAggregates (API.TNTable (Witch.into targetTable)) (_aarAnnSelect arrRel)
let relationshipName = mkRelationshipName $ _aarRelationshipName arrRel
recordTableRelationship
sourceName
recordRelationship
targetName
relationshipName
API.Relationship
{ _rTargetTable = Witch.into targetTable,
{ _rTarget = API.TTable (API.TargetTable (Witch.into targetTable)),
_rRelationshipType = API.ArrayRelationship,
_rColumnMapping = HashMap.fromList $ bimap Witch.from Witch.from <$> HashMap.toList (_aarColumnMapping arrRel)
}
@ -473,6 +585,7 @@ translateTableAggregateFields ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -488,6 +601,7 @@ translateTableAggregateField ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -517,6 +631,7 @@ translateAggregateField ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,
@ -571,6 +686,7 @@ translateNestedObjectSelect ::
( MonadState state m,
Has TableRelationships state,
Has RedactionExpressionState state,
Has API.InterpolatedQueries state,
MonadError QErr m,
MonadReader r m,
Has API.ScalarTypesCapabilities r,

View File

@ -82,19 +82,19 @@ mkRemoteRelationshipPlan _sourceConfig joinIds joinIdsSchema argumentIdFieldName
let tableName = case _aosTarget of
FromTable table -> Witch.from table
other -> error $ "translateAnnObjectSelectToQueryRequest: " <> show other
((fields, whereClause), (TableRelationships tableRelationships, redactionExpressionState)) <-
flip runStateT (mempty, RedactionExpressionState mempty) do
((fields, whereClause), (TableRelationships tableRelationships, redactionExpressionState, interpolatedQueries)) <-
flip runStateT (mempty, RedactionExpressionState mempty, mempty) do
fields <- QueryPlan.translateAnnFields noPrefix (API.TNTable tableName) _aosFields
whereClause <- translateBoolExpToExpression (API.TNTable tableName) _aosTargetFilter
pure (fields, whereClause)
let apiTableRelationships = Set.fromList $ tableRelationshipsToList tableRelationships
pure
$ API.QRTable
$ API.TableRequest
{ _trTable = tableName,
_trRelationships = apiTableRelationships,
_trRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_trQuery =
$ API.QueryRequest
{ _qrTarget = API.TTable (API.TargetTable tableName),
_qrRelationships = apiTableRelationships,
_qrRedactionExpressions = translateRedactionExpressions redactionExpressionState,
_qrInterpolatedQueries = interpolatedQueries,
_qrQuery =
API.Query
{ _qFields = Just $ mapFieldNameHashMap fields,
_qAggregates = Nothing,
@ -104,15 +104,15 @@ mkRemoteRelationshipPlan _sourceConfig joinIds joinIdsSchema argumentIdFieldName
_qWhere = whereClause,
_qOrderBy = Nothing
},
_trForeach = Just foreachRowFilter
_qrForeach = Just foreachRowFilter
}
tableRelationshipsToList :: HashMap API.TargetName (HashMap API.RelationshipName API.Relationship) -> [API.Relationships]
tableRelationshipsToList m = map (either (API.RFunction . uncurry API.FunctionRelationships) (API.RTable . uncurry API.TableRelationships) . tableRelationshipsKeyToEither) (HashMap.toList m)
tableRelationshipsKeyToEither :: (API.TargetName, c) -> Either (API.FunctionName, c) (API.TableName, c)
tableRelationshipsKeyToEither (API.TNFunction f, x) = Left (f, x)
tableRelationshipsKeyToEither (API.TNTable t, x) = Right (t, x)
tableRelationshipsToList m = map relationshipDispatch (HashMap.toList m)
where
relationshipDispatch (API.TNFunction f, x) = API.RFunction (API.FunctionRelationships f x)
relationshipDispatch (API.TNTable t, x) = API.RTable (API.TableRelationships t x)
relationshipDispatch (API.TNInterpolatedQuery q, x) = API.RInterpolated (API.InterpolatedRelationships q x)
translateForeachRowFilter :: (MonadError QErr m) => FieldName -> HashMap FieldName (ColumnName, ScalarType) -> J.Object -> m (HashMap API.ColumnName API.ScalarValue)
translateForeachRowFilter argumentIdFieldName joinIdsSchema joinIds =

View File

@ -37,7 +37,7 @@ instance BackendMetadata 'MSSQL where
listAllTrackables _ =
throw500 "Computed fields are not yet defined for MSSQL backends"
getTableInfo _ _ = throw400 UnexpectedPayload "get_table_info not yet supported in MSSQL!"
validateNativeQuery _ _ _ _ nq = do
validateNativeQuery _ _ _ _ _ nq = do
validateArgumentDeclaration nq
pure (trimQueryEnd (_nqmCode nq)) -- for now, all queries are valid
validateStoredProcedure _ _ _ _ = pure () -- for now, all stored procedures are valid

Some files were not shown because too many files have changed in this diff Show More