Nested array support for Data Connectors Backend and MongoDB

## Description

This change adds support for querying into nested arrays in Data Connector agents that support such a concept (currently MongoDB).

### DC API changes

- New API type `ColumnType` which allows representing the type of a "column" as either a scalar type, an object reference or an array of `ColumnType`s. This recursive definition allows arbitrary nesting of arrays of types.
- The `type` fields in the API types `ColumnInfo` and `ColumnInsertSchema` now take a `ColumnType` instead of a `ScalarType`.
- To ensure backwards compatibility, a `ColumnType` representing a scalar serialises and deserialises to the same representation as `ScalarType`.
- In queries, the `Field` type now has a new constructor `NestedArrayField`. This contains a nested `Field` along with optional `limit`, `offset`, `where` and `order_by` arguments. (These optional arguments are not yet used by either HGE or the MongoDB agent.)

### MongoDB Haskell agent changes

- The `/schema` endpoint will now recognise arrays within the JSON validation schema and generate corresponding arrays in the DC schema.
- The `/query` endpoint will now handle `NestedArrayField`s within queries (although it does not yet handle `limit`, `offset`, `where` and `order_by`).

### HGE server changes

- The `Backend` type class adds a new type family `XNestedArrays b` to enable nested arrays on a per-backend basis (currently enabled only for the `DataConnector` backend.
- Within `RawColumnInfo` the column type is now represented by a new type `RawColumnType b` which mirrors the shape of the DC API `ColumnType`, but uses `XNestedObjects b` and `XNestedArrays b` type families to allow turning nested object and array supports on or off for a particular backend. In the `DataConnector` backend `API.CustomType` is converted into `RawColumnInfo 'DataConnector` while building the schema.
- In the next stage of schema building, the `RawColumnInfo` is converted into a `StructuredColumnInfo` which allows us to represent the three different types of columns: scalar, object and array. TODO: the `StructuredColumnInfo` looks very similar to the Logical Model types. The main difference is that it uses the `XNestedObjects` and `XNestedArrays` type families. We should be able to combine these two representations.
- The `StructuredColumnInfo` is then placed into a `FIColumn` `FieldInfo`. This involved some refactoring of `FieldInfo` as I had previously split out `FINestedObject` into a separate constructor. However it works out better to represent all "column" fields (i.e. scalar, object and array) using `FIColumn` as this make it easier to implement permission checking correctly. This is the reason the `StructuredColumnInfo` was needed.
- Next, the `FieldInfo` are used to generate `FieldParser`s. We add a new constructor to `AnnFieldG` for `AFNestedArray`. An `AFNestedArray` field parser can contain either a simple array selection or an array aggregate. Simple array `FieldParsers` are currently limited to subfield selection. We will add support for limit, offset, where and order_by in a future PR. We also don't yet generate array aggregate `FieldParsers.
- The new `AFNestedArray` field is handled by the `QueryPlan` module in the `DataConnector` backend. There we generate an `API.NestedArrayField` from the AFNestedArray. We also handle nested arrays when reshaping the response from the DC agent.

## Limitations

- Support for limit, offset, filter (where) and order_by is not yet fully implemented, although it should not be hard to add this
- Support for aggregations on nested arrays is not yet fully implemented
- Permissions involving nested arrays (and objects) not yet implemented
- This should be integrated with Logical Model types, but that will happen in a separate PR

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/9149
GitOrigin-RevId: 0e7b71a994fc1d2ca1ef73bfe7b96e95b5328531
This commit is contained in:
David Overton 2023-05-24 18:00:59 +10:00 committed by hasura-bot
parent f77b6aaa1d
commit e5f88d8039
68 changed files with 992 additions and 469 deletions

View File

@ -1,6 +1,6 @@
{
"name": "@hasura/dc-api-types",
"version": "0.31.0",
"version": "0.32.0",
"description": "Hasura GraphQL Engine Data Connector Agent API types",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",

View File

@ -1081,6 +1081,74 @@
],
"type": "string"
},
"ColumnTypeObject": {
"properties": {
"name": {
"type": "string"
},
"type": {
"enum": [
"object"
],
"type": "string"
}
},
"required": [
"name",
"type"
],
"type": "object"
},
"ColumnType": {
"additionalProperties": true,
"anyOf": [
{
"$ref": "#/components/schemas/ScalarType"
},
{
"$ref": "#/components/schemas/ColumnTypeNonScalar"
}
]
},
"ColumnTypeArray": {
"properties": {
"element_type": {
"$ref": "#/components/schemas/ColumnType"
},
"nullable": {
"type": "boolean"
},
"type": {
"enum": [
"array"
],
"type": "string"
}
},
"required": [
"element_type",
"nullable",
"type"
],
"type": "object"
},
"ColumnTypeNonScalar": {
"discriminator": {
"mapping": {
"array": "ColumnTypeArray",
"object": "ColumnTypeObject"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/ColumnTypeObject"
},
{
"$ref": "#/components/schemas/ColumnTypeArray"
}
]
},
"DefaultValueGenerationStrategy": {
"properties": {
"type": {
@ -1165,7 +1233,7 @@
"type": "boolean"
},
"type": {
"$ref": "#/components/schemas/ScalarType"
"$ref": "#/components/schemas/ColumnType"
},
"updatable": {
"default": false,
@ -1685,7 +1753,7 @@
},
"type": "object"
},
"NestedObjectField": {
"NestedObjField": {
"properties": {
"column": {
"type": "string"
@ -1730,139 +1798,31 @@
],
"type": "object"
},
"ColumnField": {
"properties": {
"column": {
"type": "string"
},
"column_type": {
"$ref": "#/components/schemas/ScalarType"
},
"type": {
"enum": [
"column"
],
"type": "string"
}
},
"required": [
"column",
"column_type",
"type"
],
"type": "object"
},
"Field": {
"discriminator": {
"mapping": {
"array": "NestedArrayField",
"column": "ColumnField",
"object": "NestedObjectField",
"object": "NestedObjField",
"relationship": "RelationshipField"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/NestedObjectField"
"$ref": "#/components/schemas/NestedObjField"
},
{
"$ref": "#/components/schemas/RelationshipField"
},
{
"$ref": "#/components/schemas/NestedArrayField"
},
{
"$ref": "#/components/schemas/ColumnField"
}
]
},
"ColumnCountAggregate": {
"properties": {
"column": {
"description": "The column to apply the count aggregate function to",
"type": "string"
},
"distinct": {
"description": "Whether or not only distinct items should be counted",
"type": "boolean"
},
"type": {
"enum": [
"column_count"
],
"type": "string"
}
},
"required": [
"column",
"distinct",
"type"
],
"type": "object"
},
"SingleColumnAggregateFunction": {
"description": "Single column aggregate function name.\nA valid GraphQL name",
"type": "string"
},
"SingleColumnAggregate": {
"properties": {
"column": {
"description": "The column to apply the aggregation function to",
"type": "string"
},
"function": {
"$ref": "#/components/schemas/SingleColumnAggregateFunction"
},
"result_type": {
"$ref": "#/components/schemas/ScalarType"
},
"type": {
"enum": [
"single_column"
],
"type": "string"
}
},
"required": [
"function",
"column",
"result_type",
"type"
],
"type": "object"
},
"StarCountAggregate": {
"properties": {
"type": {
"enum": [
"star_count"
],
"type": "string"
}
},
"required": [
"type"
],
"type": "object"
},
"Aggregate": {
"discriminator": {
"mapping": {
"column_count": "ColumnCountAggregate",
"single_column": "SingleColumnAggregate",
"star_count": "StarCountAggregate"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/ColumnCountAggregate"
},
{
"$ref": "#/components/schemas/SingleColumnAggregate"
},
{
"$ref": "#/components/schemas/StarCountAggregate"
}
]
},
"UnrelatedTable": {
"properties": {
"table": {
@ -2275,6 +2235,10 @@
],
"type": "object"
},
"SingleColumnAggregateFunction": {
"description": "Single column aggregate function name.\nA valid GraphQL name",
"type": "string"
},
"OrderBySingleColumnAggregate": {
"properties": {
"column": {
@ -2395,6 +2359,152 @@
],
"type": "object"
},
"NestedArrayField": {
"properties": {
"field": {
"$ref": "#/components/schemas/Field"
},
"limit": {
"description": "Optionally limit the maximum number of returned elements of the array",
"maximum": 9223372036854776000,
"minimum": -9223372036854776000,
"nullable": true,
"type": "number"
},
"offset": {
"description": "Optionally skip the first n elements of the array",
"maximum": 9223372036854776000,
"minimum": -9223372036854776000,
"nullable": true,
"type": "number"
},
"order_by": {
"$ref": "#/components/schemas/OrderBy"
},
"type": {
"enum": [
"array"
],
"type": "string"
},
"where": {
"$ref": "#/components/schemas/Expression"
}
},
"required": [
"field",
"type"
],
"type": "object"
},
"ColumnField": {
"properties": {
"column": {
"type": "string"
},
"column_type": {
"$ref": "#/components/schemas/ScalarType"
},
"type": {
"enum": [
"column"
],
"type": "string"
}
},
"required": [
"column",
"column_type",
"type"
],
"type": "object"
},
"ColumnCountAggregate": {
"properties": {
"column": {
"description": "The column to apply the count aggregate function to",
"type": "string"
},
"distinct": {
"description": "Whether or not only distinct items should be counted",
"type": "boolean"
},
"type": {
"enum": [
"column_count"
],
"type": "string"
}
},
"required": [
"column",
"distinct",
"type"
],
"type": "object"
},
"SingleColumnAggregate": {
"properties": {
"column": {
"description": "The column to apply the aggregation function to",
"type": "string"
},
"function": {
"$ref": "#/components/schemas/SingleColumnAggregateFunction"
},
"result_type": {
"$ref": "#/components/schemas/ScalarType"
},
"type": {
"enum": [
"single_column"
],
"type": "string"
}
},
"required": [
"function",
"column",
"result_type",
"type"
],
"type": "object"
},
"StarCountAggregate": {
"properties": {
"type": {
"enum": [
"star_count"
],
"type": "string"
}
},
"required": [
"type"
],
"type": "object"
},
"Aggregate": {
"discriminator": {
"mapping": {
"column_count": "ColumnCountAggregate",
"single_column": "SingleColumnAggregate",
"star_count": "StarCountAggregate"
},
"propertyName": "type"
},
"oneOf": [
{
"$ref": "#/components/schemas/ColumnCountAggregate"
},
{
"$ref": "#/components/schemas/SingleColumnAggregate"
},
{
"$ref": "#/components/schemas/StarCountAggregate"
}
]
},
"FunctionRequest": {
"properties": {
"function": {
@ -2647,7 +2757,7 @@
"type": "string"
},
"column_type": {
"$ref": "#/components/schemas/ScalarType"
"$ref": "#/components/schemas/ColumnType"
},
"nullable": {
"description": "Is the column nullable",

View File

@ -25,6 +25,10 @@ export type { ColumnInfo } from './models/ColumnInfo';
export type { ColumnInsertFieldValue } from './models/ColumnInsertFieldValue';
export type { ColumnInsertSchema } from './models/ColumnInsertSchema';
export type { ColumnNullability } from './models/ColumnNullability';
export type { ColumnType } from './models/ColumnType';
export type { ColumnTypeArray } from './models/ColumnTypeArray';
export type { ColumnTypeNonScalar } from './models/ColumnTypeNonScalar';
export type { ColumnTypeObject } from './models/ColumnTypeObject';
export type { ColumnValueGenerationStrategy } from './models/ColumnValueGenerationStrategy';
export type { ComparisonCapabilities } from './models/ComparisonCapabilities';
export type { ComparisonColumn } from './models/ComparisonColumn';
@ -75,7 +79,8 @@ export type { MutationOperationResults } from './models/MutationOperationResults
export type { MutationRequest } from './models/MutationRequest';
export type { MutationResponse } from './models/MutationResponse';
export type { NamedArgument } from './models/NamedArgument';
export type { NestedObjectField } from './models/NestedObjectField';
export type { NestedArrayField } from './models/NestedArrayField';
export type { NestedObjField } from './models/NestedObjField';
export type { NotExpression } from './models/NotExpression';
export type { NullColumnFieldValue } from './models/NullColumnFieldValue';
export type { NullColumnInsertFieldValue } from './models/NullColumnInsertFieldValue';

View File

@ -2,8 +2,8 @@
/* tslint:disable */
/* eslint-disable */
import type { ColumnType } from './ColumnType';
import type { ColumnValueGenerationStrategy } from './ColumnValueGenerationStrategy';
import type { ScalarType } from './ScalarType';
export type ColumnInfo = {
/**
@ -22,7 +22,7 @@ export type ColumnInfo = {
* Is column nullable
*/
nullable: boolean;
type: ScalarType;
type: ColumnType;
/**
* Whether or not the column can be updated
*/

View File

@ -2,15 +2,15 @@
/* tslint:disable */
/* eslint-disable */
import type { ColumnType } from './ColumnType';
import type { ColumnValueGenerationStrategy } from './ColumnValueGenerationStrategy';
import type { ScalarType } from './ScalarType';
export type ColumnInsertSchema = {
/**
* The name of the column that this field should be inserted into
*/
column: string;
column_type: ScalarType;
column_type: ColumnType;
/**
* Is the column nullable
*/

View File

@ -0,0 +1,9 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { ColumnTypeNonScalar } from './ColumnTypeNonScalar';
import type { ScalarType } from './ScalarType';
export type ColumnType = (ScalarType | ColumnTypeNonScalar);

View File

@ -0,0 +1,12 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { ColumnType } from './ColumnType';
export type ColumnTypeArray = {
element_type: ColumnType;
nullable: boolean;
type: 'array';
};

View File

@ -0,0 +1,9 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { ColumnTypeArray } from './ColumnTypeArray';
import type { ColumnTypeObject } from './ColumnTypeObject';
export type ColumnTypeNonScalar = (ColumnTypeObject | ColumnTypeArray);

View File

@ -0,0 +1,9 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type ColumnTypeObject = {
name: string;
type: 'object';
};

View File

@ -3,8 +3,9 @@
/* eslint-disable */
import type { ColumnField } from './ColumnField';
import type { NestedObjectField } from './NestedObjectField';
import type { NestedArrayField } from './NestedArrayField';
import type { NestedObjField } from './NestedObjField';
import type { RelationshipField } from './RelationshipField';
export type Field = (NestedObjectField | RelationshipField | ColumnField);
export type Field = (NestedObjField | RelationshipField | NestedArrayField | ColumnField);

View File

@ -0,0 +1,23 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { Expression } from './Expression';
import type { Field } from './Field';
import type { OrderBy } from './OrderBy';
export type NestedArrayField = {
field: Field;
/**
* Optionally limit the maximum number of returned elements of the array
*/
limit?: number | null;
/**
* Optionally skip the first n elements of the array
*/
offset?: number | null;
order_by?: OrderBy;
type: 'array';
where?: Expression;
};

View File

@ -4,7 +4,7 @@
import type { Query } from './Query';
export type NestedObjectField = {
export type NestedObjField = {
column: string;
query: Query;
type: 'object';

View File

@ -24,7 +24,7 @@
},
"dc-api-types": {
"name": "@hasura/dc-api-types",
"version": "0.31.0",
"version": "0.32.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",
@ -2227,7 +2227,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -2547,7 +2547,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify": "^4.13.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -2868,7 +2868,7 @@
"version": "file:reference",
"requires": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/xml2js": "^0.4.11",
@ -3080,7 +3080,7 @@
"version": "file:sqlite",
"requires": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/sqlite3": "^3.1.8",

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -52,7 +52,7 @@
"integrity": "sha512-lgHwxlxV1qIg1Eap7LgIeoBWIMFibOjbrYPIPJZcI1mmGAI2m3lNYpK12Y+GBdPQ0U1hRwSord7GIaawz962qQ=="
},
"node_modules/@hasura/dc-api-types": {
"version": "0.31.0",
"version": "0.32.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify": "^4.13.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",

View File

@ -482,6 +482,9 @@ const projectRow = (fields: Record<string, Field>, findRelationship: (relationsh
case "object":
throw new Error('Unsupported field type "object"');
case "array":
throw new Error('Unsupported field type "array"');
default:
return unreachable(field["type"]);
}

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify": "^4.13.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -57,7 +57,7 @@
"integrity": "sha512-lgHwxlxV1qIg1Eap7LgIeoBWIMFibOjbrYPIPJZcI1mmGAI2m3lNYpK12Y+GBdPQ0U1hRwSord7GIaawz962qQ=="
},
"node_modules/@hasura/dc-api-types": {
"version": "0.31.0",
"version": "0.32.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.31.0",
"@hasura/dc-api-types": "0.32.0",
"fastify-metrics": "^9.2.1",
"fastify": "^4.13.0",
"nanoid": "^3.3.4",

View File

@ -111,6 +111,8 @@ export function json_object(relationships: TableRelationships[], fields: Fields,
return `'${fieldName}', ${relationship(relationships, rel, field, tableAlias)}`;
case "object":
throw new Error('Unsupported field type "object"');
case "array":
throw new Error('Unsupported field type "array"');
default:
return unreachable(field["type"]);
}

View File

@ -193,9 +193,9 @@ tests = do
API._tisPrimaryKey = Just $ API.ColumnName "AlbumId" :| [],
API._tisFields =
mkFieldsMap
[ ("AlbumId", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "AlbumId") (API.ScalarType "number") False (Just API.AutoIncrement)),
("ArtistId", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "ArtistId") (API.ScalarType "number") False Nothing),
("Title", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "Title") (API.ScalarType "string") False Nothing)
[ ("AlbumId", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "AlbumId") (API.ColumnTypeScalar $ API.ScalarType "number") False (Just API.AutoIncrement)),
("ArtistId", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "ArtistId") (API.ColumnTypeScalar $ API.ScalarType "number") False Nothing),
("Title", API.ColumnInsert $ API.ColumnInsertSchema (API.ColumnName "Title") (API.ColumnTypeScalar $ API.ScalarType "string") False Nothing)
]
}
]

View File

@ -3,6 +3,7 @@
module Hasura.Backends.DataConnector.API.V0.Column
( ColumnName (..),
ColumnType (..),
ColumnInfo (..),
ciName,
ciType,
@ -29,6 +30,7 @@ import Data.OpenApi (ToSchema)
import Data.Text (Text)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.Scalar qualified as API.V0.Scalar
import Language.GraphQL.Draft.Syntax qualified as G
import Prelude
--------------------------------------------------------------------------------
@ -44,9 +46,41 @@ instance HasCodec ColumnName where
--------------------------------------------------------------------------------
data ColumnType
= ColumnTypeScalar API.V0.Scalar.ScalarType
| ColumnTypeObject G.Name
| ColumnTypeArray ColumnType Bool
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ColumnType
instance HasCodec ColumnType where
codec =
named "ColumnType" $
matchChoiceCodec
(dimapCodec ColumnTypeScalar id codec)
(object "ColumnTypeNonScalar" $ discriminatedUnionCodec "type" enc dec)
\case
ColumnTypeScalar scalar -> Left scalar
ct -> Right ct
where
enc = \case
ColumnTypeScalar _ -> error "unexpected ColumnTypeScalar"
ColumnTypeObject objectName -> ("object", mapToEncoder objectName columnTypeObjectCodec)
ColumnTypeArray columnType isNullable -> ("array", mapToEncoder (columnType, isNullable) columnTypeArrayCodec)
dec =
HashMap.fromList
[ ("object", ("ColumnTypeObject", mapToDecoder ColumnTypeObject columnTypeObjectCodec)),
("array", ("ColumnTypeArray", mapToDecoder (uncurry ColumnTypeArray) columnTypeArrayCodec))
]
columnTypeObjectCodec = requiredField' "name"
columnTypeArrayCodec = (,) <$> requiredField' "element_type" .= fst <*> requiredField' "nullable" .= snd
--------------------------------------------------------------------------------
data ColumnInfo = ColumnInfo
{ _ciName :: ColumnName,
_ciType :: API.V0.Scalar.ScalarType,
_ciType :: ColumnType,
_ciNullable :: Bool,
_ciDescription :: Maybe Text,
_ciInsertable :: Bool,

View File

@ -186,7 +186,7 @@ instance HasCodec InsertFieldSchema where
-- | Describes a field in a row to be inserted that represents a column in the table
data ColumnInsertSchema = ColumnInsertSchema
{ _cisColumn :: API.V0.ColumnName,
_cisColumnType :: API.V0.ScalarType,
_cisColumnType :: API.V0.ColumnType,
_cisNullable :: Bool,
_cisValueGenerated :: Maybe API.V0.ColumnValueGenerationStrategy
}

View File

@ -34,6 +34,7 @@ module Hasura.Backends.DataConnector.API.V0.Query
qOrderBy,
Field (..),
RelationshipField (..),
ArrayField (..),
QueryResponse (..),
qrRows,
qrAggregates,
@ -42,6 +43,7 @@ module Hasura.Backends.DataConnector.API.V0.Query
mkRelationshipFieldValue,
mkNestedObjFieldValue,
mkNestedArrayFieldValue,
isNullFieldValue,
deserializeAsColumnFieldValue,
deserializeAsRelationshipFieldValue,
deserializeAsNestedObjFieldValue,
@ -283,16 +285,37 @@ relationshipFieldObjectCodec =
<*> requiredField "query" "Relationship query"
.= _rfQuery
data ArrayField = ArrayField
{ _afField :: Field,
_afLimit :: Maybe Int,
_afOffset :: Maybe Int,
_afWhere :: Maybe API.V0.Expression,
_afOrderBy :: Maybe API.V0.OrderBy
}
deriving stock (Eq, Ord, Show, Generic)
arrayFieldObjectCodec :: JSONObjectCodec ArrayField
arrayFieldObjectCodec =
ArrayField
<$> requiredField "field" "The nested field for array elements" .= _afField
<*> optionalFieldOrNull "limit" "Optionally limit the maximum number of returned elements of the array" .= _afLimit
<*> optionalFieldOrNull "offset" "Optionally skip the first n elements of the array" .= _afOffset
<*> optionalFieldOrNull "where" "Optionally constrain the returned elements of the array to satisfy some predicate" .= _afWhere
<*> optionalFieldOrNull "order_by" "Optionally order the returned elements of the array" .= _afOrderBy
-- | The specific fields that are targeted by a 'Query'.
--
-- A field conceptually falls under one of the two following categories:
-- A field conceptually falls under one of the following categories:
-- 1. a "column" within the data store that the query is being issued against
-- 2. a "relationship", which indicates that the field is the result of
-- a subquery
-- 3. an "object", which indicates that the field contains a nested object
-- 4. an "array", which indicates that the field contains a nested array
data Field
= ColumnField API.V0.ColumnName API.V0.ScalarType
| RelField RelationshipField
| NestedObjField API.V0.ColumnName Query
| NestedArrayField ArrayField
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Field
@ -318,11 +341,13 @@ instance HasCodec Field where
ColumnField columnName scalarType -> ("column", mapToEncoder (columnName, scalarType) columnCodec)
RelField relField -> ("relationship", mapToEncoder relField relationshipFieldObjectCodec)
NestedObjField columnName nestedObjQuery -> ("object", mapToEncoder (columnName, nestedObjQuery) nestedObjCodec)
NestedArrayField arrayField -> ("array", mapToEncoder arrayField arrayFieldObjectCodec)
dec =
HashMap.fromList
[ ("column", ("ColumnField", mapToDecoder (uncurry ColumnField) columnCodec)),
("relationship", ("RelationshipField", mapToDecoder RelField relationshipFieldObjectCodec)),
("object", ("NestedObjectField", mapToDecoder (uncurry NestedObjField) nestedObjCodec))
("object", ("NestedObjField", mapToDecoder (uncurry NestedObjField) nestedObjCodec)),
("array", ("NestedArrayField", mapToDecoder NestedArrayField arrayFieldObjectCodec))
]
-- | The resolved query response provided by the 'POST /query'
@ -396,6 +421,10 @@ mkNestedObjFieldValue = FieldValue . J.toJSON
mkNestedArrayFieldValue :: [FieldValue] -> FieldValue
mkNestedArrayFieldValue = FieldValue . J.toJSON
isNullFieldValue :: FieldValue -> Bool
isNullFieldValue (FieldValue J.Null) = True
isNullFieldValue _ = False
deserializeAsColumnFieldValue :: FieldValue -> J.Value
deserializeAsColumnFieldValue (FieldValue value) = value

View File

@ -89,7 +89,7 @@ numericColumns =
>>= ( API._tiColumns
>>> mapMaybe
( \API.ColumnInfo {..} ->
if _ciType == API.ScalarType "number"
if _ciType == API.ColumnTypeScalar (API.ScalarType "number")
then Just _ciName
else Nothing
)
@ -690,7 +690,9 @@ findColumnInfo API.SchemaResponse {..} tableName columnName =
findColumnScalarType :: API.SchemaResponse -> API.TableName -> API.ColumnName -> API.ScalarType
findColumnScalarType schemaResponse tableName columnName =
API._ciType $ findColumnInfo schemaResponse tableName columnName
case API._ciType $ findColumnInfo schemaResponse tableName columnName of
API.ColumnTypeScalar scalarType -> scalarType
_ -> error $ "Column " <> show columnName <> " in table " <> show tableName <> " does not have a scalar type"
emptyQuery :: API.Query
emptyQuery = API.Query Nothing Nothing Nothing Nothing Nothing Nothing Nothing

View File

@ -69,7 +69,7 @@ formatDateColumnsInRow dateTimeFormatString TableInfo {..} row =
else fieldValue
)
where
dateFields = fmap (\ColumnInfo {..} -> FieldName $ unColumnName _ciName) $ filter (\ColumnInfo {..} -> _ciType == dateTimeScalarType) _tiColumns
dateFields = fmap (\ColumnInfo {..} -> FieldName $ unColumnName _ciName) $ filter (\ColumnInfo {..} -> _ciType == ColumnTypeScalar dateTimeScalarType) _tiColumns
dateTimeScalarType = ScalarType "DateTime"
tryFormatDate fieldValue = case deserializeAsColumnFieldValue fieldValue of
J.String value -> do

View File

@ -16,6 +16,11 @@ import Test.Sandwich (describe, shouldBe)
import Test.TestHelpers (AgentDatasetTestSpec, it)
import Prelude
toScalarType :: ColumnType -> Maybe ScalarType
toScalarType = \case
ColumnTypeScalar scalarType -> Just scalarType
_ -> Nothing
spec :: TestData -> ScalarTypesCapabilities -> AgentDatasetTestSpec
spec TestData {..} (ScalarTypesCapabilities scalarTypesCapabilities) = describe "Custom Operators in Queries" do
describe "Top-level application of custom operators" do
@ -25,11 +30,12 @@ spec TestData {..} (ScalarTypesCapabilities scalarTypesCapabilities) = describe
HashMap.fromList do
API.TableInfo {_tiName, _tiColumns} <- _tdSchemaTables
ColumnInfo {_ciName, _ciType} <- _tiColumns
ScalarTypeCapabilities {_stcComparisonOperators} <- maybeToList $ HashMap.lookup _ciType scalarTypesCapabilities
scalarType <- maybeToList $ toScalarType _ciType
ScalarTypeCapabilities {_stcComparisonOperators} <- maybeToList $ HashMap.lookup scalarType scalarTypesCapabilities
(operatorName, argType) <- HashMap.toList $ unComparisonOperators _stcComparisonOperators
ColumnInfo {_ciName = anotherColumnName, _ciType = anotherColumnType} <- _tiColumns
guard $ anotherColumnType == argType
pure ((operatorName, _ciType), (_ciName, _tiName, anotherColumnName, argType))
guard $ anotherColumnType == ColumnTypeScalar argType
pure ((operatorName, scalarType), (_ciName, _tiName, anotherColumnName, argType))
forM_ (HashMap.toList items) \((operatorName, columnType), (columnName, tableName, argColumnName, argType)) -> do
-- Perform a select using the operator in a where clause

View File

@ -47,6 +47,9 @@ data MockRequestConfig = MockRequestConfig
mkTableName :: Text -> API.TableName
mkTableName = API.TableName . (:| [])
mkScalar :: Text -> API.ColumnType
mkScalar = API.ColumnTypeScalar . API.ScalarType
-- | Stock Capabilities for a Chinook Agent
capabilities :: API.CapabilitiesResponse
capabilities =
@ -154,7 +157,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "ArtistId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Artist primary key identifier",
API._ciInsertable = True,
@ -163,7 +166,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Name",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The name of the artist",
API._ciInsertable = True,
@ -184,7 +187,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "AlbumId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Album primary key identifier",
API._ciInsertable = True,
@ -193,7 +196,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Title",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The title of the album",
API._ciInsertable = True,
@ -202,7 +205,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "ArtistId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The ID of the artist that created the album",
API._ciInsertable = True,
@ -225,7 +228,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "CustomerId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Customer primary key identifier",
API._ciInsertable = True,
@ -234,7 +237,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "FirstName",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The customer's first name",
API._ciInsertable = True,
@ -243,7 +246,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "LastName",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The customer's last name",
API._ciInsertable = True,
@ -252,7 +255,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Company",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's company name",
API._ciInsertable = True,
@ -261,7 +264,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Address",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's address line (street number, street)",
API._ciInsertable = True,
@ -270,7 +273,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "City",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's address city",
API._ciInsertable = True,
@ -279,7 +282,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "State",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's address state",
API._ciInsertable = True,
@ -288,7 +291,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Country",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's address country",
API._ciInsertable = True,
@ -297,7 +300,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "PostalCode",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's address postal code",
API._ciInsertable = True,
@ -306,7 +309,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Phone",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's phone number",
API._ciInsertable = True,
@ -315,7 +318,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Fax",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The customer's fax number",
API._ciInsertable = True,
@ -324,7 +327,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Email",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The customer's email address",
API._ciInsertable = True,
@ -333,7 +336,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "SupportRepId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = True,
API._ciDescription = Just "The ID of the Employee who is this customer's support representative",
API._ciInsertable = True,
@ -356,7 +359,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "EmployeeId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Employee primary key identifier",
API._ciInsertable = True,
@ -365,7 +368,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "LastName",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The employee's last name",
API._ciInsertable = True,
@ -374,7 +377,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "FirstName",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The employee's first name",
API._ciInsertable = True,
@ -383,7 +386,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Title",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's job title",
API._ciInsertable = True,
@ -392,7 +395,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "ReportsTo",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = True,
API._ciDescription = Just "The employee's report",
API._ciInsertable = True,
@ -401,7 +404,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BirthDate",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's birth date",
API._ciInsertable = True,
@ -410,7 +413,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "HireDate",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's hire date",
API._ciInsertable = True,
@ -419,7 +422,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Address",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's address line (street number, street)",
API._ciInsertable = True,
@ -428,7 +431,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "City",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's address city",
API._ciInsertable = True,
@ -437,7 +440,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "State",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's address state",
API._ciInsertable = True,
@ -446,7 +449,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Country",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's address country",
API._ciInsertable = True,
@ -455,7 +458,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "PostalCode",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's address postal code",
API._ciInsertable = True,
@ -464,7 +467,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Phone",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's phone number",
API._ciInsertable = True,
@ -473,7 +476,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Fax",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's fax number",
API._ciInsertable = True,
@ -482,7 +485,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Email",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The employee's email address",
API._ciInsertable = True,
@ -505,7 +508,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "GenreId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Genre primary key identifier",
API._ciInsertable = True,
@ -514,7 +517,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Name",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The name of the genre",
API._ciInsertable = True,
@ -535,7 +538,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "InvoiceId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Invoice primary key identifier",
API._ciInsertable = True,
@ -544,7 +547,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "CustomerId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "ID of the customer who bought the music",
API._ciInsertable = True,
@ -553,7 +556,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "InvoiceDate",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "Date of the invoice",
API._ciInsertable = True,
@ -562,7 +565,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BillingAddress",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The invoice's billing address line (street number, street)",
API._ciInsertable = True,
@ -571,7 +574,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BillingCity",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The invoice's billing address city",
API._ciInsertable = True,
@ -580,7 +583,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BillingState",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The invoice's billing address state",
API._ciInsertable = True,
@ -589,7 +592,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BillingCountry",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The invoice's billing address country",
API._ciInsertable = True,
@ -598,7 +601,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "BillingPostalCode",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The invoice's billing address postal code",
API._ciInsertable = True,
@ -607,7 +610,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Total",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The total amount due on the invoice",
API._ciInsertable = True,
@ -631,7 +634,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "InvoiceLineId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Invoice Line primary key identifier",
API._ciInsertable = True,
@ -640,7 +643,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "InvoiceId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "ID of the invoice the line belongs to",
API._ciInsertable = True,
@ -649,7 +652,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "TrackId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "ID of the music track being purchased",
API._ciInsertable = True,
@ -658,7 +661,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "UnitPrice",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Price of each individual track unit",
API._ciInsertable = True,
@ -667,7 +670,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Quantity",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Quantity of the track purchased",
API._ciInsertable = True,
@ -693,7 +696,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "MediaTypeId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "Media Type primary key identifier",
API._ciInsertable = True,
@ -702,7 +705,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Name",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The name of the media type format",
API._ciInsertable = True,
@ -723,7 +726,7 @@ schema =
API._tiColumns =
[ API.ColumnInfo
{ API._ciName = API.ColumnName "TrackId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The ID of the track",
API._ciInsertable = True,
@ -732,7 +735,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Name",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = False,
API._ciDescription = Just "The name of the track",
API._ciInsertable = True,
@ -741,7 +744,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "AlbumId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = True,
API._ciDescription = Just "The ID of the album the track belongs to",
API._ciInsertable = True,
@ -750,7 +753,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "MediaTypeId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The ID of the media type the track is encoded with",
API._ciInsertable = True,
@ -759,7 +762,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "GenreId",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = True,
API._ciDescription = Just "The ID of the genre of the track",
API._ciInsertable = True,
@ -768,7 +771,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Composer",
API._ciType = API.ScalarType "string",
API._ciType = mkScalar "string",
API._ciNullable = True,
API._ciDescription = Just "The name of the composer of the track",
API._ciInsertable = True,
@ -777,7 +780,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Milliseconds",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The length of the track in milliseconds",
API._ciInsertable = True,
@ -786,7 +789,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "Bytes",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = True,
API._ciDescription = Just "The size of the track in bytes",
API._ciInsertable = True,
@ -795,7 +798,7 @@ schema =
},
API.ColumnInfo
{ API._ciName = API.ColumnName "UnitPrice",
API._ciType = API.ScalarType "number",
API._ciType = mkScalar "number",
API._ciNullable = False,
API._ciDescription = Just "The price of the track",
API._ciInsertable = True,
@ -820,12 +823,12 @@ schema =
{ API._tiName = mkTableName "MyCustomScalarsTable",
API._tiType = API.Table,
API._tiColumns =
[ API.ColumnInfo (API.ColumnName "MyIntColumn") (API.ScalarType "MyInt") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyFloatColumn") (API.ScalarType "MyFloat") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyStringColumn") (API.ScalarType "MyString") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyBooleanColumn") (API.ScalarType "MyBoolean") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyIDColumn") (API.ScalarType "MyID") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyAnythingColumn") (API.ScalarType "MyAnything") False Nothing True True Nothing
[ API.ColumnInfo (API.ColumnName "MyIntColumn") (mkScalar "MyInt") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyFloatColumn") (mkScalar "MyFloat") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyStringColumn") (mkScalar "MyString") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyBooleanColumn") (mkScalar "MyBoolean") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyIDColumn") (mkScalar "MyID") False Nothing True True Nothing,
API.ColumnInfo (API.ColumnName "MyAnythingColumn") (mkScalar "MyAnything") False Nothing True True Nothing
],
API._tiPrimaryKey = Nothing,
API._tiDescription = Nothing,

View File

@ -114,7 +114,7 @@ resolveSource sourceConfig =
[ RawColumnInfo
{ rciName = ColumnName name,
rciPosition = position,
rciType = restTypeToScalarType type',
rciType = RawColumnTypeScalar $ restTypeToScalarType type',
rciIsNullable =
case mode of
Nullable -> True

View File

@ -79,6 +79,7 @@ instance Backend 'DataConnector where
type XNestedInserts 'DataConnector = XDisable
type XStreamingSubscription 'DataConnector = XDisable
type XNestedObjects 'DataConnector = XEnable
type XNestedArrays 'DataConnector = XEnable
type HealthCheckTest 'DataConnector = Void

View File

@ -107,7 +107,6 @@ instance BackendMetadata 'DataConnector where
postDropSourceHook _sourceConfig _tableTriggerMap = pure ()
buildComputedFieldBooleanExp _ _ _ _ _ _ =
error "buildComputedFieldBooleanExp: not implemented for the Data Connector backend."
columnInfoToFieldInfo = columnInfoToFieldInfo'
listAllTables = listAllTables'
listAllTrackables = listAllTrackables'
getTableInfo = getTableInfo'
@ -293,6 +292,12 @@ getDataConnectorInfo dataConnectorName backendInfo =
onNothing (HashMap.lookup dataConnectorName backendInfo) $
throw400 DataConnectorError ("Data connector named " <> toTxt dataConnectorName <<> " was not found in the data connector backend info")
mkRawColumnType :: API.Capabilities -> API.ColumnType -> RQL.T.C.RawColumnType 'DataConnector
mkRawColumnType capabilities = \case
API.ColumnTypeScalar scalarType -> RQL.T.C.RawColumnTypeScalar $ DC.mkScalarType capabilities scalarType
API.ColumnTypeObject name -> RQL.T.C.RawColumnTypeObject () name
API.ColumnTypeArray columnType isNullable -> RQL.T.C.RawColumnTypeArray () (mkRawColumnType capabilities columnType) isNullable
resolveDatabaseMetadata' ::
( MonadIO m,
MonadBaseControl IO m
@ -303,9 +308,8 @@ resolveDatabaseMetadata' ::
m (Either QErr (DBObjectsIntrospection 'DataConnector))
resolveDatabaseMetadata' logger SourceMetadata {_smName} sourceConfig@DC.SourceConfig {_scCapabilities} = runExceptT do
API.SchemaResponse {..} <- requestDatabaseSchema logger _smName sourceConfig
let typeNames = maybe mempty (HashSet.fromList . toList . fmap API._otdName) _srObjectTypes
customObjectTypes =
maybe mempty (HashMap.fromList . mapMaybe (toTableObjectType _scCapabilities typeNames) . toList) _srObjectTypes
let customObjectTypes =
maybe mempty (HashMap.fromList . mapMaybe (toTableObjectType _scCapabilities) . toList) _srObjectTypes
tables = HashMap.fromList $ do
API.TableInfo {..} <- _srTables
let primaryKeyColumns = fmap Witch.from . NESeq.fromList <$> _tiPrimaryKey
@ -318,7 +322,7 @@ resolveDatabaseMetadata' logger SourceMetadata {_smName} sourceConfig@DC.SourceC
RQL.T.C.RawColumnInfo
{ rciName = Witch.from _ciName,
rciPosition = 1, -- TODO: This is very wrong and needs to be fixed. It is used for diffing tables and seeing what's new/deleted/altered, so reusing 1 for all columns is problematic.
rciType = DC.mkScalarType _scCapabilities _ciType,
rciType = mkRawColumnType _scCapabilities _ciType,
rciIsNullable = _ciNullable,
rciDescription = fmap GQL.Description _ciDescription,
rciMutability = RQL.T.C.ColumnMutability _ciInsertable _ciUpdatable
@ -368,23 +372,33 @@ requestDatabaseSchema logger sourceName sourceConfig = do
. flip runAgentClientT (AgentClientContext logger (DC._scEndpoint transformedSourceConfig) (DC._scManager transformedSourceConfig) (DC._scTimeoutMicroseconds transformedSourceConfig) Nothing)
$ Client.schema sourceName (DC._scConfig transformedSourceConfig)
toTableObjectType :: API.Capabilities -> HashSet G.Name -> API.ObjectTypeDefinition -> Maybe (G.Name, RQL.T.T.TableObjectType 'DataConnector)
toTableObjectType capabilities typeNames API.ObjectTypeDefinition {..} =
getFieldType :: API.Capabilities -> API.ColumnType -> Maybe (RQL.T.T.TableObjectFieldType 'DataConnector)
getFieldType capabilities = \case
API.ColumnTypeScalar scalarType -> RQL.T.T.TOFTScalar <$> G.mkName (API.getScalarType scalarType) <*> pure (DC.mkScalarType capabilities scalarType)
API.ColumnTypeObject objectTypeName -> pure $ RQL.T.T.TOFTObject objectTypeName
API.ColumnTypeArray columnType isNullable -> RQL.T.T.TOFTArray () <$> getFieldType capabilities columnType <*> pure isNullable
getGraphQLType :: Bool -> RQL.T.T.TableObjectFieldType 'DataConnector -> G.GType
getGraphQLType isNullable = \case
RQL.T.T.TOFTScalar name _ -> G.TypeNamed (G.Nullability isNullable) name
RQL.T.T.TOFTObject name -> G.TypeNamed (G.Nullability isNullable) name
RQL.T.T.TOFTArray _ fieldType isNullable' ->
G.TypeList (G.Nullability isNullable) $ getGraphQLType isNullable' fieldType
toTableObjectType :: API.Capabilities -> API.ObjectTypeDefinition -> Maybe (G.Name, RQL.T.T.TableObjectType 'DataConnector)
toTableObjectType capabilities API.ObjectTypeDefinition {..} =
(_otdName,) . RQL.T.T.TableObjectType _otdName (G.Description <$> _otdDescription) <$> traverse toTableObjectFieldDefinition _otdColumns
where
toTableObjectFieldDefinition API.ColumnInfo {..} = do
fieldTypeName <- G.mkName $ API.getScalarType _ciType
fieldType <- getFieldType capabilities _ciType
fieldName <- G.mkName $ API.unColumnName _ciName
pure $
RQL.T.T.TableObjectFieldDefinition
{ _tofdColumn = Witch.from _ciName,
_tofdName = fieldName,
_tofdDescription = G.Description <$> _ciDescription,
_tofdGType = GraphQLType $ G.TypeNamed (G.Nullability _ciNullable) fieldTypeName,
_tofdFieldType =
if HashSet.member fieldTypeName typeNames
then RQL.T.T.TOFTObject fieldTypeName
else RQL.T.T.TOFTScalar fieldTypeName $ DC.mkScalarType capabilities _ciType
_tofdGType = GraphQLType $ getGraphQLType _ciNullable fieldType,
_tofdFieldType = fieldType
}
-- | Construct a 'HashSet' 'RQL.T.T.ForeignKeyMetadata'
@ -541,30 +555,6 @@ mkTypedSessionVar ::
mkTypedSessionVar columnType =
PSESessVar (columnTypeToScalarType <$> columnType)
-- | This function assumes that if a type name is present in the custom object types for the table then it
-- refers to a nested object of that type.
-- Otherwise it is a normal (scalar) column.
columnInfoToFieldInfo' :: HashMap G.Name (RQL.T.T.TableObjectType 'DataConnector) -> RQL.T.C.ColumnInfo 'DataConnector -> RQL.T.T.FieldInfo 'DataConnector
columnInfoToFieldInfo' gqlTypes columnInfo@RQL.T.C.ColumnInfo {..} =
maybe (RQL.T.T.FIColumn columnInfo) RQL.T.T.FINestedObject getNestedObjectInfo
where
getNestedObjectInfo =
case ciType of
RQL.T.C.ColumnScalar (DC.ScalarType scalarTypeName _) -> do
gqlName <- GQL.mkName scalarTypeName
guard $ HashMap.member gqlName gqlTypes
pure $
RQL.T.C.NestedObjectInfo
{ RQL.T.C._noiSupportsNestedObjects = (),
RQL.T.C._noiColumn = ciColumn,
RQL.T.C._noiName = ciName,
RQL.T.C._noiType = gqlName,
RQL.T.C._noiIsNullable = ciIsNullable,
RQL.T.C._noiDescription = ciDescription,
RQL.T.C._noiMutability = ciMutability
}
RQL.T.C.ColumnEnumReference {} -> Nothing
buildObjectRelationshipInfo' ::
(MonadError QErr m) =>
DC.SourceConfig ->
@ -667,7 +657,7 @@ convertTableMetadataToTableInfo tableName RQL.T.T.DBTableMetadata {..} =
_sciValueGenerated =
extraColumnMetadata
>>= DC._ecmValueGenerated
>>= pure . \case
<&> \case
API.AutoIncrement -> AutoIncrement
API.UniqueIdentifier -> UniqueIdentifier
API.DefaultValue -> DefaultValue

View File

@ -173,7 +173,7 @@ captureTableInsertSchema tableName tableColumns primaryKey ExtraTableMetadata {.
scalarType = columnTypeToScalarType ciType
valueGenerated = extraColumnMetadata >>= _ecmValueGenerated
fieldName = API.FieldName $ G.unName ciName
columnInsertSchema = API.ColumnInsert $ API.ColumnInsertSchema (Witch.from ciColumn) (Witch.from scalarType) ciIsNullable valueGenerated
columnInsertSchema = API.ColumnInsert $ API.ColumnInsertSchema (Witch.from ciColumn) (API.ColumnTypeScalar $ Witch.from scalarType) ciIsNullable valueGenerated
in (fieldName, columnInsertSchema)
)
& HashMap.fromList

View File

@ -325,6 +325,14 @@ translateAnnField sessionVariables sourceTableName = \case
AFNestedObject nestedObj ->
Just . API.NestedObjField (Witch.from $ _anosColumn nestedObj)
<$> translateNestedObjectSelect sessionVariables sourceTableName nestedObj
AFNestedArray _ (ANASSimple field) ->
fmap mkArrayField <$> translateAnnField sessionVariables sourceTableName field
where
mkArrayField nestedField =
API.NestedArrayField (API.ArrayField nestedField Nothing Nothing Nothing Nothing)
-- TODO(dmoverton): support limit, offset, where and order_by in ArrayField
AFNestedArray _ (ANASAggregate _) ->
pure Nothing -- TODO(dmoverton): support nested array aggregates
AFColumn colField ->
-- TODO: make sure certain fields in colField are not in use, since we don't support them
pure . Just $ API.ColumnField (Witch.from $ _acfColumn colField) (Witch.from . columnTypeToScalarType $ _acfType colField)
@ -623,6 +631,18 @@ reshapeField field responseFieldValue =
Left err -> throw500 $ "Expected object in field returned by Data Connector agent: " <> err -- TODO(dmoverton): Add pathing information for error clarity
Right nestedResponse ->
reshapeAnnFields noPrefix (_anosFields nestedObj) nestedResponse
AFNestedArray _ (ANASSimple arrayField) -> do
fv <- responseFieldValue
if API.isNullFieldValue fv
then pure JE.null_
else do
let nestedArrayFieldValue = API.deserializeAsNestedArrayFieldValue fv
case nestedArrayFieldValue of
Left err -> throw500 $ "Expected array in field returned by Data Connector agent: " <> err -- TODO(dmoverton): Add pathing information for error clarity
Right arrayResponse ->
JE.list id <$> traverse (reshapeField arrayField) (pure <$> arrayResponse)
AFNestedArray _ (ANASAggregate _) ->
throw400 NotSupported "Nested array aggregate not supported"
AFColumn _columnField -> do
columnFieldValue <- API.deserializeAsColumnFieldValue <$> responseFieldValue
pure $ J.toEncoding columnFieldValue

View File

@ -167,7 +167,7 @@ transformColumn sysCol =
rciIsNullable = scIsNullable sysCol
rciDescription = Nothing
rciType = parseScalarType $ styName $ scJoinedSysType sysCol
rciType = RawColumnTypeScalar $ parseScalarType $ styName $ scJoinedSysType sysCol
foreignKeys =
scJoinedForeignKeyColumns sysCol <&> \foreignKeyColumn ->
let _fkConstraint = Constraint (ConstraintName "fk_mssql") $ OID $ sfkcConstraintObjectId foreignKeyColumn

View File

@ -139,7 +139,7 @@ tableInsertMatchColumnsEnum tableInfo = do
[ ( define $ ciName column,
ciColumn column
)
| column <- columns,
| SCIScalarColumn column <- columns,
isMatchColumnValid column
]
where

View File

@ -45,6 +45,7 @@ import Hasura.RQL.DDL.Schema
import Hasura.RQL.DDL.Schema.Diff qualified as Diff
import Hasura.RQL.Types.Backend
import Hasura.RQL.Types.BackendType
import Hasura.RQL.Types.Column (StructuredColumnInfo (..))
import Hasura.RQL.Types.Common
import Hasura.RQL.Types.ComputedField
import Hasura.RQL.Types.EventTrigger
@ -281,7 +282,7 @@ withMetadataCheck sqlGen source cascade txAccess runSQLQuery = do
runPgSourceWriteTx sourceConfig RunSQLQuery $
forM_ (HashMap.elems tables) $ \(TableInfo coreInfo _ eventTriggers _) -> do
let table = _tciName coreInfo
columns = getCols $ _tciFieldInfoMap coreInfo
columns = fmap (\(SCIScalarColumn col) -> col) $ getCols $ _tciFieldInfoMap coreInfo
forM_ (HashMap.toList eventTriggers) $ \(triggerName, EventTriggerInfo {etiOpsDef, etiTriggerOnReplication}) -> do
flip runReaderT sqlGen $
mkAllTriggersQ triggerName table etiTriggerOnReplication columns etiOpsDef

View File

@ -74,7 +74,7 @@ fetchAndValidateEnumValues pgSourceConfig tableName maybePrimaryKey columnInfos
Nothing -> refute [EnumTableMissingPrimaryKey]
Just primaryKey -> case _pkColumns primaryKey of
column NESeq.:<|| Seq.Empty -> case rciType column of
PGText -> pure column
RawColumnTypeScalar PGText -> pure column
_ -> refute [EnumTableNonTextualPrimaryKey column]
columns -> refute [EnumTableMultiColumnPrimaryKey $ map rciName (toList columns)]
@ -83,7 +83,7 @@ fetchAndValidateEnumValues pgSourceConfig tableName maybePrimaryKey columnInfos
case nonPrimaryKeyColumns of
[] -> pure Nothing
[column] -> case rciType column of
PGText -> pure $ Just column
RawColumnTypeScalar PGText -> pure $ Just column
_ -> dispute [EnumTableNonTextualCommentColumn column] $> Nothing
columns -> dispute [EnumTableTooManyColumns $ map rciName columns] $> Nothing
@ -123,11 +123,12 @@ fetchAndValidateEnumValues pgSourceConfig tableName maybePrimaryKey columnInfos
<> ")"
where
typeMismatch description colInfo expected =
"the tables "
<> description
<> " ("
<> rciName colInfo <<> ") must have type "
<> expected <<> ", not type " <>> rciType colInfo
let RawColumnTypeScalar scalarType = rciType @('Postgres pgKind) colInfo
in "the tables "
<> description
<> " ("
<> rciName colInfo <<> ") must have type "
<> expected <<> ", not type " <>> scalarType
fetchEnumValuesFromDb ::
forall pgKind m.

View File

@ -290,50 +290,58 @@ transformAnnFields ::
AnnFieldsG src (RemoteRelationshipField UnpreparedValue) (UnpreparedValue src) ->
Collector (AnnFieldsG src Void (UnpreparedValue src))
transformAnnFields fields = do
let transformAnnField :: AnnFieldG src (RemoteRelationshipField UnpreparedValue) (UnpreparedValue src) -> Collector (AnnFieldG src Void (UnpreparedValue src), Maybe RemoteJoin)
transformAnnField = \case
-- AnnFields which do not need to be transformed.
AFNodeId x sn qt pkeys ->
pure (AFNodeId x sn qt pkeys, Nothing)
AFColumn c ->
pure (AFColumn c, Nothing)
AFExpression t ->
pure (AFExpression t, Nothing)
-- AnnFields with no associated remote joins and whose transformations are
-- relatively straightforward.
AFObjectRelation annRel -> do
transformed <- traverseOf aarAnnSelect transformObjectSelect annRel
pure (AFObjectRelation transformed, Nothing)
AFArrayRelation (ASSimple annRel) -> do
transformed <- traverseOf aarAnnSelect transformSelect annRel
pure (AFArrayRelation . ASSimple $ transformed, Nothing)
AFArrayRelation (ASAggregate aggRel) -> do
transformed <- traverseOf aarAnnSelect transformAggregateSelect aggRel
pure (AFArrayRelation . ASAggregate $ transformed, Nothing)
AFArrayRelation (ASConnection annRel) -> do
transformed <- traverseOf aarAnnSelect transformConnectionSelect annRel
pure (AFArrayRelation . ASConnection $ transformed, Nothing)
AFComputedField computedField computedFieldName computedFieldSelect -> do
transformed <- case computedFieldSelect of
CFSScalar cfss cbe -> pure $ CFSScalar cfss cbe
CFSTable jsonAggSel annSel -> do
transformed <- transformSelect annSel
pure $ CFSTable jsonAggSel transformed
pure (AFComputedField computedField computedFieldName transformed, Nothing)
-- Remote AnnFields, whose elements require annotation so that they can be
-- used to construct a remote join.
AFRemote RemoteRelationshipSelect {..} ->
pure
( -- We generate this so that the response has a key with the relationship,
-- without which preserving the order of fields in the final response
-- would require a lot of bookkeeping.
remoteAnnPlaceholder,
Just $ createRemoteJoin (HashMap.intersection joinColumnAliases _rrsLHSJoinFields) _rrsRelationship
)
AFNestedObject nestedObj ->
(,Nothing) . AFNestedObject <$> transformNestedObjectSelect nestedObj
AFNestedArray supportsNestedArray (ANASSimple nestedArrayField) -> do
(,Nothing) . AFNestedArray supportsNestedArray . ANASSimple . fst <$> transformAnnField nestedArrayField
AFNestedArray supportsNestedArray (ANASAggregate agg) -> do
transformed <- transformAggregateSelect agg
pure (AFNestedArray supportsNestedArray (ANASAggregate transformed), Nothing)
-- Produces a list of transformed fields that may or may not have an
-- associated remote join.
annotatedFields <-
fields & traverseFields \case
-- AnnFields which do not need to be transformed.
AFNodeId x sn qt pkeys ->
pure (AFNodeId x sn qt pkeys, Nothing)
AFColumn c ->
pure (AFColumn c, Nothing)
AFExpression t ->
pure (AFExpression t, Nothing)
-- AnnFields with no associated remote joins and whose transformations are
-- relatively straightforward.
AFObjectRelation annRel -> do
transformed <- traverseOf aarAnnSelect transformObjectSelect annRel
pure (AFObjectRelation transformed, Nothing)
AFArrayRelation (ASSimple annRel) -> do
transformed <- traverseOf aarAnnSelect transformSelect annRel
pure (AFArrayRelation . ASSimple $ transformed, Nothing)
AFArrayRelation (ASAggregate aggRel) -> do
transformed <- traverseOf aarAnnSelect transformAggregateSelect aggRel
pure (AFArrayRelation . ASAggregate $ transformed, Nothing)
AFArrayRelation (ASConnection annRel) -> do
transformed <- traverseOf aarAnnSelect transformConnectionSelect annRel
pure (AFArrayRelation . ASConnection $ transformed, Nothing)
AFComputedField computedField computedFieldName computedFieldSelect -> do
transformed <- case computedFieldSelect of
CFSScalar cfss cbe -> pure $ CFSScalar cfss cbe
CFSTable jsonAggSel annSel -> do
transformed <- transformSelect annSel
pure $ CFSTable jsonAggSel transformed
pure (AFComputedField computedField computedFieldName transformed, Nothing)
-- Remote AnnFields, whose elements require annotation so that they can be
-- used to construct a remote join.
AFRemote RemoteRelationshipSelect {..} ->
pure
( -- We generate this so that the response has a key with the relationship,
-- without which preserving the order of fields in the final response
-- would require a lot of bookkeeping.
remoteAnnPlaceholder,
Just $ createRemoteJoin (HashMap.intersection joinColumnAliases _rrsLHSJoinFields) _rrsRelationship
)
AFNestedObject nestedObj ->
(,Nothing) . AFNestedObject <$> transformNestedObjectSelect nestedObj
fields & traverseFields transformAnnField
let transformedFields = (fmap . fmap) fst annotatedFields
remoteJoins =

View File

@ -131,8 +131,10 @@ boolExpInternal gqlName fieldInfos description memoizeKey mkAggPredParser = do
fieldName <- hoistMaybe $ fieldInfoGraphQLName fieldInfo
P.fieldOptional fieldName Nothing <$> case fieldInfo of
-- field_name: field_type_comparison_exp
FIColumn columnInfo ->
FIColumn (SCIScalarColumn columnInfo) ->
lift $ fmap (AVColumn columnInfo) <$> comparisonExps @b (ciType columnInfo)
FIColumn (SCIObjectColumn _) -> empty -- TODO(dmoverton)
FIColumn (SCIArrayColumn _) -> empty -- TODO(dmoverton)
-- field_name: field_type_bool_exp
FIRelationship relationshipInfo -> do
case riTarget relationshipInfo of
@ -165,7 +167,6 @@ boolExpInternal gqlName fieldInfos description memoizeKey mkAggPredParser = do
-- Using remote relationship fields in boolean expressions is not supported.
FIRemoteRelationship _ -> empty
FINestedObject _ -> empty -- TODO(dmoverton)
-- |
-- > input type_bool_exp {

View File

@ -24,6 +24,7 @@ module Hasura.GraphQL.Schema.Common
AnnotatedActionField,
AnnotatedActionFields,
AnnotatedNestedObjectSelect,
AnnotatedNestedArraySelect,
EdgeFields,
Scenario (..),
SelectArgs,
@ -320,6 +321,8 @@ type AnnotatedActionField = IR.ActionFieldG (IR.RemoteRelationshipField IR.Unpre
type AnnotatedNestedObjectSelect b = IR.AnnNestedObjectSelectG b (IR.RemoteRelationshipField IR.UnpreparedValue) (IR.UnpreparedValue b)
type AnnotatedNestedArraySelect b = IR.AnnNestedArraySelectG b (IR.RemoteRelationshipField IR.UnpreparedValue) (IR.UnpreparedValue b)
-------------------------------------------------------------------------------
data RemoteSchemaParser n = RemoteSchemaParser

View File

@ -211,11 +211,12 @@ tableFieldsInput tableInfo = do
mkFieldParser = \case
FIComputedField _ -> pure Nothing
FIRemoteRelationship _ -> pure Nothing
FINestedObject _ -> pure Nothing -- TODO(dmoverton)
FIColumn columnInfo -> do
FIColumn (SCIScalarColumn columnInfo) -> do
if (_cmIsInsertable $ ciMutability columnInfo)
then mkColumnParser columnInfo
else pure Nothing
FIColumn (SCIObjectColumn _) -> pure Nothing -- TODO(dmoverton)
FIColumn (SCIArrayColumn _) -> pure Nothing -- TODO(dmoverton)
FIRelationship relInfo -> mkRelationshipParser relInfo
mkColumnParser ::

View File

@ -7,6 +7,7 @@ module Hasura.GraphQL.Schema.OrderBy
)
where
import Control.Lens ((^?))
import Data.Has
import Data.HashMap.Strict.Extended qualified as HashMap
import Data.Text.Casing qualified as C
@ -119,7 +120,7 @@ orderByExpInternal gqlName description tableFields memoizeKey = do
mkField sourceInfo tCase fieldInfo = runMaybeT $ do
roleName <- retrieve scRole
case fieldInfo of
FIColumn columnInfo -> do
FIColumn (SCIScalarColumn columnInfo) -> do
let !fieldName = ciName columnInfo
pure $
P.fieldOptional
@ -127,6 +128,8 @@ orderByExpInternal gqlName description tableFields memoizeKey = do
Nothing
(orderByOperator @b tCase sourceInfo)
<&> fmap (pure . mkOrderByItemG @b (IR.AOCColumn columnInfo)) . join
FIColumn (SCIObjectColumn _) -> empty -- TODO(dmoverton)
FIColumn (SCIArrayColumn _) -> empty -- TODO(dmoverton)
FIRelationship relationshipInfo -> do
case riTarget relationshipInfo of
RelTargetNativeQuery _ -> error "mkField RelTargetNativeQuery"
@ -184,7 +187,6 @@ orderByExpInternal gqlName description tableFields memoizeKey = do
aggregationOrderBy
ReturnsOthers -> empty
FIRemoteRelationship _ -> empty
FINestedObject _ -> empty -- TODO(dmoverton)
-- | Corresponds to an object type for an order by.
--
@ -230,7 +232,7 @@ orderByAggregation sourceInfo tableInfo = P.memoizeOn 'orderByAggregation (_siNa
tCase = _rscNamingConvention customization
mkTypename = _rscTypeNames customization
tableIdentifierName <- getTableIdentifierName @b tableInfo
allColumns <- tableSelectColumns tableInfo
allColumns <- mapMaybe (^? _SCIScalarColumn) <$> tableSelectColumns tableInfo
let numColumns = stdAggOpColumns tCase $ onlyNumCols allColumns
compColumns = stdAggOpColumns tCase $ onlyComparableCols allColumns
numOperatorsAndColumns = HashMap.fromList $ (,numColumns) <$> numericAggOperators

View File

@ -1124,7 +1124,7 @@ tableAggregationFields tableInfo = do
mkTypename = _rscTypeNames customization
P.memoizeOn 'tableAggregationFields (sourceName, tableName) do
tableGQLName <- getTableIdentifierName tableInfo
allColumns <- tableSelectColumns tableInfo
allColumns <- mapMaybe (^? _SCIScalarColumn) <$> tableSelectColumns tableInfo
allComputedFields <-
if supportsAggregateComputedFields @b -- See 'supportsAggregateComputedFields' for an explanation
then tableSelectComputedFields tableInfo
@ -1362,7 +1362,7 @@ fieldSelection ::
FieldInfo b ->
SchemaT r m [FieldParser n (AnnotatedField b)]
fieldSelection table tableInfo = \case
FIColumn columnInfo ->
FIColumn (SCIScalarColumn columnInfo) ->
maybeToList <$> runMaybeT do
roleName <- retrieve scRole
schemaKind <- retrieve scSchemaKind
@ -1402,8 +1402,10 @@ fieldSelection table tableInfo = \case
pure $!
P.selection fieldName (ciDescription columnInfo) pathArg field
<&> IR.mkAnnColumnField (ciColumn columnInfo) (ciType columnInfo) caseBoolExpUnpreparedValue
FINestedObject nestedObjectInfo ->
FIColumn (SCIObjectColumn nestedObjectInfo) ->
pure . fmap IR.AFNestedObject <$> nestedObjectFieldParser tableInfo nestedObjectInfo
FIColumn (SCIArrayColumn NestedArrayInfo {..}) ->
fmap (nestedArrayFieldParser _naiSupportsNestedArrays _naiIsNullable) <$> fieldSelection table tableInfo (FIColumn _naiColumnInfo)
FIRelationship relationshipInfo ->
concat . maybeToList <$> relationshipField table relationshipInfo
FIComputedField computedFieldInfo ->
@ -1433,6 +1435,16 @@ fieldSelection table tableInfo = \case
pure $ P.subselection_ _noiName _noiDescription parser
_ -> throw500 $ "fieldSelection: object type " <> _noiType <<> " not found"
outputParserModifier :: Bool -> IP.Parser origin 'Output m a -> IP.Parser origin 'Output m a
outputParserModifier True = P.nullableParser
outputParserModifier False = P.nonNullableParser
nestedArrayFieldParser :: forall origin m b r v. Functor m => XNestedArrays b -> Bool -> IP.FieldParser origin m (IR.AnnFieldG b r v) -> IP.FieldParser origin m (IR.AnnFieldG b r v)
nestedArrayFieldParser supportsNestedArrays isNullable =
wrapNullable . IP.multipleField . fmap (IR.AFNestedArray @b supportsNestedArrays . IR.ANASSimple)
where
wrapNullable = if isNullable then IP.nullableField else IP.nonNullableField
nestedObjectParser ::
forall b r m n.
(MonadBuildSchema b r m n) =>
@ -1449,21 +1461,22 @@ nestedObjectParser supportsNestedObjects objectTypes objectType column isNullabl
P.selectionSet (_totName objectType) (_totDescription objectType) allFieldParsers
<&> IR.AnnNestedObjectSelectG supportsNestedObjects column . parsedSelectionsToFields IR.AFExpression
where
outputParserModifier True = P.nullableParser
outputParserModifier False = P.nonNullableParser
outputFieldParser ::
TableObjectFieldDefinition b ->
SchemaT r m (IP.FieldParser MetadataObjId n (IR.AnnFieldG b (IR.RemoteRelationshipField IR.UnpreparedValue) (IR.UnpreparedValue b)))
outputFieldParser (TableObjectFieldDefinition column' name description (GraphQLType gType) objectFieldType) =
P.memoizeOn 'nestedObjectParser (_totName objectType, name) do
case objectFieldType of
TOFTScalar fieldTypeName scalarType ->
wrapScalar scalarType $ customScalarParser fieldTypeName
TOFTObject objectName -> do
objectType' <- HashMap.lookup objectName objectTypes `onNothing` throw500 ("Custom type " <> objectName <<> " not found")
parser <- fmap (IR.AFNestedObject @b) <$> nestedObjectParser supportsNestedObjects objectTypes objectType' column' (G.isNullable gType)
pure $ P.subselection_ name description parser
let go objectFieldType' =
case objectFieldType' of
TOFTScalar fieldTypeName scalarType ->
wrapScalar scalarType $ customScalarParser fieldTypeName
TOFTObject objectName -> do
objectType' <- HashMap.lookup objectName objectTypes `onNothing` throw500 ("Custom type " <> objectName <<> " not found")
parser <- fmap (IR.AFNestedObject @b) <$> nestedObjectParser supportsNestedObjects objectTypes objectType' column' (G.isNullable gType)
pure $ P.subselection_ name description parser
TOFTArray supportsNestedArrays nestedFieldType isNullable' -> do
nestedArrayFieldParser supportsNestedArrays isNullable' <$> go nestedFieldType
go objectFieldType
where
wrapScalar scalarType parser =
pure $
@ -1657,7 +1670,7 @@ relationshipField table ri = runMaybeT do
(False, BeforeParent) -> do
let columns = HashMap.keys $ riMapping ri
fieldInfoMap = _tciFieldInfoMap $ _tiCoreInfo tableInfo
findColumn col = HashMap.lookup (fromCol @b col) fieldInfoMap ^? _Just . _FIColumn
findColumn col = HashMap.lookup (fromCol @b col) fieldInfoMap ^? _Just . _FIColumn . _SCIScalarColumn
-- Fetch information about the referencing columns of the foreign key
-- constraint
colInfo <-

View File

@ -7,6 +7,7 @@ module Hasura.GraphQL.Schema.SubscriptionStream
)
where
import Control.Lens ((^?))
import Control.Monad.Memoize
import Data.Has
import Data.List.NonEmpty qualified as NE
@ -191,7 +192,7 @@ tableStreamCursorExp tableInfo = do
memoizeOn 'tableStreamCursorExp (sourceName, tableName) $ do
tableGQLName <- getTableGQLName tableInfo
tableGQLIdentifier <- getTableIdentifierName tableInfo
columnInfos <- tableSelectColumns tableInfo
columnInfos <- mapMaybe (^? _SCIScalarColumn) <$> tableSelectColumns tableInfo
let objName = mkTypename $ applyTypeNameCaseIdentifier tCase $ mkStreamCursorInputTypeName tableGQLIdentifier
description = G.Description $ "Streaming cursor of the table " <>> tableGQLName
columnParsers <- tableStreamColumnArg tableGQLIdentifier columnInfos

View File

@ -17,6 +17,7 @@ module Hasura.GraphQL.Schema.Table
)
where
import Control.Lens ((^?))
import Data.Has
import Data.HashMap.Strict qualified as HashMap
import Data.HashSet qualified as Set
@ -104,7 +105,7 @@ tableSelectColumnsEnum tableInfo = do
"select columns of table " <>> tableInfoName tableInfo
-- We noticed many 'Definition's allocated, from 'define' below, so memoize
-- to gain more sharing and lower memory residency.
case nonEmpty $ map (define . ciName &&& ciColumn) columns of
case nonEmpty $ map (define . structuredColumnInfoName &&& structuredColumnInfoColumn) columns of
Nothing -> pure Nothing
Just columnDefinitions ->
Just
@ -139,7 +140,7 @@ tableSelectColumnsPredEnum columnPredicate predName tableInfo = do
mkTypename = runMkTypename $ _rscTypeNames customization
predName' = applyFieldNameCaseIdentifier tCase predName
tableGQLName <- getTableIdentifierName @b tableInfo
columns <- filter (columnPredicate . ciType) <$> tableSelectColumns tableInfo
columns <- filter (columnPredicate . ciType) . mapMaybe (^? _SCIScalarColumn) <$> tableSelectColumns tableInfo
let enumName = mkTypename $ applyTypeNameCaseIdentifier tCase $ mkSelectColumnPredTypeName tableGQLName predName
description =
Just $
@ -228,10 +229,12 @@ tableSelectFields tableInfo = do
filterM (canBeSelected roleName permissions) $ HashMap.elems tableFields
where
canBeSelected _ Nothing _ = pure False
canBeSelected _ (Just permissions) (FIColumn columnInfo) =
canBeSelected _ (Just permissions) (FIColumn (SCIScalarColumn (columnInfo))) =
pure $! HashMap.member (ciColumn columnInfo) (spiCols permissions)
canBeSelected _ (Just permissions) (FINestedObject NestedObjectInfo {..}) =
canBeSelected _ (Just permissions) (FIColumn (SCIObjectColumn NestedObjectInfo {..})) =
pure $! HashMap.member _noiColumn (spiCols permissions)
canBeSelected role permissions (FIColumn (SCIArrayColumn NestedArrayInfo {..})) =
canBeSelected role permissions (FIColumn _naiColumnInfo)
canBeSelected role _ (FIRelationship relationshipInfo) = do
case riTarget relationshipInfo of
RelTargetNativeQuery _ -> error "tableSelectFields RelTargetNativeQuery"
@ -253,7 +256,7 @@ tableColumns ::
tableColumns tableInfo =
mapMaybe columnInfo . HashMap.elems . _tciFieldInfoMap . _tiCoreInfo $ tableInfo
where
columnInfo (FIColumn ci) = Just ci
columnInfo (FIColumn (SCIScalarColumn ci)) = Just ci
columnInfo _ = Nothing
-- | Get the columns of a table that may be selected under the given select
@ -267,7 +270,7 @@ tableSelectColumns ::
Has (SourceInfo b) r
) =>
TableInfo b ->
m [ColumnInfo b]
m [StructuredColumnInfo b]
tableSelectColumns tableInfo =
mapMaybe columnInfo <$> tableSelectFields tableInfo
where

View File

@ -13,7 +13,7 @@ import Hasura.LogicalModel.NullableScalarType (NullableScalarType (..))
import Hasura.LogicalModel.Types (LogicalModelField (..), LogicalModelType (..), LogicalModelTypeScalar (..))
import Hasura.Prelude
import Hasura.RQL.Types.Backend (Backend (..))
import Hasura.RQL.Types.Column (ColumnInfo (..), ColumnMutability (..), ColumnType (..), fromCol)
import Hasura.RQL.Types.Column (ColumnInfo (..), ColumnMutability (..), ColumnType (..), StructuredColumnInfo (..), fromCol)
import Hasura.Table.Cache (FieldInfo (..), FieldInfoMap)
import Language.GraphQL.Draft.Syntax qualified as G
@ -46,19 +46,21 @@ toFieldInfo fields =
traverseWithIndex :: (Applicative m) => (Int -> aa -> m bb) -> [aa] -> m [bb]
traverseWithIndex f = zipWithM f [0 ..]
logicalModelToColumnInfo :: forall b. (Backend b) => Int -> (Column b, NullableScalarType b) -> Maybe (ColumnInfo b)
logicalModelToColumnInfo :: forall b. (Backend b) => Int -> (Column b, NullableScalarType b) -> Maybe (StructuredColumnInfo b)
logicalModelToColumnInfo i (column, NullableScalarType {..}) = do
name <- G.mkName (toTxt column)
pure $
ColumnInfo
{ ciColumn = column,
ciName = name,
ciPosition = i,
ciType = ColumnScalar nstType,
ciIsNullable = nstNullable,
ciDescription = G.Description <$> nstDescription,
ciMutability = ColumnMutability {_cmIsInsertable = False, _cmIsUpdatable = False}
}
-- TODO(dmoverton): handle object and array columns
SCIScalarColumn $
ColumnInfo
{ ciColumn = column,
ciName = name,
ciPosition = i,
ciType = ColumnScalar nstType,
ciIsNullable = nstNullable,
ciDescription = G.Description <$> nstDescription,
ciMutability = ColumnMutability {_cmIsInsertable = False, _cmIsUpdatable = False}
}
logicalModelFieldsToFieldInfo ::
forall b.

View File

@ -364,7 +364,7 @@ buildInsPermInfo source tn fieldInfoMap (InsPerm checkCond set mCols backendOnly
return (InsPermInfo insColsWithoutPresets be setColsSQL backendOnly reqHdrs, deps)
where
allInsCols = map ciColumn $ filter (_cmIsInsertable . ciMutability) $ getCols fieldInfoMap
allInsCols = map structuredColumnInfoColumn $ filter (_cmIsInsertable . structuredColumnInfoMutability) $ getCols fieldInfoMap
insCols = interpColSpec allInsCols (fromMaybe PCStar mCols)
relInInsErr = "Only table columns can have insert permissions defined, not relationships or other field types"
@ -517,7 +517,7 @@ buildSelPermInfo ::
SelPerm b ->
m (WithDeps (SelPermInfo b))
buildSelPermInfo source tableName fieldInfoMap roleName sp = withPathK "permission" $ do
let pgCols = interpColSpec (ciColumn <$> getCols fieldInfoMap) $ spColumns sp
let pgCols = interpColSpec (structuredColumnInfoColumn <$> getCols fieldInfoMap) $ spColumns sp
(spiFilter, boolExpDeps) <-
withPathK "filter" $
@ -614,7 +614,7 @@ buildUpdPermInfo source tn fieldInfoMap (UpdPerm colSpec set fltr check backendO
return (UpdPermInfo updColsWithoutPreSets tn be (fst <$> checkExpr) setColsSQL backendOnly reqHeaders, deps)
where
allUpdCols = map ciColumn $ filter (_cmIsUpdatable . ciMutability) $ getCols fieldInfoMap
allUpdCols = map structuredColumnInfoColumn $ filter (_cmIsUpdatable . structuredColumnInfoMutability) $ getCols fieldInfoMap
updCols = interpColSpec allUpdCols colSpec
relInUpdErr = "Only table columns can have update permissions defined, not relationships or other field types"

View File

@ -28,7 +28,7 @@ import Hasura.Prelude
import Hasura.RQL.IR.BoolExp
import Hasura.RQL.Types.Backend
import Hasura.RQL.Types.BoolExp
import Hasura.RQL.Types.Column (ColumnReference (ColumnReferenceColumn))
import Hasura.RQL.Types.Column (ColumnReference (ColumnReferenceColumn), StructuredColumnInfo (..))
import Hasura.RQL.Types.Common
import Hasura.RQL.Types.Metadata.Backend
import Hasura.RQL.Types.Permission
@ -177,9 +177,11 @@ annColExp ::
annColExp rhsParser rootFieldInfoMap colInfoMap (ColExp fieldName colVal) = do
colInfo <- askFieldInfo colInfoMap fieldName
case colInfo of
FIColumn pgi -> AVColumn pgi <$> parseBoolExpOperations (_berpValueParser rhsParser) rootFieldInfoMap colInfoMap (ColumnReferenceColumn pgi) colVal
FINestedObject {} ->
FIColumn (SCIScalarColumn pgi) -> AVColumn pgi <$> parseBoolExpOperations (_berpValueParser rhsParser) rootFieldInfoMap colInfoMap (ColumnReferenceColumn pgi) colVal
FIColumn (SCIObjectColumn {}) ->
throw400 NotSupported "nested object not supported"
FIColumn (SCIArrayColumn {}) ->
throw400 NotSupported "nested array not supported"
FIRelationship relInfo -> do
case riTarget relInfo of
RelTargetNativeQuery _ -> error "annColExp RelTargetNativeQuery"

View File

@ -32,6 +32,7 @@ import Hasura.Base.Error
QErrM,
runAesonParser,
throw400,
throw500,
)
import Hasura.EncJSON (EncJSON)
import Hasura.Prelude
@ -301,7 +302,7 @@ data PartiallyResolvedSource b = PartiallyResolvedSource
{ _prsSourceMetadata :: SourceMetadata b,
_prsConfig :: SourceConfig b,
_prsIntrospection :: DBObjectsIntrospection b,
_tableCoreInfoMap :: HashMap (TableName b) (TableCoreInfoG b (ColumnInfo b) (ColumnInfo b)),
_tableCoreInfoMap :: HashMap (TableName b) (TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b)),
_eventTriggerInfoMap :: HashMap (TableName b) (EventTriggerInfoMap b)
}
deriving (Eq)
@ -346,8 +347,9 @@ buildRemoteFieldInfo lhsIdentifier lhsJoinFields RemoteRelationship {..} allSour
-- TODO: rhs fields should also ideally be DBJoinFields
columnPairs <- for (HashMap.toList _tsrdFieldMapping) \(srcFieldName, tgtFieldName) -> do
lhsJoinField <- askFieldInfo lhsJoinFields srcFieldName
tgtField <- askFieldInfo targetColumns tgtFieldName
pure (srcFieldName, lhsJoinField, tgtField)
tgtField <- toScalarColumnInfo <$> askFieldInfo targetColumns tgtFieldName
tgtFieldScalarColumn <- tgtField `onNothing` throw500 ("Target field " <> tgtFieldName <<> "is not a scalar column")
pure (srcFieldName, lhsJoinField, tgtFieldScalarColumn)
columnMapping <- for columnPairs \(srcFieldName, srcColumn, tgtColumn) -> do
tgtScalar <- case ciType tgtColumn of
ColumnScalar scalarType -> pure scalarType

View File

@ -679,7 +679,7 @@ buildSchemaCacheRule logger env = proc (MetadataWithResourceVersion metadataNoDe
HashMap SourceName (AB.AnyBackend PartiallyResolvedSource),
SourceMetadata b,
SourceConfig b,
HashMap (TableName b) (TableCoreInfoG b (ColumnInfo b) (ColumnInfo b)),
HashMap (TableName b) (TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b)),
HashMap (TableName b) (EventTriggerInfoMap b),
DBObjectsIntrospection b,
PartiallyResolvedRemoteSchemaMap,
@ -700,8 +700,7 @@ buildSchemaCacheRule logger env = proc (MetadataWithResourceVersion metadataNoDe
interpretWriter
-< for (tablesRawInfo `alignTableMap` nonColumnsByTable) \(tableRawInfo, nonColumnInput) -> do
let columns = _tciFieldInfoMap tableRawInfo
customObjectTypes = _tciCustomObjectTypes tableRawInfo
allFields :: FieldInfoMap (FieldInfo b) <- addNonColumnFields allSources sourceName sourceConfig customObjectTypes tablesRawInfo columns remoteSchemaMap dbFunctions nonColumnInput
allFields :: FieldInfoMap (FieldInfo b) <- addNonColumnFields allSources sourceName sourceConfig tablesRawInfo columns remoteSchemaMap dbFunctions nonColumnInput
pure $ tableRawInfo {_tciFieldInfoMap = allFields}
-- permissions
@ -1274,7 +1273,7 @@ buildSchemaCacheRule logger env = proc (MetadataWithResourceVersion metadataNoDe
( CacheDynamicConfig,
SourceName,
SourceConfig b,
TableCoreInfoG b (ColumnInfo b) (ColumnInfo b),
TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b),
[EventTriggerConf b],
Inc.Dependency Inc.InvalidationKey,
RecreateEventTriggers
@ -1283,7 +1282,7 @@ buildSchemaCacheRule logger env = proc (MetadataWithResourceVersion metadataNoDe
buildTableEventTriggers = proc (dynamicConfig, sourceName, sourceConfig, tableInfo, eventTriggerConfs, metadataInvalidationKey, migrationRecreateEventTriggers) ->
buildInfoMap (etcName . (^. _7)) (mkEventTriggerMetadataObject @b) buildEventTrigger
-<
(tableInfo, map (dynamicConfig,metadataInvalidationKey,sourceName,sourceConfig,_tciName tableInfo,migrationRecreateEventTriggers,) eventTriggerConfs)
(tableInfo & tciFieldInfoMap %~ HashMap.mapMaybe toScalarColumnInfo, map (dynamicConfig,metadataInvalidationKey,sourceName,sourceConfig,_tciName tableInfo,migrationRecreateEventTriggers,) eventTriggerConfs)
where
buildEventTrigger = proc (tableInfo, (dynamicConfig, _metadataInvalidationKey, source, sourceConfig, table, migrationRecreateEventTriggers, eventTriggerConf)) -> do
let triggerName = etcName eventTriggerConf

View File

@ -38,14 +38,13 @@ addNonColumnFields ::
HashMap SourceName (AB.AnyBackend PartiallyResolvedSource) ->
SourceName ->
SourceConfig b ->
HashMap G.Name (TableObjectType b) ->
HashMap (TableName b) (TableCoreInfoG b (ColumnInfo b) (ColumnInfo b)) ->
FieldInfoMap (ColumnInfo b) ->
HashMap (TableName b) (TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b)) ->
FieldInfoMap (StructuredColumnInfo b) ->
PartiallyResolvedRemoteSchemaMap ->
DBFunctionsMetadata b ->
NonColumnTableInputs b ->
m (FieldInfoMap (FieldInfo b))
addNonColumnFields allSources sourceName sourceConfig customObjectTypes rawTableInfos columns remoteSchemaMap pgFunctions NonColumnTableInputs {..} = do
addNonColumnFields allSources sourceName sourceConfig rawTableInfos columns remoteSchemaMap pgFunctions NonColumnTableInputs {..} = do
objectRelationshipInfos <-
buildInfoMapPreservingMetadataM
_rdName
@ -61,18 +60,19 @@ addNonColumnFields allSources sourceName sourceConfig customObjectTypes rawTable
_nctiArrayRelationships
let relationshipInfos = objectRelationshipInfos <> arrayRelationshipInfos
scalarColumns = HashMap.mapMaybe toScalarColumnInfo columns
computedFieldInfos <-
buildInfoMapPreservingMetadataM
_cfmName
(mkComputedFieldMetadataObject sourceName _nctiTable)
(buildComputedField (HS.fromList $ HashMap.keys rawTableInfos) (HS.fromList $ map ciColumn $ HashMap.elems columns) sourceName pgFunctions _nctiTable)
(buildComputedField (HS.fromList $ HashMap.keys rawTableInfos) (HS.fromList $ map ciColumn $ HashMap.elems scalarColumns) sourceName pgFunctions _nctiTable)
_nctiComputedFields
-- the fields that can be used for defining join conditions to other sources/remote schemas:
-- 1. all columns
-- 2. computed fields which don't expect arguments other than the table row and user session
let lhsJoinFields =
let columnFields = columns <&> \columnInfo -> JoinColumn (ciColumn columnInfo) (ciType columnInfo)
let columnFields = scalarColumns <&> \columnInfo -> JoinColumn (ciColumn columnInfo) (ciType columnInfo)
computedFields = HashMap.fromList $
flip mapMaybe (HashMap.toList computedFieldInfos) $
\(cfName, (ComputedFieldInfo {..}, _)) -> do
@ -136,7 +136,7 @@ addNonColumnFields allSources sourceName sourceConfig customObjectTypes rawTable
pure Nothing
noCustomFieldConflicts nonColumnFields = do
let columnsByGQLName = mapFromL ciName $ HashMap.elems columns
let columnsByGQLName = mapFromL structuredColumnInfoName $ HashMap.elems columns
for nonColumnFields \(fieldInfo, metadata) -> withRecordInconsistencyM metadata do
for_ (fieldInfoGraphQLNames fieldInfo) \fieldGQLName ->
case HashMap.lookup fieldGQLName columnsByGQLName of
@ -144,15 +144,15 @@ addNonColumnFields allSources sourceName sourceConfig customObjectTypes rawTable
-- If they are the same, `noColumnConflicts` will catch it, and it will produce a
-- more useful error message.
Just columnInfo
| toTxt (ciColumn columnInfo) /= G.unName fieldGQLName ->
| toTxt (structuredColumnInfoColumn columnInfo) /= G.unName fieldGQLName ->
throw400 AlreadyExists $
"field definition conflicts with custom field name for postgres column "
<>> ciColumn columnInfo
<>> structuredColumnInfoColumn columnInfo
_ -> return ()
return (fieldInfo, metadata)
noColumnConflicts = \case
This columnInfo -> pure $ columnInfoToFieldInfo customObjectTypes columnInfo
This columnInfo -> pure $ FIColumn columnInfo
That (fieldInfo, _) -> pure $ fieldInfo
These columnInfo (_, fieldMetadata) -> do
recordInconsistencyM Nothing fieldMetadata "field definition conflicts with postgres column"

View File

@ -496,7 +496,6 @@ updateColExp qt rf (ColExp fld val) =
Nothing -> pure val
Just fi -> case fi of
FIColumn _ -> pure val
FINestedObject _ -> pure val
FIComputedField _ -> pure val
FIRelationship ri -> do
case riTarget ri of

View File

@ -63,7 +63,7 @@ import Hasura.RQL.Types.Backend
import Hasura.RQL.Types.Backend qualified as RQL.Types
import Hasura.RQL.Types.BackendType
import Hasura.RQL.Types.BackendType qualified as Backend
import Hasura.RQL.Types.Column (ColumnMutability (..), RawColumnInfo (..))
import Hasura.RQL.Types.Column (ColumnMutability (..), RawColumnInfo (..), RawColumnType (..))
import Hasura.RQL.Types.Common
import Hasura.RQL.Types.Common qualified as Common
import Hasura.RQL.Types.HealthCheck (HealthCheckConfig)
@ -482,11 +482,17 @@ convertTableMetadataToTableInfo tableName DBTableMetadata {..} =
_tiDeletable = all viIsDeletable _ptmiViewInfo
}
where
convertRawColumnType :: RawColumnType 'DataConnector -> API.ColumnType
convertRawColumnType = \case
RawColumnTypeScalar scalarType -> API.ColumnTypeScalar $ Witch.from scalarType
RawColumnTypeObject _ name -> API.ColumnTypeObject name
RawColumnTypeArray _ columnType isNullable -> API.ColumnTypeArray (convertRawColumnType columnType) isNullable
convertColumn :: RawColumnInfo 'DataConnector -> API.ColumnInfo
convertColumn RawColumnInfo {..} =
API.ColumnInfo
{ _ciName = Witch.from rciName,
_ciType = Witch.from rciType,
_ciType = convertRawColumnType rciType,
_ciNullable = rciIsNullable,
_ciDescription = G.unDescription <$> rciDescription,
_ciInsertable = _cmIsInsertable rciMutability,

View File

@ -8,6 +8,7 @@ module Hasura.RQL.DML.Delete
)
where
import Control.Lens ((^?))
import Control.Monad.Trans.Control (MonadBaseControl)
import Data.Aeson
import Data.Sequence qualified as DS
@ -64,7 +65,7 @@ validateDeleteQWith
askSelPermInfo tableInfo
let fieldInfoMap = _tciFieldInfoMap coreInfo
allCols = getCols fieldInfoMap
allCols = mapMaybe (^? _SCIScalarColumn) $ getCols fieldInfoMap
-- convert the returning cols into sql returing exp
mAnnRetCols <- forM mRetCols $ \retCols ->

View File

@ -3,6 +3,7 @@ module Hasura.RQL.DML.Insert
)
where
import Control.Lens ((^?))
import Control.Monad.Trans.Control (MonadBaseControl)
import Data.Aeson.Types
import Data.HashMap.Strict qualified as HashMap
@ -181,11 +182,11 @@ convInsertQuery objsParser sessVarBldr prepFn (InsertQuery tableName _ val oC mR
let defInsVals =
HashMap.fromList
[ (ciColumn column, S.columnDefaultValue)
[ (structuredColumnInfoColumn column, S.columnDefaultValue)
| column <- getCols fieldInfoMap,
_cmIsInsertable (ciMutability column)
_cmIsInsertable (structuredColumnInfoMutability column)
]
allCols = getCols fieldInfoMap
allCols = mapMaybe (^? _SCIScalarColumn) $ getCols fieldInfoMap
insCols = HashMap.keys defInsVals
resolvedPreSet <- mapM (convPartialSQLExp sessVarBldr) setInsVals

View File

@ -80,7 +80,7 @@ convWildcard fieldInfoMap selPermInfo wildcard =
(StarDot wc) -> (simpleCols ++) <$> (catMaybes <$> relExtCols wc)
where
cols = spiCols selPermInfo
pgCols = map ciColumn $ getCols fieldInfoMap
pgCols = map structuredColumnInfoColumn $ getCols fieldInfoMap
relColInfos = getRels fieldInfoMap
simpleCols = map ECSimple $ filter (`HashMap.member` cols) pgCols
@ -136,7 +136,7 @@ convOrderByElem sessVarBldr (flds, spi) = \case
OCPG fldName -> do
fldInfo <- askFieldInfo flds fldName
case fldInfo of
FIColumn colInfo -> do
FIColumn (SCIScalarColumn colInfo) -> do
checkSelOnCol spi (ciColumn colInfo)
let ty = ciType colInfo
if isScalarColumnWhere isGeoType ty

View File

@ -3,6 +3,7 @@ module Hasura.RQL.DML.Update
)
where
import Control.Lens ((^?))
import Control.Monad.Trans.Control (MonadBaseControl)
import Data.Aeson.Types
import Data.HashMap.Strict qualified as HashMap
@ -130,7 +131,7 @@ validateUpdateQueryWith sessVarBldr prepValBldr uq = do
askSelPermInfo tableInfo
let fieldInfoMap = _tciFieldInfoMap coreInfo
allCols = getCols fieldInfoMap
allCols = mapMaybe (^? _SCIScalarColumn) $ getCols fieldInfoMap
preSetObj = upiSet updPerm
preSetCols = HashMap.keys preSetObj

View File

@ -37,6 +37,8 @@ module Hasura.RQL.IR.Select
AnnFieldsG,
AnnNestedObjectSelectG (..),
AnnNestedObjectSelect,
AnnNestedArraySelectG (..),
AnnNestedArraySelect,
AnnObjectSelect,
AnnObjectSelectG (..),
AnnSimpleSelect,
@ -243,8 +245,9 @@ data AnnFieldG (b :: BackendType) (r :: Type) v
| AFNodeId (XRelay b) SourceName (TableName b) (PrimaryKeyColumns b)
| AFExpression Text
| -- | Nested object.
AFNestedObject (AnnNestedObjectSelectG b r v)
-- TODO (dmoverton): add AFNestedArray
AFNestedObject (AnnNestedObjectSelectG b r v) -- TODO(dmoverton): move XNestedObject to a field in AFNestedObject constructor for consistency with AFNestedArray
| -- | Nested array
AFNestedArray (XNestedArrays b) (AnnNestedArraySelectG b r v)
deriving stock (Functor, Foldable, Traversable)
deriving stock instance
@ -254,7 +257,8 @@ deriving stock instance
Eq (ComputedFieldSelect b r v),
Eq (ObjectRelationSelectG b r v),
Eq (RemoteRelationshipSelect b r),
Eq (AnnNestedObjectSelectG b r v)
Eq (AnnNestedObjectSelectG b r v),
Eq (AnnNestedArraySelectG b r v)
) =>
Eq (AnnFieldG b r v)
@ -265,7 +269,8 @@ deriving stock instance
Show (ComputedFieldSelect b r v),
Show (ObjectRelationSelectG b r v),
Show (RemoteRelationshipSelect b r),
Show (AnnNestedObjectSelectG b r v)
Show (AnnNestedObjectSelectG b r v),
Show (AnnNestedArraySelectG b r v)
) =>
Show (AnnFieldG b r v)
@ -279,6 +284,7 @@ instance Backend b => Bifoldable (AnnFieldG b) where
AFNodeId {} -> mempty
AFExpression {} -> mempty
AFNestedObject no -> bifoldMap f g no
AFNestedArray _ na -> bifoldMap f g na
type AnnField b = AnnFieldG b Void (SQLExpression b)
@ -700,6 +706,26 @@ instance Backend b => Bifoldable (AnnNestedObjectSelectG b) where
type AnnNestedObjectSelect b r = AnnNestedObjectSelectG b r (SQLExpression b)
-- Nested arrays
data AnnNestedArraySelectG (b :: BackendType) (r :: Type) v
= ANASSimple (AnnFieldG b r v)
| ANASAggregate (AnnAggregateSelectG b r v)
deriving stock (Functor, Foldable, Traversable)
deriving stock instance
(Backend b, Eq (AnnFieldG b r v), Eq (AnnAggregateSelectG b r v)) => Eq (AnnNestedArraySelectG b r v)
deriving stock instance
(Backend b, Show (AnnFieldG b r v), Show (AnnAggregateSelectG b r v)) => Show (AnnNestedArraySelectG b r v)
instance Backend b => Bifoldable (AnnNestedArraySelectG b) where
bifoldMap f g = \case
ANASSimple field -> bifoldMap f g field
ANASAggregate agg -> bifoldMapAnnSelectG f g agg
type AnnNestedArraySelect b r = AnnNestedArraySelectG b r (SQLExpression b)
-- | If argument positional index is less than or equal to length of
-- 'positional' arguments then insert the value in 'positional' arguments else
-- insert the value with argument name in 'named' arguments

View File

@ -182,7 +182,16 @@ class
NFData (XNestedObjects b),
Hashable (XNestedObjects b),
ToJSON (XNestedObjects b),
FromJSON (XNestedObjects b),
ToTxt (XNestedObjects b),
Eq (XNestedArrays b),
Ord (XNestedArrays b),
Show (XNestedArrays b),
NFData (XNestedArrays b),
Hashable (XNestedArrays b),
ToJSON (XNestedArrays b),
FromJSON (XNestedArrays b),
ToTxt (XNestedArrays b),
-- Intermediate Representations
Traversable (BooleanOperators b),
Traversable (UpdateVariant b),
@ -330,6 +339,9 @@ class
type XNestedObjects b :: Type
type XNestedObjects b = XDisable
type XNestedArrays b :: Type
type XNestedArrays b = XDisable
-- The result of dynamic connection template resolution
type ResolvedConnectionTemplate b :: Type
type ResolvedConnectionTemplate b = () -- Uninmplemented value

View File

@ -1,3 +1,4 @@
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE TemplateHaskell #-}
{-# LANGUAGE UndecidableInstances #-}
@ -17,6 +18,7 @@ module Hasura.RQL.Types.Column
ColumnMutability (..),
ColumnInfo (..),
NestedObjectInfo (..),
RawColumnType (..),
RawColumnInfo (..),
PrimaryKeyColumns,
getColInfos,
@ -28,11 +30,21 @@ module Hasura.RQL.Types.Column
ColumnValues,
ColumnReference (..),
columnReferenceType,
NestedArrayInfo (..),
StructuredColumnInfo (..),
_SCIScalarColumn,
_SCIObjectColumn,
_SCIArrayColumn,
structuredColumnInfoName,
structuredColumnInfoColumn,
structuredColumnInfoMutability,
toScalarColumnInfo,
)
where
import Autodocodec
import Control.Lens.TH
import Data.Aeson
import Data.Aeson hiding ((.=))
import Data.Aeson.TH
import Data.HashMap.Strict qualified as HashMap
import Data.Text.Extended
@ -46,12 +58,14 @@ import Hasura.SQL.Types
import Language.GraphQL.Draft.Syntax qualified as G
newtype EnumValue = EnumValue {getEnumValue :: G.Name}
deriving (Show, Eq, Ord, NFData, Hashable, ToJSON, ToJSONKey, FromJSON, FromJSONKey)
deriving stock (Show, Eq, Ord, Generic)
deriving newtype (NFData, Hashable, ToJSON, ToJSONKey, FromJSON, FromJSONKey)
newtype EnumValueInfo = EnumValueInfo
{ evComment :: Maybe Text
}
deriving (Show, Eq, Ord, NFData, Hashable)
deriving stock (Show, Eq, Ord, Generic)
deriving newtype (NFData, Hashable)
$(deriveJSON hasuraJSON ''EnumValueInfo)
@ -167,6 +181,38 @@ parseScalarValuesColumnType ::
parseScalarValuesColumnType columnType =
indexedMapM (parseScalarValueColumnType columnType)
data RawColumnType (b :: BackendType)
= RawColumnTypeScalar (ScalarType b)
| RawColumnTypeObject (XNestedObjects b) G.Name
| RawColumnTypeArray (XNestedArrays b) (RawColumnType b) Bool
deriving stock (Generic)
deriving instance Backend b => Eq (RawColumnType b)
deriving instance Backend b => Ord (RawColumnType b)
deriving anyclass instance Backend b => Hashable (RawColumnType b)
deriving instance Backend b => Show (RawColumnType b)
instance Backend b => NFData (RawColumnType b)
-- For backwards compatibility we want to serialize and deserialize
-- RawColumnTypeScalar as a ScalarType
instance Backend b => ToJSON (RawColumnType b) where
toJSON = \case
RawColumnTypeScalar scalar -> toJSON scalar
other -> genericToJSON hasuraJSON other
instance Backend b => FromJSON (RawColumnType b) where
parseJSON v = (RawColumnTypeScalar <$> parseJSON v) <|> genericParseJSON hasuraJSON v
-- Ideally we'd derive ToJSON and FromJSON instances from the HasCodec instance, rather than the other way around.
-- Unfortunately, I'm not sure if it's possible to write a proper HasCodec instance in the presence
-- of the (XNestedObjects b) and (XNestedArrays b) type families, which may be Void.
instance Backend b => HasCodec (RawColumnType b) where
codec = codecViaAeson "RawColumnType"
-- | “Raw” column info, as stored in the catalog (but not in the schema cache). Instead of
-- containing a 'PGColumnType', it only contains a 'PGScalarType', which is combined with the
-- 'pcirReferences' field and other table data to eventually resolve the type to a 'PGColumnType'.
@ -176,7 +222,7 @@ data RawColumnInfo (b :: BackendType) = RawColumnInfo
-- increases. Dropping a column does /not/ cause the columns to be renumbered, so a column can be
-- consistently identified by its position.
rciPosition :: Int,
rciType :: ScalarType b,
rciType :: RawColumnType b,
rciIsNullable :: Bool,
rciDescription :: Maybe G.Description,
rciMutability :: ColumnMutability
@ -275,6 +321,72 @@ instance (Backend b) => ToJSON (NestedObjectInfo b) where
toJSON = genericToJSON hasuraJSON
toEncoding = genericToEncoding hasuraJSON
data NestedArrayInfo b = NestedArrayInfo
{ _naiSupportsNestedArrays :: XNestedArrays b,
_naiIsNullable :: Bool,
_naiColumnInfo :: StructuredColumnInfo b
}
deriving (Generic)
deriving instance (Backend b) => Eq (NestedArrayInfo b)
deriving instance (Backend b) => Ord (NestedArrayInfo b)
deriving instance (Backend b) => Show (NestedArrayInfo b)
instance (Backend b) => NFData (NestedArrayInfo b)
instance (Backend b) => Hashable (NestedArrayInfo b)
instance (Backend b) => ToJSON (NestedArrayInfo b) where
toJSON = genericToJSON hasuraJSON
toEncoding = genericToEncoding hasuraJSON
data StructuredColumnInfo b
= SCIScalarColumn (ColumnInfo b)
| SCIObjectColumn (NestedObjectInfo b)
| SCIArrayColumn (NestedArrayInfo b)
deriving (Generic)
deriving instance (Backend b) => Eq (StructuredColumnInfo b)
deriving instance (Backend b) => Ord (StructuredColumnInfo b)
deriving instance (Backend b) => Show (StructuredColumnInfo b)
instance (Backend b) => NFData (StructuredColumnInfo b)
instance (Backend b) => Hashable (StructuredColumnInfo b)
instance (Backend b) => ToJSON (StructuredColumnInfo b) where
toJSON = genericToJSON hasuraJSON
toEncoding = genericToEncoding hasuraJSON
structuredColumnInfoName :: StructuredColumnInfo b -> G.Name
structuredColumnInfoName = \case
SCIScalarColumn ColumnInfo {..} -> ciName
SCIObjectColumn NestedObjectInfo {..} -> _noiName
SCIArrayColumn NestedArrayInfo {..} -> structuredColumnInfoName _naiColumnInfo
structuredColumnInfoColumn :: StructuredColumnInfo b -> Column b
structuredColumnInfoColumn = \case
SCIScalarColumn ColumnInfo {..} -> ciColumn
SCIObjectColumn NestedObjectInfo {..} -> _noiColumn
SCIArrayColumn NestedArrayInfo {..} -> structuredColumnInfoColumn _naiColumnInfo
structuredColumnInfoMutability :: StructuredColumnInfo b -> ColumnMutability
structuredColumnInfoMutability = \case
SCIScalarColumn ColumnInfo {..} -> ciMutability
SCIObjectColumn NestedObjectInfo {..} -> _noiMutability
SCIArrayColumn NestedArrayInfo {..} -> structuredColumnInfoMutability _naiColumnInfo
toScalarColumnInfo :: StructuredColumnInfo b -> Maybe (ColumnInfo b)
toScalarColumnInfo = \case
SCIScalarColumn ci -> Just ci
_ -> Nothing
$(makePrisms ''StructuredColumnInfo)
type PrimaryKeyColumns b = NESeq (ColumnInfo b)
onlyNumCols :: forall b. Backend b => [ColumnInfo b] -> [ColumnInfo b]

View File

@ -262,15 +262,6 @@ class
getStoredProcedureGraphqlName _ _ =
throw500 "getStoredProcedureGraphqlName: not implemented for this backend."
-- | How to convert a column to a field.
-- For backends that don't support nested objects or arrays the default implementation
-- (i.e. wrapping the ColumnInfo in FIColumn) is what you want.
columnInfoToFieldInfo ::
HashMap G.Name (TableObjectType b) ->
ColumnInfo b ->
FieldInfo b
columnInfoToFieldInfo _ = FIColumn
-- | Allows the backend to control whether or not a particular source supports being
-- the target of remote relationships or not
supportsBeingRemoteRelationshipTarget :: SourceConfig b -> Bool

View File

@ -27,13 +27,14 @@ import Data.OpenApi (ToSchema)
import Data.Text (Text)
import GHC.Generics (Generic)
import Hasura.RQL.Types.Backend (Backend (..))
import Hasura.RQL.Types.Column (RawColumnType (..))
import Prelude
--------------------------------------------------------------------------------
data SourceColumnInfo b = SourceColumnInfo
{ _sciName :: Column b,
_sciType :: ScalarType b,
_sciType :: RawColumnType b,
_sciNullable :: Bool,
_sciDescription :: Maybe Text,
_sciInsertable :: Bool,

View File

@ -91,10 +91,16 @@ instance (Backend b) => FromJSON (TrackTable b) where
where
withOptions = withObject "TrackTable" \o ->
TrackTable
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .:? "is_enum" .!= False
<*> o .:? "apollo_federation_config"
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.:? "is_enum"
.!= False
<*> o
.:? "apollo_federation_config"
withoutOptions = TrackTable defaultSource <$> parseJSON v <*> pure False <*> pure Nothing
data SetTableIsEnum b = SetTableIsEnum
@ -110,9 +116,13 @@ deriving instance (Show (TableName b)) => Show (SetTableIsEnum b)
instance (Backend b) => FromJSON (SetTableIsEnum b) where
parseJSON = withObject "SetTableIsEnum" $ \o ->
SetTableIsEnum
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .: "is_enum"
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.: "is_enum"
data UntrackTable b = UntrackTable
{ utSource :: SourceName,
@ -127,9 +137,14 @@ deriving instance (Backend b) => Eq (UntrackTable b)
instance (Backend b) => FromJSON (UntrackTable b) where
parseJSON = withObject "UntrackTable" $ \o ->
UntrackTable
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .:? "cascade" .!= False
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.:? "cascade"
.!= False
isTableTracked :: forall b. (Backend b) => SourceInfo b -> TableName b -> Bool
isTableTracked sourceInfo tableName =
@ -512,9 +527,13 @@ data SetTableCustomization b = SetTableCustomization
instance (Backend b) => FromJSON (SetTableCustomization b) where
parseJSON = withObject "SetTableCustomization" $ \o ->
SetTableCustomization
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .: "configuration"
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.: "configuration"
data SetTableCustomFields = SetTableCustomFields
{ _stcfSource :: SourceName,
@ -527,10 +546,17 @@ data SetTableCustomFields = SetTableCustomFields
instance FromJSON SetTableCustomFields where
parseJSON = withObject "SetTableCustomFields" $ \o ->
SetTableCustomFields
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .:? "custom_root_fields" .!= emptyCustomRootFields
<*> o .:? "custom_column_names" .!= HashMap.empty
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.:? "custom_root_fields"
.!= emptyCustomRootFields
<*> o
.:? "custom_column_names"
.!= HashMap.empty
runSetTableCustomFieldsQV2 ::
(QErrM m, CacheRWM m, MetadataM m) => SetTableCustomFields -> m EncJSON
@ -662,7 +688,7 @@ buildTableCache ::
Inc.Dependency Inc.InvalidationKey,
NamingCase
)
`arr` HashMap.HashMap (TableName b) (TableCoreInfoG b (ColumnInfo b) (ColumnInfo b))
`arr` HashMap.HashMap (TableName b) (TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b))
buildTableCache = Inc.cache proc (source, sourceConfig, dbTablesMeta, tableBuildInputs, reloadMetadataInvalidationKey, tCase) -> do
rawTableInfos <-
(|
@ -756,7 +782,7 @@ buildTableCache = Inc.cache proc (source, sourceConfig, dbTablesMeta, tableBuild
HashMap.HashMap (TableName b) (PrimaryKey b (Column b), TableConfig b, EnumValues) ->
TableCoreInfoG b (RawColumnInfo b) (Column b) ->
NamingCase ->
n (TableCoreInfoG b (ColumnInfo b) (ColumnInfo b))
n (TableCoreInfoG b (StructuredColumnInfo b) (ColumnInfo b))
processTableInfo enumTables rawInfo tCase = do
let columns = _tciFieldInfoMap rawInfo
enumReferences = resolveEnumReferences enumTables (_tciForeignKeys rawInfo)
@ -765,7 +791,7 @@ buildTableCache = Inc.cache proc (source, sourceConfig, dbTablesMeta, tableBuild
>>= traverse (processColumnInfo tCase enumReferences (_tciName rawInfo))
assertNoDuplicateFieldNames (HashMap.elems columnInfoMap)
primaryKey <- traverse (resolvePrimaryKeyColumns columnInfoMap) (_tciPrimaryKey rawInfo)
primaryKey <- traverse (resolvePrimaryKeyColumns $ HashMap.mapMaybe toScalarColumnInfo columnInfoMap) (_tciPrimaryKey rawInfo)
pure
rawInfo
{ _tciFieldInfoMap = columnInfoMap,
@ -821,25 +847,50 @@ buildTableCache = Inc.cache proc (source, sourceConfig, dbTablesMeta, tableBuild
HashMap.HashMap (Column b) (NonEmpty (EnumReference b)) ->
TableName b ->
(RawColumnInfo b, GQLNameIdentifier, Maybe G.Description) ->
n (ColumnInfo b)
processColumnInfo tCase tableEnumReferences tableName (rawInfo, name, description) = do
resolvedType <- resolveColumnType
pure
ColumnInfo
{ ciColumn = pgCol,
ciName = (applyFieldNameCaseIdentifier tCase name),
ciPosition = rciPosition rawInfo,
ciType = resolvedType,
ciIsNullable = rciIsNullable rawInfo,
ciDescription = description,
ciMutability = rciMutability rawInfo
}
n (StructuredColumnInfo b)
processColumnInfo tCase tableEnumReferences tableName (rawInfo, name, description) =
processRawColumnType (rciIsNullable rawInfo) $ rciType rawInfo
where
processRawColumnType isNullable = \case
RawColumnTypeScalar scalarType -> do
resolvedType <- resolveColumnType scalarType
pure $
SCIScalarColumn
ColumnInfo
{ ciColumn = pgCol,
ciName = applyFieldNameCaseIdentifier tCase name,
ciPosition = rciPosition rawInfo,
ciType = resolvedType,
ciIsNullable = isNullable,
ciDescription = description,
ciMutability = rciMutability rawInfo
}
RawColumnTypeObject supportsNestedObjects objectTypeName ->
pure $
SCIObjectColumn @b
NestedObjectInfo
{ _noiSupportsNestedObjects = supportsNestedObjects,
_noiColumn = pgCol,
_noiName = applyFieldNameCaseIdentifier tCase name,
_noiType = objectTypeName,
_noiIsNullable = isNullable,
_noiDescription = description,
_noiMutability = rciMutability rawInfo
}
RawColumnTypeArray supportsNestedArrays rawColumnType isNullable' -> do
nestedColumnInfo <- processRawColumnType isNullable' rawColumnType
pure $
SCIArrayColumn @b
NestedArrayInfo
{ _naiSupportsNestedArrays = supportsNestedArrays,
_naiIsNullable = isNullable,
_naiColumnInfo = nestedColumnInfo
}
pgCol = rciName rawInfo
resolveColumnType =
resolveColumnType scalarType =
case HashMap.lookup pgCol tableEnumReferences of
-- no references? not an enum
Nothing -> pure $ ColumnScalar (rciType rawInfo)
Nothing -> pure $ ColumnScalar scalarType
-- one reference? is an enum
Just (enumReference :| []) -> pure $ ColumnEnumReference enumReference
-- multiple referenced enums? the schema is strange, so lets reject it
@ -853,12 +904,12 @@ buildTableCache = Inc.cache proc (source, sourceConfig, dbTablesMeta, tableBuild
<> ")"
assertNoDuplicateFieldNames columns =
void $ flip HashMap.traverseWithKey (HashMap.groupOn ciName columns) \name columnsWithName ->
void $ flip HashMap.traverseWithKey (HashMap.groupOn structuredColumnInfoName columns) \name columnsWithName ->
case columnsWithName of
one : two : more ->
throw400 AlreadyExists $
"the definitions of columns "
<> englishList "and" (dquote . ciColumn <$> (one :| two : more))
<> englishList "and" (dquote . structuredColumnInfoColumn <$> (one :| two : more))
<> " are in conflict: they are mapped to the same field name, " <>> name
_ -> pure ()
@ -882,9 +933,13 @@ data SetApolloFederationConfig b = SetApolloFederationConfig
instance (Backend b) => FromJSON (SetApolloFederationConfig b) where
parseJSON = withObject "SetApolloFederationConfig" $ \o ->
SetApolloFederationConfig
<$> o .:? "source" .!= defaultSource
<*> o .: "table"
<*> o .:? "apollo_federation_config"
<$> o
.:? "source"
.!= defaultSource
<*> o
.: "table"
<*> o
.:? "apollo_federation_config"
runSetApolloFederationConfig ::
forall b m.

View File

@ -349,8 +349,7 @@ getAllCustomRootFields TableCustomRootFields {..} =
]
data FieldInfo (b :: BackendType)
= FIColumn (ColumnInfo b)
| FINestedObject (NestedObjectInfo b)
= FIColumn (StructuredColumnInfo b)
| FIRelationship (RelInfo b)
| FIComputedField (ComputedFieldInfo b)
| FIRemoteRelationship (RemoteFieldInfo (DBJoinField b))
@ -372,16 +371,14 @@ type FieldInfoMap = HashMap.HashMap FieldName
fieldInfoName :: forall b. (Backend b) => FieldInfo b -> FieldName
fieldInfoName = \case
FIColumn info -> fromCol @b $ ciColumn info
FINestedObject info -> fromCol @b $ _noiColumn info
FIColumn info -> fromCol @b $ structuredColumnInfoColumn info
FIRelationship info -> fromRel $ riName info
FIComputedField info -> fromComputedField $ _cfiName info
FIRemoteRelationship info -> fromRemoteRelationship $ getRemoteFieldInfoName info
fieldInfoGraphQLName :: FieldInfo b -> Maybe G.Name
fieldInfoGraphQLName = \case
FIColumn info -> Just $ ciName info
FINestedObject info -> Just $ _noiName info
FIColumn info -> Just $ structuredColumnInfoName info
FIRelationship info -> G.mkName $ relNameToTxt $ riName info
FIComputedField info -> G.mkName $ computedFieldNameToText $ _cfiName info
FIRemoteRelationship info -> G.mkName $ relNameToTxt $ getRemoteFieldInfoName info
@ -397,16 +394,20 @@ getRemoteFieldInfoName RemoteFieldInfo {_rfiRHS} = case _rfiRHS of
fieldInfoGraphQLNames :: FieldInfo b -> [G.Name]
fieldInfoGraphQLNames info = case info of
FIColumn _ -> maybeToList $ fieldInfoGraphQLName info
FINestedObject _ -> maybeToList $ fieldInfoGraphQLName info
FIRelationship relationshipInfo -> fold do
name <- fieldInfoGraphQLName info
pure $ case riType relationshipInfo of
ObjRel -> [name]
ArrRel -> [name, name <> Name.__aggregate]
ArrRel -> addAggregateFields [name]
FIComputedField _ -> maybeToList $ fieldInfoGraphQLName info
FIRemoteRelationship _ -> maybeToList $ fieldInfoGraphQLName info
where
addAggregateFields :: [G.Name] -> [G.Name]
addAggregateFields names = do
name <- names
[name, name <> Name.__aggregate]
getCols :: FieldInfoMap (FieldInfo backend) -> [ColumnInfo backend]
getCols :: FieldInfoMap (FieldInfo backend) -> [StructuredColumnInfo backend]
getCols = mapMaybe (^? _FIColumn) . HashMap.elems
-- | Sort columns based on their ordinal position
@ -1010,6 +1011,7 @@ instance (Backend b) => FromJSON (TableObjectFieldDefinition b) where
data TableObjectFieldType (b :: BackendType)
= TOFTScalar G.Name (ScalarType b)
| TOFTObject G.Name
| TOFTArray (XNestedArrays b) (TableObjectFieldType b) Bool -- isNullable
deriving stock (Generic)
deriving stock instance (Backend b) => Eq (TableObjectFieldType b)
@ -1180,7 +1182,7 @@ getFieldInfoM tableInfo fieldName =
getColumnInfoM ::
TableInfo b -> FieldName -> Maybe (ColumnInfo b)
getColumnInfoM tableInfo fieldName =
(^? _FIColumn) =<< getFieldInfoM tableInfo fieldName
(^? _FIColumn . _SCIScalarColumn) =<< getFieldInfoM tableInfo fieldName
askFieldInfo ::
(MonadError QErr m) =>
@ -1211,8 +1213,9 @@ askColInfo m c msg = do
modifyErr ("column " <>) $
askFieldInfo m (fromCol @backend c)
case fieldInfo of
(FIColumn colInfo) -> pure colInfo
(FINestedObject _) -> throwErr "nested object"
(FIColumn (SCIScalarColumn colInfo)) -> pure colInfo
(FIColumn (SCIObjectColumn _)) -> throwErr "object"
(FIColumn (SCIArrayColumn _)) -> throwErr "array"
(FIRelationship _) -> throwErr "relationship"
(FIComputedField _) -> throwErr "computed field"
(FIRemoteRelationship _) -> throwErr "remote relationship"
@ -1238,7 +1241,6 @@ askComputedFieldInfo fields computedField = do
fromComputedField computedField
case fieldInfo of
(FIColumn _) -> throwErr "column"
(FINestedObject _) -> throwErr "nested object"
(FIRelationship _) -> throwErr "relationship"
(FIRemoteRelationship _) -> throwErr "remote relationship"
(FIComputedField cci) -> pure cci
@ -1296,7 +1298,7 @@ mkAdminRolePermInfo tableInfo =
RolePermInfo (Just i) (Just s) (Just u) (Just d)
where
fields = _tciFieldInfoMap tableInfo
pgCols = map ciColumn $ getCols fields
pgCols = map structuredColumnInfoColumn $ getCols fields
pgColsWithFilter = HashMap.fromList $ map (,Nothing) pgCols
computedFields =
-- Fetch the list of computed fields not returning rows of existing table.

View File

@ -1,7 +1,7 @@
{-# LANGUAGE OverloadedLists #-}
{-# LANGUAGE QuasiQuotes #-}
module Hasura.Backends.DataConnector.API.V0.ColumnSpec (spec, genColumnName, genColumnInfo, genColumnValueGenerationStrategy) where
module Hasura.Backends.DataConnector.API.V0.ColumnSpec (spec, genColumnName, genColumnType, genColumnInfo, genColumnValueGenerationStrategy) where
import Data.Aeson.QQ.Simple (aesonQQ)
import Hasura.Backends.DataConnector.API.V0
@ -10,6 +10,7 @@ import Hasura.Generator.Common (defaultRange, genArbitraryAlphaNumText)
import Hasura.Prelude
import Hedgehog
import Hedgehog.Gen qualified as Gen
import Language.GraphQL.Draft.Generator (genName)
import Test.Aeson.Utils
import Test.Hspec
@ -21,7 +22,7 @@ spec = do
describe "ColumnInfo" $ do
describe "minimal" $
testFromJSON
(ColumnInfo (ColumnName "my_column_name") (ScalarType "string") False Nothing False False Nothing)
(ColumnInfo (ColumnName "my_column_name") (ColumnTypeScalar $ ScalarType "string") False Nothing False False Nothing)
[aesonQQ|
{ "name": "my_column_name",
"type": "string",
@ -30,7 +31,7 @@ spec = do
|]
describe "non-minimal" $
testToFromJSONToSchema
(ColumnInfo (ColumnName "my_column_name") (ScalarType "number") True (Just "My column description") True True (Just AutoIncrement))
(ColumnInfo (ColumnName "my_column_name") (ColumnTypeScalar $ ScalarType "number") True (Just "My column description") True True (Just AutoIncrement))
[aesonQQ|
{ "name": "my_column_name",
"type": "number",
@ -55,11 +56,19 @@ spec = do
genColumnName :: MonadGen m => m ColumnName
genColumnName = ColumnName <$> genArbitraryAlphaNumText defaultRange
genColumnInfo :: (MonadGen m, GenBase m ~ Identity) => m ColumnInfo
genColumnType :: Gen ColumnType
genColumnType =
Gen.choice
[ ColumnTypeScalar <$> genScalarType,
ColumnTypeObject <$> genName,
ColumnTypeArray <$> genColumnType <*> Gen.bool
]
genColumnInfo :: Gen ColumnInfo
genColumnInfo =
ColumnInfo
<$> genColumnName
<*> genScalarType
<*> genColumnType
<*> Gen.bool
<*> Gen.maybe (genArbitraryAlphaNumText defaultRange)
<*> Gen.bool

View File

@ -10,7 +10,7 @@ import Data.Aeson
import Data.Aeson.QQ.Simple (aesonQQ)
import Hasura.Backends.DataConnector.API.V0
import Hasura.Backends.DataConnector.API.V0.CapabilitiesSpec (genUpdateColumnOperatorName)
import Hasura.Backends.DataConnector.API.V0.ColumnSpec (genColumnName, genColumnValueGenerationStrategy)
import Hasura.Backends.DataConnector.API.V0.ColumnSpec (genColumnName, genColumnType, genColumnValueGenerationStrategy)
import Hasura.Backends.DataConnector.API.V0.ExpressionSpec (genExpression)
import Hasura.Backends.DataConnector.API.V0.QuerySpec (genField, genFieldMap, genFieldValue)
import Hasura.Backends.DataConnector.API.V0.RelationshipsSpec (genRelationshipName, genTableRelationships)
@ -58,7 +58,7 @@ spec = do
describe "ColumnInsert" $ do
describe "minimal" $ do
testToFromJSONToSchema
(ColumnInsert (ColumnInsertSchema (ColumnName "my_column") (ScalarType "number") True Nothing))
(ColumnInsert (ColumnInsertSchema (ColumnName "my_column") (ColumnTypeScalar $ ScalarType "number") True Nothing))
[aesonQQ|
{ "type": "column",
"column": "my_column",
@ -67,7 +67,7 @@ spec = do
|]
describe "non-minimal" $ do
testToFromJSONToSchema
(ColumnInsert (ColumnInsertSchema (ColumnName "my_column") (ScalarType "number") True (Just UniqueIdentifier)))
(ColumnInsert (ColumnInsertSchema (ColumnName "my_column") (ColumnTypeScalar $ ScalarType "number") True (Just UniqueIdentifier)))
[aesonQQ|
{ "type": "column",
"column": "my_column",
@ -260,7 +260,7 @@ genTableInsertSchema =
<*> Gen.maybe (Gen.nonEmpty defaultRange genColumnName)
<*> genFieldMap genInsertFieldSchema
genInsertFieldSchema :: (MonadGen m, GenBase m ~ Identity) => m InsertFieldSchema
genInsertFieldSchema :: Gen InsertFieldSchema
genInsertFieldSchema =
Gen.choice
[ ColumnInsert <$> genColumnInsertSchema,
@ -268,11 +268,11 @@ genInsertFieldSchema =
ArrayRelationInsert <$> genArrayRelationInsertSchema
]
genColumnInsertSchema :: (MonadGen m, GenBase m ~ Identity) => m ColumnInsertSchema
genColumnInsertSchema :: Gen ColumnInsertSchema
genColumnInsertSchema =
ColumnInsertSchema
<$> genColumnName
<*> genScalarType
<*> genColumnType
<*> Gen.bool
<*> Gen.maybe genColumnValueGenerationStrategy

View File

@ -19,7 +19,7 @@ spec = do
testToFromJSONToSchema (SchemaResponse [] [] Nothing) [aesonQQ|{"tables": []}|]
jsonOpenApiProperties genSchemaResponse
genSchemaResponse :: (MonadGen m, GenBase m ~ Identity) => m SchemaResponse
genSchemaResponse :: Gen SchemaResponse
genSchemaResponse = do
tables <- Gen.list defaultRange genTableInfo
pure $ SchemaResponse tables [] Nothing

View File

@ -34,7 +34,7 @@ spec = do
( TableInfo
(TableName ["my_table_name"])
View
[ColumnInfo (ColumnName "id") (ScalarType "string") False Nothing False False Nothing]
[ColumnInfo (ColumnName "id") (ColumnTypeScalar $ ScalarType "string") False Nothing False False Nothing]
(Just $ ColumnName "id" :| [])
(ForeignKeys mempty)
(Just "my description")
@ -58,7 +58,7 @@ spec = do
( TableInfo
(TableName ["my_table_name"])
Table
[ColumnInfo (ColumnName "id") (ScalarType "string") False Nothing False False Nothing]
[ColumnInfo (ColumnName "id") (ColumnTypeScalar $ ScalarType "string") False Nothing False False Nothing]
(Just $ ColumnName "id" :| [])
(ForeignKeys $ HashMap.singleton (ConstraintName "Artist") (Constraint (TableName ["artist_table"]) (HashMap.singleton (ColumnName "ArtistId") (ColumnName "ArtistId"))))
(Just "my description")
@ -105,7 +105,7 @@ genTableType :: MonadGen m => m TableType
genTableType = Gen.enumBounded
-- | Note: this generator is intended for serialization tests only and does not ensure valid Foreign Key Constraints.
genTableInfo :: (MonadGen m, GenBase m ~ Identity) => m TableInfo
genTableInfo :: Gen TableInfo
genTableInfo =
TableInfo
<$> genTableName

View File

@ -27,7 +27,7 @@ import Hasura.RQL.IR.Root (RemoteRelationshipField)
import Hasura.RQL.IR.Update (AnnotatedUpdateG (..))
import Hasura.RQL.IR.Value (UnpreparedValue (..))
import Hasura.RQL.Types.BackendType (BackendType (Postgres), PostgresKind (Vanilla))
import Hasura.RQL.Types.Column (ColumnInfo (..), ColumnMutability (..), ColumnType (..))
import Hasura.RQL.Types.Column (ColumnInfo (..), ColumnMutability (..), ColumnType (..), StructuredColumnInfo (..))
import Hasura.RQL.Types.Common (Comment (..), FieldName (..), OID (..))
import Hasura.RQL.Types.Instances ()
import Hasura.RQL.Types.Permission (AllowedRootFields (..))
@ -181,7 +181,7 @@ buildTableInfo TableInfoBuilder {..} = tableInfo
$ columns
toCIHashPair :: ColumnInfoBuilder -> (FieldName, FieldInfo PG)
toCIHashPair cib = (coerce $ cibName cib, FIColumn $ mkColumnInfo cib)
toCIHashPair cib = (coerce $ cibName cib, FIColumn $ SCIScalarColumn $ mkColumnInfo cib)
toRelHashPair :: RelInfo PG -> (FieldName, FieldInfo PG)
toRelHashPair ri = (fromRel $ riName ri, FIRelationship ri)