GDC-189 custom aggregations

>

## Description
->

This PR allows DC agents to define custom aggregate functions for their scalar types.

### Related Issues
->

GDC-189

### Solution and Design
>

We added a new property `aggregate_functions` to the scalar types capabilities. This allows the agent author to specify a set of aggregate functions supported by each scalar type, along with the function's result type.

During GraphQL schema generation, the custom aggregate functions are available via a new method `getCustomAggregateOperators` on the `Backend` type class.
Custom functions are merged with the builtin aggregate functions when building GraphQL schemas for table aggregate fields and for `order_by` operators on array relations.

### Steps to test and verify
>

• Codec tests for aggregate function capabilities have been added to the unit tests.
• Some custom aggregate operators have been added to the reference agent and are used in a new test in `api-tests`.

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/6199
GitOrigin-RevId: e9c0d1617af93847c1493671fdbb794f573bde0c
This commit is contained in:
David Overton 2022-10-27 11:42:49 +11:00 committed by hasura-bot
parent 78e0480bb2
commit 9921823915
43 changed files with 461 additions and 202 deletions

View File

@ -187,10 +187,14 @@ It is the agent's responsibility to validate the values provided as GraphQL inpu
Custom scalar types are declared by adding a property to the `scalar_types` section of the [capabilities](#capabilities-and-configuration-schema) and
by adding scalar type declaration with the same name in the `graphql_schema` capabilities property.
Custom comparison types can be defined by adding a `comparisonType` property to the scalar type capabilities object.
The `comparisonType` property gives the name of a GraphQL input object type, which must be defined in the `graphql_schema` capabilities property.
The input object type will be spliced into the `where` argument for any columns of the scalar type in the GraphQL schema.
Custom aggregate functions can be defined by adding a `aggregate_functions` property to the scalar type capabilities object.
The `aggregate_functions` property must be an object mapping aggregate function names to their result types.
Example:
```yaml
@ -204,12 +208,15 @@ capabilities:
scalar_types:
DateTime:
comparisonType: DateTimeComparisons
aggregate_functions:
max: 'DateTime'
min: 'DateTime'
```
This example declares a custom scalar type `DateTime`, with comparison operators defined by the GraphQL input object type `DateTimeComparisons`.
The input type `DateTimeComparisons` defines one comparison operator `in_year` which takes a `Number` argument
An example GraphQL query using this custom operator might look like below:
An example GraphQL query using the custom comparison operator might look like below:
```graphql
query MyQuery {
Employee(where: {BirthDate: {in_year: 1962}}) {
@ -221,6 +228,8 @@ query MyQuery {
In this query we have an `Employee` field with a `BirthDate` property of type `DateTime`.
The `in_year` custom comparison operator is being used to request all employees with a birth date in the year 1962.
The example also defines two aggregate functions `min` and `max`, both of which have a result type of `DateTime`.
### Schema
The `GET /schema` endpoint is called whenever the metadata is (re)loaded by `graphql-engine`. It returns the following JSON object:

View File

@ -1,6 +1,6 @@
{
"name": "@hasura/dc-api-types",
"version": "0.12.0",
"version": "0.13.0",
"description": "Hasura GraphQL Engine Data Connector Agent API types",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",

View File

@ -371,9 +371,35 @@
"description": "A valid GraphQL name",
"type": "string"
},
"ScalarType": {
"additionalProperties": true,
"anyOf": [
{
"enum": [
"string",
"number",
"bool"
],
"type": "string"
},
{
"type": "string"
}
]
},
"AggregateFunctions": {
"additionalProperties": {
"$ref": "#/components/schemas/ScalarType"
},
"description": "A map from aggregate function names to their result types.\nFunction and result type names must be valid GraphQL names.\nResult type names must be defined scalar types - either builtin or declared in ScalarTypesCapabilities.\n",
"type": "object"
},
"ScalarTypeCapabilities": {
"description": "Capabilities of a scalar type.\ncomparison_type: Name of the GraphQL input object to be used for comparison operations on the scalar type. The input object type must be defined in the `graphql_schema`.\n",
"description": "Capabilities of a scalar type.\ncomparison_type: Name of the GraphQL input object to be used for comparison operations on the scalar type. The input object type must be defined in the `graphql_schema`.\naggregate_functions: The aggregate functions supported by the scalar type.\n",
"properties": {
"aggregate_functions": {
"$ref": "#/components/schemas/AggregateFunctions"
},
"comparison_type": {
"$ref": "#/components/schemas/GraphQLName"
}
@ -760,22 +786,6 @@
},
"type": "array"
},
"ScalarType": {
"additionalProperties": true,
"anyOf": [
{
"enum": [
"string",
"number",
"bool"
],
"type": "string"
},
{
"type": "string"
}
]
},
"ColumnInfo": {
"properties": {
"description": {
@ -1102,16 +1112,7 @@
"type": "object"
},
"SingleColumnAggregateFunction": {
"enum": [
"avg",
"max",
"min",
"stddev_pop",
"stddev_samp",
"sum",
"var_pop",
"var_samp"
],
"description": "Single column aggregate function name.\nA valid GraphQL name",
"type": "string"
},
"SingleColumnAggregate": {

View File

@ -3,6 +3,7 @@
/* eslint-disable */
export type { Aggregate } from './models/Aggregate';
export type { AggregateFunctions } from './models/AggregateFunctions';
export type { AndExpression } from './models/AndExpression';
export type { AnotherColumnComparison } from './models/AnotherColumnComparison';
export type { ApplyBinaryArrayComparisonOperator } from './models/ApplyBinaryArrayComparisonOperator';

View File

@ -0,0 +1,13 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { ScalarType } from './ScalarType';
/**
* A map from aggregate function names to their result types.
* Function and result type names must be valid GraphQL names.
* Result type names must be defined scalar types - either builtin or declared in ScalarTypesCapabilities.
*
*/
export type AggregateFunctions = Record<string, ScalarType>;

View File

@ -2,14 +2,17 @@
/* tslint:disable */
/* eslint-disable */
import type { AggregateFunctions } from './AggregateFunctions';
import type { GraphQLName } from './GraphQLName';
/**
* Capabilities of a scalar type.
* comparison_type: Name of the GraphQL input object to be used for comparison operations on the scalar type. The input object type must be defined in the `graphql_schema`.
* aggregate_functions: The aggregate functions supported by the scalar type.
*
*/
export type ScalarTypeCapabilities = {
aggregate_functions?: AggregateFunctions;
comparison_type?: GraphQLName;
};

View File

@ -2,4 +2,8 @@
/* tslint:disable */
/* eslint-disable */
export type SingleColumnAggregateFunction = 'avg' | 'max' | 'min' | 'stddev_pop' | 'stddev_samp' | 'sum' | 'var_pop' | 'var_samp';
/**
* Single column aggregate function name.
* A valid GraphQL name
*/
export type SingleColumnAggregateFunction = string;

View File

@ -24,7 +24,7 @@
},
"dc-api-types": {
"name": "@hasura/dc-api-types",
"version": "0.12.0",
"version": "0.13.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",
@ -631,7 +631,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify": "^3.29.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -1389,7 +1389,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify": "^4.4.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -3122,7 +3122,7 @@
"version": "file:reference",
"requires": {
"@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/xml2js": "^0.4.11",
@ -3613,7 +3613,7 @@
"version": "file:sqlite",
"requires": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/sqlite3": "^3.1.8",

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify": "^3.29.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
@ -44,7 +44,7 @@
}
},
"node_modules/@hasura/dc-api-types": {
"version": "0.12.0",
"version": "0.13.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify": "^3.29.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",

View File

@ -7,14 +7,27 @@ const schemaDoc: string =
input DateTimeComparisons {
same_day_as: DateTime
in_year: Int
}`
}
`
const dateTimeCapabilities: ScalarTypeCapabilities = {
comparison_type: 'DateTimeComparisons'
comparison_type: 'DateTimeComparisons',
aggregate_functions: {
max: 'DateTime',
min: 'DateTime'
}
}
const stringCapabilities: ScalarTypeCapabilities = {
aggregate_functions: {
longest: 'string',
shortest: 'string'
}
}
const scalarTypes: ScalarTypesCapabilities = {
DateTime: dateTimeCapabilities
DateTime: dateTimeCapabilities,
string: stringCapabilities
}
const capabilities: Capabilities = {

View File

@ -479,6 +479,10 @@ const isComparableArray = (values: ScalarValue[]): values is (number | string)[]
return values.every(v => typeof v === "number" || typeof v === "string");
};
const isStringArray = (values: ScalarValue[]): values is string[] => {
return values.every(v => typeof v === "string");
};
const singleColumnAggregateFunction = (aggregate: SingleColumnAggregate) => (rows: Record<string, ScalarValue>[]): ScalarValue => {
const values = rows.map(row => row[aggregate.column]).filter((v): v is Exclude<ScalarValue, null> => v !== null);
if (values.length === 0)
@ -492,6 +496,13 @@ const singleColumnAggregateFunction = (aggregate: SingleColumnAggregate) => (row
case "min": return values.reduce((prev, curr) => prev < curr ? prev : curr);
}
if (isStringArray(values)) {
switch (aggregate.function) {
case "longest": return values.reduce((prev, curr) => prev.length > curr.length ? prev : curr);
case "shortest": return values.reduce((prev, curr) => prev.length < curr.length ? prev : curr);
}
}
if (!isNumberArray(values)) {
throw new Error(`Found non-numeric scalar values when computing ${aggregate.function}`);
}
@ -500,11 +511,13 @@ const singleColumnAggregateFunction = (aggregate: SingleColumnAggregate) => (row
return math.mean(values);
case "stddev_pop": return math.std(values, "uncorrected");
case "stddev_samp": return math.std(values, "unbiased");
case "stddev": return math.std(values, "unbiased");
case "sum": return math.sum(values);
case "var_pop": return math.variance(values, "uncorrected");
case "var_samp": return math.variance(values, "unbiased");
case "variance": return math.variance(values, "unbiased");
default:
return unreachable(aggregate.function);
return unknownAggregateFunction(aggregate.function);
}
};
@ -561,6 +574,8 @@ export const queryData = (getTable: (tableName: TableName) => Record<string, Sca
const unknownOperator = (x: string): never => { throw new Error(`Unknown operator: ${x}`) };
const unknownAggregateFunction = (x: string): never => { throw new Error(`Unknown aggregate function: ${x}`) };
const expectedString = (x: string): never => { throw new Error(`Expected string value but got ${x}`) };
const expectedNumber = (x: string): never => { throw new Error(`Expected number value but got ${x}`) };

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0",
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify": "^4.4.0",
"fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4",
@ -54,7 +54,7 @@
"license": "MIT"
},
"node_modules/@hasura/dc-api-types": {
"version": "0.12.0",
"version": "0.13.0",
"license": "Apache-2.0",
"devDependencies": {
"@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.12.0",
"@hasura/dc-api-types": "0.13.0",
"fastify-metrics": "^9.2.1",
"fastify": "^4.4.0",
"nanoid": "^3.3.4",

View File

@ -12,6 +12,7 @@ executable api-tests
, bytestring
, dc-api
, fast-logger
, graphql-parser
, hasura-prelude
, hspec
, http-conduit

View File

@ -19,7 +19,7 @@ import Harness.TestEnvironment (TestEnvironment)
import Harness.TestEnvironment qualified as TE
import Harness.Yaml (shouldReturnYaml)
import Hasura.Prelude
import Test.Hspec (SpecWith, describe, it)
import Test.Hspec (SpecWith, describe, it, pendingWith)
spec :: SpecWith TestEnvironment
spec =
@ -447,3 +447,44 @@ tests opts = describe "Aggregate Query Tests" $ do
aggregate:
count: 14
|]
it "works for custom aggregate functions" $ \(testEnvironment, _) -> do
when (TE.backendType testEnvironment == Just Fixture.DataConnectorSqlite) do
pendingWith "SQLite DataConnector does not support 'longest' and 'shortest' custom aggregate functions"
shouldReturnYaml
opts
( GraphqlEngine.postGraphql
testEnvironment
[graphql|
query MyQuery {
Album_aggregate {
aggregate {
longest {
Title
}
shortest {
Title
}
max {
Title
}
min {
Title
}
}
}
}
|]
)
[yaml|
"data":
"Album_aggregate":
"aggregate":
"longest":
"Title": "Tchaikovsky: 1812 Festival Overture, Op.49, Capriccio Italien & Beethoven: Wellington's Victory"
"shortest":
"Title": "IV"
"max":
"Title": "[1997] Black Light Syndrome"
"min":
"Title": "...And Justice For All"
|]

View File

@ -18,6 +18,7 @@ import Harness.Test.Fixture qualified as Fixture
import Harness.TestEnvironment (TestEnvironment)
import Hasura.Backends.DataConnector.API qualified as API
import Hasura.Prelude
import Language.GraphQL.Draft.Syntax.QQ qualified as G
import Test.Hspec (SpecWith, describe, it)
--------------------------------------------------------------------------------
@ -317,8 +318,8 @@ tests opts = describe "Aggregate Query Tests" $ do
HashMap.fromList
[ (API.FieldName "counts_count", API.StarCount),
(API.FieldName "counts_uniqueBillingCountries", API.ColumnCount (API.ColumnCountAggregate (API.ColumnName "BillingCountry") True)),
(API.FieldName "ids_minimum_Id", API.SingleColumn (API.SingleColumnAggregate API.Min (API.ColumnName "InvoiceId"))),
(API.FieldName "ids_max_InvoiceId", API.SingleColumn (API.SingleColumnAggregate API.Max (API.ColumnName "InvoiceId")))
(API.FieldName "ids_minimum_Id", API.SingleColumn (singleColumnAggregateMin (API.ColumnName "InvoiceId"))),
(API.FieldName "ids_max_InvoiceId", API.SingleColumn (singleColumnAggregateMax (API.ColumnName "InvoiceId")))
],
_qLimit = Just 2,
_qOffset = Nothing,
@ -337,3 +338,9 @@ aggregatesResponse aggregates = API.QueryResponse Nothing (Just $ HashMap.fromLi
aggregatesAndRowsResponse :: [(API.FieldName, Aeson.Value)] -> [[(API.FieldName, API.FieldValue)]] -> API.QueryResponse
aggregatesAndRowsResponse aggregates rows = API.QueryResponse (Just $ HashMap.fromList <$> rows) (Just $ HashMap.fromList aggregates)
singleColumnAggregateMax :: API.ColumnName -> API.SingleColumnAggregate
singleColumnAggregateMax = API.SingleColumnAggregate $ API.SingleColumnAggregateFunction [G.name|max|]
singleColumnAggregateMin :: API.ColumnName -> API.SingleColumnAggregate
singleColumnAggregateMin = API.SingleColumnAggregate $ API.SingleColumnAggregateFunction [G.name|min|]

View File

@ -86,6 +86,8 @@ library
Hasura.Backends.DataConnector.API.V0.Scalar
Hasura.Backends.DataConnector.API.V0.Schema
Hasura.Backends.DataConnector.API.V0.Table
other-modules:
Hasura.Backends.DataConnector.API.V0.Name
test-suite tests-dc-api
import: common-all

View File

@ -16,13 +16,15 @@ import Data.HashMap.Strict qualified as HashMap
import Data.OpenApi (ToSchema)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.Column qualified as API.V0
import Hasura.Backends.DataConnector.API.V0.Name (nameCodec)
import Language.GraphQL.Draft.Syntax qualified as GQL
import Prelude
data SingleColumnAggregate = SingleColumnAggregate
{ _scaFunction :: SingleColumnAggregateFunction,
_scaColumn :: API.V0.ColumnName
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
singleColumnAggregateObjectCodec :: JSONObjectCodec SingleColumnAggregate
singleColumnAggregateObjectCodec =
@ -30,31 +32,15 @@ singleColumnAggregateObjectCodec =
<$> requiredField "function" "The aggregation function" .= _scaFunction
<*> requiredField "column" "The column to apply the aggregation function to" .= _scaColumn
data SingleColumnAggregateFunction
= Average
| Max
| Min
| StandardDeviationPopulation
| StandardDeviationSample
| Sum
| VariancePopulation
| VarianceSample
deriving stock (Eq, Ord, Show, Generic, Data, Enum, Bounded)
newtype SingleColumnAggregateFunction = SingleColumnAggregateFunction {unSingleColumnAggregateFunction :: GQL.Name}
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec SingleColumnAggregateFunction
instance HasCodec SingleColumnAggregateFunction where
codec =
named "SingleColumnAggregateFunction" $
stringConstCodec
[ (Average, "avg"),
(Max, "max"),
(Min, "min"),
(StandardDeviationPopulation, "stddev_pop"),
(StandardDeviationSample, "stddev_samp"),
(Sum, "sum"),
(VariancePopulation, "var_pop"),
(VarianceSample, "var_samp")
]
dimapCodec SingleColumnAggregateFunction unSingleColumnAggregateFunction nameCodec
<?> "Single column aggregate function name."
data ColumnCountAggregate = ColumnCountAggregate
{ _ccaColumn :: API.V0.ColumnName,
@ -72,7 +58,7 @@ data Aggregate
= SingleColumn SingleColumnAggregate
| ColumnCount ColumnCountAggregate
| StarCount
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Aggregate
instance HasCodec Aggregate where

View File

@ -13,6 +13,7 @@ module Hasura.Backends.DataConnector.API.V0.Capabilities
QueryCapabilities (..),
MutationCapabilities (..),
SubscriptionCapabilities (..),
AggregateFunctions (..),
ScalarTypeCapabilities (..),
ScalarTypesCapabilities (..),
GraphQLTypeDefinitions,
@ -52,6 +53,8 @@ import Data.Text.Lazy (toStrict)
import Data.Text.Lazy.Builder qualified as Builder
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.ConfigSchema (ConfigSchemaResponse)
import Hasura.Backends.DataConnector.API.V0.Name (nameCodec)
import Hasura.Backends.DataConnector.API.V0.Scalar (ScalarType (..))
import Language.GraphQL.Draft.Parser qualified as GQL.Parser
import Language.GraphQL.Draft.Printer qualified as GQL.Printer
import Language.GraphQL.Draft.Syntax qualified as GQL.Syntax
@ -167,18 +170,25 @@ data RelationshipCapabilities = RelationshipCapabilities {}
instance HasCodec RelationshipCapabilities where
codec = object "RelationshipCapabilities" $ pure RelationshipCapabilities
nameCodec :: JSONCodec GQL.Syntax.Name
nameCodec =
bimapCodec
parseName
GQL.Syntax.unName
(StringCodec (Just "GraphQLName"))
<?> "A valid GraphQL name"
where
parseName text = maybe (Left $ Text.unpack text <> " is not a valid GraphQL name") pure $ GQL.Syntax.mkName text
newtype AggregateFunctions = AggregateFunctions
{ unAggregateFunctions :: HashMap GQL.Syntax.Name ScalarType
}
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec AggregateFunctions
instance HasCodec AggregateFunctions where
codec =
named "AggregateFunctions" $
dimapCodec AggregateFunctions unAggregateFunctions (hashMapCodec codec)
<??> [ "A map from aggregate function names to their result types.",
"Function and result type names must be valid GraphQL names.",
"Result type names must be defined scalar types - either builtin or declared in ScalarTypesCapabilities."
]
data ScalarTypeCapabilities = ScalarTypeCapabilities
{ _stcComparisonInputObject :: Maybe GQL.Syntax.Name
{ _stcComparisonInputObject :: Maybe GQL.Syntax.Name,
_stcAggregateFunctions :: Maybe AggregateFunctions
}
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
@ -190,16 +200,19 @@ instance HasCodec ScalarTypeCapabilities where
"ScalarTypeCapabilities"
( ScalarTypeCapabilities
<$> optionalFieldWith' "comparison_type" nameCodec .= _stcComparisonInputObject
<*> optionalField' "aggregate_functions" .= _stcAggregateFunctions
)
<??> [ "Capabilities of a scalar type.",
"comparison_type: Name of the GraphQL input object to be used for comparison operations on the scalar type. The input object type must be defined in the `graphql_schema`."
"comparison_type: Name of the GraphQL input object to be used for comparison operations on the scalar type. The input object type must be defined in the `graphql_schema`.",
"aggregate_functions: The aggregate functions supported by the scalar type."
]
newtype ScalarTypesCapabilities = ScalarTypesCapabilities
{ unScalarTypesCapabilities :: HashMap GQL.Syntax.Name ScalarTypeCapabilities
{ unScalarTypesCapabilities :: HashMap ScalarType ScalarTypeCapabilities
}
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving newtype (Semigroup, Monoid)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ScalarTypesCapabilities
instance HasCodec ScalarTypesCapabilities where
@ -353,7 +366,7 @@ instance ToSchema CapabilitiesResponse where
pure $ NamedSchema (Just "CapabilitiesResponse") schema
lookupComparisonInputObjectDefinition :: Capabilities -> GQL.Syntax.Name -> Maybe (GQL.Syntax.InputObjectTypeDefinition GQL.Syntax.InputValueDefinition)
lookupComparisonInputObjectDefinition :: Capabilities -> ScalarType -> Maybe (GQL.Syntax.InputObjectTypeDefinition GQL.Syntax.InputValueDefinition)
lookupComparisonInputObjectDefinition Capabilities {..} typeName = do
scalarTypesMap <- _cScalarTypes
ScalarTypeCapabilities {..} <- HashMap.lookup typeName $ unScalarTypesCapabilities scalarTypesMap

View File

@ -46,7 +46,7 @@ data ColumnInfo = ColumnInfo
_ciNullable :: Bool,
_ciDescription :: Maybe Text
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ColumnInfo

View File

@ -128,7 +128,7 @@ data Expression
-- 'UnaryComparisonOperator'; the result of this application will return "true" or
-- "false" depending on the 'UnaryComparisonOperator' that's being applied.
ApplyUnaryComparisonOperator UnaryComparisonOperator ComparisonColumn
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving anyclass (Hashable, NFData)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Expression
@ -235,7 +235,7 @@ data ComparisonColumn = ComparisonColumn
-- | The scalar type of the column
_ccColumnType :: API.V0.ScalarType
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ComparisonColumn
deriving anyclass (Hashable, NFData)
@ -283,7 +283,7 @@ data ComparisonValue
= -- | Allows a comparison to a column on the current table or another table
AnotherColumn ComparisonColumn
| ScalarValue Value API.V0.ScalarType
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving anyclass (Hashable, NFData)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ComparisonValue

View File

@ -0,0 +1,19 @@
{-# OPTIONS_GHC -Wno-unrecognised-pragmas #-}
{-# HLINT ignore "Use onNothing" #-}
module Hasura.Backends.DataConnector.API.V0.Name (nameCodec) where
import Autodocodec
import Data.Text qualified as Text
import Language.GraphQL.Draft.Syntax qualified as GQL
nameCodec :: JSONCodec GQL.Name
nameCodec =
bimapCodec
parseName
GQL.unName
(StringCodec (Just "GraphQLName"))
<?> "A valid GraphQL name"
where
parseName text = maybe (Left $ Text.unpack text <> " is not a valid GraphQL name") pure $ GQL.mkName text

View File

@ -33,7 +33,7 @@ data OrderBy = OrderBy
{ _obRelations :: HashMap API.V0.RelationshipName OrderByRelation,
_obElements :: NonEmpty OrderByElement
}
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec OrderBy
instance HasCodec OrderBy where
@ -47,7 +47,7 @@ data OrderByRelation = OrderByRelation
{ _obrWhere :: Maybe API.V0.Expression,
_obrSubrelations :: HashMap API.V0.RelationshipName OrderByRelation
}
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving anyclass (Hashable, NFData)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec OrderByRelation
@ -64,7 +64,7 @@ data OrderByElement = OrderByElement
_obeTarget :: OrderByTarget,
_obeOrderDirection :: OrderDirection
}
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec OrderByElement
instance HasCodec OrderByElement where
@ -79,7 +79,7 @@ data OrderByTarget
= OrderByColumn API.V0.ColumnName
| OrderByStarCountAggregate
| OrderBySingleColumnAggregate API.V0.SingleColumnAggregate
deriving stock (Data, Eq, Generic, Ord, Show)
deriving stock (Eq, Generic, Ord, Show)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec OrderByTarget
instance HasCodec OrderByTarget where

View File

@ -63,7 +63,7 @@ data QueryRequest = QueryRequest
_qrTableRelationships :: [API.V0.TableRelationships],
_qrQuery :: Query
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec QueryRequest
instance HasCodec QueryRequest where
@ -96,7 +96,7 @@ data Query = Query
-- | Optionally order the results by the value of one or more fields.
_qOrderBy :: Maybe API.V0.OrderBy
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Query
instance HasCodec Query where
@ -126,7 +126,7 @@ data RelationshipField = RelationshipField
{ _rfRelationship :: API.V0.RelationshipName,
_rfQuery :: Query
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
relationshipFieldObjectCodec :: JSONObjectCodec RelationshipField
relationshipFieldObjectCodec =
@ -145,7 +145,7 @@ relationshipFieldObjectCodec =
data Field
= ColumnField API.V0.ColumnName API.V0.ScalarType
| RelField RelationshipField
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Field
instance HasCodec Field where

View File

@ -12,8 +12,9 @@ where
import Autodocodec
import Autodocodec.OpenAPI ()
import Control.DeepSeq (NFData)
import Data.Aeson (FromJSON, ToJSON, ToJSONKey)
import Data.Data (Data)
import Data.Aeson (FromJSON, FromJSONKey (..), FromJSONKeyFunction (..), ToJSON, ToJSONKey (..), ToJSONKeyFunction (..))
import Data.Aeson.Encoding qualified as Encoding
import Data.Aeson.Key qualified as Key
import Data.Hashable (Hashable)
import Data.OpenApi (ToSchema)
import Data.Text (Text)
@ -27,10 +28,30 @@ data ScalarType
| NumberTy
| BoolTy
| CustomTy {getCustomTy :: Text}
deriving stock (Data, Eq, Generic, Ord, Show)
deriving anyclass (Hashable, NFData, ToJSONKey)
deriving stock (Eq, Generic, Ord, Show)
deriving anyclass (Hashable, NFData)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec ScalarType
instance ToJSONKey ScalarType where
toJSONKey = ToJSONKeyText (Key.fromText . scalarTypeToText) (Encoding.text . scalarTypeToText)
instance FromJSONKey ScalarType where
fromJSONKey = FromJSONKeyText textToScalarType
scalarTypeToText :: ScalarType -> Text
scalarTypeToText = \case
StringTy -> "string"
NumberTy -> "number"
BoolTy -> "bool"
CustomTy t -> t
textToScalarType :: Text -> ScalarType
textToScalarType t
| t == "string" = StringTy
| t == "number" = NumberTy
| t == "bool" = BoolTy
| otherwise = CustomTy t
instance HasCodec ScalarType where
codec =
named "ScalarType" $

View File

@ -8,7 +8,6 @@ where
import Autodocodec
import Control.DeepSeq (NFData)
import Data.Aeson (FromJSON, ToJSON)
import Data.Data (Data)
import Data.Hashable (Hashable)
import Data.OpenApi (ToSchema)
import GHC.Generics (Generic)
@ -24,7 +23,7 @@ import Prelude
newtype SchemaResponse = SchemaResponse
{ _srTables :: [API.V0.TableInfo]
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec SchemaResponse

View File

@ -61,7 +61,7 @@ data TableInfo = TableInfo
_tiForeignKeys :: ForeignKeys,
_tiDescription :: Maybe Text
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving stock (Eq, Ord, Show, Generic)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec TableInfo

View File

@ -1,3 +1,5 @@
{-# LANGUAGE QuasiQuotes #-}
module Test.QuerySpec.AggregatesSpec (spec) where
import Control.Arrow ((>>>))
@ -12,6 +14,7 @@ import Data.List.NonEmpty qualified as NonEmpty
import Data.Maybe (fromMaybe, isJust, mapMaybe)
import Data.Ord (Down (..))
import Hasura.Backends.DataConnector.API
import Language.GraphQL.Draft.Syntax.QQ qualified as G
import Servant.API (NamedRoutes)
import Servant.Client (Client)
import Test.Data (TestData (..), guardedQuery)
@ -119,7 +122,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
describe "Single Column Function" $ do
it "can get the max total from all rows" $ do
let aggregates = Data.mkFieldsMap [("max", SingleColumn $ SingleColumnAggregate Max (_tdColumnName "Total"))]
let aggregates = Data.mkFieldsMap [("max", singleColumnAggregateMax (_tdColumnName "Total"))]
let queryRequest = invoicesQueryRequest aggregates
response <- guardedQuery api sourceName config queryRequest
@ -132,7 +135,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
it "can get the max total from all rows, after applying pagination, filtering and ordering" $ do
let where' = ApplyBinaryComparisonOperator Equal (_tdCurrentComparisonColumn "BillingCountry" _tdStringType) (ScalarValue (String "USA") _tdStringType)
let orderBy = OrderBy mempty $ _tdOrderByColumn [] "BillingPostalCode" Descending :| [_tdOrderByColumn [] "InvoiceId" Ascending]
let aggregates = Data.mkFieldsMap [("max", SingleColumn $ SingleColumnAggregate Max (_tdColumnName "Total"))]
let aggregates = Data.mkFieldsMap [("max", singleColumnAggregateMax (_tdColumnName "Total"))]
let queryRequest = invoicesQueryRequest aggregates & qrQuery %~ (qLimit ?~ 20 >>> qWhere ?~ where' >>> qOrderBy ?~ orderBy)
response <- guardedQuery api sourceName config queryRequest
@ -152,8 +155,8 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
it "can get the min and max of a non-numeric comparable type such as a string" $ do
let aggregates =
Data.mkFieldsMap
[ ("min", SingleColumn $ SingleColumnAggregate Min (_tdColumnName "Name")),
("max", SingleColumn $ SingleColumnAggregate Max (_tdColumnName "Name"))
[ ("min", singleColumnAggregateMin (_tdColumnName "Name")),
("max", singleColumnAggregateMax (_tdColumnName "Name"))
]
let queryRequest = artistsQueryRequest aggregates
response <- guardedQuery api sourceName config queryRequest
@ -170,7 +173,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
it "aggregates over empty row lists results in nulls" $ do
let where' = ApplyBinaryComparisonOperator LessThan (_tdCurrentComparisonColumn "ArtistId" _tdIntType) (ScalarValue (Number 0) _tdIntType)
let aggregates = Data.mkFieldsMap [("min", SingleColumn $ SingleColumnAggregate Min (_tdColumnName "Name"))]
let aggregates = Data.mkFieldsMap [("min", singleColumnAggregateMin (_tdColumnName "Name"))]
let queryRequest = artistsQueryRequest aggregates & qrQuery . qWhere ?~ where'
response <- guardedQuery api sourceName config queryRequest
@ -185,7 +188,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
Data.mkFieldsMap
[ ("count", StarCount),
("distinctBillingStates", ColumnCount $ ColumnCountAggregate (_tdColumnName "BillingState") True),
("maxTotal", SingleColumn $ SingleColumnAggregate Max (_tdColumnName "Total"))
("maxTotal", singleColumnAggregateMax (_tdColumnName "Total"))
]
let queryRequest = invoicesQueryRequest aggregates
response <- guardedQuery api sourceName config queryRequest
@ -207,8 +210,8 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
it "can reuse the same aggregate twice" $ do
let aggregates =
Data.mkFieldsMap
[ ("minInvoiceId", SingleColumn $ SingleColumnAggregate Min (_tdColumnName "InvoiceId")),
("minTotal", SingleColumn $ SingleColumnAggregate Min (_tdColumnName "Total"))
[ ("minInvoiceId", singleColumnAggregateMin (_tdColumnName "InvoiceId")),
("minTotal", singleColumnAggregateMin (_tdColumnName "Total"))
]
let queryRequest = invoicesQueryRequest aggregates
response <- guardedQuery api sourceName config queryRequest
@ -233,7 +236,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
]
let where' = ApplyBinaryComparisonOperator Equal (_tdCurrentComparisonColumn "BillingCountry" _tdStringType) (ScalarValue (String "Canada") _tdStringType)
let orderBy = OrderBy mempty $ _tdOrderByColumn [] "BillingAddress" Ascending :| [_tdOrderByColumn [] "InvoiceId" Ascending]
let aggregates = Data.mkFieldsMap [("min", SingleColumn $ SingleColumnAggregate Min (_tdColumnName "Total"))]
let aggregates = Data.mkFieldsMap [("min", singleColumnAggregateMin (_tdColumnName "Total"))]
let queryRequest = invoicesQueryRequest aggregates & qrQuery %~ (qFields ?~ fields >>> qLimit ?~ 30 >>> qWhere ?~ where' >>> qOrderBy ?~ orderBy)
response <- guardedQuery api sourceName config queryRequest
@ -408,7 +411,7 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
-- @
deeplyNestedArtistsQuery :: QueryRequest
deeplyNestedArtistsQuery =
let invoiceLinesAggregates = Data.mkFieldsMap [("aggregate_sum_Quantity", SingleColumn $ SingleColumnAggregate Sum (_tdColumnName "Quantity"))]
let invoiceLinesAggregates = Data.mkFieldsMap [("aggregate_sum_Quantity", singleColumnAggregateSum (_tdColumnName "Quantity"))]
invoiceLinesSubquery = Data.emptyQuery & qAggregates ?~ invoiceLinesAggregates
mediaTypeFields = Data.mkFieldsMap [("Name", _tdColumnField "Name" _tdStringType)]
mediaTypeSubquery = Data.emptyQuery & qFields ?~ mediaTypeFields
@ -465,3 +468,12 @@ spec TestData {..} api sourceName config relationshipCapabilities = describe "Ag
aggregate :: (NonEmpty a -> Value) -> [a] -> Value
aggregate aggFn values =
maybe Null aggFn $ NonEmpty.nonEmpty values
singleColumnAggregateMax :: ColumnName -> Aggregate
singleColumnAggregateMax = SingleColumn . SingleColumnAggregate (SingleColumnAggregateFunction [G.name|max|])
singleColumnAggregateMin :: ColumnName -> Aggregate
singleColumnAggregateMin = SingleColumn . SingleColumnAggregate (SingleColumnAggregateFunction [G.name|min|])
singleColumnAggregateSum :: ColumnName -> Aggregate
singleColumnAggregateSum = SingleColumn . SingleColumnAggregate (SingleColumnAggregateFunction [G.name|sum|])

View File

@ -1,3 +1,4 @@
{-# LANGUAGE DerivingVia #-}
{-# LANGUAGE ViewPatterns #-}
-- | Defines the 'Parser' type and its primitive combinators.
@ -28,6 +29,7 @@ import Data.HashMap.Strict qualified as M
import Data.HashSet qualified as S
import Data.List.NonEmpty (NonEmpty)
import Data.List.NonEmpty qualified as NonEmpty
import Data.Monoid (Ap (..))
import Data.Traversable (for)
import Data.Type.Equality
import Data.Vector qualified as V
@ -64,7 +66,8 @@ data InputFieldsParser origin m a = InputFieldsParser
{ ifDefinitions :: [Definition origin (InputFieldInfo origin)],
ifParser :: HashMap Name (InputValue Variable) -> m a
}
deriving (Functor)
deriving stock (Functor)
deriving (Semigroup, Monoid) via Ap (InputFieldsParser origin m) a
instance Applicative m => Applicative (InputFieldsParser origin m) where
pure v = InputFieldsParser [] (const $ pure v)

View File

@ -87,6 +87,13 @@ defaultBackendCapabilities = \case
scalar_types:
DateTime:
comparison_type: DateTimeComparisons
aggregate_functions:
max: DateTime
min: DateTime
string:
aggregate_functions:
longest: string
shortest: string
|]
_ -> Nothing

View File

@ -8,6 +8,7 @@ module Data.HashMap.Strict.Extended
isInverseOf,
unionWithM,
unionsAll,
unionsWith,
homogenise,
catMaybes,
)
@ -105,6 +106,11 @@ unionsAll ::
(Eq k, Hashable k, Foldable t) => t (HashMap k v) -> HashMap k (NonEmpty v)
unionsAll = F.foldl' (\a b -> M.unionWith (<>) a (fmap (:| []) b)) M.empty
-- | Like 'M.unions', but combining elements
unionsWith ::
(Eq k, Hashable k, Foldable t) => (v -> v -> v) -> t (HashMap k v) -> HashMap k v
unionsWith f = F.foldl' (M.unionWith f) M.empty
-- | Homogenise maps, such that all maps range over the full set of
-- keys, inserting a default value as needed.
homogenise :: (Hashable a, Eq a) => b -> [HashMap a b] -> (HashSet a, [HashMap a b])

View File

@ -10,10 +10,13 @@ import Data.Aeson qualified as J
import Data.Aeson.Extended (ToJSONKeyValue (..))
import Data.Aeson.Key (fromText)
import Data.Aeson.Types qualified as J
import Data.HashMap.Strict qualified as HashMap
import Data.List.NonEmpty qualified as NonEmpty
import Data.Text qualified as Text
import Data.Text.Casing qualified as C
import Data.Text.Extended ((<<>))
import Hasura.Backends.DataConnector.API qualified as API
import Hasura.Backends.DataConnector.Adapter.Types qualified as Adapter
import Hasura.Backends.DataConnector.Adapter.Types qualified as DC
import Hasura.Base.Error (Code (ValidationFailed), QErr, runAesonParser, throw400)
import Hasura.Incremental
@ -24,6 +27,7 @@ import Hasura.RQL.Types.Column (ColumnType (..))
import Hasura.RQL.Types.ResizePool (ServerReplicas)
import Hasura.SQL.Backend (BackendType (DataConnector))
import Language.GraphQL.Draft.Syntax qualified as G
import Witch qualified
-- | An alias for '()' indicating that a particular associated type has not yet
-- been implemented for the 'DataConnector' backend.
@ -78,12 +82,25 @@ instance Backend 'DataConnector where
DC.NumberTy -> True
DC.StringTy -> True
DC.BoolTy -> False
DC.CustomTy _ -> False -- TODO: extend Capabilities for custom types
DC.CustomTy _ -> False
isNumType :: ScalarType 'DataConnector -> Bool
isNumType DC.NumberTy = True
isNumType _ = False
getCustomAggregateOperators :: Adapter.SourceConfig -> HashMap G.Name (HashMap DC.ScalarType DC.ScalarType)
getCustomAggregateOperators Adapter.SourceConfig {..} =
HashMap.foldrWithKey insertOps mempty scalarTypesCapabilities
where
scalarTypesCapabilities = maybe mempty API.unScalarTypesCapabilities $ API._cScalarTypes _scCapabilities
insertOps typeName API.ScalarTypeCapabilities {..} m =
HashMap.foldrWithKey insertOp m $
maybe mempty API.unAggregateFunctions _stcAggregateFunctions
where
insertOp funtionName resultTypeName =
HashMap.insertWith HashMap.union funtionName $
HashMap.singleton (Witch.from typeName) (Witch.from resultTypeName)
textToScalarValue :: Maybe Text -> ScalarValue 'DataConnector
textToScalarValue = error "textToScalarValue: not implemented for the Data Connector backend."

View File

@ -37,6 +37,7 @@ import Hasura.RQL.Types.SourceCustomization qualified as RQL
import Hasura.RQL.Types.Table qualified as RQL
import Hasura.SQL.Backend (BackendType (..))
import Language.GraphQL.Draft.Syntax qualified as GQL
import Witch qualified
--------------------------------------------------------------------------------
@ -203,7 +204,7 @@ comparisonExps' sourceInfo columnType = P.memoizeOn 'comparisonExps' (dataConnec
GS.C.SchemaT r m [P.InputFieldsParser n (Maybe (CustomBooleanOperator (IR.UnpreparedValue 'DataConnector)))]
mkCustomOperators tCase collapseIfNull typeName = do
let capabilities = sourceInfo ^. RQL.siConfiguration . DC.scCapabilities
case lookupComparisonInputObjectDefinition capabilities typeName of
case lookupComparisonInputObjectDefinition capabilities (Witch.from $ DC.fromGQLType typeName) of
Nothing -> pure []
Just GQL.InputObjectTypeDefinition {..} -> do
traverse (mkCustomOperator tCase collapseIfNull) _iotdValueDefinitions

View File

@ -36,6 +36,7 @@ import Hasura.RQL.Types.Relationships.Local (RelInfo (..))
import Hasura.SQL.Backend
import Hasura.SQL.Types (CollectableType (..))
import Hasura.Session
import Language.GraphQL.Draft.Syntax qualified as G
import Witch qualified
--------------------------------------------------------------------------------
@ -415,18 +416,9 @@ mkPlan session (SourceConfig {}) ir = do
pure mempty
translateSingleColumnAggregateFunction :: Text -> m API.SingleColumnAggregateFunction
translateSingleColumnAggregateFunction = \case
"avg" -> pure API.Average
"max" -> pure API.Max
"min" -> pure API.Min
"stddev_pop" -> pure API.StandardDeviationPopulation
"stddev_samp" -> pure API.StandardDeviationSample
"stddev" -> pure API.StandardDeviationSample
"sum" -> pure API.Sum
"var_pop" -> pure API.VariancePopulation
"var_samp" -> pure API.VarianceSample
"variance" -> pure API.VarianceSample
unknownFunc -> throw500 $ "translateSingleColumnAggregateFunction: Unknown aggregate function encountered: " <> unknownFunc
translateSingleColumnAggregateFunction functionName =
fmap API.SingleColumnAggregateFunction (G.mkName functionName)
`onNothing` throw500 ("translateSingleColumnAggregateFunction: Invalid aggregate function encountered: " <> functionName)
prepareLiterals ::
UnpreparedValue 'DataConnector ->

View File

@ -7,6 +7,7 @@ module Hasura.GraphQL.Schema.OrderBy
where
import Data.Has
import Data.HashMap.Strict.Extended qualified as HashMap
import Data.Text.Casing qualified as C
import Data.Text.Extended
import Hasura.GraphQL.Parser.Class
@ -166,10 +167,15 @@ orderByAggregation sourceInfo tableInfo = P.memoizeOn 'orderByAggregation (_siNa
tableIdentifierName <- getTableIdentifierName @b tableInfo
allColumns <- tableSelectColumns sourceInfo tableInfo
makeTypename <- asks getter
let numColumns = onlyNumCols allColumns
compColumns = onlyComparableCols allColumns
numFields = catMaybes <$> traverse (mkField tCase) numColumns
compFields = catMaybes <$> traverse (mkField tCase) compColumns
let numColumns = mkAgOpsFields tCase $ onlyNumCols allColumns
compColumns = mkAgOpsFields tCase $ onlyComparableCols allColumns
numOperatorsAndColumns = HashMap.fromList $ (,numColumns) <$> numericAggOperators
compOperatorsAndColumns = HashMap.fromList $ (,compColumns) <$> comparisonAggOperators
customOperatorsAndColumns =
getCustomAggOpsColumns tCase allColumns <$> getCustomAggregateOperators @b (_siConfiguration sourceInfo)
allOperatorsAndColumns =
HashMap.catMaybes $
HashMap.unionsWith (<>) [numOperatorsAndColumns, compOperatorsAndColumns, customOperatorsAndColumns]
aggFields =
fmap (concat . catMaybes . concat) $
sequenceA $
@ -181,18 +187,12 @@ orderByAggregation sourceInfo tableInfo = P.memoizeOn 'orderByAggregation (_siNa
Nothing
(orderByOperator @b tCase sourceInfo)
<&> pure . fmap (pure . mkOrderByItemG @b IR.AAOCount) . join,
-- operators on numeric columns
if null numColumns
-- other operators
if null allOperatorsAndColumns
then Nothing
else Just $
for numericAggOperators \operator ->
parseOperator makeTypename operator tableGQLName numFields,
-- operators on comparable columns
if null compColumns
then Nothing
else Just $
for comparisonAggOperators \operator ->
parseOperator makeTypename operator tableGQLName compFields
for (HashMap.toList allOperatorsAndColumns) \(operator, fields) -> do
parseOperator makeTypename operator tableGQLName fields
]
objectName <- mkTypename $ applyTypeNameCaseIdentifier tCase $ mkTableAggregateOrderByTypeName tableIdentifierName
let description = G.Description $ "order by aggregate values of table " <>> tableName
@ -200,6 +200,30 @@ orderByAggregation sourceInfo tableInfo = P.memoizeOn 'orderByAggregation (_siNa
where
tableName = tableInfoName tableInfo
-- Build an InputFieldsParser only if the column list is non-empty
mkAgOpsFields ::
NamingCase ->
[ColumnInfo b] ->
Maybe (InputFieldsParser n [(ColumnInfo b, (BasicOrderType b, NullsOrderType b))])
mkAgOpsFields tCase =
fmap (fmap (catMaybes . toList) . traverse (mkField tCase)) . nonEmpty
getCustomAggOpsColumns ::
NamingCase ->
[ColumnInfo b] ->
HashMap (ScalarType b) v ->
Maybe (InputFieldsParser n [(ColumnInfo b, (BasicOrderType b, NullsOrderType b))])
getCustomAggOpsColumns tCase columnInfos typeMap =
columnInfos
& filter
( \ColumnInfo {..} ->
case ciType of
ColumnEnumReference _ -> False
ColumnScalar scalarType ->
HashMap.member scalarType typeMap
)
& mkAgOpsFields tCase
mkField :: NamingCase -> ColumnInfo b -> InputFieldsParser n (Maybe (ColumnInfo b, (BasicOrderType b, NullsOrderType b)))
mkField tCase columnInfo =
P.fieldOptional

View File

@ -668,7 +668,9 @@ tableOrderByArg sourceInfo tableInfo = do
pure $ do
maybeOrderByExps <-
fmap join $
P.fieldOptional orderByName orderByDesc $ P.nullable $ P.list orderByParser
P.fieldOptional orderByName orderByDesc $
P.nullable $
P.list orderByParser
pure $ maybeOrderByExps >>= NE.nonEmpty . concat
-- | Argument to distinct select on columns returned from table selection
@ -890,12 +892,14 @@ tableAggregationFields sourceInfo tableInfo =
allColumns <- tableSelectColumns sourceInfo tableInfo
let numericColumns = onlyNumCols allColumns
comparableColumns = onlyComparableCols allColumns
customOperatorsAndColumns =
Map.toList $ Map.mapMaybe (getCustomAggOpsColumns allColumns) $ getCustomAggregateOperators @b (_siConfiguration sourceInfo)
description = G.Description $ "aggregate fields of " <>> tableInfoName tableInfo
selectName <- mkTypename $ applyTypeNameCaseIdentifier tCase $ mkTableAggregateFieldTypeName tableGQLName
count <- countField
makeTypename <- asks getter
numericAndComparable <-
fmap concat $
nonCountFieldsMap <-
fmap (Map.unionsWith (++) . concat) $
sequenceA $
catMaybes
[ -- operators on numeric columns
@ -903,9 +907,8 @@ tableAggregationFields sourceInfo tableInfo =
then Nothing
else Just $
for numericAggOperators $ \operator -> do
let fieldNameCase = applyFieldNameCaseCust tCase operator
numFields <- mkNumericAggFields operator numericColumns
pure $ parseAggOperator makeTypename operator fieldNameCase tCase tableGQLName numFields,
pure $ Map.singleton operator numFields,
-- operators on comparable columns
if null comparableColumns
then Nothing
@ -913,14 +916,39 @@ tableAggregationFields sourceInfo tableInfo =
comparableFields <- traverse mkColumnAggField comparableColumns
pure $
comparisonAggOperators & map \operator ->
let fieldNameCase = applyFieldNameCaseCust tCase operator
in parseAggOperator makeTypename operator fieldNameCase tCase tableGQLName comparableFields
Map.singleton operator comparableFields,
-- -- custom operators
if null customOperatorsAndColumns
then Nothing
else Just $
for customOperatorsAndColumns \(operator, columnTypes) -> do
customFields <- traverse (uncurry mkCustomColumnAggField) (toList columnTypes)
pure $ Map.singleton operator customFields
]
let aggregateFields = count : numericAndComparable
let nonCountFields =
Map.mapWithKey
( \operator fields ->
let fieldNameCase = applyFieldNameCaseCust tCase operator
in parseAggOperator makeTypename operator fieldNameCase tCase tableGQLName fields
)
nonCountFieldsMap
aggregateFields = count : Map.elems nonCountFields
pure $
P.selectionSet selectName (Just description) aggregateFields
<&> parsedSelectionsToFields IR.AFExp
where
getCustomAggOpsColumns :: [ColumnInfo b] -> HashMap (ScalarType b) (ScalarType b) -> Maybe (NonEmpty (ColumnInfo b, ScalarType b))
getCustomAggOpsColumns columnInfos typeMap =
columnInfos
& mapMaybe
( \ci@ColumnInfo {..} ->
case ciType of
ColumnEnumReference _ -> Nothing
ColumnScalar scalarType ->
(ci,) <$> Map.lookup scalarType typeMap
)
& nonEmpty
mkNumericAggFields :: G.Name -> [ColumnInfo b] -> SchemaT r m [FieldParser n (IR.ColFld b)]
mkNumericAggFields name
| name == Name._sum = traverse mkColumnAggField
@ -933,8 +961,12 @@ tableAggregationFields sourceInfo tableInfo =
$> IR.CFCol (ciColumn columnInfo) (ciType columnInfo)
mkColumnAggField :: ColumnInfo b -> SchemaT r m (FieldParser n (IR.ColFld b))
mkColumnAggField columnInfo = do
field <- columnParser (ciType columnInfo) (G.Nullability True)
mkColumnAggField columnInfo =
mkColumnAggField' columnInfo (ciType columnInfo)
mkColumnAggField' :: ColumnInfo b -> ColumnType b -> SchemaT r m (FieldParser n (IR.ColFld b))
mkColumnAggField' columnInfo resultType = do
field <- columnParser resultType (G.Nullability True)
pure $
P.selection_
(ciName columnInfo)
@ -942,6 +974,10 @@ tableAggregationFields sourceInfo tableInfo =
field
$> IR.CFCol (ciColumn columnInfo) (ciType columnInfo)
mkCustomColumnAggField :: ColumnInfo b -> ScalarType b -> SchemaT r m (FieldParser n (IR.ColFld b))
mkCustomColumnAggField columnInfo resultType =
mkColumnAggField' columnInfo (ColumnScalar resultType)
countField :: SchemaT r m (FieldParser n (IR.AggregateField b))
countField = do
columnsEnum <- tableSelectColumnsEnum sourceInfo tableInfo
@ -1164,9 +1200,9 @@ relationshipField sourceInfo table ri = runMaybeT do
-- "backwards" joining condition from the related table back to
-- `table`. If it is, then we can optimize the row-level permission
-- filters by dropping them here.
if riRTable remoteRI == table
&& riMapping remoteRI `Map.isInverseOf` riMapping ri
&& thisTablePerm == remoteTablePerm
if (riRTable remoteRI == table)
&& (riMapping remoteRI `Map.isInverseOf` riMapping ri)
&& (thisTablePerm == remoteTablePerm)
then BoolAnd []
else x
_ -> x
@ -1235,7 +1271,8 @@ relationshipField sourceInfo table ri = runMaybeT do
IR.AnnRelationSelectG (riName ri) (riMapping ri) $
IR.AnnObjectSelectG fields (riRTable ri) $
deduplicatePermissions $
IR._tpFilter $ tablePermissionsInfo remotePerms
IR._tpFilter $
tablePermissionsInfo remotePerms
ArrRel -> do
let arrayRelDesc = Just $ G.Description "An array relationship"
otherTableParser <- MaybeT $ selectTable sourceInfo otherTableInfo relFieldName arrayRelDesc

View File

@ -307,6 +307,17 @@ class
-- functions on types
isComparableType :: ScalarType b -> Bool
isNumType :: ScalarType b -> Bool
-- | Custom aggregate operators supported by the backend.
-- Backends that support custom aggregate operators should
-- return a HashMap from operator name to a scalar type mapping.
-- In the scalar type mapping the key represents the input type for the operator
-- and the value represents the result type.
-- Backends that do not support custom aggregate operators can use the default implementation
-- which returns an empty map.
getCustomAggregateOperators :: SourceConfig b -> HashMap G.Name (HashMap (ScalarType b) (ScalarType b))
getCustomAggregateOperators = const mempty
textToScalarValue :: Maybe Text -> ScalarValue b
parseScalarValue :: ScalarType b -> Value -> Either QErr (ScalarValue b)
scalarValueToJSON :: ScalarValue b -> Value

View File

@ -14,6 +14,8 @@ import Hasura.Backends.DataConnector.API.V0.ColumnSpec (genColumnName)
import Hasura.Prelude
import Hedgehog
import Hedgehog.Gen qualified as Gen
import Language.GraphQL.Draft.Generator (genName)
import Language.GraphQL.Draft.Syntax.QQ qualified as G
import Test.Aeson.Utils (jsonOpenApiProperties, testToFromJSONToSchema)
import Test.Hspec
@ -22,7 +24,7 @@ spec = do
describe "Aggregate" $ do
describe "SingleColumn" $ do
testToFromJSONToSchema
(SingleColumn $ SingleColumnAggregate Average (ColumnName "my_column_name"))
(SingleColumn $ SingleColumnAggregate (SingleColumnAggregateFunction [G.name|avg|]) (ColumnName "my_column_name"))
[aesonQQ|
{ "type": "single_column",
"function": "avg",
@ -48,25 +50,10 @@ spec = do
jsonOpenApiProperties genAggregate
describe "SingleColumnAggregateFunction" $ do
describe "Average" $
testToFromJSONToSchema Average [aesonQQ|"avg"|]
describe "Max" $
testToFromJSONToSchema Max [aesonQQ|"max"|]
describe "Min" $
testToFromJSONToSchema Min [aesonQQ|"min"|]
describe "StandardDeviationPopulation" $
testToFromJSONToSchema StandardDeviationPopulation [aesonQQ|"stddev_pop"|]
describe "StandardDeviationSample" $
testToFromJSONToSchema StandardDeviationSample [aesonQQ|"stddev_samp"|]
describe "Sum" $
testToFromJSONToSchema Sum [aesonQQ|"sum"|]
describe "VariancePopulation" $
testToFromJSONToSchema VariancePopulation [aesonQQ|"var_pop"|]
describe "VarianceSample" $
testToFromJSONToSchema VarianceSample [aesonQQ|"var_samp"|]
testToFromJSONToSchema (SingleColumnAggregateFunction [G.name|avg|]) [aesonQQ|"avg"|]
jsonOpenApiProperties genSingleColumnAggregateFunction
genAggregate :: MonadGen m => m Aggregate
genAggregate :: Gen Aggregate
genAggregate =
Gen.choice
[ SingleColumn <$> genSingleColumnAggregate,
@ -74,17 +61,17 @@ genAggregate =
pure StarCount
]
genSingleColumnAggregate :: MonadGen m => m SingleColumnAggregate
genSingleColumnAggregate :: Gen SingleColumnAggregate
genSingleColumnAggregate =
SingleColumnAggregate
<$> genSingleColumnAggregateFunction
<*> genColumnName
genColumnCountAggregate :: MonadGen m => m ColumnCountAggregate
genColumnCountAggregate :: Gen ColumnCountAggregate
genColumnCountAggregate =
ColumnCountAggregate
<$> genColumnName
<*> Gen.bool
genSingleColumnAggregateFunction :: MonadGen m => m SingleColumnAggregateFunction
genSingleColumnAggregateFunction = Gen.enumBounded
genSingleColumnAggregateFunction :: Gen SingleColumnAggregateFunction
genSingleColumnAggregateFunction = SingleColumnAggregateFunction <$> genName

View File

@ -5,9 +5,12 @@ module Hasura.Backends.DataConnector.API.V0.CapabilitiesSpec (spec) where
import Data.Aeson (Value (..))
import Data.Aeson.QQ.Simple (aesonQQ)
import Data.HashMap.Strict qualified as HashMap
import Data.Text.RawString (raw)
import Hasura.Backends.DataConnector.API.V0.Capabilities
import Hasura.Backends.DataConnector.API.V0.ConfigSchema
import Hasura.Backends.DataConnector.API.V0.Scalar (ScalarType (..))
import Hasura.Backends.DataConnector.API.V0.ScalarSpec (genScalarType)
import Hasura.Generator.Common
import Hasura.Prelude
import Hedgehog
@ -27,8 +30,11 @@ spec = do
testToFromJSON
(CapabilitiesResponse (defaultCapabilities {_cRelationships = Just RelationshipCapabilities {}}) emptyConfigSchemaResponse)
[aesonQQ|{"capabilities": {"relationships": {}}, "config_schemas": {"config_schema": {}, "other_schemas": {}}}|]
describe "ScalarTypesCapabilities" $ do
testToFromJSONToSchema (ScalarTypesCapabilities (HashMap.singleton StringTy (ScalarTypeCapabilities Nothing Nothing))) [aesonQQ|{"string": {}}|]
jsonOpenApiProperties genScalarTypesCapabilities
describe "ScalarTypeCapabilities" $ do
testToFromJSONToSchema (ScalarTypeCapabilities $ Just [G.name|DateTimeComparisons|]) [aesonQQ|{"comparison_type": "DateTimeComparisons"}|]
testToFromJSONToSchema (ScalarTypeCapabilities (Just [G.name|DateTimeComparisons|]) Nothing) [aesonQQ|{"comparison_type": "DateTimeComparisons"}|]
describe "GraphQLTypeDefinitions" $ do
testToFromJSONToSchema sampleGraphQLTypeDefinitions sampleGraphQLTypeDefinitionsJSON
@ -76,12 +82,19 @@ genMutationCapabilities = pure MutationCapabilities {}
genSubscriptionCapabilities :: MonadGen m => m SubscriptionCapabilities
genSubscriptionCapabilities = pure SubscriptionCapabilities {}
genAggregateFunctions :: MonadGen m => m AggregateFunctions
genAggregateFunctions =
AggregateFunctions <$> genHashMap (genGName defaultRange) genScalarType defaultRange
genScalarTypeCapabilities :: MonadGen m => m ScalarTypeCapabilities
genScalarTypeCapabilities = ScalarTypeCapabilities <$> Gen.maybe (genGName defaultRange)
genScalarTypeCapabilities =
ScalarTypeCapabilities
<$> Gen.maybe (genGName defaultRange)
<*> Gen.maybe genAggregateFunctions
genScalarTypesCapabilities :: MonadGen m => m ScalarTypesCapabilities
genScalarTypesCapabilities =
ScalarTypesCapabilities <$> genHashMap (genGName defaultRange) genScalarTypeCapabilities defaultRange
ScalarTypesCapabilities <$> genHashMap genScalarType genScalarTypeCapabilities defaultRange
-- | 'genTypeDefinition' generates invalid type definitions so we need to filter them out.
-- The printers also sort various lists upon printing, so we need to pre-sort them for round-tripping to work.

View File

@ -19,6 +19,7 @@ import Hasura.Generator.Common (defaultRange)
import Hasura.Prelude
import Hedgehog
import Hedgehog.Gen qualified as Gen
import Language.GraphQL.Draft.Syntax.QQ qualified as G
import Test.Aeson.Utils (jsonOpenApiProperties, testToFromJSONToSchema)
import Test.Hspec
@ -41,7 +42,7 @@ spec = do
|]
describe "OrderBySingleColumnAggregate" $
testToFromJSONToSchema
(OrderBySingleColumnAggregate (SingleColumnAggregate Sum (ColumnName "test_column")))
(OrderBySingleColumnAggregate (SingleColumnAggregate (SingleColumnAggregateFunction [G.name|sum|]) (ColumnName "test_column")))
[aesonQQ|
{ "type": "single_column_aggregate",
"function": "sum",
@ -120,27 +121,27 @@ spec = do
testToFromJSONToSchema Descending [aesonQQ|"desc"|]
jsonOpenApiProperties genOrderDirection
genOrderBy :: MonadGen m => m OrderBy
genOrderBy :: Gen OrderBy
genOrderBy =
OrderBy
<$> (HashMap.fromList <$> Gen.list defaultRange ((,) <$> genRelationshipName <*> genOrderByRelation))
<*> Gen.nonEmpty defaultRange genOrderByElement
genOrderByRelation :: MonadGen m => m OrderByRelation
genOrderByRelation :: Gen OrderByRelation
genOrderByRelation =
OrderByRelation
<$> Gen.maybe genExpression
-- Gen.small ensures the recursion will terminate as the size will shrink with each recursion
<*> Gen.small (HashMap.fromList <$> Gen.list defaultRange ((,) <$> genRelationshipName <*> genOrderByRelation))
genOrderByElement :: MonadGen m => m OrderByElement
genOrderByElement :: Gen OrderByElement
genOrderByElement =
OrderByElement
<$> Gen.list defaultRange genRelationshipName
<*> genOrderByTarget
<*> genOrderDirection
genOrderByTarget :: MonadGen m => m OrderByTarget
genOrderByTarget :: Gen OrderByTarget
genOrderByTarget =
Gen.choice
[ OrderByColumn <$> genColumnName,
@ -148,5 +149,5 @@ genOrderByTarget =
OrderBySingleColumnAggregate <$> genSingleColumnAggregate
]
genOrderDirection :: MonadGen m => m OrderDirection
genOrderDirection :: Gen OrderDirection
genOrderDirection = Gen.enumBounded

View File

@ -143,27 +143,27 @@ spec = do
"aggregates": {} }
|]
genField :: MonadGen m => m Field
genField :: Gen Field
genField =
Gen.recursive
Gen.choice
[ColumnField <$> genColumnName <*> genScalarType]
[RelField <$> genRelationshipField]
genFieldName :: MonadGen m => m FieldName
genFieldName :: Gen FieldName
genFieldName = FieldName <$> genArbitraryAlphaNumText defaultRange
genFieldMap :: MonadGen m => m value -> m (HashMap FieldName value)
genFieldMap :: Gen value -> Gen (HashMap FieldName value)
genFieldMap genValue' =
HashMap.fromList <$> Gen.list defaultRange ((,) <$> genFieldName <*> genValue')
genRelationshipField :: MonadGen m => m RelationshipField
genRelationshipField :: Gen RelationshipField
genRelationshipField =
RelationshipField
<$> genRelationshipName
<*> genQuery
genQuery :: MonadGen m => m Query
genQuery :: Gen Query
genQuery =
Query
<$> Gen.maybe (genFieldMap genField)
@ -173,21 +173,21 @@ genQuery =
<*> Gen.maybe genExpression
<*> Gen.maybe genOrderBy
genQueryRequest :: MonadGen m => m QueryRequest
genQueryRequest :: Gen QueryRequest
genQueryRequest =
QueryRequest
<$> genTableName
<*> Gen.list defaultRange genTableRelationships
<*> genQuery
genFieldValue :: MonadGen m => m FieldValue
genFieldValue :: Gen FieldValue
genFieldValue =
Gen.recursive
Gen.choice
[mkColumnFieldValue <$> genValue]
[mkRelationshipFieldValue <$> genQueryResponse]
genQueryResponse :: MonadGen m => m QueryResponse
genQueryResponse :: Gen QueryResponse
genQueryResponse =
QueryResponse
<$> Gen.maybe (Gen.list defaultRange (genFieldMap genFieldValue))

View File

@ -26,31 +26,31 @@ import Hedgehog.Internal.Range
import Test.Hspec
import Test.Hspec.Hedgehog
testToFromJSON :: (Eq a, Show a, FromJSON a, ToJSON a) => a -> Value -> Spec
testToFromJSON :: (HasCallStack, Eq a, Show a, FromJSON a, ToJSON a) => a -> Value -> Spec
testToFromJSON a v = do
it "parses from JSON" $
parseEither parseJSON v `shouldBe` Right a
it "encodes to JSON" $
toJSON a `shouldBe` v
validateToJSONOpenApi :: (ToJSON a, ToSchema a) => a -> Spec
validateToJSONOpenApi :: (HasCallStack, ToJSON a, ToSchema a) => a -> Spec
validateToJSONOpenApi a = do
it "value validates against OpenAPI schema" $
validatePrettyToJSON a `shouldBe` Nothing
testToFromJSONToSchema :: (Eq a, Show a, FromJSON a, ToJSON a, ToSchema a) => a -> Value -> Spec
testToFromJSONToSchema :: (HasCallStack, Eq a, Show a, FromJSON a, ToJSON a, ToSchema a) => a -> Value -> Spec
testToFromJSONToSchema a v = do
testToFromJSON a v
validateToJSONOpenApi a
jsonRoundTrip :: (Eq a, Show a, FromJSON a, ToJSON a) => Gen a -> Spec
jsonRoundTrip :: (HasCallStack, Eq a, Show a, FromJSON a, ToJSON a) => Gen a -> Spec
jsonRoundTrip gen =
it "JSON roundtrips" $
hedgehog $ do
a <- forAll gen
tripping a toJSON (parseEither parseJSON)
jsonEncodingEqualsValue :: (Show a, ToJSON a) => Gen a -> Spec
jsonEncodingEqualsValue :: (HasCallStack, Show a, ToJSON a) => Gen a -> Spec
jsonEncodingEqualsValue gen =
it "JSON encoding equals value" $
hedgehog $ do
@ -59,19 +59,19 @@ jsonEncodingEqualsValue gen =
decoded = decode encoded :: Maybe Value
decoded === Just (toJSON a)
jsonProperties :: (Eq a, Show a, FromJSON a, ToJSON a) => Gen a -> Spec
jsonProperties :: (HasCallStack, Eq a, Show a, FromJSON a, ToJSON a) => Gen a -> Spec
jsonProperties gen = do
jsonRoundTrip gen
jsonEncodingEqualsValue gen
validateAgainstOpenApiSchema :: (Show a, ToJSON a, ToSchema a) => Gen a -> Spec
validateAgainstOpenApiSchema :: (HasCallStack, Show a, ToJSON a, ToSchema a) => Gen a -> Spec
validateAgainstOpenApiSchema gen = do
it "ToJSON validates against OpenAPI schema" $
hedgehog $ do
a <- forAll gen
validatePrettyToJSON a === Nothing
jsonOpenApiProperties :: (Eq a, Show a, FromJSON a, ToJSON a, ToSchema a) => Gen a -> Spec
jsonOpenApiProperties :: (HasCallStack, Eq a, Show a, FromJSON a, ToJSON a, ToSchema a) => Gen a -> Spec
jsonOpenApiProperties gen = do
jsonProperties gen
validateAgainstOpenApiSchema gen