graphql-engine/server/src-test/Test/Parser/Internal.hs
David Overton e5f88d8039 Nested array support for Data Connectors Backend and MongoDB
## Description

This change adds support for querying into nested arrays in Data Connector agents that support such a concept (currently MongoDB).

### DC API changes

- New API type `ColumnType` which allows representing the type of a "column" as either a scalar type, an object reference or an array of `ColumnType`s. This recursive definition allows arbitrary nesting of arrays of types.
- The `type` fields in the API types `ColumnInfo` and `ColumnInsertSchema` now take a `ColumnType` instead of a `ScalarType`.
- To ensure backwards compatibility, a `ColumnType` representing a scalar serialises and deserialises to the same representation as `ScalarType`.
- In queries, the `Field` type now has a new constructor `NestedArrayField`. This contains a nested `Field` along with optional `limit`, `offset`, `where` and `order_by` arguments. (These optional arguments are not yet used by either HGE or the MongoDB agent.)

### MongoDB Haskell agent changes

- The `/schema` endpoint will now recognise arrays within the JSON validation schema and generate corresponding arrays in the DC schema.
- The `/query` endpoint will now handle `NestedArrayField`s within queries (although it does not yet handle `limit`, `offset`, `where` and `order_by`).

### HGE server changes

- The `Backend` type class adds a new type family `XNestedArrays b` to enable nested arrays on a per-backend basis (currently enabled only for the `DataConnector` backend.
- Within `RawColumnInfo` the column type is now represented by a new type `RawColumnType b` which mirrors the shape of the DC API `ColumnType`, but uses `XNestedObjects b` and `XNestedArrays b` type families to allow turning nested object and array supports on or off for a particular backend. In the `DataConnector` backend `API.CustomType` is converted into `RawColumnInfo 'DataConnector` while building the schema.
- In the next stage of schema building, the `RawColumnInfo` is converted into a `StructuredColumnInfo` which allows us to represent the three different types of columns: scalar, object and array. TODO: the `StructuredColumnInfo` looks very similar to the Logical Model types. The main difference is that it uses the `XNestedObjects` and `XNestedArrays` type families. We should be able to combine these two representations.
- The `StructuredColumnInfo` is then placed into a `FIColumn` `FieldInfo`. This involved some refactoring of `FieldInfo` as I had previously split out `FINestedObject` into a separate constructor. However it works out better to represent all "column" fields (i.e. scalar, object and array) using `FIColumn` as this make it easier to implement permission checking correctly. This is the reason the `StructuredColumnInfo` was needed.
- Next, the `FieldInfo` are used to generate `FieldParser`s. We add a new constructor to `AnnFieldG` for `AFNestedArray`. An `AFNestedArray` field parser can contain either a simple array selection or an array aggregate. Simple array `FieldParsers` are currently limited to subfield selection. We will add support for limit, offset, where and order_by in a future PR. We also don't yet generate array aggregate `FieldParsers.
- The new `AFNestedArray` field is handled by the `QueryPlan` module in the `DataConnector` backend. There we generate an `API.NestedArrayField` from the AFNestedArray. We also handle nested arrays when reshaping the response from the DC agent.

## Limitations

- Support for limit, offset, filter (where) and order_by is not yet fully implemented, although it should not be hard to add this
- Support for aggregations on nested arrays is not yet fully implemented
- Permissions involving nested arrays (and objects) not yet implemented
- This should be integrated with Logical Model types, but that will happen in a separate PR

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/9149
GitOrigin-RevId: 0e7b71a994fc1d2ca1ef73bfe7b96e95b5328531
2023-05-24 08:02:43 +00:00

259 lines
8.6 KiB
Haskell

-- | Internal helper module. Some things re-exported by
-- 'Test.Parser.Expectation'.
module Test.Parser.Internal
( mkTable,
ColumnInfoBuilder (..),
mkColumnInfo,
mkParser,
Parser,
TableInfoBuilder (..),
tableInfoBuilder,
buildTableInfo,
)
where
import Data.HashMap.Strict qualified as HashMap
import Data.HashSet qualified as HS
import Data.Sequence.NonEmpty qualified as NESeq
import Data.Text.Casing qualified as C
import Hasura.Backends.Postgres.Instances.Schema ()
import Hasura.Backends.Postgres.SQL.Types (ConstraintName (..), QualifiedObject (..), QualifiedTable, TableName (..), unsafePGCol)
import Hasura.GraphQL.Schema.Backend
import Hasura.GraphQL.Schema.Common (Scenario (Frontend))
import Hasura.GraphQL.Schema.Parser (FieldParser)
import Hasura.Prelude
import Hasura.RQL.IR.BoolExp (AnnBoolExpFld (..), GBoolExp (..), PartialSQLExp (..))
import Hasura.RQL.IR.Root (RemoteRelationshipField)
import Hasura.RQL.IR.Update (AnnotatedUpdateG (..))
import Hasura.RQL.IR.Value (UnpreparedValue (..))
import Hasura.RQL.Types.BackendType (BackendType (Postgres), PostgresKind (Vanilla))
import Hasura.RQL.Types.Column (ColumnInfo (..), ColumnMutability (..), ColumnType (..), StructuredColumnInfo (..))
import Hasura.RQL.Types.Common (Comment (..), FieldName (..), OID (..))
import Hasura.RQL.Types.Instances ()
import Hasura.RQL.Types.Permission (AllowedRootFields (..))
import Hasura.RQL.Types.Relationships.Local (RelInfo (..), fromRel)
import Hasura.Table.Cache (Constraint (..), CustomRootField (..), FieldInfo (..), PrimaryKey (..), RolePermInfo (..), SelPermInfo (..), TableConfig (..), TableCoreInfoG (..), TableCustomRootFields (..), TableInfo (..), UpdPermInfo (..))
import Language.GraphQL.Draft.Syntax (unsafeMkName)
import Test.Parser.Monad
{-# ANN module ("HLint: ignore Use mkName" :: String) #-}
type PG = 'Postgres 'Vanilla
type Parser = FieldParser ParserTest (AnnotatedUpdateG PG (RemoteRelationshipField UnpreparedValue) (UnpreparedValue PG))
-- | Create a table by its name, using the public schema.
mkTable :: Text -> QualifiedTable
mkTable name =
QualifiedObject
{ qSchema = "public",
qName = TableName name
}
-- | Build a column, see 'mkColumnInfo'.
data ColumnInfoBuilder = ColumnInfoBuilder
{ -- | name of the column
cibName :: Text,
-- | Column type, e.g.
--
-- > ColumnScalar PGText
cibType :: ColumnType PG,
-- | whether the column is nullable or not
cibNullable :: Bool,
-- | is it a primary key?
cibIsPrimaryKey :: Bool
}
-- | Create a column using the provided 'ColumnInfoBuilder' and defaults.
--
-- Note that all permissions are enabled by default.
mkColumnInfo :: ColumnInfoBuilder -> ColumnInfo PG
mkColumnInfo ColumnInfoBuilder {..} =
ColumnInfo
{ ciColumn = unsafePGCol cibName,
ciName = unsafeMkName cibName,
ciPosition = 0,
ciType = cibType,
ciIsNullable = cibNullable,
ciDescription = Nothing,
ciMutability = columnMutability
}
where
columnMutability :: ColumnMutability
columnMutability =
ColumnMutability
{ _cmIsInsertable = True,
_cmIsUpdatable = True
}
-- | Create a parser for the provided table and columns.
--
-- No special permissions, required headers, filters, etc., are set.
--
-- This will not work for inserts and deletes (see @rolePermInfo@ below).
mkParser :: TableInfoBuilder -> SchemaTest [Parser]
mkParser tib =
buildTableUpdateMutationFields
Frontend
(buildTableInfo tib)
name
where
name :: C.GQLNameIdentifier
name = C.fromAutogeneratedName (unsafeMkName $ getTableTxt $ qName (table tib))
-- | Inputs for building 'TableInfo's.
-- The expectation is that this will be extended freely as new tests need more
-- elaborate setup.
data TableInfoBuilder = TableInfoBuilder
{ table :: QualifiedTable,
columns :: [ColumnInfoBuilder],
relations :: [RelInfo PG]
}
-- | A smart constructor for an empty 'TableInfoBuilder'.
-- This should make it easier to maintain existing test code when new fields are
-- added.
tableInfoBuilder :: QualifiedTable -> TableInfoBuilder
tableInfoBuilder table = TableInfoBuilder {columns = [], relations = [], ..}
-- | Build a 'TableInfo' from a 'TableInfoBuilder.
-- The expectation is that this will be extended freely as new tests need more
-- elaborate setup.
buildTableInfo :: TableInfoBuilder -> TableInfo PG
buildTableInfo TableInfoBuilder {..} = tableInfo
where
tableInfo :: TableInfo PG
tableInfo =
TableInfo
{ _tiCoreInfo = tableCoreInfo,
_tiRolePermInfoMap = mempty,
_tiEventTriggerInfoMap = mempty,
_tiAdminRolePermInfo = rolePermInfo
}
tableCoreInfo :: TableCoreInfoG PG (FieldInfo PG) (ColumnInfo PG)
tableCoreInfo =
TableCoreInfo
{ _tciName = table,
_tciDescription = Nothing,
_tciFieldInfoMap = fieldInfoMap,
_tciPrimaryKey = pk,
_tciUniqueConstraints = mempty,
_tciForeignKeys = mempty,
_tciViewInfo = Nothing,
_tciEnumValues = Nothing,
_tciCustomConfig = tableConfig,
_tciExtraTableMetadata = (),
_tciApolloFederationConfig = Nothing,
_tciCustomObjectTypes = mempty
}
pk :: Maybe (PrimaryKey PG (ColumnInfo PG))
pk = case pks of
Nothing -> Nothing
Just primaryColumns ->
Just
PrimaryKey
{ _pkConstraint =
Constraint
{ _cName = ConstraintName "",
_cOid = OID 0
},
_pkColumns = primaryColumns
}
rolePermInfo :: RolePermInfo PG
rolePermInfo =
RolePermInfo
{ _permIns = Nothing,
_permSel = Just selPermInfo,
_permUpd = Just updPermInfo,
_permDel = Nothing
}
fieldInfoMap :: HashMap.HashMap FieldName (FieldInfo PG)
fieldInfoMap = HashMap.unions [columnFields, relationFields]
columnFields :: HashMap.HashMap FieldName (FieldInfo PG)
columnFields =
HashMap.fromList
. fmap toCIHashPair
$ columns
toCIHashPair :: ColumnInfoBuilder -> (FieldName, FieldInfo PG)
toCIHashPair cib = (coerce $ cibName cib, FIColumn $ SCIScalarColumn $ mkColumnInfo cib)
toRelHashPair :: RelInfo PG -> (FieldName, FieldInfo PG)
toRelHashPair ri = (fromRel $ riName ri, FIRelationship ri)
relationFields :: HashMap.HashMap FieldName (FieldInfo PG)
relationFields = HashMap.fromList . fmap toRelHashPair $ relations
tableConfig :: TableConfig PG
tableConfig =
TableConfig
{ _tcCustomRootFields = tableCustomRootFields,
_tcColumnConfig = mempty,
_tcCustomName = Nothing,
_tcComment = Automatic
}
selPermInfo :: SelPermInfo PG
selPermInfo =
SelPermInfo
{ spiCols = HashMap.fromList . fmap ((,Nothing) . unsafePGCol . cibName) $ columns,
spiComputedFields = mempty,
spiFilter = upiFilter,
spiLimit = Nothing,
spiAllowAgg = True,
spiRequiredHeaders = mempty,
spiAllowedQueryRootFields = ARFAllowAllRootFields,
spiAllowedSubscriptionRootFields = ARFAllowAllRootFields
}
tableCustomRootFields :: TableCustomRootFields
tableCustomRootFields =
TableCustomRootFields
{ _tcrfSelect = customRootField,
_tcrfSelectByPk = customRootField,
_tcrfSelectAggregate = customRootField,
_tcrfSelectStream = customRootField,
_tcrfInsert = customRootField,
_tcrfInsertOne = customRootField,
_tcrfUpdate = customRootField,
_tcrfUpdateByPk = customRootField,
_tcrfUpdateMany = customRootField,
_tcrfDelete = customRootField,
_tcrfDeleteByPk = customRootField
}
customRootField :: CustomRootField
customRootField =
CustomRootField
{ _crfName = Nothing,
_crfComment = Automatic
}
updPermInfo :: UpdPermInfo PG
updPermInfo =
UpdPermInfo
{ upiCols = HS.fromList . fmap (unsafePGCol . cibName) $ columns,
upiTable = table,
upiFilter = upiFilter,
upiCheck = Nothing,
upiSet = mempty,
upiBackendOnly = False,
upiRequiredHeaders = mempty
}
columnInfos :: [ColumnInfo PG]
columnInfos = mkColumnInfo <$> columns
pks :: Maybe (NESeq.NESeq (ColumnInfo PG))
pks = case mkColumnInfo <$> filter cibIsPrimaryKey columns of
[] -> Nothing
(x : xs) -> Just $ foldl (<>) (NESeq.singleton x) $ fmap NESeq.singleton xs
upiFilter :: GBoolExp PG (AnnBoolExpFld PG (PartialSQLExp PG))
upiFilter = BoolAnd $ fmap (\ci -> BoolField $ AVColumn ci []) columnInfos