mirror of
https://github.com/hasura/graphql-engine.git
synced 2024-12-15 09:22:43 +03:00
server: customize tracking tables with a custom name (#5719)
https://github.com/hasura/graphql-engine/pull/5719
This commit is contained in:
parent
562b6ac43a
commit
2cb08a89cb
@ -85,6 +85,7 @@ This release contains the [PDV refactor (#4111)](https://github.com/hasura/graph
|
||||
**NOTE:** If you have event triggers with names greater than 42 chars, then you should update their names to avoid running into Postgres identifier limit bug (#5786)
|
||||
- server: validate remote schema queries (fixes #4143)
|
||||
- server: fix issue with tracking custom functions that return `SETOF` materialized view (close #5294) (#5945)
|
||||
- server: introduce optional custom table name in table configuration to track the table according to the custom name. The `set_table_custom_fields` API has been deprecated, A new API `set_table_customization` has been added to set the configuration. (#3811)
|
||||
- server: allow remote relationships with union, interface and enum type fields as well (fixes #5875) (#6080)
|
||||
- console: allow user to cascade Postgres dependencies when dropping Postgres objects (close #5109) (#5248)
|
||||
- console: mark inconsistent remote schemas in the UI (close #5093) (#5181)
|
||||
|
@ -104,10 +104,15 @@ The various types of queries are listed in the following table:
|
||||
- 2
|
||||
- Add a table/view with configuration
|
||||
|
||||
* - :ref:`set_table_custom_fields <set_table_custom_fields>`
|
||||
* - :ref:`set_table_customization <set_table_customization>`
|
||||
- :ref:`set_table_customization_args <set_table_customization_syntax>`
|
||||
- 1
|
||||
- Set table customization of an already tracked table
|
||||
|
||||
* - :ref:`set_table_custom_fields <set_table_custom_fields>` (deprecated)
|
||||
- :ref:`set_table_custom_fields_args <set_table_custom_fields_args_syntax>`
|
||||
- 2
|
||||
- Set custom fields to an already tracked table
|
||||
- Set custom fields to an already tracked table (deprecated)
|
||||
|
||||
* - :ref:`untrack_table`
|
||||
- :ref:`untrack_table_args <untrack_table_syntax>`
|
||||
|
@ -149,6 +149,45 @@ Add a table/view ``author``:
|
||||
}
|
||||
}
|
||||
|
||||
A table can be tracked with a ``custom name``. This can be useful when a table
|
||||
name is not GraphQL compliant, like ``Users Address``. A ``custom name`` like
|
||||
``users_address`` will complement the ``"Users Address"``
|
||||
table, so that it can be added to the GraphQL schema.
|
||||
|
||||
.. code-block:: http
|
||||
|
||||
POST /v1/query HTTP/1.1
|
||||
Content-Type: application/json
|
||||
X-Hasura-Role: admin
|
||||
|
||||
{
|
||||
"type": "track_table",
|
||||
"version": 2,
|
||||
"args": {
|
||||
"table": "Author Details",
|
||||
"configuration": {
|
||||
"custom_name": "author_details"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
The GraphQL nodes and typenames
|
||||
that are generated will be according to the ``identifier``. For example, in this case,
|
||||
the nodes generated will be:
|
||||
|
||||
- ``users_address``
|
||||
- ``users_address_one``
|
||||
- ``users_address_aggregate``
|
||||
- ``insert_users_address``
|
||||
- ``insert_users_address_one``
|
||||
- ``update_users_address``
|
||||
- ``update_users_address_by_pk``
|
||||
- ``delete_users_address``
|
||||
- ``delete_users_address_by_pk``
|
||||
|
||||
.. note::
|
||||
graphql-engine requires the constraint names (if any) of a table to be GraphQL `compliant <https://spec.graphql.org/June2018/#sec-Names>`__ in order to be able to track it.
|
||||
|
||||
.. _track_table_args_syntax_v2:
|
||||
|
||||
Args syntax
|
||||
@ -182,6 +221,11 @@ Table Config
|
||||
- Required
|
||||
- Schema
|
||||
- Description
|
||||
* - custom_name
|
||||
- false
|
||||
- ``String``
|
||||
- Customise the ``<table-name>`` with the provided custom name value
|
||||
. The GraphQL nodes for the table will be generated according to the custom name.
|
||||
* - custom_root_fields
|
||||
- false
|
||||
- :ref:`Custom Root Fields <custom_root_fields>`
|
||||
@ -242,13 +286,19 @@ Custom Root Fields
|
||||
|
||||
.. _set_table_custom_fields:
|
||||
|
||||
set_table_custom_fields
|
||||
-----------------------
|
||||
set_table_custom_fields (deprecated)
|
||||
------------------------------------
|
||||
|
||||
``set_table_custom_fields`` has been deprecated. Use the
|
||||
:ref:`set_table_configuration <set_table_configuration>` API to set the custom
|
||||
table fields.
|
||||
|
||||
``set_table_custom_fields`` in version ``2`` sets the custom root fields and
|
||||
custom column names of already tracked table. This will **replace** the already
|
||||
present custom fields configuration.
|
||||
|
||||
|
||||
|
||||
Set custom fields for table/view ``author``:
|
||||
|
||||
.. code-block:: http
|
||||
@ -304,6 +354,70 @@ Args syntax
|
||||
- :ref:`CustomColumnNames`
|
||||
- Customise the column fields
|
||||
|
||||
.. _set_table_customization:
|
||||
|
||||
set_table_customization
|
||||
-----------------------
|
||||
|
||||
``set_table_customization`` allows to customize any given table with
|
||||
a custom name, custom root fields and custom column names of an already tracked
|
||||
table. This will **replace** the already present customization.
|
||||
|
||||
:ref:`set_table_custom_fields <set_table_custom_fields>` has been deprecated in
|
||||
favour of this API.
|
||||
|
||||
Set configuration for table/view ``author``:
|
||||
|
||||
.. code-block:: http
|
||||
|
||||
POST /v1/query HTTP/1.1
|
||||
Content-Type: application/json
|
||||
X-Hasura-Role: admin
|
||||
|
||||
{
|
||||
"type": "set_table_customization",
|
||||
"args": {
|
||||
"table": "author_details",
|
||||
"configuration": {
|
||||
"identifier": "author",
|
||||
"custom_root_fields": {
|
||||
"select": "Authors",
|
||||
"select_by_pk": "Author",
|
||||
"select_aggregate": "AuthorAggregate",
|
||||
"insert": "AddAuthors",
|
||||
"insert_one":"AddAuthor",
|
||||
"update": "UpdateAuthors",
|
||||
"update_by_pk": "UpdateAuthor",
|
||||
"delete": "DeleteAuthors",
|
||||
"delete_by_pk": "DeleteAuthor"
|
||||
},
|
||||
"custom_column_names": {
|
||||
"id": "authorId"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.. _set_table_customization_syntax:
|
||||
|
||||
Args syntax
|
||||
^^^^^^^^^^^
|
||||
|
||||
.. list-table::
|
||||
:header-rows: 1
|
||||
|
||||
* - Key
|
||||
- Required
|
||||
- Schema
|
||||
- Description
|
||||
* - table
|
||||
- true
|
||||
- :ref:`TableName <TableName>`
|
||||
- Name of the table
|
||||
* - configuration
|
||||
- false
|
||||
- :ref:`TableConfig <table_config>`
|
||||
- Configuration for the table/view
|
||||
|
||||
.. _untrack_table:
|
||||
|
||||
|
@ -38,6 +38,7 @@ module Hasura.Backends.Postgres.SQL.Types
|
||||
, qualifiedObjectToText
|
||||
, snakeCaseQualifiedObject
|
||||
, qualifiedObjectToName
|
||||
, isGraphQLCompliantTableName
|
||||
|
||||
, PGScalarType(..)
|
||||
, WithScalarType(..)
|
||||
@ -241,6 +242,9 @@ qualifiedObjectToName objectName = do
|
||||
"cannot include " <> objectName <<> " in the GraphQL schema because " <> textName
|
||||
<<> " is not a valid GraphQL identifier"
|
||||
|
||||
isGraphQLCompliantTableName :: ToTxt a => QualifiedObject a -> Bool
|
||||
isGraphQLCompliantTableName = isJust . G.mkName . snakeCaseQualifiedObject
|
||||
|
||||
type QualifiedTable = QualifiedObject TableName
|
||||
|
||||
type QualifiedFunction = QualifiedObject FunctionName
|
||||
|
@ -5,6 +5,7 @@ import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.Haskell.TH as TH
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Data.Has
|
||||
import Data.Parser.JSONPath
|
||||
@ -16,7 +17,7 @@ import Type.Reflection (Typeable)
|
||||
import Hasura.Backends.Postgres.SQL.Types
|
||||
import {-# SOURCE #-} Hasura.GraphQL.Parser.Internal.Parser
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.RQL.Types.Table (TableCache, TableInfo)
|
||||
import Hasura.RQL.Types.Table
|
||||
import Hasura.SQL.Backend
|
||||
import Hasura.Session (RoleName)
|
||||
|
||||
@ -124,6 +125,23 @@ askTableInfo tableName = do
|
||||
-- supposed to ensure that all dependencies are resolved.
|
||||
tableInfo `onNothing` throw500 ("askTableInfo: no info for " <>> tableName)
|
||||
|
||||
-- | Helper function to get the table GraphQL name. A table may have an
|
||||
-- identifier configured with it. When the identifier exists, the GraphQL nodes
|
||||
-- that are generated according to the identifier. For example: Let's say,
|
||||
-- we have a table called `users address`, the name of the table is not GraphQL
|
||||
-- compliant so we configure the table with a GraphQL compliant name,
|
||||
-- say `users_address`
|
||||
-- The generated top-level nodes of this table will be like `users_address`,
|
||||
-- `insert_users_address` etc
|
||||
getTableGQLName
|
||||
:: MonadTableInfo r m
|
||||
=> QualifiedTable
|
||||
-> m G.Name
|
||||
getTableGQLName table = do
|
||||
tableInfo <- askTableInfo table
|
||||
let tableCustomName = _tcCustomName . _tciCustomConfig . _tiCoreInfo $ tableInfo
|
||||
maybe (qualifiedObjectToName table) pure tableCustomName
|
||||
|
||||
-- | A wrapper around 'memoizeOn' that memoizes a function by using its argument
|
||||
-- as the key.
|
||||
memoize
|
||||
|
@ -73,7 +73,17 @@ buildGQLContext =
|
||||
tableFilter = not . isSystemDefined . _tciSystemDefined
|
||||
functionFilter = not . isSystemDefined . fiSystemDefined
|
||||
|
||||
graphQLTableFilter tableName tableInfo =
|
||||
-- either the table name should be GraphQL compliant
|
||||
-- or it should have a GraphQL custom name set with it
|
||||
isGraphQLCompliantTableName tableName
|
||||
|| (isJust . _tcCustomName . _tciCustomConfig . _tiCoreInfo $ tableInfo)
|
||||
|
||||
validTables = Map.filter (tableFilter . _tiCoreInfo) allTables
|
||||
-- Only tables that have GraphQL compliant names will be added to the schema.
|
||||
-- We allow tables which don't have GraphQL compliant names, so that RQL CRUD
|
||||
-- operations can be performed on them
|
||||
graphQLTables = Map.filterWithKey graphQLTableFilter validTables
|
||||
validFunctions = Map.elems $ Map.filter functionFilter allFunctions
|
||||
|
||||
allActionInfos = Map.elems allActions
|
||||
@ -87,12 +97,12 @@ buildGQLContext =
|
||||
SQLGenCtx{ stringifyNum } <- askSQLGenCtx
|
||||
let gqlContext =
|
||||
(,)
|
||||
<$> queryWithIntrospection (Set.fromMap $ validTables $> ())
|
||||
<$> queryWithIntrospection (Set.fromMap $ graphQLTables $> ())
|
||||
validFunctions mempty mempty
|
||||
allActionInfos nonObjectCustomTypes
|
||||
<*> mutation (Set.fromMap $ validTables $> ()) mempty
|
||||
<*> mutation (Set.fromMap $ graphQLTables $> ()) mempty
|
||||
allActionInfos nonObjectCustomTypes
|
||||
flip runReaderT (adminRoleName, validTables, Frontend, QueryContext stringifyNum queryType queryRemotesMap) $
|
||||
flip runReaderT (adminRoleName, graphQLTables, Frontend, QueryContext stringifyNum queryType queryRemotesMap) $
|
||||
P.runSchemaT gqlContext
|
||||
|
||||
-- build the admin context so that we can check against name clashes with remotes
|
||||
@ -164,19 +174,19 @@ buildGQLContext =
|
||||
queryRemotes = concatMap (piQuery . snd) remotes
|
||||
mutationRemotes = concatMap (concat . piMutation . snd) remotes
|
||||
queryHasuraOrRelay = case queryType of
|
||||
QueryHasura -> queryWithIntrospection (Set.fromMap $ validTables $> ())
|
||||
QueryHasura -> queryWithIntrospection (Set.fromMap $ graphQLTables $> ())
|
||||
validFunctions queryRemotes mutationRemotes
|
||||
allActionInfos nonObjectCustomTypes
|
||||
QueryRelay -> relayWithIntrospection (Set.fromMap $ validTables $> ()) validFunctions
|
||||
QueryRelay -> relayWithIntrospection (Set.fromMap $ graphQLTables $> ()) validFunctions
|
||||
|
||||
buildContextForRoleAndScenario :: RoleName -> Scenario -> m GQLContext
|
||||
buildContextForRoleAndScenario roleName scenario = do
|
||||
SQLGenCtx{ stringifyNum } <- askSQLGenCtx
|
||||
let gqlContext = GQLContext
|
||||
<$> (finalizeParser <$> queryHasuraOrRelay)
|
||||
<*> (fmap finalizeParser <$> mutation (Set.fromList $ Map.keys validTables) mutationRemotes
|
||||
<*> (fmap finalizeParser <$> mutation (Set.fromList $ Map.keys graphQLTables) mutationRemotes
|
||||
allActionInfos nonObjectCustomTypes)
|
||||
flip runReaderT (roleName, validTables, scenario, QueryContext stringifyNum queryType queryRemotesMap) $
|
||||
flip runReaderT (roleName, graphQLTables, scenario, QueryContext stringifyNum queryType queryRemotesMap) $
|
||||
P.runSchemaT gqlContext
|
||||
|
||||
buildContextForRole :: RoleName -> m (RoleContext GQLContext)
|
||||
@ -216,14 +226,14 @@ query' allTables allFunctions allRemotes allActions nonObjectCustomTypes = do
|
||||
selectPerms <- tableSelectPermissions table
|
||||
customRootFields <- _tcCustomRootFields . _tciCustomConfig . _tiCoreInfo <$> askTableInfo table
|
||||
for selectPerms \perms -> do
|
||||
displayName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
let fieldsDesc = G.Description $ "fetch data from the table: " <>> table
|
||||
aggName = displayName <> $$(G.litName "_aggregate")
|
||||
aggName = tableGQLName <> $$(G.litName "_aggregate")
|
||||
aggDesc = G.Description $ "fetch aggregated fields from the table: " <>> table
|
||||
pkName = displayName <> $$(G.litName "_by_pk")
|
||||
pkName = tableGQLName <> $$(G.litName "_by_pk")
|
||||
pkDesc = G.Description $ "fetch data from the table: " <> table <<> " using primary key columns"
|
||||
catMaybes <$> sequenceA
|
||||
[ requiredFieldParser (RFDB . QDBSimple) $ selectTable table (fromMaybe displayName $ _tcrfSelect customRootFields) (Just fieldsDesc) perms
|
||||
[ requiredFieldParser (RFDB . QDBSimple) $ selectTable table (fromMaybe tableGQLName $ _tcrfSelect customRootFields) (Just fieldsDesc) perms
|
||||
, mapMaybeFieldParser (RFDB . QDBPrimaryKey) $ selectTableByPk table (fromMaybe pkName $ _tcrfSelectByPk customRootFields) (Just pkDesc) perms
|
||||
, mapMaybeFieldParser (RFDB . QDBAggregation) $ selectTableAggregate table (fromMaybe aggName $ _tcrfSelectAggregate customRootFields) (Just aggDesc) perms
|
||||
]
|
||||
@ -232,12 +242,12 @@ query' allTables allFunctions allRemotes allActions nonObjectCustomTypes = do
|
||||
functionName = fiName function
|
||||
selectPerms <- tableSelectPermissions targetTable
|
||||
for selectPerms \perms -> do
|
||||
displayName <- qualifiedObjectToName functionName
|
||||
tableGQLName <- qualifiedObjectToName functionName
|
||||
let functionDesc = G.Description $ "execute function " <> functionName <<> " which returns " <>> targetTable
|
||||
aggName = displayName <> $$(G.litName "_aggregate")
|
||||
aggName = tableGQLName <> $$(G.litName "_aggregate")
|
||||
aggDesc = G.Description $ "execute function " <> functionName <<> " and query aggregates on result of table type " <>> targetTable
|
||||
catMaybes <$> sequenceA
|
||||
[ requiredFieldParser (RFDB . QDBSimple) $ selectFunction function displayName (Just functionDesc) perms
|
||||
[ requiredFieldParser (RFDB . QDBSimple) $ selectFunction function tableGQLName (Just functionDesc) perms
|
||||
, mapMaybeFieldParser (RFDB . QDBAggregation) $ selectFunctionAggregate function aggName (Just aggDesc) perms
|
||||
]
|
||||
actionParsers <- for allActions $ \actionInfo ->
|
||||
@ -275,8 +285,8 @@ relayQuery' allTables allFunctions = do
|
||||
pkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns)
|
||||
<$> askTableInfo table
|
||||
selectPerms <- MaybeT $ tableSelectPermissions table
|
||||
displayName <- qualifiedObjectToName table
|
||||
let fieldName = displayName <> $$(G.litName "_connection")
|
||||
tableGQLName <- getTableGQLName table
|
||||
let fieldName = tableGQLName <> $$(G.litName "_connection")
|
||||
fieldDesc = Just $ G.Description $ "fetch data from the table: " <>> table
|
||||
lift $ selectTableConnection table fieldName fieldDesc pkeyColumns selectPerms
|
||||
|
||||
@ -287,8 +297,8 @@ relayQuery' allTables allFunctions = do
|
||||
pkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns)
|
||||
<$> askTableInfo returnTable
|
||||
selectPerms <- MaybeT $ tableSelectPermissions returnTable
|
||||
displayName <- qualifiedObjectToName functionName
|
||||
let fieldName = displayName <> $$(G.litName "_connection")
|
||||
tableGQLName <- qualifiedObjectToName functionName
|
||||
let fieldName = tableGQLName <> $$(G.litName "_connection")
|
||||
fieldDesc = Just $ G.Description $ "execute function " <> functionName
|
||||
<<> " which returns " <>> returnTable
|
||||
lift $ selectFunctionConnection function fieldName fieldDesc pkeyColumns selectPerms
|
||||
@ -485,7 +495,7 @@ mutation
|
||||
mutation allTables allRemotes allActions nonObjectCustomTypes = do
|
||||
mutationParsers <- for (toList allTables) \table -> do
|
||||
tableCoreInfo <- _tiCoreInfo <$> askTableInfo table
|
||||
displayName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
tablePerms <- tablePermissions table
|
||||
for tablePerms \permissions -> do
|
||||
let customRootFields = _tcCustomRootFields $ _tciCustomConfig tableCoreInfo
|
||||
@ -500,9 +510,9 @@ mutation allTables allRemotes allActions nonObjectCustomTypes = do
|
||||
then Nothing
|
||||
else return insertPermission
|
||||
inserts <- fmap join $ whenMaybe (isMutable viIsInsertable viewInfo) $ for scenarioInsertPermissionM \insertPerms -> do
|
||||
let insertName = $$(G.litName "insert_") <> displayName
|
||||
let insertName = $$(G.litName "insert_") <> tableGQLName
|
||||
insertDesc = G.Description $ "insert data into the table: " <>> table
|
||||
insertOneName = $$(G.litName "insert_") <> displayName <> $$(G.litName "_one")
|
||||
insertOneName = $$(G.litName "insert_") <> tableGQLName <> $$(G.litName "_one")
|
||||
insertOneDesc = G.Description $ "insert a single row into the table: " <>> table
|
||||
insert <- insertIntoTable table (fromMaybe insertName $ _tcrfInsert customRootFields) (Just insertDesc) insertPerms selectPerms (_permUpd permissions)
|
||||
-- select permissions are required for InsertOne: the
|
||||
@ -514,9 +524,9 @@ mutation allTables allRemotes allActions nonObjectCustomTypes = do
|
||||
pure $ fmap (RFDB . MDBInsert) insert : maybe [] (pure . fmap (RFDB . MDBInsert)) insertOne
|
||||
|
||||
updates <- fmap join $ whenMaybe (isMutable viIsUpdatable viewInfo) $ for (_permUpd permissions) \updatePerms -> do
|
||||
let updateName = $$(G.litName "update_") <> displayName
|
||||
let updateName = $$(G.litName "update_") <> tableGQLName
|
||||
updateDesc = G.Description $ "update data of the table: " <>> table
|
||||
updateByPkName = $$(G.litName "update_") <> displayName <> $$(G.litName "_by_pk")
|
||||
updateByPkName = $$(G.litName "update_") <> tableGQLName <> $$(G.litName "_by_pk")
|
||||
updateByPkDesc = G.Description $ "update single row of the table: " <>> table
|
||||
update <- updateTable table (fromMaybe updateName $ _tcrfUpdate customRootFields) (Just updateDesc) updatePerms selectPerms
|
||||
-- likewise; furthermore, primary keys can only be tested in
|
||||
@ -527,9 +537,9 @@ mutation allTables allRemotes allActions nonObjectCustomTypes = do
|
||||
pure $ fmap (RFDB . MDBUpdate) <$> catMaybes [update, updateByPk]
|
||||
|
||||
deletes <- fmap join $ whenMaybe (isMutable viIsDeletable viewInfo) $ for (_permDel permissions) \deletePerms -> do
|
||||
let deleteName = $$(G.litName "delete_") <> displayName
|
||||
let deleteName = $$(G.litName "delete_") <> tableGQLName
|
||||
deleteDesc = G.Description $ "delete data from the table: " <>> table
|
||||
deleteByPkName = $$(G.litName "delete_") <> displayName <> $$(G.litName "_by_pk")
|
||||
deleteByPkName = $$(G.litName "delete_") <> tableGQLName <> $$(G.litName "_by_pk")
|
||||
deleteByPkDesc = G.Description $ "delete single row from the table: " <>> table
|
||||
delete <- deleteFromTable table (fromMaybe deleteName $ _tcrfDelete customRootFields) (Just deleteDesc) deletePerms selectPerms
|
||||
|
||||
|
@ -36,7 +36,8 @@ boolExp
|
||||
-> Maybe (SelPermInfo 'Postgres)
|
||||
-> m (Parser 'Input n (AnnBoolExp 'Postgres UnpreparedValue))
|
||||
boolExp table selectPermissions = memoizeOn 'boolExp table $ do
|
||||
name <- qualifiedObjectToName table <&> (<> $$(G.litName "_bool_exp"))
|
||||
tableGQLName <- getTableGQLName table
|
||||
let name = tableGQLName <> $$(G.litName "_bool_exp")
|
||||
let description = G.Description $
|
||||
"Boolean expression to filter rows from the table " <> table <<>
|
||||
". All fields are combined with a logical 'AND'."
|
||||
|
@ -114,7 +114,7 @@ tableFieldsInput
|
||||
-> InsPermInfo 'Postgres -- ^ insert permissions of the table
|
||||
-> m (Parser 'Input n (AnnInsObj 'Postgres UnpreparedValue))
|
||||
tableFieldsInput table insertPerms = memoizeOn 'tableFieldsInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
allFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table
|
||||
objectFields <- catMaybes <$> for (Map.elems allFields) \case
|
||||
FIComputedField _ -> pure Nothing
|
||||
@ -144,7 +144,7 @@ tableFieldsInput table insertPerms = memoizeOn 'tableFieldsInput table do
|
||||
pure $ P.fieldOptional relFieldName Nothing parser <&> \arrRelIns -> do
|
||||
rel <- join arrRelIns
|
||||
Just $ AnnInsObj [] [] [RelIns rel relationshipInfo | not $ null $ _aiInsObj rel]
|
||||
let objectName = tableName <> $$(G.litName "_insert_input")
|
||||
let objectName = tableGQLName <> $$(G.litName "_insert_input")
|
||||
objectDesc = G.Description $ "input type for inserting data into table " <>> table
|
||||
pure $ P.object objectName (Just objectDesc) $ catMaybes <$> sequenceA objectFields
|
||||
<&> mconcat
|
||||
@ -159,12 +159,12 @@ objectRelationshipInput
|
||||
-> m (Parser 'Input n (SingleObjIns 'Postgres UnpreparedValue))
|
||||
objectRelationshipInput table insertPerms selectPerms updatePerms =
|
||||
memoizeOn 'objectRelationshipInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- tableColumns table
|
||||
objectParser <- tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms
|
||||
let objectName = $$(G.litName "data")
|
||||
inputName = tableName <> $$(G.litName "_obj_rel_insert_input")
|
||||
inputName = tableGQLName <> $$(G.litName "_obj_rel_insert_input")
|
||||
inputDesc = G.Description $ "input type for inserting object relation for remote table " <>> table
|
||||
inputParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
@ -182,12 +182,12 @@ arrayRelationshipInput
|
||||
-> m (Parser 'Input n (MultiObjIns 'Postgres UnpreparedValue))
|
||||
arrayRelationshipInput table insertPerms selectPerms updatePerms =
|
||||
memoizeOn 'arrayRelationshipInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- tableColumns table
|
||||
objectParser <- tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms
|
||||
let objectsName = $$(G.litName "data")
|
||||
inputName = tableName <> $$(G.litName "_arr_rel_insert_input")
|
||||
inputName = tableGQLName <> $$(G.litName "_arr_rel_insert_input")
|
||||
inputDesc = G.Description $ "input type for inserting array relation for remote table " <>> table
|
||||
inputParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
@ -224,12 +224,12 @@ conflictObject
|
||||
-> UpdPermInfo 'Postgres
|
||||
-> m (Maybe (Parser 'Input n (RQL.ConflictClauseP1 'Postgres UnpreparedValue)))
|
||||
conflictObject table selectPerms updatePerms = runMaybeT $ do
|
||||
tableName <- lift $ qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
columnsEnum <- MaybeT $ tableUpdateColumnsEnum table updatePerms
|
||||
constraints <- MaybeT $ tciUniqueOrPrimaryKeyConstraints . _tiCoreInfo <$> askTableInfo table
|
||||
constraintParser <- lift $ conflictConstraint constraints table
|
||||
whereExpParser <- lift $ boolExp table selectPerms
|
||||
let objectName = tableName <> $$(G.litName "_on_conflict")
|
||||
let objectName = tableGQLName <> $$(G.litName "_on_conflict")
|
||||
objectDesc = G.Description $ "on conflict condition type for table " <>> table
|
||||
constraintName = $$(G.litName "constraint")
|
||||
columnsName = $$(G.litName "update_columns")
|
||||
@ -251,18 +251,17 @@ conflictConstraint
|
||||
-> QualifiedTable
|
||||
-> m (Parser 'Both n ConstraintName)
|
||||
conflictConstraint constraints table = memoizeOn 'conflictConstraint table $ do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
constraintEnumValues <- for constraints \constraint -> do
|
||||
name <- textToName $ getConstraintTxt $ _cName constraint
|
||||
pure ( P.mkDefinition name (Just "unique or primary key constraint") P.EnumValueInfo
|
||||
, _cName constraint
|
||||
)
|
||||
let enumName = tableName <> $$(G.litName "_constraint")
|
||||
let enumName = tableGQLName <> $$(G.litName "_constraint")
|
||||
enumDesc = G.Description $ "unique or primary key constraints on table " <>> table
|
||||
pure $ P.enum enumName (Just enumDesc) constraintEnumValues
|
||||
|
||||
|
||||
|
||||
-- update
|
||||
|
||||
-- | Construct a root field, normally called update_tablename, that can be used
|
||||
@ -300,14 +299,14 @@ updateTableByPk
|
||||
-> SelPermInfo 'Postgres -- ^ select permissions of the table
|
||||
-> m (Maybe (FieldParser n (RQL.AnnUpdG 'Postgres UnpreparedValue)))
|
||||
updateTableByPk table fieldName description updatePerms selectPerms = runMaybeT $ do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- lift $ tableSelectColumns table selectPerms
|
||||
pkArgs <- MaybeT $ primaryKeysArguments table selectPerms
|
||||
opArgs <- MaybeT $ updateOperators table updatePerms
|
||||
selection <- lift $ tableSelectionSet table selectPerms
|
||||
let pkFieldName = $$(G.litName "pk_columns")
|
||||
pkObjectName = tableName <> $$(G.litName "_pk_columns_input")
|
||||
pkObjectDesc = G.Description $ "primary key columns input for table: " <> G.unName tableName
|
||||
pkObjectName = tableGQLName <> $$(G.litName "_pk_columns_input")
|
||||
pkObjectDesc = G.Description $ "primary key columns input for table: " <> G.unName tableGQLName
|
||||
argsParser = do
|
||||
operators <- opArgs
|
||||
primaryKeys <- P.field pkFieldName Nothing $ P.object pkObjectName (Just pkObjectDesc) pkArgs
|
||||
@ -344,40 +343,40 @@ updateOperators
|
||||
-> UpdPermInfo 'Postgres -- ^ update permissions of the table
|
||||
-> m (Maybe (InputFieldsParser n [(PGCol, RQL.UpdOpExpG UnpreparedValue)]))
|
||||
updateOperators table updatePermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- tableUpdateColumns table updatePermissions
|
||||
let numericCols = onlyNumCols columns
|
||||
jsonCols = onlyJSONBCols columns
|
||||
parsers <- catMaybes <$> sequenceA
|
||||
[ updateOperator tableName $$(G.litName "_set")
|
||||
[ updateOperator tableGQLName $$(G.litName "_set")
|
||||
columnParser RQL.UpdSet columns
|
||||
"sets the columns of the filtered rows to the given values"
|
||||
(G.Description $ "input type for updating data in table " <>> table)
|
||||
|
||||
, updateOperator tableName $$(G.litName "_inc")
|
||||
, updateOperator tableGQLName $$(G.litName "_inc")
|
||||
columnParser RQL.UpdInc numericCols
|
||||
"increments the numeric columns with given value of the filtered values"
|
||||
(G.Description $"input type for incrementing numeric columns in table " <>> table)
|
||||
|
||||
, let desc = "prepend existing jsonb value of filtered columns with new jsonb value"
|
||||
in updateOperator tableName $$(G.litName "_prepend")
|
||||
in updateOperator tableGQLName $$(G.litName "_prepend")
|
||||
columnParser RQL.UpdPrepend jsonCols desc desc
|
||||
|
||||
, let desc = "append existing jsonb value of filtered columns with new jsonb value"
|
||||
in updateOperator tableName $$(G.litName "_append")
|
||||
in updateOperator tableGQLName $$(G.litName "_append")
|
||||
columnParser RQL.UpdAppend jsonCols desc desc
|
||||
|
||||
, let desc = "delete key/value pair or string element. key/value pairs are matched based on their key value"
|
||||
in updateOperator tableName $$(G.litName "_delete_key")
|
||||
in updateOperator tableGQLName $$(G.litName "_delete_key")
|
||||
nullableTextParser RQL.UpdDeleteKey jsonCols desc desc
|
||||
|
||||
, let desc = "delete the array element with specified index (negative integers count from the end). "
|
||||
<> "throws an error if top level container is not an array"
|
||||
in updateOperator tableName $$(G.litName "_delete_elem")
|
||||
in updateOperator tableGQLName $$(G.litName "_delete_elem")
|
||||
nonNullableIntParser RQL.UpdDeleteElem jsonCols desc desc
|
||||
|
||||
, let desc = "delete the field or element with specified path (for JSON arrays, negative integers count from the end)"
|
||||
in updateOperator tableName $$(G.litName "_delete_at_path")
|
||||
in updateOperator tableGQLName $$(G.litName "_delete_at_path")
|
||||
(fmap P.list . nonNullableTextParser) RQL.UpdDeleteAtPath jsonCols desc desc
|
||||
]
|
||||
whenMaybe (not $ null parsers) do
|
||||
@ -414,7 +413,7 @@ updateOperators table updatePermissions = do
|
||||
-> G.Description
|
||||
-> G.Description
|
||||
-> m (Maybe (Text, InputFieldsParser n (Maybe [(PGCol, RQL.UpdOpExpG UnpreparedValue)])))
|
||||
updateOperator tableName opName mkParser updOpExp columns opDesc objDesc =
|
||||
updateOperator tableGQLName opName mkParser updOpExp columns opDesc objDesc =
|
||||
whenMaybe (not $ null columns) do
|
||||
fields <- for columns \columnInfo -> do
|
||||
let fieldName = pgiName columnInfo
|
||||
@ -422,7 +421,7 @@ updateOperators table updatePermissions = do
|
||||
fieldParser <- mkParser columnInfo
|
||||
pure $ P.fieldOptional fieldName fieldDesc fieldParser
|
||||
`mapField` \value -> (pgiColumn columnInfo, updOpExp value)
|
||||
let objName = tableName <> opName <> $$(G.litName "_input")
|
||||
let objName = tableGQLName <> opName <> $$(G.litName "_input")
|
||||
pure $ (G.unName opName,)
|
||||
$ P.fieldOptional opName (Just opDesc)
|
||||
$ P.object objName (Just objDesc)
|
||||
@ -496,7 +495,7 @@ mutationSelectionSet
|
||||
-> m (Parser 'Output n (RQL.MutFldsG 'Postgres UnpreparedValue))
|
||||
mutationSelectionSet table selectPerms =
|
||||
memoizeOn 'mutationSelectionSet table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
returning <- runMaybeT do
|
||||
permissions <- MaybeT $ pure selectPerms
|
||||
tableSet <- lift $ tableSelectionList table permissions
|
||||
@ -505,7 +504,7 @@ mutationSelectionSet table selectPerms =
|
||||
pure $ RQL.MRet <$> P.subselection_ returningName (Just returningDesc) tableSet
|
||||
let affectedRowsName = $$(G.litName "affected_rows")
|
||||
affectedRowsDesc = "number of rows affected by the mutation"
|
||||
selectionName = tableName <> $$(G.litName "_mutation_response")
|
||||
selectionName = tableGQLName <> $$(G.litName "_mutation_response")
|
||||
selectionDesc = G.Description $ "response of any mutation on the table " <>> table
|
||||
|
||||
selectionFields = catMaybes
|
||||
|
@ -37,7 +37,8 @@ orderByExp
|
||||
-> SelPermInfo 'Postgres
|
||||
-> m (Parser 'Input n [RQL.AnnOrderByItemG 'Postgres UnpreparedValue])
|
||||
orderByExp table selectPermissions = memoizeOn 'orderByExp table $ do
|
||||
name <- qualifiedObjectToName table <&> (<> $$(G.litName "_order_by"))
|
||||
tableGQLName <- getTableGQLName table
|
||||
let name = tableGQLName <> $$(G.litName "_order_by")
|
||||
let description = G.Description $
|
||||
"Ordering options when selecting data from " <> table <<> "."
|
||||
tableFields <- tableSelectFields table selectPermissions
|
||||
@ -90,8 +91,8 @@ orderByAggregation table selectPermissions = do
|
||||
-- there is heavy duplication between this and Select.tableAggregationFields
|
||||
-- it might be worth putting some of it in common, just to avoid issues when
|
||||
-- we change one but not the other?
|
||||
tableName <- qualifiedObjectToName table
|
||||
allColumns <- tableSelectColumns table selectPermissions
|
||||
tableGQLName <- getTableGQLName table
|
||||
allColumns <- tableSelectColumns table selectPermissions
|
||||
let numColumns = onlyNumCols allColumns
|
||||
compColumns = onlyComparableCols allColumns
|
||||
numFields = catMaybes <$> traverse mkField numColumns
|
||||
@ -103,13 +104,13 @@ orderByAggregation table selectPermissions = do
|
||||
, -- operators on numeric columns
|
||||
if null numColumns then Nothing else Just $
|
||||
for numericAggOperators \operator ->
|
||||
parseOperator operator tableName numFields
|
||||
parseOperator operator tableGQLName numFields
|
||||
, -- operators on comparable columns
|
||||
if null compColumns then Nothing else Just $
|
||||
for comparisonAggOperators \operator ->
|
||||
parseOperator operator tableName compFields
|
||||
parseOperator operator tableGQLName compFields
|
||||
]
|
||||
let objectName = tableName <> $$(G.litName "_aggregate_order_by")
|
||||
let objectName = tableGQLName <> $$(G.litName "_aggregate_order_by")
|
||||
description = G.Description $ "order by aggregate values of table " <>> table
|
||||
pure $ P.object objectName (Just description) aggFields
|
||||
where
|
||||
@ -123,15 +124,13 @@ orderByAggregation table selectPermissions = do
|
||||
-> G.Name
|
||||
-> InputFieldsParser n [(ColumnInfo 'Postgres, OrderInfo)]
|
||||
-> InputFieldsParser n (Maybe [OrderByItemG (RQL.AnnAggregateOrderBy 'Postgres)])
|
||||
parseOperator operator tableName columns =
|
||||
parseOperator operator tableGQLName columns =
|
||||
let opText = G.unName operator
|
||||
objectName = tableName <> $$(G.litName "_") <> operator <> $$(G.litName "_order_by")
|
||||
objectName = tableGQLName <> $$(G.litName "_") <> operator <> $$(G.litName "_order_by")
|
||||
objectDesc = Just $ G.Description $ "order by " <> opText <> "() on columns of table " <>> table
|
||||
in P.fieldOptional operator Nothing (P.object objectName objectDesc columns)
|
||||
`mapField` map (\(col, info) -> mkOrderByItemG (RQL.AAOOp opText col) info)
|
||||
|
||||
|
||||
|
||||
orderByOperator :: MonadParse m => Parser 'Both m (Maybe OrderInfo)
|
||||
orderByOperator =
|
||||
P.nullable $ P.enum $$(G.litName "order_by") (Just "column ordering options") $ NE.fromList
|
||||
|
@ -207,11 +207,11 @@ selectTableAggregate
|
||||
selectTableAggregate table fieldName description selectPermissions = runMaybeT do
|
||||
guard $ spiAllowAgg selectPermissions
|
||||
stringifyNum <- asks $ qcStringifyNum . getter
|
||||
tableName <- lift $ qualifiedObjectToName table
|
||||
tableGQLName <- lift $ getTableGQLName table
|
||||
tableArgsParser <- lift $ tableArgs table selectPermissions
|
||||
aggregateParser <- lift $ tableAggregationFields table selectPermissions
|
||||
nodesParser <- lift $ tableSelectionList table selectPermissions
|
||||
let selectionName = tableName <> $$(G.litName "_aggregate")
|
||||
let selectionName = tableGQLName <> $$(G.litName "_aggregate")
|
||||
aggregationParser = P.nonNullableParser $
|
||||
parsedSelectionsToFields RQL.TAFExp <$>
|
||||
P.selectionSet selectionName (Just $ G.Description $ "aggregated selection of " <>> table)
|
||||
@ -300,7 +300,7 @@ tableSelectionSet
|
||||
-> m (Parser 'Output n (AnnotatedFields 'Postgres))
|
||||
tableSelectionSet table selectPermissions = memoizeOn 'tableSelectionSet table do
|
||||
tableInfo <- _tiCoreInfo <$> askTableInfo table
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
let tableFields = Map.elems $ _tciFieldInfoMap tableInfo
|
||||
tablePkeyColumns = _pkColumns <$> _tciPrimaryKey tableInfo
|
||||
description = Just $ mkDescriptionWith (_tciDescription tableInfo) $
|
||||
@ -324,10 +324,10 @@ tableSelectionSet table selectPermissions = memoizeOn 'tableSelectionSet table d
|
||||
P.selection_ $$(G.litName "id") Nothing P.identifier $> RQL.AFNodeId table pkeyColumns
|
||||
allFieldParsers = fieldParsers <> [nodeIdFieldParser]
|
||||
nodeInterface <- node
|
||||
pure $ P.selectionSetObject tableName description allFieldParsers [nodeInterface]
|
||||
pure $ P.selectionSetObject tableGQLName description allFieldParsers [nodeInterface]
|
||||
<&> parsedSelectionsToFields RQL.AFExpression
|
||||
_ ->
|
||||
pure $ P.selectionSetObject tableName description fieldParsers []
|
||||
pure $ P.selectionSetObject tableGQLName description fieldParsers []
|
||||
<&> parsedSelectionsToFields RQL.AFExpression
|
||||
|
||||
-- | List of table fields object.
|
||||
@ -378,9 +378,9 @@ tableConnectionSelectionSet
|
||||
-> SelPermInfo 'Postgres
|
||||
-> m (Parser 'Output n (RQL.ConnectionFields 'Postgres UnpreparedValue))
|
||||
tableConnectionSelectionSet table selectPermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
edgesParser <- tableEdgesSelectionSet
|
||||
let connectionTypeName = tableName <> $$(G.litName "Connection")
|
||||
edgesParser <- tableEdgesSelectionSet
|
||||
tableGQLName <- getTableGQLName table
|
||||
let connectionTypeName = tableGQLName <> $$(G.litName "Connection")
|
||||
pageInfo = P.subselection_ $$(G.litName "pageInfo") Nothing
|
||||
pageInfoSelectionSet <&> RQL.ConnectionPageInfo
|
||||
edges = P.subselection_ $$(G.litName "edges") Nothing
|
||||
@ -407,12 +407,13 @@ tableConnectionSelectionSet table selectPermissions = do
|
||||
in P.nonNullableParser $ P.selectionSet $$(G.litName "PageInfo") Nothing allFields
|
||||
<&> parsedSelectionsToFields RQL.PageInfoTypename
|
||||
|
||||
|
||||
tableEdgesSelectionSet
|
||||
:: m (Parser 'Output n (RQL.EdgeFields 'Postgres UnpreparedValue))
|
||||
tableEdgesSelectionSet = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
edgeNodeParser <- P.nonNullableParser <$> tableSelectionSet table selectPermissions
|
||||
let edgesType = tableName <> $$(G.litName "Edge")
|
||||
let edgesType = tableGQLName <> $$(G.litName "Edge")
|
||||
cursor = P.selection_ $$(G.litName "cursor") Nothing
|
||||
P.string $> RQL.EdgeCursor
|
||||
edgeNode = P.subselection_ $$(G.litName "node") Nothing
|
||||
@ -455,10 +456,11 @@ selectFunctionAggregate function fieldName description selectPermissions = runMa
|
||||
let table = fiReturnType function
|
||||
stringifyNum <- asks $ qcStringifyNum . getter
|
||||
guard $ spiAllowAgg selectPermissions
|
||||
tableGQLName <- getTableGQLName table
|
||||
tableArgsParser <- lift $ tableArgs table selectPermissions
|
||||
functionArgsParser <- lift $ customSQLFunctionArgs function
|
||||
aggregateParser <- lift $ tableAggregationFields table selectPermissions
|
||||
selectionName <- lift $ qualifiedObjectToName table <&> (<> $$(G.litName "_aggregate"))
|
||||
selectionName <- lift $ pure tableGQLName <&> (<> $$(G.litName "_aggregate"))
|
||||
nodesParser <- lift $ tableSelectionList table selectPermissions
|
||||
let argsParser = liftA2 (,) functionArgsParser tableArgsParser
|
||||
aggregationParser = fmap (parsedSelectionsToFields RQL.TAFExp) $
|
||||
@ -785,11 +787,11 @@ tableAggregationFields
|
||||
-> SelPermInfo 'Postgres
|
||||
-> m (Parser 'Output n (RQL.AggregateFields 'Postgres))
|
||||
tableAggregationFields table selectPermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
tableGQLName <- getTableGQLName table
|
||||
allColumns <- tableSelectColumns table selectPermissions
|
||||
let numericColumns = onlyNumCols allColumns
|
||||
comparableColumns = onlyComparableCols allColumns
|
||||
selectName = tableName <> $$(G.litName "_aggregate_fields")
|
||||
selectName = tableGQLName <> $$(G.litName "_aggregate_fields")
|
||||
description = G.Description $ "aggregate fields of " <>> table
|
||||
count <- countField
|
||||
numericAndComparable <- fmap concat $ sequenceA $ catMaybes
|
||||
@ -797,12 +799,12 @@ tableAggregationFields table selectPermissions = do
|
||||
if null numericColumns then Nothing else Just $
|
||||
for numericAggOperators $ \operator -> do
|
||||
numFields <- mkNumericAggFields operator numericColumns
|
||||
pure $ parseAggOperator operator tableName numFields
|
||||
pure $ parseAggOperator operator tableGQLName numFields
|
||||
, -- operators on comparable columns
|
||||
if null comparableColumns then Nothing else Just $ do
|
||||
comparableFields <- traverse mkColumnAggField comparableColumns
|
||||
pure $ comparisonAggOperators & map \operator ->
|
||||
parseAggOperator operator tableName comparableFields
|
||||
parseAggOperator operator tableGQLName comparableFields
|
||||
]
|
||||
let aggregateFields = count : numericAndComparable
|
||||
pure $ P.selectionSet selectName (Just description) aggregateFields
|
||||
@ -841,9 +843,9 @@ tableAggregationFields table selectPermissions = do
|
||||
-> G.Name
|
||||
-> [FieldParser n (RQL.ColFld 'Postgres)]
|
||||
-> FieldParser n (RQL.AggregateField 'Postgres)
|
||||
parseAggOperator operator tableName columns =
|
||||
parseAggOperator operator tableGQLName columns =
|
||||
let opText = G.unName operator
|
||||
setName = tableName <> $$(G.litName "_") <> operator <> $$(G.litName "_fields")
|
||||
setName = tableGQLName <> $$(G.litName "_") <> operator <> $$(G.litName "_fields")
|
||||
setDesc = Just $ G.Description $ "aggregate " <> opText <> " on columns"
|
||||
subselectionParser = P.selectionSet setName setDesc columns
|
||||
<&> parsedSelectionsToFields RQL.CFExp
|
||||
|
@ -42,9 +42,9 @@ tableSelectColumnsEnum
|
||||
-> SelPermInfo 'Postgres
|
||||
-> m (Maybe (Parser 'Both n PGCol))
|
||||
tableSelectColumnsEnum table selectPermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableSelectColumns table selectPermissions
|
||||
let enumName = tableName <> $$(G.litName "_select_column")
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- tableSelectColumns table selectPermissions
|
||||
let enumName = tableGQLName <> $$(G.litName "_select_column")
|
||||
description = Just $ G.Description $
|
||||
"select columns of table " <>> table
|
||||
pure $ P.enum enumName description <$> nonEmpty
|
||||
@ -71,9 +71,9 @@ tableUpdateColumnsEnum
|
||||
-> UpdPermInfo 'Postgres
|
||||
-> m (Maybe (Parser 'Both n PGCol))
|
||||
tableUpdateColumnsEnum table updatePermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableUpdateColumns table updatePermissions
|
||||
let enumName = tableName <> $$(G.litName "_update_column")
|
||||
tableGQLName <- getTableGQLName table
|
||||
columns <- tableUpdateColumns table updatePermissions
|
||||
let enumName = tableGQLName <> $$(G.litName "_update_column")
|
||||
description = Just $ G.Description $
|
||||
"update columns of table " <>> table
|
||||
pure $ P.enum enumName description <$> nonEmpty
|
||||
|
@ -476,12 +476,12 @@ updateColMap fromQT toQT rnCol =
|
||||
possiblyUpdateCustomColumnNames
|
||||
:: MonadTx m => QualifiedTable -> PGCol -> PGCol -> m ()
|
||||
possiblyUpdateCustomColumnNames qt oCol nCol = do
|
||||
TableConfig customRootFields customColumns <- getTableConfig qt
|
||||
TableConfig customRootFields customColumns identifier <- getTableConfig qt
|
||||
let updatedCustomColumns =
|
||||
M.fromList $ flip map (M.toList customColumns) $
|
||||
\(dbCol, val) -> (, val) $ if dbCol == oCol then nCol else dbCol
|
||||
when (updatedCustomColumns /= customColumns) $
|
||||
updateTableConfig qt $ TableConfig customRootFields updatedCustomColumns
|
||||
updateTableConfig qt $ TableConfig customRootFields updatedCustomColumns identifier
|
||||
|
||||
-- database functions for relationships
|
||||
getRelDef :: QualifiedTable -> RelName -> Q.TxE QErr Value
|
||||
|
@ -17,6 +17,9 @@ module Hasura.RQL.DDL.Schema.Table
|
||||
, SetTableCustomFields(..)
|
||||
, runSetTableCustomFieldsQV2
|
||||
|
||||
, SetTableCustomization(..)
|
||||
, runSetTableCustomization
|
||||
|
||||
, buildTableCache
|
||||
, delTableAndDirectDeps
|
||||
, processTableChanges
|
||||
@ -196,6 +199,13 @@ runSetExistingTableIsEnumQ (SetTableIsEnum tableName isEnum) = do
|
||||
buildSchemaCacheFor (MOTable tableName)
|
||||
return successMsg
|
||||
|
||||
data SetTableCustomization
|
||||
= SetTableCustomization
|
||||
{ _stcTable :: !QualifiedTable
|
||||
, _stcConfiguration :: !TableConfig
|
||||
} deriving (Show, Eq, Lift)
|
||||
$(deriveJSON (aesonDrop 4 snakeCase) ''SetTableCustomization)
|
||||
|
||||
data SetTableCustomFields
|
||||
= SetTableCustomFields
|
||||
{ _stcfTable :: !QualifiedTable
|
||||
@ -215,10 +225,19 @@ runSetTableCustomFieldsQV2
|
||||
:: (MonadTx m, CacheRWM m) => SetTableCustomFields -> m EncJSON
|
||||
runSetTableCustomFieldsQV2 (SetTableCustomFields tableName rootFields columnNames) = do
|
||||
void $ askTabInfo tableName -- assert that table is tracked
|
||||
updateTableConfig tableName (TableConfig rootFields columnNames)
|
||||
-- `Identifier` is set to `Nothing` below because this API doesn't accept it
|
||||
updateTableConfig tableName (TableConfig rootFields columnNames Nothing)
|
||||
buildSchemaCacheFor (MOTable tableName)
|
||||
return successMsg
|
||||
|
||||
runSetTableCustomization
|
||||
:: (MonadTx m, CacheRWM m) => SetTableCustomization -> m EncJSON
|
||||
runSetTableCustomization (SetTableCustomization table config) = do
|
||||
void $ askTabInfo table
|
||||
updateTableConfig table config
|
||||
buildSchemaCacheFor (MOTable table)
|
||||
return successMsg
|
||||
|
||||
unTrackExistingTableOrViewP1
|
||||
:: (CacheRM m, QErrM m) => UntrackTable -> m ()
|
||||
unTrackExistingTableOrViewP1 (UntrackTable vn _) = do
|
||||
@ -271,7 +290,8 @@ processTableChanges ti tableDiff = do
|
||||
procAlteredCols sc tn
|
||||
|
||||
withNewTabName newTN = do
|
||||
let tnGQL = snakeCaseQualifiedObject newTN
|
||||
let customTableNameText = G.unName <$> (_tcCustomName . _tciCustomConfig $ ti)
|
||||
tnGQL = fromMaybe (snakeCaseQualifiedObject newTN) customTableNameText
|
||||
-- check for GraphQL schema conflicts on new name
|
||||
checkConflictingNode sc tnGQL
|
||||
procAlteredCols sc tn
|
||||
@ -287,10 +307,10 @@ processTableChanges ti tableDiff = do
|
||||
TableDiff mNewName droppedCols _ alteredCols _ computedFieldDiff _ _ = tableDiff
|
||||
|
||||
possiblyDropCustomColumnNames tn = do
|
||||
let TableConfig customFields customColumnNames = _tciCustomConfig ti
|
||||
let TableConfig customFields customColumnNames identifier = _tciCustomConfig ti
|
||||
modifiedCustomColumnNames = foldl' (flip Map.delete) customColumnNames droppedCols
|
||||
when (modifiedCustomColumnNames /= customColumnNames) $
|
||||
liftTx $ updateTableConfig tn $ TableConfig customFields modifiedCustomColumnNames
|
||||
liftTx $ updateTableConfig tn $ TableConfig customFields modifiedCustomColumnNames identifier
|
||||
|
||||
procAlteredCols sc tn = for_ alteredCols $
|
||||
\( RawColumnInfo oldName _ oldType _ _
|
||||
|
@ -392,20 +392,22 @@ data TableConfig
|
||||
= TableConfig
|
||||
{ _tcCustomRootFields :: !TableCustomRootFields
|
||||
, _tcCustomColumnNames :: !CustomColumnNames
|
||||
, _tcCustomName :: !(Maybe G.Name)
|
||||
} deriving (Show, Eq, Lift, Generic)
|
||||
instance NFData TableConfig
|
||||
instance Cacheable TableConfig
|
||||
$(deriveToJSON (aesonDrop 3 snakeCase) ''TableConfig)
|
||||
$(deriveToJSON (aesonDrop 3 snakeCase){omitNothingFields=True} ''TableConfig)
|
||||
|
||||
emptyTableConfig :: TableConfig
|
||||
emptyTableConfig =
|
||||
TableConfig emptyCustomRootFields M.empty
|
||||
TableConfig emptyCustomRootFields M.empty Nothing
|
||||
|
||||
instance FromJSON TableConfig where
|
||||
parseJSON = withObject "TableConfig" $ \obj ->
|
||||
TableConfig
|
||||
<$> obj .:? "custom_root_fields" .!= emptyCustomRootFields
|
||||
<*> obj .:? "custom_column_names" .!= M.empty
|
||||
<*> obj .:? "custom_name"
|
||||
|
||||
-- | The @field@ and @primaryKeyColumn@ type parameters vary as the schema cache is built and more
|
||||
-- information is accumulated. See 'TableRawInfo' and 'TableCoreInfo'.
|
||||
@ -435,7 +437,8 @@ type TableCoreInfo b = TableCoreInfoG (FieldInfo b) (ColumnInfo b)
|
||||
|
||||
tciUniqueOrPrimaryKeyConstraints :: TableCoreInfoG a b -> Maybe (NonEmpty Constraint)
|
||||
tciUniqueOrPrimaryKeyConstraints info = NE.nonEmpty $
|
||||
maybeToList (_pkConstraint <$> _tciPrimaryKey info) <> toList (_tciUniqueConstraints info)
|
||||
maybeToList (_pkConstraint <$> _tciPrimaryKey info)
|
||||
<> toList (_tciUniqueConstraints info)
|
||||
|
||||
data TableInfo (b :: Backend)
|
||||
= TableInfo
|
||||
|
@ -129,11 +129,12 @@ data RQLQueryV1
|
||||
| RQDumpInternalState !DumpInternalState
|
||||
|
||||
| RQSetCustomTypes !CustomTypes
|
||||
| RQSetTableCustomization !SetTableCustomization
|
||||
deriving (Show, Eq)
|
||||
|
||||
data RQLQueryV2
|
||||
= RQV2TrackTable !TrackTableV2
|
||||
| RQV2SetTableCustomFields !SetTableCustomFields
|
||||
| RQV2SetTableCustomFields !SetTableCustomFields -- deprecated
|
||||
| RQV2TrackFunction !TrackFunctionV2
|
||||
deriving (Show, Eq)
|
||||
|
||||
@ -304,6 +305,7 @@ queryModifiesSchemaCache (RQV1 qi) = case qi of
|
||||
|
||||
RQDumpInternalState _ -> False
|
||||
RQSetCustomTypes _ -> True
|
||||
RQSetTableCustomization _ -> True
|
||||
|
||||
RQBulk qs -> any queryModifiesSchemaCache qs
|
||||
|
||||
@ -440,6 +442,7 @@ runQueryM env rq = withPathK "args" $ case rq of
|
||||
RQRunSql q -> runRunSQL q
|
||||
|
||||
RQSetCustomTypes q -> runSetCustomTypes q
|
||||
RQSetTableCustomization q -> runSetTableCustomization q
|
||||
|
||||
RQBulk qs -> encJFromList <$> indexedMapM (runQueryM env) qs
|
||||
|
||||
@ -528,6 +531,7 @@ requiresAdmin = \case
|
||||
|
||||
RQDumpInternalState _ -> True
|
||||
RQSetCustomTypes _ -> True
|
||||
RQSetTableCustomization _ -> True
|
||||
|
||||
RQRunSql _ -> True
|
||||
|
||||
|
@ -0,0 +1,104 @@
|
||||
description: GraphQL introspection query
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
query:
|
||||
query: |
|
||||
query IntrospectionQuery {
|
||||
__schema {
|
||||
queryType {
|
||||
name
|
||||
}
|
||||
mutationType {
|
||||
name
|
||||
}
|
||||
subscriptionType {
|
||||
name
|
||||
}
|
||||
types {
|
||||
...FullType
|
||||
}
|
||||
directives {
|
||||
name
|
||||
description
|
||||
locations
|
||||
args {
|
||||
...InputValue
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fragment FullType on __Type {
|
||||
kind
|
||||
name
|
||||
description
|
||||
fields(includeDeprecated: true) {
|
||||
name
|
||||
description
|
||||
args {
|
||||
...InputValue
|
||||
}
|
||||
type {
|
||||
...TypeRef
|
||||
}
|
||||
isDeprecated
|
||||
deprecationReason
|
||||
}
|
||||
inputFields {
|
||||
...InputValue
|
||||
}
|
||||
interfaces {
|
||||
...TypeRef
|
||||
}
|
||||
enumValues(includeDeprecated: true) {
|
||||
name
|
||||
description
|
||||
isDeprecated
|
||||
deprecationReason
|
||||
}
|
||||
possibleTypes {
|
||||
...TypeRef
|
||||
}
|
||||
}
|
||||
|
||||
fragment InputValue on __InputValue {
|
||||
name
|
||||
description
|
||||
type {
|
||||
...TypeRef
|
||||
}
|
||||
defaultValue
|
||||
}
|
||||
|
||||
fragment TypeRef on __Type {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
ofType {
|
||||
kind
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,20 @@
|
||||
type: bulk
|
||||
args:
|
||||
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
CREATE TABLE "user address" (
|
||||
id serial primary key,
|
||||
name text unique,
|
||||
config jsonb
|
||||
);
|
||||
ALTER INDEX "user address_pkey" RENAME TO user_address_pkey;
|
||||
ALTER INDEX "user address_name_key" RENAME TO user_address_name_key;
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: user address
|
||||
configuration:
|
||||
custom_name: user_address
|
@ -0,0 +1,4 @@
|
||||
type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
DROP TABLE "user address" cascade;
|
@ -0,0 +1,34 @@
|
||||
description: deletes from table "author details" with custom name "author_details"
|
||||
status: 200
|
||||
url: /v1/graphql
|
||||
response:
|
||||
data:
|
||||
delete_author_details:
|
||||
affected_rows: 1
|
||||
returning:
|
||||
- id: 1
|
||||
name: Author 1
|
||||
delete_author_details_by_pk:
|
||||
id: 2
|
||||
name: Author 2
|
||||
query:
|
||||
query: |
|
||||
mutation {
|
||||
delete_author_details(
|
||||
where: {
|
||||
id: {_eq: 1}
|
||||
}
|
||||
) {
|
||||
affected_rows
|
||||
returning {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
delete_author_details_by_pk(
|
||||
id: 2
|
||||
) {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,41 @@
|
||||
description: Insert into article table with a new author
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
insert_article:
|
||||
affected_rows: 2
|
||||
returning:
|
||||
- title: Article 4 by new author
|
||||
content: Content for Article 4
|
||||
author:
|
||||
id: 3
|
||||
name: Author 3
|
||||
query:
|
||||
query: |
|
||||
mutation {
|
||||
insert_article(
|
||||
objects: [
|
||||
{
|
||||
title: "Article 4 by new author",
|
||||
content: "Content for Article 4"
|
||||
author: {
|
||||
data: {
|
||||
id: 3
|
||||
name: "Author 3"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
){
|
||||
affected_rows
|
||||
returning{
|
||||
title
|
||||
content
|
||||
author{
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,37 @@
|
||||
description: Insert into "author details" table
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
insert_author_details:
|
||||
affected_rows: 1
|
||||
returning:
|
||||
- id: 3
|
||||
name: Author 3
|
||||
phone: "1234567890"
|
||||
insert_author_details_one:
|
||||
id: 3
|
||||
name: Author 3 updated
|
||||
|
||||
query:
|
||||
query: |
|
||||
mutation {
|
||||
insert_author_details(
|
||||
objects: [{name: "Author 3", phone: "1234567890"}]
|
||||
){
|
||||
affected_rows
|
||||
returning{
|
||||
id
|
||||
name
|
||||
phone
|
||||
}
|
||||
}
|
||||
|
||||
insert_author_details_one(
|
||||
object: {name: "Author 3 updated", phone:"1234567890"},
|
||||
on_conflict: { constraint: author_details_phone_key, update_columns: [name]}
|
||||
){
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,48 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
-- create tables
|
||||
CREATE TABLE "author details" (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT,
|
||||
phone TEXT UNIQUE
|
||||
);
|
||||
CREATE TABLE article (
|
||||
id SERIAL PRIMARY KEY,
|
||||
title TEXT,
|
||||
content TEXT,
|
||||
author_id INTEGER REFERENCES "author details"(id) ON DELETE CASCADE
|
||||
);
|
||||
ALTER INDEX "author details_pkey" RENAME TO author_details_pkey;
|
||||
ALTER INDEX "author details_phone_key" RENAME TO author_details_phone_key;
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: author details
|
||||
configuration:
|
||||
custom_name: author_details
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: article
|
||||
configuration: {}
|
||||
|
||||
- type: create_array_relationship
|
||||
args:
|
||||
table: author details
|
||||
name: articles
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
table: article
|
||||
column: author_id
|
||||
|
||||
- type: create_object_relationship
|
||||
args:
|
||||
table: article
|
||||
name: author
|
||||
using:
|
||||
foreign_key_constraint_on: author_id
|
@ -0,0 +1,8 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
DROP TABLE article;
|
||||
DROP TABLE "author details";
|
||||
cascade: true
|
@ -0,0 +1,42 @@
|
||||
description: updates in table "author details" with custom name "author_details"
|
||||
status: 200
|
||||
url: /v1/graphql
|
||||
response:
|
||||
data:
|
||||
update_author_details:
|
||||
affected_rows: 1
|
||||
returning:
|
||||
- id: 1
|
||||
name: Author 1 Updated
|
||||
update_author_details_by_pk:
|
||||
id: 1
|
||||
name: Author 1 Updated Again
|
||||
query:
|
||||
query: |
|
||||
mutation {
|
||||
update_author_details(
|
||||
_set: {
|
||||
name: "Author 1 Updated"
|
||||
},
|
||||
where: {
|
||||
id: {_eq: 1}
|
||||
}
|
||||
) {
|
||||
affected_rows
|
||||
returning {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
update_author_details_by_pk(
|
||||
_set: {
|
||||
name: "Author 1 Updated Again"
|
||||
},
|
||||
pk_columns: {
|
||||
id: 1
|
||||
}
|
||||
) {
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,24 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
-- insert data
|
||||
INSERT INTO "author details" (name)
|
||||
VALUES ('Author 1'), ('Author 2')
|
||||
;
|
||||
INSERT INTO article (title, content, author_id)
|
||||
VALUES
|
||||
( 'Article 1'
|
||||
, 'Content for Article 1'
|
||||
, 1
|
||||
),
|
||||
( 'Article 2'
|
||||
, 'Content for Article 2'
|
||||
, 1
|
||||
),
|
||||
( 'Article 3'
|
||||
, 'Content for Article 3'
|
||||
, 2
|
||||
)
|
||||
;
|
@ -0,0 +1,9 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
DELETE FROM article;
|
||||
SELECT setval('article_id_seq', 1, FALSE);
|
||||
DELETE FROM "author details";
|
||||
SELECT setval('"author details_id_seq"', 1, FALSE);
|
@ -0,0 +1,72 @@
|
||||
description: Query author table with customised root field names
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_details_by_pk:
|
||||
id: 1
|
||||
name: Author 1
|
||||
author_details:
|
||||
- id: 1
|
||||
name: Author 1
|
||||
articles:
|
||||
- id: 1
|
||||
title: Article 1
|
||||
content: Content for Article 1
|
||||
author_id: 1
|
||||
- id: 2
|
||||
title: Article 2
|
||||
content: Content for Article 2
|
||||
author_id: 1
|
||||
- id: 2
|
||||
name: Author 2
|
||||
articles:
|
||||
- id: 3
|
||||
title: Article 3
|
||||
content: Content for Article 3
|
||||
author_id: 2
|
||||
author_details_with_bool_exp:
|
||||
- id: 2
|
||||
name: Author 2
|
||||
author_details_aggregate:
|
||||
aggregate:
|
||||
count: 2
|
||||
nodes:
|
||||
- id: 1
|
||||
name: Author 1
|
||||
- id: 2
|
||||
name: Author 2
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_details_by_pk(id: 1){
|
||||
id
|
||||
name
|
||||
}
|
||||
|
||||
author_details{
|
||||
id
|
||||
name
|
||||
articles {
|
||||
id
|
||||
title
|
||||
content
|
||||
author_id
|
||||
}
|
||||
}
|
||||
|
||||
author_details_with_bool_exp: author_details(where: {id: {_eq: 2}}){
|
||||
id
|
||||
name
|
||||
}
|
||||
|
||||
author_details_aggregate{
|
||||
aggregate{
|
||||
count
|
||||
}
|
||||
nodes{
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,62 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
-- create tables
|
||||
CREATE TABLE "author details" (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT UNIQUE
|
||||
);
|
||||
|
||||
ALTER INDEX "author details_pkey" RENAME TO author_details_pkey;
|
||||
ALTER INDEX "author details_name_key" RENAME TO author_details_name_key;
|
||||
|
||||
CREATE TABLE article (
|
||||
id SERIAL PRIMARY KEY,
|
||||
title TEXT,
|
||||
content TEXT,
|
||||
author_id INTEGER REFERENCES "author details"(id)
|
||||
);
|
||||
|
||||
-- insert data
|
||||
INSERT INTO "author details" (name)
|
||||
VALUES ('Author 1'), ('Author 2')
|
||||
;
|
||||
INSERT INTO article (title, content, author_id)
|
||||
VALUES
|
||||
( 'Article 1'
|
||||
, 'Content for Article 1'
|
||||
, 1
|
||||
),
|
||||
( 'Article 2'
|
||||
, 'Content for Article 2'
|
||||
, 1
|
||||
),
|
||||
( 'Article 3'
|
||||
, 'Content for Article 3'
|
||||
, 2
|
||||
)
|
||||
;
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: author details
|
||||
configuration:
|
||||
custom_name: author_details
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: article
|
||||
configuration: {}
|
||||
|
||||
- type: create_array_relationship
|
||||
args:
|
||||
table: author details
|
||||
name: articles
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
table: article
|
||||
column: author_id
|
@ -0,0 +1,8 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
DROP TABLE article;
|
||||
DROP TABLE "author details";
|
||||
cascade: true
|
@ -112,6 +112,14 @@ response:
|
||||
name: test2
|
||||
column_mapping:
|
||||
id: id
|
||||
- table:
|
||||
schema: public
|
||||
name: user address
|
||||
configuration:
|
||||
custom_name: user_address
|
||||
custom_root_fields: {}
|
||||
custom_column_names: {}
|
||||
|
||||
functions:
|
||||
- function:
|
||||
schema: public
|
||||
|
@ -191,6 +191,24 @@ args:
|
||||
- type: add_collection_to_allowlist
|
||||
args:
|
||||
collection: collection_1
|
||||
|
||||
- type: add_collection_to_allowlist
|
||||
args:
|
||||
collection: collection_2
|
||||
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
CREATE TABLE "user address" (
|
||||
id serial primary key,
|
||||
name text,
|
||||
address text
|
||||
);
|
||||
ALTER INDEX "user address_pkey" RENAME TO user_address_pkey;
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: user address
|
||||
configuration:
|
||||
custom_name: user_address
|
||||
|
@ -5,6 +5,7 @@ args:
|
||||
sql: |
|
||||
DROP TABLE test1 cascade;
|
||||
DROP TABLE test2 cascade;
|
||||
DROP TABLE "user address" cascade;
|
||||
cascade: true
|
||||
- type: clear_metadata
|
||||
args: {}
|
||||
|
@ -0,0 +1,47 @@
|
||||
- description: Set custom column names
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_root_fields:
|
||||
select: Authors
|
||||
custom_column_names:
|
||||
id: AuthorId
|
||||
name: AuthorName
|
||||
|
||||
- description: "Rename column 'id' and drop column 'name'"
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
result_type: CommandOk
|
||||
result: null
|
||||
query:
|
||||
type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
ALTER TABLE author DROP COLUMN name;
|
||||
ALTER TABLE author RENAME COLUMN id to author_id;
|
||||
|
||||
- description: Test if custom column names are updated
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
Authors:
|
||||
- AuthorId: 1
|
||||
age: 23
|
||||
- AuthorId: 2
|
||||
age: null
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
Authors{
|
||||
AuthorId
|
||||
age
|
||||
}
|
||||
}
|
@ -0,0 +1,71 @@
|
||||
- description: Set custom column names for article table by swaping the column names
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: article
|
||||
configuration:
|
||||
custom_root_fields: {}
|
||||
custom_column_names:
|
||||
title: content
|
||||
content: title
|
||||
|
||||
- description: Perform graphql query
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
article:
|
||||
- id: 1
|
||||
title: Article 1 content
|
||||
content: Article 1 title
|
||||
- id: 2
|
||||
title: Article 2 content
|
||||
content: Article 2 title
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
article{
|
||||
id
|
||||
title
|
||||
content
|
||||
}
|
||||
}
|
||||
|
||||
- description: Unset the custom column names
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: article
|
||||
configuration:
|
||||
custom_root_fields: {}
|
||||
custom_column_names: {}
|
||||
|
||||
- description: Peform graphql query
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
article:
|
||||
- id: 1
|
||||
title: Article 1 title
|
||||
content: Article 1 content
|
||||
- id: 2
|
||||
title: Article 2 title
|
||||
content: Article 2 content
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
article{
|
||||
id
|
||||
title
|
||||
content
|
||||
}
|
||||
}
|
@ -0,0 +1,32 @@
|
||||
description: Set custom column names conflicting with existing relationship
|
||||
url: /v1/query
|
||||
status: 400
|
||||
response:
|
||||
internal:
|
||||
- definition:
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
column: author_id
|
||||
table:
|
||||
schema: public
|
||||
name: article
|
||||
name: articles
|
||||
comment:
|
||||
table:
|
||||
schema: public
|
||||
name: author
|
||||
reason: field definition conflicts with custom field name for postgres column
|
||||
"name"
|
||||
type: array_relation
|
||||
path: $.args
|
||||
error: cannot continue due to new inconsistent metadata
|
||||
code: unexpected
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_root_fields: {}
|
||||
custom_column_names:
|
||||
name: articles
|
||||
age: articles_aggregate
|
@ -0,0 +1,13 @@
|
||||
description: set custom table name which conflicts with other nodes
|
||||
status: 500
|
||||
url: /v1/query
|
||||
response:
|
||||
error: "found duplicate fields in selection set: insert_article, insert_article_one, update_article, delete_article, update_article_by_pk, delete_article_by_pk"
|
||||
code: unexpected
|
||||
path: $.args
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_name: article # article table has already been tracked
|
@ -0,0 +1,32 @@
|
||||
description: Try to define a relationship with custom column name
|
||||
url: /v1/query
|
||||
status: 400
|
||||
response:
|
||||
internal:
|
||||
- definition:
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
column: author_id
|
||||
table:
|
||||
schema: public
|
||||
name: article
|
||||
name: AuthorId
|
||||
comment:
|
||||
table:
|
||||
schema: public
|
||||
name: author
|
||||
reason: field definition conflicts with custom field name for postgres column
|
||||
"id"
|
||||
type: array_relation
|
||||
path: $.args
|
||||
error: field definition conflicts with custom field name for postgres column "id"
|
||||
code: constraint-violation
|
||||
query:
|
||||
type: create_array_relationship
|
||||
args:
|
||||
name: AuthorId
|
||||
table: author
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
table: article
|
||||
column: author_id
|
@ -0,0 +1,55 @@
|
||||
- description: set table customization with custom name
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_name: author_alias
|
||||
custom_root_fields: {}
|
||||
custom_column_names: {}
|
||||
|
||||
- description: Check that above query has changed the schema
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_alias_by_pk:
|
||||
name: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_alias_by_pk(id: 1){
|
||||
name
|
||||
}
|
||||
}
|
||||
|
||||
- description: rename the underlying table
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
result_type: CommandOk
|
||||
result: null
|
||||
query:
|
||||
type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
ALTER TABLE author RENAME TO authors;
|
||||
|
||||
- description: Check if the earlier query works after renaming the table
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_alias_by_pk:
|
||||
name: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_alias_by_pk(id: 1){
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,55 @@
|
||||
- description: set table configuration with custom name
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_configuration
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_name: author_alias
|
||||
custom_root_fields: {}
|
||||
custom_column_names: {}
|
||||
|
||||
- description: Check that above query has changed the schema
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_alias_by_pk:
|
||||
name: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_alias_by_pk(id: 1){
|
||||
name
|
||||
}
|
||||
}
|
||||
|
||||
- description: rename the underlying table
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
result_type: CommandOk
|
||||
result: null
|
||||
query:
|
||||
type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
ALTER TABLE author RENAME TO authors;
|
||||
|
||||
- description: Check if the earlier query works after renaming the table
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_alias_by_pk:
|
||||
name: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_alias_by_pk(id: 1){
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,60 @@
|
||||
- description: Set select_by_pk root field and customise name column
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_root_fields:
|
||||
select: Authors
|
||||
select_by_pk: Author
|
||||
custom_column_names:
|
||||
id: AuthorId
|
||||
name: AuthorName
|
||||
age: age
|
||||
|
||||
- description: Check that above query has changed the schema
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
Author:
|
||||
AuthorName: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
Author(AuthorId: 1){
|
||||
AuthorName
|
||||
}
|
||||
}
|
||||
|
||||
- description: Unset select_by_pk root field and remove custom column names
|
||||
url: /v1/query
|
||||
status: 200
|
||||
response:
|
||||
message: success
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_root_fields:
|
||||
select: Authors
|
||||
|
||||
- description: Check that above query has changed the schema
|
||||
url: /v1/graphql
|
||||
status: 200
|
||||
response:
|
||||
data:
|
||||
author_by_pk:
|
||||
name: Clarke
|
||||
query:
|
||||
query: |
|
||||
query {
|
||||
author_by_pk(id: 1){
|
||||
name
|
||||
}
|
||||
}
|
@ -0,0 +1,18 @@
|
||||
description: Set custom fields of table which does not exist
|
||||
url: /v1/query
|
||||
status: 400
|
||||
response:
|
||||
path: "$.args"
|
||||
error: table "author1" does not exist
|
||||
code: not-exists
|
||||
query:
|
||||
type: set_table_customization
|
||||
args:
|
||||
table: author1
|
||||
configuration:
|
||||
custom_root_fields:
|
||||
select: Authors
|
||||
select_by_pk: Author
|
||||
custom_column_names:
|
||||
id: AuthorId
|
||||
name: AuthorName
|
@ -0,0 +1,51 @@
|
||||
type: bulk
|
||||
args:
|
||||
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
CREATE TABLE author (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
age INTEGER
|
||||
);
|
||||
|
||||
INSERT INTO author (name, age) VALUES
|
||||
('Clarke', 23),
|
||||
('Bellamy', NULL);
|
||||
|
||||
CREATE TABLE article (
|
||||
id SERIAL PRIMARY KEY,
|
||||
title TEXT NOT NULL,
|
||||
content TEXT,
|
||||
author_id INTEGER NOT NULL REFERENCES author(id)
|
||||
);
|
||||
|
||||
INSERT INTO article (title, content, author_id) VALUES
|
||||
('Article 1 title', 'Article 1 content', 1),
|
||||
('Article 2 title', 'Article 2 content', 2);
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: author
|
||||
configuration:
|
||||
custom_root_fields:
|
||||
select: Authors
|
||||
custom_column_names:
|
||||
id: AuthorId
|
||||
|
||||
- type: track_table
|
||||
version: 2
|
||||
args:
|
||||
table: article
|
||||
configuration: {}
|
||||
|
||||
- type: create_array_relationship
|
||||
args:
|
||||
name: articles
|
||||
table: author
|
||||
using:
|
||||
foreign_key_constraint_on:
|
||||
table: article
|
||||
column: author_id
|
@ -0,0 +1,9 @@
|
||||
type: bulk
|
||||
args:
|
||||
- type: run_sql
|
||||
args:
|
||||
sql: |
|
||||
DROP TABLE article;
|
||||
DROP TABLE IF EXISTS author;
|
||||
DROP TABLE IF EXISTS authors;
|
||||
cascade: true
|
@ -32,3 +32,69 @@ class TestGraphqlIntrospection:
|
||||
@classmethod
|
||||
def dir(cls):
|
||||
return "queries/graphql_introspection"
|
||||
|
||||
def getTypeNameFromType(typeObject):
|
||||
if typeObject['name'] != None:
|
||||
return typeObject['name']
|
||||
elif isinstance(typeObject['ofType'],dict):
|
||||
return getTypeNameFromType(typeObject['ofType'])
|
||||
else:
|
||||
raise Exception("typeObject doesn't have name and ofType is not an object")
|
||||
|
||||
@pytest.mark.usefixtures('per_class_tests_db_state')
|
||||
class TestGraphqlIntrospectionWithCustomTableName:
|
||||
|
||||
# test to check some of the type names that are generated
|
||||
# while tracking a table with a custom name
|
||||
def test_introspection(self, hge_ctx):
|
||||
with open(self.dir() + "/introspection.yaml") as c:
|
||||
conf = yaml.safe_load(c)
|
||||
resp, _ = check_query(hge_ctx, conf)
|
||||
hasMultiSelect = False
|
||||
hasAggregate = False
|
||||
hasSelectByPk = False
|
||||
hasQueryRoot = False
|
||||
for t in resp['data']['__schema']['types']:
|
||||
if t['name'] == 'query_root':
|
||||
hasQueryRoot = True
|
||||
for field in t['fields']:
|
||||
if field['name'] == 'user_address':
|
||||
hasMultiSelect = True
|
||||
assert 'args' in field
|
||||
for args in field['args']:
|
||||
if args['name'] == 'distinct_on':
|
||||
assert "user_address_select_column" == getTypeNameFromType(args['type'])
|
||||
elif args['name'] == 'order_by':
|
||||
assert "user_address_order_by" == getTypeNameFromType(args['type'])
|
||||
elif args['name'] == 'where':
|
||||
assert 'user_address_bool_exp' == getTypeNameFromType(args['type'])
|
||||
elif field['name'] == 'user_address_aggregate':
|
||||
hasAggregate = True
|
||||
assert "user_address_aggregate" == getTypeNameFromType(field['type'])
|
||||
elif field['name'] == 'user_address_by_pk':
|
||||
assert "user_address" == getTypeNameFromType(field['type'])
|
||||
hasSelectByPk = True
|
||||
elif t['name'] == 'mutation_root':
|
||||
for field in t['fields']:
|
||||
if field['name'] == 'insert_user_address':
|
||||
hasMultiInsert = True
|
||||
assert "user_address_mutation_response" == getTypeNameFromType(field['type'])
|
||||
for args in field['args']:
|
||||
if args['name'] == 'object':
|
||||
assert "user_address_insert_input" == getTypeNameFromType(args['type'])
|
||||
elif field['name'] == 'update_user_address_by_pk':
|
||||
hasUpdateByPk = True
|
||||
assert "user_address" == getTypeNameFromType(field['type'])
|
||||
for args in field['args']:
|
||||
if args['name'] == 'object':
|
||||
assert "user_address" == getTypeNameFromType(args['type'])
|
||||
assert hasQueryRoot
|
||||
assert hasMultiSelect
|
||||
assert hasAggregate
|
||||
assert hasSelectByPk
|
||||
assert hasMultiInsert
|
||||
assert hasUpdateByPk
|
||||
|
||||
@classmethod
|
||||
def dir(cls):
|
||||
return "queries/graphql_introspection/custom_table_name"
|
||||
|
@ -543,6 +543,26 @@ class TestGraphqlMutationCustomSchema:
|
||||
def dir(cls):
|
||||
return "queries/graphql_mutation/custom_schema"
|
||||
|
||||
@pytest.mark.parametrize("transport", ['http', 'websocket'])
|
||||
@use_mutation_fixtures
|
||||
class TestGraphqlMutationCustomGraphQLTableName:
|
||||
|
||||
def test_insert_author(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/insert_author_details.yaml', transport)
|
||||
|
||||
def test_insert_article_author(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/insert_article_author.yaml', transport)
|
||||
|
||||
def test_update_author(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/update_author_details.yaml', transport)
|
||||
|
||||
def test_delete_author(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/delete_author_details.yaml', transport)
|
||||
|
||||
@classmethod
|
||||
def dir(cls):
|
||||
return "queries/graphql_mutation/custom_schema/custom_table_name"
|
||||
|
||||
@pytest.mark.parametrize('transport', ['http', 'websocket'])
|
||||
@use_mutation_fixtures
|
||||
class TestGraphQLMutateEnums:
|
||||
|
@ -544,7 +544,7 @@ class TestGraphQLQueryFunctions:
|
||||
@pytest.mark.parametrize("transport", ['http', 'websocket'])
|
||||
def test_query_get_test_session_id(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/query_get_test_session_id.yaml')
|
||||
|
||||
|
||||
@pytest.mark.parametrize("transport", ['http', 'websocket'])
|
||||
def test_query_search_author_mview(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + '/query_search_author_mview.yaml')
|
||||
@ -567,6 +567,17 @@ class TestGraphQLQueryCustomSchema:
|
||||
def dir(cls):
|
||||
return 'queries/graphql_query/custom_schema'
|
||||
|
||||
@pytest.mark.parametrize("transport", ['http', 'websocket'])
|
||||
@usefixtures('per_class_tests_db_state')
|
||||
class TestGraphQLQueryCustomTableName:
|
||||
|
||||
def test_author(self, hge_ctx, transport):
|
||||
check_query_f(hge_ctx, self.dir() + 'author.yaml', transport)
|
||||
|
||||
@classmethod
|
||||
def dir(cls):
|
||||
return 'queries/graphql_query/custom_schema/custom_table_name/'
|
||||
|
||||
@pytest.mark.parametrize('transport', ['http', 'websocket'])
|
||||
@usefixtures('per_class_tests_db_state')
|
||||
class TestGraphQLQueryEnums:
|
||||
|
@ -764,6 +764,37 @@ class TestSetTableCustomFields:
|
||||
def test_relationship_conflict_with_custom_column(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + "/relationship_conflict_with_custom_column.yaml")
|
||||
|
||||
@usefixtures('per_method_tests_db_state')
|
||||
class TestSetTableCustomization:
|
||||
|
||||
@classmethod
|
||||
def dir(cls):
|
||||
return 'queries/v1/set_table_configuration'
|
||||
|
||||
def test_set_and_unset(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + '/set_and_unset.yaml')
|
||||
|
||||
def test_set_invalid_table(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + '/set_invalid_table.yaml')
|
||||
|
||||
def test_alter_column(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + '/alter_column.yaml')
|
||||
|
||||
def test_conflict_with_relationship(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + '/conflict_with_relationship.yaml')
|
||||
|
||||
def test_column_field_swap(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + "/column_field_swap.yaml")
|
||||
|
||||
def test_relationship_conflict_with_custom_column(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + "/relationship_conflict_with_custom_column.yaml")
|
||||
|
||||
def test_alter_table_name_with_custom_name(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + "/rename_original_table_with_custom_name.yaml")
|
||||
|
||||
def test_conflicting_custom_table_name(self, hge_ctx):
|
||||
check_query_f(hge_ctx, self.dir() + "/fail_conflicting_custom_table_name.yaml")
|
||||
|
||||
@usefixtures('per_method_tests_db_state')
|
||||
class TestComputedFields:
|
||||
@classmethod
|
||||
|
Loading…
Reference in New Issue
Block a user