mirror of
https://github.com/hasura/graphql-engine.git
synced 2024-08-15 21:50:40 +03:00
Aka “the PDV refactor.” History is preserved on the branch 2801-graphql-schema-parser-refactor. * [skip ci] remove stale benchmark commit from commit_diff * [skip ci] Check for root field name conflicts between remotes * [skip ci] Additionally check for conflicts between remotes and DB * [skip ci] Check for conflicts in schema when tracking a table * [skip ci] Fix equality checking in GraphQL AST * server: fix mishandling of GeoJSON inputs in subscriptions (fix #3239) (#4551) * Add support for multiple top-level fields in a subscription to improve testability of subscriptions * Add an internal flag to enable multiple subscriptions * Add missing call to withConstructorFn in live queries (fix #3239) Co-authored-by: Alexis King <lexi.lambda@gmail.com> * Scheduled triggers (close #1914) (#3553) server: add scheduled triggers Co-authored-by: Alexis King <lexi.lambda@gmail.com> Co-authored-by: Marion Schleifer <marion@hasura.io> Co-authored-by: Karthikeyan Chinnakonda <karthikeyan@hasura.io> Co-authored-by: Aleksandra Sikora <ola.zxcvbnm@gmail.com> * dev.sh: bump version due to addition of croniter python dependency * server: fix an introspection query caching issue (fix #4547) (#4661) Introspection queries accept variables, but we need to make sure to also touch the variables that we ignore, so that an introspection query is marked not reusable if we are not able to build a correct query plan for it. A better solution here would be to deal with such unused variables correctly, so that more introspection queries become reusable. An even better solution would be to type-safely track *how* to reuse which variables, rather than to split the reusage marking from the planning. Co-authored-by: Tirumarai Selvan <tiru@hasura.io> * flush log buffer on exception in mkWaiApp ( fix #4772 ) (#4801) * flush log buffer on exception in mkWaiApp * add comment to explain the introduced change * add changelog * allow logging details of a live query polling thread (#4959) * changes for poller-log add various multiplexed query info in poller-log * minor cleanup, also fixes a bug which will return duplicate data * Live query poller stats can now be logged This also removes in-memory stats that are collected about batched query execution as the log lines when piped into an monitoring tool will give us better insights. * allow poller-log to be configurable * log minimal information in the livequery-poller-log Other information can be retrieved from /dev/subscriptions/extended * fix few review comments * avoid marshalling and unmarshalling from ByteString to EncJSON * separate out SubscriberId and SubscriberMetadata Co-authored-by: Anon Ray <rayanon004@gmail.com> * Don't compile in developer APIs by default * Tighten up handling of admin secret, more docs Store the admin secret only as a hash to prevent leaking the secret inadvertently, and to prevent timing attacks on the secret. NOTE: best practice for stored user passwords is a function with a tunable cost like bcrypt, but our threat model is quite different (even if we thought we could reasonably protect the secret from an attacker who could read arbitrary regions of memory), and bcrypt is far too slow (by design) to perform on each request. We'd have to rely on our (technically savvy) users to choose high entropy passwords in any case. Referencing #4736 * server/docs: add instructions to fix loss of float precision in PostgreSQL <= 11 (#5187) This adds a server flag, --pg-connection-options, that can be used to set a PostgreSQL connection parameter, extra_float_digits, that needs to be used to avoid loss of data on older versions of PostgreSQL, which have odd default behavior when returning float values. (fixes #5092) * [skip ci] Add new commits from master to the commit diff * [skip ci] serve default directives (skip & include) over introspection * [skip ci] Update non-Haskell assets with the version on master * server: refactor GQL execution check and config API (#5094) Co-authored-by: Vamshi Surabhi <vamshi@hasura.io> Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] fix js issues in tests by pinning dependencies version * [skip ci] bump graphql version * [skip ci] Add note about memory usage * generalize query execution logic on Postgres (#5110) * generalize PGExecCtx to support specialized functions for various operations * fix tests compilation * allow customising PGExecCtx when starting the web server * server: changes catalog initialization and logging for pro customization (#5139) * new typeclass to abstract the logic of QueryLog-ing * abstract the logic of logging websocket-server logs introduce a MonadWSLog typeclass * move catalog initialization to init step expose a helper function to migrate catalog create schema cache in initialiseCtx * expose various modules and functions for pro * [skip ci] cosmetic change * [skip ci] fix test calling a mutation that does not exist * [skip ci] minor text change * [skip ci] refactored input values * [skip ci] remove VString Origin * server: fix updating of headers behaviour in the update cron trigger API and create future events immediately (#5151) * server: fix bug to update headers in an existing cron trigger and create future events Co-authored-by: Tirumarai Selvan <tiru@hasura.io> * Lower stack chunk size in RTS to reduce thread STACK memory (closes #5190) This reduces memory consumption for new idle subscriptions significantly (see linked ticket). The hypothesis is: we fork a lot of threads per websocket, and some of these use slightly more than the initial 1K stack size, so the first overflow balloons to 32K, when significantly less is required. However: running with `+RTS -K1K -xc` did not seem to show evidence of any overflows! So it's a mystery why this improves things. GHC should probably also be doubling the stack buffer at each overflow or doing something even smarter; the knobs we have aren't so helpful. * [skip ci] fix todo and schema generation for aggregate fields * 5087 libpq pool leak (#5089) Shrink libpq buffers to 1MB before returning connection to pool. Closes #5087 See: https://github.com/hasura/pg-client-hs/pull/19 Also related: #3388 #4077 * bump pg-client-hs version (fixes a build issue on some environments) (#5267) * do not use prepared statements for mutations * server: unlock scheduled events on graceful shutdown (#4928) * Fix buggy parsing of new --conn-lifetime flag in2b0e3774
* [skip ci] remove cherry-picked commit from commit_diff.txt * server: include additional fields in scheduled trigger webhook payload (#5262) * include scheduled triggers metadata in the webhook body Co-authored-by: Tirumarai Selvan <tiru@hasura.io> * server: call the webhook asynchronously in event triggers (#5352) * server: call the webhook asynchronosly in event triggers * Expose all modules in Cabal file (#5371) * [skip ci] update commit_diff.txt * [skip ci] fix cast exp parser & few TODOs * [skip ci] fix remote fields arguments * [skip ci] fix few more TODO, no-op refactor, move resolve/action.hs to execute/action.hs * Pass environment variables around as a data structure, via @sordina (#5374) * Pass environment variables around as a data structure, via @sordina * Resolving build error * Adding Environment passing note to changelog * Removing references to ILTPollerLog as this seems to have been reintroduced from a bad merge * removing commented-out imports * Language pragmas already set by project * Linking async thread * Apply suggestions from code review Use `runQueryTx` instead of `runLazyTx` for queries. * remove the non-user facing entry in the changelog Co-authored-by: Phil Freeman <paf31@cantab.net> Co-authored-by: Phil Freeman <phil@hasura.io> Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] fix: restrict remote relationship field generation for hasura queries * [skip ci] no-op refactor; move insert execution code from schema parser module * server: call the webhook asynchronously in event triggers (#5352) * server: call the webhook asynchronosly in event triggers * Expose all modules in Cabal file (#5371) * [skip ci] update commit_diff.txt * Pass environment variables around as a data structure, via @sordina (#5374) * Pass environment variables around as a data structure, via @sordina * Resolving build error * Adding Environment passing note to changelog * Removing references to ILTPollerLog as this seems to have been reintroduced from a bad merge * removing commented-out imports * Language pragmas already set by project * Linking async thread * Apply suggestions from code review Use `runQueryTx` instead of `runLazyTx` for queries. * remove the non-user facing entry in the changelog Co-authored-by: Phil Freeman <paf31@cantab.net> Co-authored-by: Phil Freeman <phil@hasura.io> Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] implement header checking Probably closes #14 and #3659. * server: refactor 'pollQuery' to have a hook to process 'PollDetails' (#5391) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * update pg-client (#5421) * [skip ci] update commit_diff * Fix latency buckets for telemetry data These must have gotten messed up during a refactor. As a consequence almost all samples received so far fall into the single erroneous 0 to 1K seconds (originally supposed to be 1ms?) bucket. I also re-thought what the numbers should be, but these are still arbitrary and might want adjusting in the future. * [skip ci] include the latest commit compared against master in commit_diff * [skip ci] include new commits from master in commit_diff * [skip ci] improve description generation * [skip ci] sort all introspect arrays * [skip ci] allow parsers to specify error codes * [skip ci] fix integer and float parsing error code * [skip ci] scalar from json errors are now parse errors * [skip ci] fixed negative integer error message and code * [skip ci] Re-fix nullability in relationships * [skip ci] no-op refactor and removed couple of FIXMEs * [skip ci] uncomment code in 'deleteMetadataObject' * [skip ci] Fix re-fix of nullability for relationships * [skip ci] fix default arguments error code * [skip ci] updated test error message !!! WARNING !!! Since all fields accept `null`, they all are technically optional in the new schema. Meaning there's no such thing as a missing mandatory field anymore: a field that doesn't have a default value, and which therefore isn't labelled as "optional" in the schema, will be assumed to be null if it's missing, meaning it isn't possible anymore to have an error for a missing mandatory field. The only possible error is now when a optional positional argument is omitted but is not the last positional argument. * [skip ci] cleanup of int scalar parser * [skip ci] retro-compatibility of offset as string * [skip ci] Remove commit from commit_diff.txt Although strictly speaking we don't know if this will work correctly in PDV if we would implement query plan caching, the fact is that in the theoretical case that we would have the same issue in PDV, it would probably apply not just to introspection, and the fix would be written completely differently. So this old commit is of no value to us other than the heads-up "make sure query plan caching works correctly even in the presence of unused variables", which is already part of the test suite. * Add MonadTrace and MonadExecuteQuery abstractions (#5383) * [skip ci] Fix accumulation of input object types Just like object types, interface types, and union types, we have to avoid circularities when collecting input types from the GraphQL AST. Additionally, this fixes equality checks for input object types (whose fields are unordered, and hence should be compared as sets) and enum types (ditto). * [skip ci] fix fragment error path * [skip ci] fix node error code * [skip ci] fix paths in insert queries * [skip ci] fix path in objects * [skip ci] manually alter node id path for consistency * [skip ci] more node error fixups * [skip ci] one last relay error message fix * [skip ci] update commit_diff * Propagate the trace context to event triggers (#5409) * Propagate the trace context to event triggers * Handle missing trace and span IDs * Store trace context as one LOCAL * Add migrations * Documentation * changelog * Fix warnings * Respond to code review suggestions * Respond to code review * Undo changelog * Update CHANGELOG.md Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * server: log request/response sizes for event triggers (#5463) * server: log request/response sizes for event triggers event triggers (and scheduled triggers) now have request/response size in their logs. * add changelog entry * Tracing: Simplify HTTP traced request (#5451) Remove the Inversion of Control (SuspendRequest) and simplify the tracing of HTTP Requests. Co-authored-by: Phil Freeman <phil@hasura.io> * Attach request ID as tracing metadata (#5456) * Propagate the trace context to event triggers * Handle missing trace and span IDs * Store trace context as one LOCAL * Add migrations * Documentation * Include the request ID as trace metadata * changelog * Fix warnings * Respond to code review suggestions * Respond to code review * Undo changelog * Update CHANGELOG.md * Typo Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * server: add logging for action handlers (#5471) * server: add logging for action handlers * add changelog entry * change action-handler log type from internal to non-internal * fix action-handler-log name * server: pass http and websocket request to logging context (#5470) * pass request body to logging context in all cases * add message size logging on the websocket API this is required by graphql-engine-pro/#416 * message size logging on websocket API As we need to log all messages recieved/sent by the websocket server, it makes sense to log them as part of the websocket server event logs. Previously message recieved were logged inside the onMessage handler, and messages sent were logged only for "data" messages (as a server event log) * fix review comments Co-authored-by: Phil Freeman <phil@hasura.io> * server: stop eventing subsystem threads when shutting down (#5479) * server: stop eventing subsystem threads when shutting down * Apply suggestions from code review Co-authored-by: Karthikeyan Chinnakonda <chkarthikeyan95@gmail.com> Co-authored-by: Phil Freeman <phil@hasura.io> Co-authored-by: Phil Freeman <paf31@cantab.net> Co-authored-by: Karthikeyan Chinnakonda <chkarthikeyan95@gmail.com> * [skip ci] update commit_diff with new commits added in master * Bugfix to support 0-size HASURA_GRAPHQL_QUERY_PLAN_CACHE_SIZE Also some minor refactoring of bounded cache module: - the maxBound check in `trim` was confusing and unnecessary - consequently trim was unnecessary for lookupPure Also add some basic tests * Support only the bounded cache, with default HASURA_GRAPHQL_QUERY_PLAN_CACHE_SIZE of 4000. Closes #5363 * [skip ci] remove merge commit from commit_diff * server: Fix compiler warning caused by GHC upgrade (#5489) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] update all non server code from master * [skip ci] aligned object field error message with master * [skip ci] fix remaining undefined? * [skip ci] remove unused import * [skip ci] revert to previous error message, fix tests * Move nullableType/nonNullableType to Schema.hs These are functions on Types, not on Parsers. * [skip ci] fix setup to fix backend only test the order in which permission checks are performed on the branch is slightly different than on master, resulting in a slightly different error if there are no other mutations the user has access to. By adding update permissions, we go back to the expected case. * [skip ci] fix insert geojson tests to reflect new paths * [skip ci] fix enum test for better error message * [skip ci] fix header test for better error message * [skip ci] fix fragment cycle test for better error message * [skip ci] fix error message for type mismatch * [skip ci] fix variable path in test * [skip ci] adjust tests after bug fix * [skip ci] more tests fixing * Add hdb_catalog.current_setting abstraction for reading Hasura settings As the comment in the function’s definition explains, this is needed to work around an awkward Postgres behavior. * [skip ci] Update CONTRIBUTING.md to mention Node setup for Python tests * [skip ci] Add missing Python tests env var to CONTRIBUTING.md * [skip ci] fix order of result when subscription is run with multiple nodes * [skip ci] no-op refactor: fix a warning in Internal/Parser.hs * [skip ci] throw error when a subscription contains remote joins * [skip ci] Enable easier profiling by hiding AssertNF behind a flag In order to compile a profiling build, run: $ cabal new-build -f profiling --enable-profiling * [skip ci] Fix two warnings We used to lookup the objects that implement a given interface by filtering all objects in the schema document. However, one of the tests expects us to generate a warning if the provided `implements` field of an introspection query specifies an object not implementing some interface. So we use that field instead. * [skip ci] Fix warnings by commenting out query plan caching * [skip ci] improve masking/commenting query caching related code & few warning fixes * [skip ci] Fixed compiler warnings in graphql-parser-hs * Sync non-Haskell assets with master * [skip ci] add a test inserting invalid GraphQL but valid JSON value in a jsonb column * [skip ci] Avoid converting to/from Map * [skip ci] Apply some hlint suggestions * [skip ci] remove redundant constraints from buildLiveQueryPlan and explainGQLQuery * [skip ci] add NOTEs about missing Tracing constraints in PDV from master * Remove -fdefer-typed-holes, fix warnings * Update cabal.project.freeze * Limit GHC’s heap size to 8GB in CI to avoid the OOM killer * Commit package-lock.json for Python tests’ remote schema server * restrict env variables start with HASURA_GRAPHQL_ for headers configuration in actions, event triggers & remote schemas (#5519) * restrict env variables start with HASURA_GRAPHQL_ for headers definition in actions & event triggers * update CHANGELOG.md * Apply suggestions from code review Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * add test for table_by_pk node when roles doesn't have permission to PK * [skip ci] fix introspection query if any enum column present in primary key (fix #5200) (#5522) * [skip ci] test case fix fora6450e126b
* [skip ci] add tests to agg queries when role doesn't have access to any cols * fix backend test * Simplify subscription execution * [skip ci] add test to check if required headers are present while querying * Suppose, table B is related to table A and to query B certain headers are necessary, then the test checks that we are throwing error when the header is not set when B is queried through A * fix mutations not checking for view mutability * [skip ci] add variable type checking and corresponding tests * [skip ci] add test to check if update headers are present while doing an upsert * [skip ci] add positive counterparts to some of the negative permission tests * fix args missing their description in introspect * [skip ci] Remove unused function; insert missing markNotReusable call * [skip ci] Add a Note about InputValue * [skip ci] Delete LegacySchema/ 🎉 * [skip ci] Delete GraphQL/{Resolve,Validate}/ 🎉 * [skip ci] Delete top-level Resolve/Validate modules; tidy .cabal file * [skip ci] Delete LegacySchema top-level module Somehow I missed this one. * fix input value to json * [skip ci] elaborate on JSON objects in GraphQL * [skip ci] add missing file * [skip ci] add a test with subscription containing remote joins * add a test with remote joins in mutation output * [skip ci] Add some comments to Schema/Mutation.hs * [skip ci] Remove no longer needed code from RemoteServer.hs * [skip ci] Use a helper function to generate conflict clause parsers * [skip ci] fix type checker error in fields with default value * capitalize the header keys in select_articles_without_required_headers * Somehow, this was the reason the tests were failing. I have no idea, why! * [skip ci] Add a long Note about optional fields and nullability * Improve comments a bit; simplify Schema/Common.hs a bit * [skip ci] full implementation of 5.8.5 type checking. * [skip ci] fix validation test teardown * [skip ci] fix schema stitching test * fix remote schema ignoring enum nullability * [skip ci] fix fieldOptional to not discard nullability * revert nullability of use_spheroid * fix comment * add required remote fields with arguments for tests * [skip ci] add missing docstrings * [skip ci] fixed description of remote fields * [skip ci] change docstring for consistency * fix several schema inconsistencies * revert behaviour change in function arguments parsing * fix remaining nullability issues in new schema * minor no-op refactor; use isListType from graphql-parser-hs * use nullability of remote schema node, while creating a Remote reln * fix 'ID' input coercing & action 'ID' type relationship mapping * include ASTs in MonadExecuteQuery * needed for PRO code-base * Delete code for "interfaces implementing ifaces" (draft GraphQL spec) Previously I started writing some code that adds support for a future GraphQL feature where interfaces may themselves be sub-types of other interfaces. However, this code was incomplete, and partially incorrect. So this commit deletes support for that entirely. * Ignore a remote schema test during the upgrade/downgrade test The PDV refactor does a better job at exposing a minimal set of types through introspection. In particular, not every type that is present in a remote schema is re-exposed by Hasura. The test test_schema_stitching.py::TestRemoteSchemaBasic::test_introspection assumed that all types were re-exposed, which is not required for GraphQL compatibility, in order to test some aspect of our support for remote schemas. So while this particular test has been updated on PDV, the PDV branch now does not pass the old test, which we argue to be incorrect. Hence this test is disabled while we await a release, after which we can re-enable it. This also re-enables a test that was previously disabled for similar, though unrelated, reasons. * add haddock documentation to the action's field parsers * Deslecting some tests in server-upgrade Some tests with current build are failing on server upgrade which it should not. The response is more accurate than what it was. Also the upgrade tests were not throwing errors when the test is expected to return an error, but succeeds. The test framework is patched to catch this case. * [skip ci] Add a long Note about interfaces and object types * send the response headers back to client after running a query * Deselect a few more tests during upgrade/downgrade test * Update commit_diff.txt * change log kind from db_migrate to catalog_migrate (#5531) * Show method and complete URI in traced HTTP calls (#5525) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * restrict env variables start with HASURA_GRAPHQL_ for headers configuration in actions, event triggers & remote schemas (#5519) * restrict env variables start with HASURA_GRAPHQL_ for headers definition in actions & event triggers * update CHANGELOG.md * Apply suggestions from code review Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * fix introspection query if any enum column present in primary key (fix #5200) (#5522) * Fix telemetry reporting of transport (websocket was reported as http) * add log kinds in cli-migrations image (#5529) * add log kinds in cli-migrations image * give hint to resolve timeout error * minor changes and CHANGELOG * server: set hasura.tracecontext in RQL mutations [#5542] (#5555) * server: set hasura.tracecontext in RQL mutations [#5542] * Update test suite Co-authored-by: Tirumarai Selvan <tiru@hasura.io> * Add bulldozer auto-merge and -update configuration We still need to add the github app (as of time of opening this PR) Afterwards devs should be able to allow bulldozer to automatically "update" the branch, merging in parent when it changes, as well as automatically merge when all checks pass. This is opt-in by adding the `auto-update-auto-merge` label to the PR. * Remove 'bulldozer' config, try 'kodiak' for auto-merge see: https://github.com/chdsbd/kodiak The main issue that bit us was not being able to auto update forked branches, also: https://github.com/palantir/bulldozer/issues/66 https://github.com/palantir/bulldozer/issues/145 * Cherry-picked all commits * [skip ci] Slightly improve formatting * Revert "fix introspection query if any enum column present in primary key (fix #5200) (#5522)" This reverts commit0f9a5afa59
. This undoes a cherry-pick of34288e1eb5
that was already done previously ina6450e126b
, and subsequently fixed for PDV in70e89dc250
* Do a small bit of tidying in Hasura.GraphQL.Parser.Collect * Fix cherry-picking work Some previous cherry-picks ended up modifying code that is commented out * [skip ci] clarified comment regarding insert representation * [skip ci] removed obsolete todos * cosmetic change * fix action error message * [skip ci] remove obsolete comment * [skip ci] synchronize stylish haskell extensions list * use previously defined scalar names in parsers rather than ad-hoc literals * Apply most syntax hlint hints. * Clarify comment on update mutation. * [skip ci] Clarify what fields should be specified for objects * Update "_inc" description. * Use record types rather than tuples fo IntrospectionResult and ParsedIntrospection * Get rid of checkFieldNamesUnique (use Data.List.Extended.duplicates) * Throw more errors when collecting query root names * [skip ci] clean column parser comment * Remove dead code inserted inab65b39
* avoid converting to non-empty list where not needed * add note and TODO about the disabled checks in PDV * minor refactor in remoteField' function * Unify two getObject methods * Nitpicks in Remote.hs * Update CHANGELOG.md * Revert "Unify two getObject methods" This reverts commitbd6bb40355
. We do need two different getObject functions as the corresponding error message is different * Fix error message in Remote.hs * Update CHANGELOG.md Co-authored-by: Auke Booij <auke@tulcod.com> * Apply suggested Changelog fix. Co-authored-by: Auke Booij <auke@tulcod.com> * Fix typo in Changelog. * [skip ci] Update changelog. * reuse type names to avoid duplication * Fix Hashable instance for Definition The presence of `Maybe Unique`, and an optional description, as part of `Definition`s, means that `Definition`s that are considered `Eq`ual may get different hashes. This can happen, for instance, when one object is memoized but another is not. * [skip ci] Update commit_diff.txt * Bump parser version. * Bump freeze file after changes in parser. * [skip ci] Incorporate commits from master * Fix developer flag in server/cabal.project.freeze Co-authored-by: Auke Booij <auke@tulcod.com> * Deselect a changed ENUM test for upgrade/downgrade CI * Deselect test here as well * [skip ci] remove dead code * Disable more tests for upgrade/downgrade * Fix which test gets deselected * Revert "Add hdb_catalog.current_setting abstraction for reading Hasura settings" This reverts commit66e85ab9fb
. * Remove circular reference in cabal.project.freeze Co-authored-by: Karthikeyan Chinnakonda <karthikeyan@hasura.io> Co-authored-by: Auke Booij <auke@hasura.io> Co-authored-by: Tirumarai Selvan <tiru@hasura.io> Co-authored-by: Marion Schleifer <marion@hasura.io> Co-authored-by: Aleksandra Sikora <ola.zxcvbnm@gmail.com> Co-authored-by: Brandon Simmons <brandon.m.simmons@gmail.com> Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> Co-authored-by: Anon Ray <rayanon004@gmail.com> Co-authored-by: rakeshkky <12475069+rakeshkky@users.noreply.github.com> Co-authored-by: Anon Ray <ecthiender@users.noreply.github.com> Co-authored-by: Vamshi Surabhi <vamshi@hasura.io> Co-authored-by: Antoine Leblanc <antoine@hasura.io> Co-authored-by: Brandon Simmons <brandon@hasura.io> Co-authored-by: Phil Freeman <phil@hasura.io> Co-authored-by: Lyndon Maydwell <lyndon@sordina.net> Co-authored-by: Phil Freeman <paf31@cantab.net> Co-authored-by: Naveen Naidu <naveennaidu479@gmail.com> Co-authored-by: Karthikeyan Chinnakonda <chkarthikeyan95@gmail.com> Co-authored-by: Nizar Malangadan <nizar-m@users.noreply.github.com> Co-authored-by: Antoine Leblanc <crucuny@gmail.com> Co-authored-by: Auke Booij <auke@tulcod.com>
This commit is contained in:
parent
bcda0cc927
commit
7e970177c1
13
.circleci/server-upgrade-downgrade/err_msg.patch
Normal file
13
.circleci/server-upgrade-downgrade/err_msg.patch
Normal file
@ -0,0 +1,13 @@
|
||||
diff --git a/server/tests-py/validate.py b/server/tests-py/validate.py
|
||||
index 3eecd52a..a18b3113 100644
|
||||
--- a/server/tests-py/validate.py
|
||||
+++ b/server/tests-py/validate.py
|
||||
@@ -318,7 +318,7 @@ def assert_graphql_resp_expected(resp_orig, exp_response_orig, query, resp_hdrs=
|
||||
# If it is a batch GraphQL query, compare each individual response separately
|
||||
for (exp, out) in zip(as_list(exp_response), as_list(resp)):
|
||||
matched_ = equal_CommentedMap(exp, out)
|
||||
- if is_err_msg(exp):
|
||||
+ if is_err_msg(exp) and is_err_msg(out):
|
||||
if not matched_:
|
||||
warnings.warn("Response does not have the expected error message\n" + dump_str.getvalue())
|
||||
return resp, matched
|
@ -6,7 +6,9 @@
|
||||
# and sets some of the required variables that run.sh needs,
|
||||
# before executing run.sh
|
||||
set -euo pipefail
|
||||
ROOT="${BASH_SOURCE[0]%/*}"
|
||||
cd "${BASH_SOURCE[0]%/*}"
|
||||
ROOT="${PWD}"
|
||||
cd - > /dev/null
|
||||
|
||||
SERVER_DIR="$ROOT/../../server"
|
||||
|
||||
@ -18,8 +20,8 @@ echo "server binary: $SERVER_BINARY"
|
||||
cd -
|
||||
set +x
|
||||
|
||||
export SERVER_OUTPUT_DIR="server-output"
|
||||
export LATEST_SERVER_BINARY="./graphql-engine-latest"
|
||||
export SERVER_OUTPUT_DIR="$ROOT/server-output"
|
||||
export LATEST_SERVER_BINARY="$ROOT/graphql-engine-latest"
|
||||
|
||||
# Create Python virtualenv
|
||||
if ! [ -f ".venv/bin/activate" ] ; then
|
||||
@ -40,7 +42,8 @@ log_duration=on
|
||||
port=$PG_PORT
|
||||
EOF
|
||||
)
|
||||
export HASURA_GRAPHQL_DATABASE_URL="postgres://postgres:$PGPASSWORD@127.0.0.1:$PG_PORT/postgres"
|
||||
# Pytest is giving out deprecated warnings when postgres:// is used
|
||||
export HASURA_GRAPHQL_DATABASE_URL="postgresql://postgres:$PGPASSWORD@127.0.0.1:$PG_PORT/postgres"
|
||||
|
||||
DOCKER_PSQL="docker exec -u postgres -it $PG_CONTAINER_NAME psql -p $PG_PORT"
|
||||
|
||||
|
@ -12,7 +12,9 @@ set -euo pipefail
|
||||
# # echo an error message before exiting
|
||||
# trap 'echo "\"${last_command}\" command filed with exit code $?."' EXIT
|
||||
|
||||
ROOT="${BASH_SOURCE[0]%/*}"
|
||||
cd "${BASH_SOURCE[0]%/*}"
|
||||
ROOT="${PWD}"
|
||||
cd - > /dev/null
|
||||
|
||||
download_with_etag_check() {
|
||||
URL="$1"
|
||||
@ -119,6 +121,17 @@ trap rm_worktree ERR
|
||||
|
||||
make_latest_release_worktree() {
|
||||
git worktree add --detach "$WORKTREE_DIR" "$RELEASE_VERSION"
|
||||
cd "$WORKTREE_DIR"
|
||||
# FIX ME: Remove the patch below after the next stable release
|
||||
# The --avoid-error-message-checks in pytest was implementated as a rather relaxed check than
|
||||
# what we intended to have. In versions <= v1.3.0,
|
||||
# this check allows response to be success even if the expected response is a failure.
|
||||
# The patch below fixes that issue.
|
||||
# The `git apply` should give errors from next release onwards,
|
||||
# since this change is going to be included in the next release version
|
||||
git apply "${ROOT}/err_msg.patch" || \
|
||||
(log "Remove the git apply in make_latest_release_worktree function" && false)
|
||||
cd - > /dev/null
|
||||
}
|
||||
|
||||
cleanup_hasura_metadata_if_present() {
|
||||
@ -148,7 +161,18 @@ get_server_upgrade_tests() {
|
||||
cd $RELEASE_PYTEST_DIR
|
||||
tmpfile="$(mktemp --dry-run)"
|
||||
set -x
|
||||
python3 -m pytest -q --collect-only --collect-upgrade-tests-to-file "$tmpfile" -m 'allow_server_upgrade_test and not skip_server_upgrade_test' "${args[@]}" 1>/dev/null 2>/dev/null
|
||||
# FIX ME: Deselecting some introspection tests from the previous test suite
|
||||
# which throw errors on the latest build. Even when the output of the current build is more accurate.
|
||||
# Remove these deselects after the next stable release
|
||||
python3 -m pytest -q --collect-only --collect-upgrade-tests-to-file "$tmpfile" \
|
||||
-m 'allow_server_upgrade_test and not skip_server_upgrade_test' \
|
||||
--deselect test_schema_stitching.py::TestRemoteSchemaBasic::test_introspection \
|
||||
--deselect test_schema_stitching.py::TestAddRemoteSchemaCompareRootQueryFields::test_schema_check_arg_default_values_and_field_and_arg_types \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_user_with_no_backend_privilege \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_backend_user_no_admin_secret_fail \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlMutationCustomSchema::test_update_article \
|
||||
--deselect test_graphql_queries.py::TestGraphQLQueryEnums::test_introspect_user_role \
|
||||
"${args[@]}" 1>/dev/null 2>/dev/null
|
||||
set +x
|
||||
cat "$tmpfile"
|
||||
cd - >/dev/null
|
||||
@ -174,11 +198,12 @@ run_server_upgrade_pytest() {
|
||||
set -x
|
||||
|
||||
# With --avoid-error-message-checks, we are only going to throw warnings if the error message has changed between releases
|
||||
# FIX ME: Remove the deselect below after the next stable release
|
||||
pytest --hge-urls "${HGE_URL}" --pg-urls "$HASURA_GRAPHQL_DATABASE_URL" \
|
||||
--avoid-error-message-checks "$@" \
|
||||
-m 'allow_server_upgrade_test and not skip_server_upgrade_test' \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlUpdateBasic::test_numerics_inc \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_user_with_no_backend_privilege \
|
||||
--deselect test_graphql_mutations.py::TestGraphqlMutationCustomSchema::test_update_article \
|
||||
--deselect test_graphql_queries.py::TestGraphQLQueryEnums::test_introspect_user_role \
|
||||
-v $tests_to_run
|
||||
set +x
|
||||
cd -
|
||||
|
@ -191,9 +191,7 @@ pip3 install -r requirements.txt
|
||||
# node js deps
|
||||
curl -sL https://deb.nodesource.com/setup_8.x | bash -
|
||||
apt-get install -y nodejs
|
||||
npm_config_loglevel=error npm install $PYTEST_ROOT/remote_schemas/nodejs/
|
||||
|
||||
npm install apollo-server graphql
|
||||
(cd remote_schemas/nodejs && npm_config_loglevel=error npm ci)
|
||||
|
||||
mkdir -p "$OUTPUT_FOLDER/hpc"
|
||||
|
||||
|
@ -1,7 +1,7 @@
|
||||
###############################################################################
|
||||
## Configuration for auto-merge / auto-update bot
|
||||
##
|
||||
## See: https://kodiakhq.com/
|
||||
## See: https://kodiakhq.com/
|
||||
###############################################################################
|
||||
|
||||
# Kodiak's configuration file should be placed at `.kodiak.toml` (repository
|
||||
|
16
CHANGELOG.md
16
CHANGELOG.md
@ -2,19 +2,31 @@
|
||||
|
||||
## Next release
|
||||
|
||||
### Breaking changes
|
||||
|
||||
This release contains the [PDV refactor (#4111)](https://github.com/hasura/graphql-engine/pull/4111), a significant rewrite of the internals of the server, which did include some breaking changes:
|
||||
|
||||
- The semantics of explicit `null` values in `where` filters have changed according to the discussion in [issue 704](https://github.com/hasura/graphql-engine/issues/704#issuecomment-635571407): an explicit `null` value in a comparison input object will be treated as an error rather than resulting in the expression being evaluated to `True`. For instance: `delete_users(where: {id: {_eq: $userId}}) { name }` will yield an error if `$userId` is `null` instead of deleting all users.
|
||||
- The validation of required headers has been fixed (closing #14 and #3659):
|
||||
- if a query selects table `bar` through table `foo` via a relationship, the required permissions headers will be the union of the required headers of table `foo` and table `bar` (we used to only check the headers of the root table);
|
||||
- if an insert does not have an `on_conflict` clause, it will not require the update permissions headers.
|
||||
|
||||
|
||||
### Bug fixes and improvements
|
||||
|
||||
(Add entries here in the order of: server, console, cli, docs, others)
|
||||
|
||||
- docs: add docs page on networking with docker (close #4346) (#4811)
|
||||
- server: some mutations that cannot be performed will no longer be in the schema (for instance, `delete_by_pk` mutations won't be shown to users that do not have select permissions on all primary keys) (#4111)
|
||||
- server: miscellaneous description changes (#4111)
|
||||
- server: treat the absence of `backend_only` configuration and `backend_only: false` equally (closing #5059) (#4111)
|
||||
- cli: add missing global flags for seeds command (#5565)
|
||||
- docs: add docs page on networking with docker (close #4346) (#4811)
|
||||
|
||||
## `v1.3.1`, `v1.3.1-beta.1`
|
||||
|
||||
### Breaking change
|
||||
|
||||
Headers from environment variables starting with `HASURA_GRAPHQL_` are not allowed
|
||||
Headers from environment variables starting with `HASURA_GRAPHQL_` are not allowed
|
||||
in event triggers, actions & remote schemas.
|
||||
|
||||
If you do have such headers configured, then you must update the header configuration before upgrading.
|
||||
|
@ -229,6 +229,7 @@ language_extensions:
|
||||
- BangPatterns
|
||||
- BlockArguments
|
||||
- ConstraintKinds
|
||||
- DataKinds
|
||||
- DefaultSignatures
|
||||
- DeriveDataTypeable
|
||||
- DeriveFoldable
|
||||
@ -238,9 +239,11 @@ language_extensions:
|
||||
- DeriveTraversable
|
||||
- DerivingVia
|
||||
- EmptyCase
|
||||
- ExistentialQuantification
|
||||
- FlexibleContexts
|
||||
- FlexibleInstances
|
||||
- FunctionalDependencies
|
||||
- GADTs
|
||||
- GeneralizedNewtypeDeriving
|
||||
- InstanceSigs
|
||||
- LambdaCase
|
||||
@ -249,12 +252,16 @@ language_extensions:
|
||||
- NamedFieldPuns
|
||||
- NoImplicitPrelude
|
||||
- OverloadedStrings
|
||||
- QuantifiedConstraints
|
||||
- QuasiQuotes
|
||||
- RankNTypes
|
||||
- RecordWildCards
|
||||
- RoleAnnotations
|
||||
- ScopedTypeVariables
|
||||
- StandaloneDeriving
|
||||
- TemplateHaskell
|
||||
- TupleSections
|
||||
- TypeApplications
|
||||
- TypeFamilies
|
||||
- TypeFamilyDependencies
|
||||
- TypeOperators
|
||||
|
@ -24,6 +24,11 @@ Additionally, you will need a way to run a Postgres database server. The `dev.sh
|
||||
- [PostgreSQL](https://www.postgresql.org) >= 9.5
|
||||
- [postgis](https://postgis.net)
|
||||
|
||||
Additionally, you will need a way to run a Postgres database server. The `dev.sh` script (described below) can set up a Postgres instance for you via [Docker](https://www.docker.com), but if you want to run it yourself, you’ll need:
|
||||
|
||||
- [PostgreSQL](https://www.postgresql.org) >= 9.5
|
||||
- [postgis](https://postgis.net)
|
||||
|
||||
### Upgrading npm
|
||||
|
||||
If your npm is too old (>= 5.7 required):
|
||||
@ -116,16 +121,13 @@ cabal new-run -- test:graphql-engine-tests \
|
||||
|
||||
##### Running the Python test suite
|
||||
|
||||
1. To run the Python tests, you’ll need to install the necessary Python dependencies first. It is
|
||||
recommended that you do this in a self-contained Python venv, which is supported by Python 3.3+
|
||||
out of the box. To create one, run:
|
||||
1. To run the Python tests, you’ll need to install the necessary Python dependencies first. It is recommended that you do this in a self-contained Python venv, which is supported by Python 3.3+ out of the box. To create one, run:
|
||||
|
||||
```
|
||||
python3 -m venv .python-venv
|
||||
```
|
||||
|
||||
(The second argument names a directory where the venv sandbox will be created; it can be anything
|
||||
you like, but `.python-venv` is `.gitignore`d.)
|
||||
(The second argument names a directory where the venv sandbox will be created; it can be anything you like, but `.python-venv` is `.gitignore`d.)
|
||||
|
||||
With the venv created, you can enter into it in your current shell session by running:
|
||||
|
||||
@ -141,11 +143,18 @@ cabal new-run -- test:graphql-engine-tests \
|
||||
pip3 install -r tests-py/requirements.txt
|
||||
```
|
||||
|
||||
3. Start an instance of `graphql-engine` for the test suite to use:
|
||||
3. Install the dependencies for the Node server used by the remote schema tests:
|
||||
|
||||
```
|
||||
(cd tests-py/remote_schemas/nodejs && npm ci)
|
||||
```
|
||||
|
||||
4. Start an instance of `graphql-engine` for the test suite to use:
|
||||
|
||||
```
|
||||
env EVENT_WEBHOOK_HEADER=MyEnvValue \
|
||||
WEBHOOK_FROM_ENV=http://localhost:5592/ \
|
||||
SCHEDULED_TRIGGERS_WEBHOOK_DOMAIN=http://127.0.0.1:5594 \
|
||||
cabal new-run -- exe:graphql-engine \
|
||||
--database-url='postgres://<user>:<password>@<host>:<port>/<dbname>' \
|
||||
serve --stringify-numeric-types
|
||||
@ -153,7 +162,7 @@ cabal new-run -- test:graphql-engine-tests \
|
||||
|
||||
The environment variables are needed for a couple tests, and the `--stringify-numeric-types` option is used to avoid the need to do floating-point comparisons.
|
||||
|
||||
4. With the server running, run the test suite:
|
||||
5. With the server running, run the test suite:
|
||||
|
||||
```
|
||||
cd tests-py
|
||||
|
@ -15,10 +15,17 @@
|
||||
packages: .
|
||||
|
||||
constraints:
|
||||
-- ensure we don’t end up with a freeze file that forces an incompatible
|
||||
-- ensure we don't end up with a freeze file that forces an incompatible
|
||||
-- version in CI for Setup.hs scripts.
|
||||
setup.Cabal <3.4
|
||||
|
||||
allow-newer:
|
||||
-- dependent-map depends on constraints-extras, but its bounds have not yet
|
||||
-- been relaxed for GHC 8.10.
|
||||
constraints-extras-0.3.0.2:base,
|
||||
constraints-extras-0.3.0.2:constraints,
|
||||
constraints-extras-0.3.0.2:template-haskell
|
||||
|
||||
package *
|
||||
optimization: 2
|
||||
-- For tooling, e.g. 'weeder', and IDE-like stuff:
|
||||
@ -41,7 +48,7 @@ source-repository-package
|
||||
source-repository-package
|
||||
type: git
|
||||
location: https://github.com/hasura/graphql-parser-hs.git
|
||||
tag: f4a093981ca5626982a17c2bfaad047cc0834a81
|
||||
tag: 8f1cd3a9bf6ec91f1ba1d83f704ab078113e035b
|
||||
|
||||
source-repository-package
|
||||
type: git
|
||||
|
@ -3,6 +3,12 @@
|
||||
reject-unconstrained-dependencies: all
|
||||
|
||||
package graphql-engine
|
||||
ghc-options: -j3 -Werror
|
||||
ghc-options:
|
||||
-j3 -Werror
|
||||
-- Limit heap size to 8GB, which is the amount of available memory on a
|
||||
-- CircleCI `large` instance. (GHC interprets G as GB, i.e. 1,000^3 bytes,
|
||||
-- but instances seem to have 8GiB, i.e. 1,024^3 bytes, so that leaves us
|
||||
-- a little breathing room.)
|
||||
+RTS -M8G -RTS
|
||||
tests: true
|
||||
benchmarks: true
|
||||
|
@ -6,13 +6,15 @@
|
||||
-- Or, if you want to customize the configuration:
|
||||
-- $ cp cabal.project.dev cabal.project.local
|
||||
|
||||
with-compiler: ghc-8.10.1
|
||||
|
||||
package *
|
||||
documentation: true
|
||||
|
||||
package graphql-engine
|
||||
-- NOTE: this seems to work so long as there is no 'ghc-options: -O2' in the cabal file,
|
||||
-- but new-build will report 'Build profile: -O1' for some reason.
|
||||
-- See:https://github.com/haskell/cabal/issues/6221
|
||||
-- but new-build will report 'Build profile: -O1' for some reason.
|
||||
-- See:https://github.com/haskell/cabal/issues/6221
|
||||
optimization: 0
|
||||
documentation: false
|
||||
flags: +developer
|
||||
|
@ -79,6 +79,8 @@ constraints: any.Cabal ==3.2.0.0,
|
||||
any.conduit ==1.3.2,
|
||||
any.connection ==0.3.1,
|
||||
any.constraints ==0.12,
|
||||
any.constraints-extras ==0.3.0.2,
|
||||
constraints-extras +build-readme,
|
||||
any.containers ==0.6.2.1,
|
||||
any.contravariant ==1.5.2,
|
||||
contravariant +semigroups +statevar +tagged,
|
||||
@ -112,8 +114,8 @@ constraints: any.Cabal ==3.2.0.0,
|
||||
any.deepseq ==1.4.4.0,
|
||||
any.deferred-folds ==0.9.10.1,
|
||||
any.dense-linear-algebra ==0.1.0.0,
|
||||
any.dependent-map ==0.2.4.0,
|
||||
any.dependent-sum ==0.4,
|
||||
any.dependent-map ==0.4.0.0,
|
||||
any.dependent-sum ==0.7.1.0,
|
||||
any.directory ==1.3.6.1,
|
||||
any.distributive ==0.6.2,
|
||||
distributive +semigroups +tagged,
|
||||
@ -127,7 +129,6 @@ constraints: any.Cabal ==3.2.0.0,
|
||||
any.erf ==2.0.0.0,
|
||||
any.errors ==2.3.0,
|
||||
any.exceptions ==0.10.4,
|
||||
exceptions +transformers-0-4,
|
||||
any.fail ==4.9.0.0,
|
||||
any.fast-logger ==3.0.1,
|
||||
any.file-embed ==0.0.11.2,
|
||||
@ -242,8 +243,6 @@ constraints: any.Cabal ==3.2.0.0,
|
||||
any.primitive-unlifted ==0.1.3.0,
|
||||
any.process ==1.6.8.2,
|
||||
any.profunctors ==5.5.2,
|
||||
any.protolude ==0.3.0,
|
||||
protolude -dev,
|
||||
any.psqueues ==0.2.7.2,
|
||||
any.quickcheck-instances ==0.3.22,
|
||||
quickcheck-instances -bytestring-builder,
|
||||
@ -277,6 +276,8 @@ constraints: any.Cabal ==3.2.0.0,
|
||||
any.simple-sendfile ==0.2.30,
|
||||
simple-sendfile +allow-bsd,
|
||||
any.socks ==0.6.1,
|
||||
any.some ==1.0.1,
|
||||
some +newtype-unsafe,
|
||||
any.split ==0.2.3.4,
|
||||
any.splitmix ==0.0.4,
|
||||
splitmix -optimised-mixer +random,
|
||||
|
1
server/commit_diff.txt
Normal file
1
server/commit_diff.txt
Normal file
@ -0,0 +1 @@
|
||||
**** Latest commit compared against master - fd7fb580831fe9054164a285441c99562f34c815
|
@ -20,6 +20,11 @@ flag developer
|
||||
default: False
|
||||
manual: True
|
||||
|
||||
flag profiling
|
||||
description: Configures the project to be profiling-compatible
|
||||
default: False
|
||||
manual: True
|
||||
|
||||
common common-all
|
||||
ghc-options:
|
||||
-fmax-simplifier-iterations=20 -foptimal-applicative-do
|
||||
@ -27,16 +32,51 @@ common common-all
|
||||
|
||||
if flag(developer)
|
||||
cpp-options: -DDeveloperAPIs
|
||||
if flag(profiling)
|
||||
cpp-options: -DPROFILING
|
||||
|
||||
default-language: Haskell2010
|
||||
default-extensions:
|
||||
ApplicativeDo BangPatterns BlockArguments ConstraintKinds DataKinds DefaultSignatures DeriveDataTypeable
|
||||
DeriveFoldable DeriveFunctor DeriveGeneric DeriveLift DeriveTraversable DerivingVia EmptyCase
|
||||
FlexibleContexts FlexibleInstances FunctionalDependencies GeneralizedNewtypeDeriving
|
||||
InstanceSigs LambdaCase MultiParamTypeClasses MultiWayIf NamedFieldPuns NoImplicitPrelude
|
||||
OverloadedStrings QuantifiedConstraints QuasiQuotes RankNTypes RecordWildCards ScopedTypeVariables
|
||||
StandaloneDeriving TemplateHaskell TupleSections TypeApplications TypeFamilies TypeOperators
|
||||
ApplicativeDo
|
||||
BangPatterns
|
||||
BlockArguments
|
||||
ConstraintKinds
|
||||
DataKinds
|
||||
DefaultSignatures
|
||||
DeriveDataTypeable
|
||||
DeriveFoldable
|
||||
DeriveFunctor
|
||||
DeriveGeneric
|
||||
DeriveLift
|
||||
DeriveTraversable
|
||||
DerivingVia
|
||||
EmptyCase
|
||||
ExistentialQuantification
|
||||
FlexibleContexts
|
||||
FlexibleInstances
|
||||
FunctionalDependencies
|
||||
GADTs
|
||||
GeneralizedNewtypeDeriving
|
||||
InstanceSigs
|
||||
LambdaCase
|
||||
MultiParamTypeClasses
|
||||
MultiWayIf
|
||||
NamedFieldPuns
|
||||
NoImplicitPrelude
|
||||
OverloadedStrings
|
||||
QuantifiedConstraints
|
||||
QuasiQuotes
|
||||
RankNTypes
|
||||
RecordWildCards
|
||||
RoleAnnotations
|
||||
ScopedTypeVariables
|
||||
StandaloneDeriving
|
||||
TemplateHaskell
|
||||
TupleSections
|
||||
TypeApplications
|
||||
TypeFamilies
|
||||
TypeFamilyDependencies
|
||||
TypeOperators
|
||||
|
||||
common common-exe
|
||||
ghc-options:
|
||||
@ -50,12 +90,12 @@ common common-exe
|
||||
-- than that is highly unlikely to ever be helpful. More benchmarking would be useful to know if
|
||||
-- this is the right decision. It’s possible it would better to just turn it off completely.
|
||||
--
|
||||
-- `-kc8K` helps limit memory consumption in websockets (perhaps elsewhere) by making the
|
||||
-- cost of a thread's first (and probably only) stack overflow less severe.
|
||||
-- See:https://github.com/hasura/graphql-engine/issues/5190
|
||||
-- `-kc8K` helps limit memory consumption in websockets (perhaps elsewhere) by making the
|
||||
-- cost of a thread's first (and probably only) stack overflow less severe.
|
||||
-- See:https://github.com/hasura/graphql-engine/issues/5190
|
||||
--
|
||||
-- `--disable-delayed-os-memory-return` seems to help lower reported residency, in particular
|
||||
-- in situations where we seem to be dealing with haskell heap fragmentation. This is more a
|
||||
-- `--disable-delayed-os-memory-return` seems to help lower reported residency, in particular
|
||||
-- in situations where we seem to be dealing with haskell heap fragmentation. This is more a
|
||||
-- workaround for limitations in monitoring tools than anything...
|
||||
"-with-rtsopts=-N -I0 -T -qn2 -kc8K --disable-delayed-os-memory-return"
|
||||
|
||||
@ -68,6 +108,7 @@ library
|
||||
, validation
|
||||
, lifted-base
|
||||
, pg-client
|
||||
, validation
|
||||
, text
|
||||
, text-builder >= 0.6
|
||||
, vector-builder
|
||||
@ -100,8 +141,8 @@ library
|
||||
, http-client-tls
|
||||
, profunctors
|
||||
, deepseq
|
||||
, dependent-map >=0.2.4 && <0.4
|
||||
, dependent-sum >=0.4 && <0.5
|
||||
, dependent-map >=0.4 && <0.5
|
||||
, dependent-sum >=0.7.1 && <0.8
|
||||
, exceptions
|
||||
, these
|
||||
, semialign
|
||||
@ -133,10 +174,10 @@ library
|
||||
, lens
|
||||
|
||||
-- GraphQL related
|
||||
, graphql-parser
|
||||
, graphql-parser >=0.2 && <0.3
|
||||
|
||||
-- URL parser related
|
||||
, network-uri
|
||||
, network-uri >=2.6.3.0 && <2.7
|
||||
, uri-encode
|
||||
|
||||
-- String related
|
||||
@ -208,23 +249,35 @@ library
|
||||
, generic-arbitrary
|
||||
, quickcheck-instances
|
||||
|
||||
-- 0.6.1 is supposedly not okay for ghc 8.6:
|
||||
-- https://github.com/nomeata/ghc-heap-view/issues/27
|
||||
, ghc-heap-view
|
||||
|
||||
, directory
|
||||
|
||||
|
||||
, random
|
||||
, mmorph
|
||||
, http-api-data
|
||||
, lens-aeson
|
||||
, safe
|
||||
|
||||
|
||||
, semigroups >= 0.19.1
|
||||
|
||||
-- scheduled triggers
|
||||
, cron >= 0.6.2
|
||||
|
||||
, random
|
||||
, mmorph
|
||||
, http-api-data
|
||||
, lens-aeson
|
||||
, safe
|
||||
|
||||
, semigroups >= 0.19.1
|
||||
|
||||
-- scheduled triggers
|
||||
, cron >= 0.6.2
|
||||
if !flag(profiling)
|
||||
build-depends:
|
||||
-- 0.6.1 is supposedly not okay for ghc 8.6:
|
||||
-- https://github.com/nomeata/ghc-heap-view/issues/27
|
||||
ghc-heap-view
|
||||
|
||||
exposed-modules: Control.Arrow.Extended
|
||||
, Control.Arrow.Trans
|
||||
, Control.Concurrent.Extended
|
||||
@ -246,76 +299,85 @@ library
|
||||
, Data.Time.Clock.Units
|
||||
, Data.URL.Template
|
||||
, Hasura.App
|
||||
, Hasura.Cache.Bounded
|
||||
, Hasura.Db
|
||||
-- Exposed for benchmark:
|
||||
, Hasura.Cache.Bounded
|
||||
, Hasura.Logging
|
||||
, Hasura.HTTP
|
||||
, Hasura.Incremental
|
||||
, Hasura.Server.App
|
||||
, Hasura.Server.Auth
|
||||
, Hasura.Server.Init
|
||||
, Hasura.Server.Init.Config
|
||||
, Hasura.Server.API.Query
|
||||
, Hasura.Server.Utils
|
||||
, Hasura.Server.Version
|
||||
, Hasura.Server.Logging
|
||||
, Hasura.Server.Migrate
|
||||
, Hasura.Server.Compression
|
||||
, Hasura.Server.API.PGDump
|
||||
, Hasura.Prelude
|
||||
|
||||
, Hasura.EncJSON
|
||||
, Hasura.Eventing.Common
|
||||
, Hasura.Eventing.EventTrigger
|
||||
, Hasura.Eventing.HTTP
|
||||
, Hasura.Eventing.ScheduledTrigger
|
||||
, Hasura.GraphQL.Context
|
||||
, Hasura.GraphQL.Execute.Query
|
||||
, Hasura.GraphQL.Logging
|
||||
, Hasura.Incremental.Select
|
||||
, Hasura.RQL.DML.Select
|
||||
, Hasura.RQL.Types.Run
|
||||
, Hasura.Session
|
||||
|
||||
-- exposed for Pro
|
||||
, Hasura.Server.API.Config
|
||||
, Hasura.Server.Telemetry
|
||||
-- Exposed for testing:
|
||||
, Hasura.Server.Telemetry.Counters
|
||||
, Hasura.Server.Auth.JWT
|
||||
, Hasura.GraphQL.Execute
|
||||
, Hasura.GraphQL.Execute.LiveQuery
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Options
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Plan
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Poll
|
||||
, Hasura.GraphQL.Execute.LiveQuery.State
|
||||
, Hasura.GraphQL.Execute.LiveQuery.TMap
|
||||
, Hasura.GraphQL.Execute.Plan
|
||||
, Hasura.GraphQL.Execute.Query
|
||||
, Hasura.GraphQL.Explain
|
||||
, Hasura.GraphQL.Logging
|
||||
, Hasura.GraphQL.NormalForm
|
||||
, Hasura.GraphQL.RelaySchema
|
||||
, Hasura.GraphQL.RemoteServer
|
||||
, Hasura.GraphQL.Resolve
|
||||
, Hasura.GraphQL.Resolve.Action
|
||||
, Hasura.GraphQL.Resolve.BoolExp
|
||||
, Hasura.GraphQL.Resolve.Context
|
||||
, Hasura.GraphQL.Resolve.InputValue
|
||||
, Hasura.GraphQL.Resolve.Insert
|
||||
, Hasura.GraphQL.Resolve.Introspect
|
||||
, Hasura.GraphQL.Resolve.Mutation
|
||||
, Hasura.GraphQL.Resolve.Select
|
||||
, Hasura.GraphQL.Resolve.Types
|
||||
, Hasura.GraphQL.Schema
|
||||
, Hasura.GraphQL.Schema.Action
|
||||
, Hasura.GraphQL.Schema.BoolExp
|
||||
, Hasura.GraphQL.Schema.Builder
|
||||
, Hasura.GraphQL.Schema.Common
|
||||
, Hasura.GraphQL.Schema.CustomTypes
|
||||
, Hasura.GraphQL.Schema.Function
|
||||
, Hasura.GraphQL.Schema.Merge
|
||||
, Hasura.GraphQL.Schema.Mutation.Common
|
||||
, Hasura.GraphQL.Schema.Mutation.Delete
|
||||
, Hasura.GraphQL.Schema.Mutation.Insert
|
||||
, Hasura.GraphQL.Schema.Mutation.Update
|
||||
, Hasura.GraphQL.Schema.OrderBy
|
||||
, Hasura.GraphQL.Schema.Select
|
||||
, Hasura.GraphQL.Transport.HTTP
|
||||
, Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
, Hasura.GraphQL.Transport.WebSocket
|
||||
, Hasura.GraphQL.Transport.WebSocket.Protocol
|
||||
, Hasura.GraphQL.Transport.WebSocket.Server
|
||||
, Hasura.GraphQL.Utils
|
||||
, Hasura.GraphQL.Validate
|
||||
, Hasura.GraphQL.Validate.Context
|
||||
, Hasura.GraphQL.Validate.InputValue
|
||||
, Hasura.GraphQL.Validate.SelectionSet
|
||||
, Hasura.GraphQL.Validate.Types
|
||||
, Hasura.HTTP
|
||||
, Hasura.Incremental
|
||||
, Hasura.Incremental.Internal.Cache
|
||||
, Hasura.Incremental.Internal.Dependency
|
||||
, Hasura.Incremental.Internal.Rule
|
||||
, Hasura.Incremental.Select
|
||||
, Hasura.Logging
|
||||
, Hasura.Prelude
|
||||
, Hasura.Server.Auth.WebHook
|
||||
, Hasura.Server.Middleware
|
||||
, Hasura.Server.Cors
|
||||
, Hasura.Server.CheckUpdates
|
||||
, Hasura.Server.SchemaUpdate
|
||||
, Hasura.Server.Migrate.Version
|
||||
, Hasura.Server.Auth.JWT.Internal
|
||||
, Hasura.Server.Auth.JWT.Logging
|
||||
, Hasura.RQL.Instances
|
||||
, Hasura.RQL.Types
|
||||
, Hasura.RQL.Types.SchemaCache
|
||||
, Hasura.RQL.Types.Table
|
||||
, Hasura.RQL.Types.SchemaCache.Build
|
||||
, Hasura.RQL.Types.SchemaCacheTypes
|
||||
, Hasura.RQL.Types.BoolExp
|
||||
, Hasura.RQL.Types.Function
|
||||
, Hasura.RQL.Types.Catalog
|
||||
, Hasura.RQL.Types.Column
|
||||
, Hasura.RQL.Types.Common
|
||||
, Hasura.RQL.Types.ComputedField
|
||||
, Hasura.RQL.Types.DML
|
||||
, Hasura.RQL.Types.Error
|
||||
, Hasura.RQL.Types.EventTrigger
|
||||
, Hasura.RQL.Types.Metadata
|
||||
, Hasura.RQL.Types.Permission
|
||||
, Hasura.RQL.Types.QueryCollection
|
||||
, Hasura.RQL.Types.Action
|
||||
, Hasura.RQL.Types.RemoteSchema
|
||||
, Hasura.RQL.Types.RemoteRelationship
|
||||
, Hasura.RQL.Types.ScheduledTrigger
|
||||
, Hasura.RQL.DDL.Action
|
||||
, Hasura.RQL.DDL.ComputedField
|
||||
, Hasura.RQL.DDL.CustomTypes
|
||||
, Hasura.RQL.Types.CustomTypes
|
||||
, Hasura.RQL.DDL.Deps
|
||||
, Hasura.RQL.DDL.EventTrigger
|
||||
, Hasura.RQL.DDL.Headers
|
||||
, Hasura.RQL.DDL.Metadata
|
||||
, Hasura.RQL.DDL.Metadata.Generator
|
||||
@ -329,7 +391,6 @@ library
|
||||
, Hasura.RQL.DDL.RemoteRelationship
|
||||
, Hasura.RQL.DDL.RemoteRelationship.Validate
|
||||
, Hasura.RQL.DDL.RemoteSchema
|
||||
, Hasura.RQL.DDL.ScheduledTrigger
|
||||
, Hasura.RQL.DDL.Schema
|
||||
, Hasura.RQL.DDL.Schema.Cache
|
||||
, Hasura.RQL.DDL.Schema.Cache.Common
|
||||
@ -343,66 +404,63 @@ library
|
||||
, Hasura.RQL.DDL.Schema.Rename
|
||||
, Hasura.RQL.DDL.Schema.Table
|
||||
, Hasura.RQL.DDL.Utils
|
||||
, Hasura.RQL.DML.Count
|
||||
, Hasura.RQL.DDL.EventTrigger
|
||||
, Hasura.RQL.DDL.ScheduledTrigger
|
||||
, Hasura.RQL.DML.Delete
|
||||
, Hasura.RQL.DML.Insert
|
||||
, Hasura.RQL.DML.Delete.Types
|
||||
, Hasura.RQL.DML.Internal
|
||||
, Hasura.RQL.DML.Insert
|
||||
, Hasura.RQL.DML.Insert.Types
|
||||
, Hasura.RQL.DML.Mutation
|
||||
, Hasura.RQL.DML.RemoteJoin
|
||||
, Hasura.RQL.DML.Returning
|
||||
, Hasura.RQL.DML.Select
|
||||
, Hasura.RQL.DML.Returning.Types
|
||||
, Hasura.RQL.DML.Select.Internal
|
||||
, Hasura.RQL.DML.Select.Types
|
||||
, Hasura.RQL.DML.Update
|
||||
, Hasura.RQL.DML.Update.Types
|
||||
, Hasura.RQL.DML.Count
|
||||
, Hasura.RQL.GBoolExp
|
||||
, Hasura.RQL.Instances
|
||||
, Hasura.RQL.Types
|
||||
, Hasura.RQL.Types.Action
|
||||
, Hasura.RQL.Types.BoolExp
|
||||
, Hasura.RQL.Types.Catalog
|
||||
, Hasura.RQL.Types.Column
|
||||
, Hasura.RQL.Types.Common
|
||||
, Hasura.RQL.Types.ComputedField
|
||||
, Hasura.RQL.Types.CustomTypes
|
||||
, Hasura.RQL.Types.DML
|
||||
, Hasura.RQL.Types.Error
|
||||
, Hasura.RQL.Types.EventTrigger
|
||||
, Hasura.RQL.Types.Function
|
||||
, Hasura.RQL.Types.Metadata
|
||||
, Hasura.RQL.Types.Permission
|
||||
, Hasura.RQL.Types.QueryCollection
|
||||
, Hasura.RQL.Types.RemoteRelationship
|
||||
, Hasura.RQL.Types.RemoteSchema
|
||||
, Hasura.RQL.Types.Run
|
||||
, Hasura.RQL.Types.ScheduledTrigger
|
||||
, Hasura.RQL.Types.SchemaCache
|
||||
, Hasura.RQL.Types.SchemaCache.Build
|
||||
, Hasura.RQL.Types.SchemaCacheTypes
|
||||
, Hasura.RQL.Types.Table
|
||||
, Hasura.Server.API.Config
|
||||
, Hasura.Server.API.PGDump
|
||||
, Hasura.Server.API.Query
|
||||
, Hasura.Server.App
|
||||
, Hasura.Server.Auth
|
||||
, Hasura.Server.Auth.JWT
|
||||
, Hasura.Server.Auth.JWT.Internal
|
||||
, Hasura.Server.Auth.JWT.Logging
|
||||
, Hasura.Server.Auth.WebHook
|
||||
, Hasura.Server.CheckUpdates
|
||||
, Hasura.Server.Compression
|
||||
, Hasura.Server.Cors
|
||||
, Hasura.Server.Init
|
||||
, Hasura.Server.Init.Config
|
||||
, Hasura.Server.Logging
|
||||
, Hasura.Server.Middleware
|
||||
, Hasura.Server.Migrate
|
||||
, Hasura.Server.Migrate.Version
|
||||
, Hasura.Server.SchemaUpdate
|
||||
, Hasura.Server.Telemetry
|
||||
, Hasura.Server.Telemetry.Counters
|
||||
, Hasura.Server.Utils
|
||||
, Hasura.Server.Version
|
||||
, Hasura.Session
|
||||
, Hasura.GraphQL.Explain
|
||||
, Hasura.GraphQL.Execute.Action
|
||||
, Hasura.GraphQL.Execute.Inline
|
||||
, Hasura.GraphQL.Execute.Insert
|
||||
, Hasura.GraphQL.Execute.Plan
|
||||
, Hasura.GraphQL.Execute.Types
|
||||
, Hasura.GraphQL.Execute.Mutation
|
||||
, Hasura.GraphQL.Execute.Resolve
|
||||
, Hasura.GraphQL.Execute.Prepare
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Options
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Plan
|
||||
, Hasura.GraphQL.Execute.LiveQuery.Poll
|
||||
, Hasura.GraphQL.Execute.LiveQuery.State
|
||||
, Hasura.GraphQL.Execute.LiveQuery.TMap
|
||||
, Hasura.GraphQL.RemoteServer
|
||||
, Hasura.GraphQL.Context
|
||||
, Hasura.GraphQL.Parser
|
||||
, Hasura.GraphQL.Parser.Class
|
||||
, Hasura.GraphQL.Parser.Collect
|
||||
, Hasura.GraphQL.Parser.Column
|
||||
, Hasura.GraphQL.Parser.Internal.Parser
|
||||
, Hasura.GraphQL.Parser.Monad
|
||||
, Hasura.GraphQL.Parser.Schema
|
||||
, Hasura.GraphQL.Schema
|
||||
, Hasura.GraphQL.Schema.Action
|
||||
, Hasura.GraphQL.Schema.BoolExp
|
||||
, Hasura.GraphQL.Schema.Common
|
||||
, Hasura.GraphQL.Schema.Insert
|
||||
, Hasura.GraphQL.Schema.Introspect
|
||||
, Hasura.GraphQL.Schema.Mutation
|
||||
, Hasura.GraphQL.Schema.OrderBy
|
||||
, Hasura.GraphQL.Schema.Remote
|
||||
, Hasura.GraphQL.Schema.Select
|
||||
, Hasura.GraphQL.Schema.Table
|
||||
, Hasura.Eventing.HTTP
|
||||
, Hasura.Eventing.EventTrigger
|
||||
, Hasura.Eventing.ScheduledTrigger
|
||||
, Hasura.Eventing.Common
|
||||
, Data.GADT.Compare.Extended
|
||||
, Data.Tuple.Extended
|
||||
, Hasura.SQL.DML
|
||||
, Hasura.SQL.Error
|
||||
, Hasura.SQL.GeoJSON
|
||||
@ -414,6 +472,7 @@ library
|
||||
, Network.URI.Extended
|
||||
, Network.Wai.Extended
|
||||
, Network.Wai.Handler.WebSockets.Custom
|
||||
|
||||
executable graphql-engine
|
||||
import: common-all, common-exe
|
||||
hs-source-dirs: src-exec
|
||||
|
@ -82,7 +82,7 @@ main = defaultMain [
|
||||
-- correct, or might be incorrect for some users. Or it might be that many
|
||||
-- users interact with hasura ONLY with parameterized queries with variables,
|
||||
-- where all of these fit into a fairly small cache (but where occurrences of
|
||||
-- these are zipf-distributed). (TODO It should be simple to adapt this to the latter
|
||||
-- these are zipf-distributed). (TODO (from master) It should be simple to adapt this to the latter
|
||||
-- case (just test on zipf Word8 domain), but these benchmarks don't seem very
|
||||
-- useful if we assume we effectively get only cache hits).
|
||||
--
|
||||
@ -141,7 +141,7 @@ realisticBenches name wrk =
|
||||
_hitsMisses <- forConcurrently localPayloads $ \payloadL -> do
|
||||
foldM lookupInsertLoop (0,0) payloadL
|
||||
aft <- getMonotonicTimeNSec
|
||||
-- TODO we need to decide whether to rewrite these benchmarks or fix
|
||||
-- TODO (from master) we need to decide whether to rewrite these benchmarks or fix
|
||||
-- criterion so it can support what I want here (to run a slow benchmark
|
||||
-- perhaps one time, with an actual time limit).
|
||||
-- We should also look into just generating a report by hand that takes
|
||||
|
@ -16,8 +16,8 @@ module Data.Aeson.Ordered
|
||||
, decode
|
||||
, Data.Aeson.Ordered.toList
|
||||
, fromList
|
||||
, object
|
||||
, asObject
|
||||
, object
|
||||
, array
|
||||
, insert
|
||||
, delete
|
||||
@ -41,7 +41,6 @@ import qualified Data.ByteString.Lazy as L
|
||||
import Data.Data
|
||||
import Data.Functor
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import Data.HashMap.Strict.InsOrd (InsOrdHashMap)
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import Data.Scientific
|
||||
import qualified Data.Text as T
|
||||
|
29
server/src-lib/Data/GADT/Compare/Extended.hs
Normal file
29
server/src-lib/Data/GADT/Compare/Extended.hs
Normal file
@ -0,0 +1,29 @@
|
||||
{-# OPTIONS_GHC -fno-warn-orphans #-}
|
||||
{-# LANGUAGE PolyKinds #-}
|
||||
|
||||
module Data.GADT.Compare.Extended
|
||||
( module Data.GADT.Compare
|
||||
, strengthenOrdering
|
||||
, extendGOrdering
|
||||
) where
|
||||
|
||||
import Prelude
|
||||
|
||||
import Data.GADT.Compare
|
||||
import Type.Reflection
|
||||
|
||||
instance GEq ((:~~:) a) where
|
||||
geq HRefl HRefl = Just Refl
|
||||
instance GCompare ((:~~:) a) where
|
||||
gcompare HRefl HRefl = GEQ
|
||||
|
||||
strengthenOrdering :: Ordering -> GOrdering a a
|
||||
strengthenOrdering LT = GLT
|
||||
strengthenOrdering EQ = GEQ
|
||||
strengthenOrdering GT = GGT
|
||||
|
||||
infixr 6 `extendGOrdering`
|
||||
extendGOrdering :: GOrdering a b -> (a ~ b => GOrdering c d) -> GOrdering c d
|
||||
extendGOrdering GLT _ = GLT
|
||||
extendGOrdering GEQ x = x
|
||||
extendGOrdering GGT _ = GGT
|
@ -10,7 +10,7 @@ import qualified Data.List as L
|
||||
|
||||
import Data.Hashable (Hashable)
|
||||
|
||||
import Prelude (Eq, Foldable, Functor, fmap, ($))
|
||||
import Prelude (Eq, Foldable, Functor, flip, fmap, ($), (<>))
|
||||
|
||||
groupTuples
|
||||
:: (Eq k, Hashable k, Foldable t)
|
||||
@ -19,7 +19,7 @@ groupTuples =
|
||||
L.foldl' groupFlds OMap.empty
|
||||
where
|
||||
groupFlds m (k, v) =
|
||||
OMap.insertWith (\_ c -> c NE.|> v) k (NE.init v) m
|
||||
OMap.insertWith (flip (<>)) k (NE.singleton v) m
|
||||
|
||||
groupListWith
|
||||
:: (Eq k, Hashable k, Foldable t, Functor t)
|
||||
|
@ -1,39 +1,49 @@
|
||||
{-# LANGUAGE PatternSynonyms #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
|
||||
module Data.Sequence.NonEmpty
|
||||
( NESeq(..)
|
||||
, (<|)
|
||||
, (|>)
|
||||
, init
|
||||
( NESeq
|
||||
, pattern (:<||)
|
||||
, pattern (:||>)
|
||||
, singleton
|
||||
, head
|
||||
, tail
|
||||
, toSeq
|
||||
, fromSeq
|
||||
, toNonEmpty
|
||||
) where
|
||||
|
||||
import qualified Data.Foldable as Foldable
|
||||
import qualified Data.Functor as Functor
|
||||
import Prelude hiding (head, tail)
|
||||
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import qualified Data.Sequence as Seq
|
||||
|
||||
import Control.DeepSeq (NFData)
|
||||
import Data.Aeson
|
||||
import Hasura.Incremental (Cacheable)
|
||||
import Hasura.Prelude hiding (head, tail)
|
||||
import Data.Foldable
|
||||
import GHC.Generics (Generic)
|
||||
|
||||
infixr 5 <|
|
||||
infixl 5 |>
|
||||
|
||||
newtype NESeq a
|
||||
= NESeq { unNESeq :: (a, Seq.Seq a)}
|
||||
deriving (Show, Eq, Generic, Traversable)
|
||||
data NESeq a = NESeq
|
||||
{ head :: a
|
||||
, tail :: Seq.Seq a
|
||||
} deriving (Show, Eq, Functor, Traversable, Generic)
|
||||
instance (NFData a) => NFData (NESeq a)
|
||||
instance (Cacheable a) => Cacheable (NESeq a)
|
||||
|
||||
instance Functor.Functor NESeq where
|
||||
fmap f (NESeq (a, rest))
|
||||
= NESeq (f a, Functor.fmap f rest)
|
||||
instance Semigroup (NESeq a) where
|
||||
NESeq x xs <> NESeq y ys = NESeq x (xs Seq.>< y Seq.<| ys)
|
||||
|
||||
instance Foldable.Foldable NESeq where
|
||||
foldr f v = Foldable.foldr f v . toSeq
|
||||
instance Foldable NESeq where
|
||||
null _ = False
|
||||
toList (NESeq x xs) = x : toList xs
|
||||
length (NESeq _ xs) = 1 + length xs
|
||||
foldl1 f (NESeq x xs) = foldl f x xs
|
||||
|
||||
fold = fold . toSeq
|
||||
foldMap f = foldMap f . toSeq
|
||||
foldl f v = foldl f v . toSeq
|
||||
foldl' f v = foldl' f v . toSeq
|
||||
foldr f v = foldr f v . toSeq
|
||||
foldr' f v = foldr' f v . toSeq
|
||||
foldr1 f = foldr1 f . toSeq
|
||||
|
||||
instance FromJSON a => FromJSON (NESeq a) where
|
||||
parseJSON v = do
|
||||
@ -43,32 +53,33 @@ instance FromJSON a => FromJSON (NESeq a) where
|
||||
instance ToJSON a => ToJSON (NESeq a) where
|
||||
toJSON = toJSON . toSeq
|
||||
|
||||
init :: a -> NESeq a
|
||||
init a = NESeq (a, Seq.empty)
|
||||
|
||||
head :: NESeq a -> a
|
||||
head = fst . unNESeq
|
||||
|
||||
tail :: NESeq a -> Seq.Seq a
|
||||
tail = snd . unNESeq
|
||||
|
||||
(|>) :: NESeq a -> a -> NESeq a
|
||||
(NESeq (h, l)) |> v = NESeq (h, l Seq.|> v)
|
||||
|
||||
(<|) :: a -> NESeq a -> NESeq a
|
||||
v <| (NESeq (h, l)) = NESeq (v, h Seq.<| l)
|
||||
singleton :: a -> NESeq a
|
||||
singleton a = NESeq a Seq.empty
|
||||
|
||||
toSeq :: NESeq a -> Seq.Seq a
|
||||
toSeq (NESeq (v, l)) = v Seq.<| l
|
||||
toSeq (NESeq v l) = v Seq.<| l
|
||||
|
||||
fromSeq :: Seq.Seq a -> Maybe (NESeq a)
|
||||
fromSeq = \case
|
||||
Seq.Empty -> Nothing
|
||||
h Seq.:<| l -> Just $ NESeq (h, l)
|
||||
h Seq.:<| l -> Just $ NESeq h l
|
||||
|
||||
toNonEmpty :: NESeq a -> NonEmpty a
|
||||
toNonEmpty (NESeq (v, l)) = v NE.:| toList l
|
||||
pattern (:<||) :: a -> Seq.Seq a -> NESeq a
|
||||
pattern x :<|| xs = NESeq x xs
|
||||
{-# COMPLETE (:<||) #-}
|
||||
|
||||
instance Semigroup (NESeq a) where
|
||||
(NESeq (h, l)) <> r =
|
||||
NESeq (h, l <> toSeq r)
|
||||
unsnoc :: NESeq a -> (Seq.Seq a, a)
|
||||
unsnoc (x :<|| (xs Seq.:|> y)) = (x Seq.:<| xs, y)
|
||||
unsnoc (x :<|| Seq.Empty ) = (Seq.Empty , x)
|
||||
{-# INLINE unsnoc #-}
|
||||
|
||||
pattern (:||>) :: Seq.Seq a -> a -> NESeq a
|
||||
pattern xs :||> x <- (unsnoc->(!xs, x))
|
||||
where
|
||||
(x Seq.:<| xs) :||> y = x :<|| (xs Seq.:|> y)
|
||||
Seq.Empty :||> y = y :<|| Seq.Empty
|
||||
{-# COMPLETE (:||>) #-}
|
||||
|
||||
toNonEmpty :: NESeq a -> NE.NonEmpty a
|
||||
toNonEmpty (NESeq head tail) =
|
||||
head NE.:| toList tail
|
||||
|
23
server/src-lib/Data/Tuple/Extended.hs
Normal file
23
server/src-lib/Data/Tuple/Extended.hs
Normal file
@ -0,0 +1,23 @@
|
||||
module Data.Tuple.Extended
|
||||
( module Data.Tuple
|
||||
, curry3
|
||||
, curry4
|
||||
, uncurry3
|
||||
, uncurry4
|
||||
) where
|
||||
|
||||
import Prelude
|
||||
|
||||
import Data.Tuple
|
||||
|
||||
curry3 :: ((a, b, c) -> d) -> a -> b -> c -> d
|
||||
curry3 f a b c = f (a, b, c)
|
||||
|
||||
curry4 :: ((a, b, c, d) -> e) -> a -> b -> c -> d -> e
|
||||
curry4 f a b c d = f (a, b, c, d)
|
||||
|
||||
uncurry3 :: (a -> b -> c -> d) -> (a, b, c) -> d
|
||||
uncurry3 f (a, b, c) = f a b c
|
||||
|
||||
uncurry4 :: (a -> b -> c -> d -> e) -> (a, b, c, d) -> e
|
||||
uncurry4 f (a, b, c, d) = f a b c d
|
@ -1,4 +1,5 @@
|
||||
{-# LANGUAGE UndecidableInstances #-}
|
||||
{-# LANGUAGE CPP #-}
|
||||
|
||||
module Hasura.App where
|
||||
|
||||
@ -12,9 +13,12 @@ import Control.Monad.Morph (hoist)
|
||||
import Control.Monad.Stateless
|
||||
import Control.Monad.STM (atomically)
|
||||
import Control.Monad.Trans.Control (MonadBaseControl (..))
|
||||
import Control.Monad.Unique
|
||||
import Data.Aeson ((.=))
|
||||
import Data.Time.Clock (UTCTime)
|
||||
#ifndef PROFILING
|
||||
import GHC.AssertNF
|
||||
#endif
|
||||
import GHC.Stats
|
||||
import Options.Applicative
|
||||
import System.Environment (getEnvironment)
|
||||
@ -45,8 +49,8 @@ import Hasura.Eventing.EventTrigger
|
||||
import Hasura.Eventing.ScheduledTrigger
|
||||
import Hasura.GraphQL.Execute (MonadGQLExecutionCheck (..),
|
||||
checkQueryInAllowlist)
|
||||
import Hasura.GraphQL.Execute.Action (asyncActionsProcessor)
|
||||
import Hasura.GraphQL.Logging (MonadQueryLog (..), QueryLog (..))
|
||||
import Hasura.GraphQL.Resolve.Action (asyncActionsProcessor)
|
||||
import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery (..))
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol (toParsed)
|
||||
import Hasura.Logging
|
||||
@ -165,7 +169,7 @@ data InitCtx
|
||||
}
|
||||
|
||||
-- | Collection of the LoggerCtx, the regular Logger and the PGLogger
|
||||
-- TODO: better naming?
|
||||
-- TODO (from master): better naming?
|
||||
data Loggers
|
||||
= Loggers
|
||||
{ _lsLoggerCtx :: !(LoggerCtx Hasura)
|
||||
@ -319,7 +323,9 @@ runHGEServer env ServeOptions{..} InitCtx{..} pgExecCtx initTime shutdownApp pos
|
||||
-- tool.
|
||||
--
|
||||
-- NOTE: be sure to compile WITHOUT code coverage, for this to work properly.
|
||||
#ifndef PROFILING
|
||||
liftIO disableAssertNF
|
||||
#endif
|
||||
|
||||
let sqlGenCtx = SQLGenCtx soStringifyNum
|
||||
Loggers loggerCtx logger _ = _icLoggers
|
||||
@ -597,6 +603,7 @@ execQuery
|
||||
, CacheRWM m
|
||||
, MonadTx m
|
||||
, MonadIO m
|
||||
, MonadUnique m
|
||||
, HasHttpManager m
|
||||
, HasSQLGenCtx m
|
||||
, UserInfoM m
|
||||
@ -625,7 +632,7 @@ instance HttpLog AppM where
|
||||
mkHttpAccessLogContext userInfoM reqId waiReq compressedResponse qTime cType headers
|
||||
|
||||
instance MonadExecuteQuery AppM where
|
||||
executeQuery _ _ _ pgCtx _txAccess tx =
|
||||
executeQuery _ _ _ pgCtx _txAccess tx =
|
||||
([],) <$> hoist (runQueryTx pgCtx) tx
|
||||
|
||||
instance UserAuthentication (Tracing.TraceT AppM) where
|
||||
@ -660,7 +667,6 @@ instance MonadQueryLog AppM where
|
||||
instance WS.MonadWSLog AppM where
|
||||
logWSLog = unLogger
|
||||
|
||||
|
||||
--- helper functions ---
|
||||
|
||||
mkConsoleHTML :: HasVersion => Text -> AuthMode -> Bool -> Maybe Text -> Either String Text
|
||||
|
@ -19,6 +19,7 @@ module Hasura.Db
|
||||
, LazyRespTx
|
||||
, defaultTxErrorHandler
|
||||
, mkTxErrorHandler
|
||||
, lazyTxToQTx
|
||||
) where
|
||||
|
||||
import Control.Lens
|
||||
|
@ -25,7 +25,7 @@ import qualified Data.Text.Encoding as TE
|
||||
import qualified Database.PG.Query as Q
|
||||
|
||||
-- encoded json
|
||||
-- TODO: can be improved with gadts capturing bytestring, lazybytestring
|
||||
-- TODO (from master): can be improved with gadts capturing bytestring, lazybytestring
|
||||
-- and builder
|
||||
newtype EncJSON
|
||||
= EncJSON { unEncJSON :: BB.Builder }
|
||||
|
@ -1,54 +1,43 @@
|
||||
module Hasura.GraphQL.Context where
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
module Hasura.GraphQL.Context
|
||||
( RoleContext(..)
|
||||
, GQLContext(..)
|
||||
, ParserFn
|
||||
, RootField(..)
|
||||
, traverseDB
|
||||
, traverseAction
|
||||
, RemoteField
|
||||
, QueryDB(..)
|
||||
, ActionQuery(..)
|
||||
, QueryRootField
|
||||
, MutationDB(..)
|
||||
, ActionMutation(..)
|
||||
, MutationRootField
|
||||
, SubscriptionRootField
|
||||
, SubscriptionRootFieldResolved
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import Data.Aeson
|
||||
import qualified Data.Aeson as J
|
||||
import Data.Aeson.Casing
|
||||
import Data.Aeson.TH
|
||||
import Data.Has
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Session
|
||||
import qualified Hasura.RQL.DML.Delete.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Select.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Update.Types as RQL
|
||||
import qualified Hasura.RQL.Types.Action as RQL
|
||||
import qualified Hasura.RQL.Types.RemoteSchema as RQL
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
-- | A /GraphQL context/, aka the final output of GraphQL schema generation. Used to both validate
|
||||
-- incoming queries and respond to introspection queries.
|
||||
--
|
||||
-- Combines information from 'TyAgg', 'RootFields', and 'InsCtxMap' datatypes and adds a bit more on
|
||||
-- top. Constructed via the 'mkGCtx' smart constructor.
|
||||
data GCtx
|
||||
= GCtx
|
||||
-- GraphQL type information
|
||||
{ _gTypes :: !TypeMap
|
||||
, _gFields :: !FieldMap
|
||||
, _gQueryRoot :: !ObjTyInfo
|
||||
, _gMutRoot :: !(Maybe ObjTyInfo)
|
||||
, _gSubRoot :: !(Maybe ObjTyInfo)
|
||||
-- Postgres type information
|
||||
, _gOrdByCtx :: !OrdByCtx
|
||||
, _gQueryCtxMap :: !QueryCtxMap
|
||||
, _gMutationCtxMap :: !MutationCtxMap
|
||||
, _gInsCtxMap :: !InsCtxMap
|
||||
} deriving (Show, Eq)
|
||||
import Hasura.GraphQL.Parser
|
||||
import Hasura.GraphQL.Schema.Insert (AnnInsert)
|
||||
|
||||
data RemoteGCtx
|
||||
= RemoteGCtx
|
||||
{ _rgTypes :: !TypeMap
|
||||
, _rgQueryRoot :: !ObjTyInfo
|
||||
, _rgMutationRoot :: !(Maybe ObjTyInfo)
|
||||
, _rgSubscriptionRoot :: !(Maybe ObjTyInfo)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
instance Has TypeMap GCtx where
|
||||
getter = _gTypes
|
||||
modifier f ctx = ctx { _gTypes = f $ _gTypes ctx }
|
||||
|
||||
instance ToJSON GCtx where
|
||||
toJSON _ = String "ToJSON for GCtx is not implemented"
|
||||
-- | For storing both a normal GQLContext and one for the backend variant.
|
||||
-- Currently, this is to enable the backend variant to have certain insert
|
||||
-- permissions which the frontend variant does not.
|
||||
|
||||
data RoleContext a
|
||||
= RoleContext
|
||||
@ -57,37 +46,71 @@ data RoleContext a
|
||||
} deriving (Show, Eq, Functor, Foldable, Traversable)
|
||||
$(deriveToJSON (aesonDrop 5 snakeCase) ''RoleContext)
|
||||
|
||||
type GCtxMap = Map.HashMap RoleName (RoleContext GCtx)
|
||||
type RelayGCtxMap = Map.HashMap RoleName GCtx
|
||||
data GQLContext = GQLContext
|
||||
{ gqlQueryParser :: ParserFn (InsOrdHashMap G.Name (QueryRootField UnpreparedValue))
|
||||
, gqlMutationParser :: Maybe (ParserFn (InsOrdHashMap G.Name (MutationRootField UnpreparedValue)))
|
||||
}
|
||||
|
||||
queryRootNamedType :: G.NamedType
|
||||
queryRootNamedType = G.NamedType "query_root"
|
||||
instance J.ToJSON GQLContext where
|
||||
toJSON GQLContext{} = J.String "The GraphQL schema parsers"
|
||||
|
||||
mutationRootNamedType :: G.NamedType
|
||||
mutationRootNamedType = G.NamedType "mutation_root"
|
||||
type ParserFn a
|
||||
= G.SelectionSet G.NoFragments Variable
|
||||
-> Either (NESeq ParseError) (a, QueryReusability)
|
||||
|
||||
subscriptionRootNamedType :: G.NamedType
|
||||
subscriptionRootNamedType = G.NamedType "subscription_root"
|
||||
data RootField db remote action raw
|
||||
= RFDB db
|
||||
| RFRemote remote
|
||||
| RFAction action
|
||||
| RFRaw raw
|
||||
|
||||
mkQueryRootTyInfo :: [ObjFldInfo] -> ObjTyInfo
|
||||
mkQueryRootTyInfo flds =
|
||||
mkHsraObjTyInfo (Just "query root") queryRootNamedType Set.empty $
|
||||
mapFromL _fiName $ schemaFld:typeFld:flds
|
||||
where
|
||||
schemaFld = mkHsraObjFldInfo Nothing "__schema" Map.empty $
|
||||
G.toGT $ G.toNT $ G.NamedType "__Schema"
|
||||
typeFld = mkHsraObjFldInfo Nothing "__type" typeFldArgs $
|
||||
G.toGT $ G.NamedType "__Type"
|
||||
typeFldArgs = mapFromL _iviName $ pure $
|
||||
InpValInfo (Just "name of the type") "name" Nothing
|
||||
$ G.toGT $ G.toNT $ G.NamedType "String"
|
||||
traverseDB :: forall db db' remote action raw f
|
||||
. Applicative f
|
||||
=> (db -> f db')
|
||||
-> RootField db remote action raw
|
||||
-> f (RootField db' remote action raw)
|
||||
traverseDB f = \case
|
||||
RFDB x -> RFDB <$> f x
|
||||
RFRemote x -> pure $ RFRemote x
|
||||
RFAction x -> pure $ RFAction x
|
||||
RFRaw x -> pure $ RFRaw x
|
||||
|
||||
defaultTypes :: [TypeInfo]
|
||||
defaultTypes = $(fromSchemaDocQ defaultSchema TLHasuraType)
|
||||
traverseAction :: forall db remote action action' raw f
|
||||
. Applicative f
|
||||
=> (action -> f action')
|
||||
-> RootField db remote action raw
|
||||
-> f (RootField db remote action' raw)
|
||||
traverseAction f = \case
|
||||
RFDB x -> pure $ RFDB x
|
||||
RFRemote x -> pure $ RFRemote x
|
||||
RFAction x -> RFAction <$> f x
|
||||
RFRaw x -> pure $ RFRaw x
|
||||
|
||||
emptyGCtx :: GCtx
|
||||
emptyGCtx =
|
||||
let queryRoot = mkQueryRootTyInfo []
|
||||
allTys = mkTyInfoMap $ TIObj queryRoot:defaultTypes
|
||||
-- for now subscription root is query root
|
||||
in GCtx allTys mempty queryRoot Nothing Nothing mempty mempty mempty mempty
|
||||
data QueryDB v
|
||||
= QDBSimple (RQL.AnnSimpleSelG v)
|
||||
| QDBPrimaryKey (RQL.AnnSimpleSelG v)
|
||||
| QDBAggregation (RQL.AnnAggregateSelectG v)
|
||||
| QDBConnection (RQL.ConnectionSelect v)
|
||||
|
||||
data ActionQuery v
|
||||
= AQQuery !(RQL.AnnActionExecution v)
|
||||
| AQAsync !(RQL.AnnActionAsyncQuery v)
|
||||
|
||||
type RemoteField = (RQL.RemoteSchemaInfo, G.Field G.NoFragments Variable)
|
||||
|
||||
type QueryRootField v = RootField (QueryDB v) RemoteField (ActionQuery v) J.Value
|
||||
|
||||
data MutationDB v
|
||||
= MDBInsert (AnnInsert v)
|
||||
| MDBUpdate (RQL.AnnUpdG v)
|
||||
| MDBDelete (RQL.AnnDelG v)
|
||||
|
||||
data ActionMutation v
|
||||
= AMSync !(RQL.AnnActionExecution v)
|
||||
| AMAsync !RQL.AnnActionMutationAsync
|
||||
|
||||
type MutationRootField v =
|
||||
RootField (MutationDB v) RemoteField (ActionMutation v) J.Value
|
||||
|
||||
type SubscriptionRootField v = RootField (QueryDB v) Void (RQL.AnnActionAsyncQuery v) Void
|
||||
type SubscriptionRootFieldResolved = RootField (QueryDB S.SQLExp) Void RQL.AnnSimpleSel Void
|
||||
|
@ -1,21 +1,19 @@
|
||||
module Hasura.GraphQL.Execute
|
||||
( GQExecPlan(..)
|
||||
, EQ.GraphQLQueryType(..)
|
||||
|
||||
, ExecPlanPartial
|
||||
, getExecPlanPartial
|
||||
|
||||
, ExecOp(..)
|
||||
, GQExecPlanResolved
|
||||
( EPr.ExecutionStep(..)
|
||||
, ResolvedExecutionPlan(..)
|
||||
, ET.GraphQLQueryType(..)
|
||||
, getResolvedExecPlan
|
||||
, getExecPlanPartial
|
||||
, execRemoteGQ
|
||||
, getSubsOp
|
||||
, validateSubscriptionRootField
|
||||
-- , getSubsOp
|
||||
|
||||
, EP.PlanCache
|
||||
, EP.PlanCacheOptions(..)
|
||||
, EP.initPlanCache
|
||||
, EP.clearPlanCache
|
||||
, EP.dumpPlanCache
|
||||
-- , EP.PlanCache
|
||||
-- , EP.mkPlanCacheOptions
|
||||
-- , EP.PlanCacheOptions(..)
|
||||
-- , EP.initPlanCache
|
||||
-- , EP.clearPlanCache
|
||||
-- , EP.dumpPlanCache
|
||||
, EQ.PreparedSql(..)
|
||||
, ExecutionCtx(..)
|
||||
|
||||
@ -23,55 +21,46 @@ module Hasura.GraphQL.Execute
|
||||
, checkQueryInAllowlist
|
||||
) where
|
||||
|
||||
import Control.Lens
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
|
||||
import qualified Data.HashSet as HS
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
import qualified Network.Wai.Extended as Wai
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Logging
|
||||
import Hasura.GraphQL.Parser.Column (UnpreparedValue)
|
||||
import Hasura.GraphQL.RemoteServer (execRemoteGQ')
|
||||
import Hasura.GraphQL.Resolve.Action
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Schema
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.GraphQL.Utils (showName)
|
||||
import Hasura.HTTP
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Utils (RequestId)
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Hasura.GraphQL.Context as GC
|
||||
import qualified Hasura.GraphQL.Context as C
|
||||
import qualified Hasura.GraphQL.Execute.Inline as EI
|
||||
|
||||
import qualified Hasura.GraphQL.Execute.LiveQuery as EL
|
||||
import qualified Hasura.GraphQL.Execute.Plan as EP
|
||||
import qualified Hasura.GraphQL.Execute.Mutation as EM
|
||||
-- import qualified Hasura.GraphQL.Execute.Plan as EP
|
||||
import qualified Hasura.GraphQL.Execute.Prepare as EPr
|
||||
import qualified Hasura.GraphQL.Execute.Query as EQ
|
||||
import qualified Hasura.GraphQL.Resolve as GR
|
||||
import qualified Hasura.GraphQL.Validate as VQ
|
||||
import qualified Hasura.GraphQL.Validate.SelectionSet as VQ
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
import qualified Hasura.GraphQL.Execute.Types as ET
|
||||
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.Server.Telemetry.Counters as Telem
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
-- The current execution plan of a graphql operation, it is
|
||||
-- currently, either local pg execution or a remote execution
|
||||
--
|
||||
-- The 'a' is parameterised so this AST can represent
|
||||
-- intermediate passes
|
||||
data GQExecPlan a
|
||||
= GExPHasura !a
|
||||
| GExPRemote !RemoteSchemaInfo !G.TypedOperationDefinition
|
||||
deriving (Functor, Foldable, Traversable)
|
||||
|
||||
type QueryParts = G.TypedOperationDefinition G.FragmentSpread G.Name
|
||||
|
||||
-- | Execution context
|
||||
data ExecutionCtx
|
||||
@ -79,7 +68,7 @@ data ExecutionCtx
|
||||
{ _ecxLogger :: !(L.Logger L.Hasura)
|
||||
, _ecxSqlGenCtx :: !SQLGenCtx
|
||||
, _ecxPgExecCtx :: !PGExecCtx
|
||||
, _ecxPlanCache :: !EP.PlanCache
|
||||
-- , _ecxPlanCache :: !EP.PlanCache
|
||||
, _ecxSchemaCache :: !SchemaCache
|
||||
, _ecxSchemaCacheVer :: !SchemaCacheVer
|
||||
, _ecxHttpManager :: !HTTP.Manager
|
||||
@ -90,7 +79,7 @@ data ExecutionCtx
|
||||
-- before a GraphQL query should be allowed to be executed. In OSS, the safety
|
||||
-- check is to check in the query is in the allow list.
|
||||
|
||||
-- | TODO: Limitation: This parses the query, which is not ideal if we already
|
||||
-- | TODO (from master): Limitation: This parses the query, which is not ideal if we already
|
||||
-- have the query cached. The parsing happens unnecessary. But getting this to
|
||||
-- either return a plan or parse was tricky and complicated.
|
||||
class Monad m => MonadGQLExecutionCheck m where
|
||||
@ -117,75 +106,83 @@ instance MonadGQLExecutionCheck m => MonadGQLExecutionCheck (Tracing.TraceT m) w
|
||||
checkGQLExecution ui det enableAL sc req =
|
||||
lift $ checkGQLExecution ui det enableAL sc req
|
||||
|
||||
-- Enforces the current limitation
|
||||
assertSameLocationNodes
|
||||
:: (MonadError QErr m) => [VT.TypeLoc] -> m VT.TypeLoc
|
||||
assertSameLocationNodes typeLocs =
|
||||
case Set.toList (Set.fromList typeLocs) of
|
||||
-- this shouldn't happen
|
||||
[] -> return VT.TLHasuraType
|
||||
[loc] -> return loc
|
||||
_ -> throw400 NotSupported msg
|
||||
where
|
||||
msg = "cannot mix top level fields from two different graphql servers"
|
||||
|
||||
-- TODO: we should fix this function asap
|
||||
-- as this will fail when there is a fragment at the top level
|
||||
getTopLevelNodes :: G.TypedOperationDefinition -> [G.Name]
|
||||
getTopLevelNodes opDef =
|
||||
mapMaybe f $ G._todSelectionSet opDef
|
||||
where
|
||||
f = \case
|
||||
G.SelectionField fld -> Just $ G._fName fld
|
||||
G.SelectionFragmentSpread _ -> Nothing
|
||||
G.SelectionInlineFragment _ -> Nothing
|
||||
|
||||
gatherTypeLocs :: GCtx -> [G.Name] -> [VT.TypeLoc]
|
||||
gatherTypeLocs gCtx nodes =
|
||||
catMaybes $ flip map nodes $ \node ->
|
||||
VT._fiLoc <$> Map.lookup node schemaNodes
|
||||
where
|
||||
schemaNodes =
|
||||
let qr = VT._otiFields $ _gQueryRoot gCtx
|
||||
mr = VT._otiFields <$> _gMutRoot gCtx
|
||||
in maybe qr (Map.union qr) mr
|
||||
|
||||
-- This is for when the graphql query is validated
|
||||
type ExecPlanPartial = GQExecPlan (GCtx, VQ.RootSelectionSet)
|
||||
|
||||
getExecPlanPartial
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
:: (MonadError QErr m)
|
||||
=> UserInfo
|
||||
-> SchemaCache
|
||||
-> EQ.GraphQLQueryType
|
||||
-> ET.GraphQLQueryType
|
||||
-> GQLReqParsed
|
||||
-> m ExecPlanPartial
|
||||
getExecPlanPartial userInfo sc queryType req = do
|
||||
let gCtx = case queryType of
|
||||
EQ.QueryHasura -> getGCtx (_uiBackendOnlyFieldAccess userInfo) sc roleName
|
||||
EQ.QueryRelay -> fromMaybe GC.emptyGCtx $ Map.lookup roleName $ scRelayGCtxMap sc
|
||||
|
||||
queryParts <- flip runReaderT gCtx $ VQ.getQueryParts req
|
||||
|
||||
let opDef = VQ.qpOpDef queryParts
|
||||
topLevelNodes = getTopLevelNodes opDef
|
||||
-- gather TypeLoc of topLevelNodes
|
||||
typeLocs = gatherTypeLocs gCtx topLevelNodes
|
||||
|
||||
-- see if they are all the same
|
||||
typeLoc <- assertSameLocationNodes typeLocs
|
||||
|
||||
case typeLoc of
|
||||
VT.TLHasuraType -> do
|
||||
rootSelSet <- runReaderT (VQ.validateGQ queryParts) gCtx
|
||||
pure $ GExPHasura (gCtx, rootSelSet)
|
||||
VT.TLRemoteType _ rsi ->
|
||||
pure $ GExPRemote rsi opDef
|
||||
VT.TLCustom ->
|
||||
throw500 "unexpected custom type for top level field"
|
||||
-> m (C.GQLContext, QueryParts)
|
||||
getExecPlanPartial userInfo sc queryType req =
|
||||
(getGCtx ,) <$> getQueryParts req
|
||||
where
|
||||
roleName = _uiRole userInfo
|
||||
|
||||
contextMap =
|
||||
case queryType of
|
||||
ET.QueryHasura -> scGQLContext sc
|
||||
ET.QueryRelay -> scRelayContext sc
|
||||
|
||||
defaultContext =
|
||||
case queryType of
|
||||
ET.QueryHasura -> scUnauthenticatedGQLContext sc
|
||||
ET.QueryRelay -> scUnauthenticatedRelayContext sc
|
||||
|
||||
getGCtx :: C.GQLContext
|
||||
getGCtx =
|
||||
case Map.lookup roleName contextMap of
|
||||
Nothing -> defaultContext
|
||||
Just (C.RoleContext frontend backend) ->
|
||||
case _uiBackendOnlyFieldAccess userInfo of
|
||||
BOFAAllowed -> fromMaybe frontend backend
|
||||
BOFADisallowed -> frontend
|
||||
|
||||
-- | Depending on the request parameters, fetch the correct typed operation
|
||||
-- definition from the GraphQL query
|
||||
getQueryParts
|
||||
:: MonadError QErr m
|
||||
=> GQLReqParsed
|
||||
-> m QueryParts
|
||||
getQueryParts (GQLReq opNameM q _varValsM) = do
|
||||
let (selSets, opDefs, _fragDefsL) = G.partitionExDefs $ unGQLExecDoc q
|
||||
case (opNameM, selSets, opDefs) of
|
||||
(Just opName, [], _) -> do
|
||||
let n = _unOperationName opName
|
||||
opDefM = find (\opDef -> G._todName opDef == Just n) opDefs
|
||||
onNothing opDefM $ throw400 ValidationFailed $
|
||||
"no such operation found in the document: " <> showName n
|
||||
(Just _, _, _) ->
|
||||
throw400 ValidationFailed $ "operationName cannot be used when " <>
|
||||
"an anonymous operation exists in the document"
|
||||
(Nothing, [selSet], []) ->
|
||||
return $ G.TypedOperationDefinition G.OperationTypeQuery Nothing [] [] selSet
|
||||
(Nothing, [], [opDef]) ->
|
||||
return opDef
|
||||
(Nothing, _, _) ->
|
||||
throw400 ValidationFailed $ "exactly one operation has to be present " <>
|
||||
"in the document when operationName is not specified"
|
||||
|
||||
-- The graphql query is resolved into a sequence of execution operations
|
||||
data ResolvedExecutionPlan m
|
||||
= QueryExecutionPlan
|
||||
(EPr.ExecutionPlan (m EncJSON, EQ.GeneratedSqlMap) EPr.RemoteCall (G.Name, J.Value)) [C.QueryRootField UnpreparedValue]
|
||||
-- ^ query execution; remote schemas and introspection possible
|
||||
| MutationExecutionPlan (EPr.ExecutionPlan (m EncJSON, HTTP.ResponseHeaders) EPr.RemoteCall (G.Name, J.Value))
|
||||
-- ^ mutation execution; only __typename introspection supported
|
||||
| SubscriptionExecutionPlan EL.LiveQueryPlan
|
||||
-- ^ live query execution; remote schemas and introspection not supported
|
||||
|
||||
validateSubscriptionRootField
|
||||
:: MonadError QErr m
|
||||
=> C.QueryRootField v -> m (C.SubscriptionRootField v)
|
||||
validateSubscriptionRootField = \case
|
||||
C.RFDB x -> pure $ C.RFDB x
|
||||
C.RFAction (C.AQAsync s) -> pure $ C.RFAction s
|
||||
C.RFAction (C.AQQuery _) -> throw400 NotSupported "query actions cannot be run as a subscription"
|
||||
C.RFRemote _ -> throw400 NotSupported "subscription to remote server is not supported"
|
||||
C.RFRaw _ -> throw400 NotSupported "Introspection not supported over subscriptions"
|
||||
|
||||
|
||||
checkQueryInAllowlist
|
||||
:: (MonadError QErr m) => Bool -> UserInfo -> GQLReqParsed -> SchemaCache -> m ()
|
||||
checkQueryInAllowlist enableAL userInfo req sc =
|
||||
@ -193,24 +190,18 @@ checkQueryInAllowlist enableAL userInfo req sc =
|
||||
-- check if query is in allowlist
|
||||
when (enableAL && (_uiRole userInfo /= adminRoleName)) $ do
|
||||
let notInAllowlist =
|
||||
not $ VQ.isQueryInAllowlist (_grQuery req) (scAllowlist sc)
|
||||
when notInAllowlist $ modifyQErr modErr $ throwVE "query is not allowed"
|
||||
not $ isQueryInAllowlist (_grQuery req) (scAllowlist sc)
|
||||
when notInAllowlist $ modifyQErr modErr $ throw400 ValidationFailed "query is not allowed"
|
||||
|
||||
where
|
||||
modErr e =
|
||||
let msg = "query is not in any of the allowlists"
|
||||
in e{qeInternal = Just $ J.object [ "message" J..= J.String msg]}
|
||||
|
||||
|
||||
-- An execution operation, in case of queries and mutations it is just a
|
||||
-- transaction to be executed
|
||||
data ExecOp m
|
||||
= ExOpQuery !(m EncJSON) !(Maybe EQ.GeneratedSqlMap) ![GR.QueryRootFldUnresolved]
|
||||
| ExOpMutation !HTTP.ResponseHeaders !(m EncJSON)
|
||||
| ExOpSubs !EL.LiveQueryPlan
|
||||
|
||||
-- The graphql query is resolved into an execution operation
|
||||
type GQExecPlanResolved m = GQExecPlan (ExecOp m)
|
||||
isQueryInAllowlist q = HS.member gqlQuery
|
||||
where
|
||||
gqlQuery = GQLQuery $ G.ExecutableDocument $ stripTypenames $
|
||||
unGQLExecDoc q
|
||||
|
||||
getResolvedExecPlan
|
||||
:: forall m tx
|
||||
@ -225,207 +216,104 @@ getResolvedExecPlan
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> PGExecCtx
|
||||
-> EP.PlanCache
|
||||
-- -> EP.PlanCache
|
||||
-> UserInfo
|
||||
-> SQLGenCtx
|
||||
-> SchemaCache
|
||||
-> SchemaCacheVer
|
||||
-> EQ.GraphQLQueryType
|
||||
-> ET.GraphQLQueryType
|
||||
-> HTTP.Manager
|
||||
-> [HTTP.Header]
|
||||
-> (GQLReqUnparsed, GQLReqParsed)
|
||||
-> m (Telem.CacheHit, GQExecPlanResolved tx)
|
||||
getResolvedExecPlan env logger pgExecCtx planCache userInfo sqlGenCtx
|
||||
sc scVer queryType httpManager reqHeaders (reqUnparsed, reqParsed) = do
|
||||
-> m (Telem.CacheHit, ResolvedExecutionPlan tx)
|
||||
getResolvedExecPlan env logger pgExecCtx {- planCache-} userInfo sqlGenCtx
|
||||
sc scVer queryType httpManager reqHeaders (reqUnparsed, reqParsed) = -- do
|
||||
|
||||
planM <- liftIO $ EP.getPlan scVer (_uiRole userInfo) operationNameM queryStr
|
||||
queryType planCache
|
||||
let usrVars = _uiSession userInfo
|
||||
case planM of
|
||||
-- plans are only for queries and subscriptions
|
||||
Just plan -> (Telem.Hit,) . GExPHasura <$> case plan of
|
||||
EP.RPQuery queryPlan asts -> do
|
||||
(tx, genSql) <- EQ.queryOpFromPlan env httpManager reqHeaders userInfo queryVars queryPlan
|
||||
pure $ ExOpQuery tx (Just genSql) asts
|
||||
EP.RPSubs subsPlan ->
|
||||
ExOpSubs <$> EL.reuseLiveQueryPlan pgExecCtx usrVars queryVars subsPlan
|
||||
Nothing -> (Telem.Miss,) <$> noExistingPlan
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- planM <- liftIO $ EP.getPlan scVer (_uiRole userInfo) opNameM queryStr
|
||||
-- queryType planCache
|
||||
-- case planM of
|
||||
-- -- plans are only for queries and subscriptions
|
||||
-- Just plan -> (Telem.Hit,) <$> case plan of
|
||||
-- EP.RPQuery queryPlan -> do
|
||||
-- -- (tx, genSql) <- EQ.queryOpFromPlan env httpManager reqHeaders userInfo queryVars queryPlan
|
||||
-- return $ QueryExecutionPlan _ -- tx (Just genSql)
|
||||
-- EP.RPSubs subsPlan ->
|
||||
-- return $ SubscriptionExecutionPlan _ -- <$> EL.reuseLiveQueryPlan pgExecCtx usrVars queryVars subsPlan
|
||||
-- Nothing -> (Telem.Miss,) <$> noExistingPlan
|
||||
(Telem.Miss,) <$> noExistingPlan
|
||||
where
|
||||
GQLReq operationNameM queryStr queryVars = reqUnparsed
|
||||
addPlanToCache plan =
|
||||
liftIO $ EP.addPlan scVer (_uiRole userInfo)
|
||||
operationNameM queryStr plan queryType planCache
|
||||
|
||||
noExistingPlan :: m (GQExecPlanResolved tx)
|
||||
GQLReq opNameM queryStr queryVars = reqUnparsed
|
||||
-- addPlanToCache plan =
|
||||
-- liftIO $ EP.addPlan scVer (userRole userInfo)
|
||||
-- opNameM queryStr plan planCache
|
||||
noExistingPlan :: m (ResolvedExecutionPlan tx)
|
||||
noExistingPlan = do
|
||||
-- GraphQL requests may incorporate fragments which insert a pre-defined
|
||||
-- part of a GraphQL query. Here we make sure to remember those
|
||||
-- pre-defined sections, so that when we encounter a fragment spread
|
||||
-- later, we can inline it instead.
|
||||
-- req <- toParsed reqUnparsed
|
||||
(partialExecPlan, queryReusability) <- runReusabilityT $
|
||||
getExecPlanPartial userInfo sc queryType reqParsed
|
||||
forM partialExecPlan $ \(gCtx, rootSelSet) ->
|
||||
case rootSelSet of
|
||||
VQ.RMutation selSet -> do
|
||||
(tx, respHeaders) <- getMutOp env logger gCtx sqlGenCtx userInfo httpManager reqHeaders selSet
|
||||
pure $ ExOpMutation respHeaders tx
|
||||
VQ.RQuery selSet -> do
|
||||
(queryTx, plan, genSql, asts) <- getQueryOp env logger gCtx sqlGenCtx httpManager reqHeaders userInfo
|
||||
queryReusability (allowQueryActionExecuter httpManager reqHeaders) selSet
|
||||
traverse_ (addPlanToCache . flip EP.RPQuery asts) plan
|
||||
return $ ExOpQuery queryTx (Just genSql) asts
|
||||
VQ.RSubscription fields -> do
|
||||
(lqOp, plan) <- getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo queryReusability
|
||||
(restrictActionExecuter "query actions cannot be run as a subscription") fields
|
||||
traverse_ (addPlanToCache . EP.RPSubs) plan
|
||||
return $ ExOpSubs lqOp
|
||||
let takeFragment = \case G.ExecutableDefinitionFragment f -> Just f; _ -> Nothing
|
||||
fragments =
|
||||
mapMaybe takeFragment $ unGQLExecDoc $ _grQuery reqParsed
|
||||
(gCtx, queryParts) <- getExecPlanPartial userInfo sc queryType reqParsed
|
||||
case queryParts of
|
||||
G.TypedOperationDefinition G.OperationTypeQuery _ varDefs _ selSet -> do
|
||||
-- (Here the above fragment inlining is actually executed.)
|
||||
inlinedSelSet <- EI.inlineSelectionSet fragments selSet
|
||||
-- (unpreparedQueries, _) <-
|
||||
-- E.parseGraphQLQuery gCtx varDefs
|
||||
(execPlan,asts) {-, plan-} <-
|
||||
EQ.convertQuerySelSet env logger gCtx userInfo httpManager reqHeaders inlinedSelSet varDefs (_grVariables reqUnparsed)
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- traverse_ (addPlanToCache . EP.RPQuery) plan
|
||||
return $ QueryExecutionPlan execPlan asts
|
||||
G.TypedOperationDefinition G.OperationTypeMutation _ varDefs _ selSet -> do
|
||||
-- (Here the above fragment inlining is actually executed.)
|
||||
inlinedSelSet <- EI.inlineSelectionSet fragments selSet
|
||||
queryTx <- EM.convertMutationSelectionSet env logger gCtx sqlGenCtx userInfo httpManager reqHeaders
|
||||
inlinedSelSet varDefs (_grVariables reqUnparsed)
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- traverse_ (addPlanToCache . EP.RPQuery) plan
|
||||
return $ MutationExecutionPlan queryTx
|
||||
G.TypedOperationDefinition G.OperationTypeSubscription _ varDefs directives selSet -> do
|
||||
-- (Here the above fragment inlining is actually executed.)
|
||||
inlinedSelSet <- EI.inlineSelectionSet fragments selSet
|
||||
-- Parse as query to check correctness
|
||||
(unpreparedAST, _reusability) <-
|
||||
EQ.parseGraphQLQuery gCtx varDefs (_grVariables reqUnparsed) inlinedSelSet
|
||||
-- A subscription should have exactly one root field
|
||||
-- As an internal testing feature, we support subscribing to multiple
|
||||
-- root fields in a subcription. First, we check if the corresponding directive
|
||||
-- (@_multiple_top_level_fields) is set.
|
||||
case inlinedSelSet of
|
||||
[] -> throw500 "empty selset for subscription"
|
||||
[_] -> pure ()
|
||||
(_:rst) ->
|
||||
let multipleAllowed =
|
||||
G.Directive $$(G.litName "_multiple_top_level_fields") mempty `elem` directives
|
||||
in
|
||||
unless (multipleAllowed || null rst) $
|
||||
throw400 ValidationFailed "subscriptions must select one top level field"
|
||||
validSubscriptionAST <- for unpreparedAST validateSubscriptionRootField
|
||||
(lqOp, _plan) <- EL.buildLiveQueryPlan pgExecCtx userInfo validSubscriptionAST
|
||||
-- getSubsOpM pgExecCtx userInfo inlinedSelSet
|
||||
return $ SubscriptionExecutionPlan lqOp
|
||||
|
||||
-- Monad for resolving a hasura query/mutation
|
||||
type E m =
|
||||
ReaderT ( UserInfo
|
||||
, QueryCtxMap
|
||||
, MutationCtxMap
|
||||
, TypeMap
|
||||
, FieldMap
|
||||
, OrdByCtx
|
||||
, InsCtxMap
|
||||
, SQLGenCtx
|
||||
, L.Logger L.Hasura
|
||||
) (ExceptT QErr m)
|
||||
|
||||
runE
|
||||
:: (MonadError QErr m)
|
||||
=> L.Logger L.Hasura
|
||||
-> GCtx
|
||||
-> SQLGenCtx
|
||||
-> UserInfo
|
||||
-> E m a
|
||||
-> m a
|
||||
runE logger ctx sqlGenCtx userInfo action = do
|
||||
res <- runExceptT $ runReaderT action
|
||||
(userInfo, queryCtxMap, mutationCtxMap, typeMap, fldMap, ordByCtx, insCtxMap, sqlGenCtx, logger)
|
||||
either throwError return res
|
||||
where
|
||||
queryCtxMap = _gQueryCtxMap ctx
|
||||
mutationCtxMap = _gMutationCtxMap ctx
|
||||
typeMap = _gTypes ctx
|
||||
fldMap = _gFields ctx
|
||||
ordByCtx = _gOrdByCtx ctx
|
||||
insCtxMap = _gInsCtxMap ctx
|
||||
|
||||
getQueryOp
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> GCtx
|
||||
-> SQLGenCtx
|
||||
-> HTTP.Manager
|
||||
-> [HTTP.Header]
|
||||
-> UserInfo
|
||||
-> QueryReusability
|
||||
-> QueryActionExecuter
|
||||
-> VQ.ObjectSelectionSet
|
||||
-> m (tx EncJSON, Maybe EQ.ReusableQueryPlan, EQ.GeneratedSqlMap, [GR.QueryRootFldUnresolved])
|
||||
getQueryOp env logger gCtx sqlGenCtx manager reqHdrs userInfo queryReusability actionExecuter selSet =
|
||||
runE logger gCtx sqlGenCtx userInfo $ EQ.convertQuerySelSet env manager reqHdrs queryReusability selSet actionExecuter
|
||||
|
||||
resolveMutSelSet
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has UserInfo r
|
||||
, Has MutationCtxMap r
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has InsCtxMap r
|
||||
, Has HTTP.Manager r
|
||||
, Has [HTTP.Header] r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> VQ.ObjectSelectionSet
|
||||
-> m (tx EncJSON, HTTP.ResponseHeaders)
|
||||
resolveMutSelSet env fields = do
|
||||
aliasedTxs <- traverseObjectSelectionSet fields $ \fld ->
|
||||
case VQ._fName fld of
|
||||
"__typename" -> return (return $ encJFromJValue mutationRootNamedType, [])
|
||||
_ -> evalReusabilityT $ GR.mutFldToTx env fld
|
||||
|
||||
-- combines all transactions into a single transaction
|
||||
return (toSingleTx aliasedTxs, concatMap (snd . snd) aliasedTxs)
|
||||
where
|
||||
-- A list of aliased transactions for eg
|
||||
-- [("f1", Tx r1), ("f2", Tx r2)]
|
||||
-- are converted into a single transaction as follows
|
||||
-- Tx {"f1": r1, "f2": r2}
|
||||
-- toSingleTx :: [(Text, LazyRespTx)] -> LazyRespTx
|
||||
toSingleTx aliasedTxs =
|
||||
fmap encJFromAssocList $ forM aliasedTxs $ \(al, (tx, _)) -> (,) al <$> tx
|
||||
|
||||
getMutOp
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> GCtx
|
||||
-> SQLGenCtx
|
||||
-> UserInfo
|
||||
-> HTTP.Manager
|
||||
-> [HTTP.Header]
|
||||
-> VQ.ObjectSelectionSet
|
||||
-> m (tx EncJSON, HTTP.ResponseHeaders)
|
||||
getMutOp env logger ctx sqlGenCtx userInfo manager reqHeaders selSet =
|
||||
peelReaderT $ resolveMutSelSet env selSet
|
||||
where
|
||||
peelReaderT action =
|
||||
runReaderT action
|
||||
( userInfo, queryCtxMap, mutationCtxMap
|
||||
, typeMap, fldMap, ordByCtx, insCtxMap, sqlGenCtx
|
||||
, manager, reqHeaders, logger
|
||||
)
|
||||
where
|
||||
queryCtxMap = _gQueryCtxMap ctx
|
||||
mutationCtxMap = _gMutationCtxMap ctx
|
||||
typeMap = _gTypes ctx
|
||||
fldMap = _gFields ctx
|
||||
ordByCtx = _gOrdByCtx ctx
|
||||
insCtxMap = _gInsCtxMap ctx
|
||||
|
||||
getSubsOp
|
||||
:: ( MonadError QErr m
|
||||
, MonadIO m
|
||||
, HasVersion
|
||||
, Tracing.MonadTrace m
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> PGExecCtx
|
||||
-> GCtx
|
||||
-> SQLGenCtx
|
||||
-> UserInfo
|
||||
-> QueryReusability
|
||||
-> QueryActionExecuter
|
||||
-> VQ.ObjectSelectionSet
|
||||
-> m (EL.LiveQueryPlan, Maybe EL.ReusableLiveQueryPlan)
|
||||
getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo queryReusability actionExecuter =
|
||||
runE logger gCtx sqlGenCtx userInfo .
|
||||
EL.buildLiveQueryPlan env pgExecCtx queryReusability actionExecuter
|
||||
-- forM partialExecPlan $ \(gCtx, rootSelSet) ->
|
||||
-- case rootSelSet of
|
||||
-- VQ.RMutation selSet -> do
|
||||
-- (tx, respHeaders) <- getMutOp gCtx sqlGenCtx userInfo httpManager reqHeaders selSet
|
||||
-- pure $ ExOpMutation respHeaders tx
|
||||
-- VQ.RQuery selSet -> do
|
||||
-- (queryTx, plan, genSql) <- getQueryOp gCtx sqlGenCtx userInfo queryReusability (allowQueryActionExecuter httpManager reqHeaders) selSet
|
||||
-- traverse_ (addPlanToCache . EP.RPQuery) plan
|
||||
-- return $ ExOpQuery queryTx (Just genSql)
|
||||
-- VQ.RSubscription fld -> do
|
||||
-- (lqOp, plan) <- getSubsOp pgExecCtx gCtx sqlGenCtx userInfo queryReusability (restrictActionExecuter "query actions cannot be run as a subscription") fld
|
||||
-- traverse_ (addPlanToCache . EP.RPSubs) plan
|
||||
-- return $ ExOpSubs lqOp
|
||||
|
||||
execRemoteGQ
|
||||
:: ( HasVersion
|
||||
@ -441,13 +329,14 @@ execRemoteGQ
|
||||
-> [HTTP.Header]
|
||||
-> GQLReqUnparsed
|
||||
-> RemoteSchemaInfo
|
||||
-> G.OperationType
|
||||
-> G.TypedOperationDefinition G.NoFragments G.Name
|
||||
-> m (DiffTime, HttpResponse EncJSON)
|
||||
-- ^ Also returns time spent in http request, for telemetry.
|
||||
execRemoteGQ env reqId userInfo reqHdrs q rsi opType = do
|
||||
execRemoteGQ env reqId userInfo reqHdrs q rsi opDef = do
|
||||
execCtx <- ask
|
||||
let logger = _ecxLogger execCtx
|
||||
manager = _ecxHttpManager execCtx
|
||||
opType = G._todType opDef
|
||||
logQueryLog logger q Nothing reqId
|
||||
(time, respHdrs, resp) <- execRemoteGQ' env manager userInfo reqHdrs q rsi opType
|
||||
let !httpResp = HttpResponse (encJFromLBS resp) respHdrs
|
||||
|
@ -1,35 +1,21 @@
|
||||
-- This pragma is needed for allowQueryActionExecuter
|
||||
{-# OPTIONS_GHC -fno-warn-redundant-constraints #-}
|
||||
|
||||
module Hasura.GraphQL.Resolve.Action
|
||||
( resolveActionMutation
|
||||
module Hasura.GraphQL.Execute.Action
|
||||
( ActionExecuteTx
|
||||
, ActionExecuteResult(..)
|
||||
, resolveAsyncActionQuery
|
||||
, asyncActionsProcessor
|
||||
, resolveActionQuery
|
||||
, mkJsonAggSelect
|
||||
, QueryActionExecuter
|
||||
, allowQueryActionExecuter
|
||||
, restrictActionExecuter
|
||||
, resolveActionExecution
|
||||
, resolveActionMutationAsync
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import Control.Concurrent (threadDelay)
|
||||
import Control.Exception (try)
|
||||
import Control.Lens
|
||||
import Control.Monad.Trans.Control (MonadBaseControl)
|
||||
import Data.Has
|
||||
import Data.Int (Int64)
|
||||
import Data.IORef
|
||||
|
||||
import qualified Control.Concurrent.Async.Lifted.Safe as LA
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.CaseInsensitive as CI
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.UUID as UUID
|
||||
import qualified Database.PG.Query as Q
|
||||
@ -38,19 +24,29 @@ import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
import qualified Network.Wreq as Wreq
|
||||
|
||||
import qualified Hasura.GraphQL.Resolve.Select as GRS
|
||||
import Control.Concurrent (threadDelay)
|
||||
import Control.Exception (try)
|
||||
import Control.Lens
|
||||
import Data.Has
|
||||
import Data.Int (Int64)
|
||||
import Data.IORef
|
||||
|
||||
import qualified Hasura.RQL.DML.RemoteJoin as RJ
|
||||
import qualified Hasura.RQL.DML.Select as RS
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
-- import qualified Hasura.GraphQL.Resolve.Select as GRS
|
||||
import Control.Monad.Trans.Control (MonadBaseControl)
|
||||
|
||||
import qualified Control.Concurrent.Async.Lifted.Safe as LA
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Resolve.Select (processTableSelectionSet)
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Execute.Prepare
|
||||
import Hasura.GraphQL.Parser hiding (column)
|
||||
import Hasura.GraphQL.Utils (showNames)
|
||||
import Hasura.HTTP
|
||||
import Hasura.RQL.DDL.Headers (makeHeadersFromConf, toHeadersConf)
|
||||
import Hasura.RQL.DDL.Headers
|
||||
import Hasura.RQL.DDL.Schema.Cache
|
||||
import Hasura.RQL.DML.Select (asSingleRowJsonResp)
|
||||
import Hasura.RQL.Types
|
||||
@ -61,6 +57,9 @@ import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value (PGScalarValue (..), toTxtValue)
|
||||
|
||||
type ActionExecuteTx =
|
||||
forall tx. (MonadIO tx, MonadTx tx, Tracing.MonadTrace tx) => tx EncJSON
|
||||
|
||||
newtype ActionContext
|
||||
= ActionContext {_acName :: ActionName}
|
||||
deriving (Show, Eq)
|
||||
@ -82,19 +81,19 @@ data ActionWebhookErrorResponse
|
||||
$(J.deriveJSON (J.aesonDrop 5 J.snakeCase) ''ActionWebhookErrorResponse)
|
||||
|
||||
data ActionWebhookResponse
|
||||
= AWRArray ![J.Object]
|
||||
| AWRObject !J.Object
|
||||
= AWRArray ![Map.HashMap G.Name J.Value]
|
||||
| AWRObject !(Map.HashMap G.Name J.Value)
|
||||
deriving (Show, Eq)
|
||||
|
||||
instance J.FromJSON ActionWebhookResponse where
|
||||
parseJSON v = case v of
|
||||
J.Array{} -> AWRArray <$> J.parseJSON v
|
||||
J.Object o -> pure $ AWRObject o
|
||||
J.Object{} -> AWRObject <$> J.parseJSON v
|
||||
_ -> fail $ "expecting object or array of objects for action webhook response"
|
||||
|
||||
instance J.ToJSON ActionWebhookResponse where
|
||||
toJSON (AWRArray objects) = J.toJSON objects
|
||||
toJSON (AWRObject object) = J.toJSON object
|
||||
toJSON (AWRObject obj) = J.toJSON obj
|
||||
|
||||
data ActionRequestInfo
|
||||
= ActionRequestInfo
|
||||
@ -131,142 +130,55 @@ $(J.deriveJSON (J.aesonDrop 4 J.snakeCase){J.omitNothingFields=True} ''ActionHan
|
||||
instance L.ToEngineLog ActionHandlerLog L.Hasura where
|
||||
toEngineLog ahl = (L.LevelInfo, L.ELTActionHandler, J.toJSON ahl)
|
||||
|
||||
resolveActionMutation
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, MonadIO m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has HTTP.Manager r
|
||||
, Has [HTTP.Header] r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> Field
|
||||
-> ActionMutationExecutionContext
|
||||
-> UserInfo
|
||||
-> m (tx EncJSON, HTTP.ResponseHeaders)
|
||||
resolveActionMutation env field executionContext userInfo =
|
||||
case executionContext of
|
||||
ActionMutationSyncWebhook executionContextSync ->
|
||||
resolveActionMutationSync env field executionContextSync userInfo
|
||||
ActionMutationAsync ->
|
||||
(,[]) <$> resolveActionMutationAsync field userInfo
|
||||
|
||||
data ActionExecuteResult
|
||||
= ActionExecuteResult
|
||||
{ _aerTransaction :: !ActionExecuteTx
|
||||
, _aerHeaders :: !HTTP.ResponseHeaders
|
||||
}
|
||||
|
||||
-- | Synchronously execute webhook handler and resolve response to action "output"
|
||||
resolveActionMutationSync
|
||||
resolveActionExecution
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, MonadIO m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has HTTP.Manager r
|
||||
, Has [HTTP.Header] r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> Field
|
||||
-> ActionExecutionContext
|
||||
-> L.Logger L.Hasura
|
||||
-> UserInfo
|
||||
-> m (tx EncJSON, HTTP.ResponseHeaders)
|
||||
resolveActionMutationSync env field executionContext userInfo = do
|
||||
let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field
|
||||
actionContext = ActionContext actionName
|
||||
sessionVariables = _uiSession userInfo
|
||||
handlerPayload = ActionWebhookPayload actionContext sessionVariables inputArgs
|
||||
manager <- asks getter
|
||||
reqHeaders <- asks getter
|
||||
(webhookRes, respHeaders) <- callWebhook env manager outputType outputFields reqHeaders confHeaders
|
||||
-> AnnActionExecution UnpreparedValue
|
||||
-> ActionExecContext
|
||||
-> m ActionExecuteResult
|
||||
resolveActionExecution env logger userInfo annAction execContext = do
|
||||
let actionContext = ActionContext actionName
|
||||
handlerPayload = ActionWebhookPayload actionContext sessionVariables inputPayload
|
||||
(webhookRes, respHeaders) <- flip runReaderT logger $ callWebhook env manager outputType outputFields reqHeaders confHeaders
|
||||
forwardClientHeaders resolvedWebhook handlerPayload
|
||||
let webhookResponseExpression = RS.AEInput $ UVSQL $
|
||||
let webhookResponseExpression = RS.AEInput $ UVLiteral $
|
||||
toTxtValue $ WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON webhookRes
|
||||
selSet <- asObjectSelectionSet $ _fSelSet field
|
||||
selectAstUnresolved <-
|
||||
processOutputSelectionSet webhookResponseExpression outputType definitionList
|
||||
(_fType field) selSet
|
||||
astResolved <- RS.traverseAnnSimpleSelect resolveValTxt selectAstUnresolved
|
||||
let (astResolvedWithoutRemoteJoins,maybeRemoteJoins) = RJ.getRemoteJoins astResolved
|
||||
jsonAggType = mkJsonAggSelect outputType
|
||||
return $ (,respHeaders) $
|
||||
case maybeRemoteJoins of
|
||||
Just remoteJoins ->
|
||||
let query = Q.fromBuilder $ toSQL $
|
||||
RS.mkSQLSelect jsonAggType astResolvedWithoutRemoteJoins
|
||||
in RJ.executeQueryWithRemoteJoins env manager reqHeaders userInfo query [] remoteJoins
|
||||
Nothing ->
|
||||
liftTx $ asSingleRowJsonResp (Q.fromBuilder $ toSQL $ RS.mkSQLSelect jsonAggType astResolved) []
|
||||
|
||||
selectAstUnresolved = processOutputSelectionSet webhookResponseExpression
|
||||
outputType definitionList annFields stringifyNum
|
||||
(astResolved, _expectedVariables) <- flip runStateT Set.empty $ RS.traverseAnnSimpleSelect prepareWithoutPlan selectAstUnresolved
|
||||
return $ ActionExecuteResult (executeAction astResolved) respHeaders
|
||||
where
|
||||
ActionExecutionContext actionName outputType outputFields definitionList resolvedWebhook confHeaders
|
||||
forwardClientHeaders = executionContext
|
||||
AnnActionExecution actionName outputType annFields inputPayload
|
||||
outputFields definitionList resolvedWebhook confHeaders
|
||||
forwardClientHeaders stringifyNum = annAction
|
||||
ActionExecContext manager reqHeaders sessionVariables = execContext
|
||||
|
||||
-- QueryActionExecuter is a type for a higher function, this is being used
|
||||
-- to allow or disallow where a query action can be executed. We would like
|
||||
-- to explicitly control where a query action can be run.
|
||||
-- Example: We do not explain a query action, so we use the `restrictActionExecuter`
|
||||
-- to prevent resolving the action query.
|
||||
type QueryActionExecuter =
|
||||
forall m a. (MonadError QErr m)
|
||||
=> (HTTP.Manager -> [HTTP.Header] -> m a)
|
||||
-> m a
|
||||
|
||||
allowQueryActionExecuter :: HTTP.Manager -> [HTTP.Header] -> QueryActionExecuter
|
||||
allowQueryActionExecuter manager reqHeaders actionResolver =
|
||||
actionResolver manager reqHeaders
|
||||
|
||||
restrictActionExecuter :: Text -> QueryActionExecuter
|
||||
restrictActionExecuter errMsg _ =
|
||||
throw400 NotSupported errMsg
|
||||
|
||||
resolveActionQuery
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, MonadIO m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, Tracing.MonadTrace m
|
||||
)
|
||||
=> Env.Environment
|
||||
-> Field
|
||||
-> ActionExecutionContext
|
||||
-> SessionVariables
|
||||
-> HTTP.Manager
|
||||
-> [HTTP.Header]
|
||||
-> m (RS.AnnSimpleSelG UnresolvedVal)
|
||||
resolveActionQuery env field executionContext sessionVariables httpManager reqHeaders = do
|
||||
let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field
|
||||
actionContext = ActionContext actionName
|
||||
handlerPayload = ActionWebhookPayload actionContext sessionVariables inputArgs
|
||||
(webhookRes, _) <- callWebhook env httpManager outputType outputFields reqHeaders confHeaders
|
||||
forwardClientHeaders resolvedWebhook handlerPayload
|
||||
let webhookResponseExpression = RS.AEInput $ UVSQL $
|
||||
toTxtValue $ WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON webhookRes
|
||||
selSet <- asObjectSelectionSet $ _fSelSet field
|
||||
selectAstUnresolved <-
|
||||
processOutputSelectionSet webhookResponseExpression outputType definitionList
|
||||
(_fType field) selSet
|
||||
return selectAstUnresolved
|
||||
where
|
||||
ActionExecutionContext actionName outputType outputFields definitionList resolvedWebhook confHeaders
|
||||
forwardClientHeaders = executionContext
|
||||
executeAction :: RS.AnnSimpleSel -> ActionExecuteTx
|
||||
executeAction astResolved = do
|
||||
let (astResolvedWithoutRemoteJoins,maybeRemoteJoins) = RJ.getRemoteJoins astResolved
|
||||
jsonAggType = mkJsonAggSelect outputType
|
||||
case maybeRemoteJoins of
|
||||
Just remoteJoins ->
|
||||
let query = Q.fromBuilder $ toSQL $
|
||||
RS.mkSQLSelect jsonAggType astResolvedWithoutRemoteJoins
|
||||
in RJ.executeQueryWithRemoteJoins env manager reqHeaders userInfo query [] remoteJoins
|
||||
Nothing ->
|
||||
liftTx $ asSingleRowJsonResp (Q.fromBuilder $ toSQL $ RS.mkSQLSelect jsonAggType astResolved) []
|
||||
|
||||
{- Note: [Async action architecture]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
@ -284,17 +196,13 @@ table provides the action response. See Note [Resolving async action query/subsc
|
||||
-- | Resolve asynchronous action mutation which returns only the action uuid
|
||||
resolveActionMutationAsync
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has [HTTP.Header] r
|
||||
, MonadTx tx
|
||||
)
|
||||
=> Field
|
||||
-> UserInfo
|
||||
=> AnnActionMutationAsync
|
||||
-> [HTTP.Header]
|
||||
-> SessionVariables
|
||||
-> m (tx EncJSON)
|
||||
resolveActionMutationAsync field userInfo = do
|
||||
let sessionVariables = _uiSession userInfo
|
||||
reqHeaders <- asks getter
|
||||
let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field
|
||||
resolveActionMutationAsync annAction reqHeaders sessionVariables = do
|
||||
pure $ liftTx do
|
||||
actionId <- runIdentity . Q.getRow <$> Q.withQE defaultTxErrorHandler [Q.sql|
|
||||
INSERT INTO
|
||||
@ -308,7 +216,7 @@ resolveActionMutationAsync field userInfo = do
|
||||
|
||||
pure $ encJFromJValue $ UUID.toText actionId
|
||||
where
|
||||
actionName = G.unName $ _fName field
|
||||
AnnActionMutationAsync actionName inputArgs = annAction
|
||||
toHeadersMap = Map.fromList . map ((bsToTxt . CI.original) *** bsToTxt)
|
||||
|
||||
{- Note: [Resolving async action query/subscription]
|
||||
@ -322,67 +230,51 @@ action's type. Here, we treat the "output" field as a computed field to hdb_acti
|
||||
`jsonb_to_record` as custom SQL function.
|
||||
-}
|
||||
|
||||
|
||||
-- TODO: Add tracing here? Avoided now because currently the function is pure
|
||||
resolveAsyncActionQuery
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> UserInfo
|
||||
-> ActionSelectOpContext
|
||||
-> Field
|
||||
-> m GRS.AnnSimpleSelect
|
||||
resolveAsyncActionQuery userInfo selectOpCtx field = do
|
||||
actionId <- withArg (_fArguments field) "id" parseActionId
|
||||
stringifyNumerics <- stringifyNum <$> asks getter
|
||||
:: UserInfo
|
||||
-> AnnActionAsyncQuery UnpreparedValue
|
||||
-> RS.AnnSimpleSelG UnpreparedValue
|
||||
resolveAsyncActionQuery userInfo annAction =
|
||||
let annotatedFields = asyncFields <&> second \case
|
||||
AsyncTypename t -> RS.AFExpression t
|
||||
AsyncOutput annFields ->
|
||||
-- See Note [Resolving async action query/subscription]
|
||||
let inputTableArgument = RS.AETableRow $ Just $ Iden "response_payload"
|
||||
jsonAggSelect = mkJsonAggSelect outputType
|
||||
in RS.AFComputedField $ RS.CFSTable jsonAggSelect $
|
||||
processOutputSelectionSet inputTableArgument outputType
|
||||
definitionList annFields stringifyNumerics
|
||||
|
||||
selSet <- asObjectSelectionSet $ _fSelSet field
|
||||
AsyncId -> mkAnnFldFromPGCol "id" PGUUID
|
||||
AsyncCreatedAt -> mkAnnFldFromPGCol "created_at" PGTimeStampTZ
|
||||
AsyncErrors -> mkAnnFldFromPGCol "errors" PGJSONB
|
||||
|
||||
annotatedFields <- fmap (map (first FieldName)) $ traverseObjectSelectionSet selSet $ \fld ->
|
||||
case _fName fld of
|
||||
"__typename" -> return $ RS.AFExpression $ G.unName $ G.unNamedType $ _fType field
|
||||
"output" -> do
|
||||
-- See Note [Resolving async action query/subscription]
|
||||
let inputTableArgument = RS.AETableRow $ Just $ Iden "response_payload"
|
||||
ActionSelectOpContext outputType definitionList = selectOpCtx
|
||||
jsonAggSelect = mkJsonAggSelect outputType
|
||||
fldSelSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
(RS.AFComputedField . RS.CFSTable jsonAggSelect)
|
||||
<$> processOutputSelectionSet inputTableArgument outputType
|
||||
definitionList (_fType fld) fldSelSet
|
||||
|
||||
-- The metadata columns
|
||||
"id" -> return $ mkAnnFieldFromPGCol "id" PGUUID
|
||||
"created_at" -> return $ mkAnnFieldFromPGCol "created_at" PGTimeStampTZ
|
||||
"errors" -> return $ mkAnnFieldFromPGCol "errors" PGJSONB
|
||||
G.Name t -> throw500 $ "unexpected field in actions' httpResponse : " <> t
|
||||
|
||||
let tableFromExp = RS.FromTable actionLogTable
|
||||
tableFromExp = RS.FromTable actionLogTable
|
||||
tableArguments = RS.noSelectArgs
|
||||
{ RS._saWhere = Just $ mkTableBoolExpression actionId}
|
||||
{ RS._saWhere = Just tableBoolExpression}
|
||||
tablePermissions = RS.TablePerm annBoolExpTrue Nothing
|
||||
selectAstUnresolved = RS.AnnSelectG annotatedFields tableFromExp tablePermissions
|
||||
tableArguments stringifyNumerics
|
||||
return selectAstUnresolved
|
||||
|
||||
in RS.AnnSelectG annotatedFields tableFromExp tablePermissions
|
||||
tableArguments stringifyNumerics
|
||||
where
|
||||
AnnActionAsyncQuery actionName actionId outputType asyncFields definitionList stringifyNumerics = annAction
|
||||
actionLogTable = QualifiedObject (SchemaName "hdb_catalog") (TableName "hdb_action_log")
|
||||
|
||||
-- TODO:- Avoid using PGColumnInfo
|
||||
mkAnnFieldFromPGCol column columnType =
|
||||
-- TODO (from master):- Avoid using PGColumnInfo
|
||||
mkAnnFldFromPGCol column' columnType =
|
||||
flip RS.mkAnnColumnField Nothing $
|
||||
PGColumnInfo (unsafePGCol column) (G.Name column) 0 (PGColumnScalar columnType) True Nothing
|
||||
PGColumnInfo (unsafePGCol column') (G.unsafeMkName column') 0 (PGColumnScalar columnType) True Nothing
|
||||
|
||||
parseActionId annInpValue = mkParameterizablePGValue <$> asPGColumnValue annInpValue
|
||||
|
||||
mkTableBoolExpression actionId =
|
||||
let actionIdColumnInfo = PGColumnInfo (unsafePGCol "id") "id" 0 (PGColumnScalar PGUUID) False Nothing
|
||||
tableBoolExpression =
|
||||
let actionIdColumnInfo = PGColumnInfo (unsafePGCol "id") $$(G.litName "id")
|
||||
0 (PGColumnScalar PGUUID) False Nothing
|
||||
actionIdColumnEq = BoolFld $ AVCol actionIdColumnInfo [AEQ True actionId]
|
||||
sessionVarsColumnInfo = PGColumnInfo (unsafePGCol "session_variables") "session_variables"
|
||||
sessionVarsColumnInfo = PGColumnInfo (unsafePGCol "session_variables") $$(G.litName "session_variables")
|
||||
0 (PGColumnScalar PGJSONB) False Nothing
|
||||
sessionVarValue = UVPG $ AnnPGVal Nothing False $ WithScalarType PGJSONB
|
||||
$ PGValJSONB $ Q.JSONB $ J.toJSON $ _uiSession userInfo
|
||||
sessionVarValue = flip UVParameter Nothing $ PGColumnValue (PGColumnScalar PGJSONB) $
|
||||
WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON $ _uiSession userInfo
|
||||
sessionVarsColumnEq = BoolFld $ AVCol sessionVarsColumnInfo [AEQ True sessionVarValue]
|
||||
|
||||
-- For non-admin roles, accessing an async action's response should be allowed only for the user
|
||||
@ -403,13 +295,13 @@ data ActionLogItem
|
||||
-- | Process async actions from hdb_catalog.hdb_action_log table. This functions is executed in a background thread.
|
||||
-- See Note [Async action architecture] above
|
||||
asyncActionsProcessor
|
||||
:: forall m void .
|
||||
( HasVersion
|
||||
, MonadIO m
|
||||
, MonadBaseControl IO m
|
||||
, LA.Forall (LA.Pure m)
|
||||
, Tracing.HasReporter m
|
||||
)
|
||||
:: forall m void
|
||||
. ( HasVersion
|
||||
, MonadIO m
|
||||
, MonadBaseControl IO m
|
||||
, LA.Forall (LA.Pure m)
|
||||
, Tracing.HasReporter m
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> IORef (RebuildableSchemaCache Run, SchemaCacheVer)
|
||||
@ -443,8 +335,8 @@ asyncActionsProcessor env logger cacheRef pgPool httpManager = forever $ do
|
||||
actionContext = ActionContext actionName
|
||||
eitherRes <- runExceptT $ flip runReaderT logger $
|
||||
callWebhook env httpManager outputType outputFields reqHeaders confHeaders
|
||||
forwardClientHeaders webhookUrl $
|
||||
ActionWebhookPayload actionContext sessionVariables inputPayload
|
||||
forwardClientHeaders webhookUrl $
|
||||
ActionWebhookPayload actionContext sessionVariables inputPayload
|
||||
liftIO $ case eitherRes of
|
||||
Left e -> setError actionId e
|
||||
Right (responsePayload, _) -> setCompleted actionId $ J.toJSON responsePayload
|
||||
@ -529,7 +421,7 @@ callWebhook env manager outputType outputFields reqHeaders confHeaders
|
||||
requestBody = J.encode postPayload
|
||||
requestBodySize = BL.length requestBody
|
||||
url = unResolvedWebhook resolvedWebhook
|
||||
httpResponse <- do
|
||||
httpResponse <- do
|
||||
initReq <- liftIO $ HTTP.parseRequest (T.unpack url)
|
||||
let req = initReq { HTTP.method = "POST"
|
||||
, HTTP.requestHeaders = addDefaultHeaders hdrs
|
||||
@ -602,13 +494,13 @@ callWebhook env manager outputType outputFields reqHeaders confHeaders
|
||||
-- Webhook response object should conform to action output fields
|
||||
validateResponseObject obj = do
|
||||
-- Fields not specified in the output type shouldn't be present in the response
|
||||
let extraFields = filter (not . flip Map.member outputFields) $ map G.Name $ Map.keys obj
|
||||
let extraFields = filter (not . flip Map.member outputFields) $ Map.keys obj
|
||||
when (not $ null extraFields) $ throwUnexpected $
|
||||
"unexpected fields in webhook response: " <> showNames extraFields
|
||||
|
||||
void $ flip Map.traverseWithKey outputFields $ \fieldName fieldTy ->
|
||||
-- When field is non-nullable, it has to present in the response with no null value
|
||||
when (not $ G.isNullable fieldTy) $ case Map.lookup (G.unName fieldName) obj of
|
||||
when (not $ G.isNullable fieldTy) $ case Map.lookup fieldName obj of
|
||||
Nothing -> throwUnexpected $
|
||||
"field " <> fieldName <<> " expected in webhook response, but not found"
|
||||
Just v -> when (v == J.Null) $ throwUnexpected $
|
||||
@ -619,23 +511,14 @@ mkJsonAggSelect =
|
||||
bool RS.JASSingleObject RS.JASMultipleRows . isListType
|
||||
|
||||
processOutputSelectionSet
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> RS.ArgumentExp UnresolvedVal
|
||||
:: RS.ArgumentExp v
|
||||
-> GraphQLType
|
||||
-> [(PGCol, PGScalarType)]
|
||||
-> G.NamedType -> ObjectSelectionSet -> m GRS.AnnSimpleSelect
|
||||
processOutputSelectionSet tableRowInput actionOutputType definitionList fldTy flds = do
|
||||
stringifyNumerics <- stringifyNum <$> asks getter
|
||||
annotatedFields <- processTableSelectionSet fldTy flds
|
||||
let annSel = RS.AnnSelectG annotatedFields selectFrom
|
||||
RS.noTablePermissions RS.noSelectArgs stringifyNumerics
|
||||
pure annSel
|
||||
-> RS.AnnFieldsG v
|
||||
-> Bool
|
||||
-> RS.AnnSimpleSelG v
|
||||
processOutputSelectionSet tableRowInput actionOutputType definitionList annotatedFields =
|
||||
RS.AnnSelectG annotatedFields selectFrom RS.noTablePermissions RS.noSelectArgs
|
||||
where
|
||||
jsonbToPostgresRecordFunction =
|
||||
QualifiedObject "pg_catalog" $ FunctionName $
|
||||
@ -645,4 +528,3 @@ processOutputSelectionSet tableRowInput actionOutputType definitionList fldTy fl
|
||||
|
||||
functionArgs = RS.FunctionArgsExp [tableRowInput] mempty
|
||||
selectFrom = RS.FromFunction jsonbToPostgresRecordFunction functionArgs $ Just definitionList
|
||||
|
180
server/src-lib/Hasura/GraphQL/Execute/Inline.hs
Normal file
180
server/src-lib/Hasura/GraphQL/Execute/Inline.hs
Normal file
@ -0,0 +1,180 @@
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
{-| This module implements /fragment inlining/, which converts all fragment
|
||||
spreads in a GraphQL query to inline fragments. For example, given a query like
|
||||
|
||||
> query {
|
||||
> users {
|
||||
> id
|
||||
> ...userFields
|
||||
> }
|
||||
> }
|
||||
>
|
||||
> fragment userFields on User {
|
||||
> name
|
||||
> favoriteColor
|
||||
> }
|
||||
|
||||
the fragment inliner will convert it to this:
|
||||
|
||||
> query {
|
||||
> users {
|
||||
> id
|
||||
> ... on User {
|
||||
> name
|
||||
> favoriteColor
|
||||
> }
|
||||
> }
|
||||
> }
|
||||
|
||||
This is a straightforward and mechanical transformation, but it simplifies
|
||||
further processing, since we catch unbound fragments and recursive fragment
|
||||
definitions early in the pipeline, so parsing does not have to worry about it.
|
||||
In that sense, fragment inlining is similar to the variable resolution pass
|
||||
performed by "Hasura.GraphQL.Execute.Resolve", but for fragment definitions
|
||||
rather than variables. -}
|
||||
module Hasura.GraphQL.Execute.Inline
|
||||
( inlineSelectionSet
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict.Extended as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.List as L
|
||||
import qualified Data.Text as T
|
||||
|
||||
import Control.Lens
|
||||
import Language.GraphQL.Draft.Syntax
|
||||
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.Server.Utils
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- | Internal bookkeeping used during inlining.
|
||||
data InlineEnv = InlineEnv
|
||||
{ _ieFragmentDefinitions :: HashMap Name FragmentDefinition
|
||||
-- ^ All known fragment definitions.
|
||||
, _ieFragmentStack :: [Name]
|
||||
-- ^ Fragments we’re currently inlining higher up in the call stack, used to
|
||||
-- detect fragment cycles.
|
||||
}
|
||||
|
||||
-- | Internal bookkeeping used during inlining.
|
||||
newtype InlineState var = InlineState
|
||||
{ _isFragmentCache :: HashMap Name (InlineFragment NoFragments var)
|
||||
-- ^ A cache of fragment definitions we’ve already inlined, so we don’t need
|
||||
-- to inline them again.
|
||||
}
|
||||
|
||||
$(makeLensesFor [("_ieFragmentStack", "ieFragmentStack")] ''InlineEnv)
|
||||
$(makeLenses ''InlineState)
|
||||
|
||||
type MonadInline var m =
|
||||
( MonadError QErr m
|
||||
, MonadReader InlineEnv m
|
||||
, MonadState (InlineState var) m
|
||||
)
|
||||
|
||||
-- | Inlines all fragment spreads in a 'SelectionSet'; see the module
|
||||
-- documentation for "Hasura.GraphQL.Execute.Inline" for details.
|
||||
inlineSelectionSet
|
||||
:: (MonadError QErr m, Foldable t)
|
||||
=> t FragmentDefinition
|
||||
-> SelectionSet FragmentSpread var
|
||||
-> m (SelectionSet NoFragments var)
|
||||
inlineSelectionSet fragmentDefinitions selectionSet = do
|
||||
let fragmentDefinitionMap = Map.groupOnNE _fdName fragmentDefinitions
|
||||
uniqueFragmentDefinitions <- flip Map.traverseWithKey fragmentDefinitionMap
|
||||
\fragmentName fragmentDefinitions' ->
|
||||
case fragmentDefinitions' of
|
||||
a :| [] -> return a
|
||||
_ -> throw400 ParseFailed $ "multiple definitions for fragment " <>> fragmentName
|
||||
let usedFragmentNames = Set.fromList $ fragmentsInSelectionSet selectionSet
|
||||
definedFragmentNames = Set.fromList $ Map.keys uniqueFragmentDefinitions
|
||||
-- At the time of writing, this check is disabled using
|
||||
-- a local binding because, the master branch doesn't implement this
|
||||
-- check.
|
||||
-- TODO: Do this check using a feature flag
|
||||
isFragmentValidationEnabled = False
|
||||
when (isFragmentValidationEnabled && (usedFragmentNames /= definedFragmentNames)) $
|
||||
throw400 ValidationFailed $
|
||||
"following fragment(s) have been defined, but have not been used in the query - "
|
||||
<> T.concat (L.intersperse ", "
|
||||
$ map unName $ Set.toList $
|
||||
Set.difference definedFragmentNames usedFragmentNames)
|
||||
traverse inlineSelection selectionSet
|
||||
& flip evalStateT InlineState{ _isFragmentCache = mempty }
|
||||
& flip runReaderT InlineEnv
|
||||
{ _ieFragmentDefinitions = uniqueFragmentDefinitions
|
||||
, _ieFragmentStack = [] }
|
||||
where
|
||||
fragmentsInSelectionSet :: SelectionSet FragmentSpread var -> [Name]
|
||||
fragmentsInSelectionSet selectionSet' = concatMap getFragFromSelection selectionSet'
|
||||
|
||||
getFragFromSelection :: Selection FragmentSpread var -> [Name]
|
||||
getFragFromSelection = \case
|
||||
SelectionField fld -> fragmentsInSelectionSet $ _fSelectionSet fld
|
||||
SelectionFragmentSpread fragmentSpread -> [_fsName fragmentSpread]
|
||||
SelectionInlineFragment inlineFragment -> fragmentsInSelectionSet $ _ifSelectionSet inlineFragment
|
||||
|
||||
inlineSelection
|
||||
:: MonadInline var m
|
||||
=> Selection FragmentSpread var
|
||||
-> m (Selection NoFragments var)
|
||||
inlineSelection (SelectionField field@Field{ _fSelectionSet }) =
|
||||
withPathK "selectionSet" $ withPathK (unName $ _fName field) $ do
|
||||
selectionSet <- traverse inlineSelection _fSelectionSet
|
||||
pure $! SelectionField field{ _fSelectionSet = selectionSet }
|
||||
inlineSelection (SelectionFragmentSpread spread) =
|
||||
withPathK "selectionSet" $
|
||||
SelectionInlineFragment <$> inlineFragmentSpread spread
|
||||
inlineSelection (SelectionInlineFragment fragment@InlineFragment{ _ifSelectionSet }) = do
|
||||
selectionSet <- traverse inlineSelection _ifSelectionSet
|
||||
pure $! SelectionInlineFragment fragment{ _ifSelectionSet = selectionSet }
|
||||
|
||||
inlineFragmentSpread
|
||||
:: MonadInline var m
|
||||
=> FragmentSpread var
|
||||
-> m (InlineFragment NoFragments var)
|
||||
inlineFragmentSpread FragmentSpread{ _fsName, _fsDirectives } = do
|
||||
InlineEnv{ _ieFragmentDefinitions, _ieFragmentStack } <- ask
|
||||
InlineState{ _isFragmentCache } <- get
|
||||
|
||||
if -- If we’ve already inlined this fragment, no need to process it again.
|
||||
| Just fragment <- Map.lookup _fsName _isFragmentCache ->
|
||||
pure $! addSpreadDirectives fragment
|
||||
|
||||
-- Fragment cycles are always illegal; see
|
||||
-- http://spec.graphql.org/June2018/#sec-Fragment-spreads-must-not-form-cycles
|
||||
| (fragmentCycle, _:_) <- break (== _fsName) _ieFragmentStack ->
|
||||
throw400 ValidationFailed $ "the fragment definition(s) "
|
||||
<> englishList "and" (dquoteTxt <$> (_fsName :| reverse fragmentCycle))
|
||||
<> " form a cycle"
|
||||
|
||||
-- We didn’t hit the fragment cache, so look up the definition and convert
|
||||
-- it to an inline fragment.
|
||||
| Just FragmentDefinition{ _fdTypeCondition, _fdSelectionSet }
|
||||
<- Map.lookup _fsName _ieFragmentDefinitions -> withPathK (unName _fsName) $ do
|
||||
|
||||
selectionSet <- locally ieFragmentStack (_fsName :) $
|
||||
traverse inlineSelection (fmap absurd <$> _fdSelectionSet)
|
||||
|
||||
let fragment = InlineFragment
|
||||
{ _ifTypeCondition = Just _fdTypeCondition
|
||||
-- As far as I can tell, the GraphQL spec says that directives
|
||||
-- on the fragment definition do NOT apply to the fields in its
|
||||
-- selection set.
|
||||
, _ifDirectives = []
|
||||
, _ifSelectionSet = selectionSet
|
||||
}
|
||||
modify' $ over isFragmentCache $ Map.insert _fsName fragment
|
||||
pure $! addSpreadDirectives fragment
|
||||
|
||||
-- If we get here, the fragment name is unbound; raise an error.
|
||||
-- http://spec.graphql.org/June2018/#sec-Fragment-spread-target-defined
|
||||
| otherwise -> throw400 ValidationFailed $
|
||||
"reference to undefined fragment " <>> _fsName
|
||||
where
|
||||
addSpreadDirectives fragment =
|
||||
fragment{ _ifDirectives = _ifDirectives fragment ++ _fsDirectives }
|
318
server/src-lib/Hasura/GraphQL/Execute/Insert.hs
Normal file
318
server/src-lib/Hasura/GraphQL/Execute/Insert.hs
Normal file
@ -0,0 +1,318 @@
|
||||
module Hasura.GraphQL.Execute.Insert
|
||||
( traverseAnnInsert
|
||||
, convertToSQLTransaction
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Text as T
|
||||
import qualified Database.PG.Query as Q
|
||||
|
||||
|
||||
import qualified Hasura.RQL.DML.Insert as RQL
|
||||
import qualified Hasura.RQL.DML.Insert.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Mutation as RQL
|
||||
import qualified Hasura.RQL.DML.RemoteJoin as RQL
|
||||
import qualified Hasura.RQL.DML.Returning as RQL
|
||||
import qualified Hasura.RQL.DML.Returning.Types as RQL
|
||||
import qualified Hasura.RQL.GBoolExp as RQL
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.Db
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Schema.Insert
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
|
||||
traverseAnnInsert
|
||||
:: (Applicative f)
|
||||
=> (a -> f b)
|
||||
-> AnnInsert a
|
||||
-> f (AnnInsert b)
|
||||
traverseAnnInsert f (AnnInsert fieldName isSingle (annIns, mutationOutput)) =
|
||||
AnnInsert fieldName isSingle
|
||||
<$> ( (,)
|
||||
<$> traverseMulti annIns
|
||||
<*> RQL.traverseMutationOutput f mutationOutput
|
||||
)
|
||||
where
|
||||
traverseMulti (AnnIns objs tableName conflictClause checkCond columns defaultValues) = AnnIns
|
||||
<$> traverse traverseObject objs
|
||||
<*> pure tableName
|
||||
<*> traverse (traverse f) conflictClause
|
||||
<*> ( (,)
|
||||
<$> traverseAnnBoolExp f (fst checkCond)
|
||||
<*> traverse (traverseAnnBoolExp f) (snd checkCond)
|
||||
)
|
||||
<*> pure columns
|
||||
<*> traverse f defaultValues
|
||||
traverseSingle (AnnIns obj tableName conflictClause checkCond columns defaultValues) = AnnIns
|
||||
<$> traverseObject obj
|
||||
<*> pure tableName
|
||||
<*> traverse (traverse f) conflictClause
|
||||
<*> ( (,)
|
||||
<$> traverseAnnBoolExp f (fst checkCond)
|
||||
<*> traverse (traverseAnnBoolExp f) (snd checkCond)
|
||||
)
|
||||
<*> pure columns
|
||||
<*> traverse f defaultValues
|
||||
traverseObject (AnnInsObj columns objRels arrRels) = AnnInsObj
|
||||
<$> traverse (traverse f) columns
|
||||
<*> traverse (traverseRel traverseSingle) objRels
|
||||
<*> traverse (traverseRel traverseMulti) arrRels
|
||||
traverseRel z (RelIns object relInfo) = RelIns <$> z object <*> pure relInfo
|
||||
|
||||
|
||||
convertToSQLTransaction
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> AnnInsert S.SQLExp
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> Seq.Seq Q.PrepArg
|
||||
-> Bool
|
||||
-> m EncJSON
|
||||
convertToSQLTransaction env (AnnInsert fieldName isSingle (annIns, mutationOutput)) rjCtx planVars stringifyNum =
|
||||
if null $ _aiInsObj annIns
|
||||
then pure $ RQL.buildEmptyMutResp mutationOutput
|
||||
else withPaths ["selectionSet", fieldName, "args", suffix] $
|
||||
insertMultipleObjects env annIns [] rjCtx mutationOutput planVars stringifyNum
|
||||
where
|
||||
withPaths p x = foldr ($) x $ withPathK <$> p
|
||||
suffix = bool "objects" "object" isSingle
|
||||
|
||||
insertMultipleObjects
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> MultiObjIns S.SQLExp
|
||||
-> [(PGCol, S.SQLExp)]
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> RQL.MutationOutput
|
||||
-> Seq.Seq Q.PrepArg
|
||||
-> Bool
|
||||
-> m EncJSON
|
||||
insertMultipleObjects env multiObjIns additionalColumns rjCtx mutationOutput planVars stringifyNum =
|
||||
bool withoutRelsInsert withRelsInsert anyRelsToInsert
|
||||
where
|
||||
AnnIns insObjs table conflictClause checkCondition columnInfos defVals = multiObjIns
|
||||
allInsObjRels = concatMap _aioObjRels insObjs
|
||||
allInsArrRels = concatMap _aioArrRels insObjs
|
||||
anyRelsToInsert = not $ null allInsArrRels && null allInsObjRels
|
||||
|
||||
withoutRelsInsert = do
|
||||
indexedForM_ (_aioColumns <$> insObjs) \column ->
|
||||
validateInsert (map fst column) [] (map fst additionalColumns)
|
||||
let columnValues = map (mkSQLRow defVals) $ union additionalColumns . _aioColumns <$> insObjs
|
||||
columnNames = Map.keys defVals
|
||||
insertQuery = RQL.InsertQueryP1
|
||||
table
|
||||
columnNames
|
||||
columnValues
|
||||
conflictClause
|
||||
checkCondition
|
||||
mutationOutput
|
||||
columnInfos
|
||||
RQL.execInsertQuery env stringifyNum (Just rjCtx) (insertQuery, planVars)
|
||||
|
||||
withRelsInsert = do
|
||||
insertRequests <- indexedForM insObjs \obj -> do
|
||||
let singleObj = AnnIns obj table conflictClause checkCondition columnInfos defVals
|
||||
insertObject env singleObj additionalColumns rjCtx planVars stringifyNum
|
||||
let affectedRows = sum $ map fst insertRequests
|
||||
columnValues = mapMaybe snd insertRequests
|
||||
selectExpr <- RQL.mkSelCTEFromColVals table columnInfos columnValues
|
||||
let (mutOutputRJ, remoteJoins) = RQL.getRemoteJoinsMutationOutput mutationOutput
|
||||
sqlQuery = Q.fromBuilder $ toSQL $
|
||||
RQL.mkMutationOutputExp table columnInfos (Just affectedRows) selectExpr mutOutputRJ stringifyNum
|
||||
RQL.executeMutationOutputQuery env sqlQuery [] $ (,rjCtx) <$> remoteJoins
|
||||
|
||||
insertObject
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> SingleObjIns S.SQLExp
|
||||
-> [(PGCol, S.SQLExp)]
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> Seq.Seq Q.PrepArg
|
||||
-> Bool
|
||||
-> m (Int, Maybe (ColumnValues TxtEncodedPGVal))
|
||||
insertObject env singleObjIns additionalColumns rjCtx planVars stringifyNum = do
|
||||
validateInsert (map fst columns) (map _riRelInfo objectRels) (map fst additionalColumns)
|
||||
|
||||
-- insert all object relations and fetch this insert dependent column values
|
||||
objInsRes <- forM objectRels $ insertObjRel env planVars rjCtx stringifyNum
|
||||
|
||||
-- prepare final insert columns
|
||||
let objRelAffRows = sum $ map fst objInsRes
|
||||
objRelDeterminedCols = concatMap snd objInsRes
|
||||
finalInsCols = columns <> objRelDeterminedCols <> additionalColumns
|
||||
|
||||
cte <- mkInsertQ table onConflict finalInsCols defaultValues checkCond
|
||||
|
||||
MutateResp affRows colVals <- liftTx $ RQL.mutateAndFetchCols table allColumns (cte, planVars) stringifyNum
|
||||
colValM <- asSingleObject colVals
|
||||
|
||||
arrRelAffRows <- bool (withArrRels colValM) (return 0) $ null arrayRels
|
||||
let totAffRows = objRelAffRows + affRows + arrRelAffRows
|
||||
|
||||
return (totAffRows, colValM)
|
||||
where
|
||||
AnnIns annObj table onConflict checkCond allColumns defaultValues = singleObjIns
|
||||
AnnInsObj columns objectRels arrayRels = annObj
|
||||
|
||||
arrRelDepCols = flip getColInfos allColumns $
|
||||
concatMap (Map.keys . riMapping . _riRelInfo) arrayRels
|
||||
|
||||
withArrRels colValM = do
|
||||
colVal <- onNothing colValM $ throw400 NotSupported cannotInsArrRelErr
|
||||
arrDepColsWithVal <- fetchFromColVals colVal arrRelDepCols
|
||||
arrInsARows <- forM arrayRels $ insertArrRel env arrDepColsWithVal rjCtx planVars stringifyNum
|
||||
return $ sum arrInsARows
|
||||
|
||||
asSingleObject = \case
|
||||
[] -> pure Nothing
|
||||
[r] -> pure $ Just r
|
||||
_ -> throw500 "more than one row returned"
|
||||
|
||||
cannotInsArrRelErr =
|
||||
"cannot proceed to insert array relations since insert to table "
|
||||
<> table <<> " affects zero rows"
|
||||
|
||||
insertObjRel
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> Seq.Seq Q.PrepArg
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> Bool
|
||||
-> ObjRelIns S.SQLExp
|
||||
-> m (Int, [(PGCol, S.SQLExp)])
|
||||
insertObjRel env planVars rjCtx stringifyNum objRelIns =
|
||||
withPathK (relNameToTxt relName) $ do
|
||||
(affRows, colValM) <- withPathK "data" $ insertObject env singleObjIns [] rjCtx planVars stringifyNum
|
||||
colVal <- onNothing colValM $ throw400 NotSupported errMsg
|
||||
retColsWithVals <- fetchFromColVals colVal rColInfos
|
||||
let columns = flip mapMaybe (Map.toList mapCols) \(column, target) -> do
|
||||
value <- lookup target retColsWithVals
|
||||
Just (column, value)
|
||||
pure (affRows, columns)
|
||||
where
|
||||
RelIns singleObjIns relInfo = objRelIns
|
||||
relName = riName relInfo
|
||||
table = riRTable relInfo
|
||||
mapCols = riMapping relInfo
|
||||
allCols = _aiTableCols singleObjIns
|
||||
rCols = Map.elems mapCols
|
||||
rColInfos = getColInfos rCols allCols
|
||||
errMsg = "cannot proceed to insert object relation "
|
||||
<> relName <<> " since insert to table "
|
||||
<> table <<> " affects zero rows"
|
||||
|
||||
insertArrRel
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> [(PGCol, S.SQLExp)]
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> Seq.Seq Q.PrepArg
|
||||
-> Bool
|
||||
-> ArrRelIns S.SQLExp
|
||||
-> m Int
|
||||
insertArrRel env resCols rjCtx planVars stringifyNum arrRelIns =
|
||||
withPathK (relNameToTxt $ riName relInfo) $ do
|
||||
let additionalColumns = flip mapMaybe resCols \(column, value) -> do
|
||||
target <- Map.lookup column mapping
|
||||
Just (target, value)
|
||||
resBS <- withPathK "data" $
|
||||
insertMultipleObjects env multiObjIns additionalColumns rjCtx mutOutput planVars stringifyNum
|
||||
resObj <- decodeEncJSON resBS
|
||||
onNothing (Map.lookup ("affected_rows" :: T.Text) resObj) $
|
||||
throw500 "affected_rows not returned in array rel insert"
|
||||
where
|
||||
RelIns multiObjIns relInfo = arrRelIns
|
||||
mapping = riMapping relInfo
|
||||
mutOutput = RQL.MOutMultirowFields [("affected_rows", RQL.MCount)]
|
||||
|
||||
-- | validate an insert object based on insert columns,
|
||||
-- | insert object relations and additional columns from parent
|
||||
validateInsert
|
||||
:: (MonadError QErr m)
|
||||
=> [PGCol] -- ^ inserting columns
|
||||
-> [RelInfo] -- ^ object relation inserts
|
||||
-> [PGCol] -- ^ additional fields from parent
|
||||
-> m ()
|
||||
validateInsert insCols objRels addCols = do
|
||||
-- validate insertCols
|
||||
unless (null insConflictCols) $ throw400 ValidationFailed $
|
||||
"cannot insert " <> showPGCols insConflictCols
|
||||
<> " columns as their values are already being determined by parent insert"
|
||||
|
||||
forM_ objRels $ \relInfo -> do
|
||||
let lCols = Map.keys $ riMapping relInfo
|
||||
relName = riName relInfo
|
||||
relNameTxt = relNameToTxt relName
|
||||
lColConflicts = lCols `intersect` (addCols <> insCols)
|
||||
withPathK relNameTxt $ unless (null lColConflicts) $ throw400 ValidationFailed $
|
||||
"cannot insert object relation ship " <> relName
|
||||
<<> " as " <> showPGCols lColConflicts
|
||||
<> " column values are already determined"
|
||||
where
|
||||
insConflictCols = insCols `intersect` addCols
|
||||
|
||||
|
||||
mkInsertQ
|
||||
:: MonadError QErr m
|
||||
=> QualifiedTable
|
||||
-> Maybe (RQL.ConflictClauseP1 S.SQLExp)
|
||||
-> [(PGCol, S.SQLExp)]
|
||||
-> Map.HashMap PGCol S.SQLExp
|
||||
-> (AnnBoolExpSQL, Maybe AnnBoolExpSQL)
|
||||
-> m S.CTE
|
||||
mkInsertQ table onConflictM insCols defVals (insCheck, updCheck) = do
|
||||
let sqlConflict = RQL.toSQLConflict table <$> onConflictM
|
||||
sqlExps = mkSQLRow defVals insCols
|
||||
valueExp = S.ValuesExp [S.TupleExp sqlExps]
|
||||
tableCols = Map.keys defVals
|
||||
sqlInsert =
|
||||
S.SQLInsert table tableCols valueExp sqlConflict
|
||||
. Just
|
||||
$ S.RetExp
|
||||
[ S.selectStar
|
||||
, S.Extractor
|
||||
(RQL.insertOrUpdateCheckExpr table onConflictM
|
||||
(RQL.toSQLBoolExp (S.QualTable table) insCheck)
|
||||
(fmap (RQL.toSQLBoolExp (S.QualTable table)) updCheck))
|
||||
Nothing
|
||||
]
|
||||
pure $ S.CTEInsert sqlInsert
|
||||
|
||||
fetchFromColVals
|
||||
:: MonadError QErr m
|
||||
=> ColumnValues TxtEncodedPGVal
|
||||
-> [PGColumnInfo]
|
||||
-> m [(PGCol, S.SQLExp)]
|
||||
fetchFromColVals colVal reqCols =
|
||||
forM reqCols $ \ci -> do
|
||||
let valM = Map.lookup (pgiColumn ci) colVal
|
||||
val <- onNothing valM $ throw500 $ "column "
|
||||
<> pgiColumn ci <<> " not found in given colVal"
|
||||
let pgColVal = case val of
|
||||
TENull -> S.SENull
|
||||
TELit t -> S.SELit t
|
||||
return (pgiColumn ci, pgColVal)
|
||||
|
||||
mkSQLRow :: Map.HashMap PGCol S.SQLExp -> [(PGCol, S.SQLExp)] -> [S.SQLExp]
|
||||
mkSQLRow defVals withPGCol = map snd $
|
||||
flip map (Map.toList defVals) $
|
||||
\(col, defVal) -> (col,) $ fromMaybe defVal $ Map.lookup col withPGColMap
|
||||
where
|
||||
withPGColMap = Map.fromList withPGCol
|
||||
|
||||
decodeEncJSON :: (J.FromJSON a, QErrM m) => EncJSON -> m a
|
||||
decodeEncJSON =
|
||||
either (throw500 . T.pack) decodeValue .
|
||||
J.eitherDecode . encJToLBS
|
@ -90,7 +90,7 @@ Additional details are provided by the documentation for individual bindings.
|
||||
module Hasura.GraphQL.Execute.LiveQuery
|
||||
( LiveQueryPlan
|
||||
, ReusableLiveQueryPlan
|
||||
, reuseLiveQueryPlan
|
||||
-- , reuseLiveQueryPlan
|
||||
, buildLiveQueryPlan
|
||||
|
||||
, LiveQueryPlanExplanation
|
||||
|
@ -18,7 +18,7 @@ module Hasura.GraphQL.Execute.LiveQuery.Plan
|
||||
, ReusableLiveQueryPlan
|
||||
, ValidatedQueryVariables
|
||||
, buildLiveQueryPlan
|
||||
, reuseLiveQueryPlan
|
||||
-- , reuseLiveQueryPlan
|
||||
|
||||
, LiveQueryPlanExplanation
|
||||
, explainLiveQueryPlan
|
||||
@ -27,41 +27,37 @@ module Hasura.GraphQL.Execute.LiveQuery.Plan
|
||||
import Hasura.Prelude
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.Extended as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString as B
|
||||
import qualified Data.Environment as E
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.UUID.V4 as UUID
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.Extended as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString as B
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.UUID.V4 as UUID
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
-- remove these when array encoding is merged
|
||||
import qualified Database.PG.Query.PTI as PTI
|
||||
import qualified PostgreSQL.Binary.Encoding as PE
|
||||
import qualified Database.PG.Query.PTI as PTI
|
||||
import qualified PostgreSQL.Binary.Encoding as PE
|
||||
|
||||
import Control.Lens
|
||||
import Data.Has
|
||||
import Data.UUID (UUID)
|
||||
import Data.UUID (UUID)
|
||||
|
||||
import qualified Hasura.GraphQL.Resolve as GR
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.GraphQL.Validate as GV
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
import qualified Hasura.GraphQL.Parser.Schema as PS
|
||||
-- import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.RQL.DML.RemoteJoin as RR
|
||||
import qualified Hasura.RQL.DML.Select as DS
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
import Hasura.Db
|
||||
import Hasura.GraphQL.Resolve.Action
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Execute.Action
|
||||
import Hasura.GraphQL.Execute.Query
|
||||
import Hasura.GraphQL.Parser.Column
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.SQL.Error
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
@ -72,7 +68,17 @@ import Hasura.SQL.Value
|
||||
newtype MultiplexedQuery = MultiplexedQuery { unMultiplexedQuery :: Q.Query }
|
||||
deriving (Show, Eq, Hashable, J.ToJSON)
|
||||
|
||||
mkMultiplexedQuery :: OMap.InsOrdHashMap G.Alias GR.QueryRootFldResolved -> MultiplexedQuery
|
||||
toSQLFromItem :: S.Alias -> SubscriptionRootFieldResolved -> S.FromItem
|
||||
toSQLFromItem alias = \case
|
||||
RFDB (QDBPrimaryKey s) -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
RFDB (QDBSimple s) -> fromSelect $ DS.mkSQLSelect DS.JASMultipleRows s
|
||||
RFDB (QDBAggregation s) -> fromSelect $ DS.mkAggregateSelect s
|
||||
RFDB (QDBConnection s) -> S.mkSelectWithFromItem (DS.mkConnectionSelect s) alias
|
||||
RFAction s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
where
|
||||
fromSelect s = S.mkSelFromItem s alias
|
||||
|
||||
mkMultiplexedQuery :: OMap.InsOrdHashMap G.Name SubscriptionRootFieldResolved -> MultiplexedQuery
|
||||
mkMultiplexedQuery rootFields = MultiplexedQuery . Q.fromBuilder . toSQL $ S.mkSelect
|
||||
{ S.selExtr =
|
||||
-- SELECT _subs.result_id, _fld_resp.root AS result
|
||||
@ -94,38 +100,38 @@ mkMultiplexedQuery rootFields = MultiplexedQuery . Q.fromBuilder . toSQL $ S.mkS
|
||||
{ S.selExtr = [S.Extractor rootFieldsJsonAggregate (Just . S.Alias $ Iden "root")]
|
||||
, S.selFrom = Just . S.FromExp $
|
||||
flip map (OMap.toList rootFields) $ \(fieldAlias, resolvedAST) ->
|
||||
GR.toSQLFromItem (S.Alias $ aliasToIden fieldAlias) resolvedAST
|
||||
toSQLFromItem (S.Alias $ aliasToIden fieldAlias) resolvedAST
|
||||
}
|
||||
|
||||
-- json_build_object('field1', field1.root, 'field2', field2.root, ...)
|
||||
rootFieldsJsonAggregate = S.SEFnApp "json_build_object" rootFieldsJsonPairs Nothing
|
||||
rootFieldsJsonPairs = flip concatMap (OMap.keys rootFields) $ \fieldAlias ->
|
||||
[ S.SELit (G.unName $ G.unAlias fieldAlias)
|
||||
[ S.SELit (G.unName fieldAlias)
|
||||
, mkQualIden (aliasToIden fieldAlias) (Iden "root") ]
|
||||
|
||||
mkQualIden prefix = S.SEQIden . S.QIden (S.QualIden prefix Nothing) -- TODO fix this Nothing of course
|
||||
aliasToIden = Iden . G.unName . G.unAlias
|
||||
aliasToIden = Iden . G.unName
|
||||
|
||||
-- TODO fix this comment
|
||||
-- | Resolves an 'GR.UnresolvedVal' by converting 'GR.UVPG' values to SQL expressions that refer to
|
||||
-- the @result_vars@ input object, collecting variable values along the way.
|
||||
resolveMultiplexedValue
|
||||
:: (MonadState (GV.ReusableVariableValues, Seq (WithScalarType PGScalarValue)) m)
|
||||
=> GR.UnresolvedVal -> m S.SQLExp
|
||||
:: (MonadState (HashMap G.Name PGColumnValue, Seq PGColumnValue) m)
|
||||
=> UnpreparedValue -> m S.SQLExp
|
||||
resolveMultiplexedValue = \case
|
||||
GR.UVPG annPGVal -> do
|
||||
let GR.AnnPGVal varM _ colVal = annPGVal
|
||||
varJsonPath <- case varM of
|
||||
UVParameter colVal varM -> do
|
||||
varJsonPath <- case fmap PS.getName varM of
|
||||
Just varName -> do
|
||||
modifying _1 $ Map.insert varName colVal
|
||||
pure ["query", G.unName $ G.unVariable varName]
|
||||
pure ["query", G.unName varName]
|
||||
Nothing -> do
|
||||
syntheticVarIndex <- gets (length . snd)
|
||||
modifying _2 (|> colVal)
|
||||
pure ["synthetic", T.pack $ show syntheticVarIndex]
|
||||
pure $ fromResVars (PGTypeScalar $ pstType colVal) varJsonPath
|
||||
GR.UVSessVar ty sessVar -> pure $ fromResVars ty ["session", sessionVariableToText sessVar]
|
||||
GR.UVSQL sqlExp -> pure sqlExp
|
||||
GR.UVSession -> pure $ fromResVars (PGTypeScalar PGJSON) ["session"]
|
||||
pure $ fromResVars (PGTypeScalar $ pstType $ pcvValue colVal) varJsonPath
|
||||
UVSessionVar ty sessVar -> pure $ fromResVars ty ["session", sessionVariableToText sessVar]
|
||||
UVLiteral sqlExp -> pure sqlExp
|
||||
UVSession -> pure $ fromResVars (PGTypeScalar PGJSON) ["session"]
|
||||
where
|
||||
fromResVars pgType jPath = addTypeAnnotation pgType $ S.SEOpApp (S.SQLOp "#>>")
|
||||
[ S.SEQIden $ S.QIden (S.QualIden (Iden "_subs") Nothing) (Iden "result_vars")
|
||||
@ -215,7 +221,7 @@ deriving instance (Eq (f TxtEncodedPGVal)) => Eq (ValidatedVariables f)
|
||||
deriving instance (Hashable (f TxtEncodedPGVal)) => Hashable (ValidatedVariables f)
|
||||
deriving instance (J.ToJSON (f TxtEncodedPGVal)) => J.ToJSON (ValidatedVariables f)
|
||||
|
||||
type ValidatedQueryVariables = ValidatedVariables (Map.HashMap G.Variable)
|
||||
type ValidatedQueryVariables = ValidatedVariables (Map.HashMap G.Name)
|
||||
type ValidatedSyntheticVariables = ValidatedVariables []
|
||||
|
||||
-- | Checks if the provided arguments are valid values for their corresponding types.
|
||||
@ -264,76 +270,109 @@ data ReusableLiveQueryPlan
|
||||
= ReusableLiveQueryPlan
|
||||
{ _rlqpParameterizedPlan :: !ParameterizedLiveQueryPlan
|
||||
, _rlqpSyntheticVariableValues :: !ValidatedSyntheticVariables
|
||||
, _rlqpQueryVariableTypes :: !GV.ReusableVariableTypes
|
||||
, _rlqpQueryVariableTypes :: HashMap G.Name PGColumnType
|
||||
} deriving (Show)
|
||||
$(J.deriveToJSON (J.aesonDrop 4 J.snakeCase) ''ReusableLiveQueryPlan)
|
||||
|
||||
-- | Constructs a new execution plan for a live query and returns a reusable version of the plan if
|
||||
-- possible.
|
||||
|
||||
-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it
|
||||
-- here. We should evaluate if we need it here.
|
||||
buildLiveQueryPlan
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has UserInfo r
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has QueryCtxMap r
|
||||
, Has SQLGenCtx r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, HasVersion
|
||||
)
|
||||
=> E.Environment
|
||||
-> PGExecCtx
|
||||
-> QueryReusability
|
||||
-> QueryActionExecuter
|
||||
-> ObjectSelectionSet
|
||||
=> PGExecCtx
|
||||
-> UserInfo
|
||||
-> InsOrdHashMap G.Name (SubscriptionRootField UnpreparedValue)
|
||||
-> m (LiveQueryPlan, Maybe ReusableLiveQueryPlan)
|
||||
buildLiveQueryPlan env pgExecCtx initialReusability actionExecuter selectionSet = do
|
||||
((resolvedASTMap, (queryVariableValues, syntheticVariableValues)), finalReusability) <-
|
||||
runReusabilityTWith initialReusability $
|
||||
flip runStateT mempty $ flip OMap.traverseWithKey (unAliasedFields $ unObjectSelectionSet selectionSet) $
|
||||
\_ field -> case GV._fName field of
|
||||
"__typename" -> throwVE "you cannot create a subscription on '__typename' field"
|
||||
_ -> do
|
||||
unresolvedAST <- GR.queryFldToPGAST env field actionExecuter
|
||||
resolvedAST <- GR.traverseQueryRootFldAST resolveMultiplexedValue unresolvedAST
|
||||
buildLiveQueryPlan pgExecCtx userInfo unpreparedAST = do
|
||||
-- ((resolvedASTs, (queryVariableValues, syntheticVariableValues)), finalReusability) <-
|
||||
-- GV.runReusabilityTWith initialReusability . flip runStateT mempty $
|
||||
-- fmap Map.fromList . for (toList fields) $ \field -> case GV._fName field of
|
||||
-- "__typename" -> throwVE "you cannot create a subscription on '__typename' field"
|
||||
-- _ -> do
|
||||
-- unresolvedAST <- GR.queryFldToPGAST field actionExecutioner
|
||||
-- resolvedAST <- GR.traverseQueryRootFldAST resolveMultiplexedValue unresolvedAST
|
||||
|
||||
let (_, remoteJoins) = GR.toPGQuery resolvedAST
|
||||
-- Reject remote relationships in subscription live query
|
||||
when (remoteJoins /= mempty) $
|
||||
throw400 NotSupported
|
||||
"Remote relationships are not allowed in subscriptions"
|
||||
pure resolvedAST
|
||||
-- let (_, remoteJoins) = GR.toPGQuery resolvedAST
|
||||
-- -- Reject remote relationships in subscription live query
|
||||
-- when (remoteJoins /= mempty) $
|
||||
-- throw400 NotSupported
|
||||
-- "Remote relationships are not allowed in subscriptions"
|
||||
-- pure (GV._fAlias field, resolvedAST)
|
||||
|
||||
userInfo <- asks getter
|
||||
let multiplexedQuery = mkMultiplexedQuery resolvedASTMap
|
||||
-- Transform the RQL AST into a prepared SQL query
|
||||
{- preparedAST <- for unpreparedAST \unpreparedQuery -> do
|
||||
(preparedQuery, PlanningSt _ planVars planVals)
|
||||
<- flip runStateT initPlanningSt
|
||||
$ traverseSubscriptionRootField prepareWithPlan unpreparedQuery
|
||||
pure $! irToRootFieldPlan planVars planVals preparedQuery
|
||||
-}
|
||||
(preparedAST, (queryVariableValues, querySyntheticVariableValues)) <- flip runStateT (mempty, Seq.empty) $
|
||||
for unpreparedAST \unpreparedQuery -> do
|
||||
resolvedRootField <- traverseQueryRootField resolveMultiplexedValue unpreparedQuery
|
||||
case resolvedRootField of
|
||||
RFDB qDB -> do
|
||||
let remoteJoins = case qDB of
|
||||
QDBSimple s -> snd $ RR.getRemoteJoins s
|
||||
QDBPrimaryKey s -> snd $ RR.getRemoteJoins s
|
||||
QDBAggregation s -> snd $ RR.getRemoteJoinsAggregateSelect s
|
||||
QDBConnection s -> snd $ RR.getRemoteJoinsConnectionSelect s
|
||||
when (remoteJoins /= mempty)
|
||||
$ throw400 NotSupported "Remote relationships are not allowed in subscriptions"
|
||||
_ -> pure ()
|
||||
traverseAction (DS.traverseAnnSimpleSelect resolveMultiplexedValue . resolveAsyncActionQuery userInfo) resolvedRootField
|
||||
|
||||
let multiplexedQuery = mkMultiplexedQuery preparedAST
|
||||
roleName = _uiRole userInfo
|
||||
parameterizedPlan = ParameterizedLiveQueryPlan roleName multiplexedQuery
|
||||
|
||||
-- We need to ensure that the values provided for variables are correct according to Postgres.
|
||||
-- Without this check an invalid value for a variable for one instance of the subscription will
|
||||
-- take down the entire multiplexed query.
|
||||
validatedQueryVars <- validateVariables pgExecCtx queryVariableValues
|
||||
validatedSyntheticVars <- validateVariables pgExecCtx (toList syntheticVariableValues)
|
||||
let cohortVariables = CohortVariables (_uiSession userInfo) validatedQueryVars validatedSyntheticVars
|
||||
plan = LiveQueryPlan parameterizedPlan cohortVariables
|
||||
varTypes = finalReusability ^? _Reusable
|
||||
reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars <$> varTypes
|
||||
pure (plan, reusablePlan)
|
||||
validatedQueryVars <- validateVariables pgExecCtx $ fmap pcvValue queryVariableValues
|
||||
validatedSyntheticVars <- validateVariables pgExecCtx $ map pcvValue $ toList querySyntheticVariableValues
|
||||
|
||||
reuseLiveQueryPlan
|
||||
:: (MonadError QErr m, MonadIO m)
|
||||
=> PGExecCtx
|
||||
-> SessionVariables
|
||||
-> Maybe GH.VariableValues
|
||||
-> ReusableLiveQueryPlan
|
||||
-> m LiveQueryPlan
|
||||
reuseLiveQueryPlan pgExecCtx sessionVars queryVars reusablePlan = do
|
||||
let ReusableLiveQueryPlan parameterizedPlan syntheticVars queryVarTypes = reusablePlan
|
||||
annVarVals <- GV.validateVariablesForReuse queryVarTypes queryVars
|
||||
validatedVars <- validateVariables pgExecCtx annVarVals
|
||||
pure $ LiveQueryPlan parameterizedPlan (CohortVariables sessionVars validatedVars syntheticVars)
|
||||
let -- TODO validatedQueryVars validatedSyntheticVars
|
||||
cohortVariables = CohortVariables (_uiSession userInfo) validatedQueryVars validatedSyntheticVars
|
||||
|
||||
plan = LiveQueryPlan parameterizedPlan cohortVariables
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- varTypes = finalReusability ^? GV._Reusable
|
||||
reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars mempty {- <$> _varTypes -}
|
||||
pure (plan, Just reusablePlan)
|
||||
|
||||
-- (astResolved, (queryVariableValues, syntheticVariableValues)) <- flip runStateT mempty $
|
||||
-- GEQ.traverseSubscriptionRootField resolveMultiplexedValue _astUnresolved
|
||||
-- let pgQuery = mkMultiplexedQuery $ _toPGQuery astResolved
|
||||
-- parameterizedPlan = ParameterizedLiveQueryPlan (userRole userInfo) fieldAlias pgQuery
|
||||
|
||||
-- -- We need to ensure that the values provided for variables
|
||||
-- -- are correct according to Postgres. Without this check
|
||||
-- -- an invalid value for a variable for one instance of the
|
||||
-- -- subscription will take down the entire multiplexed query
|
||||
-- validatedQueryVars <- validateVariables pgExecCtx queryVariableValues
|
||||
-- validatedSyntheticVars <- validateVariables pgExecCtx (toList syntheticVariableValues)
|
||||
-- let cohortVariables = CohortVariables (userVars userInfo) validatedQueryVars validatedSyntheticVars
|
||||
-- plan = LiveQueryPlan parameterizedPlan cohortVariables
|
||||
-- reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars <$> _varTypes
|
||||
-- pure (plan, reusablePlan)
|
||||
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- reuseLiveQueryPlan
|
||||
-- :: (MonadError QErr m, MonadIO m)
|
||||
-- => PGExecCtx
|
||||
-- -> SessionVariables
|
||||
-- -> Maybe GH.VariableValues
|
||||
-- -> ReusableLiveQueryPlan
|
||||
-- -> m LiveQueryPlan
|
||||
-- reuseLiveQueryPlan pgExecCtx sessionVars queryVars reusablePlan = do
|
||||
-- let ReusableLiveQueryPlan parameterizedPlan syntheticVars queryVarTypes = reusablePlan
|
||||
-- annVarVals <- _validateVariablesForReuse queryVarTypes queryVars
|
||||
-- validatedVars <- validateVariables pgExecCtx annVarVals
|
||||
-- pure $ LiveQueryPlan parameterizedPlan (CohortVariables sessionVars validatedVars syntheticVars)
|
||||
|
||||
data LiveQueryPlanExplanation
|
||||
= LiveQueryPlanExplanation
|
||||
|
@ -1,3 +1,4 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
-- | Multiplexed live query poller threads; see "Hasura.GraphQL.Execute.LiveQuery" for details.
|
||||
module Hasura.GraphQL.Execute.LiveQuery.Poll (
|
||||
-- * Pollers
|
||||
@ -39,7 +40,9 @@ module Hasura.GraphQL.Execute.LiveQuery.Poll (
|
||||
) where
|
||||
|
||||
import Data.List.Split (chunksOf)
|
||||
#ifndef PROFILING
|
||||
import GHC.AssertNF
|
||||
#endif
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Control.Concurrent.Async as A
|
||||
@ -215,7 +218,9 @@ pushResultToCohort result !respHashM (LiveQueryMetadata dTime) cohortSnapshot =
|
||||
(subscribersToPush, subscribersToIgnore) <-
|
||||
if isExecError result || respHashM /= prevRespHashM
|
||||
then do
|
||||
#ifndef PROFILING
|
||||
$assertNFHere respHashM -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
STM.atomically $ STM.writeTVar respRef respHashM
|
||||
return (newSinks <> curSinks, mempty)
|
||||
else
|
||||
@ -225,6 +230,7 @@ pushResultToCohort result !respHashM (LiveQueryMetadata dTime) cohortSnapshot =
|
||||
(subscribersToPush, subscribersToIgnore)
|
||||
where
|
||||
CohortSnapshot _ respRef curSinks newSinks = cohortSnapshot
|
||||
|
||||
response = result <&> \payload -> LiveQueryResponse payload dTime
|
||||
pushResultToSubscribers =
|
||||
A.mapConcurrently_ $ \(Subscriber _ _ action) -> action response
|
||||
@ -375,10 +381,10 @@ they need to.
|
||||
|
||||
-- | see Note [Minimal LiveQuery Poller Log]
|
||||
pollDetailMinimal :: PollDetails -> J.Value
|
||||
pollDetailMinimal (PollDetails{..}) =
|
||||
pollDetailMinimal PollDetails{..} =
|
||||
J.object [ "poller_id" J..= _pdPollerId
|
||||
, "snapshot_time" J..= _pdSnapshotTime
|
||||
, "batches" J..= (map batchExecutionDetailMinimal _pdBatches)
|
||||
, "batches" J..= map batchExecutionDetailMinimal _pdBatches
|
||||
, "total_time" J..= _pdTotalTime
|
||||
]
|
||||
|
||||
@ -389,7 +395,7 @@ type LiveQueryPostPollHook = PollDetails -> IO ()
|
||||
|
||||
-- the default LiveQueryPostPollHook
|
||||
defaultLiveQueryPostPollHook :: L.Logger L.Hasura -> LiveQueryPostPollHook
|
||||
defaultLiveQueryPostPollHook logger pd = L.unLogger logger pd
|
||||
defaultLiveQueryPostPollHook = L.unLogger
|
||||
|
||||
-- | Where the magic happens: the top-level action run periodically by each
|
||||
-- active 'Poller'. This needs to be async exception safe.
|
||||
|
@ -1,3 +1,4 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
-- | Top-level management of live query poller threads. The implementation of the polling itself is
|
||||
-- in "Hasura.GraphQL.Execute.LiveQuery.Poll". See "Hasura.GraphQL.Execute.LiveQuery" for high-level
|
||||
-- details.
|
||||
@ -23,7 +24,9 @@ import qualified StmContainers.Map as STMMap
|
||||
import Control.Concurrent.Extended (forkImmortal, sleep)
|
||||
import Control.Exception (mask_)
|
||||
import Data.String
|
||||
#ifndef PROFILING
|
||||
import GHC.AssertNF
|
||||
#endif
|
||||
|
||||
import qualified Hasura.GraphQL.Execute.LiveQuery.TMap as TMap
|
||||
import qualified Hasura.Logging as L
|
||||
@ -83,7 +86,9 @@ addLiveQuery logger subscriberMetadata lqState plan onResultAction = do
|
||||
|
||||
let !subscriber = Subscriber subscriberId subscriberMetadata onResultAction
|
||||
|
||||
#ifndef PROFILING
|
||||
$assertNFHere subscriber -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
|
||||
-- a handler is returned only when it is newly created
|
||||
handlerM <- STM.atomically $
|
||||
@ -107,7 +112,9 @@ addLiveQuery logger subscriberMetadata lqState plan onResultAction = do
|
||||
pollQuery pollerId lqOpts pgExecCtx query (_pCohorts handler) postPollHook
|
||||
sleep $ unRefetchInterval refetchInterval
|
||||
let !pState = PollerIOState threadRef pollerId
|
||||
#ifndef PROFILING
|
||||
$assertNFHere pState -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
STM.atomically $ STM.putTMVar (_pIOState handler) pState
|
||||
|
||||
pure $ LiveQueryId handlerId cohortKey subscriberId
|
||||
|
211
server/src-lib/Hasura/GraphQL/Execute/Mutation.hs
Normal file
211
server/src-lib/Hasura/GraphQL/Execute/Mutation.hs
Normal file
@ -0,0 +1,211 @@
|
||||
module Hasura.GraphQL.Execute.Mutation where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.IntMap as IntMap
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Sequence.NonEmpty as NE
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.RQL.DML.Delete as RQL
|
||||
import qualified Hasura.RQL.DML.Mutation as RQL
|
||||
import qualified Hasura.RQL.DML.Returning.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Update as RQL
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
import qualified Hasura.Logging as L
|
||||
|
||||
|
||||
import Hasura.Db
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Execute.Action
|
||||
import Hasura.GraphQL.Execute.Insert
|
||||
import Hasura.GraphQL.Execute.Prepare
|
||||
import Hasura.GraphQL.Execute.Resolve
|
||||
import Hasura.GraphQL.Parser
|
||||
import Hasura.GraphQL.Schema.Insert
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
|
||||
convertDelete
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
, MonadIO tx)
|
||||
=> Env.Environment
|
||||
-> SessionVariables
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> RQL.AnnDelG UnpreparedValue
|
||||
-> Bool
|
||||
-> m (tx EncJSON)
|
||||
convertDelete env usrVars rjCtx deleteOperation stringifyNum = do
|
||||
let (preparedDelete, expectedVariables) = flip runState Set.empty $ RQL.traverseAnnDel prepareWithoutPlan deleteOperation
|
||||
validateSessionVariables expectedVariables usrVars
|
||||
pure $ RQL.execDeleteQuery env stringifyNum (Just rjCtx) (preparedDelete, Seq.empty)
|
||||
|
||||
convertUpdate
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
, MonadIO tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> SessionVariables
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> RQL.AnnUpdG UnpreparedValue
|
||||
-> Bool
|
||||
-> m (tx EncJSON)
|
||||
convertUpdate env usrVars rjCtx updateOperation stringifyNum = do
|
||||
let (preparedUpdate, expectedVariables) = flip runState Set.empty $ RQL.traverseAnnUpd prepareWithoutPlan updateOperation
|
||||
if null $ RQL.uqp1OpExps updateOperation
|
||||
then pure $ pure $ RQL.buildEmptyMutResp $ RQL.uqp1Output preparedUpdate
|
||||
else do
|
||||
validateSessionVariables expectedVariables usrVars
|
||||
pure $ RQL.execUpdateQuery env stringifyNum (Just rjCtx) (preparedUpdate, Seq.empty)
|
||||
|
||||
convertInsert
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
, MonadIO tx)
|
||||
=> Env.Environment
|
||||
-> SessionVariables
|
||||
-> RQL.MutationRemoteJoinCtx
|
||||
-> AnnInsert UnpreparedValue
|
||||
-> Bool
|
||||
-> m (tx EncJSON)
|
||||
convertInsert env usrVars rjCtx insertOperation stringifyNum = do
|
||||
let (preparedInsert, expectedVariables) = flip runState Set.empty $ traverseAnnInsert prepareWithoutPlan insertOperation
|
||||
validateSessionVariables expectedVariables usrVars
|
||||
pure $ convertToSQLTransaction env preparedInsert rjCtx Seq.empty stringifyNum
|
||||
|
||||
planVariablesSequence :: SessionVariables -> PlanningSt -> Seq.Seq Q.PrepArg
|
||||
planVariablesSequence usrVars = Seq.fromList . map fst . withUserVars usrVars . IntMap.elems . _psPrepped
|
||||
|
||||
convertMutationRootField
|
||||
:: forall m tx
|
||||
. ( HasVersion
|
||||
, MonadIO m
|
||||
, MonadError QErr m
|
||||
, Tracing.MonadTrace m
|
||||
, Tracing.MonadTrace tx
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> UserInfo
|
||||
-> HTTP.Manager
|
||||
-> HTTP.RequestHeaders
|
||||
-> Bool
|
||||
-> MutationRootField UnpreparedValue
|
||||
-> m (Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField)
|
||||
convertMutationRootField env logger userInfo manager reqHeaders stringifyNum = \case
|
||||
RFDB (MDBInsert s) -> noResponseHeaders =<< convertInsert env userSession rjCtx s stringifyNum
|
||||
RFDB (MDBUpdate s) -> noResponseHeaders =<< convertUpdate env userSession rjCtx s stringifyNum
|
||||
RFDB (MDBDelete s) -> noResponseHeaders =<< convertDelete env userSession rjCtx s stringifyNum
|
||||
RFRemote remote -> pure $ Right remote
|
||||
RFAction (AMSync s) -> Left . (_aerTransaction &&& _aerHeaders) <$> resolveActionExecution env logger userInfo s actionExecContext
|
||||
RFAction (AMAsync s) -> noResponseHeaders =<< resolveActionMutationAsync s reqHeaders userSession
|
||||
RFRaw s -> noResponseHeaders $ pure $ encJFromJValue s
|
||||
where
|
||||
noResponseHeaders :: tx EncJSON -> m (Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField)
|
||||
noResponseHeaders rTx = pure $ Left (rTx, [])
|
||||
|
||||
userSession = _uiSession userInfo
|
||||
actionExecContext = ActionExecContext manager reqHeaders $ _uiSession userInfo
|
||||
|
||||
rjCtx = (manager, reqHeaders, userInfo)
|
||||
|
||||
convertMutationSelectionSet
|
||||
:: forall m tx
|
||||
. ( HasVersion
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO m
|
||||
, MonadError QErr m
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
, MonadIO tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> GQLContext
|
||||
-> SQLGenCtx
|
||||
-> UserInfo
|
||||
-> HTTP.Manager
|
||||
-> HTTP.RequestHeaders
|
||||
-> G.SelectionSet G.NoFragments G.Name
|
||||
-> [G.VariableDefinition]
|
||||
-> Maybe GH.VariableValues
|
||||
-> m (ExecutionPlan (tx EncJSON, HTTP.ResponseHeaders) RemoteCall (G.Name, J.Value))
|
||||
convertMutationSelectionSet env logger gqlContext sqlGenCtx userInfo manager reqHeaders fields varDefs varValsM = do
|
||||
mutationParser <- onNothing (gqlMutationParser gqlContext) $
|
||||
throw400 ValidationFailed "no mutations exist"
|
||||
-- Parse the GraphQL query into the RQL AST
|
||||
(unpreparedQueries, _reusability)
|
||||
:: (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue), QueryReusability)
|
||||
<- resolveVariables varDefs (fromMaybe Map.empty varValsM) fields
|
||||
>>= (mutationParser >>> (`onLeft` reportParseErrors))
|
||||
|
||||
-- Transform the RQL AST into a prepared SQL query
|
||||
txs <- for unpreparedQueries $ convertMutationRootField env logger userInfo manager reqHeaders (stringifyNum sqlGenCtx)
|
||||
let txList = OMap.toList txs
|
||||
case (mapMaybe takeTx txList, mapMaybe takeRemote txList) of
|
||||
(dbPlans, []) -> do
|
||||
let allHeaders = concatMap (snd . snd) dbPlans
|
||||
combinedTx = toSingleTx $ map (G.unName *** fst) dbPlans
|
||||
pure $ ExecStepDB (combinedTx, allHeaders)
|
||||
([], remotes@(firstRemote:_)) -> do
|
||||
let (remoteOperation, varValsM') =
|
||||
buildTypedOperation
|
||||
G.OperationTypeMutation
|
||||
varDefs
|
||||
(map (G.SelectionField . snd . snd) remotes)
|
||||
varValsM
|
||||
if all (\remote' -> fst (snd firstRemote) == fst (snd remote')) remotes
|
||||
then return $ ExecStepRemote (fst (snd firstRemote), remoteOperation, varValsM')
|
||||
else throw400 NotSupported "Mixed remote schemas are not supported"
|
||||
_ -> throw400 NotSupported "Heterogeneous execution of database and remote schemas not supported"
|
||||
-- Build and return an executable action from the generated SQL
|
||||
where
|
||||
reportParseErrors errs = case NE.head errs of
|
||||
-- TODO: Our error reporting machinery doesn’t currently support reporting
|
||||
-- multiple errors at once, so we’re throwing away all but the first one
|
||||
-- here. It would be nice to report all of them!
|
||||
ParseError{ pePath, peMessage, peCode } ->
|
||||
throwError (err400 peCode peMessage){ qePath = pePath }
|
||||
|
||||
-- | A list of aliased transactions for eg
|
||||
--
|
||||
-- > [("f1", Tx r1), ("f2", Tx r2)]
|
||||
--
|
||||
-- are converted into a single transaction as follows
|
||||
--
|
||||
-- > Tx {"f1": r1, "f2": r2}
|
||||
toSingleTx :: [(Text, tx EncJSON)] -> tx EncJSON
|
||||
toSingleTx aliasedTxs =
|
||||
fmap encJFromAssocList $
|
||||
forM aliasedTxs $ \(al, tx) -> (,) al <$> tx
|
||||
takeTx
|
||||
:: (G.Name, Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField)
|
||||
-> Maybe (G.Name, (tx EncJSON, HTTP.ResponseHeaders))
|
||||
takeTx (name, Left tx) = Just (name, tx)
|
||||
takeTx _ = Nothing
|
||||
takeRemote
|
||||
:: (G.Name, Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField)
|
||||
-> Maybe (G.Name, RemoteField)
|
||||
takeRemote (name, Right remote) = Just (name, remote)
|
||||
takeRemote _ = Nothing
|
@ -2,6 +2,7 @@ module Hasura.GraphQL.Execute.Plan
|
||||
( ReusablePlan(..)
|
||||
, PlanCache
|
||||
, PlanCacheOptions(..)
|
||||
, mkPlanCacheOptions
|
||||
, getPlan
|
||||
, addPlan
|
||||
, initPlanCache
|
||||
@ -18,18 +19,32 @@ import Hasura.RQL.Types
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Hasura.Cache.Bounded as Cache
|
||||
import qualified Hasura.GraphQL.Execute.LiveQuery as LQ
|
||||
import qualified Hasura.GraphQL.Execute.Query as EQ
|
||||
import qualified Hasura.GraphQL.Resolve as R
|
||||
-- import qualified Hasura.GraphQL.Execute.LiveQuery as LQ
|
||||
-- import qualified Hasura.GraphQL.Execute.Query as EQ
|
||||
|
||||
import qualified Hasura.GraphQL.Execute.Types as ET
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
|
||||
{- Note [Temporarily disabling query plan caching]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
Caching the incoming queries for re-usability is *temporarily* disabled.
|
||||
This is being done as part of rewriting GraphQL schema generation and
|
||||
execution (See https://github.com/hasura/graphql-engine/pull/4111)
|
||||
until we figure out if we need query plan caching.
|
||||
|
||||
The code related to query caching in GraphQL query execution code path
|
||||
is just commented with referring to this note. The relavent variables are
|
||||
commented inline (Ex: {- planCache -}) to help authors in re-enabling
|
||||
the query caching feature (if needed).
|
||||
-}
|
||||
|
||||
data PlanId
|
||||
= PlanId
|
||||
{ _piSchemaCacheVersion :: !SchemaCacheVer
|
||||
, _piRole :: !RoleName
|
||||
, _piOperationName :: !(Maybe GH.OperationName)
|
||||
, _piQuery :: !GH.GQLQueryText
|
||||
, _piQueryType :: !EQ.GraphQLQueryType
|
||||
, _piQueryType :: !ET.GraphQLQueryType
|
||||
} deriving (Show, Eq, Ord, Generic)
|
||||
|
||||
instance Hashable PlanId
|
||||
@ -47,27 +62,33 @@ instance J.ToJSON PlanId where
|
||||
newtype PlanCache
|
||||
= PlanCache {_unPlanCache :: Cache.BoundedCache PlanId ReusablePlan}
|
||||
|
||||
data ReusablePlan
|
||||
= RPQuery !EQ.ReusableQueryPlan ![R.QueryRootFldUnresolved]
|
||||
| RPSubs !LQ.ReusableLiveQueryPlan
|
||||
data ReusablePlan = ReusablePlan
|
||||
|
||||
instance J.ToJSON ReusablePlan where
|
||||
toJSON = \case
|
||||
RPQuery queryPlan _ -> J.toJSON queryPlan
|
||||
RPSubs subsPlan -> J.toJSON subsPlan
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- data ReusablePlan
|
||||
-- = RPQuery !EQ.ReusableQueryPlan
|
||||
-- | RPSubs !LQ.ReusableLiveQueryPlan
|
||||
|
||||
-- instance J.ToJSON ReusablePlan where
|
||||
-- toJSON = \case
|
||||
-- RPQuery queryPlan -> J.toJSON queryPlan
|
||||
-- RPSubs subsPlan -> J.toJSON subsPlan
|
||||
|
||||
newtype PlanCacheOptions
|
||||
= PlanCacheOptions { unPlanCacheSize :: Cache.CacheSize }
|
||||
deriving (Show, Eq)
|
||||
$(J.deriveJSON (J.aesonDrop 2 J.snakeCase) ''PlanCacheOptions)
|
||||
|
||||
mkPlanCacheOptions :: Cache.CacheSize -> PlanCacheOptions
|
||||
mkPlanCacheOptions = PlanCacheOptions
|
||||
|
||||
initPlanCache :: PlanCacheOptions -> IO PlanCache
|
||||
initPlanCache options =
|
||||
PlanCache <$> Cache.initialise (unPlanCacheSize options)
|
||||
|
||||
getPlan
|
||||
:: SchemaCacheVer -> RoleName -> Maybe GH.OperationName -> GH.GQLQueryText
|
||||
-> EQ.GraphQLQueryType -> PlanCache -> IO (Maybe ReusablePlan)
|
||||
-> ET.GraphQLQueryType -> PlanCache -> IO (Maybe ReusablePlan)
|
||||
getPlan schemaVer rn opNameM q queryType (PlanCache planCache) =
|
||||
Cache.lookup planId planCache
|
||||
where
|
||||
@ -75,22 +96,28 @@ getPlan schemaVer rn opNameM q queryType (PlanCache planCache) =
|
||||
|
||||
addPlan
|
||||
:: SchemaCacheVer -> RoleName -> Maybe GH.OperationName -> GH.GQLQueryText
|
||||
-> ReusablePlan -> EQ.GraphQLQueryType -> PlanCache -> IO ()
|
||||
-> ReusablePlan -> ET.GraphQLQueryType -> PlanCache -> IO ()
|
||||
addPlan schemaVer rn opNameM q queryPlan queryType (PlanCache planCache) =
|
||||
Cache.insert planId queryPlan planCache
|
||||
where
|
||||
planId = PlanId schemaVer rn opNameM q queryType
|
||||
|
||||
clearPlanCache :: PlanCache -> IO ()
|
||||
clearPlanCache (PlanCache planCache) =
|
||||
Cache.clear planCache
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- clearPlanCache :: PlanCache -> IO ()
|
||||
clearPlanCache :: IO ()
|
||||
clearPlanCache {- (PlanCache planCache) -} =
|
||||
pure ()
|
||||
-- Cache.clear planCache
|
||||
|
||||
dumpPlanCache :: PlanCache -> IO J.Value
|
||||
dumpPlanCache (PlanCache cache) =
|
||||
J.toJSON . map (map dumpEntry) <$> Cache.getEntries cache
|
||||
where
|
||||
dumpEntry (planId, plan) =
|
||||
J.object
|
||||
[ "id" J..= planId
|
||||
, "plan" J..= plan
|
||||
]
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- dumpPlanCache :: PlanCache -> IO J.Value
|
||||
dumpPlanCache :: IO J.Value
|
||||
dumpPlanCache {- (PlanCache cache) -} =
|
||||
pure $ J.String "Plan cache is temporarily disabled"
|
||||
-- J.toJSON . map (map dumpEntry) <$> Cache.getEntries cache
|
||||
-- where
|
||||
-- dumpEntry (planId, plan) =
|
||||
-- J.object
|
||||
-- [ "id" J..= planId
|
||||
-- , "plan" J..= plan
|
||||
-- ]
|
||||
|
187
server/src-lib/Hasura/GraphQL/Execute/Prepare.hs
Normal file
187
server/src-lib/Hasura/GraphQL/Execute/Prepare.hs
Normal file
@ -0,0 +1,187 @@
|
||||
module Hasura.GraphQL.Execute.Prepare
|
||||
( PlanVariables
|
||||
, PrepArgMap
|
||||
, PlanningSt(..)
|
||||
, RemoteCall
|
||||
, ExecutionPlan
|
||||
, ExecutionStep(..)
|
||||
, initPlanningSt
|
||||
, runPlan
|
||||
, prepareWithPlan
|
||||
, prepareWithoutPlan
|
||||
, validateSessionVariables
|
||||
, withUserVars
|
||||
, buildTypedOperation
|
||||
) where
|
||||
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.IntMap as IntMap
|
||||
import qualified Data.Text as T
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
import Hasura.GraphQL.Parser.Column
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.RQL.DML.Internal (currentSession)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
|
||||
type PlanVariables = Map.HashMap G.Name Int
|
||||
|
||||
-- | The value is (Q.PrepArg, PGScalarValue) because we want to log the human-readable value of the
|
||||
-- prepared argument and not the binary encoding in PG format
|
||||
type PrepArgMap = IntMap.IntMap (Q.PrepArg, PGScalarValue)
|
||||
|
||||
-- | Full execution plan to process one GraphQL query. Once we work on
|
||||
-- heterogeneous execution this will contain a mixture of things to run on the
|
||||
-- database and things to run on remote schemas.
|
||||
type ExecutionPlan db remote raw = ExecutionStep db remote raw
|
||||
|
||||
type RemoteCall = (RemoteSchemaInfo, G.TypedOperationDefinition G.NoFragments G.Name, Maybe GH.VariableValues)
|
||||
|
||||
-- | One execution step to processing a GraphQL query (e.g. one root field).
|
||||
-- Polymorphic to allow the SQL to be generated in stages.
|
||||
data ExecutionStep db remote raw
|
||||
= ExecStepDB db
|
||||
-- ^ A query to execute against the database
|
||||
| ExecStepRemote remote -- !RemoteSchemaInfo !(G.Selection G.NoFragments G.Name)
|
||||
-- ^ A query to execute against a remote schema
|
||||
| ExecStepRaw raw
|
||||
-- ^ Output a plain JSON object
|
||||
|
||||
data PlanningSt
|
||||
= PlanningSt
|
||||
{ _psArgNumber :: !Int
|
||||
, _psVariables :: !PlanVariables
|
||||
, _psPrepped :: !PrepArgMap
|
||||
, _psSessionVariables :: !(Set.HashSet SessionVariable)
|
||||
}
|
||||
|
||||
initPlanningSt :: PlanningSt
|
||||
initPlanningSt =
|
||||
PlanningSt 2 Map.empty IntMap.empty Set.empty
|
||||
|
||||
runPlan :: StateT PlanningSt m a -> m (a, PlanningSt)
|
||||
runPlan = flip runStateT initPlanningSt
|
||||
|
||||
prepareWithPlan :: (MonadState PlanningSt m) => UnpreparedValue -> m S.SQLExp
|
||||
prepareWithPlan = \case
|
||||
UVParameter PGColumnValue{ pcvValue = colVal } varInfoM -> do
|
||||
argNum <- case fmap getName varInfoM of
|
||||
Just var -> getVarArgNum var
|
||||
Nothing -> getNextArgNum
|
||||
addPrepArg argNum (toBinaryValue colVal, pstValue colVal)
|
||||
return $ toPrepParam argNum (pstType colVal)
|
||||
|
||||
UVSessionVar ty sessVar -> do
|
||||
sessVarVal <- retrieveAndFlagSessionVariableValue insertSessionVariable sessVar currentSessionExp
|
||||
pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of
|
||||
PGTypeScalar colTy -> withConstructorFn colTy sessVarVal
|
||||
PGTypeArray _ -> sessVarVal
|
||||
|
||||
UVLiteral sqlExp -> pure sqlExp
|
||||
UVSession -> pure currentSessionExp
|
||||
where
|
||||
currentSessionExp = S.SEPrep 1
|
||||
insertSessionVariable sessVar plan =
|
||||
plan { _psSessionVariables = Set.insert sessVar $ _psSessionVariables plan }
|
||||
|
||||
prepareWithoutPlan :: (MonadState (Set.HashSet SessionVariable) m) => UnpreparedValue -> m S.SQLExp
|
||||
prepareWithoutPlan = \case
|
||||
UVParameter pgValue _ -> pure $ toTxtValue $ pcvValue pgValue
|
||||
UVLiteral sqlExp -> pure sqlExp
|
||||
UVSession -> pure currentSession
|
||||
UVSessionVar ty sessVar -> do
|
||||
sessVarVal <- retrieveAndFlagSessionVariableValue Set.insert sessVar currentSession
|
||||
-- TODO: this piece of code appears at least three times: twice here
|
||||
-- and once in RQL.DML.Internal. Some de-duplication is in order.
|
||||
pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of
|
||||
PGTypeScalar colTy -> withConstructorFn colTy sessVarVal
|
||||
PGTypeArray _ -> sessVarVal
|
||||
|
||||
retrieveAndFlagSessionVariableValue
|
||||
:: (MonadState s m)
|
||||
=> (SessionVariable -> s -> s)
|
||||
-> SessionVariable
|
||||
-> S.SQLExp
|
||||
-> m S.SQLExp
|
||||
retrieveAndFlagSessionVariableValue updateState sessVar currentSessionExp = do
|
||||
modify $ updateState sessVar
|
||||
pure $ S.SEOpApp (S.SQLOp "->>")
|
||||
[currentSessionExp, S.SELit $ sessionVariableToText sessVar]
|
||||
|
||||
withUserVars :: SessionVariables -> [(Q.PrepArg, PGScalarValue)] -> [(Q.PrepArg, PGScalarValue)]
|
||||
withUserVars usrVars list =
|
||||
let usrVarsAsPgScalar = PGValJSON $ Q.JSON $ J.toJSON usrVars
|
||||
prepArg = Q.toPrepVal (Q.AltJ usrVars)
|
||||
in (prepArg, usrVarsAsPgScalar):list
|
||||
|
||||
validateSessionVariables :: MonadError QErr m => Set.HashSet SessionVariable -> SessionVariables -> m ()
|
||||
validateSessionVariables requiredVariables sessionVariables = do
|
||||
let missingSessionVariables = requiredVariables `Set.difference` getSessionVariablesSet sessionVariables
|
||||
unless (null missingSessionVariables) do
|
||||
throw400 NotFound $ "missing session variables: " <> T.intercalate ", " (dquote . sessionVariableToText <$> toList missingSessionVariables)
|
||||
|
||||
getVarArgNum :: (MonadState PlanningSt m) => G.Name -> m Int
|
||||
getVarArgNum var = do
|
||||
PlanningSt curArgNum vars prepped sessionVariables <- get
|
||||
case Map.lookup var vars of
|
||||
Just argNum -> pure argNum
|
||||
Nothing -> do
|
||||
put $ PlanningSt (curArgNum + 1) (Map.insert var curArgNum vars) prepped sessionVariables
|
||||
pure curArgNum
|
||||
|
||||
addPrepArg
|
||||
:: (MonadState PlanningSt m)
|
||||
=> Int -> (Q.PrepArg, PGScalarValue) -> m ()
|
||||
addPrepArg argNum arg = do
|
||||
PlanningSt curArgNum vars prepped sessionVariables <- get
|
||||
put $ PlanningSt curArgNum vars (IntMap.insert argNum arg prepped) sessionVariables
|
||||
|
||||
getNextArgNum :: (MonadState PlanningSt m) => m Int
|
||||
getNextArgNum = do
|
||||
PlanningSt curArgNum vars prepped sessionVariables <- get
|
||||
put $ PlanningSt (curArgNum + 1) vars prepped sessionVariables
|
||||
return curArgNum
|
||||
|
||||
unresolveVariables
|
||||
:: forall fragments
|
||||
. Functor fragments
|
||||
=> G.SelectionSet fragments Variable
|
||||
-> G.SelectionSet fragments G.Name
|
||||
unresolveVariables =
|
||||
fmap (fmap (getName . vInfo))
|
||||
|
||||
collectVariables
|
||||
:: forall fragments var
|
||||
. (Foldable fragments, Hashable var, Eq var)
|
||||
=> G.SelectionSet fragments var
|
||||
-> Set.HashSet var
|
||||
collectVariables =
|
||||
Set.unions . fmap (foldMap Set.singleton)
|
||||
|
||||
buildTypedOperation
|
||||
:: forall frag
|
||||
. (Functor frag, Foldable frag)
|
||||
=> G.OperationType
|
||||
-> [G.VariableDefinition]
|
||||
-> G.SelectionSet frag Variable
|
||||
-> Maybe GH.VariableValues
|
||||
-> (G.TypedOperationDefinition frag G.Name, Maybe GH.VariableValues)
|
||||
buildTypedOperation tp varDefs selSet varValsM =
|
||||
let unresolvedSelSet = unresolveVariables selSet
|
||||
requiredVars = collectVariables unresolvedSelSet
|
||||
restrictedDefs = filter (\varDef -> G._vdName varDef `Set.member` requiredVars) varDefs
|
||||
restrictedValsM = flip Map.intersection (Set.toMap requiredVars) <$> varValsM
|
||||
in (G.TypedOperationDefinition tp Nothing restrictedDefs [] unresolvedSelSet, restrictedValsM)
|
@ -1,10 +1,12 @@
|
||||
module Hasura.GraphQL.Execute.Query
|
||||
( convertQuerySelSet
|
||||
, queryOpFromPlan
|
||||
, ReusableQueryPlan
|
||||
-- , queryOpFromPlan
|
||||
-- , ReusableQueryPlan
|
||||
, GeneratedSqlMap
|
||||
, PreparedSql(..)
|
||||
, GraphQLQueryType(..)
|
||||
, traverseQueryRootField -- for live query planning
|
||||
, irToRootFieldPlan
|
||||
, parseGraphQLQuery
|
||||
) where
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
@ -12,43 +14,38 @@ import qualified Data.ByteString as B
|
||||
import qualified Data.ByteString.Lazy as LBS
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.IntMap as IntMap
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Sequence.NonEmpty as NESeq
|
||||
import qualified Data.TByteString as TBS
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as N
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
|
||||
import Control.Lens ((^?))
|
||||
import Data.Has
|
||||
|
||||
import qualified Hasura.GraphQL.Resolve as R
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.GraphQL.Validate as GV
|
||||
import qualified Hasura.GraphQL.Validate.SelectionSet as V
|
||||
import qualified Hasura.Logging as L
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.Db
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Resolve.Action
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.GraphQL.Execute.Action
|
||||
import Hasura.GraphQL.Execute.Prepare
|
||||
import Hasura.GraphQL.Execute.Resolve
|
||||
import Hasura.GraphQL.Parser
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DML.RemoteJoin
|
||||
import Hasura.RQL.DML.Select
|
||||
import Hasura.RQL.DML.Select (asSingleRowJsonResp)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
-- import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
type PlanVariables = Map.HashMap G.Variable Int
|
||||
|
||||
-- | The value is (Q.PrepArg, PGScalarValue) because we want to log the human-readable value of the
|
||||
-- prepared argument and not the binary encoding in PG format
|
||||
type PrepArgMap = IntMap.IntMap (Q.PrepArg, PGScalarValue)
|
||||
import qualified Hasura.RQL.DML.Select as DS
|
||||
|
||||
data PGPlan
|
||||
= PGPlan
|
||||
@ -68,43 +65,57 @@ instance J.ToJSON PGPlan where
|
||||
data RootFieldPlan
|
||||
= RFPRaw !B.ByteString
|
||||
| RFPPostgres !PGPlan
|
||||
|
||||
fldPlanFromJ :: (J.ToJSON a) => a -> RootFieldPlan
|
||||
fldPlanFromJ = RFPRaw . LBS.toStrict . J.encode
|
||||
| RFPActionQuery !ActionExecuteTx
|
||||
|
||||
instance J.ToJSON RootFieldPlan where
|
||||
toJSON = \case
|
||||
RFPRaw encJson -> J.toJSON $ TBS.fromBS encJson
|
||||
RFPPostgres pgPlan -> J.toJSON pgPlan
|
||||
RFPActionQuery _ -> J.String "Action Execution Tx"
|
||||
|
||||
type FieldPlans = [(G.Alias, RootFieldPlan)]
|
||||
type FieldPlans = [(G.Name, RootFieldPlan)]
|
||||
|
||||
data ReusableQueryPlan
|
||||
= ReusableQueryPlan
|
||||
{ _rqpVariableTypes :: !ReusableVariableTypes
|
||||
, _rqpFldPlans :: !FieldPlans
|
||||
}
|
||||
data ActionQueryPlan
|
||||
= AQPAsyncQuery !DS.AnnSimpleSel -- ^ Cacheable plan
|
||||
| AQPQuery !ActionExecuteTx -- ^ Non cacheable transaction
|
||||
|
||||
instance J.ToJSON ReusableQueryPlan where
|
||||
toJSON (ReusableQueryPlan varTypes fldPlans) =
|
||||
J.object [ "variables" J..= varTypes
|
||||
, "field_plans" J..= fldPlans
|
||||
]
|
||||
actionQueryToRootFieldPlan
|
||||
:: PlanVariables -> PrepArgMap -> ActionQueryPlan -> RootFieldPlan
|
||||
actionQueryToRootFieldPlan vars prepped = \case
|
||||
AQPAsyncQuery s -> RFPPostgres $
|
||||
PGPlan (DS.selectQuerySQL DS.JASSingleObject s) vars prepped Nothing
|
||||
AQPQuery tx -> RFPActionQuery tx
|
||||
|
||||
withPlan
|
||||
:: (MonadError QErr m)
|
||||
=> SessionVariables -> PGPlan -> ReusableVariableValues -> m PreparedSql
|
||||
withPlan usrVars (PGPlan q reqVars prepMap rq) annVars = do
|
||||
prepMap' <- foldM getVar prepMap (Map.toList reqVars)
|
||||
let args = withSessionVariables usrVars $ IntMap.elems prepMap'
|
||||
return $ PreparedSql q args rq
|
||||
where
|
||||
getVar accum (var, prepNo) = do
|
||||
let varName = G.unName $ G.unVariable var
|
||||
colVal <- onNothing (Map.lookup var annVars) $
|
||||
throw500 $ "missing variable in annVars : " <> varName
|
||||
let prepVal = (toBinaryValue colVal, pstValue colVal)
|
||||
return $ IntMap.insert prepNo prepVal accum
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- data ReusableVariableTypes
|
||||
-- data ReusableVariableValues
|
||||
|
||||
-- data ReusableQueryPlan
|
||||
-- = ReusableQueryPlan
|
||||
-- { _rqpVariableTypes :: !ReusableVariableTypes
|
||||
-- , _rqpFldPlans :: !FieldPlans
|
||||
-- }
|
||||
|
||||
-- instance J.ToJSON ReusableQueryPlan where
|
||||
-- toJSON (ReusableQueryPlan varTypes fldPlans) =
|
||||
-- J.object [ "variables" J..= () -- varTypes
|
||||
-- , "field_plans" J..= fldPlans
|
||||
-- ]
|
||||
|
||||
-- withPlan
|
||||
-- :: (MonadError QErr m)
|
||||
-- => SessionVariables -> PGPlan -> HashMap G.Name (WithScalarType PGScalarValue) -> m PreparedSql
|
||||
-- withPlan usrVars (PGPlan q reqVars prepMap remoteJoins) annVars = do
|
||||
-- prepMap' <- foldM getVar prepMap (Map.toList reqVars)
|
||||
-- let args = withUserVars usrVars $ IntMap.elems prepMap'
|
||||
-- return $ PreparedSql q args remoteJoins
|
||||
-- where
|
||||
-- getVar accum (var, prepNo) = do
|
||||
-- let varName = G.unName var
|
||||
-- colVal <- onNothing (Map.lookup var annVars) $
|
||||
-- throw500 $ "missing variable in annVars : " <> varName
|
||||
-- let prepVal = (toBinaryValue colVal, pstValue colVal)
|
||||
-- return $ IntMap.insert prepNo prepVal accum
|
||||
|
||||
-- turn the current plan into a transaction
|
||||
mkCurPlanTx
|
||||
@ -117,7 +128,7 @@ mkCurPlanTx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> HTTP.Manager
|
||||
-> [N.Header]
|
||||
-> [HTTP.Header]
|
||||
-> UserInfo
|
||||
-> FieldPlans
|
||||
-> m (tx EncJSON, GeneratedSqlMap)
|
||||
@ -126,85 +137,72 @@ mkCurPlanTx env manager reqHdrs userInfo fldPlans = do
|
||||
resolved <- forM fldPlans $ \(alias, fldPlan) -> do
|
||||
fldResp <- case fldPlan of
|
||||
RFPRaw resp -> return $ RRRaw resp
|
||||
RFPPostgres (PGPlan q _ prepMap rq) -> do
|
||||
let args = withSessionVariables (_uiSession userInfo) $ IntMap.elems prepMap
|
||||
return $ RRSql $ PreparedSql q args rq
|
||||
RFPPostgres (PGPlan q _ prepMap remoteJoins) -> do
|
||||
let args = withUserVars (_uiSession userInfo) $ IntMap.elems prepMap
|
||||
return $ RRSql $ PreparedSql q args remoteJoins
|
||||
RFPActionQuery tx -> pure $ RRActionQuery tx
|
||||
return (alias, fldResp)
|
||||
|
||||
(,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved)
|
||||
|
||||
withSessionVariables :: SessionVariables -> [(Q.PrepArg, PGScalarValue)] -> [(Q.PrepArg, PGScalarValue)]
|
||||
withSessionVariables usrVars list =
|
||||
let usrVarsAsPgScalar = PGValJSON $ Q.JSON $ J.toJSON usrVars
|
||||
prepArg = Q.toPrepVal (Q.AltJ usrVars)
|
||||
in (prepArg, usrVarsAsPgScalar):list
|
||||
|
||||
data PlanningSt
|
||||
= PlanningSt
|
||||
{ _psArgNumber :: !Int
|
||||
, _psVariables :: !PlanVariables
|
||||
, _psPrepped :: !PrepArgMap
|
||||
}
|
||||
|
||||
initPlanningSt :: PlanningSt
|
||||
initPlanningSt =
|
||||
PlanningSt 2 Map.empty IntMap.empty
|
||||
|
||||
getVarArgNum :: (MonadState PlanningSt m) => G.Variable -> m Int
|
||||
getVarArgNum var = do
|
||||
PlanningSt curArgNum vars prepped <- get
|
||||
case Map.lookup var vars of
|
||||
Just argNum -> pure argNum
|
||||
Nothing -> do
|
||||
put $ PlanningSt (curArgNum + 1) (Map.insert var curArgNum vars) prepped
|
||||
pure curArgNum
|
||||
|
||||
addPrepArg
|
||||
:: (MonadState PlanningSt m)
|
||||
=> Int -> (Q.PrepArg, PGScalarValue) -> m ()
|
||||
addPrepArg argNum arg = do
|
||||
PlanningSt curArgNum vars prepped <- get
|
||||
put $ PlanningSt curArgNum vars $ IntMap.insert argNum arg prepped
|
||||
|
||||
getNextArgNum :: (MonadState PlanningSt m) => m Int
|
||||
getNextArgNum = do
|
||||
PlanningSt curArgNum vars prepped <- get
|
||||
put $ PlanningSt (curArgNum + 1) vars prepped
|
||||
return curArgNum
|
||||
|
||||
prepareWithPlan :: (MonadState PlanningSt m) => UnresolvedVal -> m S.SQLExp
|
||||
prepareWithPlan = \case
|
||||
R.UVPG annPGVal -> do
|
||||
let AnnPGVal varM _ colVal = annPGVal
|
||||
argNum <- case varM of
|
||||
Just var -> getVarArgNum var
|
||||
Nothing -> getNextArgNum
|
||||
addPrepArg argNum (toBinaryValue colVal, pstValue colVal)
|
||||
return $ toPrepParam argNum (pstType colVal)
|
||||
|
||||
R.UVSessVar ty sessVar -> do
|
||||
let sessVarVal =
|
||||
S.SEOpApp (S.SQLOp "->>")
|
||||
[currentSession, S.SELit $ sessionVariableToText sessVar]
|
||||
return $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of
|
||||
PGTypeScalar colTy -> withConstructorFn colTy sessVarVal
|
||||
PGTypeArray _ -> sessVarVal
|
||||
|
||||
R.UVSQL sqlExp -> pure sqlExp
|
||||
R.UVSession -> pure currentSession
|
||||
-- convert a query from an intermediate representation to... another
|
||||
irToRootFieldPlan
|
||||
:: PlanVariables
|
||||
-> PrepArgMap
|
||||
-> QueryDB S.SQLExp -> PGPlan
|
||||
irToRootFieldPlan vars prepped = \case
|
||||
QDBSimple s -> mkPGPlan (DS.selectQuerySQL DS.JASMultipleRows) s
|
||||
QDBPrimaryKey s -> mkPGPlan (DS.selectQuerySQL DS.JASSingleObject) s
|
||||
QDBAggregation s ->
|
||||
let (annAggSel, aggRemoteJoins) = getRemoteJoinsAggregateSelect s
|
||||
in PGPlan (DS.selectAggregateQuerySQL annAggSel) vars prepped aggRemoteJoins
|
||||
QDBConnection s ->
|
||||
let (connSel, connRemoteJoins) = getRemoteJoinsConnectionSelect s
|
||||
in PGPlan (DS.connectionSelectQuerySQL connSel) vars prepped connRemoteJoins
|
||||
where
|
||||
currentSession = S.SEPrep 1
|
||||
mkPGPlan f simpleSel =
|
||||
let (simpleSel',remoteJoins) = getRemoteJoins simpleSel
|
||||
in PGPlan (f simpleSel') vars prepped remoteJoins
|
||||
|
||||
traverseQueryRootField
|
||||
:: forall f a b c d h
|
||||
. Applicative f
|
||||
=> (a -> f b)
|
||||
-> RootField (QueryDB a) c h d
|
||||
-> f (RootField (QueryDB b) c h d)
|
||||
traverseQueryRootField f =
|
||||
traverseDB f'
|
||||
where
|
||||
f' :: QueryDB a -> f (QueryDB b)
|
||||
f' = \case
|
||||
QDBSimple s -> QDBSimple <$> DS.traverseAnnSimpleSelect f s
|
||||
QDBPrimaryKey s -> QDBPrimaryKey <$> DS.traverseAnnSimpleSelect f s
|
||||
QDBAggregation s -> QDBAggregation <$> DS.traverseAnnAggregateSelect f s
|
||||
QDBConnection s -> QDBConnection <$> DS.traverseConnectionSelect f s
|
||||
|
||||
parseGraphQLQuery
|
||||
:: MonadError QErr m
|
||||
=> GQLContext
|
||||
-> [G.VariableDefinition]
|
||||
-> Maybe (HashMap G.Name J.Value)
|
||||
-> G.SelectionSet G.NoFragments G.Name
|
||||
-> m ( InsOrdHashMap G.Name (QueryRootField UnpreparedValue)
|
||||
, QueryReusability
|
||||
)
|
||||
parseGraphQLQuery gqlContext varDefs varValsM fields =
|
||||
resolveVariables varDefs (fromMaybe Map.empty varValsM) fields
|
||||
>>= (gqlQueryParser gqlContext >>> (`onLeft` reportParseErrors))
|
||||
where
|
||||
reportParseErrors errs = case NESeq.head errs of
|
||||
-- TODO: Our error reporting machinery doesn’t currently support reporting
|
||||
-- multiple errors at once, so we’re throwing away all but the first one
|
||||
-- here. It would be nice to report all of them!
|
||||
ParseError{ pePath, peMessage, peCode } ->
|
||||
throwError (err400 peCode peMessage){ qePath = pePath }
|
||||
|
||||
convertQuerySelSet
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has TypeMap r
|
||||
, Has QueryCtxMap r
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has UserInfo r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
:: forall m tx .
|
||||
( MonadError QErr m
|
||||
, HasVersion
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
@ -213,59 +211,111 @@ convertQuerySelSet
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> HTTP.Manager
|
||||
-> [N.Header]
|
||||
-> QueryReusability
|
||||
-> V.ObjectSelectionSet
|
||||
-> QueryActionExecuter
|
||||
-> m (tx EncJSON, Maybe ReusableQueryPlan, GeneratedSqlMap, [R.QueryRootFldUnresolved])
|
||||
convertQuerySelSet env manager reqHdrs initialReusability selSet actionRunner = do
|
||||
userInfo <- asks getter
|
||||
(fldPlansAndAst, finalReusability) <- runReusabilityTWith initialReusability $ do
|
||||
result <- V.traverseObjectSelectionSet selSet $ \fld -> do
|
||||
case V._fName fld of
|
||||
"__type" -> ((, Nothing) . fldPlanFromJ) <$> R.typeR fld
|
||||
"__schema" -> ((, Nothing) . fldPlanFromJ) <$> R.schemaR fld
|
||||
"__typename" -> pure (fldPlanFromJ queryRootNamedType, Nothing)
|
||||
_ -> do
|
||||
unresolvedAst <- R.queryFldToPGAST env fld actionRunner
|
||||
(q, PlanningSt _ vars prepped) <- flip runStateT initPlanningSt $
|
||||
R.traverseQueryRootFldAST prepareWithPlan unresolvedAst
|
||||
let (query, remoteJoins) = R.toPGQuery q
|
||||
pure $ (RFPPostgres $ PGPlan query vars prepped remoteJoins, Just unresolvedAst)
|
||||
return $ map (\(alias, (fldPlan, ast)) -> ((G.Alias $ G.Name alias, fldPlan), ast)) result
|
||||
|
||||
let varTypes = finalReusability ^? _Reusable
|
||||
fldPlans = map fst fldPlansAndAst
|
||||
reusablePlan = ReusableQueryPlan <$> varTypes <*> pure fldPlans
|
||||
(tx, sql) <- mkCurPlanTx env manager reqHdrs userInfo fldPlans
|
||||
pure (tx, reusablePlan, sql, mapMaybe snd fldPlansAndAst)
|
||||
|
||||
-- use the existing plan and new variables to create a pg query
|
||||
queryOpFromPlan
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> HTTP.Manager
|
||||
-> [N.Header]
|
||||
-> L.Logger L.Hasura
|
||||
-> GQLContext
|
||||
-> UserInfo
|
||||
-> HTTP.Manager
|
||||
-> HTTP.RequestHeaders
|
||||
-> G.SelectionSet G.NoFragments G.Name
|
||||
-> [G.VariableDefinition]
|
||||
-> Maybe GH.VariableValues
|
||||
-> ReusableQueryPlan
|
||||
-> m (tx EncJSON, GeneratedSqlMap)
|
||||
queryOpFromPlan env manager reqHdrs userInfo varValsM (ReusableQueryPlan varTypes fldPlans) = do
|
||||
validatedVars <- GV.validateVariablesForReuse varTypes varValsM
|
||||
-- generate the SQL and prepared vars or the bytestring
|
||||
resolved <- forM fldPlans $ \(alias, fldPlan) ->
|
||||
(alias,) <$> case fldPlan of
|
||||
RFPRaw resp -> return $ RRRaw resp
|
||||
RFPPostgres pgPlan -> RRSql <$> withPlan (_uiSession userInfo) pgPlan validatedVars
|
||||
-> m ( ExecutionPlan (tx EncJSON, GeneratedSqlMap) RemoteCall (G.Name, J.Value)
|
||||
-- , Maybe ReusableQueryPlan
|
||||
, [QueryRootField UnpreparedValue]
|
||||
)
|
||||
convertQuerySelSet env logger gqlContext userInfo manager reqHeaders fields varDefs varValsM = do
|
||||
-- Parse the GraphQL query into the RQL AST
|
||||
(unpreparedQueries, _reusability) <- parseGraphQLQuery gqlContext varDefs varValsM fields
|
||||
|
||||
(,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved)
|
||||
-- Transform the RQL AST into a prepared SQL query
|
||||
queryPlans <- for unpreparedQueries \unpreparedQuery -> do
|
||||
(preparedQuery, PlanningSt _ planVars planVals expectedVariables)
|
||||
<- flip runStateT initPlanningSt
|
||||
$ traverseQueryRootField prepareWithPlan unpreparedQuery
|
||||
>>= traverseAction convertActionQuery
|
||||
validateSessionVariables expectedVariables $ _uiSession userInfo
|
||||
traverseDB (pure . irToRootFieldPlan planVars planVals) preparedQuery
|
||||
>>= traverseAction (pure . actionQueryToRootFieldPlan planVars planVals)
|
||||
|
||||
-- This monster makes sure that consecutive database operation get executed together
|
||||
let dbPlans :: Seq.Seq (G.Name, RootFieldPlan)
|
||||
remoteFields :: Seq.Seq (G.Name, RemoteField)
|
||||
(dbPlans, remoteFields) = OMap.foldlWithKey' collectPlan (Seq.Empty, Seq.Empty) queryPlans
|
||||
|
||||
collectPlan
|
||||
:: (Seq.Seq (G.Name, RootFieldPlan), Seq.Seq (G.Name, RemoteField))
|
||||
-> G.Name
|
||||
-> RootField PGPlan RemoteField RootFieldPlan J.Value
|
||||
-> (Seq.Seq (G.Name, RootFieldPlan), Seq.Seq (G.Name, RemoteField))
|
||||
|
||||
collectPlan (seqDB, seqRemote) name (RFRemote r) =
|
||||
(seqDB, seqRemote Seq.:|> (name, r))
|
||||
|
||||
collectPlan (seqDB, seqRemote) name (RFDB db) =
|
||||
(seqDB Seq.:|> (name, RFPPostgres db), seqRemote)
|
||||
|
||||
collectPlan (seqDB, seqRemote) name (RFAction rfp) =
|
||||
(seqDB Seq.:|> (name, rfp), seqRemote)
|
||||
|
||||
collectPlan (seqDB, seqRemote) name (RFRaw r) =
|
||||
(seqDB Seq.:|> (name, RFPRaw $ LBS.toStrict $ J.encode r), seqRemote)
|
||||
|
||||
|
||||
executionPlan <- case (dbPlans, remoteFields) of
|
||||
(dbs, Seq.Empty) -> ExecStepDB <$> mkCurPlanTx env manager reqHeaders userInfo (toList dbs)
|
||||
(Seq.Empty, remotes@(firstRemote Seq.:<| _)) -> do
|
||||
let (remoteOperation, _) =
|
||||
buildTypedOperation
|
||||
G.OperationTypeQuery
|
||||
varDefs
|
||||
(map (G.SelectionField . snd . snd) $ toList remotes)
|
||||
varValsM
|
||||
if all (\remote' -> fst (snd firstRemote) == fst (snd remote')) remotes
|
||||
then return $ ExecStepRemote (fst (snd firstRemote), remoteOperation, varValsM)
|
||||
else throw400 NotSupported "Mixed remote schemas are not supported"
|
||||
_ -> throw400 NotSupported "Heterogeneous execution of database and remote schemas not supported"
|
||||
|
||||
let asts :: [QueryRootField UnpreparedValue]
|
||||
asts = OMap.elems unpreparedQueries
|
||||
pure (executionPlan,asts) -- See Note [Temporarily disabling query plan caching]
|
||||
where
|
||||
usrVars = _uiSession userInfo
|
||||
|
||||
convertActionQuery
|
||||
:: ActionQuery UnpreparedValue -> StateT PlanningSt m ActionQueryPlan
|
||||
convertActionQuery = \case
|
||||
AQQuery s -> lift $ do
|
||||
result <- resolveActionExecution env logger userInfo s $ ActionExecContext manager reqHeaders usrVars
|
||||
pure $ AQPQuery $ _aerTransaction result
|
||||
AQAsync s -> AQPAsyncQuery <$>
|
||||
DS.traverseAnnSimpleSelect prepareWithPlan (resolveAsyncActionQuery userInfo s)
|
||||
|
||||
-- See Note [Temporarily disabling query plan caching]
|
||||
-- use the existing plan and new variables to create a pg query
|
||||
-- queryOpFromPlan
|
||||
-- :: ( HasVersion
|
||||
-- , MonadError QErr m
|
||||
-- , Tracing.MonadTrace m
|
||||
-- , MonadIO tx
|
||||
-- , MonadTx tx
|
||||
-- , Tracing.MonadTrace tx
|
||||
-- )
|
||||
-- => Env.Environment
|
||||
-- -> HTTP.Manager
|
||||
-- -> [HTTP.Header]
|
||||
-- -> UserInfo
|
||||
-- -> Maybe GH.VariableValues
|
||||
-- -> ReusableQueryPlan
|
||||
-- -> m (tx EncJSON, GeneratedSqlMap)
|
||||
-- queryOpFromPlan env manager reqHdrs userInfo varValsM (ReusableQueryPlan varTypes fldPlans) = do
|
||||
-- validatedVars <- _validateVariablesForReuse varTypes varValsM
|
||||
-- -- generate the SQL and prepared vars or the bytestring
|
||||
-- resolved <- forM fldPlans $ \(alias, fldPlan) ->
|
||||
-- (alias,) <$> case fldPlan of
|
||||
-- RFPRaw resp -> return $ RRRaw resp
|
||||
-- RFPPostgres pgPlan -> RRSql <$> withPlan (_uiSession userInfo) pgPlan validatedVars
|
||||
|
||||
-- (,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved)
|
||||
|
||||
data PreparedSql
|
||||
= PreparedSql
|
||||
@ -291,11 +341,12 @@ instance J.ToJSON PreparedSql where
|
||||
data ResolvedQuery
|
||||
= RRRaw !B.ByteString
|
||||
| RRSql !PreparedSql
|
||||
| RRActionQuery !ActionExecuteTx
|
||||
|
||||
-- | The computed SQL with alias which can be logged. Nothing here represents no
|
||||
-- SQL for cases like introspection responses. Tuple of alias to a (maybe)
|
||||
-- prepared statement
|
||||
type GeneratedSqlMap = [(G.Alias, Maybe PreparedSql)]
|
||||
type GeneratedSqlMap = [(G.Name, Maybe PreparedSql)]
|
||||
|
||||
mkLazyRespTx
|
||||
:: ( HasVersion
|
||||
@ -306,38 +357,28 @@ mkLazyRespTx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> HTTP.Manager
|
||||
-> [N.Header]
|
||||
-> [HTTP.Header]
|
||||
-> UserInfo
|
||||
-> [(G.Alias, ResolvedQuery)]
|
||||
-> [(G.Name, ResolvedQuery)]
|
||||
-> m (tx EncJSON)
|
||||
mkLazyRespTx env manager reqHdrs userInfo resolved = do
|
||||
mkLazyRespTx env manager reqHdrs userInfo resolved =
|
||||
pure $ fmap encJFromAssocList $ forM resolved $ \(alias, node) -> do
|
||||
resp <- case node of
|
||||
RRRaw bs -> return $ encJFromBS bs
|
||||
RRSql (PreparedSql q args maybeRemoteJoins) -> do
|
||||
RRRaw bs -> return $ encJFromBS bs
|
||||
RRSql (PreparedSql q args maybeRemoteJoins) -> do
|
||||
let prepArgs = map fst args
|
||||
case maybeRemoteJoins of
|
||||
Nothing -> Tracing.trace "Postgres" . liftTx $ asSingleRowJsonResp q prepArgs
|
||||
Just remoteJoins ->
|
||||
executeQueryWithRemoteJoins env manager reqHdrs userInfo q prepArgs remoteJoins
|
||||
return (G.unName $ G.unAlias alias, resp)
|
||||
RRActionQuery actionTx -> actionTx
|
||||
return (G.unName alias, resp)
|
||||
|
||||
mkGeneratedSqlMap :: [(G.Alias, ResolvedQuery)] -> GeneratedSqlMap
|
||||
mkGeneratedSqlMap :: [(G.Name, ResolvedQuery)] -> GeneratedSqlMap
|
||||
mkGeneratedSqlMap resolved =
|
||||
flip map resolved $ \(alias, node) ->
|
||||
let res = case node of
|
||||
RRRaw _ -> Nothing
|
||||
RRSql ps -> Just ps
|
||||
RRRaw _ -> Nothing
|
||||
RRSql ps -> Just ps
|
||||
RRActionQuery _ -> Nothing
|
||||
in (alias, res)
|
||||
|
||||
-- The GraphQL Query type
|
||||
data GraphQLQueryType
|
||||
= QueryHasura
|
||||
| QueryRelay
|
||||
deriving (Show, Eq, Ord, Generic)
|
||||
instance Hashable GraphQLQueryType
|
||||
|
||||
instance J.ToJSON GraphQLQueryType where
|
||||
toJSON = \case
|
||||
QueryHasura -> "hasura"
|
||||
QueryRelay -> "relay"
|
||||
|
84
server/src-lib/Hasura/GraphQL/Execute/Resolve.hs
Normal file
84
server/src-lib/Hasura/GraphQL/Execute/Resolve.hs
Normal file
@ -0,0 +1,84 @@
|
||||
-- | Implements /variable resolution/ for GraphQL queries, which annotates the
|
||||
-- use site of each GraphQL variable with its value.
|
||||
module Hasura.GraphQL.Execute.Resolve
|
||||
( resolveVariables
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict.Extended as Map
|
||||
import qualified Data.HashSet as HS
|
||||
import qualified Data.List as L
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.SQL.Types
|
||||
|
||||
resolveVariables
|
||||
:: forall m fragments
|
||||
. (MonadError QErr m, Traversable fragments)
|
||||
=> [G.VariableDefinition]
|
||||
-> GH.VariableValues
|
||||
-> G.SelectionSet fragments G.Name
|
||||
-> m (G.SelectionSet fragments Variable)
|
||||
resolveVariables definitions jsonValues selSet = do
|
||||
variablesByName <- Map.groupOnNE getName <$> traverse buildVariable definitions
|
||||
uniqueVariables <- flip Map.traverseWithKey variablesByName
|
||||
\variableName variableDefinitions ->
|
||||
case variableDefinitions of
|
||||
a :| [] -> return a
|
||||
_ -> throw400 ParseFailed
|
||||
$ "multiple definitions for variable " <>> variableName
|
||||
(selSet', usedVariables) <- flip runStateT mempty $
|
||||
traverse (traverse (resolveVariable uniqueVariables)) selSet
|
||||
let variablesByNameSet = HS.fromList . Map.keys $ variablesByName
|
||||
jsonVariableNames = HS.fromList $ Map.keys jsonValues
|
||||
-- At the time of writing, this check is disabled using
|
||||
-- a local binding because, the master branch doesn't implement this
|
||||
-- check.
|
||||
-- TODO: Do this check using a feature flag
|
||||
isVariableValidationEnabled = False
|
||||
|
||||
when (isVariableValidationEnabled && usedVariables /= variablesByNameSet) $
|
||||
throw400 ValidationFailed $
|
||||
"following variable(s) have been defined, but have not been used in the query - "
|
||||
<> T.concat (L.intersperse ", " $
|
||||
map G.unName $ HS.toList $
|
||||
HS.difference variablesByNameSet usedVariables)
|
||||
|
||||
-- There may be variables which have a default value and may not be
|
||||
-- included in the variables JSON Map. So, we should only see, if a
|
||||
-- variable is inlcuded in the JSON Map, then it must be used in the
|
||||
-- query
|
||||
when (HS.difference jsonVariableNames usedVariables /= HS.empty) $
|
||||
throw400 ValidationFailed $
|
||||
"unexpected variables in variableValues: "
|
||||
<> T.concat (L.intersperse ", " $
|
||||
map G.unName $ HS.toList $
|
||||
HS.difference jsonVariableNames usedVariables)
|
||||
return selSet'
|
||||
where
|
||||
buildVariable :: G.VariableDefinition -> m Variable
|
||||
buildVariable G.VariableDefinition{ G._vdName, G._vdType, G._vdDefaultValue } = do
|
||||
let defaultValue = fromMaybe G.VNull _vdDefaultValue
|
||||
value <- case Map.lookup _vdName jsonValues of
|
||||
Just jsonValue -> pure $ JSONValue jsonValue
|
||||
Nothing
|
||||
| G.isNullable _vdType -> pure $ GraphQLValue $ absurd <$> defaultValue
|
||||
| otherwise -> throw400 ValidationFailed $
|
||||
"expecting a value for non-nullable variable: " <>> _vdName
|
||||
pure $! Variable
|
||||
{ vInfo = if G.isNullable _vdType
|
||||
then VIOptional _vdName defaultValue
|
||||
else VIRequired _vdName
|
||||
, vType = _vdType
|
||||
, vValue = value
|
||||
}
|
||||
resolveVariable :: HashMap G.Name Variable -> G.Name -> StateT (HS.HashSet G.Name) m Variable
|
||||
resolveVariable variables name = case Map.lookup name variables of
|
||||
Just variable -> modify (HS.insert name) >> pure variable
|
||||
Nothing -> throw400 ValidationFailed $ "unbound variable " <>> name
|
17
server/src-lib/Hasura/GraphQL/Execute/Types.hs
Normal file
17
server/src-lib/Hasura/GraphQL/Execute/Types.hs
Normal file
@ -0,0 +1,17 @@
|
||||
module Hasura.GraphQL.Execute.Types (GraphQLQueryType(..)) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
|
||||
-- graphql-engine supports two GraphQL interfaces: one at v1/graphql, and a Relay one at v1beta1/relay
|
||||
data GraphQLQueryType
|
||||
= QueryHasura
|
||||
| QueryRelay
|
||||
deriving (Show, Eq, Ord, Generic)
|
||||
instance Hashable GraphQLQueryType
|
||||
|
||||
instance J.ToJSON GraphQLQueryType where
|
||||
toJSON = \case
|
||||
QueryHasura -> "hasura"
|
||||
QueryRelay -> "relay"
|
@ -6,32 +6,29 @@ module Hasura.GraphQL.Explain
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Resolve.Action
|
||||
import Hasura.GraphQL.Validate.Types (evalReusabilityT, runReusabilityT)
|
||||
import Hasura.GraphQL.Parser
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DML.Internal
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.GraphQL.Execute as E
|
||||
import qualified Hasura.GraphQL.Execute.Inline as E
|
||||
import qualified Hasura.GraphQL.Execute.LiveQuery as E
|
||||
import qualified Hasura.GraphQL.Resolve as RS
|
||||
import qualified Hasura.GraphQL.Execute.Query as E
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.GraphQL.Validate as GV
|
||||
import qualified Hasura.GraphQL.Validate.SelectionSet as GV
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.RQL.DML.RemoteJoin as RR
|
||||
import qualified Hasura.RQL.DML.Select as DS
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
data GQLExplain
|
||||
= GQLExplain
|
||||
@ -53,118 +50,96 @@ data FieldPlan
|
||||
|
||||
$(J.deriveJSON (J.aesonDrop 3 J.camelCase) ''FieldPlan)
|
||||
|
||||
type Explain r m =
|
||||
(ReaderT r (ExceptT QErr m))
|
||||
|
||||
runExplain
|
||||
resolveUnpreparedValue
|
||||
:: (MonadError QErr m)
|
||||
=> r -> Explain r m a -> m a
|
||||
runExplain ctx m =
|
||||
either throwError return =<< runExceptT (runReaderT m ctx)
|
||||
=> UserInfo -> UnpreparedValue -> m S.SQLExp
|
||||
resolveUnpreparedValue userInfo = \case
|
||||
UVParameter pgValue _ -> pure $ toTxtValue $ pcvValue pgValue
|
||||
UVLiteral sqlExp -> pure sqlExp
|
||||
UVSession -> pure $ sessionInfoJsonExp $ _uiSession userInfo
|
||||
UVSessionVar ty sessionVariable -> do
|
||||
let maybeSessionVariableValue =
|
||||
getSessionVariableValue sessionVariable (_uiSession userInfo)
|
||||
|
||||
resolveVal
|
||||
:: (MonadError QErr m)
|
||||
=> UserInfo -> RS.UnresolvedVal -> m S.SQLExp
|
||||
resolveVal userInfo = \case
|
||||
RS.UVPG annPGVal ->
|
||||
RS.txtConverter annPGVal
|
||||
RS.UVSessVar ty sessVar -> do
|
||||
sessVarVal <- S.SELit <$> getSessVarVal userInfo sessVar
|
||||
return $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of
|
||||
PGTypeScalar colTy -> withConstructorFn colTy sessVarVal
|
||||
PGTypeArray _ -> sessVarVal
|
||||
RS.UVSQL sqlExp -> return sqlExp
|
||||
RS.UVSession -> pure $ sessionInfoJsonExp $ _uiSession userInfo
|
||||
sessionVariableValue <- fmap S.SELit $ onNothing maybeSessionVariableValue $
|
||||
throw400 UnexpectedPayload $ "missing required session variable for role "
|
||||
<> _uiRole userInfo <<> " : " <> sessionVariableToText sessionVariable
|
||||
|
||||
getSessVarVal
|
||||
:: (MonadError QErr m)
|
||||
=> UserInfo -> SessionVariable -> m Text
|
||||
getSessVarVal userInfo sessVar =
|
||||
onNothing (getSessionVariableValue sessVar sessionVariables) $
|
||||
throw400 UnexpectedPayload $
|
||||
"missing required session variable for role " <> rn <<>
|
||||
" : " <> sessionVariableToText sessVar
|
||||
where
|
||||
rn = _uiRole userInfo
|
||||
sessionVariables = _uiSession userInfo
|
||||
pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of
|
||||
PGTypeScalar colTy -> withConstructorFn colTy sessionVariableValue
|
||||
PGTypeArray _ -> sessionVariableValue
|
||||
|
||||
explainField
|
||||
:: (MonadError QErr m, MonadTx m, HasVersion, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> UserInfo
|
||||
-> GCtx
|
||||
-> SQLGenCtx
|
||||
-> QueryActionExecuter
|
||||
-> GV.Field
|
||||
-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it
|
||||
-- here. We should evaluate if we need it here.
|
||||
explainQueryField
|
||||
:: (MonadError QErr m, MonadTx m)
|
||||
=> UserInfo
|
||||
-> G.Name
|
||||
-> QueryRootField UnpreparedValue
|
||||
-> m FieldPlan
|
||||
explainField env logger userInfo gCtx sqlGenCtx actionExecuter fld =
|
||||
case fName of
|
||||
"__type" -> return $ FieldPlan fName Nothing Nothing
|
||||
"__schema" -> return $ FieldPlan fName Nothing Nothing
|
||||
"__typename" -> return $ FieldPlan fName Nothing Nothing
|
||||
_ -> do
|
||||
unresolvedAST <-
|
||||
runExplain (logger, queryCtxMap, userInfo, fldMap, orderByCtx, sqlGenCtx) $
|
||||
evalReusabilityT $ RS.queryFldToPGAST env fld actionExecuter
|
||||
resolvedAST <- RS.traverseQueryRootFldAST (resolveVal userInfo) unresolvedAST
|
||||
let (query, remoteJoins) = RS.toPGQuery resolvedAST
|
||||
txtSQL = Q.getQueryText query
|
||||
explainQueryField userInfo fieldName rootField = do
|
||||
resolvedRootField <- E.traverseQueryRootField (resolveUnpreparedValue userInfo) rootField
|
||||
case resolvedRootField of
|
||||
RFRemote _ -> throw400 InvalidParams "only hasura queries can be explained"
|
||||
RFAction _ -> throw400 InvalidParams "query actions cannot be explained"
|
||||
RFRaw _ -> pure $ FieldPlan fieldName Nothing Nothing
|
||||
RFDB qDB -> do
|
||||
let (querySQL, remoteJoins) = case qDB of
|
||||
QDBSimple s -> first (DS.selectQuerySQL DS.JASMultipleRows) $ RR.getRemoteJoins s
|
||||
QDBPrimaryKey s -> first (DS.selectQuerySQL DS.JASSingleObject) $ RR.getRemoteJoins s
|
||||
QDBAggregation s -> first DS.selectAggregateQuerySQL $ RR.getRemoteJoinsAggregateSelect s
|
||||
QDBConnection s -> first DS.connectionSelectQuerySQL $ RR.getRemoteJoinsConnectionSelect s
|
||||
textSQL = Q.getQueryText querySQL
|
||||
-- CAREFUL!: an `EXPLAIN ANALYZE` here would actually *execute* this
|
||||
-- query, resulting in potential privilege escalation:
|
||||
withExplain = "EXPLAIN (FORMAT TEXT) " <> txtSQL
|
||||
-- query, maybe resulting in privilege escalation:
|
||||
withExplain = "EXPLAIN (FORMAT TEXT) " <> textSQL
|
||||
-- Reject if query contains any remote joins
|
||||
when (remoteJoins /= mempty) $ throw400 NotSupported "Remote relationships are not allowed in explain query"
|
||||
planLines <- liftTx $ map runIdentity <$>
|
||||
Q.listQE dmlTxErrorHandler (Q.fromText withExplain) () True
|
||||
return $ FieldPlan fName (Just txtSQL) $ Just planLines
|
||||
where
|
||||
fName = GV._fName fld
|
||||
|
||||
queryCtxMap = _gQueryCtxMap gCtx
|
||||
fldMap = _gFields gCtx
|
||||
orderByCtx = _gOrdByCtx gCtx
|
||||
Q.listQE dmlTxErrorHandler (Q.fromText withExplain) () True
|
||||
pure $ FieldPlan fieldName (Just textSQL) $ Just planLines
|
||||
|
||||
-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it
|
||||
-- here. We should evaluate if we need it here.
|
||||
explainGQLQuery
|
||||
:: ( HasVersion
|
||||
, MonadError QErr m
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> L.Logger L.Hasura
|
||||
-> PGExecCtx
|
||||
-> (tx EncJSON -> m EncJSON)
|
||||
:: forall m
|
||||
. ( MonadError QErr m
|
||||
, MonadIO m
|
||||
)
|
||||
=> PGExecCtx
|
||||
-> SchemaCache
|
||||
-> SQLGenCtx
|
||||
-> QueryActionExecuter
|
||||
-> GQLExplain
|
||||
-> m EncJSON
|
||||
explainGQLQuery env logger pgExecCtx runInTx sc sqlGenCtx actionExecuter (GQLExplain query userVarsRaw maybeIsRelay) = do
|
||||
-- NOTE!: we will be executing what follows as though admin role. See e.g.
|
||||
-- notes in explainField:
|
||||
explainGQLQuery pgExecCtx sc (GQLExplain query userVarsRaw maybeIsRelay) = do
|
||||
-- NOTE!: we will be executing what follows as though admin role. See e.g. notes in explainField:
|
||||
userInfo <- mkUserInfo (URBFromSessionVariablesFallback adminRoleName) UAdminSecretSent sessionVariables
|
||||
-- we don't need to check in allow list as we consider it an admin endpoint
|
||||
(execPlan, queryReusability) <- runReusabilityT $
|
||||
E.getExecPlanPartial userInfo sc queryType query
|
||||
(gCtx, rootSelSet) <- case execPlan of
|
||||
E.GExPHasura (gCtx, rootSelSet) ->
|
||||
return (gCtx, rootSelSet)
|
||||
E.GExPRemote{} ->
|
||||
throw400 InvalidParams "only hasura queries can be explained"
|
||||
case rootSelSet of
|
||||
GV.RQuery selSet ->
|
||||
runInTx $ encJFromJValue . map snd <$>
|
||||
GV.traverseObjectSelectionSet selSet (explainField env logger userInfo gCtx sqlGenCtx actionExecuter)
|
||||
GV.RMutation _ ->
|
||||
let takeFragment =
|
||||
\case G.ExecutableDefinitionFragment f -> Just f; _ -> Nothing
|
||||
fragments = mapMaybe takeFragment $ GH.unGQLExecDoc $ GH._grQuery query
|
||||
(graphQLContext, queryParts) <- E.getExecPlanPartial userInfo sc queryType query
|
||||
case queryParts of
|
||||
G.TypedOperationDefinition G.OperationTypeQuery _ varDefs _ selSet -> do
|
||||
-- (Here the above fragment inlining is actually executed.)
|
||||
inlinedSelSet <- E.inlineSelectionSet fragments selSet
|
||||
(unpreparedQueries, _) <-
|
||||
E.parseGraphQLQuery graphQLContext varDefs (GH._grVariables query) inlinedSelSet
|
||||
runInTx $ encJFromJValue
|
||||
<$> for (OMap.toList unpreparedQueries) (uncurry (explainQueryField userInfo))
|
||||
|
||||
G.TypedOperationDefinition G.OperationTypeMutation _ _ _ _ ->
|
||||
throw400 InvalidParams "only queries can be explained"
|
||||
GV.RSubscription fields -> do
|
||||
(plan, _) <- E.getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo
|
||||
queryReusability actionExecuter fields
|
||||
|
||||
G.TypedOperationDefinition G.OperationTypeSubscription _ varDefs _ selSet -> do
|
||||
-- (Here the above fragment inlining is actually executed.)
|
||||
inlinedSelSet <- E.inlineSelectionSet fragments selSet
|
||||
(unpreparedQueries, _) <- E.parseGraphQLQuery graphQLContext varDefs (GH._grVariables query) inlinedSelSet
|
||||
validSubscriptionQueries <- for unpreparedQueries E.validateSubscriptionRootField
|
||||
(plan, _) <- E.buildLiveQueryPlan pgExecCtx userInfo validSubscriptionQueries
|
||||
runInTx $ encJFromJValue <$> E.explainLiveQueryPlan plan
|
||||
where
|
||||
queryType = bool E.QueryHasura E.QueryRelay $ fromMaybe False maybeIsRelay
|
||||
sessionVariables = mkSessionVariablesText $ maybe [] Map.toList userVarsRaw
|
||||
|
||||
runInTx :: LazyTx QErr EncJSON -> m EncJSON
|
||||
runInTx = liftEither <=< liftIO . runExceptT . runLazyTx pgExecCtx Q.ReadOnly
|
||||
|
@ -5,7 +5,7 @@ layer. In contrast with, logging at the HTTP server layer.
|
||||
|
||||
module Hasura.GraphQL.Logging
|
||||
( QueryLog(..)
|
||||
, MonadQueryLog (..)
|
||||
, MonadQueryLog(..)
|
||||
) where
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
@ -43,9 +43,8 @@ instance L.ToEngineLog QueryLog L.Hasura where
|
||||
-- | key-value map to be printed as JSON
|
||||
encodeSql :: EQ.GeneratedSqlMap -> J.Value
|
||||
encodeSql sql =
|
||||
jValFromAssocList $ map (\(a, q) -> (alName a, fmap J.toJSON q)) sql
|
||||
jValFromAssocList $ map (\(a, q) -> (G.unName a, fmap J.toJSON q)) sql
|
||||
where
|
||||
alName = G.unName . G.unAlias
|
||||
jValFromAssocList xs = J.object $ map (uncurry (J..=)) xs
|
||||
|
||||
class Monad m => MonadQueryLog m where
|
||||
|
@ -247,9 +247,9 @@ instance IsField Typename where
|
||||
getMemberSelectionSet
|
||||
:: IsField f
|
||||
=> G.NamedType -> ScopedSelectionSet f -> ObjectSelectionSet
|
||||
getMemberSelectionSet namedType (ScopedSelectionSet {..}) =
|
||||
getMemberSelectionSet namedType ScopedSelectionSet{..} =
|
||||
fromMaybe (ObjectSelectionSet (fmap toField _sssBaseSelectionSet)) $
|
||||
Map.lookup namedType $ _sssMemberSelectionSets
|
||||
Map.lookup namedType _sssMemberSelectionSets
|
||||
|
||||
data AnnInpVal
|
||||
= AnnInpVal
|
||||
|
51
server/src-lib/Hasura/GraphQL/Parser.hs
Normal file
51
server/src-lib/Hasura/GraphQL/Parser.hs
Normal file
@ -0,0 +1,51 @@
|
||||
-- | This module exports the public API to our internal GraphQL query parser
|
||||
-- combinator language. For more details, see the documentation for 'Parser'.
|
||||
module Hasura.GraphQL.Parser
|
||||
( Parser
|
||||
, parserType
|
||||
, runParser
|
||||
, bind
|
||||
, bindFields
|
||||
|
||||
, ScalarRepresentation(..)
|
||||
, scalar
|
||||
, boolean
|
||||
, int
|
||||
, float
|
||||
, string
|
||||
, identifier
|
||||
, unsafeRawScalar
|
||||
|
||||
, enum
|
||||
, nullable
|
||||
, list
|
||||
, object
|
||||
, selectionSet
|
||||
, selectionSetObject
|
||||
|
||||
, InputFieldsParser
|
||||
, field
|
||||
, fieldWithDefault
|
||||
, fieldOptional
|
||||
|
||||
, FieldParser
|
||||
, ParsedSelection(..)
|
||||
, handleTypename
|
||||
, selection
|
||||
, selection_
|
||||
, subselection
|
||||
, subselection_
|
||||
|
||||
, jsonToGraphQL
|
||||
|
||||
, module Hasura.GraphQL.Parser.Class
|
||||
, module Hasura.GraphQL.Parser.Column
|
||||
, module Hasura.GraphQL.Parser.Monad
|
||||
, module Hasura.GraphQL.Parser.Schema
|
||||
) where
|
||||
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Parser.Column
|
||||
import Hasura.GraphQL.Parser.Internal.Parser
|
||||
import Hasura.GraphQL.Parser.Monad
|
||||
import Hasura.GraphQL.Parser.Schema
|
193
server/src-lib/Hasura/GraphQL/Parser/Class.hs
Normal file
193
server/src-lib/Hasura/GraphQL/Parser/Class.hs
Normal file
@ -0,0 +1,193 @@
|
||||
-- | Classes for monads used during schema construction and query parsing.
|
||||
module Hasura.GraphQL.Parser.Class where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.Haskell.TH as TH
|
||||
|
||||
import Data.Has
|
||||
import Data.Parser.JSONPath
|
||||
import Data.Tuple.Extended
|
||||
import GHC.Stack (HasCallStack)
|
||||
import Type.Reflection (Typeable)
|
||||
|
||||
import {-# SOURCE #-} Hasura.GraphQL.Parser.Internal.Parser
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.RQL.Types.Table (TableCache, TableInfo)
|
||||
import Hasura.Session (RoleName)
|
||||
import Hasura.SQL.Types
|
||||
|
||||
{- Note [Tying the knot]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
GraphQL type definitions can be mutually recursive, and indeed, they quite often
|
||||
are! For example, two tables that reference one another will be represented by
|
||||
types such as the following:
|
||||
|
||||
type author {
|
||||
id: Int!
|
||||
name: String!
|
||||
articles: [article!]!
|
||||
}
|
||||
|
||||
type article {
|
||||
id: Int!
|
||||
title: String!
|
||||
content: String!
|
||||
author: author!
|
||||
}
|
||||
|
||||
This doesn’t cause any trouble if the schema is represented by a mapping from
|
||||
type names to type definitions, but the Parser abstraction is all about avoiding
|
||||
that kind of indirection to improve type safety — parsers refer to their
|
||||
sub-parsers directly. This presents two problems during schema generation:
|
||||
|
||||
1. Schema generation needs to terminate in finite time, so we need to ensure
|
||||
we don’t try to eagerly construct an infinitely-large schema due to the
|
||||
mutually-recursive structure.
|
||||
|
||||
2. To serve introspection queries, we do eventually need to construct a
|
||||
mapping from names to types (a TypeMap), so we need to be able to
|
||||
recursively walk the entire schema in finite time.
|
||||
|
||||
Solving point number 1 could be done with either laziness or sharing, but
|
||||
neither of those are enough to solve point number 2, which requires /observable/
|
||||
sharing. We need to construct a Parser graph that contains enough information to
|
||||
detect cycles during traversal.
|
||||
|
||||
It may seem appealing to just use type names to detect cycles, which would allow
|
||||
us to get away with using laziness rather than true sharing. Unfortunately, that
|
||||
leads to two further problems:
|
||||
|
||||
* It’s possible to end up with two different types with the same name, which
|
||||
is an error and should be reported as such. Using names to break cycles
|
||||
prevents us from doing that, since we have no way to check that two types
|
||||
with the same name are actually the same.
|
||||
|
||||
* Some Parser constructors can fail — the `column` parser checks that the type
|
||||
name is a valid GraphQL name, for example. This extra validation means lazy
|
||||
schema construction isn’t viable, since we need to eagerly build the schema
|
||||
to ensure all the validation checks hold.
|
||||
|
||||
So we’re forced to use sharing. But how do we do it? Somehow, we have to /tie
|
||||
the knot/ — we have to build a cyclic data structure — and some of the cycles
|
||||
may be quite large. Doing all this knot-tying by hand would be incredibly
|
||||
tricky, and it would require a lot of inversion of control to thread the shared
|
||||
parsers around.
|
||||
|
||||
To avoid contorting the program, we instead implement a form of memoization. The
|
||||
MonadSchema class provides a mechanism to memoize a parser constructor function,
|
||||
which allows us to get sharing mostly for free. The memoization strategy also
|
||||
annotates cached parsers with a Unique that can be used to break cycles while
|
||||
traversing the graph, so we get observable sharing as well. -}
|
||||
|
||||
-- | A class that provides functionality used when building the GraphQL schema,
|
||||
-- i.e. constructing the 'Parser' graph.
|
||||
class (Monad m, MonadParse n) => MonadSchema n m | m -> n where
|
||||
-- | Memoizes a parser constructor function for the extent of a single schema
|
||||
-- construction process. This is mostly useful for recursive parsers;
|
||||
-- see Note [Tying the knot] for more details.
|
||||
memoizeOn
|
||||
:: (HasCallStack, Ord a, Typeable a, Typeable b, Typeable k)
|
||||
=> TH.Name
|
||||
-- ^ A unique name used to identify the function being memoized. There isn’t
|
||||
-- really any metaprogramming going on here, we just use a Template Haskell
|
||||
-- 'TH.Name' as a convenient source for a static, unique identifier.
|
||||
-> a
|
||||
-- ^ The value to use as the memoization key. It’s the caller’s
|
||||
-- responsibility to ensure multiple calls to the same function don’t use
|
||||
-- the same key.
|
||||
-> m (Parser k n b) -> m (Parser k n b)
|
||||
|
||||
type MonadRole r m = (MonadReader r m, Has RoleName r)
|
||||
|
||||
-- | Gets the current role the schema is being built for.
|
||||
askRoleName
|
||||
:: MonadRole r m
|
||||
=> m RoleName
|
||||
askRoleName = asks getter
|
||||
|
||||
type MonadTableInfo r m = (MonadReader r m, Has TableCache r, MonadError QErr m)
|
||||
|
||||
-- | Looks up table information for the given table name. This function
|
||||
-- should never fail, since the schema cache construction process is
|
||||
-- supposed to ensure all dependencies are resolved.
|
||||
askTableInfo
|
||||
:: MonadTableInfo r m
|
||||
=> QualifiedTable
|
||||
-> m TableInfo
|
||||
askTableInfo tableName = do
|
||||
tableInfo <- asks $ Map.lookup tableName . getter
|
||||
-- This should never fail, since the schema cache construction process is
|
||||
-- supposed to ensure that all dependencies are resolved.
|
||||
tableInfo `onNothing` throw500 ("askTableInfo: no info for " <>> tableName)
|
||||
|
||||
-- | A wrapper around 'memoizeOn' that memoizes a function by using its argument
|
||||
-- as the key.
|
||||
memoize
|
||||
:: (HasCallStack, MonadSchema n m, Ord a, Typeable a, Typeable b, Typeable k)
|
||||
=> TH.Name
|
||||
-> (a -> m (Parser k n b))
|
||||
-> (a -> m (Parser k n b))
|
||||
memoize name f a = memoizeOn name a (f a)
|
||||
|
||||
memoize2
|
||||
:: (HasCallStack, MonadSchema n m, Ord a, Ord b, Typeable a, Typeable b, Typeable c, Typeable k)
|
||||
=> TH.Name
|
||||
-> (a -> b -> m (Parser k n c))
|
||||
-> (a -> b -> m (Parser k n c))
|
||||
memoize2 name = curry . memoize name . uncurry
|
||||
|
||||
memoize3
|
||||
:: ( HasCallStack, MonadSchema n m, Ord a, Ord b, Ord c
|
||||
, Typeable a, Typeable b, Typeable c, Typeable d, Typeable k )
|
||||
=> TH.Name
|
||||
-> (a -> b -> c -> m (Parser k n d))
|
||||
-> (a -> b -> c -> m (Parser k n d))
|
||||
memoize3 name = curry3 . memoize name . uncurry3
|
||||
|
||||
memoize4
|
||||
:: ( HasCallStack, MonadSchema n m, Ord a, Ord b, Ord c, Ord d
|
||||
, Typeable a, Typeable b, Typeable c, Typeable d, Typeable e, Typeable k )
|
||||
=> TH.Name
|
||||
-> (a -> b -> c -> d -> m (Parser k n e))
|
||||
-> (a -> b -> c -> d -> m (Parser k n e))
|
||||
memoize4 name = curry4 . memoize name . uncurry4
|
||||
|
||||
-- | A class that provides functionality for parsing GraphQL queries, i.e.
|
||||
-- running a fully-constructed 'Parser'.
|
||||
class Monad m => MonadParse m where
|
||||
withPath :: (JSONPath -> JSONPath) -> m a -> m a
|
||||
-- | Not the full power of 'MonadError' because parse errors cannot be
|
||||
-- caught.
|
||||
parseErrorWith :: Code -> Text -> m a
|
||||
-- | See 'QueryReusability'.
|
||||
markNotReusable :: m ()
|
||||
|
||||
parseError :: MonadParse m => Text -> m a
|
||||
parseError = parseErrorWith ValidationFailed
|
||||
|
||||
-- | Tracks whether or not a query is /reusable/. Reusable queries are nice,
|
||||
-- since we can cache their resolved ASTs and avoid re-resolving them if we
|
||||
-- receive an identical query. However, we can’t always safely reuse queries if
|
||||
-- they have variables, since some variable values can affect the generated SQL.
|
||||
-- For example, consider the following query:
|
||||
--
|
||||
-- > query users_where($condition: users_bool_exp!) {
|
||||
-- > users(where: $condition) {
|
||||
-- > id
|
||||
-- > }
|
||||
-- > }
|
||||
--
|
||||
-- Different values for @$condition@ will produce completely different queries,
|
||||
-- so we can’t reuse its plan (unless the variable values were also all
|
||||
-- identical, of course, but we don’t bother caching those).
|
||||
data QueryReusability = Reusable | NotReusable
|
||||
|
||||
instance Semigroup QueryReusability where
|
||||
NotReusable <> _ = NotReusable
|
||||
_ <> NotReusable = NotReusable
|
||||
Reusable <> Reusable = Reusable
|
||||
|
||||
instance Monoid QueryReusability where
|
||||
mempty = Reusable
|
5
server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot
Normal file
5
server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot
Normal file
@ -0,0 +1,5 @@
|
||||
module Hasura.GraphQL.Parser.Class where
|
||||
|
||||
import Data.Kind (Type)
|
||||
|
||||
class MonadParse (m :: Type -> Type)
|
274
server/src-lib/Hasura/GraphQL/Parser/Collect.hs
Normal file
274
server/src-lib/Hasura/GraphQL/Parser/Collect.hs
Normal file
@ -0,0 +1,274 @@
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
{-| This module implements two parts of the GraphQL specification:
|
||||
|
||||
1. <§ 5.3.2 Field Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging>
|
||||
2. <§ 6.3.2 Field Collection http://spec.graphql.org/June2018/#sec-Field-Collection>
|
||||
|
||||
These are described in completely different sections of the specification, but
|
||||
they’re actually highly related: both essentially normalize fields in a
|
||||
selection set. -}
|
||||
module Hasura.GraphQL.Parser.Collect
|
||||
( collectFields
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict.Extended as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
|
||||
import Data.List.Extended (duplicates)
|
||||
import Language.GraphQL.Draft.Syntax
|
||||
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import {-# SOURCE #-} Hasura.GraphQL.Parser.Internal.Parser (boolean, runParser)
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.GraphQL.Utils (showNames)
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- | Collects the effective set of fields queried by a selection set by
|
||||
-- flattening fragments and merging duplicate fields.
|
||||
collectFields
|
||||
:: (MonadParse m, Foldable t)
|
||||
=> t Name
|
||||
-- ^ The names of the object types and interface types the 'SelectionSet' is
|
||||
-- selecting against.
|
||||
-> SelectionSet NoFragments Variable
|
||||
-> m (InsOrdHashMap Name (Field NoFragments Variable))
|
||||
collectFields objectTypeNames selectionSet =
|
||||
mergeFields =<< flattenSelectionSet objectTypeNames selectionSet
|
||||
|
||||
-- | Flattens inline fragments in a selection set. For example,
|
||||
--
|
||||
-- > {
|
||||
-- > bar
|
||||
-- > ... on Foo {
|
||||
-- > baz
|
||||
-- > qux
|
||||
-- > }
|
||||
-- > }
|
||||
--
|
||||
-- is flattened to:
|
||||
--
|
||||
-- > {
|
||||
-- > bar
|
||||
-- > baz
|
||||
-- > qux
|
||||
-- > }
|
||||
--
|
||||
-- Nested fragments are similarly flattened, but only as is necessary: fragments
|
||||
-- inside subselection sets of individual fields are /not/ flattened. For
|
||||
-- example,
|
||||
--
|
||||
-- > {
|
||||
-- > bar
|
||||
-- > ... on Foo {
|
||||
-- > baz {
|
||||
-- > ... on Baz {
|
||||
-- > foo
|
||||
-- > }
|
||||
-- > }
|
||||
-- > qux
|
||||
-- > }
|
||||
-- > }
|
||||
--
|
||||
-- is flattened to
|
||||
--
|
||||
-- > {
|
||||
-- > bar
|
||||
-- > baz {
|
||||
-- > ... on Baz {
|
||||
-- > foo
|
||||
-- > }
|
||||
-- > }
|
||||
-- > qux
|
||||
-- > }
|
||||
--
|
||||
-- leaving the innermost fragment on @baz@ alone.
|
||||
--
|
||||
-- This function also applies @\@include@ and @\@skip@ directives, since they
|
||||
-- should be applied before fragments are flattened.
|
||||
flattenSelectionSet
|
||||
:: (MonadParse m, Foldable t)
|
||||
=> t Name
|
||||
-- ^ The name of the object type the 'SelectionSet' is selecting against.
|
||||
-> SelectionSet NoFragments Variable
|
||||
-> m [Field NoFragments Variable]
|
||||
flattenSelectionSet objectTypeNames = fmap concat . traverse flattenSelection
|
||||
where
|
||||
-- The easy case: just a single field.
|
||||
flattenSelection (SelectionField field) = do
|
||||
validateDirectives (_fDirectives field)
|
||||
applyInclusionDirectives (_fDirectives field) $ pure [field]
|
||||
|
||||
-- Note: The 'SelectionFragmentSpread' case has already been eliminated by
|
||||
-- the fragment inliner.
|
||||
|
||||
-- The involved case: we have an inline fragment to process.
|
||||
flattenSelection (SelectionInlineFragment fragment) = do
|
||||
validateDirectives (_ifDirectives fragment)
|
||||
applyInclusionDirectives (_ifDirectives fragment) $
|
||||
case _ifTypeCondition fragment of
|
||||
-- No type condition, so the fragment unconditionally applies.
|
||||
Nothing -> flattenInlineFragment fragment
|
||||
Just typeName
|
||||
-- There is a type condition, but it is just the type of the
|
||||
-- selection set; the fragment trivially applies.
|
||||
| typeName `elem` objectTypeNames -> flattenInlineFragment fragment
|
||||
|
||||
-- Otherwise, the fragment must not apply, because we do not currently
|
||||
-- support interfaces or unions. According to the GraphQL spec, it is
|
||||
-- an *error* to select a fragment that cannot possibly apply to the
|
||||
-- given type; see
|
||||
-- http://spec.graphql.org/June2018/#sec-Fragment-spread-is-possible.
|
||||
-- Therefore, we raise an error.
|
||||
| otherwise -> return []
|
||||
{- parseError $ "illegal type condition in fragment; type "
|
||||
<> typeName <<> " is unrelated to any of the types " <>
|
||||
Text.intercalate ", " (fmap dquoteTxt (toList objectTypeNames))
|
||||
-}
|
||||
|
||||
flattenInlineFragment InlineFragment{ _ifDirectives, _ifSelectionSet } = do
|
||||
validateDirectives _ifDirectives
|
||||
flattenSelectionSet objectTypeNames _ifSelectionSet
|
||||
|
||||
applyInclusionDirectives directives continue
|
||||
| Just directive <- find ((== $$(litName "include")) . _dName) directives
|
||||
= applyInclusionDirective id directive continue
|
||||
| Just directive <- find ((== $$(litName "skip")) . _dName) directives
|
||||
= applyInclusionDirective not directive continue
|
||||
| otherwise = continue
|
||||
|
||||
applyInclusionDirective adjust Directive{ _dName, _dArguments } continue = do
|
||||
ifArgument <- Map.lookup $$(litName "if") _dArguments `onNothing`
|
||||
parseError ("missing \"if\" argument for " <> _dName <<> " directive")
|
||||
value <- runParser boolean $ GraphQLValue ifArgument
|
||||
if adjust value then continue else pure []
|
||||
|
||||
validateDirectives directives =
|
||||
case nonEmpty $ toList $ duplicates $ map _dName directives of
|
||||
Nothing -> pure ()
|
||||
Just duplicatedDirectives -> parseError
|
||||
$ "the following directives are used more than once: "
|
||||
<> showNames duplicatedDirectives
|
||||
|
||||
-- | Merges fields according to the rules in the GraphQL specification, specifically
|
||||
-- <§ 5.3.2 Field Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging>.
|
||||
mergeFields
|
||||
:: (MonadParse m, Eq var)
|
||||
=> [Field NoFragments var]
|
||||
-> m (InsOrdHashMap Name (Field NoFragments var))
|
||||
mergeFields = foldM addField OMap.empty
|
||||
where
|
||||
addField fields newField = case OMap.lookup alias fields of
|
||||
Nothing ->
|
||||
pure $! OMap.insert alias newField fields
|
||||
Just oldField -> do
|
||||
mergedField <- mergeField alias oldField newField
|
||||
pure $! OMap.insert alias mergedField fields
|
||||
where
|
||||
alias = fromMaybe (_fName newField) (_fAlias newField)
|
||||
|
||||
mergeField alias oldField newField = do
|
||||
unless (_fName oldField == _fName newField) $ parseError $
|
||||
"selection of both " <> _fName oldField <<> " and " <>
|
||||
_fName newField <<> " specify the same response name, " <>> alias
|
||||
|
||||
unless (_fArguments oldField == _fArguments newField) $ parseError $
|
||||
"inconsistent arguments between multiple selections of " <>
|
||||
"field " <>> _fName oldField
|
||||
|
||||
pure $! Field
|
||||
{ _fAlias = Just alias
|
||||
, _fName = _fName oldField
|
||||
, _fArguments = _fArguments oldField
|
||||
-- see Note [Drop directives from merged fields]
|
||||
, _fDirectives = []
|
||||
-- see Note [Lazily merge selection sets]
|
||||
, _fSelectionSet = _fSelectionSet oldField ++ _fSelectionSet newField
|
||||
}
|
||||
|
||||
{- Note [Drop directives from merged fields]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
When we merge two fields, what do we do with directives? The GraphQL spec isn’t
|
||||
very clear here, but it does explicitly state that directives only need to be
|
||||
unique per unmerged field (§ 5.7.3 Directives Are Unique Per Location,
|
||||
http://spec.graphql.org/June2018/#sec-Directives-Are-Unique-Per-Location). For
|
||||
clarity, here is the example given by the spec:
|
||||
|
||||
query ($foo: Boolean = true, $bar: Boolean = false) {
|
||||
field @skip(if: $foo) {
|
||||
subfieldA
|
||||
}
|
||||
field @skip(if: $bar) {
|
||||
subfieldB
|
||||
}
|
||||
}
|
||||
|
||||
The spec says this is totally fine, since the @skip directives appear in
|
||||
different places. This forces our hand: we *must* process @include/@skip
|
||||
directives prior to merging fields. And conveniently, aside from @include/@skip,
|
||||
we don’t care about directives, so we don’t bother reconciling them during field
|
||||
merging---we just drop them.
|
||||
|
||||
Note [Lazily merge selection sets]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
Field merging is described in a recursive way in the GraphQL spec (§ 5.3.2 Field
|
||||
Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging).
|
||||
This makes sense: if fields have sub-selection sets, they should be recursively
|
||||
merged. For example, suppose we have the following selection set:
|
||||
|
||||
{
|
||||
field1 {
|
||||
field2 {
|
||||
field3
|
||||
}
|
||||
field5
|
||||
}
|
||||
field1 {
|
||||
field2 {
|
||||
field4
|
||||
}
|
||||
field5
|
||||
}
|
||||
}
|
||||
|
||||
After a single level of merging, we’ll merge the two occurrences of field1
|
||||
together to get:
|
||||
|
||||
{
|
||||
field1 {
|
||||
field2 {
|
||||
field3
|
||||
}
|
||||
field5
|
||||
field2 {
|
||||
field4
|
||||
}
|
||||
field5
|
||||
}
|
||||
}
|
||||
|
||||
It would be natural to then merge the inner selection set, too, yielding:
|
||||
|
||||
{
|
||||
field1 {
|
||||
field2 {
|
||||
field3
|
||||
field4
|
||||
}
|
||||
field5
|
||||
}
|
||||
}
|
||||
|
||||
But we don’t do this. Instead, we stop after the first level of merging, so
|
||||
field1’s sub-selection set still has duplication. Why? Because recursively
|
||||
merging fields would also require recursively flattening fragments, and
|
||||
flattening fragments is tricky: it requires knowledge of type information.
|
||||
|
||||
Fortunately, this lazy approach to field merging is totally okay, because we
|
||||
call collectFields (and therefore mergeFields) each time we parse a selection
|
||||
set. Once we get to processing the sub-selection set of field1, we’ll call
|
||||
collectFields again, and it will merge things the rest of the way. This is
|
||||
consistent with the way the rest of our parsing system works, where parsers
|
||||
interpret their own inputs on an as-needed basis. -}
|
154
server/src-lib/Hasura/GraphQL/Parser/Column.hs
Normal file
154
server/src-lib/Hasura/GraphQL/Parser/Column.hs
Normal file
@ -0,0 +1,154 @@
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
module Hasura.GraphQL.Parser.Column
|
||||
( PGColumnValue(..)
|
||||
, column
|
||||
, mkScalarTypeName
|
||||
|
||||
, UnpreparedValue(..)
|
||||
|
||||
, Opaque
|
||||
, openOpaque
|
||||
, mkParameter
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict.Extended as M
|
||||
import qualified Database.PG.Query as Q
|
||||
|
||||
import Language.GraphQL.Draft.Syntax (Description (..), Name (..),
|
||||
Nullability (..), Value (..), litName,
|
||||
mkName)
|
||||
|
||||
import qualified Hasura.RQL.Types.Column as RQL
|
||||
import qualified Hasura.RQL.Types.CustomTypes as RQL
|
||||
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Parser.Internal.Parser
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.RQL.Types.Column hiding (EnumValue (..), EnumValueInfo (..))
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.Session (SessionVariable)
|
||||
import Hasura.SQL.DML
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
-- -------------------------------------------------------------------------------------------------
|
||||
|
||||
data Opaque a = Opaque
|
||||
{ _opVariable :: Maybe VariableInfo
|
||||
-- ^ The variable this value came from, if any.
|
||||
, _opValue :: a
|
||||
} -- Note: we intentionally don’t derive any instances here, since that would
|
||||
-- defeat the opaqueness!
|
||||
|
||||
openOpaque :: MonadParse m => Opaque a -> m a
|
||||
openOpaque (Opaque Nothing value) = pure value
|
||||
openOpaque (Opaque (Just _) value) = markNotReusable $> value
|
||||
|
||||
data UnpreparedValue
|
||||
-- | A SQL value that can be parameterized over.
|
||||
= UVParameter PGColumnValue
|
||||
(Maybe VariableInfo)
|
||||
-- ^ The GraphQL variable this value came from, if any.
|
||||
-- | A literal SQL expression that /cannot/ be parameterized over.
|
||||
| UVLiteral SQLExp
|
||||
-- | The entire session variables JSON object.
|
||||
| UVSession
|
||||
-- | A single session variable.
|
||||
| UVSessionVar (PGType PGScalarType) SessionVariable
|
||||
|
||||
data PGColumnValue = PGColumnValue
|
||||
{ pcvType :: PGColumnType
|
||||
, pcvValue :: WithScalarType PGScalarValue
|
||||
}
|
||||
|
||||
mkParameter :: Opaque PGColumnValue -> UnpreparedValue
|
||||
mkParameter (Opaque variable value) = UVParameter value variable
|
||||
|
||||
-- -------------------------------------------------------------------------------------------------
|
||||
|
||||
column
|
||||
:: (MonadSchema n m, MonadError QErr m)
|
||||
=> PGColumnType
|
||||
-> Nullability
|
||||
-> m (Parser 'Both n (Opaque PGColumnValue))
|
||||
column columnType (Nullability isNullable) =
|
||||
-- TODO(PDV): It might be worth memoizing this function even though it isn’t
|
||||
-- recursive simply for performance reasons, since it’s likely to be hammered
|
||||
-- during schema generation. Need to profile to see whether or not it’s a win.
|
||||
opaque . fmap (PGColumnValue columnType) <$> case columnType of
|
||||
PGColumnScalar scalarType -> withScalarType scalarType <$> case scalarType of
|
||||
PGInteger -> pure (PGValInteger <$> int)
|
||||
PGBoolean -> pure (PGValBoolean <$> boolean)
|
||||
PGFloat -> pure (PGValDouble <$> float)
|
||||
PGText -> pure (PGValText <$> string)
|
||||
PGVarchar -> pure (PGValVarchar <$> string)
|
||||
PGJSON -> pure (PGValJSON . Q.JSON <$> json)
|
||||
PGJSONB -> pure (PGValJSONB . Q.JSONB <$> jsonb)
|
||||
|
||||
-- For all other scalars, we convert the value to JSON and use the
|
||||
-- FromJSON instance. The major upside is that this avoids having to write
|
||||
-- a new parsers for each custom type: if the JSON parser is sound, so
|
||||
-- will this one, and it avoids the risk of having two separate ways of
|
||||
-- parsing a value in the codebase, which could lead to inconsistencies.
|
||||
_ -> do
|
||||
name <- mkScalarTypeName scalarType
|
||||
let schemaType = NonNullable $ TNamed $ mkDefinition name Nothing TIScalar
|
||||
pure $ Parser
|
||||
{ pType = schemaType
|
||||
, pParser =
|
||||
valueToJSON (toGraphQLType schemaType) >=>
|
||||
either (parseErrorWith ParseFailed . qeError) pure . runAesonParser (parsePGValue scalarType)
|
||||
}
|
||||
PGColumnEnumReference (EnumReference tableName enumValues) ->
|
||||
case nonEmpty (M.toList enumValues) of
|
||||
Just enumValuesList -> do
|
||||
name <- qualifiedObjectToName tableName <&> (<> $$(litName "_enum"))
|
||||
pure $ withScalarType PGText $ enum name Nothing (mkEnumValue <$> enumValuesList)
|
||||
Nothing -> throw400 ValidationFailed "empty enum values"
|
||||
where
|
||||
-- Sadly, this combinator is not sound in general, so we can’t export it
|
||||
-- for general-purpose use. If we did, someone could write this:
|
||||
--
|
||||
-- mkParameter <$> opaque do
|
||||
-- n <- int
|
||||
-- pure (mkIntColumnValue (n + 1))
|
||||
--
|
||||
-- Now we’d end up with a UVParameter that has a variable in it, so we’d
|
||||
-- parameterize over it. But when we’d reuse the plan, we wouldn’t know to
|
||||
-- increment the value by 1, so we’d use the wrong value!
|
||||
--
|
||||
-- We could theoretically solve this by retaining a reference to the parser
|
||||
-- itself and re-parsing each new value, using the saved parser, which
|
||||
-- would admittedly be neat. But it’s more complicated, and it isn’t clear
|
||||
-- that it would actually be useful, so for now we don’t support it.
|
||||
opaque :: MonadParse m => Parser 'Both m a -> Parser 'Both m (Opaque a)
|
||||
opaque parser = parser
|
||||
{ pParser = \case
|
||||
GraphQLValue (VVariable var@Variable{ vInfo, vValue }) -> do
|
||||
typeCheck False (toGraphQLType $ pType parser) var
|
||||
Opaque (Just vInfo) <$> pParser parser (absurd <$> vValue)
|
||||
value -> Opaque Nothing <$> pParser parser value
|
||||
}
|
||||
|
||||
withScalarType scalarType = fmap (WithScalarType scalarType) . possiblyNullable scalarType
|
||||
possiblyNullable scalarType
|
||||
| isNullable = fmap (fromMaybe $ PGNull scalarType) . nullable
|
||||
| otherwise = id
|
||||
|
||||
mkEnumValue (RQL.EnumValue value, RQL.EnumValueInfo description) =
|
||||
( mkDefinition value (Description <$> description) EnumValueInfo
|
||||
, PGValText $ unName value
|
||||
)
|
||||
|
||||
mkScalarTypeName :: MonadError QErr m => PGScalarType -> m Name
|
||||
mkScalarTypeName PGInteger = pure RQL.intScalar
|
||||
mkScalarTypeName PGBoolean = pure RQL.boolScalar
|
||||
mkScalarTypeName PGFloat = pure RQL.floatScalar
|
||||
mkScalarTypeName PGText = pure RQL.stringScalar
|
||||
mkScalarTypeName PGVarchar = pure RQL.stringScalar
|
||||
mkScalarTypeName scalarType = mkName (toSQLTxt scalarType) `onNothing` throw400 ValidationFailed
|
||||
("cannot use SQL type " <> scalarType <<> " in the GraphQL schema because its name is not a "
|
||||
<> "valid GraphQL identifier")
|
998
server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs
Normal file
998
server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs
Normal file
@ -0,0 +1,998 @@
|
||||
{-# OPTIONS_HADDOCK not-home #-}
|
||||
{-# LANGUAGE AllowAmbiguousTypes #-}
|
||||
{-# LANGUAGE StrictData #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
|
||||
-- | Defines the 'Parser' type and its primitive combinators.
|
||||
module Hasura.GraphQL.Parser.Internal.Parser where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as A
|
||||
import qualified Data.HashMap.Strict.Extended as M
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.HashSet as S
|
||||
import qualified Data.Text as T
|
||||
|
||||
import Control.Lens.Extended hiding (enum, index)
|
||||
import Data.Int (Int32, Int64)
|
||||
import Data.Scientific (toBoundedInteger)
|
||||
import Data.Parser.JSONPath
|
||||
import Data.Type.Equality
|
||||
import Language.GraphQL.Draft.Syntax hiding (Definition)
|
||||
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Parser.Collect
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.RQL.Types.CustomTypes
|
||||
import Hasura.RQL.Types.Error
|
||||
import Hasura.Server.Utils (englishList)
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
|
||||
-- -----------------------------------------------------------------------------
|
||||
-- type definitions
|
||||
|
||||
-- | A 'Parser' that corresponds to a type in the GraphQL schema. A 'Parser' is
|
||||
-- really two things at once:
|
||||
--
|
||||
-- 1. As its name implies, a 'Parser' can be used to parse GraphQL queries
|
||||
-- (via 'runParser').
|
||||
--
|
||||
-- 2. Less obviously, a 'Parser' represents a slice of the GraphQL schema,
|
||||
-- since every 'Parser' corresponds to a particular GraphQL type, and
|
||||
-- information about that type can be recovered (via 'parserType').
|
||||
--
|
||||
-- A natural way to view this is that 'Parser's support a sort of dynamic
|
||||
-- reflection: in addition to running a 'Parser' on an input query, you can ask
|
||||
-- it to tell you about what type of input it expects. Importantly, you can do
|
||||
-- this even if you don’t have a query to parse; this is necessary to implement
|
||||
-- GraphQL introspection, which provides precisely this sort of reflection on
|
||||
-- types.
|
||||
--
|
||||
-- Another way of viewing a 'Parser' is a little more quantum: just as light
|
||||
-- “sometimes behaves like a particle and sometimes behaves like a wave,” a
|
||||
-- 'Parser' “sometimes behaves like a query parser and sometimes behaves like a
|
||||
-- type.” In this way, you can think of a function that produces a 'Parser' as
|
||||
-- simultaneously both a function that constructs a GraphQL schema and a
|
||||
-- function that parses a GraphQL query. 'Parser' constructors therefore
|
||||
-- interleave two concerns: information about a type definition (like the type’s
|
||||
-- name and description) and information about how to parse a query on that type.
|
||||
--
|
||||
-- Notably, these two concerns happen at totally different phases in the
|
||||
-- program: GraphQL schema construction happens when @graphql-engine@ first
|
||||
-- starts up, before it receives any GraphQL queries at all. But query parsing
|
||||
-- obviously can’t happen until there is actually a query to parse. For that
|
||||
-- reason, it’s useful to take care to distinguish which effects are happening
|
||||
-- at which phase during 'Parser' construction, since otherwise you may get
|
||||
-- mixed up!
|
||||
--
|
||||
-- For some more information about how to interpret the meaning of a 'Parser',
|
||||
-- see Note [The meaning of Parser 'Output].
|
||||
data Parser k m a = Parser
|
||||
{ pType :: ~(Type k)
|
||||
-- ^ Lazy for knot-tying reasons; see Note [Tying the knot] in
|
||||
-- Hasura.GraphQL.Parser.Class.
|
||||
, pParser :: ParserInput k -> m a
|
||||
} deriving (Functor)
|
||||
|
||||
parserType :: Parser k m a -> Type k
|
||||
parserType = pType
|
||||
|
||||
runParser :: Parser k m a -> ParserInput k -> m a
|
||||
runParser = pParser
|
||||
|
||||
instance HasName (Parser k m a) where
|
||||
getName = getName . pType
|
||||
|
||||
instance HasDefinition (Parser k m a) (TypeInfo k) where
|
||||
definitionLens f parser = definitionLens f (pType parser) <&> \pType -> parser { pType }
|
||||
|
||||
type family ParserInput k where
|
||||
-- see Note [The 'Both kind] in Hasura.GraphQL.Parser.Schema
|
||||
ParserInput 'Both = InputValue Variable
|
||||
ParserInput 'Input = InputValue Variable
|
||||
-- see Note [The meaning of Parser 'Output]
|
||||
ParserInput 'Output = SelectionSet NoFragments Variable
|
||||
|
||||
{- Note [The meaning of Parser 'Output]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
The ParserInput type family determines what a Parser accepts as input during
|
||||
query parsing, which varies based on its Kind. A `Parser 'Input`,
|
||||
unsurprisingly, parses GraphQL input values, much in the same way aeson
|
||||
`Parser`s parse JSON values.
|
||||
|
||||
Therefore, one might naturally conclude that `Parser 'Output` ought to parse
|
||||
GraphQL output values. But it doesn’t---a Parser is used to parse GraphQL
|
||||
*queries*, and output values don’t show up in queries anywhere! Rather, the
|
||||
output values are the results of executing the query, not something the user
|
||||
sends us, so we don’t have to parse those at all.
|
||||
|
||||
What output types really correspond to in GraphQL queries is selection sets. For
|
||||
example, if we have the GraphQL types
|
||||
|
||||
type User {
|
||||
posts(filters: PostFilters): [Post]
|
||||
}
|
||||
|
||||
input PostFilters {
|
||||
newer_than: Date
|
||||
}
|
||||
|
||||
type Post {
|
||||
id: Int
|
||||
title: String
|
||||
body: String
|
||||
}
|
||||
|
||||
then we might receive a query that looks like this:
|
||||
|
||||
query list_user_posts($user_id: Int, $date: Date) {
|
||||
user_by_id(id: $user_id) {
|
||||
posts(filters: {newer_than: $date}) {
|
||||
id
|
||||
title
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
We have Parsers to represent each of these types: a `Parser 'Input` for
|
||||
PostFilters, and two `Parser 'Output`s for User and Post. When we parse the
|
||||
query, we pass the `{newer_than: $date}` input value to the PostFilters parser,
|
||||
as expected. But what do we pass to the User parser? The answer is this
|
||||
selection set:
|
||||
|
||||
{
|
||||
posts(filters: {newer_than: $date}) {
|
||||
id
|
||||
title
|
||||
}
|
||||
}
|
||||
|
||||
Likewise, the Post parser eventually receives the inner selection set:
|
||||
|
||||
{
|
||||
id
|
||||
title
|
||||
}
|
||||
|
||||
These Parsers handle interpreting the fields of the selection sets. This is why
|
||||
`ParserInput 'Output` is SelectionSet---the GraphQL *type* associated with the
|
||||
Parser is an output type, but the part of the *query* that corresponds to that
|
||||
output type isn’t an output value but a selection set. -}
|
||||
|
||||
-- | The constraint @(''Input' '<:' k)@ entails @('ParserInput' k ~ 'Value')@,
|
||||
-- but GHC can’t figure that out on its own, so we have to be explicit to give
|
||||
-- it a little help.
|
||||
inputParserInput :: forall k. 'Input <: k => ParserInput k :~: InputValue Variable
|
||||
inputParserInput = case subKind @'Input @k of { KRefl -> Refl; KBoth -> Refl }
|
||||
|
||||
pInputParser :: forall k m a. 'Input <: k => Parser k m a -> InputValue Variable -> m a
|
||||
pInputParser = gcastWith (inputParserInput @k) pParser
|
||||
|
||||
infixl 1 `bind`
|
||||
bind :: Monad m => Parser k m a -> (a -> m b) -> Parser k m b
|
||||
bind p f = p { pParser = pParser p >=> f }
|
||||
|
||||
-- | Parses some collection of input fields. Build an 'InputFieldsParser' using
|
||||
-- 'field', 'fieldWithDefault', or 'fieldOptional', combine several together
|
||||
-- with the 'Applicative' instance, and finish it off using 'object' to turn it
|
||||
-- into a 'Parser'.
|
||||
data InputFieldsParser m a = InputFieldsParser
|
||||
-- Note: this is isomorphic to
|
||||
-- Compose ((,) [Definition (FieldInfo k)])
|
||||
-- (ReaderT (HashMap Name (FieldInput k)) m) a
|
||||
-- but working with that type sucks.
|
||||
{ ifDefinitions :: [Definition InputFieldInfo]
|
||||
, ifParser :: HashMap Name (InputValue Variable) -> m a
|
||||
} deriving (Functor)
|
||||
|
||||
infixl 1 `bindFields`
|
||||
bindFields :: Monad m => InputFieldsParser m a -> (a -> m b) -> InputFieldsParser m b
|
||||
bindFields p f = p { ifParser = ifParser p >=> f }
|
||||
|
||||
instance Applicative m => Applicative (InputFieldsParser m) where
|
||||
pure v = InputFieldsParser [] (const $ pure v)
|
||||
a <*> b = InputFieldsParser
|
||||
(ifDefinitions a <> ifDefinitions b)
|
||||
(liftA2 (<*>) (ifParser a) (ifParser b))
|
||||
|
||||
-- | A parser for a single field in a selection set. Build a 'FieldParser'
|
||||
-- with 'selection' or 'subselection', and combine them together with
|
||||
-- 'selectionSet' to obtain a 'Parser'.
|
||||
data FieldParser m a = FieldParser
|
||||
{ fDefinition :: Definition FieldInfo
|
||||
, fParser :: Field NoFragments Variable -> m a
|
||||
} deriving (Functor)
|
||||
|
||||
infixl 1 `bindField`
|
||||
bindField :: Monad m => FieldParser m a -> (a -> m b) -> FieldParser m b
|
||||
bindField p f = p { fParser = fParser p >=> f }
|
||||
|
||||
-- | A single parsed field in a selection set.
|
||||
data ParsedSelection a
|
||||
-- | An ordinary field.
|
||||
= SelectField a
|
||||
-- | The magical @__typename@ field, implicitly available on all objects
|
||||
-- <as part of GraphQL introspection http://spec.graphql.org/June2018/#sec-Type-Name-Introspection>.
|
||||
| SelectTypename Name
|
||||
deriving (Functor)
|
||||
|
||||
handleTypename :: (Name -> a) -> ParsedSelection a -> a
|
||||
handleTypename _ (SelectField value) = value
|
||||
handleTypename f (SelectTypename name) = f name
|
||||
|
||||
-- -----------------------------------------------------------------------------
|
||||
-- combinators
|
||||
|
||||
data ScalarRepresentation a where
|
||||
SRBoolean :: ScalarRepresentation Bool
|
||||
SRInt :: ScalarRepresentation Int32
|
||||
SRFloat :: ScalarRepresentation Double
|
||||
SRString :: ScalarRepresentation Text
|
||||
|
||||
scalar
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> ScalarRepresentation a
|
||||
-> Parser 'Both m a
|
||||
scalar name description representation = Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \v -> case representation of
|
||||
SRBoolean -> case v of
|
||||
GraphQLValue (VBoolean b) -> pure b
|
||||
JSONValue (A.Bool b) -> pure b
|
||||
_ -> typeMismatch name "a boolean" v
|
||||
SRInt -> case v of
|
||||
GraphQLValue (VInt i) -> convertWith scientificToInteger $ fromInteger i
|
||||
JSONValue (A.Number n) -> convertWith scientificToInteger n
|
||||
_ -> typeMismatch name "a 32-bit integer" v
|
||||
SRFloat -> case v of
|
||||
GraphQLValue (VFloat f) -> convertWith scientificToFloat f
|
||||
GraphQLValue (VInt i) -> convertWith scientificToFloat $ fromInteger i
|
||||
JSONValue (A.Number n) -> convertWith scientificToFloat n
|
||||
_ -> typeMismatch name "a float" v
|
||||
SRString -> case v of
|
||||
GraphQLValue (VString s) -> pure s
|
||||
JSONValue (A.String s) -> pure s
|
||||
_ -> typeMismatch name "a string" v
|
||||
}
|
||||
where
|
||||
schemaType = NonNullable $ TNamed $ mkDefinition name description TIScalar
|
||||
convertWith f = either (parseErrorWith ParseFailed . qeError) pure . runAesonParser f
|
||||
|
||||
{- WIP NOTE (FIXME: make into an actual note by expanding on it a bit)
|
||||
|
||||
There's a delicate balance between GraphQL types and Postgres types.
|
||||
|
||||
The mapping is done in the 'column' parser. But we want to only have
|
||||
one source of truth for parsing postgres values, which happens to be
|
||||
the JSON parsing code in SQL.Value. So here we reuse some of that code
|
||||
despite not having a JSON value.
|
||||
|
||||
-}
|
||||
|
||||
boolean :: MonadParse m => Parser 'Both m Bool
|
||||
boolean = scalar boolScalar Nothing SRBoolean
|
||||
|
||||
int :: MonadParse m => Parser 'Both m Int32
|
||||
int = scalar intScalar Nothing SRInt
|
||||
|
||||
float :: MonadParse m => Parser 'Both m Double
|
||||
float = scalar floatScalar Nothing SRFloat
|
||||
|
||||
string :: MonadParse m => Parser 'Both m Text
|
||||
string = scalar stringScalar Nothing SRString
|
||||
|
||||
-- | As an input type, any string or integer input value should be coerced to ID as Text
|
||||
-- https://spec.graphql.org/June2018/#sec-ID
|
||||
identifier :: MonadParse m => Parser 'Both m Text
|
||||
identifier = Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case
|
||||
GraphQLValue (VString s) -> pure s
|
||||
GraphQLValue (VInt i) -> pure $ T.pack $ show i
|
||||
JSONValue (A.String s) -> pure s
|
||||
JSONValue (A.Number n) -> parseScientific n
|
||||
v -> typeMismatch idName "a String or a 32-bit integer" v
|
||||
}
|
||||
where
|
||||
idName = idScalar
|
||||
schemaType = NonNullable $ TNamed $ mkDefinition idName Nothing TIScalar
|
||||
parseScientific = either (parseErrorWith ParseFailed . qeError)
|
||||
(pure . T.pack . show @Int) . runAesonParser scientificToInteger
|
||||
|
||||
namedJSON :: MonadParse m => Name -> Maybe Description -> Parser 'Both m A.Value
|
||||
namedJSON name description = Parser
|
||||
{ pType = schemaType
|
||||
, pParser = valueToJSON $ toGraphQLType schemaType
|
||||
}
|
||||
where
|
||||
schemaType = NonNullable $ TNamed $ mkDefinition name description TIScalar
|
||||
|
||||
json, jsonb :: MonadParse m => Parser 'Both m A.Value
|
||||
json = namedJSON $$(litName "json") Nothing
|
||||
jsonb = namedJSON $$(litName "jsonb") Nothing
|
||||
|
||||
-- | Explicitly define any desired scalar type. This is unsafe because it does
|
||||
-- not mark queries as unreusable when they should be.
|
||||
unsafeRawScalar
|
||||
:: MonadParse n
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Parser 'Both n (InputValue Variable)
|
||||
unsafeRawScalar name description = Parser
|
||||
{ pType = NonNullable $ TNamed $ mkDefinition name description TIScalar
|
||||
, pParser = pure
|
||||
}
|
||||
|
||||
enum
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> NonEmpty (Definition EnumValueInfo, a)
|
||||
-> Parser 'Both m a
|
||||
enum name description values = Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case
|
||||
JSONValue (A.String stringValue)
|
||||
| Just enumValue <- mkName stringValue -> validate enumValue
|
||||
GraphQLValue (VEnum (EnumValue enumValue)) -> validate enumValue
|
||||
other -> typeMismatch name "an enum value" other
|
||||
}
|
||||
where
|
||||
schemaType = NonNullable $ TNamed $ mkDefinition name description $ TIEnum (fst <$> values)
|
||||
valuesMap = M.fromList $ over (traverse._1) dName $ toList values
|
||||
validate value = case M.lookup value valuesMap of
|
||||
Just result -> pure result
|
||||
Nothing -> parseError $ "expected one of the values "
|
||||
<> englishList "or" (dquoteTxt . dName . fst <$> values) <> " for type "
|
||||
<> name <<> ", but found " <>> value
|
||||
|
||||
nullable :: forall k m a. (MonadParse m, 'Input <: k) => Parser k m a -> Parser k m (Maybe a)
|
||||
nullable parser = gcastWith (inputParserInput @k) Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case
|
||||
JSONValue A.Null -> pure Nothing
|
||||
GraphQLValue VNull -> pure Nothing
|
||||
value -> Just <$> pParser parser value
|
||||
}
|
||||
where
|
||||
schemaType = nullableType $ pType parser
|
||||
|
||||
-- | Decorate a schema field as NON_NULL
|
||||
nonNullableField :: forall m a . FieldParser m a -> FieldParser m a
|
||||
nonNullableField (FieldParser (Definition n u d (FieldInfo as t)) p) =
|
||||
FieldParser (Definition n u d (FieldInfo as (nonNullableType t))) p
|
||||
|
||||
-- | Decorate a schema field as NULL
|
||||
nullableField :: forall m a . FieldParser m a -> FieldParser m a
|
||||
nullableField (FieldParser (Definition n u d (FieldInfo as t)) p) =
|
||||
FieldParser (Definition n u d (FieldInfo as (nullableType t))) p
|
||||
{-
|
||||
field = field
|
||||
{ fDefinition = (fDefinition field)
|
||||
{ dInfo = (dInfo (fDefinition field))
|
||||
{ fType = nonNullableType (fType (dInfo (fDefinition field)))
|
||||
}
|
||||
}
|
||||
}
|
||||
-}
|
||||
-- | Decorate a schema output type as NON_NULL
|
||||
nonNullableParser :: forall m a . Parser 'Output m a -> Parser 'Output m a
|
||||
nonNullableParser parser = parser { pType = nonNullableType (pType parser) }
|
||||
|
||||
multiple :: Parser 'Output m a -> Parser 'Output m a
|
||||
multiple parser = parser { pType = Nullable $ TList $ pType parser }
|
||||
|
||||
list :: forall k m a. (MonadParse m, 'Input <: k) => Parser k m a -> Parser k m [a]
|
||||
list parser = gcastWith (inputParserInput @k) Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case
|
||||
GraphQLValue (VList values) -> for (zip [0..] values) \(index, value) ->
|
||||
withPath (++[Index index]) $ pParser parser $ GraphQLValue value
|
||||
JSONValue (A.Array values) -> for (zip [0..] $ toList values) \(index, value) ->
|
||||
withPath (++[Index index]) $ pParser parser $ JSONValue value
|
||||
-- List Input Coercion
|
||||
--
|
||||
-- According to section 3.11 of the GraphQL spec: iff the value
|
||||
-- passed as an input to a list type is not a list and not the
|
||||
-- null value, then the result of input coercion is a list of
|
||||
-- size one, where the single item value is the result of input
|
||||
-- coercion for the list’s item type on the provided value.
|
||||
--
|
||||
-- We need to explicitly test for VNull here, otherwise we could
|
||||
-- be returning `[null]` if the parser accepts a null value,
|
||||
-- which would contradict the spec.
|
||||
GraphQLValue VNull -> parseError "expected a list, but found null"
|
||||
JSONValue A.Null -> parseError "expected a list, but found null"
|
||||
other -> fmap pure $ withPath (++[Index 0]) $ pParser parser other
|
||||
}
|
||||
where
|
||||
schemaType = NonNullable $ TList $ pType parser
|
||||
|
||||
object
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> InputFieldsParser m a
|
||||
-> Parser 'Input m a
|
||||
object name description parser = Parser
|
||||
{ pType = schemaType
|
||||
, pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case
|
||||
GraphQLValue (VObject fields) -> parseFields $ GraphQLValue <$> fields
|
||||
JSONValue (A.Object fields) -> do
|
||||
translatedFields <- M.fromList <$> for (M.toList fields) \(key, val) -> do
|
||||
name' <- mkName key `onNothing` parseError
|
||||
("variable value contains object with key " <> key <<> ", which is not a legal GraphQL name")
|
||||
pure (name', JSONValue val)
|
||||
parseFields translatedFields
|
||||
other -> typeMismatch name "an object" other
|
||||
}
|
||||
where
|
||||
schemaType = NonNullable $ TNamed $ mkDefinition name description $
|
||||
TIInputObject (InputObjectInfo (ifDefinitions parser))
|
||||
fieldNames = S.fromList (dName <$> ifDefinitions parser)
|
||||
parseFields fields = do
|
||||
-- check for extraneous fields here, since the InputFieldsParser just
|
||||
-- handles parsing the fields it cares about
|
||||
for_ (M.keys fields) \fieldName ->
|
||||
unless (fieldName `S.member` fieldNames) $ withPath (++[Key (unName fieldName)]) $
|
||||
parseError $ "field " <> dquote fieldName <> " not found in type: " <> squote name
|
||||
ifParser parser fields
|
||||
|
||||
{- Note [Optional fields and nullability]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
GraphQL conflates optional fields and nullability. A field of a GraphQL input
|
||||
object (or an argument to a selection set field, which is really the same thing)
|
||||
is optional if and only if its type is nullable. It’s worth fully spelling out
|
||||
the implications here: if a field (or argument) is non-nullable, it /cannot/ be
|
||||
omitted. So, for example, suppose we had a table type like this:
|
||||
|
||||
type article {
|
||||
comments(limit: Int!): [comment!]!
|
||||
}
|
||||
|
||||
Since we made `limit` non-nullable, it is /illegal/ to omit the argument. You’d
|
||||
/always/ have to provide some value---and that isn’t what we want, because the
|
||||
row limit should be optional. We have no choice but to make it nullable:
|
||||
|
||||
type article {
|
||||
comments(limit: Int): [comment!]!
|
||||
}
|
||||
|
||||
But this feels questionable. Should we really accept `null` values for `limit`?
|
||||
That is, should this query be legal?
|
||||
|
||||
{
|
||||
articles {
|
||||
comments(limit: null) { ... }
|
||||
}
|
||||
}
|
||||
|
||||
A tempting answer to that question is “yes”: we can just treat a `null` value
|
||||
for any optional field as precisely equivalent to leaving the field off
|
||||
entirely. That is, any field with no default value really just has a default
|
||||
value of `null`. Unfortunately, this approach turns out to be a really bad idea.
|
||||
It’s all too easy to write something like
|
||||
|
||||
mutation delete_article_by_id($article_id: Int) {
|
||||
delete_articles(where: {id: {eq: $article_id}})
|
||||
}
|
||||
|
||||
then accidentally misspell `article_id` in the variables payload, and now you’ve
|
||||
deleted all the articles in your database. Very bad.
|
||||
|
||||
So we’d really like to be able to have a way to say “this field is optional, but
|
||||
`null` is not a legal value,” but at first it seems like the GraphQL spec ties
|
||||
our hands. Fortunately, there is a way out. The spec explicitly permits
|
||||
distinguishing between the following two situations:
|
||||
|
||||
comments { ... }
|
||||
comments(limit: null) { ... }
|
||||
|
||||
That is, the spec allows implementations to behave differently depending on
|
||||
whether an argument was omitted or whether its value was `null`. This is spelled
|
||||
out in a few different places in the spec, but §3.10 Input Objects
|
||||
<http://spec.graphql.org/June2018/#sec-Input-Objects> is the most explicit:
|
||||
|
||||
> If the value `null` was provided for an input object field, and the field’s
|
||||
> type is not a non‐null type, an entry in the coerced unordered map is given
|
||||
> the value `null`. In other words, there is a semantic difference between the
|
||||
> explicitly provided value `null` versus having not provided a value.
|
||||
|
||||
Note that this is only allowed for fields that don’t have any default value! If
|
||||
the field were declared with an explicit `null` default value, like
|
||||
|
||||
type article {
|
||||
comments(limit: Int = null): [comment!]!
|
||||
}
|
||||
|
||||
then it would not be legal to distinguish the two cases. Yes, this is all
|
||||
terribly subtle.
|
||||
|
||||
Okay. So armed with that knowledge, what do we do about it? We offer three
|
||||
different combinators for parsing input fields:
|
||||
|
||||
1. `field` — Defines a field with no default value. The field’s nullability is
|
||||
taken directly from the nullability of the field’s value parser.
|
||||
2. `fieldOptional` — Defines a field with no default value that is always
|
||||
nullable. Returns Nothing if (and only if!) the field is omitted.
|
||||
3. `fieldWithDefault` — Defines a field with a default value.
|
||||
|
||||
The last of the three, `fieldWithDefault`, is actually the simplest. It
|
||||
corresponds to a field with a default value, and the underlying value parser
|
||||
will /always/ be called. If the field is omitted, the value parser is called
|
||||
with the default value. (This makes it impossible to distinguish omitted fields
|
||||
from those explicitly passed the default value, as mandated by the spec.) Use
|
||||
`fieldWithDefault` for any field or argument with a non-`null` default value.
|
||||
|
||||
`field` is also fairly straightforward. It always calls its value parser, so if
|
||||
the field is omitted, it calls it with a value of `null`. Notably, there is no
|
||||
special handling for non-nullable fields, since the underlying parser will raise
|
||||
an error in that case, anyway. Use `field` for required fields, and combine
|
||||
`field` with `nullable` for optional fields with a default value of `null`.
|
||||
|
||||
`fieldOptional` is the most interesting. Unlike `field` and `fieldWithDefault`,
|
||||
`fieldOptional` does not call its underlying value parser if the field is not
|
||||
provided; it simply returns Nothing. If a value /is/ provided, it is passed
|
||||
along without modification. This yields an interesting interaction when the
|
||||
value parser does not actually accept nulls, such as a parser like this:
|
||||
|
||||
fieldOptional $$(litName "limit") Nothing int
|
||||
|
||||
This corresponds to the `limit` field from our original example. If the field is
|
||||
omitted, the `int` parser is not called, and the field parser just returns
|
||||
Nothing. But if a value of `null` is explicitly provided, it is forwarded to the
|
||||
`int` parser, which then rejects it with a parse error, since it does not accept
|
||||
nulls. This is exactly the behavior we want.
|
||||
|
||||
This semantics can appear confusing. We end up with a field with a nullable type
|
||||
for which `null` is not a legal value! A strange interpretation of “nullable”,
|
||||
indeed. But realize that the nullability really means “optional”, and the
|
||||
behavior makes more sense.
|
||||
|
||||
As a final point, note that similar behavior can be obtained with
|
||||
`fieldWithDefault`. The following creates a boolean field that defaults to
|
||||
`false` and rejects `null` values:
|
||||
|
||||
fieldWithDefault $$(litName "includeDeprecated") Nothing (VBoolean False) boolean
|
||||
|
||||
This is a perfectly reasonable thing to do for exactly the same rationale behind
|
||||
the use of `fieldOptional` above. -}
|
||||
|
||||
-- | Creates a parser for an input field. The field’s nullability is determined
|
||||
-- by the nullability of the given value parser; see Note [Optional fields and
|
||||
-- nullability] for more details.
|
||||
field
|
||||
:: (MonadParse m, 'Input <: k)
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Parser k m a
|
||||
-> InputFieldsParser m a
|
||||
field name description parser = case pType parser of
|
||||
NonNullable typ -> InputFieldsParser
|
||||
{ ifDefinitions = [mkDefinition name description $ IFRequired typ]
|
||||
, ifParser = \ values -> withPath (++[Key (unName name)]) do
|
||||
value <- onNothing (M.lookup name values) $
|
||||
parseError ("missing required field " <>> name)
|
||||
pInputParser parser value
|
||||
}
|
||||
-- nullable fields just have an implicit default value of `null`
|
||||
Nullable _ -> fieldWithDefault name description VNull parser
|
||||
|
||||
-- | Creates a parser for an input field with the given default value. The
|
||||
-- resulting field will always be nullable, even if the underlying parser
|
||||
-- rejects `null` values; see Note [Optional fields and nullability] for more
|
||||
-- details.
|
||||
fieldWithDefault
|
||||
:: (MonadParse m, 'Input <: k)
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Value Void -- ^ default value
|
||||
-> Parser k m a
|
||||
-> InputFieldsParser m a
|
||||
fieldWithDefault name description defaultValue parser = InputFieldsParser
|
||||
{ ifDefinitions = [mkDefinition name description $ IFOptional (pType parser) (Just defaultValue)]
|
||||
, ifParser = M.lookup name >>> withPath (++[Key (unName name)]) . \case
|
||||
Just value -> peelVariableWith True expectedType value >>= parseValue expectedType
|
||||
Nothing -> pInputParser parser $ GraphQLValue $ literal defaultValue
|
||||
}
|
||||
where
|
||||
expectedType = Just $ toGraphQLType $ pType parser
|
||||
parseValue _ value = pInputParser parser value
|
||||
{-
|
||||
FIXME!!!!
|
||||
FIXME!!!!
|
||||
|
||||
parseValue expectedType value = case value of
|
||||
VVariable (var@Variable { vInfo, vValue }) -> do
|
||||
typeCheck expectedType var
|
||||
-- This case is tricky: if we get a nullable variable, we have to
|
||||
-- pessimistically mark the query non-reusable, regardless of its
|
||||
-- contents. Why? Well, suppose we have a type like
|
||||
--
|
||||
-- type Foo {
|
||||
-- bar(arg: Int = 42): String
|
||||
-- }
|
||||
--
|
||||
-- and suppose we receive the following query:
|
||||
--
|
||||
-- query blah($var: Int) {
|
||||
-- foo {
|
||||
-- bar(arg: $var)
|
||||
-- }
|
||||
-- }
|
||||
--
|
||||
-- Suppose no value is provided for $var, so it defaults to null. When
|
||||
-- we parse the arg field, we see it has a default value, so we
|
||||
-- substitute 42 for null and carry on. But now we’ve discarded the
|
||||
-- information that this value came from a variable at all, so if we
|
||||
-- cache the query plan, changes to the variable will be ignored, since
|
||||
-- we’ll always use 42!
|
||||
--
|
||||
-- Note that the problem doesn’t go away even if $var has a non-null
|
||||
-- value. In that case, we’d simply have flipped the problem around: now
|
||||
-- our cached query plan will do the wrong thing if $var *is* null,
|
||||
-- since we won’t know to substitute 42.
|
||||
--
|
||||
-- Theoretically, we could be smarter here: we could record a sort of
|
||||
-- “derived variable reference” that includes a new default value. But
|
||||
-- that would be more complicated, so for now we don’t do that.
|
||||
case vInfo of
|
||||
VIRequired _ -> pInputParser parser value
|
||||
VIOptional _ _ -> markNotReusable *> parseValue expectedType (literal vValue)
|
||||
VNull -> pInputParser parser $ literal defaultValue
|
||||
_ -> pInputParser parser value
|
||||
-}
|
||||
|
||||
-- | Creates a parser for a nullable field with no default value. If the field
|
||||
-- is omitted, the provided parser /will not be called/. This allows a field to
|
||||
-- distinguish an omitted field from a field supplied with @null@ (which is
|
||||
-- permitted by the GraphQL specification); see Note [Optional fields and
|
||||
-- nullability] for more details.
|
||||
--
|
||||
-- If you want a field with a default value of @null@, combine 'field' with
|
||||
-- 'nullable', instead.
|
||||
fieldOptional
|
||||
:: (MonadParse m, 'Input <: k)
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Parser k m a
|
||||
-> InputFieldsParser m (Maybe a)
|
||||
fieldOptional name description parser = InputFieldsParser
|
||||
{ ifDefinitions = [mkDefinition name description $
|
||||
IFOptional (nullableType $ pType parser) Nothing]
|
||||
, ifParser = M.lookup name >>> withPath (++[Key (unName name)]) .
|
||||
traverse (pInputParser parser <=< peelVariable expectedType)
|
||||
}
|
||||
where
|
||||
expectedType = Just $ toGraphQLType $ nullableType $ pType parser
|
||||
|
||||
-- | A variant of 'selectionSetObject' which doesn't implement any interfaces
|
||||
selectionSet
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> [FieldParser m a]
|
||||
-> Parser 'Output m (OMap.InsOrdHashMap Name (ParsedSelection a))
|
||||
selectionSet name desc fields = selectionSetObject name desc fields []
|
||||
|
||||
-- Should this rather take a non-empty `FieldParser` list?
|
||||
-- See also Note [Selectability of tables].
|
||||
selectionSetObject
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> [FieldParser m a]
|
||||
-- ^ Fields of this object, including any fields that are required from the
|
||||
-- interfaces that it implements. Note that we can't derive those fields from
|
||||
-- the list of interfaces (next argument), because the types of the fields of
|
||||
-- the object are only required to be *subtypes* of the types of the fields of
|
||||
-- the interfaces it implements.
|
||||
-> [Parser 'Output m b]
|
||||
-- ^ Interfaces implemented by this object;
|
||||
-- see Note [The interfaces story] in Hasura.GraphQL.Parser.Schema.
|
||||
-> Parser 'Output m (OMap.InsOrdHashMap Name (ParsedSelection a))
|
||||
selectionSetObject name description parsers implementsInterfaces = Parser
|
||||
{ pType = Nullable $ TNamed $ mkDefinition name description $
|
||||
TIObject $ ObjectInfo (map fDefinition parsers) interfaces
|
||||
, pParser = \input -> withPath (++[Key "selectionSet"]) do
|
||||
-- Not all fields have a selection set, but if they have one, it
|
||||
-- must contain at least one field. The GraphQL parser returns a
|
||||
-- list to represent this: an empty list indicates there was no
|
||||
-- selection set, as an empty set is rejected outright.
|
||||
-- Arguably, this would be better represented by a `Maybe
|
||||
-- (NonEmpty a)`.
|
||||
-- The parser can't know whether a given field needs a selection
|
||||
-- set or not; but if we're in this function, it means that yes:
|
||||
-- this field needs a selection set, and if none was provided,
|
||||
-- we must fail.
|
||||
when (null input) $
|
||||
parseError $ "missing selection set for " <>> name
|
||||
|
||||
-- TODO(PDV) This probably accepts invalid queries, namely queries that use
|
||||
-- type names that do not exist.
|
||||
fields <- collectFields (name:parsedInterfaceNames) input
|
||||
for fields \selectionField@Field{ _fName, _fAlias } -> if
|
||||
| _fName == $$(litName "__typename") ->
|
||||
pure $ SelectTypename name
|
||||
| Just parser <- M.lookup _fName parserMap ->
|
||||
withPath (++[Key (unName _fName)]) $
|
||||
SelectField <$> parser selectionField
|
||||
| otherwise ->
|
||||
withPath (++[Key (unName _fName)]) $
|
||||
parseError $ "field " <> _fName <<> " not found in type: " <> squote name
|
||||
}
|
||||
where
|
||||
parserMap = parsers
|
||||
& map (\FieldParser{ fDefinition, fParser } -> (getName fDefinition, fParser))
|
||||
& M.fromList
|
||||
interfaces = mapMaybe (getInterfaceInfo . pType) implementsInterfaces
|
||||
parsedInterfaceNames = fmap getName interfaces
|
||||
|
||||
selectionSetInterface
|
||||
:: (MonadParse n, Traversable t)
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> [FieldParser n a]
|
||||
-- ^ Fields defined in this interface
|
||||
-> t (Parser 'Output n b)
|
||||
-- ^ Parsers for the object types that implement this interface; see
|
||||
-- Note [The interfaces story] in Hasura.GraphQL.Parser.Schema for details.
|
||||
-> Parser 'Output n (t b)
|
||||
selectionSetInterface name description fields objectImplementations = Parser
|
||||
{ pType = Nullable $ TNamed $ mkDefinition name description $
|
||||
TIInterface $ InterfaceInfo (map fDefinition fields) objects
|
||||
, pParser = \input -> for objectImplementations (($ input) . pParser)
|
||||
-- Note: This is somewhat suboptimal, since it parses a query against every
|
||||
-- possible object implementing this interface, possibly duplicating work for
|
||||
-- fields defined on the interface itself.
|
||||
--
|
||||
-- Furthermore, in our intended use case (Relay), based on a field argument,
|
||||
-- we can decide which object we are about to retrieve, so in theory we could
|
||||
-- save some work by only parsing against that object type. But it’s still
|
||||
-- useful to parse against all of them, since it checks the validity of any
|
||||
-- fragments on the other types.
|
||||
}
|
||||
where
|
||||
objects = catMaybes $ toList $ fmap (getObjectInfo . pType) objectImplementations
|
||||
|
||||
selectionSetUnion
|
||||
:: (MonadParse n, Traversable t)
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> t (Parser 'Output n b) -- ^ The member object types.
|
||||
-> Parser 'Output n (t b)
|
||||
selectionSetUnion name description objectImplementations = Parser
|
||||
{ pType = Nullable $ TNamed $ mkDefinition name description $
|
||||
TIUnion $ UnionInfo objects
|
||||
, pParser = \input -> for objectImplementations (($ input) . pParser)
|
||||
}
|
||||
where
|
||||
objects = catMaybes $ toList $ fmap (getObjectInfo . pType) objectImplementations
|
||||
|
||||
-- | An "escape hatch" that doesn't validate anything and just gives the
|
||||
-- requested selection set. This is unsafe because it does not check the
|
||||
-- selection set for validity.
|
||||
unsafeRawParser
|
||||
:: forall m
|
||||
. MonadParse m
|
||||
=> Type 'Output
|
||||
-> Parser 'Output m (SelectionSet NoFragments Variable)
|
||||
unsafeRawParser tp = Parser
|
||||
{ pType = tp
|
||||
, pParser = pure
|
||||
}
|
||||
|
||||
unsafeRawField
|
||||
:: forall m
|
||||
. MonadParse m
|
||||
=> Definition FieldInfo
|
||||
-> FieldParser m (Field NoFragments Variable)
|
||||
unsafeRawField def = FieldParser
|
||||
{ fDefinition = def
|
||||
, fParser = pure
|
||||
}
|
||||
|
||||
-- | Builds a 'FieldParser' for a field that does not take a subselection set,
|
||||
-- i.e. a field that returns a scalar or enum. The field’s type is taken from
|
||||
-- the provided 'Parser', but the 'Parser' is not otherwise used.
|
||||
--
|
||||
-- See also Note [The delicate balance of GraphQL kinds] in "Hasura.GraphQL.Parser.Schema".
|
||||
selection
|
||||
:: forall m a b
|
||||
. MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> InputFieldsParser m a -- ^ parser for the input arguments
|
||||
-> Parser 'Both m b -- ^ type of the result
|
||||
-> FieldParser m a
|
||||
selection name description argumentsParser resultParser = FieldParser
|
||||
{ fDefinition = mkDefinition name description $
|
||||
FieldInfo (ifDefinitions argumentsParser) (pType resultParser)
|
||||
, fParser = \Field{ _fArguments, _fSelectionSet } -> do
|
||||
unless (null _fSelectionSet) $
|
||||
parseError "unexpected subselection set for non-object field"
|
||||
-- check for extraneous arguments here, since the InputFieldsParser just
|
||||
-- handles parsing the fields it cares about
|
||||
for_ (M.keys _fArguments) \argumentName ->
|
||||
unless (argumentName `S.member` argumentNames) $
|
||||
parseError $ name <<> " has no argument named " <>> argumentName
|
||||
withPath (++[Key "args"]) $ ifParser argumentsParser $ GraphQLValue <$> _fArguments
|
||||
}
|
||||
where
|
||||
argumentNames = S.fromList (dName <$> ifDefinitions argumentsParser)
|
||||
|
||||
-- | Builds a 'FieldParser' for a field that takes a subselection set, i.e. a
|
||||
-- field that returns an object.
|
||||
--
|
||||
-- See also Note [The delicate balance of GraphQL kinds] in "Hasura.GraphQL.Parser.Schema".
|
||||
subselection
|
||||
:: forall m a b
|
||||
. MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> InputFieldsParser m a -- ^ parser for the input arguments
|
||||
-> Parser 'Output m b -- ^ parser for the subselection set
|
||||
-> FieldParser m (a, b)
|
||||
subselection name description argumentsParser bodyParser = FieldParser
|
||||
{ fDefinition = mkDefinition name description $
|
||||
FieldInfo (ifDefinitions argumentsParser) (pType bodyParser)
|
||||
, fParser = \Field{ _fArguments, _fSelectionSet } -> do
|
||||
-- check for extraneous arguments here, since the InputFieldsParser just
|
||||
-- handles parsing the fields it cares about
|
||||
for_ (M.keys _fArguments) \argumentName ->
|
||||
unless (argumentName `S.member` argumentNames) $
|
||||
parseError $ name <<> " has no argument named " <>> argumentName
|
||||
(,) <$> withPath (++[Key "args"]) (ifParser argumentsParser $ GraphQLValue <$> _fArguments)
|
||||
<*> pParser bodyParser _fSelectionSet
|
||||
}
|
||||
where
|
||||
argumentNames = S.fromList (dName <$> ifDefinitions argumentsParser)
|
||||
|
||||
-- | A shorthand for a 'selection' that takes no arguments.
|
||||
selection_
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Parser 'Both m a -- ^ type of the result
|
||||
-> FieldParser m ()
|
||||
selection_ name description = selection name description (pure ())
|
||||
|
||||
-- | A shorthand for a 'subselection' that takes no arguments.
|
||||
subselection_
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> Parser 'Output m a -- ^ parser for the subselection set
|
||||
-> FieldParser m a
|
||||
subselection_ name description bodyParser =
|
||||
snd <$> subselection name description (pure ()) bodyParser
|
||||
|
||||
|
||||
-- -----------------------------------------------------------------------------
|
||||
-- helpers
|
||||
|
||||
valueToJSON :: MonadParse m => GType -> InputValue Variable -> m A.Value
|
||||
valueToJSON expected = peelVariable (Just expected) >=> valueToJSON'
|
||||
where
|
||||
valueToJSON' = \case
|
||||
JSONValue j -> pure j
|
||||
GraphQLValue g -> graphQLToJSON g
|
||||
graphQLToJSON = \case
|
||||
VNull -> pure A.Null
|
||||
VInt i -> pure $ A.toJSON i
|
||||
VFloat f -> pure $ A.toJSON f
|
||||
VString t -> pure $ A.toJSON t
|
||||
VBoolean b -> pure $ A.toJSON b
|
||||
VEnum (EnumValue n) -> pure $ A.toJSON n
|
||||
VList values -> A.toJSON <$> traverse graphQLToJSON values
|
||||
VObject objects -> A.toJSON <$> traverse graphQLToJSON objects
|
||||
VVariable variable -> valueToJSON' $ absurd <$> vValue variable
|
||||
|
||||
jsonToGraphQL :: (MonadError Text m) => A.Value -> m (Value Void)
|
||||
jsonToGraphQL = \case
|
||||
A.Null -> pure VNull
|
||||
A.Bool val -> pure $ VBoolean val
|
||||
A.String val -> pure $ VString val
|
||||
A.Number val -> case toBoundedInteger val of
|
||||
Just intVal -> pure $ VInt $ fromIntegral @Int64 intVal
|
||||
Nothing -> pure $ VFloat val
|
||||
A.Array vals -> VList <$> traverse jsonToGraphQL (toList vals)
|
||||
A.Object vals -> VObject . M.fromList <$> for (M.toList vals) \(key, val) -> do
|
||||
graphQLName <- onNothing (mkName key) $ throwError $
|
||||
"variable value contains object with key " <> key <<> ", which is not a legal GraphQL name"
|
||||
(graphQLName,) <$> jsonToGraphQL val
|
||||
|
||||
peelVariable :: MonadParse m => Maybe GType -> InputValue Variable -> m (InputValue Variable)
|
||||
peelVariable = peelVariableWith False
|
||||
|
||||
peelVariableWith :: MonadParse m => Bool -> Maybe GType -> InputValue Variable -> m (InputValue Variable)
|
||||
peelVariableWith hasLocationDefaultValue expected = \case
|
||||
GraphQLValue (VVariable var) -> do
|
||||
onJust expected \locationType -> typeCheck hasLocationDefaultValue locationType var
|
||||
markNotReusable
|
||||
pure $ absurd <$> vValue var
|
||||
value -> pure value
|
||||
|
||||
typeCheck :: MonadParse m => Bool -> GType -> Variable -> m ()
|
||||
typeCheck hasLocationDefaultValue locationType variable@Variable { vInfo, vType } =
|
||||
unless (isVariableUsageAllowed hasLocationDefaultValue locationType variable) $ parseError
|
||||
$ "variable " <> dquote (getName vInfo) <> " is declared as "
|
||||
<> showGT vType <> ", but used where "
|
||||
<> showGT locationType <> " is expected"
|
||||
|
||||
typeMismatch :: MonadParse m => Name -> Text -> InputValue Variable -> m a
|
||||
typeMismatch name expected given = parseError $
|
||||
"expected " <> expected <> " for type " <> name <<> ", but found " <> describeValue given
|
||||
|
||||
describeValue :: InputValue Variable -> Text
|
||||
describeValue = describeValueWith (describeValueWith absurd . vValue)
|
||||
|
||||
describeValueWith :: (var -> Text) -> InputValue var -> Text
|
||||
describeValueWith describeVariable = \case
|
||||
JSONValue jval -> describeJSON jval
|
||||
GraphQLValue gval -> describeGraphQL gval
|
||||
where
|
||||
describeJSON = \case
|
||||
A.Null -> "null"
|
||||
A.Bool _ -> "a boolean"
|
||||
A.String _ -> "a string"
|
||||
A.Number _ -> "a number"
|
||||
A.Array _ -> "a list"
|
||||
A.Object _ -> "an object"
|
||||
describeGraphQL = \case
|
||||
VVariable var -> describeVariable var
|
||||
VInt _ -> "an integer"
|
||||
VFloat _ -> "a float"
|
||||
VString _ -> "a string"
|
||||
VBoolean _ -> "a boolean"
|
||||
VNull -> "null"
|
||||
VEnum _ -> "an enum value"
|
||||
VList _ -> "a list"
|
||||
VObject _ -> "an object"
|
||||
|
||||
-- | Checks whether the type of a variable is compatible with the type
|
||||
-- at the location at which it is used. This is an implementation of
|
||||
-- the function described in section 5.8.5 of the spec:
|
||||
-- http://spec.graphql.org/June2018/#sec-All-Variable-Usages-are-Allowed
|
||||
-- No input type coercion is allowed between variables: coercion
|
||||
-- rules only allow when translating a value from a literal. It is
|
||||
-- therefore not allowed to use an Int variable at a Float location,
|
||||
-- despite the fact that it is legal to use an Int literal at a
|
||||
-- Float location.
|
||||
-- Furthermore, it's also worth noting that there's one tricky case
|
||||
-- where we might allow a nullable variable at a non-nullable
|
||||
-- location: when either side has a non-null default value. That's
|
||||
-- because GraphQL conflates nullability and optinal fields (see
|
||||
-- Note [Optional fields and nullability] for more details).
|
||||
isVariableUsageAllowed
|
||||
:: Bool -- ^ does the location have a default value
|
||||
-> GType -- ^ the location type
|
||||
-> Variable -- ^ the variable
|
||||
-> Bool
|
||||
isVariableUsageAllowed hasLocationDefaultValue locationType variable
|
||||
| isNullable locationType = areTypesCompatible locationType variableType
|
||||
| not $ isNullable variableType = areTypesCompatible locationType variableType
|
||||
| hasLocationDefaultValue = areTypesCompatible locationType variableType
|
||||
| hasNonNullDefault variable = areTypesCompatible locationType variableType
|
||||
| otherwise = False
|
||||
where
|
||||
areTypesCompatible = compareTypes `on` \case
|
||||
TypeNamed _ n -> TypeNamed (Nullability True) n
|
||||
TypeList _ n -> TypeList (Nullability True) n
|
||||
variableType = vType variable
|
||||
hasNonNullDefault = vInfo >>> \case
|
||||
VIRequired _ -> False
|
||||
VIOptional _ value -> value /= VNull
|
||||
compareTypes = curry \case
|
||||
(TypeList lNull lType, TypeList vNull vType)
|
||||
-> checkNull lNull vNull && areTypesCompatible lType vType
|
||||
(TypeNamed lNull lType, TypeNamed vNull vType)
|
||||
-> checkNull lNull vNull && lType == vType
|
||||
_ -> False
|
||||
checkNull (Nullability expectedNull) (Nullability actualNull) =
|
||||
expectedNull || not actualNull
|
23
server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot
Normal file
23
server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot
Normal file
@ -0,0 +1,23 @@
|
||||
module Hasura.GraphQL.Parser.Internal.Parser where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Kind as K
|
||||
|
||||
import Language.GraphQL.Draft.Syntax
|
||||
|
||||
|
||||
import {-# SOURCE #-} Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
|
||||
type role Parser nominal representational nominal
|
||||
data Parser (k :: Kind) (m :: K.Type -> K.Type) (a :: K.Type)
|
||||
|
||||
runParser :: Parser k m a -> ParserInput k -> m a
|
||||
|
||||
type family ParserInput k where
|
||||
ParserInput 'Both = InputValue Variable
|
||||
ParserInput 'Input = InputValue Variable
|
||||
ParserInput 'Output = SelectionSet NoFragments Variable
|
||||
|
||||
boolean :: MonadParse m => Parser 'Both m Bool
|
192
server/src-lib/Hasura/GraphQL/Parser/Monad.hs
Normal file
192
server/src-lib/Hasura/GraphQL/Parser/Monad.hs
Normal file
@ -0,0 +1,192 @@
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
-- | Monad transformers for GraphQL schema construction and query parsing.
|
||||
module Hasura.GraphQL.Parser.Monad
|
||||
( SchemaT
|
||||
, runSchemaT
|
||||
|
||||
, ParseT
|
||||
, runParseT
|
||||
, ParseError(..)
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Dependent.Map as DM
|
||||
import qualified Data.Kind as K
|
||||
import qualified Data.Sequence.NonEmpty as NE
|
||||
import qualified Language.Haskell.TH as TH
|
||||
|
||||
import Control.Monad.Unique
|
||||
import Control.Monad.Validate
|
||||
import Data.Dependent.Map (DMap)
|
||||
import Data.GADT.Compare.Extended
|
||||
import Data.IORef
|
||||
import Data.Parser.JSONPath
|
||||
import Data.Proxy (Proxy (..))
|
||||
import System.IO.Unsafe (unsafeInterleaveIO)
|
||||
import Type.Reflection ((:~:) (..), Typeable, typeRep)
|
||||
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Parser.Internal.Parser
|
||||
import Hasura.GraphQL.Parser.Schema
|
||||
import Hasura.RQL.Types.Error (Code)
|
||||
|
||||
-- -------------------------------------------------------------------------------------------------
|
||||
-- schema construction
|
||||
|
||||
newtype SchemaT n m a = SchemaT
|
||||
{ unSchemaT :: StateT (DMap ParserId (ParserById n)) m a
|
||||
} deriving (Functor, Applicative, Monad, MonadError e)
|
||||
|
||||
runSchemaT :: forall m n a . Monad m => SchemaT n m a -> m a
|
||||
runSchemaT = flip evalStateT mempty . unSchemaT
|
||||
|
||||
-- | see Note [SchemaT requires MonadIO]
|
||||
instance (MonadIO m, MonadUnique m, MonadParse n)
|
||||
=> MonadSchema n (SchemaT n m) where
|
||||
memoizeOn name key buildParser = SchemaT do
|
||||
let parserId = ParserId name key
|
||||
parsersById <- get
|
||||
case DM.lookup parserId parsersById of
|
||||
Just (ParserById parser) -> pure parser
|
||||
Nothing -> do
|
||||
-- We manually do eager blackholing here using a MutVar rather than
|
||||
-- relying on MonadFix and ordinary thunk blackholing. Why? A few
|
||||
-- reasons:
|
||||
--
|
||||
-- 1. We have more control. We aren’t at the whims of whatever
|
||||
-- MonadFix instance happens to get used.
|
||||
--
|
||||
-- 2. We can be more precise. GHC’s lazy blackholing doesn’t always
|
||||
-- kick in when you’d expect.
|
||||
--
|
||||
-- 3. We can provide more useful error reporting if things go wrong.
|
||||
-- Most usefully, we can include a HasCallStack source location.
|
||||
cell <- liftIO $ newIORef Nothing
|
||||
|
||||
-- We use unsafeInterleaveIO here, which sounds scary, but
|
||||
-- unsafeInterleaveIO is actually far more safe than unsafePerformIO.
|
||||
-- unsafeInterleaveIO just defers the execution of the action until its
|
||||
-- result is needed, adding some laziness.
|
||||
--
|
||||
-- That laziness can be dangerous if the action has side-effects, since
|
||||
-- the point at which the effect is performed can be unpredictable. But
|
||||
-- this action just reads, never writes, so that isn’t a concern.
|
||||
parserById <- liftIO $ unsafeInterleaveIO $ readIORef cell >>= \case
|
||||
Just parser -> pure $ ParserById parser
|
||||
Nothing -> error $ unlines
|
||||
[ "memoize: parser was forced before being fully constructed"
|
||||
, " parser constructor: " ++ TH.pprint name ]
|
||||
put $! DM.insert parserId parserById parsersById
|
||||
|
||||
unique <- newUnique
|
||||
parser <- addDefinitionUnique unique <$> unSchemaT buildParser
|
||||
liftIO $ writeIORef cell (Just parser)
|
||||
pure parser
|
||||
|
||||
-- We can add a reader in two places. I'm not sure which one is the correct
|
||||
-- one. But since we don't seem to change the values that are being read, I
|
||||
-- don't think it matters.
|
||||
|
||||
deriving instance Monad m => MonadReader a (SchemaT n (ReaderT a m))
|
||||
|
||||
instance (MonadIO m, MonadUnique m, MonadParse n)
|
||||
=> MonadSchema n (ReaderT a (SchemaT n m)) where
|
||||
memoizeOn name key = mapReaderT (memoizeOn name key)
|
||||
|
||||
{- Note [SchemaT requires MonadIO]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
The MonadSchema instance for SchemaT requires MonadIO, which is unsatisfying.
|
||||
The only reason the constraint is needed is to implement knot-tying via IORefs
|
||||
(see Note [Tying the knot] in Hasura.GraphQL.Parser.Class), which really only
|
||||
requires the power of ST. Using ST would be much nicer, since we could discharge
|
||||
the burden locally, but unfortunately we also want to use MonadUnique, which
|
||||
is handled by IO in the end.
|
||||
|
||||
This means that we need IO at the base of our monad, so to use STRefs, we’d need
|
||||
a hypothetical STT transformer (i.e. a monad transformer version of ST). But
|
||||
such a thing isn’t safe in general, since reentrant monads like ListT or ContT
|
||||
would incorrectly share state between the different threads of execution.
|
||||
|
||||
In theory, this can be resolved by using something like Vault (from the vault
|
||||
package) to create “splittable” sets of variable references. That would allow
|
||||
you to create a transformer with an STRef-like interface that works over any
|
||||
arbitrary monad. However, while the interface would be safe, the implementation
|
||||
of such an abstraction requires unsafe primitives, and to the best of my
|
||||
knowledge no such transformer exists in any existing libraries.
|
||||
|
||||
So we decide it isn’t worth the trouble and just use MonadIO. If `eff` ever pans
|
||||
out, it should be able to support this more naturally, so we can fix it then. -}
|
||||
|
||||
-- | A key used to distinguish calls to 'memoize'd functions. The 'TH.Name'
|
||||
-- distinguishes calls to completely different parsers, and the @a@ value
|
||||
-- records the arguments.
|
||||
data ParserId (t :: (Kind, K.Type)) where
|
||||
ParserId :: (Ord a, Typeable a, Typeable b, Typeable k) => TH.Name -> a -> ParserId '(k, b)
|
||||
|
||||
instance GEq ParserId where
|
||||
geq (ParserId name1 (arg1 :: a1) :: ParserId t1)
|
||||
(ParserId name2 (arg2 :: a2) :: ParserId t2)
|
||||
| _ :: Proxy '(k1, b1) <- Proxy @t1
|
||||
, _ :: Proxy '(k2, b2) <- Proxy @t2
|
||||
, name1 == name2
|
||||
, Just Refl <- typeRep @a1 `geq` typeRep @a2
|
||||
, arg1 == arg2
|
||||
, Just Refl <- typeRep @k1 `geq` typeRep @k2
|
||||
, Just Refl <- typeRep @b1 `geq` typeRep @b2
|
||||
= Just Refl
|
||||
| otherwise = Nothing
|
||||
|
||||
instance GCompare ParserId where
|
||||
gcompare (ParserId name1 (arg1 :: a1) :: ParserId t1)
|
||||
(ParserId name2 (arg2 :: a2) :: ParserId t2)
|
||||
| _ :: Proxy '(k1, b1) <- Proxy @t1
|
||||
, _ :: Proxy '(k2, b2) <- Proxy @t2
|
||||
= strengthenOrdering (compare name1 name2)
|
||||
`extendGOrdering` gcompare (typeRep @a1) (typeRep @a2)
|
||||
`extendGOrdering` strengthenOrdering (compare arg1 arg2)
|
||||
`extendGOrdering` gcompare (typeRep @k1) (typeRep @k2)
|
||||
`extendGOrdering` gcompare (typeRep @b1) (typeRep @b2)
|
||||
`extendGOrdering` GEQ
|
||||
|
||||
-- | A newtype wrapper around a 'Parser' that rearranges the type parameters
|
||||
-- so that it can be indexed by a 'ParserId' in a 'DMap'.
|
||||
--
|
||||
-- This is really just a single newtype, but it’s implemented as a data family
|
||||
-- because GHC doesn’t allow ordinary datatype declarations to pattern-match on
|
||||
-- type parameters, and we want to match on the tuple.
|
||||
data family ParserById (m :: K.Type -> K.Type) (a :: (Kind, K.Type))
|
||||
newtype instance ParserById m '(k, a) = ParserById (Parser k m a)
|
||||
|
||||
-- -------------------------------------------------------------------------------------------------
|
||||
-- query parsing
|
||||
|
||||
newtype ParseT m a = ParseT
|
||||
{ unParseT :: ReaderT JSONPath (StateT QueryReusability (ValidateT (NESeq ParseError) m)) a
|
||||
} deriving (Functor, Applicative, Monad)
|
||||
|
||||
runParseT
|
||||
:: Functor m
|
||||
=> ParseT m a
|
||||
-> m (Either (NESeq ParseError) (a, QueryReusability))
|
||||
runParseT = unParseT
|
||||
>>> flip runReaderT []
|
||||
>>> flip runStateT mempty
|
||||
>>> runValidateT
|
||||
|
||||
instance MonadTrans ParseT where
|
||||
lift = ParseT . lift . lift . lift
|
||||
|
||||
instance Monad m => MonadParse (ParseT m) where
|
||||
withPath f x = ParseT $ withReaderT f $ unParseT x
|
||||
parseErrorWith code text = ParseT $ do
|
||||
path <- ask
|
||||
lift $ refute $ NE.singleton ParseError{ peCode = code, pePath = path, peMessage = text }
|
||||
markNotReusable = ParseT $ lift $ put NotReusable
|
||||
|
||||
data ParseError = ParseError
|
||||
{ pePath :: JSONPath
|
||||
, peMessage :: Text
|
||||
, peCode :: Code
|
||||
}
|
803
server/src-lib/Hasura/GraphQL/Parser/Schema.hs
Normal file
803
server/src-lib/Hasura/GraphQL/Parser/Schema.hs
Normal file
@ -0,0 +1,803 @@
|
||||
{-# LANGUAGE StrictData #-}
|
||||
|
||||
-- | Types for representing a GraphQL schema.
|
||||
module Hasura.GraphQL.Parser.Schema (
|
||||
-- * Kinds
|
||||
Kind(..)
|
||||
, (:<:)(..)
|
||||
, type (<:)(..)
|
||||
|
||||
-- * Types
|
||||
, Type(..)
|
||||
, NonNullableType(..)
|
||||
, TypeInfo(..)
|
||||
, SomeTypeInfo(..)
|
||||
, eqType
|
||||
, eqNonNullableType
|
||||
, eqTypeInfo
|
||||
, discardNullability
|
||||
, nullableType
|
||||
, nonNullableType
|
||||
, toGraphQLType
|
||||
, getObjectInfo
|
||||
, getInterfaceInfo
|
||||
|
||||
, EnumValueInfo(..)
|
||||
, InputFieldInfo(..)
|
||||
, FieldInfo(..)
|
||||
, InputObjectInfo(..)
|
||||
, ObjectInfo(..)
|
||||
, InterfaceInfo(..)
|
||||
, UnionInfo(..)
|
||||
|
||||
-- * Definitions
|
||||
, Definition(..)
|
||||
, mkDefinition
|
||||
, addDefinitionUnique
|
||||
, HasDefinition(..)
|
||||
|
||||
-- * Schemas
|
||||
, Schema(..)
|
||||
, ConflictingDefinitions(..)
|
||||
, HasTypeDefinitions(..)
|
||||
, collectTypeDefinitions
|
||||
|
||||
-- * Miscellany
|
||||
, HasName(..)
|
||||
, InputValue(..)
|
||||
, Variable(..)
|
||||
, VariableInfo(..)
|
||||
, DirectiveInfo(..)
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict.Extended as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import Data.Hashable ( Hashable (..) )
|
||||
|
||||
import Control.Lens.Extended
|
||||
import Control.Monad.Unique
|
||||
import Data.Functor.Classes
|
||||
import Language.GraphQL.Draft.Syntax ( Description (..), Name (..)
|
||||
, Value (..), Nullability(..)
|
||||
, GType (..), DirectiveLocation(..)
|
||||
)
|
||||
|
||||
class HasName a where
|
||||
getName :: a -> Name
|
||||
instance HasName Name where
|
||||
getName = id
|
||||
|
||||
-- | GraphQL types are divided into two classes: input types and output types.
|
||||
-- The GraphQL spec does not use the word “kind” to describe these classes, but
|
||||
-- it’s an apt term.
|
||||
--
|
||||
-- Some GraphQL types can be used at either kind, so we also include the 'Both'
|
||||
-- kind, the superkind of both 'Input' and 'Output'. The '<:' class provides
|
||||
-- kind subsumption constraints.
|
||||
--
|
||||
-- For more details, see <http://spec.graphql.org/June2018/#sec-Input-and-Output-Types>.
|
||||
data Kind
|
||||
= Both -- ^ see Note [The 'Both kind]
|
||||
| Input
|
||||
| Output
|
||||
|
||||
{- Note [The 'Both kind]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
As described in the Haddock comments for Kind and <:, we use Kind to index
|
||||
various types, such as Type and Parser. We use this to enforce various
|
||||
correctness constraints mandated by the GraphQL spec; for example, we don’t
|
||||
allow input object fields to have output types and we don’t allow output object
|
||||
fields to have input types.
|
||||
|
||||
But scalars and enums can be used as input types *or* output types. A natural
|
||||
encoding of that in Haskell would be to make constructors for those types
|
||||
polymorphic, like this:
|
||||
|
||||
data Kind = Input | Output
|
||||
|
||||
data TypeInfo k where
|
||||
TIScalar :: TypeInfo k -- \ Polymorphic!
|
||||
TIEnum :: ... -> TypeInfo k -- /
|
||||
TIInputObject :: ... -> TypeInfo 'Input
|
||||
TIObject :: ... -> TypeInfo 'Output
|
||||
|
||||
Naturally, this would give the `scalar` parser constructor a similarly
|
||||
polymorphic type:
|
||||
|
||||
scalar
|
||||
:: MonadParse m
|
||||
=> Name
|
||||
-> Maybe Description
|
||||
-> ScalarRepresentation a
|
||||
-> Parser k m a -- Polymorphic!
|
||||
|
||||
But if we actually try that, we run into problems. The trouble is that we want
|
||||
to use the Kind to influence several different things:
|
||||
|
||||
* As mentioned above, we use it to ensure that the types we generate are
|
||||
well-kinded according to the GraphQL spec rules.
|
||||
|
||||
* We use it to determine what a Parser consumes as input. Parsers for input
|
||||
types parse GraphQL input values, but Parsers for output types parse
|
||||
selection sets. (See Note [The meaning of Parser 'Output] in
|
||||
Hasura.GraphQL.Parser.Internal.Parser for an explanation of why.)
|
||||
|
||||
* We use it to know when to expect a sub-selection set for a field of an
|
||||
output object (see Note [The delicate balance of GraphQL kinds]).
|
||||
|
||||
These many uses of Kind cause some trouble for a polymorphic representation. For
|
||||
example, consider our `scalar` parser constructor above---if we were to
|
||||
instantiate it at kind 'Output, we’d receive a `Parser 'Output`, which we would
|
||||
then expect to be able to apply to a selection set. But that doesn’t make any
|
||||
sense, since scalar fields don’t have selection sets!
|
||||
|
||||
Another issue with this representation has to do with effectful parser
|
||||
constructors (such as constructors that can throw errors). These have types like
|
||||
|
||||
mkFooParser :: MonadSchema n m => Blah -> m (Parser k n Foo)
|
||||
|
||||
where the parser construction is itself monadic. This causes some annoyance,
|
||||
since even if mkFooParser returns a Parser of a polymorphic kind, code like this
|
||||
will not typecheck:
|
||||
|
||||
(fooParser :: forall k. Parser k n Foo) <- mkFooParser blah
|
||||
|
||||
The issue is that we have to instantiate k to a particular type to be able to
|
||||
call mkFooParser. If we want to use the result at both kinds, we’d have to call
|
||||
mkFooParser twice:
|
||||
|
||||
(fooInputParser :: Parser 'Input n Foo) <- mkFooParser blah
|
||||
(fooOutputParser :: Parser 'Output n Foo) <- mkFooParser blah
|
||||
|
||||
Other situations encounter similar difficulties, and they are not easy to
|
||||
resolve without impredicative polymorphism (which GHC does not support).
|
||||
|
||||
To avoid this problem, we don’t use polymorphic kinds, but instead introduce a
|
||||
form of kind subsumption. Types that can be used as both input and output types
|
||||
are explicitly given the kind 'Both. This allows us to get the best of both
|
||||
worlds:
|
||||
|
||||
* We use the <: typeclass to accept 'Both in most places where we expect
|
||||
either input or output types.
|
||||
|
||||
* We can treat 'Both specially to avoid requiring `scalar` to supply a
|
||||
selection set parser (see Note [The delicate balance of GraphQL kinds] for
|
||||
further explanation).
|
||||
|
||||
* Because we avoid the polymorphism, we don’t run into the aforementioned
|
||||
issue with monadic parser constructors.
|
||||
|
||||
All of this is subtle and somewhat complicated, but unfortunately there isn’t
|
||||
much of a way around that: GraphQL is subtle and complicated. Our use of an
|
||||
explicit 'Both kind isn’t the only way to encode these things, but it’s the
|
||||
particular set of compromises we’ve chosen to accept.
|
||||
|
||||
Note [The delicate balance of GraphQL kinds]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
As discussed in Note [The 'Both kind], we use GraphQL kinds to distinguish
|
||||
several different things. One of them is which output types take sub-selection
|
||||
sets. For example, scalars don’t accept sub-selection sets, so if we have a
|
||||
schema like
|
||||
|
||||
type Query {
|
||||
users: [User!]!
|
||||
}
|
||||
|
||||
type User {
|
||||
id: Int!
|
||||
}
|
||||
|
||||
then the following query is illegal:
|
||||
|
||||
query {
|
||||
users {
|
||||
id {
|
||||
blah
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
The id field has a scalar type, so it should not take a sub-selection set. This
|
||||
is actually something we care about distinguishing at the type level, because it
|
||||
affects the type of the `selection` parser combinator. Suppose we have a
|
||||
`Parser 'Output m UserQuery` for the User type. When we parse a field with that
|
||||
type, we expect to receive a UserQuery as a result, unsurprisingly. But what if
|
||||
we parse an output field using the `int` parser, which has this type:
|
||||
|
||||
int :: MonadParse m => Parser 'Both m Int32
|
||||
|
||||
If we follow the same logic as for the User parser above, we’d expect to receive
|
||||
an Int32 as a result... but that doesn’t make any sense, since the Int32
|
||||
corresponds to the result *we* are suppose to produce as a result of executing
|
||||
the query, not something user-specified.
|
||||
|
||||
One way to solve this would be to associate every Parser with two result types:
|
||||
one when given an input object, and one when given a selection set. Then our
|
||||
parsers could be given these types, instead:
|
||||
|
||||
user :: MonadParse m => Parser 'Output m Void UserQuery
|
||||
int :: MonadParse m => Parser 'Both m Int32 ()
|
||||
|
||||
But if you work through this, you’ll find that *all* parsers will either have
|
||||
Void or () for at least one of their input result types or their output result
|
||||
types, depending on their kind:
|
||||
|
||||
* All 'Input parsers must have Void for their output result type, since they
|
||||
aren’t allowed to be used in output contexts at all.
|
||||
|
||||
* All 'Output parsers must have Void for their input result type, since they
|
||||
aren’t allowed to be used in input contexts at all.
|
||||
|
||||
* That just leaves 'Both. The only types of kind 'Both are scalars and enums,
|
||||
neither of which accept a sub-selection set. Their output result type would
|
||||
therefore be (), since they are allowed to appear in output contexts, but
|
||||
they don’t return any results.
|
||||
|
||||
The end result of this is that we clutter all our types with Voids and ()s, with
|
||||
little actual benefit.
|
||||
|
||||
If you really think about it, the fact that the no types of kind 'Both accept a
|
||||
sub-selection set is really something of a coincidence. In theory, one could
|
||||
imagine a future version of the GraphQL spec adding a type that can be used as
|
||||
both an input type or an output type, but accepts a sub-selection set. If that
|
||||
ever happens, we’ll have to tweak our encoding, but for now, we can take
|
||||
advantage of this happy coincidence and make the kinds serve double duty:
|
||||
|
||||
* We can make `ParserInput 'Both` identical to `ParserInput 'Input`, since
|
||||
all parsers of kind 'Both only parse input values.
|
||||
|
||||
* We can require types of kind 'Both in `selection`, which does not expect a
|
||||
sub-selection set, and types of kind 'Output in `subselection`, which does.
|
||||
|
||||
Relying on this coincidence might seem a little gross, and perhaps it is
|
||||
somewhat. But it’s enormously convenient: not doing this would make some types
|
||||
significantly more complicated, since we would have to thread around more
|
||||
information at the type level and we couldn’t make as many simplifying
|
||||
assumptions. So until GraphQL adds a type that violates these assumptions, we
|
||||
are happy to take advantage of this coincidence. -}
|
||||
|
||||
-- | Evidence for '<:'.
|
||||
data k1 :<: k2 where
|
||||
KRefl :: k :<: k
|
||||
KBoth :: k :<: 'Both
|
||||
|
||||
-- | 'Kind' subsumption. The GraphQL kind hierarchy is extremely simple:
|
||||
--
|
||||
-- > Both
|
||||
-- > / \
|
||||
-- > Input Output
|
||||
--
|
||||
-- Various functions in this module use '<:' to allow 'Both' to be used in
|
||||
-- places where 'Input' or 'Output' would otherwise be expected.
|
||||
class k1 <: k2 where
|
||||
subKind :: k1 :<: k2
|
||||
instance k1 ~ k2 => k1 <: k2 where
|
||||
subKind = KRefl
|
||||
instance {-# OVERLAPPING #-} k <: 'Both where
|
||||
subKind = KBoth
|
||||
|
||||
data Type k
|
||||
= NonNullable (NonNullableType k)
|
||||
| Nullable (NonNullableType k)
|
||||
|
||||
instance Eq (Type k) where
|
||||
(==) = eqType
|
||||
|
||||
-- | Like '==', but can compare 'Type's of different kinds.
|
||||
eqType :: Type k1 -> Type k2 -> Bool
|
||||
eqType (NonNullable a) (NonNullable b) = eqNonNullableType a b
|
||||
eqType (Nullable a) (Nullable b) = eqNonNullableType a b
|
||||
eqType _ _ = False
|
||||
|
||||
instance HasName (Type k) where
|
||||
getName = getName . discardNullability
|
||||
|
||||
instance HasDefinition (Type k) (TypeInfo k) where
|
||||
definitionLens f (NonNullable t) = NonNullable <$> definitionLens f t
|
||||
definitionLens f (Nullable t) = Nullable <$> definitionLens f t
|
||||
|
||||
discardNullability :: Type k -> NonNullableType k
|
||||
discardNullability (NonNullable t) = t
|
||||
discardNullability (Nullable t) = t
|
||||
|
||||
nullableType :: Type k -> Type k
|
||||
nullableType = Nullable . discardNullability
|
||||
|
||||
nonNullableType :: Type k -> Type k
|
||||
nonNullableType = NonNullable . discardNullability
|
||||
|
||||
data NonNullableType k
|
||||
= TNamed (Definition (TypeInfo k))
|
||||
| TList (Type k)
|
||||
|
||||
instance Eq (NonNullableType k) where
|
||||
(==) = eqNonNullableType
|
||||
|
||||
toGraphQLType :: Type k -> GType
|
||||
toGraphQLType = \case
|
||||
NonNullable t -> translateWith False t
|
||||
Nullable t -> translateWith True t
|
||||
where
|
||||
translateWith nullability = \case
|
||||
TNamed typeInfo -> TypeNamed (Nullability nullability) $ getName typeInfo
|
||||
TList typeInfo -> TypeList (Nullability nullability) $ toGraphQLType typeInfo
|
||||
|
||||
|
||||
-- | Like '==', but can compare 'NonNullableType's of different kinds.
|
||||
eqNonNullableType :: NonNullableType k1 -> NonNullableType k2 -> Bool
|
||||
eqNonNullableType (TNamed a) (TNamed b) = liftEq eqTypeInfo a b
|
||||
eqNonNullableType (TList a) (TList b) = eqType a b
|
||||
eqNonNullableType _ _ = False
|
||||
|
||||
instance HasName (NonNullableType k) where
|
||||
getName (TNamed definition) = getName definition
|
||||
getName (TList t) = getName t
|
||||
|
||||
instance HasDefinition (NonNullableType k) (TypeInfo k) where
|
||||
definitionLens f (TNamed definition) = TNamed <$> f definition
|
||||
definitionLens f (TList t) = TList <$> definitionLens f t
|
||||
|
||||
{- Note [The interfaces story]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
GraphQL interfaces are not conceptually complicated, but they pose some
|
||||
non-obvious challenges for our implementation. First, familiarize yourself with
|
||||
GraphQL interfaces themselves:
|
||||
|
||||
* https://graphql.org/learn/schema/#interfaces
|
||||
* http://spec.graphql.org/June2018/#sec-Interfaces
|
||||
* http://spec.graphql.org/June2018/#sec-Objects
|
||||
|
||||
The most logical repesentation of object and interface types is to have objects
|
||||
reference the interfaces they implement, but not the other way around. After
|
||||
all, that’s how it works in the GraphQL language: when you declare an interface,
|
||||
you just specify its fields, and you specify which interfaces each object type
|
||||
implements as part of their declarations.
|
||||
|
||||
However, this representation is actually not very useful for us. We /also/ need
|
||||
the interfaces to reference the objects that implement them---forming a circular
|
||||
structure---for two reasons:
|
||||
|
||||
1. Most directly, we need this information for introspection queries.
|
||||
Introspection queries for object types return the set of interfaces they
|
||||
implement <http://spec.graphql.org/June2018/#sec-Object>, and introspection
|
||||
queries for interfaces return the set of object types that implement them
|
||||
<http://spec.graphql.org/June2018/#sec-Interface>.
|
||||
|
||||
2. Less obviously, it’s more natural to specify the relationships “backwards”
|
||||
like this when building the schema using the parser combinator language.
|
||||
|
||||
From the parser’s point of view, each implementation of an interface
|
||||
corresponds to a distinct parsing possibility. For example, when we
|
||||
generate a Relay schema, the type of the `node` root field is an interface,
|
||||
and each table is a type that implements it:
|
||||
|
||||
type query_root {
|
||||
node(id: ID!): Node
|
||||
...
|
||||
}
|
||||
|
||||
interface Node {
|
||||
id: ID!
|
||||
}
|
||||
|
||||
type author implements Node {
|
||||
id: ID!
|
||||
name: String!
|
||||
...
|
||||
}
|
||||
|
||||
type article implements Node {
|
||||
id: ID!
|
||||
title: String!
|
||||
body: String!
|
||||
...
|
||||
}
|
||||
|
||||
A query will use fragments on the Node type to access table-specific fields:
|
||||
|
||||
query get_article_info($article_id: ID!) {
|
||||
node(id: $article_id) {
|
||||
... on article {
|
||||
title
|
||||
body
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
The query parser needs to know which types implement the interface (and
|
||||
how to parse their selection sets) so that it can parse the fragments.
|
||||
|
||||
This presents some complications, since we need to build this information in a
|
||||
circular fashion. Currently, we do this in a very naïve way:
|
||||
|
||||
* We require selectionSetObject to specify the interfaces it implements /and/
|
||||
require selectionSetInterface to specify the objects that implement it.
|
||||
|
||||
* We take advantage of our existing memoization mechanism to do the knot-tying
|
||||
for us (see Note [Tying the knot] in Hasura.GraphQL.Parser.Class).
|
||||
|
||||
You may notice that this makes it possible for the definitions to be
|
||||
inconsistent: we could construct an interface parser that parses some object
|
||||
type, but forget to specify that the object type implements the interface. This
|
||||
inconsistency is currently completely unchecked, which is quite unfortunate. It
|
||||
also means we don’t support remote schema-defined object types that implement
|
||||
interfaces we generate, since we don’t know anything about those types when we
|
||||
construct the interface.
|
||||
|
||||
Since we don’t make very much use of interface types at the time of this
|
||||
writing, this isn’t much of a problem in practice. But if that changes, it would
|
||||
be worth implementing a more sophisticated solution that can gather up all the
|
||||
different sources of information and make sure they’re consistent. -}
|
||||
|
||||
data InputObjectInfo = InputObjectInfo ~[Definition InputFieldInfo]
|
||||
-- Note that we can't check for equality of the fields since there may be
|
||||
-- circularity. So we rather check for equality of names.
|
||||
instance Eq InputObjectInfo where
|
||||
InputObjectInfo fields1 == InputObjectInfo fields2
|
||||
= Set.fromList (fmap dName fields1) == Set.fromList (fmap dName fields2)
|
||||
|
||||
data ObjectInfo = ObjectInfo
|
||||
{ oiFields :: ~[Definition FieldInfo]
|
||||
-- ^ The fields that this object has. This consists of the fields of the
|
||||
-- interfaces that it implements, as well as any additional fields.
|
||||
, oiImplements :: ~[Definition InterfaceInfo]
|
||||
-- ^ The interfaces that this object implements (inheriting all their
|
||||
-- fields). See Note [The interfaces story] for more details.
|
||||
}
|
||||
-- Note that we can't check for equality of the fields and the interfaces since
|
||||
-- there may be circularity. So we rather check for equality of names.
|
||||
instance Eq ObjectInfo where
|
||||
ObjectInfo fields1 interfaces1 == ObjectInfo fields2 interfaces2
|
||||
= Set.fromList (fmap dName fields1 ) == Set.fromList (fmap dName fields2 )
|
||||
&& Set.fromList (fmap dName interfaces1) == Set.fromList (fmap dName interfaces2)
|
||||
|
||||
-- | Type information for a GraphQL interface; see Note [The interfaces story]
|
||||
-- for more details.
|
||||
--
|
||||
-- Note: in the current working draft of the GraphQL specification (> June
|
||||
-- 2018), interfaces may implement other interfaces, but we currently don't
|
||||
-- support this.
|
||||
data InterfaceInfo = InterfaceInfo
|
||||
{ iiFields :: ~[Definition FieldInfo]
|
||||
-- ^ Fields declared by this interface. Every object implementing this
|
||||
-- interface must include those fields.
|
||||
, iiPossibleTypes :: ~[Definition ObjectInfo]
|
||||
-- ^ Objects that implement this interface. See Note [The interfaces story]
|
||||
-- for why we include that information here.
|
||||
}
|
||||
-- Note that we can't check for equality of the fields and the interfaces since
|
||||
-- there may be circularity. So we rather check for equality of names.
|
||||
instance Eq InterfaceInfo where
|
||||
InterfaceInfo fields1 objects1 == InterfaceInfo fields2 objects2
|
||||
= Set.fromList (fmap dName fields1 ) == Set.fromList (fmap dName fields2 )
|
||||
&& Set.fromList (fmap dName objects1 ) == Set.fromList (fmap dName objects2 )
|
||||
|
||||
data UnionInfo = UnionInfo
|
||||
{ uiPossibleTypes :: ~[Definition ObjectInfo]
|
||||
-- ^ The member object types of this union.
|
||||
}
|
||||
|
||||
data TypeInfo k where
|
||||
TIScalar :: TypeInfo 'Both
|
||||
TIEnum :: NonEmpty (Definition EnumValueInfo) -> TypeInfo 'Both
|
||||
TIInputObject :: InputObjectInfo -> TypeInfo 'Input
|
||||
TIObject :: ObjectInfo -> TypeInfo 'Output
|
||||
TIInterface :: InterfaceInfo -> TypeInfo 'Output
|
||||
TIUnion :: UnionInfo -> TypeInfo 'Output
|
||||
|
||||
instance Eq (TypeInfo k) where
|
||||
(==) = eqTypeInfo
|
||||
|
||||
-- | Like '==', but can compare 'TypeInfo's of different kinds.
|
||||
eqTypeInfo :: TypeInfo k1 -> TypeInfo k2 -> Bool
|
||||
eqTypeInfo TIScalar TIScalar = True
|
||||
eqTypeInfo (TIEnum values1) (TIEnum values2)
|
||||
= Set.fromList (toList values1) == Set.fromList (toList values2)
|
||||
-- NB the case for input objects currently has quadratic complexity, which is
|
||||
-- probably avoidable. HashSets should be able to get this down to
|
||||
-- O(n*log(n)). But this requires writing some Hashable instances by hand
|
||||
-- because we use some existential types and GADTs.
|
||||
eqTypeInfo (TIInputObject ioi1) (TIInputObject ioi2) = ioi1 == ioi2
|
||||
eqTypeInfo (TIObject oi1) (TIObject oi2) = oi1 == oi2
|
||||
eqTypeInfo (TIInterface ii1) (TIInterface ii2) = ii1 == ii2
|
||||
eqTypeInfo (TIUnion (UnionInfo objects1)) (TIUnion (UnionInfo objects2))
|
||||
= Set.fromList (fmap dName objects1) == Set.fromList (fmap dName objects2)
|
||||
eqTypeInfo _ _ = False
|
||||
|
||||
getObjectInfo :: Type k -> Maybe (Definition ObjectInfo)
|
||||
getObjectInfo = traverse getTI . (^.definitionLens)
|
||||
where
|
||||
getTI :: TypeInfo k -> Maybe ObjectInfo
|
||||
getTI (TIObject oi) = Just oi
|
||||
getTI _ = Nothing
|
||||
|
||||
getInterfaceInfo :: Type 'Output -> Maybe (Definition InterfaceInfo)
|
||||
getInterfaceInfo = traverse getTI . (^.definitionLens)
|
||||
where
|
||||
getTI :: TypeInfo 'Output -> Maybe InterfaceInfo
|
||||
getTI (TIInterface ii) = Just ii
|
||||
getTI _ = Nothing
|
||||
|
||||
data SomeTypeInfo = forall k. SomeTypeInfo (TypeInfo k)
|
||||
|
||||
instance Eq SomeTypeInfo where
|
||||
SomeTypeInfo a == SomeTypeInfo b = eqTypeInfo a b
|
||||
|
||||
data Definition a = Definition
|
||||
{ dName :: Name
|
||||
, dUnique :: Maybe Unique
|
||||
-- ^ A unique identifier used to break cycles in mutually-recursive type
|
||||
-- definitions. If two 'Definition's have the same 'Unique', they can be
|
||||
-- assumed to be identical. Note that the inverse is /not/ true: two
|
||||
-- definitions with different 'Unique's might still be otherwise identical.
|
||||
--
|
||||
-- Also see Note [Tying the knot] in Hasura.GraphQL.Parser.Class.
|
||||
, dDescription :: Maybe Description
|
||||
, dInfo :: ~a
|
||||
-- ^ Lazy to allow mutually-recursive type definitions.
|
||||
} deriving (Functor, Foldable, Traversable, Generic)
|
||||
instance Hashable a => Hashable (Definition a) where
|
||||
hashWithSalt salt Definition{..} =
|
||||
salt `hashWithSalt` dName `hashWithSalt` dInfo
|
||||
|
||||
mkDefinition :: Name -> Maybe Description -> a -> Definition a
|
||||
mkDefinition name description info = Definition name Nothing description info
|
||||
|
||||
instance Eq a => Eq (Definition a) where
|
||||
(==) = eq1
|
||||
|
||||
instance Eq1 Definition where
|
||||
liftEq eq (Definition name1 maybeUnique1 _ info1)
|
||||
(Definition name2 maybeUnique2 _ info2)
|
||||
| Just unique1 <- maybeUnique1
|
||||
, Just unique2 <- maybeUnique2
|
||||
, unique1 == unique2
|
||||
= True
|
||||
| otherwise
|
||||
= name1 == name2 && eq info1 info2
|
||||
|
||||
instance HasName (Definition a) where
|
||||
getName = dName
|
||||
|
||||
class HasDefinition s a | s -> a where
|
||||
definitionLens :: Lens' s (Definition a)
|
||||
instance HasDefinition (Definition a) a where
|
||||
definitionLens = id
|
||||
|
||||
-- | Adds a 'Unique' to a 'Definition' that does not yet have one. If the
|
||||
-- definition already has a 'Unique', the existing 'Unique' is kept.
|
||||
addDefinitionUnique :: HasDefinition s a => Unique -> s -> s
|
||||
addDefinitionUnique unique = over definitionLens \definition ->
|
||||
definition { dUnique = dUnique definition <|> Just unique }
|
||||
|
||||
-- | Enum values have no extra information except for the information common to
|
||||
-- all definitions, so this is just a placeholder for use as @'Definition'
|
||||
-- 'EnumValueInfo'@.
|
||||
data EnumValueInfo = EnumValueInfo
|
||||
deriving (Eq, Generic)
|
||||
instance Hashable EnumValueInfo
|
||||
|
||||
data InputFieldInfo
|
||||
-- | A required field with a non-nullable type.
|
||||
= forall k. ('Input <: k) => IFRequired (NonNullableType k)
|
||||
-- | An optional input field with a nullable type and possibly a default
|
||||
-- value. If a default value is provided, it should be a valid value for the
|
||||
-- type.
|
||||
--
|
||||
-- Note that a default value of 'VNull' is subtly different from having no
|
||||
-- default value at all. If no default value is provided, the GraphQL
|
||||
-- specification allows distinguishing provided @null@ values from values left
|
||||
-- completely absent; see Note [Optional fields and nullability] in
|
||||
-- Hasura.GraphQL.Parser.Internal.Parser.
|
||||
| forall k. ('Input <: k) => IFOptional (Type k) (Maybe (Value Void))
|
||||
|
||||
instance Eq InputFieldInfo where
|
||||
IFRequired t1 == IFRequired t2 = eqNonNullableType t1 t2
|
||||
IFOptional t1 v1 == IFOptional t2 v2 = eqType t1 t2 && v1 == v2
|
||||
_ == _ = False
|
||||
|
||||
data FieldInfo = forall k. ('Output <: k) => FieldInfo
|
||||
{ fArguments :: [Definition InputFieldInfo]
|
||||
, fType :: Type k
|
||||
}
|
||||
|
||||
instance Eq FieldInfo where
|
||||
FieldInfo args1 t1 == FieldInfo args2 t2 = args1 == args2 && eqType t1 t2
|
||||
|
||||
{- Note [Parsing variable values]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
GraphQL includes its own tiny language for input values, which is similar to
|
||||
JSON but not quite the same---GraphQL input values can be enum values, and there
|
||||
are restrictions on the names of input object keys. Despite these differences,
|
||||
variables’ values are passed as JSON, so we actually need to be able to parse
|
||||
values expressed in both languages.
|
||||
|
||||
It’s tempting to contain this complexity by simply converting the JSON values to
|
||||
GraphQL input values up front, and for booleans, numbers, arrays, and most
|
||||
objects, this conversion is viable. But JSON strings pose a problem, since they
|
||||
are used to represent both GraphQL strings and GraphQL enums. For example,
|
||||
consider a query like this:
|
||||
|
||||
enum FooBar {
|
||||
FOO
|
||||
BAR
|
||||
}
|
||||
|
||||
query some_query($a: String, $b: FooBar) {
|
||||
...
|
||||
}
|
||||
|
||||
We might receive an accompany variables payload like this:
|
||||
|
||||
{
|
||||
"a": "FOO",
|
||||
"b": "FOO"
|
||||
}
|
||||
|
||||
To properly convert these JSON values to GraphQL, we’d need to use the type
|
||||
information to guide the parsing. Since $a has type String, its value should be
|
||||
parsed as the GraphQL string "FOO", while $b has type FooBar, so its value
|
||||
should be parsed as the GraphQL enum value FOO.
|
||||
|
||||
We could do this type-directed parsing, but there are some advantages to being
|
||||
lazier. For one, we can use JSON values directly when used as a column value of
|
||||
type json or jsonb, rather than converting them to GraphQL and back; which, in
|
||||
turn, solves another problem with JSON objects: JSON object keys are arbitrary
|
||||
strings, while GraphQL input object keys are GraphQL names, and therefore
|
||||
restricted: not all JSON objects can be represented by a GraphQL input object.
|
||||
|
||||
Arguably such columns should really be represented as strings containing encoded
|
||||
JSON, not GraphQL lists/objects, but the decision to treat them otherwise is
|
||||
old, and it would be backwards-incompatible to change now. We can also avoid
|
||||
needing to interpret the values of variables for types outside our control
|
||||
(i.e. those from a remote schema), which can be useful in the case of custom
|
||||
scalars or extensions of the GraphQL protocol.
|
||||
|
||||
So instead we use the InputValue type to represent that an input value might be
|
||||
a GraphQL literal value or a JSON value from the variables payload. This means
|
||||
each input parser constructor needs to be able to parse both GraphQL values and
|
||||
JSON values, but fortunately, the duplication of logic is minimal. -}
|
||||
|
||||
-- | See Note [Parsing variable values].
|
||||
data InputValue v
|
||||
= GraphQLValue (Value v)
|
||||
| JSONValue J.Value
|
||||
deriving (Show, Eq, Functor)
|
||||
|
||||
data Variable = Variable
|
||||
{ vInfo :: VariableInfo
|
||||
, vType :: GType
|
||||
, vValue :: InputValue Void
|
||||
-- ^ Note: if the variable was null or was not provided and the field has a
|
||||
-- non-null default value, this field contains the default value, not 'VNull'.
|
||||
} deriving (Show,Eq)
|
||||
|
||||
|
||||
data VariableInfo
|
||||
= VIRequired Name
|
||||
-- | Unlike fields (see 'IFOptional'), nullable variables with no default
|
||||
-- value are indistinguishable from variables with a default value of null, so
|
||||
-- we don’t distinguish those cases here.
|
||||
| VIOptional Name (Value Void)
|
||||
deriving (Show,Eq)
|
||||
|
||||
instance HasName Variable where
|
||||
getName = getName . vInfo
|
||||
instance HasName VariableInfo where
|
||||
getName (VIRequired name) = name
|
||||
getName (VIOptional name _) = name
|
||||
|
||||
-- -----------------------------------------------------------------------------
|
||||
-- support for introspection queries
|
||||
|
||||
-- | This type represents the directives information to be served over GraphQL introspection
|
||||
data DirectiveInfo = DirectiveInfo
|
||||
{ diName :: !Name
|
||||
, diDescription :: !(Maybe Description)
|
||||
, diArguments :: ![Definition InputFieldInfo]
|
||||
, diLocations :: ![DirectiveLocation]
|
||||
}
|
||||
|
||||
-- | This type contains all the information needed to efficiently serve GraphQL
|
||||
-- introspection queries. It corresponds to the GraphQL @__Schema@ type defined
|
||||
-- in <§ 4.5 Schema Introspection http://spec.graphql.org/June2018/#sec-Introspection>.
|
||||
data Schema = Schema
|
||||
{ sDescription :: Maybe Description
|
||||
, sTypes :: HashMap Name (Definition SomeTypeInfo)
|
||||
, sQueryType :: Type 'Output
|
||||
, sMutationType :: Maybe (Type 'Output)
|
||||
, sSubscriptionType :: Maybe (Type 'Output)
|
||||
, sDirectives :: [DirectiveInfo]
|
||||
}
|
||||
|
||||
-- | Recursively collects all type definitions accessible from the given value.
|
||||
collectTypeDefinitions
|
||||
:: (HasTypeDefinitions a, MonadError ConflictingDefinitions m)
|
||||
=> a -> m (HashMap Name (Definition SomeTypeInfo))
|
||||
collectTypeDefinitions = flip execStateT Map.empty . accumulateTypeDefinitions
|
||||
|
||||
data ConflictingDefinitions
|
||||
= ConflictingDefinitions (Definition SomeTypeInfo) (Definition SomeTypeInfo)
|
||||
|
||||
class HasTypeDefinitions a where
|
||||
-- | Recursively accumulates all type definitions accessible from the given
|
||||
-- value. This is done statefully to avoid infinite loops arising from
|
||||
-- recursive type definitions; see Note [Tying the knot] in Hasura.GraphQL.Parser.Class.
|
||||
accumulateTypeDefinitions
|
||||
:: ( MonadError ConflictingDefinitions m
|
||||
, MonadState (HashMap Name (Definition SomeTypeInfo)) m )
|
||||
=> a -> m ()
|
||||
|
||||
instance HasTypeDefinitions (Definition (TypeInfo k)) where
|
||||
accumulateTypeDefinitions definition = do
|
||||
-- This is the important case! We actually have a type definition, so we
|
||||
-- need to add it to the state.
|
||||
definitions <- get
|
||||
let new = SomeTypeInfo <$> definition
|
||||
case Map.lookup (dName new) definitions of
|
||||
Nothing -> do
|
||||
put $! Map.insert (dName new) new definitions
|
||||
-- This type definition might reference other type definitions, so we
|
||||
-- still need to recur.
|
||||
accumulateTypeDefinitions (dInfo definition)
|
||||
Just old
|
||||
-- It’s important we /don’t/ recur if we’ve already seen this definition
|
||||
-- before to avoid infinite loops; see Note [Tying the knot] in Hasura.GraphQL.Parser.Class.
|
||||
| old == new -> pure ()
|
||||
| otherwise -> throwError $ ConflictingDefinitions old new
|
||||
|
||||
instance HasTypeDefinitions a => HasTypeDefinitions [a] where
|
||||
accumulateTypeDefinitions = traverse_ accumulateTypeDefinitions
|
||||
|
||||
instance HasTypeDefinitions (Type k) where
|
||||
accumulateTypeDefinitions = \case
|
||||
NonNullable t -> accumulateTypeDefinitions t
|
||||
Nullable t -> accumulateTypeDefinitions t
|
||||
|
||||
instance HasTypeDefinitions (NonNullableType k) where
|
||||
accumulateTypeDefinitions = \case
|
||||
TNamed d -> accumulateTypeDefinitions d
|
||||
TList t -> accumulateTypeDefinitions t
|
||||
|
||||
instance HasTypeDefinitions (TypeInfo k) where
|
||||
accumulateTypeDefinitions = \case
|
||||
TIScalar -> pure ()
|
||||
TIEnum _ -> pure ()
|
||||
TIInputObject (InputObjectInfo fields) -> accumulateTypeDefinitions fields
|
||||
TIObject (ObjectInfo fields interfaces) ->
|
||||
accumulateTypeDefinitions fields >> accumulateTypeDefinitions interfaces
|
||||
TIInterface (InterfaceInfo fields objects) ->
|
||||
accumulateTypeDefinitions fields
|
||||
>> accumulateTypeDefinitions objects
|
||||
TIUnion (UnionInfo objects) -> accumulateTypeDefinitions objects
|
||||
|
||||
instance HasTypeDefinitions (Definition InputObjectInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIInputObject
|
||||
|
||||
instance HasTypeDefinitions (Definition InputFieldInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . dInfo
|
||||
|
||||
instance HasTypeDefinitions InputFieldInfo where
|
||||
accumulateTypeDefinitions = \case
|
||||
IFRequired t -> accumulateTypeDefinitions t
|
||||
IFOptional t _ -> accumulateTypeDefinitions t
|
||||
|
||||
instance HasTypeDefinitions (Definition FieldInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . dInfo
|
||||
|
||||
instance HasTypeDefinitions FieldInfo where
|
||||
accumulateTypeDefinitions (FieldInfo args t) = do
|
||||
accumulateTypeDefinitions args
|
||||
accumulateTypeDefinitions t
|
||||
|
||||
instance HasTypeDefinitions (Definition ObjectInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIObject
|
||||
|
||||
instance HasTypeDefinitions (Definition InterfaceInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIInterface
|
||||
|
||||
instance HasTypeDefinitions (Definition UnionInfo) where
|
||||
accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIUnion
|
@ -1,425 +0,0 @@
|
||||
module Hasura.GraphQL.RelaySchema where
|
||||
|
||||
import Control.Lens.Extended hiding (op)
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Utils (duplicates)
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import Hasura.GraphQL.Schema
|
||||
import Hasura.GraphQL.Schema.BoolExp
|
||||
import Hasura.GraphQL.Schema.Builder
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Function
|
||||
import Hasura.GraphQL.Schema.OrderBy
|
||||
import Hasura.GraphQL.Schema.Select
|
||||
|
||||
mkNodeInterface :: [QualifiedTable] -> IFaceTyInfo
|
||||
mkNodeInterface relayTableNames =
|
||||
let description = G.Description "An object with globally unique ID"
|
||||
in mkIFaceTyInfo (Just description) nodeType (mapFromL _fiName [idField]) $
|
||||
Set.fromList $ map mkTableTy relayTableNames
|
||||
where
|
||||
idField =
|
||||
let description = G.Description "A globally unique identifier"
|
||||
in mkHsraObjFldInfo (Just description) "id" mempty nodeIdType
|
||||
|
||||
-- | Relay schema should contain tables and relationships (whose remote tables)
|
||||
-- with a mandatory primary key
|
||||
tablesWithOnlyPrimaryKey :: TableCache -> TableCache
|
||||
tablesWithOnlyPrimaryKey tableCache =
|
||||
flip Map.mapMaybe tableCache $ \tableInfo ->
|
||||
tableInfo ^. tiCoreInfo.tciPrimaryKey *>
|
||||
Just (infoWithPrimaryKeyRelations tableInfo)
|
||||
where
|
||||
infoWithPrimaryKeyRelations =
|
||||
tiCoreInfo.tciFieldInfoMap %~ Map.mapMaybe (_FIRelationship %%~ withPrimaryKey)
|
||||
|
||||
withPrimaryKey relInfo =
|
||||
let remoteTable = riRTable relInfo
|
||||
maybePrimaryKey =
|
||||
(tableCache ^. at remoteTable) >>= (^. tiCoreInfo.tciPrimaryKey)
|
||||
in maybePrimaryKey *> Just relInfo
|
||||
|
||||
mkRelayGCtxMap
|
||||
:: forall m. (MonadError QErr m)
|
||||
=> TableCache -> FunctionCache -> m RelayGCtxMap
|
||||
mkRelayGCtxMap tableCache functionCache = do
|
||||
typesMapL <- mapM (mkRelayGCtxMapTable relayTableCache functionCache) relayTables
|
||||
typesMap <- combineTypes typesMapL
|
||||
let gCtxMap = flip Map.map typesMap $
|
||||
\(ty, flds, insCtx) -> mkGCtx ty flds insCtx
|
||||
pure gCtxMap
|
||||
where
|
||||
relayTableCache = tablesWithOnlyPrimaryKey tableCache
|
||||
relayTables =
|
||||
filter (tableFltr . _tiCoreInfo) $ Map.elems relayTableCache
|
||||
|
||||
tableFltr ti =
|
||||
not (isSystemDefined $ _tciSystemDefined ti)
|
||||
&& isValidObjectName (_tciName ti)
|
||||
|
||||
combineTypes
|
||||
:: [Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap)]
|
||||
-> m (Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap))
|
||||
combineTypes maps = do
|
||||
let listMap = foldr (Map.unionWith (++) . Map.map pure) mempty maps
|
||||
flip Map.traverseWithKey listMap $ \roleName typeList -> do
|
||||
let relayTableNames = map (_tciName . _tiCoreInfo) relayTables
|
||||
tyAgg = foldr addTypeInfoToTyAgg (mconcat $ map (^. _1) typeList)
|
||||
[ TIIFace $ mkNodeInterface relayTableNames
|
||||
, TIObj pageInfoObj
|
||||
]
|
||||
insCtx = mconcat $ map (^. _3) typeList
|
||||
rootFields <- combineRootFields roleName $ map (^. _2) typeList
|
||||
pure (tyAgg, rootFields, insCtx)
|
||||
|
||||
combineRootFields :: RoleName -> [RootFields] -> m RootFields
|
||||
combineRootFields roleName rootFields = do
|
||||
let duplicateQueryFields = duplicates $
|
||||
concatMap (Map.keys . _rootQueryFields) rootFields
|
||||
duplicateMutationFields = duplicates $
|
||||
concatMap (Map.keys . _rootMutationFields) rootFields
|
||||
|
||||
-- TODO: The following exception should result in inconsistency
|
||||
when (not $ null duplicateQueryFields) $
|
||||
throw400 Unexpected $ "following query root fields are duplicated: "
|
||||
<> showNames duplicateQueryFields
|
||||
|
||||
when (not $ null duplicateMutationFields) $
|
||||
throw400 Unexpected $ "following mutation root fields are duplicated: "
|
||||
<> showNames duplicateMutationFields
|
||||
|
||||
pure $ mconcat $ mkNodeQueryRootFields roleName relayTables : rootFields
|
||||
|
||||
mkRelayGCtxMapTable
|
||||
:: (MonadError QErr m)
|
||||
=> TableCache
|
||||
-> FunctionCache
|
||||
-> TableInfo
|
||||
-> m (Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap))
|
||||
mkRelayGCtxMapTable tableCache funcCache tabInfo = do
|
||||
m <- flip Map.traverseWithKey rolePerms $
|
||||
mkRelayGCtxRole tableCache tn descM fields primaryKey validConstraints tabFuncs viewInfo customConfig
|
||||
adminSelFlds <- mkAdminSelFlds fields tableCache
|
||||
adminInsCtx <- mkAdminInsCtx tableCache fields
|
||||
let adminCtx = mkRelayTyAggRole tn descM (Just (cols, icRelations adminInsCtx))
|
||||
(Just (True, adminSelFlds)) (Just cols) (Just ())
|
||||
primaryKey validConstraints viewInfo tabFuncs
|
||||
adminInsCtxMap = Map.singleton tn adminInsCtx
|
||||
return $ Map.insert adminRoleName (adminCtx, adminRootFlds, adminInsCtxMap) m
|
||||
where
|
||||
TableInfo coreInfo rolePerms _ = tabInfo
|
||||
TableCoreInfo tn descM _ fields primaryKey _ _ viewInfo _ customConfig = coreInfo
|
||||
validConstraints = mkValidConstraints $ map _cName (tciUniqueOrPrimaryKeyConstraints coreInfo)
|
||||
tabFuncs = filter (isValidObjectName . fiName) $
|
||||
getFuncsOfTable tn funcCache
|
||||
cols = getValidCols fields
|
||||
adminRootFlds =
|
||||
let insertPermDetails = Just ([], True)
|
||||
selectPermDetails = Just (noFilter, Nothing, [], True)
|
||||
updatePermDetails = Just (getValidCols fields, mempty, noFilter, Nothing, [])
|
||||
deletePermDetails = Just (noFilter, [])
|
||||
|
||||
queryFields = getRelayQueryRootFieldsRole tn primaryKey fields tabFuncs
|
||||
selectPermDetails
|
||||
mutationFields = getMutationRootFieldsRole tn primaryKey
|
||||
validConstraints fields insertPermDetails
|
||||
selectPermDetails updatePermDetails
|
||||
deletePermDetails viewInfo customConfig
|
||||
in RootFields queryFields mutationFields
|
||||
|
||||
mkRelayGCtxRole
|
||||
:: (MonadError QErr m)
|
||||
=> TableCache
|
||||
-> QualifiedTable
|
||||
-> Maybe PGDescription
|
||||
-> FieldInfoMap FieldInfo
|
||||
-> Maybe (PrimaryKey PGColumnInfo)
|
||||
-> [ConstraintName]
|
||||
-> [FunctionInfo]
|
||||
-> Maybe ViewInfo
|
||||
-> TableConfig
|
||||
-> RoleName
|
||||
-> RolePermInfo
|
||||
-> m (TyAgg, RootFields, InsCtxMap)
|
||||
mkRelayGCtxRole tableCache tn descM fields primaryKey constraints funcs viM tabConfigM role permInfo = do
|
||||
selPermM <- mapM (getSelPerm tableCache fields role) selM
|
||||
tabInsInfoM <- forM (_permIns permInfo) $ \ipi -> do
|
||||
ctx <- mkInsCtx role tableCache fields ipi $ _permUpd permInfo
|
||||
let permCols = flip getColInfos allCols $ Set.toList $ ipiCols ipi
|
||||
return (ctx, (permCols, icRelations ctx))
|
||||
let insPermM = snd <$> tabInsInfoM
|
||||
insCtxM = fst <$> tabInsInfoM
|
||||
updColsM = filterColumnFields . upiCols <$> _permUpd permInfo
|
||||
tyAgg = mkRelayTyAggRole tn descM insPermM selPermM updColsM
|
||||
(void $ _permDel permInfo) primaryKey constraints viM funcs
|
||||
queryRootFlds = getRelayQueryRootFieldsRole tn primaryKey fields funcs
|
||||
(mkSel <$> _permSel permInfo)
|
||||
mutationRootFlds = getMutationRootFieldsRole tn primaryKey constraints fields
|
||||
(mkIns <$> insM) (mkSel <$> selM)
|
||||
(mkUpd <$> updM) (mkDel <$> delM) viM tabConfigM
|
||||
insCtxMap = maybe Map.empty (Map.singleton tn) insCtxM
|
||||
return (tyAgg, RootFields queryRootFlds mutationRootFlds, insCtxMap)
|
||||
where
|
||||
RolePermInfo insM selM updM delM = permInfo
|
||||
allCols = getCols fields
|
||||
filterColumnFields allowedSet =
|
||||
filter ((`Set.member` allowedSet) . pgiColumn) $ getValidCols fields
|
||||
mkIns i = (ipiRequiredHeaders i, isJust updM)
|
||||
mkSel s = ( spiFilter s, spiLimit s
|
||||
, spiRequiredHeaders s, spiAllowAgg s
|
||||
)
|
||||
mkUpd u = ( flip getColInfos allCols $ Set.toList $ upiCols u
|
||||
, upiSet u
|
||||
, upiFilter u
|
||||
, upiCheck u
|
||||
, upiRequiredHeaders u
|
||||
)
|
||||
mkDel d = (dpiFilter d, dpiRequiredHeaders d)
|
||||
|
||||
mkRelayTyAggRole
|
||||
:: QualifiedTable
|
||||
-> Maybe PGDescription
|
||||
-- ^ Postgres description
|
||||
-> Maybe ([PGColumnInfo], RelationInfoMap)
|
||||
-- ^ insert permission
|
||||
-> Maybe (Bool, [SelField])
|
||||
-- ^ select permission
|
||||
-> Maybe [PGColumnInfo]
|
||||
-- ^ update cols
|
||||
-> Maybe ()
|
||||
-- ^ delete cols
|
||||
-> Maybe (PrimaryKey PGColumnInfo)
|
||||
-> [ConstraintName]
|
||||
-- ^ constraints
|
||||
-> Maybe ViewInfo
|
||||
-> [FunctionInfo]
|
||||
-- ^ all functions
|
||||
-> TyAgg
|
||||
mkRelayTyAggRole tn descM insPermM selPermM updColsM delPermM pkeyCols constraints viM funcs =
|
||||
let (mutationTypes, mutationFields) =
|
||||
mkMutationTypesAndFieldsRole tn insPermM selFldsM updColsM delPermM pkeyCols constraints viM
|
||||
in TyAgg (mkTyInfoMap allTypes <> mutationTypes)
|
||||
(fieldMap <> mutationFields)
|
||||
scalars ordByCtx
|
||||
where
|
||||
ordByCtx = fromMaybe Map.empty ordByCtxM
|
||||
|
||||
funcInpArgTys = bool [] (map TIInpObj funcArgInpObjs) $ isJust selFldsM
|
||||
|
||||
allTypes = queryTypes <> aggQueryTypes <> funcInpArgTys <> computedFieldFuncArgsInps
|
||||
|
||||
queryTypes = map TIObj selectObjects <>
|
||||
catMaybes
|
||||
[ TIInpObj <$> boolExpInpObjM
|
||||
, TIInpObj <$> ordByInpObjM
|
||||
, TIEnum <$> selColInpTyM
|
||||
]
|
||||
aggQueryTypes = map TIObj aggObjs <> map TIInpObj aggOrdByInps
|
||||
|
||||
fieldMap = Map.unions $ catMaybes [boolExpInpObjFldsM, selObjFldsM]
|
||||
scalars = selByPkScalarSet <> funcArgScalarSet <> computedFieldFuncArgScalars
|
||||
|
||||
selFldsM = snd <$> selPermM
|
||||
selColNamesM = map pgiName . getPGColumnFields <$> selFldsM
|
||||
selColInpTyM = mkSelColumnTy tn <$> selColNamesM
|
||||
-- boolexp input type
|
||||
boolExpInpObjM = case selFldsM of
|
||||
Just selFlds -> Just $ mkBoolExpInp tn selFlds
|
||||
-- no select permission
|
||||
Nothing ->
|
||||
-- but update/delete is defined
|
||||
if isJust updColsM || isJust delPermM
|
||||
then Just $ mkBoolExpInp tn []
|
||||
else Nothing
|
||||
|
||||
-- funcargs input type
|
||||
funcArgInpObjs = flip mapMaybe funcs $ \func ->
|
||||
mkFuncArgsInp (fiName func) (getInputArgs func)
|
||||
-- funcArgCtx = Map.unions funcArgCtxs
|
||||
funcArgScalarSet = funcs ^.. folded.to getInputArgs.folded.to (_qptName.faType)
|
||||
|
||||
-- helper
|
||||
mkFldMap ty = Map.fromList . concatMap (mkFld ty)
|
||||
mkFld ty = \case
|
||||
SFPGColumn ci -> [((ty, pgiName ci), RFPGColumn ci)]
|
||||
SFRelationship (RelationshipFieldInfo relInfo allowAgg cols permFilter permLimit maybePkCols _) ->
|
||||
let relationshipName = riName relInfo
|
||||
relFld = ( (ty, mkRelName relationshipName)
|
||||
, RFRelationship $ RelationshipField relInfo RFKSimple cols permFilter permLimit
|
||||
)
|
||||
aggRelFld = ( (ty, mkAggRelName relationshipName)
|
||||
, RFRelationship $ RelationshipField relInfo RFKAggregate cols permFilter permLimit
|
||||
)
|
||||
maybeConnFld = maybePkCols <&> \pkCols ->
|
||||
( (ty, mkConnectionRelName relationshipName)
|
||||
, RFRelationship $ RelationshipField relInfo
|
||||
(RFKConnection pkCols) cols permFilter permLimit
|
||||
)
|
||||
in case riType relInfo of
|
||||
ObjRel -> [relFld]
|
||||
ArrRel -> bool [relFld] [relFld, aggRelFld] allowAgg
|
||||
<> maybeToList maybeConnFld
|
||||
|
||||
SFComputedField cf -> pure
|
||||
( (ty, mkComputedFieldName $ _cfName cf)
|
||||
, RFComputedField cf
|
||||
)
|
||||
SFRemoteRelationship remoteField -> pure
|
||||
( (ty, G.Name (remoteRelationshipNameToText (_rfiName remoteField)))
|
||||
, RFRemoteRelationship remoteField
|
||||
)
|
||||
|
||||
-- the fields used in bool exp
|
||||
boolExpInpObjFldsM = mkFldMap (mkBoolExpTy tn) <$> selFldsM
|
||||
|
||||
-- table obj
|
||||
selectObjects = case selPermM of
|
||||
Just (_, selFlds) ->
|
||||
[ (mkRelayTableObj tn descM selFlds)
|
||||
{_otiImplIFaces = Set.singleton nodeType}
|
||||
, mkTableEdgeObj tn
|
||||
, mkTableConnectionObj tn
|
||||
]
|
||||
Nothing -> []
|
||||
|
||||
-- aggregate objs and order by inputs
|
||||
(aggObjs, aggOrdByInps) = case selPermM of
|
||||
Just (True, selFlds) ->
|
||||
let cols = getPGColumnFields selFlds
|
||||
numCols = onlyNumCols cols
|
||||
compCols = onlyComparableCols cols
|
||||
objs = [ mkTableAggObj tn
|
||||
, mkTableAggregateFieldsObj tn (numCols, numAggregateOps) (compCols, compAggregateOps)
|
||||
] <> mkColAggregateFieldsObjs selFlds
|
||||
ordByInps = mkTabAggOrdByInpObj tn (numCols, numAggregateOps) (compCols, compAggregateOps)
|
||||
: mkTabAggregateOpOrdByInpObjs tn (numCols, numAggregateOps) (compCols, compAggregateOps)
|
||||
in (objs, ordByInps)
|
||||
_ -> ([], [])
|
||||
|
||||
getNumericCols = onlyNumCols . getPGColumnFields
|
||||
getComparableCols = onlyComparableCols . getPGColumnFields
|
||||
onlyFloat = const $ mkScalarTy PGFloat
|
||||
|
||||
mkTypeMaker "sum" = mkColumnType
|
||||
mkTypeMaker _ = onlyFloat
|
||||
|
||||
mkColAggregateFieldsObjs flds =
|
||||
let numCols = getNumericCols flds
|
||||
compCols = getComparableCols flds
|
||||
mkNumObjFld n = mkTableColAggregateFieldsObj tn n (mkTypeMaker n) numCols
|
||||
mkCompObjFld n = mkTableColAggregateFieldsObj tn n mkColumnType compCols
|
||||
numFldsObjs = bool (map mkNumObjFld numAggregateOps) [] $ null numCols
|
||||
compFldsObjs = bool (map mkCompObjFld compAggregateOps) [] $ null compCols
|
||||
in numFldsObjs <> compFldsObjs
|
||||
-- the fields used in table object
|
||||
nodeFieldM = RFNodeId tn . _pkColumns <$> pkeyCols
|
||||
selObjFldsM = mkFldMap (mkTableTy tn) <$> selFldsM >>=
|
||||
\fm -> nodeFieldM <&> \nodeField ->
|
||||
Map.insert (mkTableTy tn, "id") nodeField fm
|
||||
-- the scalar set for table_by_pk arguments
|
||||
selByPkScalarSet = pkeyCols ^.. folded.to _pkColumns.folded.to pgiType._PGColumnScalar
|
||||
|
||||
ordByInpCtxM = mkOrdByInpObj tn <$> selFldsM
|
||||
(ordByInpObjM, ordByCtxM) = case ordByInpCtxM of
|
||||
Just (a, b) -> (Just a, Just b)
|
||||
Nothing -> (Nothing, Nothing)
|
||||
|
||||
-- computed fields' function args input objects and scalar types
|
||||
mkComputedFieldRequiredTypes computedFieldInfo =
|
||||
let ComputedFieldFunction qf inputArgs _ _ _ = _cfFunction computedFieldInfo
|
||||
scalarArgs = map (_qptName . faType) $ toList inputArgs
|
||||
in (, scalarArgs) <$> mkFuncArgsInp qf inputArgs
|
||||
|
||||
computedFieldReqTypes = catMaybes $
|
||||
maybe [] (map mkComputedFieldRequiredTypes . getComputedFields) selFldsM
|
||||
|
||||
computedFieldFuncArgsInps = map (TIInpObj . fst) computedFieldReqTypes
|
||||
computedFieldFuncArgScalars = Set.fromList $ concatMap snd computedFieldReqTypes
|
||||
|
||||
mkSelectOpCtx
|
||||
:: QualifiedTable
|
||||
-> [PGColumnInfo]
|
||||
-> (AnnBoolExpPartialSQL, Maybe Int, [T.Text]) -- select filter
|
||||
-> SelOpCtx
|
||||
mkSelectOpCtx tn allCols (fltr, pLimit, hdrs) =
|
||||
SelOpCtx tn hdrs colGNameMap fltr pLimit
|
||||
where
|
||||
colGNameMap = mkPGColGNameMap allCols
|
||||
|
||||
getRelayQueryRootFieldsRole
|
||||
:: QualifiedTable
|
||||
-> Maybe (PrimaryKey PGColumnInfo)
|
||||
-> FieldInfoMap FieldInfo
|
||||
-> [FunctionInfo]
|
||||
-> Maybe (AnnBoolExpPartialSQL, Maybe Int, [T.Text], Bool) -- select filter
|
||||
-> QueryRootFieldMap
|
||||
getRelayQueryRootFieldsRole tn primaryKey fields funcs selM =
|
||||
makeFieldMap $
|
||||
funcConnectionQueries
|
||||
<> catMaybes
|
||||
[ getSelConnectionDet <$> selM <*> maybePrimaryKeyColumns
|
||||
]
|
||||
where
|
||||
maybePrimaryKeyColumns = fmap _pkColumns primaryKey
|
||||
colGNameMap = mkPGColGNameMap $ getCols fields
|
||||
|
||||
funcConnectionQueries = fromMaybe [] $ getFuncQueryConnectionFlds
|
||||
<$> selM <*> maybePrimaryKeyColumns
|
||||
|
||||
getSelConnectionDet (selFltr, pLimit, hdrs, _) primaryKeyColumns =
|
||||
( QCSelectConnection primaryKeyColumns $ mkSelectOpCtx tn (getCols fields) (selFltr, pLimit, hdrs)
|
||||
, mkSelFldConnection Nothing tn
|
||||
)
|
||||
|
||||
getFuncQueryConnectionFlds (selFltr, pLimit, hdrs, _) primaryKeyColumns =
|
||||
flip map funcs $ \fi ->
|
||||
( QCFuncConnection primaryKeyColumns $
|
||||
FuncQOpCtx (fiName fi) (mkFuncArgItemSeq fi) hdrs colGNameMap selFltr pLimit
|
||||
, mkFuncQueryConnectionFld fi $ fiDescription fi
|
||||
)
|
||||
|
||||
mkNodeQueryRootFields :: RoleName -> [TableInfo] -> RootFields
|
||||
mkNodeQueryRootFields roleName relayTables =
|
||||
RootFields (mapFromL (_fiName . snd) [nodeQueryDet]) mempty
|
||||
where
|
||||
nodeQueryDet =
|
||||
( QCNodeSelect nodeSelMap
|
||||
, nodeQueryField
|
||||
)
|
||||
|
||||
nodeQueryField =
|
||||
let nodeParams = fromInpValL $ pure $
|
||||
InpValInfo (Just $ G.Description "A globally unique id")
|
||||
"id" Nothing nodeIdType
|
||||
in mkHsraObjFldInfo Nothing "node" nodeParams $ G.toGT nodeType
|
||||
|
||||
nodeSelMap =
|
||||
Map.fromList $ flip mapMaybe relayTables $ \table ->
|
||||
let tableName = _tciName $ _tiCoreInfo table
|
||||
allColumns = getCols $ _tciFieldInfoMap $ _tiCoreInfo table
|
||||
selectPermM = _permSel <$> Map.lookup roleName
|
||||
(_tiRolePermInfoMap table)
|
||||
permDetailsM = join selectPermM <&> \perm ->
|
||||
( spiFilter perm
|
||||
, spiLimit perm
|
||||
, spiRequiredHeaders perm
|
||||
)
|
||||
adminPermDetails = (noFilter, Nothing, [])
|
||||
in (mkTableTy tableName,) <$>
|
||||
((,) <$>
|
||||
(mkSelectOpCtx tableName allColumns <$>
|
||||
bool permDetailsM (Just adminPermDetails) (isAdmin roleName)
|
||||
) <*> (table ^? tiCoreInfo.tciPrimaryKey._Just.pkColumns)
|
||||
)
|
@ -1,9 +1,13 @@
|
||||
module Hasura.GraphQL.RemoteServer where
|
||||
module Hasura.GraphQL.RemoteServer
|
||||
( fetchRemoteSchema
|
||||
, IntrospectionResult
|
||||
, execRemoteGQ'
|
||||
) where
|
||||
|
||||
import Control.Exception (try)
|
||||
import Control.Lens ((^.))
|
||||
import Control.Monad.Unique
|
||||
import Data.Aeson ((.:), (.:?))
|
||||
import Data.Foldable (foldlM)
|
||||
import Hasura.HTTP
|
||||
import Hasura.Prelude
|
||||
|
||||
@ -12,14 +16,17 @@ import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
import qualified Language.GraphQL.Draft.Parser as G
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Language.Haskell.TH.Syntax as TH
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as N
|
||||
import qualified Network.Wreq as Wreq
|
||||
|
||||
import Hasura.GraphQL.Schema.Merge
|
||||
|
||||
import qualified Hasura.GraphQL.Parser.Monad as P
|
||||
import Hasura.GraphQL.Schema.Remote
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
import Hasura.RQL.DDL.Headers (makeHeadersFromConf)
|
||||
import Hasura.RQL.Types
|
||||
@ -27,11 +34,6 @@ import Hasura.Server.Utils
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Hasura.GraphQL.Context as GC
|
||||
import qualified Hasura.GraphQL.Schema as GS
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
introspectionQuery :: GQLReqParsed
|
||||
introspectionQuery =
|
||||
$(do
|
||||
@ -44,12 +46,14 @@ introspectionQuery =
|
||||
)
|
||||
|
||||
fetchRemoteSchema
|
||||
:: (HasVersion, MonadIO m, MonadError QErr m)
|
||||
:: forall m
|
||||
. (HasVersion, MonadIO m, MonadUnique m, MonadError QErr m)
|
||||
=> Env.Environment
|
||||
-> HTTP.Manager
|
||||
-> RemoteSchemaName
|
||||
-> RemoteSchemaInfo
|
||||
-> m GC.RemoteGCtx
|
||||
fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeout) = do
|
||||
-> m RemoteSchemaCtx
|
||||
fetchRemoteSchema env manager schemaName schemaInfo@(RemoteSchemaInfo url headerConf _ timeout) = do
|
||||
headers <- makeHeadersFromConf env headerConf
|
||||
let hdrsWithDefaults = addDefaultHeaders headers
|
||||
|
||||
@ -68,28 +72,25 @@ fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeou
|
||||
statusCode = resp ^. Wreq.responseStatus . Wreq.statusCode
|
||||
when (statusCode /= 200) $ throwNon200 statusCode respData
|
||||
|
||||
introspectRes :: (FromIntrospection IntrospectionResult) <-
|
||||
-- Parse the JSON into flat GraphQL type AST
|
||||
(FromIntrospection introspectRes) :: (FromIntrospection IntrospectionResult) <-
|
||||
either (remoteSchemaErr . T.pack) return $ J.eitherDecode respData
|
||||
let (sDoc, qRootN, mRootN, sRootN) =
|
||||
fromIntrospection introspectRes
|
||||
typMap <- either remoteSchemaErr return $ VT.fromSchemaDoc sDoc $
|
||||
VT.TLRemoteType name def
|
||||
let mQrTyp = Map.lookup qRootN typMap
|
||||
mMrTyp = (`Map.lookup` typMap) =<< mRootN
|
||||
mSrTyp = (`Map.lookup` typMap) =<< sRootN
|
||||
qrTyp <- liftMaybe noQueryRoot mQrTyp
|
||||
let mRmQR = VT.getObjTyM qrTyp
|
||||
mRmMR = VT.getObjTyM =<< mMrTyp
|
||||
mRmSR = VT.getObjTyM =<< mSrTyp
|
||||
rmQR <- liftMaybe (err400 Unexpected "query root has to be an object type") mRmQR
|
||||
return $ GC.RemoteGCtx typMap rmQR mRmMR mRmSR
|
||||
|
||||
-- Check that the parsed GraphQL type info is valid by running the schema generation
|
||||
(queryParsers, mutationParsers, subscriptionParsers) <-
|
||||
P.runSchemaT @m @(P.ParseT Identity) $ buildRemoteParser introspectRes schemaInfo
|
||||
|
||||
-- The 'rawIntrospectionResult' contains the 'Bytestring' response of
|
||||
-- the introspection result of the remote server. We store this in the
|
||||
-- 'RemoteSchemaCtx' because we can use this when the 'introspect_remote_schema'
|
||||
-- is called by simple encoding the result to JSON.
|
||||
return $ RemoteSchemaCtx schemaName introspectRes schemaInfo respData $
|
||||
ParsedIntrospection queryParsers mutationParsers subscriptionParsers
|
||||
where
|
||||
noQueryRoot = err400 Unexpected "query root not found in remote schema"
|
||||
remoteSchemaErr :: (MonadError QErr m) => T.Text -> m a
|
||||
remoteSchemaErr :: T.Text -> m a
|
||||
remoteSchemaErr = throw400 RemoteSchemaError
|
||||
|
||||
throwHttpErr :: (MonadError QErr m) => HTTP.HttpException -> m a
|
||||
throwHttpErr :: HTTP.HttpException -> m a
|
||||
throwHttpErr = throwWithInternal httpExceptMsg . httpExceptToJSON
|
||||
|
||||
throwNon200 st = throwWithInternal (non200Msg st) . decodeNon200Resp
|
||||
@ -108,36 +109,9 @@ fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeou
|
||||
Right a -> J.object ["response" J..= (a :: J.Value)]
|
||||
Left _ -> J.object ["raw_body" J..= bsToTxt (BL.toStrict bs)]
|
||||
|
||||
mergeSchemas
|
||||
:: (MonadError QErr m)
|
||||
=> RemoteSchemaMap
|
||||
-> GS.GCtxMap
|
||||
-- the merged GCtxMap and the default GCtx without roles
|
||||
-> m (GS.GCtxMap, GS.GCtx)
|
||||
mergeSchemas rmSchemaMap gCtxMap = do
|
||||
def <- mkDefaultRemoteGCtx remoteSchemas
|
||||
merged <- mergeRemoteSchema gCtxMap def
|
||||
return (merged, def)
|
||||
where
|
||||
remoteSchemas = map rscGCtx $ Map.elems rmSchemaMap
|
||||
|
||||
mkDefaultRemoteGCtx
|
||||
:: (MonadError QErr m)
|
||||
=> [GC.GCtx] -> m GS.GCtx
|
||||
mkDefaultRemoteGCtx =
|
||||
foldlM mergeGCtx GC.emptyGCtx
|
||||
|
||||
-- merge a remote schema `gCtx` into current `gCtxMap`
|
||||
mergeRemoteSchema
|
||||
:: (MonadError QErr m)
|
||||
=> GS.GCtxMap
|
||||
-> GS.GCtx
|
||||
-> m GS.GCtxMap
|
||||
mergeRemoteSchema ctxMap mergedRemoteGCtx =
|
||||
flip Map.traverseWithKey ctxMap $ \_ schemaCtx ->
|
||||
for schemaCtx $ \gCtx -> mergeGCtx gCtx mergedRemoteGCtx
|
||||
|
||||
-- | Parsing the introspection query result
|
||||
-- | Parsing the introspection query result. We use this newtype wrapper to
|
||||
-- avoid orphan instances and parse JSON in the way that we need for GraphQL
|
||||
-- introspection results.
|
||||
newtype FromIntrospection a
|
||||
= FromIntrospection { fromIntrospection :: a }
|
||||
deriving (Show, Eq, Generic)
|
||||
@ -153,7 +127,7 @@ instance J.FromJSON (FromIntrospection G.Description) where
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.ScalarTypeDefinition) where
|
||||
parseJSON = J.withObject "ScalarTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
when (kind /= "SCALAR") $ kindErr kind "scalar"
|
||||
@ -163,14 +137,13 @@ instance J.FromJSON (FromIntrospection G.ScalarTypeDefinition) where
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.ObjectTypeDefinition) where
|
||||
parseJSON = J.withObject "ObjectTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
fields <- o .:? "fields"
|
||||
interfaces <- o .:? "interfaces"
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
fields <- o .:? "fields"
|
||||
interfaces :: Maybe [FromIntrospection (G.InterfaceTypeDefinition [G.Name])] <- o .:? "interfaces"
|
||||
when (kind /= "OBJECT") $ kindErr kind "object"
|
||||
let implIfaces = map (G.NamedType . G._itdName) $
|
||||
maybe [] (fmap fromIntrospection) interfaces
|
||||
let implIfaces = map G._itdName $ maybe [] (fmap fromIntrospection) interfaces
|
||||
flds = maybe [] (fmap fromIntrospection) fields
|
||||
desc' = fmap fromIntrospection desc
|
||||
r = G.ObjectTypeDefinition desc' name implIfaces [] flds
|
||||
@ -196,8 +169,7 @@ instance J.FromJSON (FromIntrospection G.GType) where
|
||||
("NON_NULL", _, Just typ) -> return $ mkNotNull (fromIntrospection typ)
|
||||
("NON_NULL", _, Nothing) -> pErr "NON_NULL should have `ofType`"
|
||||
("LIST", _, Just typ) ->
|
||||
return $ G.TypeList (G.Nullability True)
|
||||
(G.ListType $ fromIntrospection typ)
|
||||
return $ G.TypeList (G.Nullability True) (fromIntrospection typ)
|
||||
("LIST", _, Nothing) -> pErr "LIST should have `ofType`"
|
||||
(_, Just name, _) -> return $ G.TypeNamed (G.Nullability True) name
|
||||
_ -> pErr $ "kind: " <> kind <> " should have name"
|
||||
@ -208,7 +180,6 @@ instance J.FromJSON (FromIntrospection G.GType) where
|
||||
G.TypeList _ ty -> G.TypeList (G.Nullability False) ty
|
||||
G.TypeNamed _ n -> G.TypeNamed (G.Nullability False) n
|
||||
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.InputValueDefinition) where
|
||||
parseJSON = J.withObject "InputValueDefinition" $ \o -> do
|
||||
name <- o .: "name"
|
||||
@ -220,20 +191,25 @@ instance J.FromJSON (FromIntrospection G.InputValueDefinition) where
|
||||
r = G.InputValueDefinition desc' name (fromIntrospection _type) defVal'
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.ValueConst) where
|
||||
parseJSON = J.withText "defaultValue" $ \t -> fmap FromIntrospection
|
||||
$ either (fail . T.unpack) return $ G.parseValueConst t
|
||||
instance J.FromJSON (FromIntrospection (G.Value Void)) where
|
||||
parseJSON = J.withText "Value Void" $ \t ->
|
||||
let parseValueConst = G.runParser G.value
|
||||
in fmap FromIntrospection $ either (fail . T.unpack) return $ parseValueConst t
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.InterfaceTypeDefinition) where
|
||||
instance J.FromJSON (FromIntrospection (G.InterfaceTypeDefinition [G.Name])) where
|
||||
parseJSON = J.withObject "InterfaceTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
fields <- o .:? "fields"
|
||||
possibleTypes :: Maybe [FromIntrospection G.ObjectTypeDefinition] <- o .:? "possibleTypes"
|
||||
let flds = maybe [] (fmap fromIntrospection) fields
|
||||
desc' = fmap fromIntrospection desc
|
||||
possTps = map G._otdName $ maybe [] (fmap fromIntrospection) possibleTypes
|
||||
when (kind /= "INTERFACE") $ kindErr kind "interface"
|
||||
let r = G.InterfaceTypeDefinition desc' name [] flds
|
||||
-- TODO (non PDV) track which interfaces implement which other interfaces, after a
|
||||
-- GraphQL spec > Jun 2018 is released.
|
||||
let r = G.InterfaceTypeDefinition desc' name [] flds possTps
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.UnionTypeDefinition) where
|
||||
@ -242,11 +218,10 @@ instance J.FromJSON (FromIntrospection G.UnionTypeDefinition) where
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
possibleTypes <- o .: "possibleTypes"
|
||||
let memberTys = map (G.NamedType . G._otdName) $
|
||||
fmap fromIntrospection possibleTypes
|
||||
let possibleTypes' = map G._otdName $ fmap fromIntrospection possibleTypes
|
||||
desc' = fmap fromIntrospection desc
|
||||
when (kind /= "UNION") $ kindErr kind "union"
|
||||
let r = G.UnionTypeDefinition desc' name [] memberTys
|
||||
let r = G.UnionTypeDefinition desc' name [] possibleTypes'
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.EnumTypeDefinition) where
|
||||
@ -280,7 +255,7 @@ instance J.FromJSON (FromIntrospection G.InputObjectTypeDefinition) where
|
||||
let r = G.InputObjectTypeDefinition desc' name [] inputFields
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.TypeDefinition) where
|
||||
instance J.FromJSON (FromIntrospection (G.TypeDefinition [G.Name])) where
|
||||
parseJSON = J.withObject "TypeDefinition" $ \o -> do
|
||||
kind :: Text <- o .: "kind"
|
||||
r <- case kind of
|
||||
@ -299,12 +274,6 @@ instance J.FromJSON (FromIntrospection G.TypeDefinition) where
|
||||
_ -> pErr $ "unknown kind: " <> kind
|
||||
return $ FromIntrospection r
|
||||
|
||||
type IntrospectionResult = ( G.SchemaDocument
|
||||
, G.NamedType
|
||||
, Maybe G.NamedType
|
||||
, Maybe G.NamedType
|
||||
)
|
||||
|
||||
instance J.FromJSON (FromIntrospection IntrospectionResult) where
|
||||
parseJSON = J.withObject "SchemaDocument" $ \o -> do
|
||||
_data <- o .: "data"
|
||||
@ -328,23 +297,10 @@ instance J.FromJSON (FromIntrospection IntrospectionResult) where
|
||||
Just subsType -> do
|
||||
subRoot <- subsType .: "name"
|
||||
return $ Just subRoot
|
||||
let r = ( G.SchemaDocument (fmap fromIntrospection types)
|
||||
, queryRoot
|
||||
, mutationRoot
|
||||
, subsRoot
|
||||
)
|
||||
let r = IntrospectionResult (G.SchemaIntrospection (fmap fromIntrospection types))
|
||||
queryRoot mutationRoot subsRoot
|
||||
return $ FromIntrospection r
|
||||
|
||||
|
||||
getNamedTyp :: G.TypeDefinition -> G.Name
|
||||
getNamedTyp ty = case ty of
|
||||
G.TypeDefinitionScalar t -> G._stdName t
|
||||
G.TypeDefinitionObject t -> G._otdName t
|
||||
G.TypeDefinitionInterface t -> G._itdName t
|
||||
G.TypeDefinitionUnion t -> G._utdName t
|
||||
G.TypeDefinitionEnum t -> G._etdName t
|
||||
G.TypeDefinitionInputObject t -> G._iotdName t
|
||||
|
||||
execRemoteGQ'
|
||||
:: ( HasVersion
|
||||
, MonadIO m
|
||||
@ -385,7 +341,7 @@ execRemoteGQ' env manager userInfo reqHdrs q rsi opType = do
|
||||
resp <- either httpThrow return res
|
||||
pure (time, mkSetCookieHeaders resp, resp ^. Wreq.responseBody)
|
||||
where
|
||||
RemoteSchemaInfo _ url hdrConf fwdClientHdrs timeout = rsi
|
||||
RemoteSchemaInfo url hdrConf fwdClientHdrs timeout = rsi
|
||||
httpThrow :: (MonadError QErr m) => HTTP.HttpException -> m a
|
||||
httpThrow = \case
|
||||
HTTP.HttpExceptionRequest _req content -> throw500 $ T.pack . show $ content
|
||||
|
@ -1,244 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve
|
||||
( mutFldToTx
|
||||
|
||||
, queryFldToPGAST
|
||||
, traverseQueryRootFldAST
|
||||
, UnresolvedVal(..)
|
||||
|
||||
, AnnPGVal(..)
|
||||
, txtConverter
|
||||
|
||||
, QueryRootFldAST(..)
|
||||
, QueryRootFldUnresolved
|
||||
, QueryRootFldResolved
|
||||
, toPGQuery
|
||||
, toSQLFromItem
|
||||
|
||||
, RIntro.schemaR
|
||||
, RIntro.typeR
|
||||
) where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import qualified Hasura.GraphQL.Resolve.Action as RA
|
||||
import qualified Hasura.GraphQL.Resolve.Insert as RI
|
||||
import qualified Hasura.GraphQL.Resolve.Introspect as RIntro
|
||||
import qualified Hasura.GraphQL.Resolve.Mutation as RM
|
||||
import qualified Hasura.GraphQL.Resolve.Select as RS
|
||||
import qualified Hasura.GraphQL.Schema.Common as GS
|
||||
import qualified Hasura.GraphQL.Validate as V
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.RQL.DML.RemoteJoin as RR
|
||||
import qualified Hasura.RQL.DML.Select as DS
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
data QueryRootFldAST v
|
||||
= QRFNode !(DS.AnnSimpleSelG v)
|
||||
| QRFPk !(DS.AnnSimpleSelG v)
|
||||
| QRFSimple !(DS.AnnSimpleSelG v)
|
||||
| QRFAgg !(DS.AnnAggregateSelectG v)
|
||||
| QRFConnection !(DS.ConnectionSelect v)
|
||||
| QRFActionSelect !(DS.AnnSimpleSelG v)
|
||||
| QRFActionExecuteObject !(DS.AnnSimpleSelG v)
|
||||
| QRFActionExecuteList !(DS.AnnSimpleSelG v)
|
||||
deriving (Show, Eq)
|
||||
|
||||
type QueryRootFldUnresolved = QueryRootFldAST UnresolvedVal
|
||||
type QueryRootFldResolved = QueryRootFldAST S.SQLExp
|
||||
|
||||
traverseQueryRootFldAST
|
||||
:: (Applicative f)
|
||||
=> (a -> f b)
|
||||
-> QueryRootFldAST a
|
||||
-> f (QueryRootFldAST b)
|
||||
traverseQueryRootFldAST f = \case
|
||||
QRFNode s -> QRFNode <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFPk s -> QRFPk <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFSimple s -> QRFSimple <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFAgg s -> QRFAgg <$> DS.traverseAnnAggregateSelect f s
|
||||
QRFActionSelect s -> QRFActionSelect <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFActionExecuteObject s -> QRFActionExecuteObject <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFActionExecuteList s -> QRFActionExecuteList <$> DS.traverseAnnSimpleSelect f s
|
||||
QRFConnection s -> QRFConnection <$> DS.traverseConnectionSelect f s
|
||||
|
||||
toPGQuery :: QueryRootFldResolved -> (Q.Query, Maybe RR.RemoteJoins)
|
||||
toPGQuery = \case
|
||||
QRFNode s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s
|
||||
QRFPk s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s
|
||||
QRFSimple s -> first (toQuery . DS.mkSQLSelect DS.JASMultipleRows) $ RR.getRemoteJoins s
|
||||
QRFAgg s -> first (toQuery . DS.mkAggregateSelect) $ RR.getRemoteJoinsAggregateSelect s
|
||||
QRFActionSelect s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s
|
||||
QRFActionExecuteObject s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s
|
||||
QRFActionExecuteList s -> first (toQuery . DS.mkSQLSelect DS.JASMultipleRows) $ RR.getRemoteJoins s
|
||||
QRFConnection s -> first (toQuery . DS.mkConnectionSelect) $ RR.getRemoteJoinsConnectionSelect s
|
||||
where
|
||||
toQuery :: ToSQL a => a -> Q.Query
|
||||
toQuery = Q.fromBuilder . toSQL
|
||||
|
||||
validateHdrs
|
||||
:: (Foldable t, QErrM m) => UserInfo -> t Text -> m ()
|
||||
validateHdrs userInfo hdrs = do
|
||||
let receivedVars = _uiSession userInfo
|
||||
forM_ hdrs $ \hdr ->
|
||||
unless (isJust $ getSessionVariableValue (mkSessionVariable hdr) receivedVars) $
|
||||
throw400 NotFound $ hdr <<> " header is expected but not found"
|
||||
|
||||
queryFldToPGAST
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has UserInfo r
|
||||
, Has QueryCtxMap r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, HasVersion
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
)
|
||||
=> Env.Environment
|
||||
-> V.Field
|
||||
-> RA.QueryActionExecuter
|
||||
-> m QueryRootFldUnresolved
|
||||
queryFldToPGAST env fld actionExecuter = do
|
||||
opCtx <- getOpCtx $ V._fName fld
|
||||
userInfo <- asks getter
|
||||
case opCtx of
|
||||
QCNodeSelect nodeSelectMap -> do
|
||||
NodeIdV1 (V1NodeId table columnValues) <- RS.resolveNodeId fld
|
||||
case Map.lookup (GS.mkTableTy table) nodeSelectMap of
|
||||
Nothing -> throwVE $ "table " <> table <<> " not found"
|
||||
Just (selOpCtx, pkeyColumns) -> do
|
||||
validateHdrs userInfo (_socHeaders selOpCtx)
|
||||
QRFNode <$> RS.convertNodeSelect selOpCtx pkeyColumns columnValues fld
|
||||
QCSelect ctx -> do
|
||||
validateHdrs userInfo (_socHeaders ctx)
|
||||
QRFSimple <$> RS.convertSelect ctx fld
|
||||
QCSelectPkey ctx -> do
|
||||
validateHdrs userInfo (_spocHeaders ctx)
|
||||
QRFPk <$> RS.convertSelectByPKey ctx fld
|
||||
QCSelectAgg ctx -> do
|
||||
validateHdrs userInfo (_socHeaders ctx)
|
||||
QRFAgg <$> RS.convertAggSelect ctx fld
|
||||
QCFuncQuery ctx -> do
|
||||
validateHdrs userInfo (_fqocHeaders ctx)
|
||||
QRFSimple <$> RS.convertFuncQuerySimple ctx fld
|
||||
QCFuncAggQuery ctx -> do
|
||||
validateHdrs userInfo (_fqocHeaders ctx)
|
||||
QRFAgg <$> RS.convertFuncQueryAgg ctx fld
|
||||
QCAsyncActionFetch ctx ->
|
||||
QRFActionSelect <$> RA.resolveAsyncActionQuery userInfo ctx fld
|
||||
QCAction ctx -> do
|
||||
-- query actions should not be marked reusable because we aren't
|
||||
-- capturing the variable value in the state as re-usable variables.
|
||||
-- The variables captured in non-action queries are used to generate
|
||||
-- an SQL query, but in case of query actions it's converted into JSON
|
||||
-- and included in the action's webhook payload.
|
||||
markNotReusable
|
||||
let jsonAggType = RA.mkJsonAggSelect $ _saecOutputType ctx
|
||||
f = case jsonAggType of
|
||||
DS.JASMultipleRows -> QRFActionExecuteList
|
||||
DS.JASSingleObject -> QRFActionExecuteObject
|
||||
f <$> actionExecuter (RA.resolveActionQuery env fld ctx (_uiSession userInfo))
|
||||
QCSelectConnection pk ctx -> do
|
||||
validateHdrs userInfo (_socHeaders ctx)
|
||||
QRFConnection <$> RS.convertConnectionSelect pk ctx fld
|
||||
QCFuncConnection pk ctx -> do
|
||||
validateHdrs userInfo (_fqocHeaders ctx)
|
||||
QRFConnection <$> RS.convertConnectionFuncQuery pk ctx fld
|
||||
|
||||
mutFldToTx
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has UserInfo r
|
||||
, Has MutationCtxMap r
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has InsCtxMap r
|
||||
, Has HTTP.Manager r
|
||||
, Has [HTTP.Header] r
|
||||
, Has (L.Logger L.Hasura) r
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> V.Field
|
||||
-> m (tx EncJSON, HTTP.ResponseHeaders)
|
||||
mutFldToTx env fld = do
|
||||
userInfo <- asks getter
|
||||
reqHeaders <- asks getter
|
||||
httpManager <- asks getter
|
||||
let rjCtx = (httpManager, reqHeaders, userInfo)
|
||||
opCtx <- getOpCtx $ V._fName fld
|
||||
let noRespHeaders = fmap (,[])
|
||||
roleName = _uiRole userInfo
|
||||
case opCtx of
|
||||
MCInsert ctx -> do
|
||||
validateHdrs userInfo (_iocHeaders ctx)
|
||||
noRespHeaders $ RI.convertInsert env rjCtx roleName (_iocTable ctx) fld
|
||||
MCInsertOne ctx -> do
|
||||
validateHdrs userInfo (_iocHeaders ctx)
|
||||
noRespHeaders $ RI.convertInsertOne env rjCtx roleName (_iocTable ctx) fld
|
||||
MCUpdate ctx -> do
|
||||
validateHdrs userInfo (_uocHeaders ctx)
|
||||
noRespHeaders $ RM.convertUpdate env ctx rjCtx fld
|
||||
MCUpdateByPk ctx -> do
|
||||
validateHdrs userInfo (_uocHeaders ctx)
|
||||
noRespHeaders $ RM.convertUpdateByPk env ctx rjCtx fld
|
||||
MCDelete ctx -> do
|
||||
validateHdrs userInfo (_docHeaders ctx)
|
||||
noRespHeaders $ RM.convertDelete env ctx rjCtx fld
|
||||
MCDeleteByPk ctx -> do
|
||||
validateHdrs userInfo (_docHeaders ctx)
|
||||
noRespHeaders $ RM.convertDeleteByPk env ctx rjCtx fld
|
||||
MCAction ctx ->
|
||||
RA.resolveActionMutation env fld ctx userInfo
|
||||
|
||||
getOpCtx
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has (OpCtxMap a) r
|
||||
)
|
||||
=> G.Name -> m a
|
||||
getOpCtx f = do
|
||||
opCtxMap <- asks getter
|
||||
onNothing (Map.lookup f opCtxMap) $ throw500 $
|
||||
"lookup failed: opctx: " <> showName f
|
||||
|
||||
toSQLFromItem :: S.Alias -> QueryRootFldResolved -> S.FromItem
|
||||
toSQLFromItem alias = \case
|
||||
QRFNode s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
QRFPk s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
QRFSimple s -> fromSelect $ DS.mkSQLSelect DS.JASMultipleRows s
|
||||
QRFAgg s -> fromSelect $ DS.mkAggregateSelect s
|
||||
QRFActionSelect s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
QRFActionExecuteObject s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
QRFActionExecuteList s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s
|
||||
QRFConnection s -> flip (S.FISelectWith (S.Lateral False)) alias
|
||||
$ DS.mkConnectionSelect s
|
||||
where
|
||||
fromSelect = flip (S.FISelect (S.Lateral False)) alias
|
@ -1,211 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.BoolExp
|
||||
( parseBoolExp
|
||||
, pgColValToBoolExp
|
||||
) where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
type OpExp = OpExpG UnresolvedVal
|
||||
|
||||
parseOpExps :: (MonadReusability m, MonadError QErr m) => PGColumnType -> AnnInpVal -> m [OpExp]
|
||||
parseOpExps colTy annVal = do
|
||||
opExpsM <- flip withObjectM annVal $ \nt objM -> forM objM $ \obj ->
|
||||
forM (OMap.toList obj) $ \(k, v) ->
|
||||
case k of
|
||||
"_cast" -> fmap ACast <$> parseCastExpression v
|
||||
|
||||
"_eq" -> fmap (AEQ True) <$> asOpRhs v
|
||||
"_ne" -> fmap (ANE True) <$> asOpRhs v
|
||||
"_neq" -> fmap (ANE True) <$> asOpRhs v
|
||||
"_is_null" -> resolveIsNull v
|
||||
|
||||
"_in" -> fmap AIN <$> asPGArray colTy v
|
||||
"_nin" -> fmap ANIN <$> asPGArray colTy v
|
||||
|
||||
"_gt" -> fmap AGT <$> asOpRhs v
|
||||
"_lt" -> fmap ALT <$> asOpRhs v
|
||||
"_gte" -> fmap AGTE <$> asOpRhs v
|
||||
"_lte" -> fmap ALTE <$> asOpRhs v
|
||||
|
||||
"_like" -> fmap ALIKE <$> asOpRhs v
|
||||
"_nlike" -> fmap ANLIKE <$> asOpRhs v
|
||||
|
||||
"_ilike" -> fmap AILIKE <$> asOpRhs v
|
||||
"_nilike" -> fmap ANILIKE <$> asOpRhs v
|
||||
|
||||
"_similar" -> fmap ASIMILAR <$> asOpRhs v
|
||||
"_nsimilar" -> fmap ANSIMILAR <$> asOpRhs v
|
||||
|
||||
-- jsonb related operators
|
||||
"_contains" -> fmap AContains <$> asOpRhs v
|
||||
"_contained_in" -> fmap AContainedIn <$> asOpRhs v
|
||||
"_has_key" -> fmap AHasKey <$> asOpRhs v
|
||||
|
||||
"_has_keys_any" -> fmap AHasKeysAny <$> asPGArray (PGColumnScalar PGText) v
|
||||
"_has_keys_all" -> fmap AHasKeysAll <$> asPGArray (PGColumnScalar PGText) v
|
||||
|
||||
-- geometry/geography type related operators
|
||||
"_st_contains" -> fmap ASTContains <$> asOpRhs v
|
||||
"_st_crosses" -> fmap ASTCrosses <$> asOpRhs v
|
||||
"_st_equals" -> fmap ASTEquals <$> asOpRhs v
|
||||
"_st_intersects" -> fmap ASTIntersects <$> asOpRhs v
|
||||
"_st_overlaps" -> fmap ASTOverlaps <$> asOpRhs v
|
||||
"_st_touches" -> fmap ASTTouches <$> asOpRhs v
|
||||
"_st_within" -> fmap ASTWithin <$> asOpRhs v
|
||||
"_st_d_within" -> parseAsObjectM v parseAsSTDWithinObj
|
||||
|
||||
-- raster type related operators
|
||||
"_st_intersects_rast" -> fmap ASTIntersectsRast <$> asOpRhs v
|
||||
"_st_intersects_nband_geom" -> parseAsObjectM v parseAsSTIntersectsNbandGeomObj
|
||||
"_st_intersects_geom_nband" -> parseAsObjectM v parseAsSTIntersectsGeomNbandObj
|
||||
|
||||
_ ->
|
||||
throw500
|
||||
$ "unexpected operator found in opexp of "
|
||||
<> showNamedTy nt
|
||||
<> ": "
|
||||
<> showName k
|
||||
return $ catMaybes $ fromMaybe [] opExpsM
|
||||
where
|
||||
asOpRhs = fmap (fmap mkParameterizablePGValue) . asPGColumnValueM
|
||||
|
||||
parseAsObjectM v f = asObjectM v >>= mapM f
|
||||
|
||||
asPGArray rhsTy v = do
|
||||
valsM <- parseMany (openOpaqueValue <=< asPGColumnValue) v
|
||||
forM valsM $ \vals -> do
|
||||
let arrayExp = S.SEArray $ map (txtEncoder . pstValue . _apvValue) vals
|
||||
return $ UVSQL $ S.SETyAnn arrayExp $ S.mkTypeAnn $
|
||||
-- Safe here because asPGColumnValue ensured all the values are of the right type, but if the
|
||||
-- list is empty, we don’t actually have a scalar type to use, so we need to use
|
||||
-- unsafePGColumnToRepresentation to create it. (It would be nice to refactor things to
|
||||
-- somehow get rid of this.)
|
||||
PGTypeArray (unsafePGColumnToRepresentation rhsTy)
|
||||
|
||||
resolveIsNull v = asPGColumnValueM v >>= traverse openOpaqueValue >>= \case
|
||||
Nothing -> pure Nothing
|
||||
Just annPGVal -> case pstValue $ _apvValue annPGVal of
|
||||
PGValBoolean b -> pure . Just $ bool ANISNOTNULL ANISNULL b
|
||||
_ -> throw500 "boolean value is expected"
|
||||
|
||||
parseAsSTDWithinObj obj = do
|
||||
distanceVal <- onNothing (OMap.lookup "distance" obj) $
|
||||
throw500 "expected \"distance\" input field in st_d_within"
|
||||
dist <- mkParameterizablePGValue <$> asPGColumnValue distanceVal
|
||||
fromVal <- onNothing (OMap.lookup "from" obj) $
|
||||
throw500 "expected \"from\" input field in st_d_within"
|
||||
from <- mkParameterizablePGValue <$> asPGColumnValue fromVal
|
||||
case colTy of
|
||||
PGColumnScalar PGGeography -> do
|
||||
useSpheroidVal <-
|
||||
onNothing (OMap.lookup "use_spheroid" obj) $
|
||||
throw500 "expected \"use_spheroid\" input field in st_d_within"
|
||||
useSpheroid <- mkParameterizablePGValue <$> asPGColumnValue useSpheroidVal
|
||||
return $ ASTDWithinGeog $ DWithinGeogOp dist from useSpheroid
|
||||
PGColumnScalar PGGeometry ->
|
||||
return $ ASTDWithinGeom $ DWithinGeomOp dist from
|
||||
_ -> throw500 "expected PGGeometry/PGGeography column for st_d_within"
|
||||
|
||||
parseAsSTIntersectsNbandGeomObj obj = do
|
||||
nbandVal <- onNothing (OMap.lookup "nband" obj) $
|
||||
throw500 "expected \"nband\" input field"
|
||||
nband <- mkParameterizablePGValue <$> asPGColumnValue nbandVal
|
||||
geommin <- parseGeommin obj
|
||||
return $ ASTIntersectsNbandGeom $ STIntersectsNbandGeommin nband geommin
|
||||
|
||||
parseAsSTIntersectsGeomNbandObj obj = do
|
||||
nbandMM <- fmap (fmap mkParameterizablePGValue) <$>
|
||||
traverse asPGColumnValueM (OMap.lookup "nband" obj)
|
||||
geommin <- parseGeommin obj
|
||||
return $ ASTIntersectsGeomNband $ STIntersectsGeomminNband geommin $ join nbandMM
|
||||
|
||||
parseGeommin obj = do
|
||||
geomminVal <- onNothing (OMap.lookup "geommin" obj) $
|
||||
throw500 "expected \"geommin\" input field"
|
||||
mkParameterizablePGValue <$> asPGColumnValue geomminVal
|
||||
|
||||
parseCastExpression
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> AnnInpVal -> m (Maybe (CastExp UnresolvedVal))
|
||||
parseCastExpression =
|
||||
withObjectM $ \_ objM -> forM objM $ \obj -> do
|
||||
targetExps <- forM (OMap.toList obj) $ \(targetTypeName, castedComparisonExpressionInput) -> do
|
||||
let targetType = textToPGScalarType $ G.unName targetTypeName
|
||||
castedComparisonExpressions <- parseOpExps (PGColumnScalar targetType) castedComparisonExpressionInput
|
||||
return (targetType, castedComparisonExpressions)
|
||||
return $ Map.fromList targetExps
|
||||
|
||||
parseColExp
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
)
|
||||
=> G.NamedType -> G.Name -> AnnInpVal
|
||||
-> m (AnnBoolExpFld UnresolvedVal)
|
||||
parseColExp nt n val = do
|
||||
fldInfo <- getFldInfo nt n
|
||||
case fldInfo of
|
||||
RFPGColumn pgColInfo -> do
|
||||
opExps <- parseOpExps (pgiType pgColInfo) val
|
||||
return $ AVCol pgColInfo opExps
|
||||
RFRelationship (RelationshipField relInfo _ _ permExp _)-> do
|
||||
relBoolExp <- parseBoolExp val
|
||||
return $ AVRel relInfo $ andAnnBoolExps relBoolExp $
|
||||
fmapAnnBoolExp partialSQLExpToUnresolvedVal permExp
|
||||
RFComputedField _ -> throw500
|
||||
"computed fields are not allowed in bool_exp"
|
||||
RFRemoteRelationship _ -> throw500
|
||||
"remote relationships are not allowed in bool_exp"
|
||||
RFNodeId _ _ -> throw500
|
||||
"node id is not allowed in bool_exp"
|
||||
|
||||
parseBoolExp
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
)
|
||||
=> AnnInpVal -> m (AnnBoolExp UnresolvedVal)
|
||||
parseBoolExp annGVal = do
|
||||
boolExpsM <-
|
||||
flip withObjectM annGVal
|
||||
$ \nt objM -> forM objM $ \obj -> forM (OMap.toList obj) $ \(k, v) -> if
|
||||
| k == "_or" -> BoolOr . fromMaybe []
|
||||
<$> parseMany parseBoolExp v
|
||||
| k == "_and" -> BoolAnd . fromMaybe []
|
||||
<$> parseMany parseBoolExp v
|
||||
| k == "_not" -> BoolNot <$> parseBoolExp v
|
||||
| otherwise -> BoolFld <$> parseColExp nt k v
|
||||
return $ BoolAnd $ fromMaybe [] boolExpsM
|
||||
|
||||
type PGColValMap = Map.HashMap G.Name AnnInpVal
|
||||
|
||||
pgColValToBoolExp
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColArgMap -> PGColValMap -> m AnnBoolExpUnresolved
|
||||
pgColValToBoolExp colArgMap colValMap = do
|
||||
colExps <- forM colVals $ \(name, val) ->
|
||||
BoolFld <$> do
|
||||
opExp <- AEQ True . mkParameterizablePGValue <$> asPGColumnValue val
|
||||
colInfo <- onNothing (Map.lookup name colArgMap) $
|
||||
throw500 $ "column name " <> showName name
|
||||
<> " not found in column arguments map"
|
||||
return $ AVCol colInfo [opExp]
|
||||
return $ BoolAnd colExps
|
||||
where
|
||||
colVals = Map.toList colValMap
|
@ -1,151 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.Context
|
||||
( FunctionArgItem(..)
|
||||
, OrdByItem(..)
|
||||
, UpdPermForIns(..)
|
||||
, InsCtx(..)
|
||||
, RespTx
|
||||
, LazyRespTx
|
||||
, AnnPGVal(..)
|
||||
, UnresolvedVal(..)
|
||||
, resolveValTxt
|
||||
, InsertTxConflictCtx(..)
|
||||
, getFldInfo
|
||||
, getPGColInfo
|
||||
, getArg
|
||||
, withArg
|
||||
, withArgM
|
||||
, nameAsPath
|
||||
|
||||
, PrepArgs
|
||||
, prepare
|
||||
, prepareColVal
|
||||
, withPrepArgs
|
||||
|
||||
, txtConverter
|
||||
|
||||
, traverseObjectSelectionSet
|
||||
, fieldAsPath
|
||||
, resolvePGCol
|
||||
, module Hasura.GraphQL.Utils
|
||||
, module Hasura.GraphQL.Resolve.Types
|
||||
) where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DML.Internal (currentSession, sessVarFromCurrentSetting)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
getFldInfo
|
||||
:: (MonadError QErr m, MonadReader r m, Has FieldMap r)
|
||||
=> G.NamedType -> G.Name
|
||||
-> m ResolveField
|
||||
getFldInfo nt n = do
|
||||
fldMap <- asks getter
|
||||
onNothing (Map.lookup (nt,n) fldMap) $
|
||||
throw500 $ "could not lookup " <> showName n <> " in " <>
|
||||
showNamedTy nt
|
||||
|
||||
getPGColInfo
|
||||
:: (MonadError QErr m, MonadReader r m, Has FieldMap r)
|
||||
=> G.NamedType -> G.Name -> m PGColumnInfo
|
||||
getPGColInfo nt n = do
|
||||
fldInfo <- getFldInfo nt n
|
||||
case fldInfo of
|
||||
RFPGColumn pgColInfo -> return pgColInfo
|
||||
RFRelationship _ -> throw500 $ mkErrMsg "relation"
|
||||
RFComputedField _ -> throw500 $ mkErrMsg "computed field"
|
||||
RFRemoteRelationship _ -> throw500 $ mkErrMsg "remote relationship"
|
||||
RFNodeId _ _ -> throw500 $ mkErrMsg "node id"
|
||||
where
|
||||
mkErrMsg ty =
|
||||
"found " <> ty <> " when expecting pgcolinfo for "
|
||||
<> showNamedTy nt <> ":" <> showName n
|
||||
|
||||
getArg
|
||||
:: (MonadError QErr m)
|
||||
=> ArgsMap
|
||||
-> G.Name
|
||||
-> m AnnInpVal
|
||||
getArg args arg =
|
||||
onNothing (Map.lookup arg args) $
|
||||
throw500 $ "missing argument: " <> showName arg
|
||||
|
||||
prependArgsInPath
|
||||
:: (MonadError QErr m)
|
||||
=> m a -> m a
|
||||
prependArgsInPath = withPathK "args"
|
||||
|
||||
nameAsPath
|
||||
:: (MonadError QErr m)
|
||||
=> G.Name -> m a -> m a
|
||||
nameAsPath name = withPathK (G.unName name)
|
||||
|
||||
withArg
|
||||
:: (MonadError QErr m)
|
||||
=> ArgsMap
|
||||
-> G.Name
|
||||
-> (AnnInpVal -> m a)
|
||||
-> m a
|
||||
withArg args arg f = prependArgsInPath $ nameAsPath arg $
|
||||
getArg args arg >>= f
|
||||
|
||||
withArgM
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> ArgsMap
|
||||
-> G.Name
|
||||
-> (AnnInpVal -> m a)
|
||||
-> m (Maybe a)
|
||||
withArgM args argName f = do
|
||||
wrappedArg <- for (Map.lookup argName args) $ \arg -> do
|
||||
when (isJust (_aivVariable arg) && G.isNullable (_aivType arg)) markNotReusable
|
||||
pure . bool (Just arg) Nothing $ hasNullVal (_aivValue arg)
|
||||
prependArgsInPath . nameAsPath argName $ traverse f (join wrappedArg)
|
||||
|
||||
type PrepArgs = Seq.Seq Q.PrepArg
|
||||
|
||||
prepare :: (MonadState PrepArgs m) => AnnPGVal -> m S.SQLExp
|
||||
prepare (AnnPGVal _ _ scalarValue) = prepareColVal scalarValue
|
||||
|
||||
resolveValTxt :: (Applicative f) => UnresolvedVal -> f S.SQLExp
|
||||
resolveValTxt = \case
|
||||
UVPG annPGVal -> txtConverter annPGVal
|
||||
UVSessVar colTy sessVar -> sessVarFromCurrentSetting colTy sessVar
|
||||
UVSQL sqlExp -> pure sqlExp
|
||||
UVSession -> pure currentSession
|
||||
|
||||
withPrepArgs :: StateT PrepArgs m a -> m (a, PrepArgs)
|
||||
withPrepArgs m = runStateT m Seq.empty
|
||||
|
||||
prepareColVal
|
||||
:: (MonadState PrepArgs m)
|
||||
=> WithScalarType PGScalarValue -> m S.SQLExp
|
||||
prepareColVal (WithScalarType scalarType colVal) = do
|
||||
preparedArgs <- get
|
||||
put (preparedArgs Seq.|> binEncoder colVal)
|
||||
return $ toPrepParam (Seq.length preparedArgs + 1) scalarType
|
||||
|
||||
txtConverter :: Applicative f => AnnPGVal -> f S.SQLExp
|
||||
txtConverter (AnnPGVal _ _ scalarValue) = pure $ toTxtValue scalarValue
|
||||
|
||||
fieldAsPath :: (MonadError QErr m) => Field -> m a -> m a
|
||||
fieldAsPath = nameAsPath . _fName
|
||||
|
||||
resolvePGCol :: (MonadError QErr m)
|
||||
=> PGColGNameMap -> G.Name -> m PGColumnInfo
|
||||
resolvePGCol colFldMap fldName =
|
||||
onNothing (Map.lookup fldName colFldMap) $ throw500 $
|
||||
"no column associated with name " <> G.unName fldName
|
@ -1,228 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.InputValue
|
||||
( withNotNull
|
||||
, tyMismatch
|
||||
|
||||
, OpaqueValue
|
||||
, OpaquePGValue
|
||||
, mkParameterizablePGValue
|
||||
, openOpaqueValue
|
||||
, asPGColumnTypeAndValueM
|
||||
, asPGColumnValueM
|
||||
, asPGColumnValue
|
||||
|
||||
, asScalarValM
|
||||
, asScalarVal
|
||||
, asEnumVal
|
||||
, asEnumValM
|
||||
, withObject
|
||||
, asObject
|
||||
, withObjectM
|
||||
, asObjectM
|
||||
, withArray
|
||||
, asArray
|
||||
, withArrayM
|
||||
, parseMany
|
||||
, asPGColText
|
||||
, asPGColTextM
|
||||
, annInpValueToJson
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Text.Builder as TB
|
||||
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Hasura.RQL.Types as RQL
|
||||
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
withNotNull
|
||||
:: (MonadError QErr m)
|
||||
=> G.NamedType -> Maybe a -> m a
|
||||
withNotNull nt v =
|
||||
onNothing v $ throw500 $
|
||||
"unexpected null for a value of type " <> showNamedTy nt
|
||||
|
||||
tyMismatch
|
||||
:: (MonadError QErr m) => Text -> AnnInpVal -> m a
|
||||
tyMismatch expectedTy v =
|
||||
throw500 $ "expected " <> expectedTy <> ", found " <>
|
||||
getAnnInpValKind (_aivValue v) <> " for value of type " <>
|
||||
G.showGT (_aivType v)
|
||||
|
||||
-- | As part of query reusability tracking (see 'QueryReusability'), functions that parse input
|
||||
-- values call 'markNotReusable' when the value comes from a variable. However, always calling
|
||||
-- 'markNotReusable' when parsing column values (using 'asPGColumnValue' and its variants) would be
|
||||
-- much too conservative: often the value is simply validated and wrapped immediately in 'UVPG',
|
||||
-- which allows it to be parameterized over.
|
||||
--
|
||||
-- Always omitting the check would be incorrect, as some callers inspect the column values and use
|
||||
-- them to generate different SQL, which is where 'OpaqueValue' comes in. Functions like
|
||||
-- 'asPGColumnValue' return an 'OpaquePGValue', which can be safely converted to an 'UnresolvedVal'
|
||||
-- via 'mkParameterizablePGValue' without marking the query as non-reusable. Other callers that wish
|
||||
-- to inspect the value can instead call 'openOpaqueValue' to get the value out, and /that/ will
|
||||
-- mark the query non-reusable, instead.
|
||||
--
|
||||
-- In other words, 'OpaqueValue' is a mechanism of delaying the 'markNotReusable' call until we’re
|
||||
-- confident its value will actually affect the generated SQL.
|
||||
data OpaqueValue a
|
||||
= OpaqueValue
|
||||
{ _opgvValue :: !a
|
||||
, _opgvIsVariable :: !Bool
|
||||
} deriving (Show)
|
||||
type OpaquePGValue = OpaqueValue AnnPGVal
|
||||
|
||||
mkParameterizablePGValue :: OpaquePGValue -> UnresolvedVal
|
||||
mkParameterizablePGValue (OpaqueValue v _) = UVPG v
|
||||
|
||||
openOpaqueValue :: (MonadReusability m) => OpaqueValue a -> m a
|
||||
openOpaqueValue (OpaqueValue v isVariable) = when isVariable markNotReusable $> v
|
||||
|
||||
asPGColumnTypeAndValueM
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> AnnInpVal
|
||||
-> m (PGColumnType, WithScalarType (Maybe (OpaqueValue PGScalarValue)))
|
||||
asPGColumnTypeAndValueM v = do
|
||||
(columnType, scalarValueM) <- case _aivValue v of
|
||||
AGScalar colTy val -> pure (PGColumnScalar colTy, WithScalarType colTy val)
|
||||
AGEnum _ (AGEReference reference maybeValue) -> do
|
||||
let maybeScalarValue = PGValText . RQL.getEnumValue <$> maybeValue
|
||||
pure (PGColumnEnumReference reference, WithScalarType PGText maybeScalarValue)
|
||||
_ -> tyMismatch "pgvalue" v
|
||||
|
||||
for_ (_aivVariable v) $ \variableName -> if
|
||||
-- If the value is a nullable variable, then the caller might make a different decision based on
|
||||
-- whether the result is 'Nothing' or 'Just', which would change the generated query, so we have
|
||||
-- to unconditionally mark the query non-reusable.
|
||||
| G.isNullable (_aivType v) -> markNotReusable
|
||||
| otherwise -> recordVariableUse variableName columnType
|
||||
|
||||
let isVariable = isJust $ _aivVariable v
|
||||
pure (columnType, fmap (flip OpaqueValue isVariable) <$> scalarValueM)
|
||||
|
||||
asPGColumnTypeAndAnnValueM
|
||||
:: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (PGColumnType, Maybe OpaquePGValue)
|
||||
asPGColumnTypeAndAnnValueM v = do
|
||||
(columnType, scalarValueM) <- asPGColumnTypeAndValueM v
|
||||
let mkAnnPGColVal = AnnPGVal (_aivVariable v) (G.isNullable (_aivType v))
|
||||
replaceOpaqueValue (WithScalarType scalarType (OpaqueValue scalarValue isVariable)) =
|
||||
OpaqueValue (mkAnnPGColVal (WithScalarType scalarType scalarValue)) isVariable
|
||||
pure (columnType, replaceOpaqueValue <$> sequence scalarValueM)
|
||||
|
||||
asPGColumnValueM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe OpaquePGValue)
|
||||
asPGColumnValueM = fmap snd . asPGColumnTypeAndAnnValueM
|
||||
|
||||
asPGColumnValue :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m OpaquePGValue
|
||||
asPGColumnValue v = do
|
||||
(columnType, annPGValM) <- asPGColumnTypeAndAnnValueM v
|
||||
onNothing annPGValM $ throw500 ("unexpected null for type " <>> columnType)
|
||||
|
||||
openInputValue :: (MonadReusability m) => AnnInpVal -> m AnnGValue
|
||||
openInputValue v = when (isJust $ _aivVariable v) markNotReusable $> _aivValue v
|
||||
|
||||
asScalarValM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> PGScalarType -> m (Maybe PGScalarValue)
|
||||
asScalarValM v tp = openInputValue v >>= \case
|
||||
AGScalar tp' vM ->
|
||||
if tp == tp'
|
||||
then pure vM
|
||||
else tyMismatch "scalar" v
|
||||
_ -> tyMismatch "scalar" v
|
||||
|
||||
asScalarVal :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> PGScalarType -> m PGScalarValue
|
||||
asScalarVal v tp = asScalarValM v tp >>= \case
|
||||
Just val -> pure val
|
||||
Nothing -> throw500 $ "unexpected null for ty " <> TB.run (toSQL tp)
|
||||
|
||||
-- | Note: only handles “synthetic” enums (see 'EnumValuesInfo'). Enum table references are handled
|
||||
-- by 'asPGColumnType' and its variants.
|
||||
asEnumVal :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (G.NamedType, G.EnumValue)
|
||||
asEnumVal = asEnumValM >=> \case
|
||||
(ty, Just val) -> pure (ty, val)
|
||||
(ty, Nothing) -> throw500 $ "unexpected null for ty " <> showNamedTy ty
|
||||
|
||||
-- | Like 'asEnumVal', only handles “synthetic” enums.
|
||||
asEnumValM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (G.NamedType, Maybe G.EnumValue)
|
||||
asEnumValM v = openInputValue v >>= \case
|
||||
AGEnum ty (AGESynthetic valM) -> return (ty, valM)
|
||||
_ -> tyMismatch "enum" v
|
||||
|
||||
withObject
|
||||
:: (MonadReusability m, MonadError QErr m) => (G.NamedType -> AnnGObject -> m a) -> AnnInpVal -> m a
|
||||
withObject fn v = openInputValue v >>= \case
|
||||
AGObject nt (Just obj) -> fn nt obj
|
||||
AGObject _ Nothing ->
|
||||
throw500 $ "unexpected null for ty"
|
||||
<> G.showGT (_aivType v)
|
||||
_ -> tyMismatch "object" v
|
||||
|
||||
asObject :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m AnnGObject
|
||||
asObject = withObject (\_ o -> return o)
|
||||
|
||||
withObjectM
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> (G.NamedType -> Maybe AnnGObject -> m a) -> AnnInpVal -> m a
|
||||
withObjectM fn v = openInputValue v >>= \case
|
||||
AGObject nt objM -> fn nt objM
|
||||
_ -> tyMismatch "object" v
|
||||
|
||||
asObjectM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe AnnGObject)
|
||||
asObjectM = withObjectM (\_ o -> return o)
|
||||
|
||||
withArrayM
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> (G.ListType -> Maybe [AnnInpVal] -> m a) -> AnnInpVal -> m a
|
||||
withArrayM fn v = openInputValue v >>= \case
|
||||
AGArray lt listM -> fn lt listM
|
||||
_ -> tyMismatch "array" v
|
||||
|
||||
withArray
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> (G.ListType -> [AnnInpVal] -> m a) -> AnnInpVal -> m a
|
||||
withArray fn v = openInputValue v >>= \case
|
||||
AGArray lt (Just l) -> fn lt l
|
||||
AGArray _ Nothing -> throw500 $ "unexpected null for ty"
|
||||
<> G.showGT (_aivType v)
|
||||
_ -> tyMismatch "array" v
|
||||
|
||||
asArray :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m [AnnInpVal]
|
||||
asArray = withArray (\_ vals -> return vals)
|
||||
|
||||
parseMany
|
||||
:: (MonadReusability m, MonadError QErr m) => (AnnInpVal -> m a) -> AnnInpVal -> m (Maybe [a])
|
||||
parseMany fn v = openInputValue v >>= \case
|
||||
AGArray _ arrM -> mapM (mapM fn) arrM
|
||||
_ -> tyMismatch "array" v
|
||||
|
||||
onlyText
|
||||
:: (MonadError QErr m)
|
||||
=> PGScalarValue -> m Text
|
||||
onlyText = \case
|
||||
PGValText t -> return t
|
||||
PGValVarchar t -> return t
|
||||
_ -> throw500 "expecting text for asPGColText"
|
||||
|
||||
asPGColText :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m Text
|
||||
asPGColText val = do
|
||||
pgColVal <- openOpaqueValue =<< asPGColumnValue val
|
||||
onlyText (pstValue $ _apvValue pgColVal)
|
||||
|
||||
asPGColTextM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe Text)
|
||||
asPGColTextM val = do
|
||||
pgColValM <- traverse openOpaqueValue =<< asPGColumnValueM val
|
||||
traverse onlyText (pstValue . _apvValue <$> pgColValM)
|
||||
|
||||
annInpValueToJson :: AnnInpVal -> J.Value
|
||||
annInpValueToJson annInpValue =
|
||||
case _aivValue annInpValue of
|
||||
AGScalar _ pgColumnValueM -> maybe J.Null pgScalarValueToJson pgColumnValueM
|
||||
AGEnum _ enumValue -> case enumValue of
|
||||
AGESynthetic enumValueM -> J.toJSON enumValueM
|
||||
AGEReference _ enumValueM -> J.toJSON enumValueM
|
||||
AGObject _ objectM -> J.toJSON $ fmap (fmap annInpValueToJson) objectM
|
||||
AGArray _ valuesM -> J.toJSON $ fmap (fmap annInpValueToJson) valuesM
|
@ -1,592 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.Insert
|
||||
( convertInsert
|
||||
, convertInsertOne
|
||||
)
|
||||
where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.EncJSON
|
||||
import Hasura.Prelude
|
||||
import Hasura.Session
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Text as T
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.RQL.DML.Insert as RI
|
||||
import qualified Hasura.RQL.DML.Returning as RR
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.GraphQL.Resolve.BoolExp
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Resolve.Mutation
|
||||
import Hasura.GraphQL.Resolve.Select
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DML.Insert (insertOrUpdateCheckExpr)
|
||||
import Hasura.RQL.DML.Internal (convAnnBoolExpPartialSQL, convPartialSQLExp,
|
||||
sessVarFromCurrentSetting)
|
||||
import Hasura.RQL.DML.Mutation
|
||||
import Hasura.RQL.DML.RemoteJoin
|
||||
import Hasura.RQL.GBoolExp (toSQLBoolExp)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
type ColumnValuesText = ColumnValues TxtEncodedPGVal
|
||||
|
||||
newtype InsResp
|
||||
= InsResp
|
||||
{ _irResponse :: Maybe J.Object
|
||||
} deriving (Show, Eq)
|
||||
$(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''InsResp)
|
||||
|
||||
data AnnIns a
|
||||
= AnnIns
|
||||
{ _aiInsObj :: !a
|
||||
, _aiConflictClause :: !(Maybe RI.ConflictClauseP1)
|
||||
, _aiCheckCond :: !(AnnBoolExpPartialSQL, Maybe AnnBoolExpPartialSQL)
|
||||
, _aiTableCols :: ![PGColumnInfo]
|
||||
, _aiDefVals :: !(Map.HashMap PGCol S.SQLExp)
|
||||
} deriving (Show, Eq, Functor, Foldable, Traversable)
|
||||
|
||||
type SingleObjIns = AnnIns AnnInsObj
|
||||
type MultiObjIns = AnnIns [AnnInsObj]
|
||||
|
||||
multiToSingles :: MultiObjIns -> [SingleObjIns]
|
||||
multiToSingles = sequenceA
|
||||
|
||||
data RelIns a
|
||||
= RelIns
|
||||
{ _riAnnIns :: !a
|
||||
, _riRelInfo :: !RelInfo
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type ObjRelIns = RelIns SingleObjIns
|
||||
type ArrRelIns = RelIns MultiObjIns
|
||||
|
||||
type PGColWithValue = (PGCol, WithScalarType PGScalarValue)
|
||||
|
||||
data CTEExp
|
||||
= CTEExp
|
||||
{ _iweExp :: !S.CTE
|
||||
, _iwePrepArgs :: !(Seq.Seq Q.PrepArg)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data AnnInsObj
|
||||
= AnnInsObj
|
||||
{ _aioColumns :: ![PGColWithValue]
|
||||
, _aioObjRels :: ![ObjRelIns]
|
||||
, _aioArrRels :: ![ArrRelIns]
|
||||
} deriving (Show, Eq)
|
||||
|
||||
mkAnnInsObj
|
||||
:: (MonadReusability m, MonadError QErr m, Has InsCtxMap r, MonadReader r m, Has FieldMap r)
|
||||
=> RelationInfoMap
|
||||
-> PGColGNameMap
|
||||
-> AnnGObject
|
||||
-> m AnnInsObj
|
||||
mkAnnInsObj relInfoMap allColMap annObj =
|
||||
foldrM (traverseInsObj relInfoMap allColMap) emptyInsObj $ OMap.toList annObj
|
||||
where
|
||||
emptyInsObj = AnnInsObj [] [] []
|
||||
|
||||
traverseInsObj
|
||||
:: (MonadReusability m, MonadError QErr m, Has InsCtxMap r, MonadReader r m, Has FieldMap r)
|
||||
=> RelationInfoMap
|
||||
-> PGColGNameMap
|
||||
-> (G.Name, AnnInpVal)
|
||||
-> AnnInsObj
|
||||
-> m AnnInsObj
|
||||
traverseInsObj rim allColMap (gName, annVal) defVal@(AnnInsObj cols objRels arrRels) =
|
||||
case _aivValue annVal of
|
||||
AGScalar{} -> parseValue
|
||||
AGEnum{} -> parseValue
|
||||
_ -> parseObject
|
||||
where
|
||||
parseValue = do
|
||||
(_, WithScalarType scalarType maybeScalarValue) <- asPGColumnTypeAndValueM annVal
|
||||
columnInfo <- onNothing (Map.lookup gName allColMap) $
|
||||
throw500 "column not found in PGColGNameMap"
|
||||
let columnName = pgiColumn columnInfo
|
||||
scalarValue <- maybe (pure $ PGNull scalarType) openOpaqueValue maybeScalarValue
|
||||
pure $ AnnInsObj ((columnName, WithScalarType scalarType scalarValue):cols) objRels arrRels
|
||||
|
||||
parseObject = do
|
||||
objM <- asObjectM annVal
|
||||
-- if relational insert input is 'null' then ignore
|
||||
-- return default value
|
||||
fmap (fromMaybe defVal) $ forM objM $ \obj -> do
|
||||
let relNameM = RelName <$> mkNonEmptyText (G.unName gName)
|
||||
onConflictM = OMap.lookup "on_conflict" obj
|
||||
relName <- onNothing relNameM $ throw500 "found empty GName String"
|
||||
dataVal <- onNothing (OMap.lookup "data" obj) $
|
||||
throw500 "\"data\" object not found"
|
||||
relInfo <- onNothing (Map.lookup relName rim) $
|
||||
throw500 $ "relation " <> relName <<> " not found"
|
||||
|
||||
let rTable = riRTable relInfo
|
||||
InsCtx rtColMap checkCond rtDefVals rtRelInfoMap rtUpdPerm <- getInsCtx rTable
|
||||
let rtCols = Map.elems rtColMap
|
||||
rtDefValsRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) rtDefVals
|
||||
|
||||
withPathK (G.unName gName) $ case riType relInfo of
|
||||
ObjRel -> do
|
||||
dataObj <- asObject dataVal
|
||||
annDataObj <- mkAnnInsObj rtRelInfoMap rtColMap dataObj
|
||||
ccM <- forM onConflictM $ parseOnConflict rTable rtUpdPerm rtColMap
|
||||
let singleObjIns = AnnIns annDataObj ccM (checkCond, rtUpdPerm >>= upfiCheck) rtCols rtDefValsRes
|
||||
objRelIns = RelIns singleObjIns relInfo
|
||||
return (AnnInsObj cols (objRelIns:objRels) arrRels)
|
||||
|
||||
ArrRel -> do
|
||||
arrDataVals <- asArray dataVal
|
||||
let withNonEmptyArrData = do
|
||||
annDataObjs <- forM arrDataVals $ \arrDataVal -> do
|
||||
dataObj <- asObject arrDataVal
|
||||
mkAnnInsObj rtRelInfoMap rtColMap dataObj
|
||||
ccM <- forM onConflictM $ parseOnConflict rTable rtUpdPerm rtColMap
|
||||
let multiObjIns = AnnIns annDataObjs ccM (checkCond, rtUpdPerm >>= upfiCheck) rtCols rtDefValsRes
|
||||
arrRelIns = RelIns multiObjIns relInfo
|
||||
return (AnnInsObj cols objRels (arrRelIns:arrRels))
|
||||
-- if array relation insert input data has empty objects
|
||||
-- then ignore and return default value
|
||||
bool withNonEmptyArrData (return defVal) $ null arrDataVals
|
||||
|
||||
parseOnConflict
|
||||
:: (MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r)
|
||||
=> QualifiedTable
|
||||
-> Maybe UpdPermForIns
|
||||
-> PGColGNameMap
|
||||
-> AnnInpVal
|
||||
-> m RI.ConflictClauseP1
|
||||
parseOnConflict tn updFiltrM allColMap val = withPathK "on_conflict" $
|
||||
flip withObject val $ \_ obj -> do
|
||||
constraint <- RI.CTConstraint <$> parseConstraint obj
|
||||
updCols <- getUpdCols obj
|
||||
case updCols of
|
||||
[] -> return $ RI.CP1DoNothing $ Just constraint
|
||||
_ -> do
|
||||
UpdPermForIns _ _ updFiltr preSet <- onNothing updFiltrM $ throw500
|
||||
"cannot update columns since update permission is not defined"
|
||||
preSetRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) preSet
|
||||
updFltrRes <- traverseAnnBoolExp
|
||||
(convPartialSQLExp sessVarFromCurrentSetting)
|
||||
updFiltr
|
||||
whereExp <- parseWhereExp obj
|
||||
let updateBoolExp = toSQLBoolExp (S.mkQual tn) updFltrRes
|
||||
whereCondition = S.BEBin S.AndOp updateBoolExp whereExp
|
||||
return $ RI.CP1Update constraint updCols preSetRes whereCondition
|
||||
|
||||
where
|
||||
getUpdCols o = do
|
||||
updColsVal <- onNothing (OMap.lookup "update_columns" o) $ throw500
|
||||
"\"update_columns\" argument in expected in \"on_conflict\" field "
|
||||
parseColumns allColMap updColsVal
|
||||
|
||||
parseConstraint o = do
|
||||
v <- onNothing (OMap.lookup "constraint" o) $ throw500
|
||||
"\"constraint\" is expected, but not found"
|
||||
(_, enumVal) <- asEnumVal v
|
||||
return $ ConstraintName $ G.unName $ G.unEnumValue enumVal
|
||||
|
||||
parseWhereExp =
|
||||
OMap.lookup "where"
|
||||
>>> traverse (parseBoolExp >=> traverse (traverse resolveValTxt))
|
||||
>>> fmap (maybe (S.BELit True) (toSQLBoolExp (S.mkQual tn)))
|
||||
|
||||
toSQLExps
|
||||
:: (MonadError QErr m, MonadState PrepArgs m)
|
||||
=> [PGColWithValue]
|
||||
-> m [(PGCol, S.SQLExp)]
|
||||
toSQLExps cols =
|
||||
forM cols $ \(c, v) -> do
|
||||
prepExp <- prepareColVal v
|
||||
return (c, prepExp)
|
||||
|
||||
mkSQLRow :: Map.HashMap PGCol S.SQLExp -> [(PGCol, S.SQLExp)] -> [S.SQLExp]
|
||||
mkSQLRow defVals withPGCol = map snd $
|
||||
flip map (Map.toList defVals) $
|
||||
\(col, defVal) -> (col,) $ fromMaybe defVal $ Map.lookup col withPGColMap
|
||||
where
|
||||
withPGColMap = Map.fromList withPGCol
|
||||
|
||||
mkInsertQ
|
||||
:: MonadError QErr m
|
||||
=> QualifiedTable
|
||||
-> Maybe RI.ConflictClauseP1
|
||||
-> [PGColWithValue]
|
||||
-> Map.HashMap PGCol S.SQLExp
|
||||
-> RoleName
|
||||
-> (AnnBoolExpSQL, Maybe AnnBoolExpSQL)
|
||||
-> m CTEExp
|
||||
mkInsertQ tn onConflictM insCols defVals role (insCheck, updCheck) = do
|
||||
(givenCols, args) <- flip runStateT Seq.Empty $ toSQLExps insCols
|
||||
let sqlConflict = RI.toSQLConflict <$> onConflictM
|
||||
sqlExps = mkSQLRow defVals givenCols
|
||||
valueExp = S.ValuesExp [S.TupleExp sqlExps]
|
||||
tableCols = Map.keys defVals
|
||||
sqlInsert =
|
||||
S.SQLInsert tn tableCols valueExp sqlConflict
|
||||
. Just
|
||||
$ S.RetExp
|
||||
[ S.selectStar
|
||||
, S.Extractor
|
||||
(insertOrUpdateCheckExpr tn onConflictM
|
||||
(toSQLBoolExp (S.QualTable tn) insCheck)
|
||||
(fmap (toSQLBoolExp (S.QualTable tn)) updCheck))
|
||||
Nothing
|
||||
]
|
||||
|
||||
adminIns = return (CTEExp (S.CTEInsert sqlInsert) args)
|
||||
nonAdminInsert = do
|
||||
let cteIns = S.CTEInsert sqlInsert
|
||||
return (CTEExp cteIns args)
|
||||
|
||||
bool nonAdminInsert adminIns $ isAdmin role
|
||||
|
||||
fetchFromColVals
|
||||
:: MonadError QErr m
|
||||
=> ColumnValuesText
|
||||
-> [PGColumnInfo]
|
||||
-> m [(PGCol, WithScalarType PGScalarValue)]
|
||||
fetchFromColVals colVal reqCols =
|
||||
forM reqCols $ \ci -> do
|
||||
let valM = Map.lookup (pgiColumn ci) colVal
|
||||
val <- onNothing valM $ throw500 $ "column "
|
||||
<> pgiColumn ci <<> " not found in given colVal"
|
||||
pgColVal <- parseTxtEncodedPGValue (pgiType ci) val
|
||||
return (pgiColumn ci, pgColVal)
|
||||
|
||||
-- | validate an insert object based on insert columns,
|
||||
-- | insert object relations and additional columns from parent
|
||||
validateInsert
|
||||
:: (MonadError QErr m)
|
||||
=> [PGCol] -- ^ inserting columns
|
||||
-> [RelInfo] -- ^ object relation inserts
|
||||
-> [PGCol] -- ^ additional fields from parent
|
||||
-> m ()
|
||||
validateInsert insCols objRels addCols = do
|
||||
-- validate insertCols
|
||||
unless (null insConflictCols) $ throwVE $
|
||||
"cannot insert " <> showPGCols insConflictCols
|
||||
<> " columns as their values are already being determined by parent insert"
|
||||
|
||||
forM_ objRels $ \relInfo -> do
|
||||
let lCols = Map.keys $ riMapping relInfo
|
||||
relName = riName relInfo
|
||||
relNameTxt = relNameToTxt relName
|
||||
lColConflicts = lCols `intersect` (addCols <> insCols)
|
||||
withPathK relNameTxt $ unless (null lColConflicts) $ throwVE $
|
||||
"cannot insert object relation ship " <> relName
|
||||
<<> " as " <> showPGCols lColConflicts
|
||||
<> " column values are already determined"
|
||||
where
|
||||
insConflictCols = insCols `intersect` addCols
|
||||
|
||||
-- | insert an object relationship and return affected rows
|
||||
-- | and parent dependent columns
|
||||
insertObjRel
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> Bool
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> ObjRelIns
|
||||
-> m (Int, [PGColWithValue])
|
||||
insertObjRel env strfyNum rjCtx role objRelIns =
|
||||
withPathK relNameTxt $ do
|
||||
(affRows, colValM) <- withPathK "data" $ insertObj env strfyNum rjCtx role tn singleObjIns []
|
||||
colVal <- onNothing colValM $ throw400 NotSupported errMsg
|
||||
retColsWithVals <- fetchFromColVals colVal rColInfos
|
||||
let c = mergeListsWith (Map.toList mapCols) retColsWithVals
|
||||
(\(_, rCol) (col, _) -> rCol == col)
|
||||
(\(lCol, _) (_, cVal) -> (lCol, cVal))
|
||||
return (affRows, c)
|
||||
where
|
||||
RelIns singleObjIns relInfo = objRelIns
|
||||
-- multiObjIns = singleToMulti singleObjIns
|
||||
relName = riName relInfo
|
||||
relNameTxt = relNameToTxt relName
|
||||
mapCols = riMapping relInfo
|
||||
tn = riRTable relInfo
|
||||
allCols = _aiTableCols singleObjIns
|
||||
rCols = Map.elems mapCols
|
||||
rColInfos = getColInfos rCols allCols
|
||||
errMsg = "cannot proceed to insert object relation "
|
||||
<> relName <<> " since insert to table "
|
||||
<> tn <<> " affects zero rows"
|
||||
|
||||
decodeEncJSON :: (J.FromJSON a, QErrM m) => EncJSON -> m a
|
||||
decodeEncJSON =
|
||||
either (throw500 . T.pack) decodeValue .
|
||||
J.eitherDecode . encJToLBS
|
||||
|
||||
-- | insert an array relationship and return affected rows
|
||||
insertArrRel
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> Bool
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> [PGColWithValue]
|
||||
-> ArrRelIns
|
||||
-> m Int
|
||||
insertArrRel env strfyNum rjCtx role resCols arrRelIns =
|
||||
withPathK relNameTxt $ do
|
||||
let addCols = mergeListsWith resCols (Map.toList colMapping)
|
||||
(\(col, _) (lCol, _) -> col == lCol)
|
||||
(\(_, colVal) (_, rCol) -> (rCol, colVal))
|
||||
|
||||
resBS <- insertMultipleObjects env strfyNum rjCtx role tn multiObjIns addCols mutOutput "data"
|
||||
resObj <- decodeEncJSON resBS
|
||||
onNothing (Map.lookup ("affected_rows" :: T.Text) resObj) $
|
||||
throw500 "affected_rows not returned in array rel insert"
|
||||
where
|
||||
RelIns multiObjIns relInfo = arrRelIns
|
||||
colMapping = riMapping relInfo
|
||||
tn = riRTable relInfo
|
||||
relNameTxt = relNameToTxt $ riName relInfo
|
||||
mutOutput = RR.MOutMultirowFields [("affected_rows", RR.MCount)]
|
||||
|
||||
-- | insert an object with object and array relationships
|
||||
insertObj
|
||||
:: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m)
|
||||
=> Env.Environment
|
||||
-> Bool
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> QualifiedTable
|
||||
-> SingleObjIns
|
||||
-> [PGColWithValue] -- ^ additional fields
|
||||
-> m (Int, Maybe ColumnValuesText)
|
||||
insertObj env strfyNum rjCtx role tn singleObjIns addCols = do
|
||||
-- validate insert
|
||||
validateInsert (map fst cols) (map _riRelInfo objRels) $ map fst addCols
|
||||
|
||||
-- insert all object relations and fetch this insert dependent column values
|
||||
objInsRes <- forM objRels $ insertObjRel env strfyNum rjCtx role
|
||||
|
||||
-- prepare final insert columns
|
||||
let objRelAffRows = sum $ map fst objInsRes
|
||||
objRelDeterminedCols = concatMap snd objInsRes
|
||||
finalInsCols = cols <> objRelDeterminedCols <> addCols
|
||||
|
||||
-- prepare insert query as with expression
|
||||
insCheck <- convAnnBoolExpPartialSQL sessVarFromCurrentSetting insCond
|
||||
updCheck <- traverse (convAnnBoolExpPartialSQL sessVarFromCurrentSetting) updCond
|
||||
|
||||
CTEExp cte insPArgs <-
|
||||
mkInsertQ tn onConflictM finalInsCols defVals role (insCheck, updCheck)
|
||||
|
||||
MutateResp affRows colVals <- liftTx $ mutateAndFetchCols tn allCols (cte, insPArgs) strfyNum
|
||||
colValM <- asSingleObject colVals
|
||||
|
||||
arrRelAffRows <- bool (withArrRels colValM) (return 0) $ null arrRels
|
||||
let totAffRows = objRelAffRows + affRows + arrRelAffRows
|
||||
|
||||
return (totAffRows, colValM)
|
||||
where
|
||||
AnnIns annObj onConflictM (insCond, updCond) allCols defVals = singleObjIns
|
||||
AnnInsObj cols objRels arrRels = annObj
|
||||
|
||||
arrRelDepCols = flip getColInfos allCols $
|
||||
concatMap (Map.keys . riMapping . _riRelInfo) arrRels
|
||||
|
||||
withArrRels colValM = do
|
||||
colVal <- onNothing colValM $ throw400 NotSupported cannotInsArrRelErr
|
||||
arrDepColsWithVal <- fetchFromColVals colVal arrRelDepCols
|
||||
arrInsARows <- forM arrRels $ insertArrRel env strfyNum rjCtx role arrDepColsWithVal
|
||||
return $ sum arrInsARows
|
||||
|
||||
asSingleObject = \case
|
||||
[] -> pure Nothing
|
||||
[r] -> pure $ Just r
|
||||
_ -> throw500 "more than one row returned"
|
||||
|
||||
cannotInsArrRelErr =
|
||||
"cannot proceed to insert array relations since insert to table "
|
||||
<> tn <<> " affects zero rows"
|
||||
|
||||
|
||||
-- | insert multiple Objects in postgres
|
||||
insertMultipleObjects
|
||||
:: ( HasVersion
|
||||
, MonadTx m
|
||||
, MonadIO m
|
||||
, Tracing.MonadTrace m
|
||||
)
|
||||
=> Env.Environment
|
||||
-> Bool
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> QualifiedTable
|
||||
-> MultiObjIns
|
||||
-> [PGColWithValue] -- ^ additional fields
|
||||
-> RR.MutationOutput
|
||||
-> T.Text -- ^ error path
|
||||
-> m EncJSON
|
||||
insertMultipleObjects env strfyNum rjCtx role tn multiObjIns addCols mutOutput errP =
|
||||
bool withoutRelsInsert withRelsInsert anyRelsToInsert
|
||||
where
|
||||
AnnIns insObjs onConflictM (insCond, updCond) tableColInfos defVals = multiObjIns
|
||||
singleObjInserts = multiToSingles multiObjIns
|
||||
insCols = map _aioColumns insObjs
|
||||
allInsObjRels = concatMap _aioObjRels insObjs
|
||||
allInsArrRels = concatMap _aioArrRels insObjs
|
||||
anyRelsToInsert = not $ null allInsArrRels && null allInsObjRels
|
||||
|
||||
withErrPath = withPathK errP
|
||||
|
||||
-- insert all column rows at one go
|
||||
withoutRelsInsert = withErrPath $ do
|
||||
indexedForM_ insCols $ \insCol ->
|
||||
validateInsert (map fst insCol) [] $ map fst addCols
|
||||
|
||||
let withAddCols = flip map insCols $ union addCols
|
||||
tableCols = Map.keys defVals
|
||||
|
||||
(sqlRows, prepArgs) <- flip runStateT Seq.Empty $ do
|
||||
rowsWithCol <- mapM toSQLExps withAddCols
|
||||
return $ map (mkSQLRow defVals) rowsWithCol
|
||||
|
||||
insCheck <- convAnnBoolExpPartialSQL sessVarFromCurrentSetting insCond
|
||||
updCheck <- traverse (convAnnBoolExpPartialSQL sessVarFromCurrentSetting) updCond
|
||||
|
||||
let insQP1 = RI.InsertQueryP1 tn tableCols sqlRows onConflictM
|
||||
(insCheck, updCheck) mutOutput tableColInfos
|
||||
p1 = (insQP1, prepArgs)
|
||||
RI.execInsertQuery env strfyNum (Just rjCtx) p1
|
||||
|
||||
-- insert each object with relations
|
||||
withRelsInsert = withErrPath $ do
|
||||
insResps <- indexedForM singleObjInserts $ \objIns ->
|
||||
insertObj env strfyNum rjCtx role tn objIns addCols
|
||||
|
||||
let affRows = sum $ map fst insResps
|
||||
columnValues = mapMaybe snd insResps
|
||||
cteExp <- mkSelCTEFromColVals tn tableColInfos columnValues
|
||||
let (mutOutputRJ, remoteJoins) = getRemoteJoinsMutationOutput mutOutput
|
||||
sqlQuery = Q.fromBuilder $ toSQL $
|
||||
RR.mkMutationOutputExp tn tableColInfos (Just affRows) cteExp mutOutputRJ strfyNum
|
||||
executeMutationOutputQuery env sqlQuery [] $ (,rjCtx) <$> remoteJoins
|
||||
|
||||
prefixErrPath :: (MonadError QErr m) => Field -> m a -> m a
|
||||
prefixErrPath fld =
|
||||
withPathK "selectionSet" . fieldAsPath fld . withPathK "args"
|
||||
|
||||
convertInsert
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has InsCtxMap r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> QualifiedTable -- table
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertInsert env rjCtx role tn fld = prefixErrPath fld $ do
|
||||
selSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
mutOutputUnres <- RR.MOutMultirowFields <$> resolveMutationFields (_fType fld) selSet
|
||||
mutOutputRes <- RR.traverseMutationOutput resolveValTxt mutOutputUnres
|
||||
annVals <- withArg arguments "objects" asArray
|
||||
-- if insert input objects is empty array then
|
||||
-- do not perform insert and return mutation response
|
||||
bool (withNonEmptyObjs annVals mutOutputRes)
|
||||
(withEmptyObjs mutOutputRes) $ null annVals
|
||||
where
|
||||
withNonEmptyObjs annVals mutOutput = do
|
||||
InsCtx tableColMap checkCond defValMap relInfoMap updPerm <- getInsCtx tn
|
||||
annObjs <- mapM asObject annVals
|
||||
annInsObjs <- forM annObjs $ mkAnnInsObj relInfoMap tableColMap
|
||||
conflictClauseM <- forM onConflictM $ parseOnConflict tn updPerm tableColMap
|
||||
defValMapRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting)
|
||||
defValMap
|
||||
let multiObjIns = AnnIns annInsObjs conflictClauseM (checkCond, updPerm >>= upfiCheck)
|
||||
tableCols defValMapRes
|
||||
tableCols = Map.elems tableColMap
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
return $ prefixErrPath fld $ insertMultipleObjects env strfyNum rjCtx role tn
|
||||
multiObjIns [] mutOutput "objects"
|
||||
withEmptyObjs mutOutput =
|
||||
return $ return $ buildEmptyMutResp mutOutput
|
||||
arguments = _fArguments fld
|
||||
onConflictM = Map.lookup "on_conflict" arguments
|
||||
|
||||
convertInsertOne
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, Has InsCtxMap r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> MutationRemoteJoinCtx
|
||||
-> RoleName
|
||||
-> QualifiedTable -- table
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertInsertOne env rjCtx role qt field = prefixErrPath field $ do
|
||||
selSet <- asObjectSelectionSet $ _fSelSet field
|
||||
tableSelFields <- processTableSelectionSet (_fType field) selSet
|
||||
let mutationOutputUnresolved = RR.MOutSinglerowObject tableSelFields
|
||||
mutationOutputResolved <- RR.traverseMutationOutput resolveValTxt mutationOutputUnresolved
|
||||
annInputObj <- withArg arguments "object" asObject
|
||||
InsCtx tableColMap checkCond defValMap relInfoMap updPerm <- getInsCtx qt
|
||||
annInsertObj <- mkAnnInsObj relInfoMap tableColMap annInputObj
|
||||
conflictClauseM <- forM (Map.lookup "on_conflict" arguments) $ parseOnConflict qt updPerm tableColMap
|
||||
defValMapRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) defValMap
|
||||
let multiObjIns = AnnIns [annInsertObj] conflictClauseM (checkCond, updPerm >>= upfiCheck)
|
||||
tableCols defValMapRes
|
||||
tableCols = Map.elems tableColMap
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
pure $ prefixErrPath field $ insertMultipleObjects env strfyNum rjCtx role qt
|
||||
multiObjIns [] mutationOutputResolved "object"
|
||||
where
|
||||
arguments = _fArguments field
|
||||
|
||||
-- helper functions
|
||||
getInsCtx
|
||||
:: (MonadError QErr m, MonadReader r m, Has InsCtxMap r)
|
||||
=> QualifiedTable -> m InsCtx
|
||||
getInsCtx tn = do
|
||||
ctxMap <- asks getter
|
||||
insCtx <- onNothing (Map.lookup tn ctxMap) $
|
||||
throw500 $ "table " <> tn <<> " not found"
|
||||
let defValMap = fmap PSESQLExp $ S.mkColDefValMap $ map pgiColumn $
|
||||
Map.elems $ icAllCols insCtx
|
||||
setCols = icSet insCtx
|
||||
return $ insCtx {icSet = Map.union setCols defValMap}
|
||||
|
||||
mergeListsWith
|
||||
:: [a] -> [b] -> (a -> b -> Bool) -> (a -> b -> c) -> [c]
|
||||
mergeListsWith _ [] _ _ = []
|
||||
mergeListsWith [] _ _ _ = []
|
||||
mergeListsWith (x:xs) l b f = case find (b x) l of
|
||||
Nothing -> mergeListsWith xs l b f
|
||||
Just y -> f x y : mergeListsWith xs l b f
|
@ -1,424 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.Introspect
|
||||
( schemaR
|
||||
, typeR
|
||||
) where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Hasura.SQL.Types as S
|
||||
import qualified Hasura.SQL.Value as S
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Validate.Context
|
||||
import Hasura.GraphQL.Validate.InputValue
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
|
||||
data TypeKind
|
||||
= TKSCALAR
|
||||
| TKOBJECT
|
||||
| TKINTERFACE
|
||||
| TKUNION
|
||||
| TKENUM
|
||||
| TKINPUT_OBJECT
|
||||
| TKLIST
|
||||
| TKNON_NULL
|
||||
deriving (Show, Eq)
|
||||
|
||||
instance J.ToJSON TypeKind where
|
||||
toJSON = J.toJSON . T.pack . drop 2 . show
|
||||
|
||||
withSubFields
|
||||
:: (MonadError QErr m)
|
||||
=> SelectionSet
|
||||
-> (Field -> m J.Value)
|
||||
-> m J.Object
|
||||
withSubFields selSet fn = do
|
||||
objectSelectionSet <- asObjectSelectionSet selSet
|
||||
Map.fromList <$> traverseObjectSelectionSet objectSelectionSet fn
|
||||
-- val <- fn fld
|
||||
-- return (G.unName $ G.unAlias $ _fAlias fld, val)
|
||||
|
||||
namedTyToTxt :: G.NamedType -> Text
|
||||
namedTyToTxt = G.unName . G.unNamedType
|
||||
|
||||
retJ :: (Applicative m, J.ToJSON a) => a -> m J.Value
|
||||
retJ = pure . J.toJSON
|
||||
|
||||
retJT :: (Applicative m) => Text -> m J.Value
|
||||
retJT = pure . J.toJSON
|
||||
|
||||
-- 4.5.2.1
|
||||
scalarR
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> ScalarTyInfo
|
||||
-> Field
|
||||
-> m J.Object
|
||||
scalarR (ScalarTyInfo descM name _ _) fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKSCALAR
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"name" -> retJ name
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.2.2
|
||||
objectTypeR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> ObjTyInfo
|
||||
-> Field
|
||||
-> m J.Object
|
||||
objectTypeR objectType fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKOBJECT
|
||||
"name" -> retJ $ namedTyToTxt n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"interfaces" -> fmap J.toJSON $ mapM (`ifaceR` subFld) $ Set.toList iFaces
|
||||
"fields" -> fmap J.toJSON $ mapM (`fieldR` subFld) $
|
||||
sortOn _fiName $
|
||||
filter notBuiltinFld $ Map.elems flds
|
||||
_ -> return J.Null
|
||||
where
|
||||
descM = _otiDesc objectType
|
||||
n = _otiName objectType
|
||||
iFaces = _otiImplIFaces objectType
|
||||
flds = _otiFields objectType
|
||||
|
||||
notBuiltinFld :: ObjFldInfo -> Bool
|
||||
notBuiltinFld f =
|
||||
fldName /= "__typename" && fldName /= "__type" && fldName /= "__schema"
|
||||
where
|
||||
fldName = _fiName f
|
||||
|
||||
getImplTypes :: (MonadReader t m, Has TypeMap t) => AsObjType -> m [ObjTyInfo]
|
||||
getImplTypes aot = do
|
||||
tyInfo <- asks getter
|
||||
return $ sortOn _otiName $
|
||||
Map.elems $ getPossibleObjTypes tyInfo aot
|
||||
|
||||
-- 4.5.2.3
|
||||
unionR
|
||||
:: (MonadReader t m, MonadError QErr m, Has TypeMap t, MonadReusability m)
|
||||
=> UnionTyInfo -> Field -> m J.Object
|
||||
unionR u@(UnionTyInfo descM n _) fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Field"
|
||||
"kind" -> retJ TKUNION
|
||||
"name" -> retJ $ namedTyToTxt n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"possibleTypes" -> fmap J.toJSON $
|
||||
mapM (`objectTypeR` subFld) =<< getImplTypes (AOTUnion u)
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.2.4
|
||||
ifaceR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> G.NamedType
|
||||
-> Field
|
||||
-> m J.Object
|
||||
ifaceR n fld = do
|
||||
tyInfo <- getTyInfo n
|
||||
case tyInfo of
|
||||
TIIFace ifaceTyInfo -> ifaceR' ifaceTyInfo fld
|
||||
_ -> throw500 $ "Unknown interface " <> showNamedTy n
|
||||
|
||||
ifaceR'
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> IFaceTyInfo
|
||||
-> Field
|
||||
-> m J.Object
|
||||
ifaceR' ifaceTyInfo fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKINTERFACE
|
||||
"name" -> retJ $ namedTyToTxt name
|
||||
"description" -> retJ $ fmap G.unDescription maybeDescription
|
||||
"fields" -> fmap J.toJSON $ mapM (`fieldR` subFld) $
|
||||
sortOn _fiName $
|
||||
filter notBuiltinFld $ Map.elems fields
|
||||
"possibleTypes" -> fmap J.toJSON $ mapM (`objectTypeR` subFld)
|
||||
=<< getImplTypes (AOTIFace ifaceTyInfo)
|
||||
_ -> return J.Null
|
||||
where
|
||||
maybeDescription = _ifDesc ifaceTyInfo
|
||||
name = _ifName ifaceTyInfo
|
||||
fields = _ifFields ifaceTyInfo
|
||||
|
||||
-- 4.5.2.5
|
||||
enumTypeR
|
||||
:: ( Monad m, MonadReusability m, MonadError QErr m )
|
||||
=> EnumTyInfo
|
||||
-> Field
|
||||
-> m J.Object
|
||||
enumTypeR (EnumTyInfo descM n vals _) fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKENUM
|
||||
"name" -> retJ $ namedTyToTxt n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"enumValues" -> do
|
||||
includeDeprecated <- readIncludeDeprecated subFld
|
||||
fmap J.toJSON $
|
||||
mapM (enumValueR subFld) $
|
||||
filter (\val -> includeDeprecated || not (_eviIsDeprecated val)) $
|
||||
sortOn _eviVal $
|
||||
Map.elems (normalizeEnumValues vals)
|
||||
_ -> return J.Null
|
||||
|
||||
readIncludeDeprecated
|
||||
:: ( Monad m, MonadReusability m, MonadError QErr m )
|
||||
=> Field
|
||||
-> m Bool
|
||||
readIncludeDeprecated subFld = do
|
||||
let argM = Map.lookup "includeDeprecated" (_fArguments subFld)
|
||||
case argM of
|
||||
Nothing -> pure False
|
||||
Just arg -> asScalarVal arg S.PGBoolean >>= \case
|
||||
S.PGValBoolean b -> pure b
|
||||
_ -> throw500 "unexpected non-Boolean argument for includeDeprecated"
|
||||
|
||||
{- Note [Reusability of introspection queries with variables]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
Introspection queries can have variables, too, in particular to influence one of
|
||||
two arguments: the @name@ argument of the @__type@ field, and the
|
||||
@includeDeprecated@ argument of the @fields@ and @enumValues@ fields. The
|
||||
current code does not cache all introspection queries with variables correctly.
|
||||
As a workaround to this, whenever a variable is passed to an @includeDeprecated@
|
||||
argument, we mark the query as unreusable. This is the purpose of
|
||||
'dummyReadIncludeDeprecated'.
|
||||
|
||||
Now @fields@ and @enumValues@ are intended to be used when introspecting,
|
||||
respectively [object and interface types] and enum types. However, it does not
|
||||
suffice to only call 'dummyReadIncludeDeprecated' for such types, since @fields@
|
||||
and @enumValues@ are valid GraphQL fields regardless of what type we are looking
|
||||
at. So precisely because @__Type@ is _thought of_ as a union, but _not
|
||||
actually_ a union, we need to call 'dummyReadIncludeDeprecated' in all cases.
|
||||
|
||||
See also issue #4547.
|
||||
-}
|
||||
|
||||
dummyReadIncludeDeprecated
|
||||
:: ( Monad m, MonadReusability m, MonadError QErr m )
|
||||
=> Field
|
||||
-> m ()
|
||||
dummyReadIncludeDeprecated fld = do
|
||||
selSet <- unAliasedFields . unObjectSelectionSet
|
||||
<$> asObjectSelectionSet (_fSelSet fld)
|
||||
forM_ (toList selSet) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"fields" -> readIncludeDeprecated subFld
|
||||
"enumValues" -> readIncludeDeprecated subFld
|
||||
_ -> return False
|
||||
|
||||
|
||||
-- 4.5.2.6
|
||||
inputObjR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> InpObjTyInfo
|
||||
-> Field
|
||||
-> m J.Object
|
||||
inputObjR (InpObjTyInfo descM nt flds _) fld = do
|
||||
dummyReadIncludeDeprecated fld
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKINPUT_OBJECT
|
||||
"name" -> retJ $ namedTyToTxt nt
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"inputFields" -> fmap J.toJSON $ mapM (inputValueR subFld) $
|
||||
sortOn _iviName $ Map.elems flds
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.2.7
|
||||
listTypeR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> G.ListType -> Field -> m J.Object
|
||||
listTypeR (G.ListType ty) fld =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKLIST
|
||||
"ofType" -> J.toJSON <$> gtypeR ty subFld
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.2.8
|
||||
nonNullR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> G.GType -> Field -> m J.Object
|
||||
nonNullR gTyp fld =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Type"
|
||||
"kind" -> retJ TKNON_NULL
|
||||
"ofType" -> case gTyp of
|
||||
G.TypeNamed (G.Nullability False) nt -> J.toJSON <$> namedTypeR nt subFld
|
||||
G.TypeList (G.Nullability False) lt -> J.toJSON <$> listTypeR lt subFld
|
||||
_ -> throw500 "nullable type passed to nonNullR"
|
||||
_ -> return J.Null
|
||||
|
||||
namedTypeR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> G.NamedType
|
||||
-> Field
|
||||
-> m J.Object
|
||||
namedTypeR nt fld = do
|
||||
tyInfo <- getTyInfo nt
|
||||
namedTypeR' fld tyInfo
|
||||
|
||||
namedTypeR'
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> Field
|
||||
-> TypeInfo
|
||||
-> m J.Object
|
||||
namedTypeR' fld tyInfo = do
|
||||
-- Now fetch the required type information from the corresponding
|
||||
-- information generator
|
||||
case tyInfo of
|
||||
TIScalar colTy -> scalarR colTy fld
|
||||
TIObj objTyInfo -> objectTypeR objTyInfo fld
|
||||
TIEnum enumTypeInfo -> enumTypeR enumTypeInfo fld
|
||||
TIInpObj inpObjTyInfo -> inputObjR inpObjTyInfo fld
|
||||
TIIFace iFaceTyInfo -> ifaceR' iFaceTyInfo fld
|
||||
TIUnion unionTyInfo -> unionR unionTyInfo fld
|
||||
|
||||
-- 4.5.3
|
||||
fieldR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> ObjFldInfo -> Field -> m J.Object
|
||||
fieldR (ObjFldInfo descM n params ty _) fld =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Field"
|
||||
"name" -> retJ $ G.unName n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"args" -> fmap J.toJSON $ mapM (inputValueR subFld) $
|
||||
sortOn _iviName $ Map.elems params
|
||||
"type" -> J.toJSON <$> gtypeR ty subFld
|
||||
"isDeprecated" -> retJ False
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.4
|
||||
inputValueR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> Field -> InpValInfo -> m J.Object
|
||||
inputValueR fld (InpValInfo descM n defM ty) =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__InputValue"
|
||||
"name" -> retJ $ G.unName n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"type" -> J.toJSON <$> gtypeR ty subFld
|
||||
-- TODO: figure out what the spec means by 'string encoding'
|
||||
"defaultValue" -> retJ $ pPrintValueC <$> defM
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.5
|
||||
enumValueR
|
||||
:: (MonadError QErr m)
|
||||
=> Field -> EnumValInfo -> m J.Object
|
||||
enumValueR fld (EnumValInfo descM enumVal isDeprecated) =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__EnumValue"
|
||||
"name" -> retJ $ G.unName $ G.unEnumValue enumVal
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"isDeprecated" -> retJ isDeprecated
|
||||
_ -> return J.Null
|
||||
|
||||
-- 4.5.6
|
||||
directiveR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> Field -> DirectiveInfo -> m J.Object
|
||||
directiveR fld (DirectiveInfo descM n args locs) =
|
||||
withSubFields (_fSelSet fld) $ \subFld ->
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Directive"
|
||||
"name" -> retJ $ G.unName n
|
||||
"description" -> retJ $ fmap G.unDescription descM
|
||||
"locations" -> retJ $ map showDirLoc locs
|
||||
"args" -> fmap J.toJSON $ mapM (inputValueR subFld) $
|
||||
sortOn _iviName $ Map.elems args
|
||||
_ -> return J.Null
|
||||
|
||||
showDirLoc :: G.DirectiveLocation -> Text
|
||||
showDirLoc = \case
|
||||
G.DLExecutable edl -> T.pack $ drop 3 $ show edl
|
||||
G.DLTypeSystem tsdl -> T.pack $ drop 4 $ show tsdl
|
||||
|
||||
gtypeR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> G.GType -> Field -> m J.Object
|
||||
gtypeR ty fld =
|
||||
case ty of
|
||||
G.TypeList (G.Nullability True) lt -> listTypeR lt fld
|
||||
G.TypeList (G.Nullability False) _ -> nonNullR ty fld
|
||||
G.TypeNamed (G.Nullability True) nt -> namedTypeR nt fld
|
||||
G.TypeNamed (G.Nullability False) _ -> nonNullR ty fld
|
||||
|
||||
schemaR
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m, MonadReusability m)
|
||||
=> Field -> m J.Object
|
||||
schemaR fld =
|
||||
withSubFields (_fSelSet fld) $ \subFld -> do
|
||||
(tyMap :: TypeMap) <- asks getter
|
||||
case _fName subFld of
|
||||
"__typename" -> retJT "__Schema"
|
||||
"types" -> fmap J.toJSON $ mapM (namedTypeR' subFld) $
|
||||
sortOn getNamedTy $ Map.elems tyMap
|
||||
"queryType" -> J.toJSON <$> namedTypeR queryRootNamedType subFld
|
||||
"mutationType" -> typeR' mutationRootNamedType subFld
|
||||
"subscriptionType" -> typeR' subscriptionRootNamedType subFld
|
||||
"directives" -> J.toJSON <$> mapM (directiveR subFld)
|
||||
(sortOn _diName defaultDirectives)
|
||||
_ -> return J.Null
|
||||
|
||||
typeR
|
||||
:: (MonadReusability m, MonadError QErr m, MonadReader r m, Has TypeMap r)
|
||||
=> Field -> m J.Value
|
||||
typeR fld = do
|
||||
name <- asPGColText =<< getArg args "name"
|
||||
typeR' (G.NamedType $ G.Name name) fld
|
||||
where
|
||||
args = _fArguments fld
|
||||
|
||||
typeR'
|
||||
:: (MonadReader r m, Has TypeMap r, MonadError QErr m, MonadReusability m)
|
||||
=> G.NamedType -> Field -> m J.Value
|
||||
typeR' n fld = do
|
||||
tyMap <- asks getter
|
||||
case Map.lookup n tyMap of
|
||||
Nothing -> return J.Null
|
||||
Just tyInfo -> J.Object <$> namedTypeR' fld tyInfo
|
@ -1,411 +0,0 @@
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
|
||||
module Hasura.GraphQL.Resolve.Mutation
|
||||
( convertUpdate
|
||||
, convertUpdateByPk
|
||||
, convertDelete
|
||||
, convertDeleteByPk
|
||||
, resolveMutationFields
|
||||
, buildEmptyMutResp
|
||||
) where
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Control.Monad.Validate as MV
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.HashMap.Strict.InsOrd.Extended as OMap
|
||||
import qualified Data.Sequence.NonEmpty as NESeq
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.RQL.DML.Delete as RD
|
||||
import qualified Hasura.RQL.DML.Returning as RR
|
||||
import qualified Hasura.RQL.DML.Update as RU
|
||||
|
||||
import qualified Hasura.RQL.DML.Select as RS
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Resolve.BoolExp
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Resolve.Select (processTableSelectionSet)
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DML.Internal (currentSession, sessVarFromCurrentSetting)
|
||||
import Hasura.RQL.DML.Mutation (MutationRemoteJoinCtx)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
resolveMutationFields
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> G.NamedType -> ObjectSelectionSet -> m (RR.MutFldsG UnresolvedVal)
|
||||
resolveMutationFields ty selSet = fmap (map (first FieldName)) $
|
||||
traverseObjectSelectionSet selSet $ \fld -> case _fName fld of
|
||||
"__typename" -> return $ RR.MExp $ G.unName $ G.unNamedType ty
|
||||
"affected_rows" -> return RR.MCount
|
||||
"returning" -> do
|
||||
annFlds <- asObjectSelectionSet (_fSelSet fld)
|
||||
>>= processTableSelectionSet (_fType fld)
|
||||
annFldsResolved <- traverse
|
||||
(traverse (RS.traverseAnnField convertPGValueToTextValue)) annFlds
|
||||
return $ RR.MRet annFldsResolved
|
||||
G.Name t -> throw500 $ "unexpected field in mutation resp : " <> t
|
||||
where
|
||||
convertPGValueToTextValue = \case
|
||||
UVPG annPGVal -> UVSQL <$> txtConverter annPGVal
|
||||
UVSessVar colTy sessVar -> pure $ UVSessVar colTy sessVar
|
||||
UVSQL sqlExp -> pure $ UVSQL sqlExp
|
||||
UVSession -> pure UVSession
|
||||
|
||||
convertRowObj
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap
|
||||
-> AnnInpVal
|
||||
-> m [(PGCol, UnresolvedVal)]
|
||||
convertRowObj colGNameMap val =
|
||||
flip withObject val $ \_ obj ->
|
||||
forM (OMap.toList obj) $ \(k, v) -> do
|
||||
prepExpM <- fmap mkParameterizablePGValue <$> asPGColumnValueM v
|
||||
pgCol <- pgiColumn <$> resolvePGCol colGNameMap k
|
||||
let prepExp = fromMaybe (UVSQL S.SENull) prepExpM
|
||||
return (pgCol, prepExp)
|
||||
|
||||
type ApplySQLOp = (PGCol, S.SQLExp) -> S.SQLExp
|
||||
|
||||
-- SET x = x <op> <value>
|
||||
rhsExpOp :: S.SQLOp -> S.TypeAnn -> ApplySQLOp
|
||||
rhsExpOp op annTy (col, e) =
|
||||
S.mkSQLOpExp op (S.SEIden $ toIden col) annExp
|
||||
where
|
||||
annExp = S.SETyAnn e annTy
|
||||
|
||||
lhsExpOp :: S.SQLOp -> S.TypeAnn -> ApplySQLOp
|
||||
lhsExpOp op annTy (col, e) =
|
||||
S.mkSQLOpExp op annExp $ S.SEIden $ toIden col
|
||||
where
|
||||
annExp = S.SETyAnn e annTy
|
||||
|
||||
-- Automatically generate type annotation by looking up the column name
|
||||
typedRhsExpOp :: S.SQLOp -> S.TypeAnn -> PGColGNameMap -> ApplySQLOp
|
||||
typedRhsExpOp op defaultAnnTy colGNameMap (colName, e) =
|
||||
let annTypeM :: Maybe S.TypeAnn
|
||||
annTypeM = do
|
||||
fieldType <- pgiType <$> Map.lookup (G.Name $ getPGColTxt colName) colGNameMap
|
||||
case fieldType of
|
||||
PGColumnScalar x -> return $ S.mkTypeAnn $ PGTypeScalar x
|
||||
_ -> Nothing
|
||||
annType :: S.TypeAnn
|
||||
annType = fromMaybe defaultAnnTy annTypeM
|
||||
in rhsExpOp op annType (colName, e)
|
||||
|
||||
convObjWithOp
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap -> ApplySQLOp -> AnnInpVal -> m [(PGCol, UnresolvedVal)]
|
||||
convObjWithOp colGNameMap opFn val =
|
||||
flip withObject val $ \_ obj -> forM (OMap.toList obj) $ \(k, v) -> do
|
||||
colVal <- openOpaqueValue =<< asPGColumnValue v
|
||||
pgCol <- pgiColumn <$> resolvePGCol colGNameMap k
|
||||
-- TODO: why are we using txtEncoder here?
|
||||
let encVal = txtEncoder $ pstValue $ _apvValue colVal
|
||||
sqlExp = opFn (pgCol, encVal)
|
||||
return (pgCol, UVSQL sqlExp)
|
||||
|
||||
convDeleteAtPathObj
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap -> AnnInpVal -> m [(PGCol, UnresolvedVal)]
|
||||
convDeleteAtPathObj colGNameMap val =
|
||||
flip withObject val $ \_ obj -> forM (OMap.toList obj) $ \(k, v) -> do
|
||||
vals <- traverse (openOpaqueValue <=< asPGColumnValue) =<< asArray v
|
||||
pgCol <- pgiColumn <$> resolvePGCol colGNameMap k
|
||||
let valExps = map (txtEncoder . pstValue . _apvValue) vals
|
||||
annEncVal = S.SETyAnn (S.SEArray valExps) S.textArrTypeAnn
|
||||
sqlExp = S.SEOpApp S.jsonbDeleteAtPathOp
|
||||
[S.SEIden $ toIden pgCol, annEncVal]
|
||||
return (pgCol, UVSQL sqlExp)
|
||||
|
||||
convertUpdateP1
|
||||
:: forall m . (MonadReusability m, MonadError QErr m)
|
||||
=> UpdOpCtx -- the update context
|
||||
-> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool expression parser
|
||||
-> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver
|
||||
-> Field -- the mutation field
|
||||
-> m (RU.AnnUpdG UnresolvedVal)
|
||||
convertUpdateP1 opCtx boolExpParser selectionResolver fld = do
|
||||
-- a set expression is same as a row object
|
||||
setExpM <- resolveUpdateOperator "_set" $ convertRowObj colGNameMap
|
||||
-- where bool expression to filter column
|
||||
whereExp <- boolExpParser args
|
||||
-- increment operator on integer columns
|
||||
incExpM <- resolveUpdateOperator "_inc" $
|
||||
convObjWithOp' $ typedRhsExpOp S.incOp S.numericTypeAnn colGNameMap
|
||||
-- append jsonb value
|
||||
appendExpM <- resolveUpdateOperator "_append" $
|
||||
convObjWithOp' $ rhsExpOp S.jsonbConcatOp S.jsonbTypeAnn
|
||||
-- prepend jsonb value
|
||||
prependExpM <- resolveUpdateOperator "_prepend" $
|
||||
convObjWithOp' $ lhsExpOp S.jsonbConcatOp S.jsonbTypeAnn
|
||||
-- delete a key in jsonb object
|
||||
deleteKeyExpM <- resolveUpdateOperator "_delete_key" $
|
||||
convObjWithOp' $ rhsExpOp S.jsonbDeleteOp S.textTypeAnn
|
||||
-- delete an element in jsonb array
|
||||
deleteElemExpM <- resolveUpdateOperator "_delete_elem" $
|
||||
convObjWithOp' $ rhsExpOp S.jsonbDeleteOp S.intTypeAnn
|
||||
-- delete at path in jsonb value
|
||||
deleteAtPathExpM <- resolveUpdateOperator "_delete_at_path" $
|
||||
convDeleteAtPathObj colGNameMap
|
||||
|
||||
updateItems <- combineUpdateExpressions
|
||||
[ setExpM, incExpM, appendExpM, prependExpM
|
||||
, deleteKeyExpM, deleteElemExpM, deleteAtPathExpM
|
||||
]
|
||||
|
||||
mutOutput <- selectionResolver fld
|
||||
|
||||
pure $ RU.AnnUpd tn updateItems (unresolvedPermFilter, whereExp) unresolvedPermCheck mutOutput allCols
|
||||
where
|
||||
convObjWithOp' = convObjWithOp colGNameMap
|
||||
allCols = Map.elems colGNameMap
|
||||
UpdOpCtx tn _ colGNameMap filterExp checkExpr preSetCols = opCtx
|
||||
args = _fArguments fld
|
||||
resolvedPreSetItems = Map.toList $ fmap partialSQLExpToUnresolvedVal preSetCols
|
||||
unresolvedPermFilter = fmapAnnBoolExp partialSQLExpToUnresolvedVal filterExp
|
||||
unresolvedPermCheck = maybe annBoolExpTrue (fmapAnnBoolExp partialSQLExpToUnresolvedVal) checkExpr
|
||||
|
||||
resolveUpdateOperator operator resolveAction =
|
||||
(operator,) <$> withArgM args operator resolveAction
|
||||
|
||||
combineUpdateExpressions :: [(G.Name, Maybe [(PGCol, UnresolvedVal)])]
|
||||
-> m [(PGCol, UnresolvedVal)]
|
||||
combineUpdateExpressions updateExps = do
|
||||
let allOperatorNames = map fst updateExps
|
||||
updateItems :: [(G.Name, [(PGCol, UnresolvedVal)])]
|
||||
updateItems = mapMaybe (\(op, itemsM) -> (op,) <$> itemsM) updateExps
|
||||
-- Atleast any one of operator is expected or preset expressions shouldn't be empty
|
||||
if null updateItems && null resolvedPreSetItems then
|
||||
throwVE $ "at least any one of " <> showNames allOperatorNames <> " is expected"
|
||||
else do
|
||||
let itemsWithOps :: [(PGCol, (G.Name, UnresolvedVal))]
|
||||
itemsWithOps = concatMap (\(op, items) -> map (second (op,)) items) updateItems
|
||||
validateMultiOps col items = do
|
||||
when (length items > 1) $ MV.dispute [(col, map fst $ toList items)]
|
||||
pure $ snd $ NESeq.head items
|
||||
eitherResult :: Either
|
||||
[(PGCol, [G.Name])]
|
||||
(OMap.InsOrdHashMap PGCol UnresolvedVal)
|
||||
eitherResult = MV.runValidate $ OMap.traverseWithKey validateMultiOps $
|
||||
OMap.groupTuples itemsWithOps
|
||||
case eitherResult of
|
||||
-- A column shouldn't be present in more than one operator.
|
||||
-- If present, then generated UPDATE statement throws unexpected query error
|
||||
Left columnsWithMultiOps -> throwVE $
|
||||
"column found in multiple operators; "
|
||||
<> T.intercalate ". "
|
||||
(map (\(col, ops) -> col <<> " in " <> showNames ops)
|
||||
columnsWithMultiOps)
|
||||
Right items -> pure $ resolvedPreSetItems <> OMap.toList items
|
||||
|
||||
convertUpdateGeneric
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> UpdOpCtx -- the update context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool exp parser
|
||||
-> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver
|
||||
-> Field
|
||||
-> m (tx EncJSON)
|
||||
convertUpdateGeneric env opCtx rjCtx boolExpParser selectionResolver fld = do
|
||||
annUpdUnresolved <- convertUpdateP1 opCtx boolExpParser selectionResolver fld
|
||||
(annUpdResolved, prepArgs) <- withPrepArgs $ RU.traverseAnnUpd
|
||||
resolveValPrep annUpdUnresolved
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
let whenNonEmptyItems = return $ RU.execUpdateQuery env strfyNum
|
||||
(Just rjCtx) (annUpdResolved, prepArgs)
|
||||
whenEmptyItems = return $ return $
|
||||
buildEmptyMutResp $ RU.uqp1Output annUpdResolved
|
||||
-- if there are not set items then do not perform
|
||||
-- update and return empty mutation response
|
||||
bool whenNonEmptyItems whenEmptyItems $ null $ RU.uqp1SetExps annUpdResolved
|
||||
|
||||
convertUpdate
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> UpdOpCtx -- the update context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertUpdate env opCtx rjCtx =
|
||||
convertUpdateGeneric env opCtx rjCtx whereExpressionParser mutationFieldsResolver
|
||||
|
||||
convertUpdateByPk
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> UpdOpCtx -- the update context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertUpdateByPk env opCtx rjCtx field =
|
||||
convertUpdateGeneric env opCtx rjCtx boolExpParser tableSelectionAsMutationOutput field
|
||||
where
|
||||
boolExpParser args = withArg args "pk_columns" $ \inpVal -> do
|
||||
obj <- asObject inpVal
|
||||
pgColValToBoolExp (_uocAllCols opCtx) $ Map.fromList $ OMap.toList obj
|
||||
|
||||
|
||||
convertDeleteGeneric
|
||||
:: ( HasVersion, MonadReusability m
|
||||
, MonadReader r m
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> DelOpCtx -- the delete context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool exp parser
|
||||
-> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertDeleteGeneric env opCtx rjCtx boolExpParser selectionResolver fld = do
|
||||
whereExp <- boolExpParser $ _fArguments fld
|
||||
mutOutput <- selectionResolver fld
|
||||
let unresolvedPermFltr =
|
||||
fmapAnnBoolExp partialSQLExpToUnresolvedVal filterExp
|
||||
annDelUnresolved = RD.AnnDel tn (unresolvedPermFltr, whereExp)
|
||||
mutOutput allCols
|
||||
(annDelResolved, prepArgs) <- withPrepArgs $ RD.traverseAnnDel
|
||||
resolveValPrep annDelUnresolved
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
return $ RD.execDeleteQuery env strfyNum (Just rjCtx) (annDelResolved, prepArgs)
|
||||
where
|
||||
DelOpCtx tn _ colGNameMap filterExp = opCtx
|
||||
allCols = Map.elems colGNameMap
|
||||
|
||||
convertDelete
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> DelOpCtx -- the delete context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertDelete env opCtx rjCtx =
|
||||
convertDeleteGeneric env opCtx rjCtx whereExpressionParser mutationFieldsResolver
|
||||
|
||||
convertDeleteByPk
|
||||
:: ( HasVersion
|
||||
, MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
, MonadIO tx
|
||||
, MonadTx tx
|
||||
, Tracing.MonadTrace tx
|
||||
)
|
||||
=> Env.Environment
|
||||
-> DelOpCtx -- the delete context
|
||||
-> MutationRemoteJoinCtx
|
||||
-> Field -- the mutation field
|
||||
-> m (tx EncJSON)
|
||||
convertDeleteByPk env opCtx rjCtx field =
|
||||
convertDeleteGeneric env opCtx rjCtx boolExpParser tableSelectionAsMutationOutput field
|
||||
where
|
||||
boolExpParser = pgColValToBoolExp (_docAllCols opCtx)
|
||||
|
||||
whereExpressionParser
|
||||
:: ( MonadReusability m, MonadError QErr m
|
||||
, MonadReader r m, Has FieldMap r
|
||||
)
|
||||
=> ArgsMap -> m AnnBoolExpUnresolved
|
||||
whereExpressionParser args = withArg args "where" parseBoolExp
|
||||
|
||||
mutationFieldsResolver
|
||||
:: ( MonadReusability m, MonadError QErr m
|
||||
, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> Field -> m (RR.MutationOutputG UnresolvedVal)
|
||||
mutationFieldsResolver field = do
|
||||
asObjectSelectionSet (_fSelSet field) >>= \selSet ->
|
||||
RR.MOutMultirowFields <$> resolveMutationFields (_fType field) selSet
|
||||
|
||||
tableSelectionAsMutationOutput
|
||||
:: ( MonadReusability m, MonadError QErr m
|
||||
, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> Field -> m (RR.MutationOutputG UnresolvedVal)
|
||||
tableSelectionAsMutationOutput field =
|
||||
asObjectSelectionSet (_fSelSet field) >>= \selSet ->
|
||||
RR.MOutSinglerowObject <$> processTableSelectionSet (_fType field) selSet
|
||||
|
||||
-- | build mutation response for empty objects
|
||||
buildEmptyMutResp :: RR.MutationOutput -> EncJSON
|
||||
buildEmptyMutResp = mkTx
|
||||
where
|
||||
mkTx = \case
|
||||
RR.MOutMultirowFields mutFlds -> encJFromJValue $ OMap.fromList $ map (second convMutFld) mutFlds
|
||||
RR.MOutSinglerowObject _ -> encJFromJValue $ J.Object mempty
|
||||
-- generate empty mutation response
|
||||
convMutFld = \case
|
||||
RR.MCount -> J.toJSON (0 :: Int)
|
||||
RR.MExp e -> J.toJSON e
|
||||
RR.MRet _ -> J.toJSON ([] :: [J.Value])
|
||||
|
||||
resolveValPrep
|
||||
:: (MonadState PrepArgs m)
|
||||
=> UnresolvedVal -> m S.SQLExp
|
||||
resolveValPrep = \case
|
||||
UVPG annPGVal -> prepare annPGVal
|
||||
UVSessVar colTy sessVar -> sessVarFromCurrentSetting colTy sessVar
|
||||
UVSQL sqlExp -> pure sqlExp
|
||||
UVSession -> pure currentSession
|
@ -1,833 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.Select
|
||||
( convertSelect
|
||||
, convertConnectionSelect
|
||||
, convertConnectionFuncQuery
|
||||
, convertSelectByPKey
|
||||
, convertAggSelect
|
||||
, convertFuncQuerySimple
|
||||
, convertFuncQueryAgg
|
||||
, parseColumns
|
||||
, processTableSelectionSet
|
||||
, resolveNodeId
|
||||
, convertNodeSelect
|
||||
, AnnSimpleSelect
|
||||
) where
|
||||
|
||||
import Control.Lens (to, (^..), (^?), _2)
|
||||
import Data.Has
|
||||
import Data.Parser.JSONPath
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Extended as J
|
||||
import qualified Data.Aeson.Internal as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Sequence.NonEmpty as NESeq
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.RQL.DML.Select as RS
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
import Hasura.GraphQL.Resolve.BoolExp
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.InputValue
|
||||
import Hasura.GraphQL.Schema (isAggregateField)
|
||||
import Hasura.GraphQL.Schema.Common (mkTableTy)
|
||||
import Hasura.GraphQL.Validate
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DML.Internal (onlyPositiveInt)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Server.Utils
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
jsonPathToColExp :: (MonadError QErr m) => T.Text -> m (Maybe S.SQLExp)
|
||||
jsonPathToColExp t = case parseJSONPath t of
|
||||
Left s -> throw400 ParseFailed $ T.pack $ "parse json path error: " ++ s
|
||||
Right [] -> return Nothing
|
||||
Right jPaths -> return $ Just $ S.SEArray $ map elToColExp jPaths
|
||||
where
|
||||
elToColExp (Key k) = S.SELit k
|
||||
elToColExp (Index i) = S.SELit $ T.pack (show i)
|
||||
|
||||
|
||||
argsToColumnOp :: (MonadReusability m, MonadError QErr m) => ArgsMap -> m (Maybe RS.ColumnOp)
|
||||
argsToColumnOp args = case Map.lookup "path" args of
|
||||
Nothing -> return Nothing
|
||||
Just txt -> do
|
||||
mColTxt <- asPGColTextM txt
|
||||
mColExps <- maybe (return Nothing) jsonPathToColExp mColTxt
|
||||
pure $ RS.ColumnOp S.jsonbPathOp <$> mColExps
|
||||
|
||||
type AnnFields = RS.AnnFieldsG UnresolvedVal
|
||||
|
||||
resolveComputedField
|
||||
:: ( MonadReusability m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r, MonadError QErr m
|
||||
)
|
||||
=> ComputedField -> Field -> m (RS.ComputedFieldSelect UnresolvedVal)
|
||||
resolveComputedField computedField fld = fieldAsPath fld $ do
|
||||
funcArgs <- parseFunctionArgs argSeq argFn $ Map.lookup "args" $ _fArguments fld
|
||||
let argsWithTableArgument = withTableAndSessionArgument funcArgs
|
||||
case fieldType of
|
||||
CFTScalar scalarTy -> do
|
||||
colOpM <- argsToColumnOp $ _fArguments fld
|
||||
pure $ RS.CFSScalar $
|
||||
RS.ComputedFieldScalarSelect qf argsWithTableArgument scalarTy colOpM
|
||||
CFTTable (ComputedFieldTable _ cols permFilter permLimit) -> do
|
||||
let functionFrom = RS.FromFunction qf argsWithTableArgument Nothing
|
||||
RS.CFSTable RS.JASMultipleRows <$> fromField functionFrom cols permFilter permLimit fld
|
||||
where
|
||||
ComputedField _ function argSeq fieldType = computedField
|
||||
ComputedFieldFunction qf _ tableArg sessionArg _ = function
|
||||
argFn :: FunctionArgItem -> InputFunctionArgument
|
||||
argFn = IFAUnknown
|
||||
withTableAndSessionArgument :: RS.FunctionArgsExpG UnresolvedVal
|
||||
-> RS.FunctionArgsExpTableRow UnresolvedVal
|
||||
withTableAndSessionArgument resolvedArgs =
|
||||
let argsExp@(RS.FunctionArgsExp positional named) = RS.AEInput <$> resolvedArgs
|
||||
tableRowArg = RS.AETableRow Nothing
|
||||
withTable = case tableArg of
|
||||
FTAFirst ->
|
||||
RS.FunctionArgsExp (tableRowArg:positional) named
|
||||
FTANamed argName index ->
|
||||
RS.insertFunctionArg argName index tableRowArg argsExp
|
||||
sessionArgVal = RS.AESession UVSession
|
||||
alsoWithSession = case sessionArg of
|
||||
Nothing -> withTable
|
||||
Just (FunctionSessionArgument argName index) ->
|
||||
RS.insertFunctionArg argName index sessionArgVal withTable
|
||||
in alsoWithSession
|
||||
|
||||
processTableSelectionSet
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> G.NamedType -> ObjectSelectionSet -> m AnnFields
|
||||
processTableSelectionSet fldTy flds =
|
||||
fmap (map (\(a, b) -> (FieldName a, b))) $ traverseObjectSelectionSet flds $ \fld -> do
|
||||
let fldName = _fName fld
|
||||
case fldName of
|
||||
"__typename" -> return $ RS.AFExpression $ G.unName $ G.unNamedType fldTy
|
||||
_ -> do
|
||||
fldInfo <- getFldInfo fldTy fldName
|
||||
case fldInfo of
|
||||
RFNodeId tn pkeys -> pure $ RS.AFNodeId tn pkeys
|
||||
RFPGColumn colInfo ->
|
||||
RS.mkAnnColumnField colInfo <$> argsToColumnOp (_fArguments fld)
|
||||
RFComputedField computedField ->
|
||||
RS.AFComputedField <$> resolveComputedField computedField fld
|
||||
RFRelationship (RelationshipField relInfo fieldKind colGNameMap tableFilter tableLimit) -> do
|
||||
let relTN = riRTable relInfo
|
||||
colMapping = riMapping relInfo
|
||||
rn = riName relInfo
|
||||
case fieldKind of
|
||||
RFKSimple ->
|
||||
case riType relInfo of
|
||||
ObjRel -> do
|
||||
annFields <- asObjectSelectionSet (_fSelSet fld)
|
||||
>>= processTableSelectionSet (_fType fld)
|
||||
pure $ RS.AFObjectRelation $ RS.AnnRelationSelectG rn colMapping $
|
||||
RS.AnnObjectSelectG annFields relTN $
|
||||
fmapAnnBoolExp partialSQLExpToUnresolvedVal tableFilter
|
||||
ArrRel -> do
|
||||
annSel <- fromField (RS.FromTable relTN) colGNameMap tableFilter tableLimit fld
|
||||
pure $ RS.AFArrayRelation $ RS.ASSimple $
|
||||
RS.AnnRelationSelectG rn colMapping annSel
|
||||
RFKAggregate -> do
|
||||
aggSel <- fromAggField (RS.FromTable relTN) colGNameMap tableFilter tableLimit fld
|
||||
pure $ RS.AFArrayRelation $ RS.ASAggregate $ RS.AnnRelationSelectG rn colMapping aggSel
|
||||
RFKConnection pkCols -> do
|
||||
connSel <- fromConnectionField (RS.FromTable relTN) pkCols tableFilter tableLimit fld
|
||||
pure $ RS.AFArrayRelation $ RS.ASConnection $ RS.AnnRelationSelectG rn colMapping connSel
|
||||
|
||||
RFRemoteRelationship info ->
|
||||
pure $ RS.AFRemote $ RS.RemoteSelect
|
||||
(unValidateArgsMap $ _fArguments fld) -- Unvalidate the input arguments
|
||||
(unValidateSelectionSet $ _fSelSet fld) -- Unvalidate the selection fields
|
||||
(_rfiHasuraFields info)
|
||||
(_rfiRemoteFields info)
|
||||
(_rfiRemoteSchema info)
|
||||
|
||||
type TableAggregateFields = RS.TableAggregateFieldsG UnresolvedVal
|
||||
|
||||
fromAggSelSet
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m TableAggregateFields
|
||||
fromAggSelSet colGNameMap fldTy selSet = fmap toFields $
|
||||
traverseObjectSelectionSet selSet $ \Field{..} ->
|
||||
case _fName of
|
||||
"__typename" -> return $ RS.TAFExp $ G.unName $ G.unNamedType fldTy
|
||||
"aggregate" -> do
|
||||
objSelSet <- asObjectSelectionSet _fSelSet
|
||||
RS.TAFAgg <$> convertAggregateField colGNameMap _fType objSelSet
|
||||
"nodes" -> do
|
||||
objSelSet <- asObjectSelectionSet _fSelSet
|
||||
RS.TAFNodes <$> processTableSelectionSet _fType objSelSet
|
||||
G.Name t -> throw500 $ "unexpected field in _agg node: " <> t
|
||||
|
||||
fromConnectionSelSet
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> G.NamedType -> ObjectSelectionSet -> m (RS.ConnectionFields UnresolvedVal)
|
||||
fromConnectionSelSet fldTy selSet = fmap toFields $
|
||||
traverseObjectSelectionSet selSet $ \Field{..} ->
|
||||
case _fName of
|
||||
"__typename" -> return $ RS.ConnectionTypename $ G.unName $ G.unNamedType fldTy
|
||||
"pageInfo" -> do
|
||||
fSelSet <- asObjectSelectionSet _fSelSet
|
||||
RS.ConnectionPageInfo <$> parsePageInfoSelectionSet _fType fSelSet
|
||||
"edges" -> do
|
||||
fSelSet <- asObjectSelectionSet _fSelSet
|
||||
RS.ConnectionEdges <$> parseEdgeSelectionSet _fType fSelSet
|
||||
-- "aggregate" -> RS.TAFAgg <$> convertAggregateField colGNameMap fTy fSelSet
|
||||
-- "nodes" -> RS.TAFNodes <$> processTableSelectionSet fTy fSelSet
|
||||
G.Name t -> throw500 $ "unexpected field in _connection node: " <> t
|
||||
|
||||
parseEdgeSelectionSet
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> G.NamedType -> ObjectSelectionSet -> m (RS.EdgeFields UnresolvedVal)
|
||||
parseEdgeSelectionSet fldTy selSet = fmap toFields $
|
||||
traverseObjectSelectionSet selSet $ \f -> do
|
||||
let fTy = _fType f
|
||||
case _fName f of
|
||||
"__typename" -> pure $ RS.EdgeTypename $ G.unName $ G.unNamedType fldTy
|
||||
"cursor" -> pure RS.EdgeCursor
|
||||
"node" -> do
|
||||
fSelSet <- asObjectSelectionSet $ _fSelSet f
|
||||
RS.EdgeNode <$> processTableSelectionSet fTy fSelSet
|
||||
G.Name t -> throw500 $ "unexpected field in Edge node: " <> t
|
||||
|
||||
parsePageInfoSelectionSet
|
||||
:: ( MonadReusability m, MonadError QErr m)
|
||||
=> G.NamedType -> ObjectSelectionSet -> m RS.PageInfoFields
|
||||
parsePageInfoSelectionSet fldTy selSet =
|
||||
fmap toFields $ traverseObjectSelectionSet selSet $ \f ->
|
||||
case _fName f of
|
||||
"__typename" -> pure $ RS.PageInfoTypename $ G.unName $ G.unNamedType fldTy
|
||||
"hasNextPage" -> pure RS.PageInfoHasNextPage
|
||||
"hasPreviousPage" -> pure RS.PageInfoHasPreviousPage
|
||||
"startCursor" -> pure RS.PageInfoStartCursor
|
||||
"endCursor" -> pure RS.PageInfoEndCursor
|
||||
-- "aggregate" -> RS.TAFAgg <$> convertAggregateField colGNameMap fTy fSelSet
|
||||
-- "nodes" -> RS.TAFNodes <$> processTableSelectionSet fTy fSelSet
|
||||
G.Name t -> throw500 $ "unexpected field in PageInfo node: " <> t
|
||||
|
||||
type SelectArgs = RS.SelectArgsG UnresolvedVal
|
||||
|
||||
parseSelectArgs
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m
|
||||
, Has FieldMap r, Has OrdByCtx r
|
||||
)
|
||||
=> PGColGNameMap -> ArgsMap -> m SelectArgs
|
||||
parseSelectArgs colGNameMap args = do
|
||||
whereExpM <- withArgM args "where" parseBoolExp
|
||||
ordByExpML <- withArgM args "order_by" parseOrderBy
|
||||
let ordByExpM = NE.nonEmpty =<< ordByExpML
|
||||
limitExpM <- withArgM args "limit" $
|
||||
parseNonNegativeInt "expecting Integer value for \"limit\""
|
||||
offsetExpM <- withArgM args "offset" $ asPGColumnValue >=> openOpaqueValue >=> txtConverter
|
||||
distOnColsML <- withArgM args "distinct_on" $ parseColumns colGNameMap
|
||||
let distOnColsM = NE.nonEmpty =<< distOnColsML
|
||||
mapM_ (validateDistOn ordByExpM) distOnColsM
|
||||
return $ RS.SelectArgs whereExpM ordByExpM limitExpM offsetExpM distOnColsM
|
||||
where
|
||||
validateDistOn Nothing _ = return ()
|
||||
validateDistOn (Just ordBys) cols = withPathK "args" $ do
|
||||
let colsLen = length cols
|
||||
initOrdBys = take colsLen $ toList ordBys
|
||||
initOrdByCols = flip mapMaybe initOrdBys $ \ob ->
|
||||
case obiColumn ob of
|
||||
RS.AOCColumn pgCol -> Just $ pgiColumn pgCol
|
||||
_ -> Nothing
|
||||
isValid = (colsLen == length initOrdByCols)
|
||||
&& all (`elem` initOrdByCols) (toList cols)
|
||||
|
||||
unless isValid $ throwVE
|
||||
"\"distinct_on\" columns must match initial \"order_by\" columns"
|
||||
|
||||
type AnnSimpleSelect = RS.AnnSimpleSelG UnresolvedVal
|
||||
|
||||
fromField
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> RS.SelectFromG UnresolvedVal
|
||||
-> PGColGNameMap
|
||||
-> AnnBoolExpPartialSQL
|
||||
-> Maybe Int
|
||||
-> Field -> m AnnSimpleSelect
|
||||
fromField selFrom colGNameMap permFilter permLimitM fld = fieldAsPath fld $ do
|
||||
tableArgs <- parseSelectArgs colGNameMap args
|
||||
selSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
annFlds <- processTableSelectionSet (_fType fld) selSet
|
||||
let unresolvedPermFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter
|
||||
let tabPerm = RS.TablePerm unresolvedPermFltr permLimitM
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
return $ RS.AnnSelectG annFlds selFrom tabPerm tableArgs strfyNum
|
||||
where
|
||||
args = _fArguments fld
|
||||
|
||||
getOrdByItemMap
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has OrdByCtx r
|
||||
)
|
||||
=> G.NamedType -> m OrdByItemMap
|
||||
getOrdByItemMap nt = do
|
||||
ordByCtx <- asks getter
|
||||
onNothing (Map.lookup nt ordByCtx) $
|
||||
throw500 $ "could not lookup " <> showNamedTy nt
|
||||
|
||||
parseOrderBy
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has OrdByCtx r
|
||||
)
|
||||
=> AnnInpVal
|
||||
-> m [RS.AnnOrderByItemG UnresolvedVal]
|
||||
parseOrderBy = fmap concat . withArray f
|
||||
where
|
||||
f _ = mapM (withObject (getAnnObItems id))
|
||||
|
||||
getAnnObItems
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has OrdByCtx r
|
||||
)
|
||||
=> (RS.AnnOrderByElement UnresolvedVal -> RS.AnnOrderByElement UnresolvedVal)
|
||||
-> G.NamedType
|
||||
-> AnnGObject
|
||||
-> m [RS.AnnOrderByItemG UnresolvedVal]
|
||||
getAnnObItems f nt obj = do
|
||||
ordByItemMap <- getOrdByItemMap nt
|
||||
fmap concat $ forM (OMap.toList obj) $ \(k, v) -> do
|
||||
ordByItem <- onNothing (Map.lookup k ordByItemMap) $ throw500 $
|
||||
"cannot lookup " <> showName k <> " order by item in "
|
||||
<> showNamedTy nt <> " map"
|
||||
case ordByItem of
|
||||
OBIPGCol ci -> do
|
||||
let aobCol = f $ RS.AOCColumn ci
|
||||
(_, enumValM) <- asEnumValM v
|
||||
ordByItemM <- forM enumValM $ \enumVal -> do
|
||||
(ordTy, nullsOrd) <- parseOrderByEnum enumVal
|
||||
return $ mkOrdByItemG ordTy aobCol nullsOrd
|
||||
return $ maybe [] pure ordByItemM
|
||||
|
||||
OBIRel ri fltr -> do
|
||||
let unresolvedFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal fltr
|
||||
let annObColFn = f . RS.AOCObjectRelation ri unresolvedFltr
|
||||
flip withObjectM v $ \nameTy objM ->
|
||||
maybe (pure []) (getAnnObItems annObColFn nameTy) objM
|
||||
|
||||
OBIAgg ri relColGNameMap fltr -> do
|
||||
let unresolvedFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal fltr
|
||||
let aobColFn = f . RS.AOCArrayAggregation ri unresolvedFltr
|
||||
flip withObjectM v $ \_ objM ->
|
||||
maybe (pure []) (parseAggOrdBy relColGNameMap aobColFn) objM
|
||||
|
||||
mkOrdByItemG :: S.OrderType -> a -> S.NullsOrder -> OrderByItemG a
|
||||
mkOrdByItemG ordTy aobCol nullsOrd =
|
||||
OrderByItemG (Just $ OrderType ordTy) aobCol (Just $ NullsOrder nullsOrd)
|
||||
|
||||
parseAggOrdBy
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap
|
||||
-> (RS.AnnAggregateOrderBy -> RS.AnnOrderByElement UnresolvedVal)
|
||||
-> AnnGObject
|
||||
-> m [RS.AnnOrderByItemG UnresolvedVal]
|
||||
parseAggOrdBy colGNameMap f annObj =
|
||||
fmap concat <$> forM (OMap.toList annObj) $ \(op, obVal) ->
|
||||
case op of
|
||||
"count" -> do
|
||||
(_, enumValM) <- asEnumValM obVal
|
||||
ordByItemM <- forM enumValM $ \enumVal -> do
|
||||
(ordTy, nullsOrd) <- parseOrderByEnum enumVal
|
||||
return $ mkOrdByItemG ordTy (f RS.AAOCount) nullsOrd
|
||||
return $ maybe [] pure ordByItemM
|
||||
|
||||
G.Name opText ->
|
||||
flip withObject obVal $ \_ opObObj -> fmap catMaybes $
|
||||
forM (OMap.toList opObObj) $ \(colName, eVal) -> do
|
||||
(_, enumValM) <- asEnumValM eVal
|
||||
forM enumValM $ \enumVal -> do
|
||||
(ordTy, nullsOrd) <- parseOrderByEnum enumVal
|
||||
col <- resolvePGCol colGNameMap colName
|
||||
let aobCol = f $ RS.AAOOp opText col
|
||||
return $ mkOrdByItemG ordTy aobCol nullsOrd
|
||||
|
||||
parseOrderByEnum
|
||||
:: (MonadError QErr m)
|
||||
=> G.EnumValue
|
||||
-> m (S.OrderType, S.NullsOrder)
|
||||
parseOrderByEnum = \case
|
||||
G.EnumValue "asc" -> return (S.OTAsc, S.NLast)
|
||||
G.EnumValue "asc_nulls_last" -> return (S.OTAsc, S.NLast)
|
||||
G.EnumValue "asc_nulls_first" -> return (S.OTAsc, S.NFirst)
|
||||
G.EnumValue "desc" -> return (S.OTDesc, S.NFirst)
|
||||
G.EnumValue "desc_nulls_first" -> return (S.OTDesc, S.NFirst)
|
||||
G.EnumValue "desc_nulls_last" -> return (S.OTDesc, S.NLast)
|
||||
G.EnumValue v -> throw500 $
|
||||
"enum value " <> showName v <> " not found in type order_by"
|
||||
|
||||
parseNonNegativeInt
|
||||
:: (MonadReusability m, MonadError QErr m) => Text -> AnnInpVal -> m Int
|
||||
parseNonNegativeInt errMsg v = do
|
||||
pgColVal <- openOpaqueValue =<< asPGColumnValue v
|
||||
limit <- maybe (throwVE errMsg) return . pgColValueToInt . pstValue $ _apvValue pgColVal
|
||||
-- validate int value
|
||||
onlyPositiveInt limit
|
||||
return limit
|
||||
|
||||
type AnnSimpleSel = RS.AnnSimpleSelG UnresolvedVal
|
||||
|
||||
fromFieldByPKey
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> QualifiedTable -> PGColArgMap
|
||||
-> AnnBoolExpPartialSQL -> Field -> m AnnSimpleSel
|
||||
fromFieldByPKey tn colArgMap permFilter fld = fieldAsPath fld $ do
|
||||
boolExp <- pgColValToBoolExp colArgMap $ _fArguments fld
|
||||
selSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
annFlds <- processTableSelectionSet fldTy selSet
|
||||
let tabFrom = RS.FromTable tn
|
||||
unresolvedPermFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal
|
||||
permFilter
|
||||
tabPerm = RS.TablePerm unresolvedPermFltr Nothing
|
||||
tabArgs = RS.noSelectArgs { RS._saWhere = Just boolExp}
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
return $ RS.AnnSelectG annFlds tabFrom tabPerm tabArgs strfyNum
|
||||
where
|
||||
fldTy = _fType fld
|
||||
|
||||
convertSelect
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> SelOpCtx -> Field -> m (RS.AnnSimpleSelG UnresolvedVal)
|
||||
convertSelect opCtx fld =
|
||||
withPathK "selectionSet" $
|
||||
fromField (RS.FromTable qt) colGNameMap permFilter permLimit fld
|
||||
where
|
||||
SelOpCtx qt _ colGNameMap permFilter permLimit = opCtx
|
||||
|
||||
convertSelectByPKey
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> SelPkOpCtx -> Field -> m (RS.AnnSimpleSelG UnresolvedVal)
|
||||
convertSelectByPKey opCtx fld =
|
||||
withPathK "selectionSet" $
|
||||
fromFieldByPKey qt colArgMap permFilter fld
|
||||
where
|
||||
SelPkOpCtx qt _ permFilter colArgMap = opCtx
|
||||
|
||||
-- agg select related
|
||||
parseColumns
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap -> AnnInpVal -> m [PGCol]
|
||||
parseColumns allColFldMap val =
|
||||
flip withArray val $ \_ vals ->
|
||||
forM vals $ \v -> do
|
||||
(_, G.EnumValue enumVal) <- asEnumVal v
|
||||
pgiColumn <$> resolvePGCol allColFldMap enumVal
|
||||
|
||||
convertCount
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap -> ArgsMap -> m S.CountType
|
||||
convertCount colGNameMap args = do
|
||||
columnsM <- withArgM args "columns" $ parseColumns colGNameMap
|
||||
isDistinct <- or <$> withArgM args "distinct" parseDistinct
|
||||
maybe (return S.CTStar) (mkCType isDistinct) columnsM
|
||||
where
|
||||
parseDistinct v = do
|
||||
val <- openOpaqueValue =<< asPGColumnValue v
|
||||
case pstValue $ _apvValue val of
|
||||
PGValBoolean b -> return b
|
||||
_ ->
|
||||
throw500 "expecting Boolean for \"distinct\""
|
||||
|
||||
mkCType isDistinct cols = return $
|
||||
bool (S.CTSimple cols) (S.CTDistinct cols) isDistinct
|
||||
|
||||
toFields :: [(T.Text, a)] -> RS.Fields a
|
||||
toFields = map (first FieldName)
|
||||
|
||||
convertColumnFields
|
||||
:: (MonadError QErr m)
|
||||
=> PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m RS.ColumnFields
|
||||
convertColumnFields colGNameMap ty selSet = fmap toFields $
|
||||
traverseObjectSelectionSet selSet $ \fld ->
|
||||
case _fName fld of
|
||||
"__typename" -> return $ RS.PCFExp $ G.unName $ G.unNamedType ty
|
||||
n -> RS.PCFCol . pgiColumn <$> resolvePGCol colGNameMap n
|
||||
|
||||
convertAggregateField
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m RS.AggregateFields
|
||||
convertAggregateField colGNameMap ty selSet = fmap toFields $
|
||||
traverseObjectSelectionSet selSet $ \Field{..} ->
|
||||
case _fName of
|
||||
"__typename" -> return $ RS.AFExp $ G.unName $ G.unNamedType ty
|
||||
"count" -> RS.AFCount <$> convertCount colGNameMap _fArguments
|
||||
n -> do
|
||||
fSelSet <- asObjectSelectionSet _fSelSet
|
||||
colFlds <- convertColumnFields colGNameMap _fType fSelSet
|
||||
unless (isAggregateField n) $ throwInvalidFld n
|
||||
return $ RS.AFOp $ RS.AggregateOp (G.unName n) colFlds
|
||||
where
|
||||
throwInvalidFld (G.Name t) =
|
||||
throw500 $ "unexpected field in _aggregate node: " <> t
|
||||
|
||||
type AnnAggregateSelect = RS.AnnAggregateSelectG UnresolvedVal
|
||||
|
||||
fromAggField
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> RS.SelectFromG UnresolvedVal
|
||||
-> PGColGNameMap
|
||||
-> AnnBoolExpPartialSQL
|
||||
-> Maybe Int
|
||||
-> Field -> m AnnAggregateSelect
|
||||
fromAggField selectFrom colGNameMap permFilter permLimit fld = fieldAsPath fld $ do
|
||||
tableArgs <- parseSelectArgs colGNameMap args
|
||||
selSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
aggSelFlds <- fromAggSelSet colGNameMap (_fType fld) selSet
|
||||
let unresolvedPermFltr =
|
||||
fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter
|
||||
let tabPerm = RS.TablePerm unresolvedPermFltr permLimit
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
return $ RS.AnnSelectG aggSelFlds selectFrom tabPerm tableArgs strfyNum
|
||||
where
|
||||
args = _fArguments fld
|
||||
|
||||
fromConnectionField
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> RS.SelectFromG UnresolvedVal
|
||||
-> PrimaryKeyColumns
|
||||
-> AnnBoolExpPartialSQL
|
||||
-> Maybe Int
|
||||
-> Field -> m (RS.ConnectionSelect UnresolvedVal)
|
||||
fromConnectionField selectFrom pkCols permFilter permLimit fld = fieldAsPath fld $ do
|
||||
(tableArgs, slice, split) <- parseConnectionArgs pkCols args
|
||||
selSet <- asObjectSelectionSet $ _fSelSet fld
|
||||
connSelFlds <- fromConnectionSelSet (_fType fld) selSet
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
let unresolvedPermFltr =
|
||||
fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter
|
||||
tabPerm = RS.TablePerm unresolvedPermFltr permLimit
|
||||
annSel = RS.AnnSelectG connSelFlds selectFrom tabPerm tableArgs strfyNum
|
||||
pure $ RS.ConnectionSelect pkCols split slice annSel
|
||||
where
|
||||
args = _fArguments fld
|
||||
|
||||
parseConnectionArgs
|
||||
:: forall r m.
|
||||
( MonadReusability m, MonadError QErr m, MonadReader r m
|
||||
, Has FieldMap r, Has OrdByCtx r
|
||||
)
|
||||
=> PrimaryKeyColumns
|
||||
-> ArgsMap
|
||||
-> m ( SelectArgs
|
||||
, Maybe RS.ConnectionSlice
|
||||
, Maybe (NE.NonEmpty (RS.ConnectionSplit UnresolvedVal))
|
||||
)
|
||||
parseConnectionArgs pKeyColumns args = do
|
||||
whereExpM <- withArgM args "where" parseBoolExp
|
||||
ordByExpML <- withArgM args "order_by" parseOrderBy
|
||||
|
||||
slice <- case (Map.lookup "first" args, Map.lookup "last" args) of
|
||||
(Nothing, Nothing) -> pure Nothing
|
||||
(Just _, Just _) -> throwVE "\"first\" and \"last\" are not allowed at once"
|
||||
(Just v, Nothing) -> Just . RS.SliceFirst <$> parseNonNegativeInt
|
||||
"expecting Integer value for \"first\"" v
|
||||
(Nothing, Just v) -> Just . RS.SliceLast <$> parseNonNegativeInt
|
||||
"expecting Integer value for \"last\"" v
|
||||
|
||||
maybeSplit <- case (Map.lookup "after" args, Map.lookup "before" args) of
|
||||
(Nothing, Nothing) -> pure Nothing
|
||||
(Just _, Just _) -> throwVE "\"after\" and \"before\" are not allowed at once"
|
||||
(Just v, Nothing) -> fmap ((RS.CSKAfter,) . base64Decode) <$> asPGColTextM v
|
||||
(Nothing, Just v) -> fmap ((RS.CSKBefore,) . base64Decode) <$> asPGColTextM v
|
||||
|
||||
let ordByExpM = NE.nonEmpty =<< appendPrimaryKeyOrderBy <$> ordByExpML
|
||||
tableArgs = RS.SelectArgs whereExpM ordByExpM Nothing Nothing Nothing
|
||||
|
||||
split <- mapM (uncurry (validateConnectionSplit ordByExpM)) maybeSplit
|
||||
pure (tableArgs, slice, split)
|
||||
where
|
||||
appendPrimaryKeyOrderBy :: [RS.AnnOrderByItemG v] -> [RS.AnnOrderByItemG v]
|
||||
appendPrimaryKeyOrderBy orderBys =
|
||||
let orderByColumnNames =
|
||||
orderBys ^.. traverse . to obiColumn . RS._AOCColumn . to pgiColumn
|
||||
pkeyOrderBys = flip mapMaybe (toList pKeyColumns) $ \pgColumnInfo ->
|
||||
if pgiColumn pgColumnInfo `elem` orderByColumnNames then Nothing
|
||||
else Just $ OrderByItemG Nothing (RS.AOCColumn pgColumnInfo) Nothing
|
||||
in orderBys <> pkeyOrderBys
|
||||
|
||||
validateConnectionSplit
|
||||
:: Maybe (NonEmpty (RS.AnnOrderByItemG UnresolvedVal))
|
||||
-> RS.ConnectionSplitKind
|
||||
-> BL.ByteString
|
||||
-> m (NonEmpty (RS.ConnectionSplit UnresolvedVal))
|
||||
validateConnectionSplit maybeOrderBys splitKind cursorSplit = do
|
||||
cursorValue <- either (const throwInvalidCursor) pure $
|
||||
J.eitherDecode cursorSplit
|
||||
case maybeOrderBys of
|
||||
Nothing -> forM (NESeq.toNonEmpty pKeyColumns) $
|
||||
\pgColumnInfo -> do
|
||||
let columnJsonPath = [J.Key $ getPGColTxt $ pgiColumn pgColumnInfo]
|
||||
pgColumnValue <- maybe throwInvalidCursor pure $ iResultToMaybe $
|
||||
executeJSONPath columnJsonPath cursorValue
|
||||
pgValue <- parsePGScalarValue (pgiType pgColumnInfo) pgColumnValue
|
||||
let unresolvedValue = UVPG $ AnnPGVal Nothing False pgValue
|
||||
pure $ RS.ConnectionSplit splitKind unresolvedValue $
|
||||
OrderByItemG Nothing (RS.AOCColumn pgColumnInfo) Nothing
|
||||
Just orderBys ->
|
||||
forM orderBys $ \orderBy -> do
|
||||
let OrderByItemG orderType annObCol nullsOrder = orderBy
|
||||
orderByItemValue <- maybe throwInvalidCursor pure $ iResultToMaybe $
|
||||
executeJSONPath (getPathFromOrderBy annObCol) cursorValue
|
||||
pgValue <- parsePGScalarValue (getOrderByColumnType annObCol) orderByItemValue
|
||||
let unresolvedValue = UVPG $ AnnPGVal Nothing False pgValue
|
||||
pure $ RS.ConnectionSplit splitKind unresolvedValue $
|
||||
OrderByItemG orderType (() <$ annObCol) nullsOrder
|
||||
where
|
||||
throwInvalidCursor = throwVE "the \"after\" or \"before\" cursor is invalid"
|
||||
|
||||
iResultToMaybe = \case
|
||||
J.ISuccess v -> Just v
|
||||
J.IError{} -> Nothing
|
||||
|
||||
getPathFromOrderBy = \case
|
||||
RS.AOCColumn pgColInfo ->
|
||||
let pathElement = J.Key $ getPGColTxt $ pgiColumn pgColInfo
|
||||
in [pathElement]
|
||||
RS.AOCObjectRelation relInfo _ obCol ->
|
||||
let pathElement = J.Key $ relNameToTxt $ riName relInfo
|
||||
in pathElement : getPathFromOrderBy obCol
|
||||
RS.AOCArrayAggregation relInfo _ aggOb ->
|
||||
let fieldName = J.Key $ relNameToTxt (riName relInfo) <> "_aggregate"
|
||||
in fieldName : case aggOb of
|
||||
RS.AAOCount -> [J.Key "count"]
|
||||
RS.AAOOp t col -> [J.Key t, J.Key $ getPGColTxt $ pgiColumn col]
|
||||
|
||||
getOrderByColumnType = \case
|
||||
RS.AOCColumn pgColInfo -> pgiType pgColInfo
|
||||
RS.AOCObjectRelation _ _ obCol -> getOrderByColumnType obCol
|
||||
RS.AOCArrayAggregation _ _ aggOb ->
|
||||
case aggOb of
|
||||
RS.AAOCount -> PGColumnScalar PGInteger
|
||||
RS.AAOOp _ colInfo -> pgiType colInfo
|
||||
|
||||
convertAggSelect
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> SelOpCtx -> Field -> m (RS.AnnAggregateSelectG UnresolvedVal)
|
||||
convertAggSelect opCtx fld =
|
||||
withPathK "selectionSet" $
|
||||
fromAggField (RS.FromTable qt) colGNameMap permFilter permLimit fld
|
||||
where
|
||||
SelOpCtx qt _ colGNameMap permFilter permLimit = opCtx
|
||||
|
||||
convertConnectionSelect
|
||||
:: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r
|
||||
, Has OrdByCtx r, Has SQLGenCtx r
|
||||
)
|
||||
=> PrimaryKeyColumns -> SelOpCtx -> Field -> m (RS.ConnectionSelect UnresolvedVal)
|
||||
convertConnectionSelect pkCols opCtx fld =
|
||||
withPathK "selectionSet" $
|
||||
fromConnectionField (RS.FromTable qt) pkCols permFilter permLimit fld
|
||||
where
|
||||
SelOpCtx qt _ _ permFilter permLimit = opCtx
|
||||
|
||||
parseFunctionArgs
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> Seq.Seq a
|
||||
-> (a -> InputFunctionArgument)
|
||||
-> Maybe AnnInpVal
|
||||
-> m (RS.FunctionArgsExpG UnresolvedVal)
|
||||
parseFunctionArgs argSeq argFn = withPathK "args" . \case
|
||||
Nothing -> do
|
||||
-- The input "args" field is not provided, hence resolve only known
|
||||
-- input arguments as positional arguments
|
||||
let positionalArgs = mapMaybe ((^? _IFAKnown._2) . argFn) $ toList argSeq
|
||||
pure RS.emptyFunctionArgsExp{RS._faePositional = positionalArgs}
|
||||
|
||||
Just val -> flip withObject val $ \_ obj -> do
|
||||
(positionalArgs, argsLeft) <- spanMaybeM (parsePositionalArg obj) argSeq
|
||||
namedArgs <- Map.fromList . catMaybes <$> traverse (parseNamedArg obj) argsLeft
|
||||
pure $ RS.FunctionArgsExp positionalArgs namedArgs
|
||||
where
|
||||
parsePositionalArg obj inputArg = case argFn inputArg of
|
||||
IFAKnown _ resolvedVal -> pure $ Just resolvedVal
|
||||
IFAUnknown (FunctionArgItem gqlName _ _) ->
|
||||
maybe (pure Nothing) (fmap Just . parseArg) $ OMap.lookup gqlName obj
|
||||
|
||||
parseArg = fmap (maybe (UVSQL S.SENull) mkParameterizablePGValue) . asPGColumnValueM
|
||||
|
||||
parseNamedArg obj inputArg = case argFn inputArg of
|
||||
IFAKnown argName resolvedVal ->
|
||||
pure $ Just (getFuncArgNameTxt argName, resolvedVal)
|
||||
IFAUnknown (FunctionArgItem gqlName maybeSqlName hasDefault) ->
|
||||
case OMap.lookup gqlName obj of
|
||||
Just argInpVal -> case maybeSqlName of
|
||||
Just sqlName -> Just . (getFuncArgNameTxt sqlName,) <$> parseArg argInpVal
|
||||
Nothing -> throw400 NotSupported
|
||||
"Only last set of positional arguments can be omitted"
|
||||
Nothing -> if not (unHasDefault hasDefault) then
|
||||
throw400 NotSupported "Non default arguments cannot be omitted"
|
||||
else pure Nothing
|
||||
|
||||
makeFunctionSelectFrom
|
||||
:: (MonadReusability m, MonadError QErr m)
|
||||
=> QualifiedFunction
|
||||
-> FunctionArgSeq
|
||||
-> Field
|
||||
-> m (RS.SelectFromG UnresolvedVal)
|
||||
makeFunctionSelectFrom qf argSeq fld = withPathK "args" $ do
|
||||
funcArgs <- parseFunctionArgs argSeq argFn $ Map.lookup "args" $ _fArguments fld
|
||||
pure $ RS.FromFunction qf (RS.AEInput <$> funcArgs) Nothing
|
||||
where
|
||||
argFn (IAUserProvided val) = IFAUnknown val
|
||||
argFn (IASessionVariables argName) = IFAKnown argName UVSession
|
||||
|
||||
convertFuncQuerySimple
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> FuncQOpCtx -> Field -> m AnnSimpleSelect
|
||||
convertFuncQuerySimple funcOpCtx fld =
|
||||
withPathK "selectionSet" $ fieldAsPath fld $ do
|
||||
selectFrom <- makeFunctionSelectFrom qf argSeq fld
|
||||
fromField selectFrom colGNameMap permFilter permLimit fld
|
||||
where
|
||||
FuncQOpCtx qf argSeq _ colGNameMap permFilter permLimit = funcOpCtx
|
||||
|
||||
convertFuncQueryAgg
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> FuncQOpCtx -> Field -> m AnnAggregateSelect
|
||||
convertFuncQueryAgg funcOpCtx fld =
|
||||
withPathK "selectionSet" $ fieldAsPath fld $ do
|
||||
selectFrom <- makeFunctionSelectFrom qf argSeq fld
|
||||
fromAggField selectFrom colGNameMap permFilter permLimit fld
|
||||
where
|
||||
FuncQOpCtx qf argSeq _ colGNameMap permFilter permLimit = funcOpCtx
|
||||
|
||||
convertConnectionFuncQuery
|
||||
:: ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> PrimaryKeyColumns -> FuncQOpCtx -> Field -> m (RS.ConnectionSelect UnresolvedVal)
|
||||
convertConnectionFuncQuery pkCols funcOpCtx fld =
|
||||
withPathK "selectionSet" $ fieldAsPath fld $ do
|
||||
selectFrom <- makeFunctionSelectFrom qf argSeq fld
|
||||
fromConnectionField selectFrom pkCols permFilter permLimit fld
|
||||
where
|
||||
FuncQOpCtx qf argSeq _ _ permFilter permLimit = funcOpCtx
|
||||
|
||||
throwInvalidNodeId :: MonadError QErr m => Text -> m a
|
||||
throwInvalidNodeId t = throwVE $ "the node id is invalid: " <> t
|
||||
|
||||
resolveNodeId
|
||||
:: ( MonadError QErr m
|
||||
, MonadReusability m
|
||||
)
|
||||
=> Field -> m NodeId
|
||||
resolveNodeId field =
|
||||
withPathK "selectionSet" $ fieldAsPath field $
|
||||
withArg (_fArguments field) "id" $ asPGColText >=>
|
||||
either (throwInvalidNodeId . T.pack) pure . J.eitherDecode . base64Decode
|
||||
|
||||
convertNodeSelect
|
||||
:: forall m r. ( MonadReusability m
|
||||
, MonadError QErr m
|
||||
, MonadReader r m
|
||||
, Has FieldMap r
|
||||
, Has OrdByCtx r
|
||||
, Has SQLGenCtx r
|
||||
)
|
||||
=> SelOpCtx
|
||||
-> PrimaryKeyColumns
|
||||
-> NESeq.NESeq J.Value
|
||||
-> Field
|
||||
-> m (RS.AnnSimpleSelG UnresolvedVal)
|
||||
convertNodeSelect selOpCtx pkeyColumns columnValues field =
|
||||
withPathK "selectionSet" $ fieldAsPath field $ do
|
||||
-- Parse selection set as interface
|
||||
ifaceSelectionSet <- asInterfaceSelectionSet $ _fSelSet field
|
||||
let tableObjectType = mkTableTy table
|
||||
selSet = getMemberSelectionSet tableObjectType ifaceSelectionSet
|
||||
unresolvedPermFilter = fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter
|
||||
tablePerm = RS.TablePerm unresolvedPermFilter permLimit
|
||||
-- Resolve the table selection set
|
||||
annFields <- processTableSelectionSet tableObjectType selSet
|
||||
-- Resolve the Node id primary key column values
|
||||
pkeyColumnValues <- alignPkeyColumnValues
|
||||
unresolvedPkeyValues <- flip Map.traverseWithKey pkeyColumnValues $
|
||||
\columnInfo jsonValue ->
|
||||
let modifyErrFn t = "value of column " <> pgiColumn columnInfo
|
||||
<<> " in node id: " <> t
|
||||
in modifyErr modifyErrFn $
|
||||
(,columnInfo) . UVPG . AnnPGVal Nothing False <$>
|
||||
parsePGScalarValue (pgiType columnInfo) jsonValue
|
||||
|
||||
-- Generate the bool expression from the primary key column values
|
||||
let pkeyBoolExp = BoolAnd $ flip map (Map.elems unresolvedPkeyValues) $
|
||||
\(unresolvedValue, columnInfo) -> (BoolFld . AVCol columnInfo) [AEQ True unresolvedValue]
|
||||
selectArgs = RS.noSelectArgs{RS._saWhere = Just pkeyBoolExp}
|
||||
strfyNum <- stringifyNum <$> asks getter
|
||||
pure $ RS.AnnSelectG annFields (RS.FromTable table) tablePerm selectArgs strfyNum
|
||||
where
|
||||
SelOpCtx table _ _ permFilter permLimit = selOpCtx
|
||||
|
||||
alignPkeyColumnValues :: m (Map.HashMap PGColumnInfo J.Value)
|
||||
alignPkeyColumnValues = do
|
||||
let NESeq.NESeq (firstPkColumn, remainingPkColumns) = pkeyColumns
|
||||
NESeq.NESeq (firstColumnValue, remainingColumns) = columnValues
|
||||
(nonAlignedPkColumns, nonAlignedColumnValues, alignedTuples) =
|
||||
partitionThese $ toList $ align remainingPkColumns remainingColumns
|
||||
|
||||
when (not $ null nonAlignedPkColumns) $ throwInvalidNodeId $
|
||||
"primary key columns " <> dquoteList (map pgiColumn nonAlignedPkColumns) <> " are missing"
|
||||
|
||||
when (not $ null nonAlignedColumnValues) $ throwInvalidNodeId $
|
||||
"unexpected column values " <> J.encodeToStrictText nonAlignedColumnValues
|
||||
|
||||
pure $ Map.fromList $ (firstPkColumn, firstColumnValue):alignedTuples
|
@ -1,337 +0,0 @@
|
||||
module Hasura.GraphQL.Resolve.Types
|
||||
( module Hasura.GraphQL.Resolve.Types
|
||||
-- * Re-exports
|
||||
, MonadReusability(..)
|
||||
) where
|
||||
|
||||
import Control.Lens.TH
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Sequence.NonEmpty as NESeq
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DDL.Headers (HeaderConf)
|
||||
import Hasura.RQL.Types.Action
|
||||
import Hasura.RQL.Types.BoolExp
|
||||
import Hasura.RQL.Types.Column
|
||||
import Hasura.RQL.Types.Common
|
||||
import Hasura.RQL.Types.ComputedField
|
||||
import Hasura.RQL.Types.CustomTypes
|
||||
import Hasura.RQL.Types.Function
|
||||
import Hasura.RQL.Types.RemoteRelationship
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
type NodeSelectMap = Map.HashMap G.NamedType (SelOpCtx, PrimaryKeyColumns)
|
||||
|
||||
data QueryCtx
|
||||
= QCNodeSelect !NodeSelectMap
|
||||
| QCSelect !SelOpCtx
|
||||
| QCSelectConnection !PrimaryKeyColumns !SelOpCtx
|
||||
| QCSelectPkey !SelPkOpCtx
|
||||
| QCSelectAgg !SelOpCtx
|
||||
| QCFuncQuery !FuncQOpCtx
|
||||
| QCFuncAggQuery !FuncQOpCtx
|
||||
| QCFuncConnection !PrimaryKeyColumns !FuncQOpCtx
|
||||
| QCAsyncActionFetch !ActionSelectOpContext
|
||||
| QCAction !ActionExecutionContext
|
||||
deriving (Show, Eq)
|
||||
|
||||
data MutationCtx
|
||||
= MCInsert !InsOpCtx
|
||||
| MCInsertOne !InsOpCtx
|
||||
| MCUpdate !UpdOpCtx
|
||||
| MCUpdateByPk !UpdOpCtx
|
||||
| MCDelete !DelOpCtx
|
||||
| MCDeleteByPk !DelOpCtx
|
||||
| MCAction !ActionMutationExecutionContext
|
||||
deriving (Show, Eq)
|
||||
|
||||
type OpCtxMap a = Map.HashMap G.Name a
|
||||
type QueryCtxMap = OpCtxMap QueryCtx
|
||||
type MutationCtxMap = OpCtxMap MutationCtx
|
||||
|
||||
data InsOpCtx
|
||||
= InsOpCtx
|
||||
{ _iocTable :: !QualifiedTable
|
||||
, _iocHeaders :: ![T.Text]
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data SelOpCtx
|
||||
= SelOpCtx
|
||||
{ _socTable :: !QualifiedTable
|
||||
, _socHeaders :: ![T.Text]
|
||||
, _socAllCols :: !PGColGNameMap
|
||||
, _socFilter :: !AnnBoolExpPartialSQL
|
||||
, _socLimit :: !(Maybe Int)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type PGColArgMap = Map.HashMap G.Name PGColumnInfo
|
||||
|
||||
data SelPkOpCtx
|
||||
= SelPkOpCtx
|
||||
{ _spocTable :: !QualifiedTable
|
||||
, _spocHeaders :: ![T.Text]
|
||||
, _spocFilter :: !AnnBoolExpPartialSQL
|
||||
, _spocArgMap :: !PGColArgMap
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type FunctionArgSeq = Seq.Seq (InputArgument FunctionArgItem)
|
||||
|
||||
data FuncQOpCtx
|
||||
= FuncQOpCtx
|
||||
{ _fqocFunction :: !QualifiedFunction
|
||||
, _fqocArgs :: !FunctionArgSeq
|
||||
, _fqocHeaders :: ![T.Text]
|
||||
, _fqocAllCols :: !PGColGNameMap
|
||||
, _fqocFilter :: !AnnBoolExpPartialSQL
|
||||
, _fqocLimit :: !(Maybe Int)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data UpdOpCtx
|
||||
= UpdOpCtx
|
||||
{ _uocTable :: !QualifiedTable
|
||||
, _uocHeaders :: ![T.Text]
|
||||
, _uocAllCols :: !PGColGNameMap
|
||||
, _uocFilter :: !AnnBoolExpPartialSQL
|
||||
, _uocCheck :: !(Maybe AnnBoolExpPartialSQL)
|
||||
, _uocPresetCols :: !PreSetColsPartial
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data DelOpCtx
|
||||
= DelOpCtx
|
||||
{ _docTable :: !QualifiedTable
|
||||
, _docHeaders :: ![T.Text]
|
||||
, _docAllCols :: !PGColGNameMap
|
||||
, _docFilter :: !AnnBoolExpPartialSQL
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data ActionExecutionContext
|
||||
= ActionExecutionContext
|
||||
{ _saecName :: !ActionName
|
||||
, _saecOutputType :: !GraphQLType
|
||||
, _saecOutputFields :: !ActionOutputFields
|
||||
, _saecDefinitionList :: ![(PGCol, PGScalarType)]
|
||||
, _saecWebhook :: !ResolvedWebhook
|
||||
, _saecHeaders :: ![HeaderConf]
|
||||
, _saecForwardClientHeaders :: !Bool
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data ActionMutationExecutionContext
|
||||
= ActionMutationSyncWebhook !ActionExecutionContext
|
||||
| ActionMutationAsync
|
||||
deriving (Show, Eq)
|
||||
|
||||
data ActionSelectOpContext
|
||||
= ActionSelectOpContext
|
||||
{ _asocOutputType :: !GraphQLType
|
||||
, _asocDefinitionList :: ![(PGCol, PGScalarType)]
|
||||
} deriving (Show, Eq)
|
||||
|
||||
-- (custom name | generated name) -> PG column info
|
||||
-- used in resolvers
|
||||
type PGColGNameMap = Map.HashMap G.Name PGColumnInfo
|
||||
|
||||
data RelationshipFieldKind
|
||||
= RFKAggregate
|
||||
| RFKSimple
|
||||
| RFKConnection !PrimaryKeyColumns
|
||||
deriving (Show, Eq)
|
||||
|
||||
data RelationshipField
|
||||
= RelationshipField
|
||||
{ _rfInfo :: !RelInfo
|
||||
, _rfKind :: !RelationshipFieldKind
|
||||
, _rfCols :: !PGColGNameMap
|
||||
, _rfPermFilter :: !AnnBoolExpPartialSQL
|
||||
, _rfPermLimit :: !(Maybe Int)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data ComputedFieldTable
|
||||
= ComputedFieldTable
|
||||
{ _cftTable :: !QualifiedTable
|
||||
, _cftCols :: !PGColGNameMap
|
||||
, _cftPermFilter :: !AnnBoolExpPartialSQL
|
||||
, _cftPermLimit :: !(Maybe Int)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data ComputedFieldType
|
||||
= CFTScalar !PGScalarType
|
||||
| CFTTable !ComputedFieldTable
|
||||
deriving (Show, Eq)
|
||||
|
||||
type ComputedFieldFunctionArgSeq = Seq.Seq FunctionArgItem
|
||||
|
||||
data ComputedField
|
||||
= ComputedField
|
||||
{ _cfName :: !ComputedFieldName
|
||||
, _cfFunction :: !ComputedFieldFunction
|
||||
, _cfArgSeq :: !ComputedFieldFunctionArgSeq
|
||||
, _cfType :: !ComputedFieldType
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data ResolveField
|
||||
= RFPGColumn !PGColumnInfo
|
||||
| RFRelationship !RelationshipField
|
||||
| RFComputedField !ComputedField
|
||||
| RFRemoteRelationship !RemoteFieldInfo
|
||||
| RFNodeId !QualifiedTable !PrimaryKeyColumns
|
||||
deriving (Show, Eq)
|
||||
|
||||
type FieldMap = Map.HashMap (G.NamedType, G.Name) ResolveField
|
||||
|
||||
-- order by context
|
||||
data OrdByItem
|
||||
= OBIPGCol !PGColumnInfo
|
||||
| OBIRel !RelInfo !AnnBoolExpPartialSQL
|
||||
| OBIAgg !RelInfo !PGColGNameMap !AnnBoolExpPartialSQL
|
||||
deriving (Show, Eq)
|
||||
|
||||
type OrdByItemMap = Map.HashMap G.Name OrdByItem
|
||||
|
||||
type OrdByCtx = Map.HashMap G.NamedType OrdByItemMap
|
||||
|
||||
data FunctionArgItem
|
||||
= FunctionArgItem
|
||||
{ _faiInputArgName :: !G.Name
|
||||
, _faiSqlArgName :: !(Maybe FunctionArgName)
|
||||
, _faiHasDefault :: !HasDefault
|
||||
} deriving (Show, Eq)
|
||||
|
||||
-- insert context
|
||||
type RelationInfoMap = Map.HashMap RelName RelInfo
|
||||
|
||||
data UpdPermForIns
|
||||
= UpdPermForIns
|
||||
{ upfiCols :: ![PGCol]
|
||||
, upfiCheck :: !(Maybe AnnBoolExpPartialSQL)
|
||||
, upfiFilter :: !AnnBoolExpPartialSQL
|
||||
, upfiSet :: !PreSetColsPartial
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data InsCtx
|
||||
= InsCtx
|
||||
{ icAllCols :: !PGColGNameMap
|
||||
, icCheck :: !AnnBoolExpPartialSQL
|
||||
, icSet :: !PreSetColsPartial
|
||||
, icRelations :: !RelationInfoMap
|
||||
, icUpdPerm :: !(Maybe UpdPermForIns)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type InsCtxMap = Map.HashMap QualifiedTable InsCtx
|
||||
|
||||
data AnnPGVal
|
||||
= AnnPGVal
|
||||
{ _apvVariable :: !(Maybe G.Variable)
|
||||
, _apvIsNullable :: !Bool
|
||||
, _apvValue :: !(WithScalarType PGScalarValue)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type PrepFn m = AnnPGVal -> m S.SQLExp
|
||||
|
||||
-- lifts PartialSQLExp to UnresolvedVal
|
||||
partialSQLExpToUnresolvedVal :: PartialSQLExp -> UnresolvedVal
|
||||
partialSQLExpToUnresolvedVal = \case
|
||||
PSESessVar ty sessVar -> UVSessVar ty sessVar
|
||||
PSESQLExp s -> UVSQL s
|
||||
|
||||
-- | A value that will be converted to an sql expression eventually
|
||||
data UnresolvedVal
|
||||
-- | an entire session variables JSON object
|
||||
= UVSession
|
||||
| UVSessVar !(PGType PGScalarType) !SessionVariable
|
||||
-- | a SQL value literal that can be parameterized over
|
||||
| UVPG !AnnPGVal
|
||||
-- | an arbitrary SQL expression, which /cannot/ be parameterized over
|
||||
| UVSQL !S.SQLExp
|
||||
deriving (Show, Eq)
|
||||
|
||||
type AnnBoolExpUnresolved = AnnBoolExp UnresolvedVal
|
||||
|
||||
data InputFunctionArgument
|
||||
= IFAKnown !FunctionArgName !UnresolvedVal -- ^ Known value
|
||||
| IFAUnknown !FunctionArgItem -- ^ Unknown value, need to be parsed
|
||||
deriving (Show, Eq)
|
||||
|
||||
{- Note [Relay Node Id]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
The 'Node' interface in Relay schema has exactly one field which returns
|
||||
a non-null 'ID' value. Each table object type in Relay schema should implement
|
||||
'Node' interface to provide global object identification.
|
||||
See https://relay.dev/graphql/objectidentification.htm for more details.
|
||||
|
||||
To identify each row in a table, we need to encode the table information
|
||||
(schema and name) and primary key column values in the 'Node' id.
|
||||
|
||||
Node id data:
|
||||
-------------
|
||||
We are using JSON format for encoding and decoding the node id. The JSON
|
||||
schema looks like following
|
||||
|
||||
'[<version-integer>, "<table-schema>", "<table-name>", "column-1", "column-2", ... "column-n"]'
|
||||
|
||||
It is represented in the type @'NodeId'. The 'version-integer' represents the JSON
|
||||
schema version to enable any backward compatibility if it is broken in upcoming versions.
|
||||
|
||||
The stringified JSON is Base64 encoded and sent to client. Also the same
|
||||
base64 encoded JSON string is accepted for 'node' field resolver's 'id' input.
|
||||
-}
|
||||
|
||||
data NodeIdVersion
|
||||
= NIVersion1
|
||||
deriving (Show, Eq)
|
||||
|
||||
nodeIdVersionInt :: NodeIdVersion -> Int
|
||||
nodeIdVersionInt NIVersion1 = 1
|
||||
|
||||
currentNodeIdVersion :: NodeIdVersion
|
||||
currentNodeIdVersion = NIVersion1
|
||||
|
||||
instance J.FromJSON NodeIdVersion where
|
||||
parseJSON v = do
|
||||
versionInt :: Int <- J.parseJSON v
|
||||
case versionInt of
|
||||
1 -> pure NIVersion1
|
||||
_ -> fail $ "expecting version 1 for node id, but got " <> show versionInt
|
||||
|
||||
data V1NodeId
|
||||
= V1NodeId
|
||||
{ _nidTable :: !QualifiedTable
|
||||
, _nidColumns :: !(NESeq.NESeq J.Value)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
-- | The Relay 'Node' inteface's 'id' field value.
|
||||
-- See Note [Relay Node id].
|
||||
data NodeId
|
||||
= NodeIdV1 !V1NodeId
|
||||
deriving (Show, Eq)
|
||||
|
||||
instance J.FromJSON NodeId where
|
||||
parseJSON v = do
|
||||
valueList <- J.parseJSON v
|
||||
case valueList of
|
||||
[] -> fail "unexpected GUID format, found empty list"
|
||||
J.Number 1:rest -> NodeIdV1 <$> parseNodeIdV1 rest
|
||||
J.Number n:_ -> fail $ "unsupported GUID version: " <> show n
|
||||
_ -> fail "unexpected GUID format, needs to start with a version number"
|
||||
where
|
||||
parseNodeIdV1 (schemaValue:(nameValue:(firstColumn:remainingColumns))) =
|
||||
V1NodeId
|
||||
<$> (QualifiedObject <$> J.parseJSON schemaValue <*> J.parseJSON nameValue)
|
||||
<*> pure (NESeq.NESeq (firstColumn, Seq.fromList remainingColumns))
|
||||
parseNodeIdV1 _ = fail "GUID version 1: expecting schema name, table name and at least one column value"
|
||||
|
||||
-- template haskell related
|
||||
$(makePrisms ''ResolveField)
|
||||
$(makeLenses ''ComputedField)
|
||||
$(makePrisms ''ComputedFieldType)
|
||||
$(makePrisms ''InputFunctionArgument)
|
File diff suppressed because it is too large
Load Diff
@ -1,380 +1,324 @@
|
||||
module Hasura.GraphQL.Schema.Action
|
||||
( mkActionsSchema
|
||||
( actionExecute
|
||||
, actionAsyncMutation
|
||||
, actionAsyncQuery
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Schema.Builder
|
||||
import Hasura.GraphQL.Schema.Common (mkDescriptionWith)
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Data.Has
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
import qualified Hasura.GraphQL.Parser.Internal.Parser as P
|
||||
import qualified Hasura.RQL.DML.Internal as RQL
|
||||
import qualified Hasura.RQL.DML.Select.Types as RQL
|
||||
|
||||
import Hasura.GraphQL.Parser (FieldParser, InputFieldsParser, Kind (..),
|
||||
Parser, UnpreparedValue (..))
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Select
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
mkAsyncActionSelectionType :: ActionName -> G.NamedType
|
||||
mkAsyncActionSelectionType = G.NamedType . unActionName
|
||||
|
||||
mkAsyncActionQueryResponseObj
|
||||
:: ActionName
|
||||
-- Name of the action
|
||||
-> GraphQLType
|
||||
-- output type
|
||||
-> ObjTyInfo
|
||||
mkAsyncActionQueryResponseObj actionName outputType =
|
||||
mkHsraObjTyInfo
|
||||
(Just description)
|
||||
(mkAsyncActionSelectionType actionName) -- "(action_name)"
|
||||
mempty -- no arguments
|
||||
(mapFromL _fiName fieldDefinitions)
|
||||
where
|
||||
description = G.Description $ "fields of action: " <>> actionName
|
||||
|
||||
mkFieldDefinition (fieldName, fieldDescription, fieldType) =
|
||||
mkHsraObjFldInfo
|
||||
(Just fieldDescription)
|
||||
fieldName
|
||||
mempty
|
||||
fieldType
|
||||
|
||||
fieldDefinitions = map mkFieldDefinition
|
||||
[ ( "id", "the unique id of an action"
|
||||
, G.toGT $ mkScalarTy PGUUID)
|
||||
, ( "created_at", "the time at which this action was created"
|
||||
, G.toGT $ mkScalarTy PGTimeStampTZ)
|
||||
, ( "errors", "errors related to the invocation"
|
||||
, G.toGT $ mkScalarTy PGJSON)
|
||||
, ( "output", "the output fields of this action"
|
||||
, unGraphQLType outputType)
|
||||
]
|
||||
|
||||
mkQueryActionField
|
||||
:: ActionName
|
||||
-- | actionExecute is used to execute either a query action or a synchronous
|
||||
-- mutation action. A query action or a synchronous mutation action accepts
|
||||
-- the field name and input arguments and a selectionset. The
|
||||
-- input argument and selectionset types are defined by the user.
|
||||
--
|
||||
--
|
||||
-- > action_name(action_input_arguments) {
|
||||
-- > col1: col1_type
|
||||
-- > col2: col2_type
|
||||
-- > }
|
||||
actionExecute
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> NonObjectTypeMap
|
||||
-> ActionInfo
|
||||
-> [(PGCol, PGScalarType)]
|
||||
-> (ActionExecutionContext, ObjFldInfo)
|
||||
mkQueryActionField actionName actionInfo definitionList =
|
||||
( actionExecutionContext
|
||||
, fieldInfo
|
||||
)
|
||||
-> m (Maybe (FieldParser n (AnnActionExecution UnpreparedValue)))
|
||||
actionExecute nonObjectTypeMap actionInfo = runMaybeT do
|
||||
roleName <- lift askRoleName
|
||||
guard $ roleName == adminRoleName || roleName `Map.member` permissions
|
||||
let fieldName = unActionName actionName
|
||||
description = G.Description <$> comment
|
||||
inputArguments <- lift $ actionInputArguments nonObjectTypeMap $ _adArguments definition
|
||||
selectionSet <- lift $ actionOutputFields outputObject
|
||||
stringifyNum <- asks $ qcStringifyNum . getter
|
||||
pure $ P.subselection fieldName description inputArguments selectionSet
|
||||
<&> \(argsJson, fields) -> AnnActionExecution
|
||||
{ _aaeName = actionName
|
||||
, _aaeFields = fields
|
||||
, _aaePayload = argsJson
|
||||
, _aaeOutputType = _adOutputType definition
|
||||
, _aaeOutputFields = getActionOutputFields outputObject
|
||||
, _aaeDefinitionList = mkDefinitionList outputObject
|
||||
, _aaeWebhook = _adHandler definition
|
||||
, _aaeHeaders = _adHeaders definition
|
||||
, _aaeForwardClientHeaders = _adForwardClientHeaders definition
|
||||
, _aaeStrfyNum = stringifyNum
|
||||
}
|
||||
where
|
||||
definition = _aiDefinition actionInfo
|
||||
actionExecutionContext =
|
||||
ActionExecutionContext
|
||||
actionName
|
||||
(_adOutputType definition)
|
||||
(getActionOutputFields $ _aiOutputObject actionInfo)
|
||||
definitionList
|
||||
(_adHandler definition)
|
||||
(_adHeaders definition)
|
||||
(_adForwardClientHeaders definition)
|
||||
ActionInfo actionName outputObject definition permissions comment = actionInfo
|
||||
|
||||
description = mkDescriptionWith (PGDescription <$> _aiComment actionInfo) $
|
||||
"perform the action: " <>> actionName
|
||||
|
||||
fieldInfo =
|
||||
mkHsraObjFldInfo
|
||||
(Just description)
|
||||
(unActionName actionName)
|
||||
(mapFromL _iviName $ map mkActionArgument $ _adArguments definition)
|
||||
actionFieldResponseType
|
||||
|
||||
mkActionArgument argument =
|
||||
InpValInfo (_argDescription argument) (unArgumentName $ _argName argument)
|
||||
Nothing $ unGraphQLType $ _argType argument
|
||||
|
||||
actionFieldResponseType = unGraphQLType $ _adOutputType definition
|
||||
|
||||
mkMutationActionField
|
||||
:: ActionName
|
||||
-- | actionAsyncMutation is used to execute a asynchronous mutation action. An
|
||||
-- asynchronous action expects the field name and the input arguments to the
|
||||
-- action. A selectionset is *not* expected. An action ID (UUID) will be
|
||||
-- returned after performing the action
|
||||
--
|
||||
-- > action_name(action_input_arguments)
|
||||
actionAsyncMutation
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> NonObjectTypeMap
|
||||
-> ActionInfo
|
||||
-> [(PGCol, PGScalarType)]
|
||||
-> ActionMutationKind
|
||||
-> (ActionMutationExecutionContext, ObjFldInfo)
|
||||
mkMutationActionField actionName actionInfo definitionList kind =
|
||||
( actionExecutionContext
|
||||
, fieldInfo
|
||||
)
|
||||
-> m (Maybe (FieldParser n AnnActionMutationAsync))
|
||||
actionAsyncMutation nonObjectTypeMap actionInfo = runMaybeT do
|
||||
roleName <- lift askRoleName
|
||||
guard $ roleName == adminRoleName || roleName `Map.member` permissions
|
||||
inputArguments <- lift $ actionInputArguments nonObjectTypeMap $ _adArguments definition
|
||||
actionId <- lift actionIdParser
|
||||
let fieldName = unActionName actionName
|
||||
description = G.Description <$> comment
|
||||
pure $ P.selection fieldName description inputArguments actionId
|
||||
<&> AnnActionMutationAsync actionName
|
||||
where
|
||||
definition = _aiDefinition actionInfo
|
||||
actionExecutionContext =
|
||||
case kind of
|
||||
ActionSynchronous ->
|
||||
ActionMutationSyncWebhook $ ActionExecutionContext actionName
|
||||
(_adOutputType definition)
|
||||
(getActionOutputFields $ _aiOutputObject actionInfo)
|
||||
definitionList
|
||||
(_adHandler definition)
|
||||
(_adHeaders definition)
|
||||
(_adForwardClientHeaders definition)
|
||||
ActionAsynchronous -> ActionMutationAsync
|
||||
ActionInfo actionName _ definition permissions comment = actionInfo
|
||||
|
||||
description = mkDescriptionWith (PGDescription <$> _aiComment actionInfo) $
|
||||
"perform the action: " <>> actionName
|
||||
-- | actionAsyncQuery is used to query/subscribe to the result of an
|
||||
-- asynchronous mutation action. The only input argument to an
|
||||
-- asynchronous mutation action is the action ID (UUID) and a selection
|
||||
-- set is expected, the selection set contains 4 fields namely 'id',
|
||||
-- 'created_at','errors' and 'output'. The result of the action can be queried
|
||||
-- through the 'output' field.
|
||||
--
|
||||
-- > action_name (id: UUID!) {
|
||||
-- > id: UUID!
|
||||
-- > created_at: timestampz!
|
||||
-- > errors: JSON
|
||||
-- > output: user_defined_type!
|
||||
-- > }
|
||||
actionAsyncQuery
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> ActionInfo
|
||||
-> m (Maybe (FieldParser n (AnnActionAsyncQuery UnpreparedValue)))
|
||||
actionAsyncQuery actionInfo = runMaybeT do
|
||||
roleName <- lift askRoleName
|
||||
guard $ roleName == adminRoleName || roleName `Map.member` permissions
|
||||
actionId <- lift actionIdParser
|
||||
actionOutputParser <- lift $ actionOutputFields outputObject
|
||||
createdAtFieldParser <-
|
||||
lift $ P.column (PGColumnScalar PGTimeStampTZ) (G.Nullability False)
|
||||
errorsFieldParser <-
|
||||
lift $ P.column (PGColumnScalar PGJSON) (G.Nullability True)
|
||||
|
||||
fieldInfo =
|
||||
mkHsraObjFldInfo
|
||||
(Just description)
|
||||
(unActionName actionName)
|
||||
(mapFromL _iviName $ map mkActionArgument $ _adArguments definition)
|
||||
actionFieldResponseType
|
||||
let fieldName = unActionName actionName
|
||||
description = G.Description <$> comment
|
||||
actionIdInputField =
|
||||
P.field idFieldName (Just idFieldDescription) actionId
|
||||
allFieldParsers =
|
||||
let idField = P.selection_ idFieldName (Just idFieldDescription) actionId $> AsyncId
|
||||
createdAtField = P.selection_ $$(G.litName "created_at")
|
||||
(Just "the time at which this action was created")
|
||||
createdAtFieldParser $> AsyncCreatedAt
|
||||
errorsField = P.selection_ $$(G.litName "errors")
|
||||
(Just "errors related to the invocation")
|
||||
errorsFieldParser $> AsyncErrors
|
||||
outputField = P.subselection_ $$(G.litName "output")
|
||||
(Just "the output fields of this action")
|
||||
actionOutputParser <&> AsyncOutput
|
||||
in [idField, createdAtField, errorsField, outputField]
|
||||
selectionSet =
|
||||
let outputTypeName = unActionName actionName
|
||||
desc = G.Description $ "fields of action: " <>> actionName
|
||||
in P.selectionSet outputTypeName (Just desc) allFieldParsers
|
||||
<&> parsedSelectionsToFields AsyncTypename
|
||||
|
||||
mkActionArgument argument =
|
||||
InpValInfo (_argDescription argument) (unArgumentName $ _argName argument)
|
||||
Nothing $ unGraphQLType $ _argType argument
|
||||
|
||||
actionFieldResponseType =
|
||||
case kind of
|
||||
ActionSynchronous -> unGraphQLType $ _adOutputType definition
|
||||
ActionAsynchronous -> G.toGT $ G.toNT $ mkScalarTy PGUUID
|
||||
|
||||
mkQueryField
|
||||
:: ActionName
|
||||
-> Maybe Text
|
||||
-> ResolvedActionDefinition
|
||||
-> [(PGCol, PGScalarType)]
|
||||
-> ActionMutationKind
|
||||
-> Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo)
|
||||
mkQueryField actionName comment definition definitionList kind =
|
||||
case kind of
|
||||
ActionAsynchronous ->
|
||||
Just ( ActionSelectOpContext (_adOutputType definition) definitionList
|
||||
|
||||
, mkHsraObjFldInfo (Just description) (unActionName actionName)
|
||||
(mapFromL _iviName [idArgument])
|
||||
(G.toGT $ G.toGT $ mkAsyncActionSelectionType actionName)
|
||||
|
||||
, TIObj $ mkAsyncActionQueryResponseObj actionName $
|
||||
_adOutputType definition
|
||||
)
|
||||
ActionSynchronous -> Nothing
|
||||
stringifyNum <- asks $ qcStringifyNum . getter
|
||||
pure $ P.subselection fieldName description actionIdInputField selectionSet
|
||||
<&> \(idArg, fields) -> AnnActionAsyncQuery
|
||||
{ _aaaqName = actionName
|
||||
, _aaaqActionId = idArg
|
||||
, _aaaqOutputType = _adOutputType definition
|
||||
, _aaaqFields = fields
|
||||
, _aaaqDefinitionList = mkDefinitionList outputObject
|
||||
, _aaaqStringifyNum = stringifyNum
|
||||
}
|
||||
where
|
||||
description = mkDescriptionWith (PGDescription <$> comment) $
|
||||
"retrieve the result of action: " <>> actionName
|
||||
ActionInfo actionName outputObject definition permissions comment = actionInfo
|
||||
idFieldName = $$(G.litName "id")
|
||||
idFieldDescription = "the unique id of an action"
|
||||
|
||||
idArgument =
|
||||
InpValInfo (Just idDescription) "id" Nothing $ G.toNT $ mkScalarTy PGUUID
|
||||
where
|
||||
idDescription = G.Description $ "id of the action: " <>> actionName
|
||||
-- | Async action's unique id
|
||||
actionIdParser
|
||||
:: (MonadSchema n m, MonadError QErr m)
|
||||
=> m (Parser 'Both n UnpreparedValue)
|
||||
actionIdParser =
|
||||
fmap P.mkParameter <$> P.column (PGColumnScalar PGUUID) (G.Nullability False)
|
||||
|
||||
mkPGFieldType
|
||||
:: ObjectFieldName
|
||||
-> (G.GType, OutputFieldTypeInfo)
|
||||
-> HashMap ObjectFieldName PGColumnInfo
|
||||
-> PGScalarType
|
||||
mkPGFieldType fieldName (fieldType, fieldTypeInfo) fieldReferences =
|
||||
case (G.isListType fieldType, fieldTypeInfo) of
|
||||
-- for scalar lists, we treat them as json columns
|
||||
(True, _) -> PGJSON
|
||||
-- enums the same
|
||||
(False, OutputFieldEnum _) -> PGJSON
|
||||
-- default to PGJSON unless you have to join with a postgres table
|
||||
-- i.e, if this field is specified as part of some relationship's
|
||||
-- mapping, we can cast this column's value as the remote column's type
|
||||
(False, OutputFieldScalar _) ->
|
||||
case Map.lookup fieldName fieldReferences of
|
||||
Just columnInfo -> unsafePGColumnToRepresentation $ pgiType columnInfo
|
||||
Nothing -> PGJSON
|
||||
|
||||
|
||||
mkDefinitionList :: AnnotatedObjectType -> HashMap ObjectFieldName PGColumnInfo -> [(PGCol, PGScalarType)]
|
||||
mkDefinitionList annotatedOutputType fieldReferences =
|
||||
[ (unsafePGCol $ coerce k, mkPGFieldType k v fieldReferences)
|
||||
| (k, v) <- Map.toList $ _aotAnnotatedFields annotatedOutputType
|
||||
]
|
||||
|
||||
mkFieldMap
|
||||
:: AnnotatedObjectType
|
||||
-> ActionInfo
|
||||
-> HashMap ObjectFieldName PGColumnInfo
|
||||
-> RoleName
|
||||
-> HashMap (G.NamedType,G.Name) ResolveField
|
||||
mkFieldMap annotatedOutputType actionInfo fieldReferences roleName =
|
||||
Map.fromList $ fields <> catMaybes relationships
|
||||
actionOutputFields
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> AnnotatedObjectType
|
||||
-> m (Parser 'Output n (RQL.AnnFieldsG UnpreparedValue))
|
||||
actionOutputFields outputObject = do
|
||||
let scalarOrEnumFields = map scalarOrEnumFieldParser $ toList $ _otdFields outputObject
|
||||
relationshipFields <- forM (_otdRelationships outputObject) $ traverse relationshipFieldParser
|
||||
let allFieldParsers = scalarOrEnumFields <>
|
||||
maybe [] (catMaybes . toList) relationshipFields
|
||||
outputTypeName = unObjectTypeName $ _otdName outputObject
|
||||
outputTypeDescription = _otdDescription outputObject
|
||||
pure $ P.selectionSet outputTypeName outputTypeDescription allFieldParsers
|
||||
<&> parsedSelectionsToFields RQL.AFExpression
|
||||
where
|
||||
fields =
|
||||
flip map (Map.toList $ _aotAnnotatedFields annotatedOutputType) $
|
||||
\(fieldName, (fieldType, fieldTypeInfo)) ->
|
||||
( (actionOutputBaseType, unObjectFieldName fieldName)
|
||||
, RFPGColumn $ PGColumnInfo
|
||||
(unsafePGCol $ coerce fieldName)
|
||||
(coerce fieldName)
|
||||
0
|
||||
(PGColumnScalar $ mkPGFieldType fieldName (fieldType, fieldTypeInfo) fieldReferences)
|
||||
(G.isNullable fieldType)
|
||||
Nothing
|
||||
)
|
||||
relationships =
|
||||
flip map (Map.toList $ _aotRelationships annotatedOutputType) $
|
||||
\(relationshipName, relationship) ->
|
||||
let remoteTableInfo = _trRemoteTable relationship
|
||||
remoteTable = _tciName $ _tiCoreInfo remoteTableInfo
|
||||
filterAndLimitM = getFilterAndLimit remoteTableInfo
|
||||
scalarOrEnumFieldParser
|
||||
:: ObjectFieldDefinition (G.GType, AnnotatedObjectFieldType)
|
||||
-> FieldParser n (RQL.AnnFieldG UnpreparedValue)
|
||||
scalarOrEnumFieldParser (ObjectFieldDefinition name _ description ty) =
|
||||
let (gType, objectFieldType) = ty
|
||||
fieldName = unObjectFieldName name
|
||||
-- FIXME? (from master)
|
||||
pgColumnInfo = PGColumnInfo (unsafePGCol $ G.unName fieldName)
|
||||
fieldName 0 (PGColumnScalar PGJSON) (G.isNullable gType) Nothing
|
||||
fieldParser = case objectFieldType of
|
||||
AOFTScalar def -> customScalarParser def
|
||||
AOFTEnum def -> customEnumParser def
|
||||
in bool P.nonNullableField id (G.isNullable gType) $
|
||||
P.selection_ (unObjectFieldName name) description fieldParser
|
||||
$> RQL.mkAnnColumnField pgColumnInfo Nothing
|
||||
|
||||
relationshipFieldParser
|
||||
:: TypeRelationship TableInfo PGColumnInfo
|
||||
-> m (Maybe (FieldParser n (RQL.AnnFieldG UnpreparedValue)))
|
||||
relationshipFieldParser typeRelationship = runMaybeT do
|
||||
let TypeRelationship relName relType tableInfo fieldMapping = typeRelationship
|
||||
tableName = _tciName $ _tiCoreInfo tableInfo
|
||||
fieldName = unRelationshipName relName
|
||||
roleName <- lift askRoleName
|
||||
tablePerms <- MaybeT $ pure $ RQL.getPermInfoMaybe roleName PASelect tableInfo
|
||||
tableParser <- lift $ selectTable tableName fieldName Nothing tablePerms
|
||||
pure $ tableParser <&> \selectExp ->
|
||||
let tableRelName = RelName $ mkNonEmptyTextUnsafe $ G.unName fieldName
|
||||
columnMapping = Map.fromList $
|
||||
[ (unsafePGCol $ coerce k, pgiColumn v)
|
||||
| (k, v) <- Map.toList $ _trFieldMapping relationship
|
||||
]
|
||||
in case filterAndLimitM of
|
||||
Just (tableFilter, tableLimit) ->
|
||||
Just ( ( actionOutputBaseType
|
||||
, unRelationshipName relationshipName
|
||||
)
|
||||
, RFRelationship $ RelationshipField
|
||||
(RelInfo
|
||||
-- RelationshipName, which is newtype wrapper over G.Name is always
|
||||
-- non-empty text so as to conform GraphQL spec
|
||||
(RelName $ mkNonEmptyTextUnsafe $ coerce relationshipName)
|
||||
(_trType relationship)
|
||||
columnMapping remoteTable True)
|
||||
RFKSimple mempty
|
||||
tableFilter
|
||||
tableLimit
|
||||
)
|
||||
Nothing -> Nothing
|
||||
[ (unsafePGCol $ G.unName $ unObjectFieldName k, pgiColumn v)
|
||||
| (k, v) <- Map.toList fieldMapping
|
||||
]
|
||||
in case relType of
|
||||
ObjRel -> RQL.AFObjectRelation $ RQL.AnnRelationSelectG tableRelName columnMapping $
|
||||
RQL.AnnObjectSelectG (RQL._asnFields selectExp) tableName $
|
||||
RQL._tpFilter $ RQL._asnPerm selectExp
|
||||
ArrRel -> RQL.AFArrayRelation $ RQL.ASSimple $
|
||||
RQL.AnnRelationSelectG tableRelName columnMapping selectExp
|
||||
|
||||
getFilterAndLimit remoteTableInfo =
|
||||
if roleName == adminRoleName
|
||||
then Just (annBoolExpTrue, Nothing)
|
||||
else do
|
||||
selectPermisisonInfo <-
|
||||
getSelectPermissionInfoM remoteTableInfo roleName
|
||||
return (spiFilter selectPermisisonInfo, spiLimit selectPermisisonInfo)
|
||||
|
||||
actionOutputBaseType =
|
||||
G.getBaseType $ unGraphQLType $ _adOutputType $ _aiDefinition actionInfo
|
||||
|
||||
mkFieldReferences :: AnnotatedObjectType -> HashMap ObjectFieldName PGColumnInfo
|
||||
mkFieldReferences annotatedOutputType=
|
||||
Map.unions $ map _trFieldMapping $ Map.elems $
|
||||
_aotRelationships annotatedOutputType
|
||||
|
||||
mkMutationActionFieldsAndTypes
|
||||
:: ActionInfo
|
||||
-> ActionPermissionInfo
|
||||
-> ActionMutationKind
|
||||
-> ( Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo)
|
||||
-- context, field, response type info
|
||||
, (ActionMutationExecutionContext, ObjFldInfo) -- mutation field
|
||||
, FieldMap
|
||||
)
|
||||
mkMutationActionFieldsAndTypes actionInfo permission kind =
|
||||
( mkQueryField actionName comment definition definitionList kind
|
||||
, mkMutationActionField actionName actionInfo definitionList kind
|
||||
, fieldMap
|
||||
)
|
||||
mkDefinitionList :: AnnotatedObjectType -> [(PGCol, PGScalarType)]
|
||||
mkDefinitionList ObjectTypeDefinition{..} =
|
||||
flip map (toList _otdFields) $ \ObjectFieldDefinition{..} ->
|
||||
(unsafePGCol . G.unName . unObjectFieldName $ _ofdName,) $
|
||||
case Map.lookup _ofdName fieldReferences of
|
||||
Nothing -> fieldTypeToScalarType $ snd _ofdType
|
||||
Just columnInfo -> unsafePGColumnToRepresentation $ pgiType columnInfo
|
||||
where
|
||||
actionName = _aiName actionInfo
|
||||
annotatedOutputType = _aiOutputObject actionInfo
|
||||
definition = _aiDefinition actionInfo
|
||||
roleName = _apiRole permission
|
||||
comment = _aiComment actionInfo
|
||||
fieldReferences =
|
||||
Map.unions $ map _trFieldMapping $ maybe [] toList _otdRelationships
|
||||
|
||||
-- all the possible field references
|
||||
fieldReferences = mkFieldReferences annotatedOutputType
|
||||
|
||||
definitionList = mkDefinitionList annotatedOutputType fieldReferences
|
||||
|
||||
fieldMap = mkFieldMap annotatedOutputType actionInfo fieldReferences roleName
|
||||
|
||||
mkQueryActionFieldsAndTypes
|
||||
:: ActionInfo
|
||||
-> ActionPermissionInfo
|
||||
-> ((ActionExecutionContext, ObjFldInfo)
|
||||
, FieldMap
|
||||
)
|
||||
mkQueryActionFieldsAndTypes actionInfo permission =
|
||||
( mkQueryActionField actionName actionInfo definitionList
|
||||
, fieldMap
|
||||
)
|
||||
actionInputArguments
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> NonObjectTypeMap
|
||||
-> [ArgumentDefinition (G.GType, NonObjectCustomType)]
|
||||
-> m (InputFieldsParser n J.Value)
|
||||
actionInputArguments nonObjectTypeMap arguments = do
|
||||
argumentParsers <- for arguments $ \argument -> do
|
||||
let ArgumentDefinition argumentName (gType, nonObjectType) argumentDescription = argument
|
||||
name = unArgumentName argumentName
|
||||
(name,) <$> argumentParser name argumentDescription gType nonObjectType
|
||||
pure $ J.Object <$> inputFieldsToObject argumentParsers
|
||||
where
|
||||
actionName = _aiName actionInfo
|
||||
roleName = _apiRole permission
|
||||
annotatedOutputType = _aiOutputObject actionInfo
|
||||
inputFieldsToObject
|
||||
:: [(G.Name, InputFieldsParser n (Maybe J.Value))]
|
||||
-> InputFieldsParser n J.Object
|
||||
inputFieldsToObject inputFields =
|
||||
let mkTuple (name, parser) = fmap (G.unName name,) <$> parser
|
||||
in fmap (Map.fromList . catMaybes) $ traverse mkTuple inputFields
|
||||
|
||||
fieldReferences = mkFieldReferences annotatedOutputType
|
||||
argumentParser
|
||||
:: G.Name
|
||||
-> Maybe G.Description
|
||||
-> G.GType
|
||||
-> NonObjectCustomType
|
||||
-> m (InputFieldsParser n (Maybe J.Value))
|
||||
argumentParser name description gType = \case
|
||||
NOCTScalar def -> pure $ mkArgumentInputFieldParser name description gType $ customScalarParser def
|
||||
NOCTEnum def -> pure $ mkArgumentInputFieldParser name description gType $ customEnumParser def
|
||||
NOCTInputObject def -> do
|
||||
let InputObjectTypeDefinition typeName objectDescription inputFields = def
|
||||
objectName = unInputObjectTypeName typeName
|
||||
inputFieldsParsers <- forM (toList inputFields) $ \inputField -> do
|
||||
let InputObjectFieldName fieldName = _iofdName inputField
|
||||
GraphQLType fieldType = _iofdType inputField
|
||||
nonObjectFieldType <-
|
||||
onNothing (Map.lookup (G.getBaseType fieldType) nonObjectTypeMap) $
|
||||
throw500 "object type for a field found in custom input object type"
|
||||
(fieldName,) <$> argumentParser fieldName (_iofdDescription inputField) fieldType nonObjectFieldType
|
||||
|
||||
definitionList = mkDefinitionList annotatedOutputType fieldReferences
|
||||
pure $ mkArgumentInputFieldParser name description gType $
|
||||
P.object objectName objectDescription $
|
||||
J.Object <$> inputFieldsToObject inputFieldsParsers
|
||||
|
||||
fieldMap = mkFieldMap annotatedOutputType actionInfo fieldReferences roleName
|
||||
|
||||
mkMutationActionSchemaOne
|
||||
:: ActionInfo
|
||||
-> ActionMutationKind
|
||||
-> Map.HashMap RoleName
|
||||
( Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo)
|
||||
, (ActionMutationExecutionContext, ObjFldInfo)
|
||||
, FieldMap
|
||||
)
|
||||
mkMutationActionSchemaOne actionInfo kind =
|
||||
flip Map.map permissions $ \permission ->
|
||||
mkMutationActionFieldsAndTypes actionInfo permission kind
|
||||
mkArgumentInputFieldParser
|
||||
:: forall m k. (MonadParse m, 'Input P.<: k)
|
||||
=> G.Name
|
||||
-> Maybe G.Description
|
||||
-> G.GType
|
||||
-> Parser k m J.Value
|
||||
-> InputFieldsParser m (Maybe J.Value)
|
||||
mkArgumentInputFieldParser name description gType parser =
|
||||
if G.isNullable gType
|
||||
then P.fieldOptional name description modifiedParser
|
||||
else Just <$> P.field name description modifiedParser
|
||||
where
|
||||
adminPermission = ActionPermissionInfo adminRoleName
|
||||
permissions = Map.insert adminRoleName adminPermission $ _aiPermissions actionInfo
|
||||
modifiedParser = parserModifier gType parser
|
||||
|
||||
mkQueryActionSchemaOne
|
||||
:: ActionInfo
|
||||
-> Map.HashMap RoleName
|
||||
( (ActionExecutionContext, ObjFldInfo)
|
||||
, FieldMap
|
||||
)
|
||||
mkQueryActionSchemaOne actionInfo =
|
||||
flip Map.map permissions $ \permission ->
|
||||
mkQueryActionFieldsAndTypes actionInfo permission
|
||||
where
|
||||
adminPermission = ActionPermissionInfo adminRoleName
|
||||
permissions = Map.insert adminRoleName adminPermission $ _aiPermissions actionInfo
|
||||
|
||||
mkActionsSchema
|
||||
:: ActionCache
|
||||
-> Map.HashMap RoleName (RootFields, TyAgg)
|
||||
mkActionsSchema =
|
||||
foldl'
|
||||
(\aggregate actionInfo ->
|
||||
case _adType $ _aiDefinition actionInfo of
|
||||
ActionQuery ->
|
||||
Map.foldrWithKey (accumulateQuery (_aiPgScalars actionInfo)) aggregate $
|
||||
mkQueryActionSchemaOne actionInfo
|
||||
ActionMutation kind ->
|
||||
Map.foldrWithKey (accumulateMutation (_aiPgScalars actionInfo)) aggregate $
|
||||
mkMutationActionSchemaOne actionInfo kind
|
||||
)
|
||||
mempty
|
||||
where
|
||||
-- we'll need to add uuid and timestamptz for actions
|
||||
initRoleState =
|
||||
( mempty
|
||||
, foldr addScalarToTyAgg mempty [PGJSON, PGTimeStampTZ, PGUUID]
|
||||
)
|
||||
|
||||
addScalarsToTyAgg = foldr addScalarToTyAgg
|
||||
|
||||
accumulateQuery pgScalars roleName (actionField, fields) =
|
||||
Map.alter (Just . addToStateSync . fromMaybe initRoleState) roleName
|
||||
parserModifier
|
||||
:: G.GType -> Parser k m J.Value -> Parser k m J.Value
|
||||
parserModifier = \case
|
||||
G.TypeNamed nullable _ -> nullableModifier nullable
|
||||
G.TypeList nullable ty ->
|
||||
nullableModifier nullable . fmap J.toJSON . P.list . parserModifier ty
|
||||
where
|
||||
addToStateSync (rootFields, tyAgg) =
|
||||
( addQueryField (first QCAction actionField) rootFields
|
||||
-- Add reused PG scalars to TyAgg
|
||||
, addFieldsToTyAgg fields $ addScalarsToTyAgg tyAgg pgScalars
|
||||
)
|
||||
nullableModifier =
|
||||
bool (fmap J.toJSON) (fmap J.toJSON . P.nullable) . G.unNullability
|
||||
|
||||
accumulateMutation pgScalars roleName (queryFieldM, actionField, fields) =
|
||||
Map.alter (Just . addToState . fromMaybe initRoleState) roleName
|
||||
where
|
||||
addToState (rootFields, tyAgg) =
|
||||
let rootFieldsWithActionMutation =
|
||||
addMutationField (first MCAction actionField) rootFields
|
||||
-- Add reused PG scalars to TyAgg
|
||||
tyAggWithFieldsAndPgScalars =
|
||||
addFieldsToTyAgg fields $ addScalarsToTyAgg tyAgg pgScalars
|
||||
in case queryFieldM of
|
||||
Just (fldCtx, fldDefinition, responseTypeInfo) ->
|
||||
-- Add async action's query resolver and response type
|
||||
( addQueryField (QCAsyncActionFetch fldCtx, fldDefinition)
|
||||
rootFieldsWithActionMutation
|
||||
, addTypeInfoToTyAgg responseTypeInfo tyAggWithFieldsAndPgScalars
|
||||
)
|
||||
Nothing -> (rootFieldsWithActionMutation, tyAggWithFieldsAndPgScalars)
|
||||
customScalarParser
|
||||
:: MonadParse m
|
||||
=> AnnotatedScalarType -> Parser 'Both m J.Value
|
||||
customScalarParser = \case
|
||||
ASTCustom ScalarTypeDefinition{..} ->
|
||||
if | _stdName == idScalar -> J.toJSON <$> P.identifier
|
||||
| _stdName == intScalar -> J.toJSON <$> P.int
|
||||
| _stdName == floatScalar -> J.toJSON <$> P.float
|
||||
| _stdName == stringScalar -> J.toJSON <$> P.string
|
||||
| _stdName == boolScalar -> J.toJSON <$> P.boolean
|
||||
| otherwise -> P.namedJSON _stdName _stdDescription
|
||||
ASTReusedPgScalar name pgScalarType ->
|
||||
let schemaType = P.NonNullable $ P.TNamed $ P.mkDefinition name Nothing P.TIScalar
|
||||
in P.Parser
|
||||
{ pType = schemaType
|
||||
, pParser = P.valueToJSON (P.toGraphQLType schemaType) >=>
|
||||
either (parseErrorWith ParseFailed . qeError)
|
||||
(pure . pgScalarValueToJson) . runAesonParser (parsePGValue pgScalarType)
|
||||
}
|
||||
|
||||
customEnumParser
|
||||
:: MonadParse m
|
||||
=> EnumTypeDefinition -> Parser 'Both m J.Value
|
||||
customEnumParser (EnumTypeDefinition typeName description enumValues) =
|
||||
let enumName = unEnumTypeName typeName
|
||||
enumValueDefinitions = enumValues <&> \enumValue ->
|
||||
let valueName = G.unEnumValue $ _evdValue enumValue
|
||||
in (, J.toJSON valueName) $ P.mkDefinition valueName
|
||||
(_evdDescription enumValue) P.EnumValueInfo
|
||||
in P.enum enumName description enumValueDefinitions
|
||||
|
@ -1,293 +1,302 @@
|
||||
module Hasura.GraphQL.Schema.BoolExp
|
||||
( geoInputTypes
|
||||
, rasterIntersectsInputTypes
|
||||
, mkCompExpInp
|
||||
|
||||
, mkBoolExpTy
|
||||
, mkBoolExpInp
|
||||
( boolExp
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict.Extended as M
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
|
||||
import Hasura.GraphQL.Parser (InputFieldsParser, Kind (..), Parser,
|
||||
UnpreparedValue, mkParameter)
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Schema.Table
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.DML
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
typeToDescription :: G.NamedType -> G.Description
|
||||
typeToDescription = G.Description . G.unName . G.unNamedType
|
||||
type ComparisonExp = OpExpG UnpreparedValue
|
||||
|
||||
mkCompExpTy :: PGColumnType -> G.NamedType
|
||||
mkCompExpTy = addTypeSuffix "_comparison_exp" . mkColumnType
|
||||
-- |
|
||||
-- > input type_bool_exp {
|
||||
-- > _or: [type_bool_exp!]
|
||||
-- > _and: [type_bool_exp!]
|
||||
-- > _not: type_bool_exp
|
||||
-- > column: type_comparison_exp
|
||||
-- > ...
|
||||
-- > }
|
||||
boolExp
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> Maybe SelPermInfo
|
||||
-> m (Parser 'Input n (AnnBoolExp UnpreparedValue))
|
||||
boolExp table selectPermissions = memoizeOn 'boolExp table $ do
|
||||
name <- qualifiedObjectToName table <&> (<> $$(G.litName "_bool_exp"))
|
||||
let description = G.Description $
|
||||
"Boolean expression to filter rows from the table " <> table <<>
|
||||
". All fields are combined with a logical 'AND'."
|
||||
|
||||
mkCastExpTy :: PGColumnType -> G.NamedType
|
||||
mkCastExpTy = addTypeSuffix "_cast_exp" . mkColumnType
|
||||
tableFieldParsers <- catMaybes <$> maybe
|
||||
(pure [])
|
||||
(traverse mkField <=< tableSelectFields table)
|
||||
selectPermissions
|
||||
recur <- boolExp table selectPermissions
|
||||
-- Bafflingly, ApplicativeDo doesn’t work if we inline this definition (I
|
||||
-- think the TH splices throw it off), so we have to define it separately.
|
||||
let specialFieldParsers =
|
||||
[ P.fieldOptional $$(G.litName "_or") Nothing (BoolOr <$> P.list recur)
|
||||
, P.fieldOptional $$(G.litName "_and") Nothing (BoolAnd <$> P.list recur)
|
||||
, P.fieldOptional $$(G.litName "_not") Nothing (BoolNot <$> recur)
|
||||
]
|
||||
|
||||
-- TODO(shahidhk) this should ideally be st_d_within_geometry
|
||||
{-
|
||||
input st_d_within_input {
|
||||
distance: Float!
|
||||
from: geometry!
|
||||
}
|
||||
-}
|
||||
stDWithinGeometryInpTy :: G.NamedType
|
||||
stDWithinGeometryInpTy = G.NamedType "st_d_within_input"
|
||||
|
||||
{-
|
||||
input st_d_within_geography_input {
|
||||
distance: Float!
|
||||
from: geography!
|
||||
use_spheroid: Bool!
|
||||
}
|
||||
-}
|
||||
stDWithinGeographyInpTy :: G.NamedType
|
||||
stDWithinGeographyInpTy = G.NamedType "st_d_within_geography_input"
|
||||
|
||||
|
||||
-- | Makes an input type declaration for the @_cast@ field of a comparison expression.
|
||||
-- (Currently only used for casting between geometry and geography types.)
|
||||
mkCastExpressionInputType :: PGColumnType -> [PGColumnType] -> InpObjTyInfo
|
||||
mkCastExpressionInputType sourceType targetTypes =
|
||||
mkHsraInpTyInfo (Just description) (mkCastExpTy sourceType) (fromInpValL targetFields)
|
||||
pure $ BoolAnd <$> P.object name (Just description) do
|
||||
tableFields <- map BoolFld . catMaybes <$> sequenceA tableFieldParsers
|
||||
specialFields <- catMaybes <$> sequenceA specialFieldParsers
|
||||
pure (tableFields ++ specialFields)
|
||||
where
|
||||
description = mconcat
|
||||
[ "Expression to compare the result of casting a column of type "
|
||||
, typeToDescription $ mkColumnType sourceType
|
||||
, ". Multiple cast targets are combined with logical 'AND'."
|
||||
]
|
||||
targetFields = map targetField targetTypes
|
||||
targetField targetType = InpValInfo
|
||||
Nothing
|
||||
(G.unNamedType $ mkColumnType targetType)
|
||||
Nothing
|
||||
(G.toGT $ mkCompExpTy targetType)
|
||||
mkField
|
||||
:: FieldInfo -> m (Maybe (InputFieldsParser n (Maybe (AnnBoolExpFld UnpreparedValue))))
|
||||
mkField fieldInfo = runMaybeT do
|
||||
fieldName <- MaybeT $ pure $ fieldInfoGraphQLName fieldInfo
|
||||
P.fieldOptional fieldName Nothing <$> case fieldInfo of
|
||||
-- field_name: field_type_comparison_exp
|
||||
FIColumn columnInfo ->
|
||||
lift $ fmap (AVCol columnInfo) <$> comparisonExps (pgiType columnInfo)
|
||||
|
||||
--- | make compare expression input type
|
||||
mkCompExpInp :: PGColumnType -> InpObjTyInfo
|
||||
mkCompExpInp colTy =
|
||||
InpObjTyInfo (Just tyDesc) (mkCompExpTy colTy) (fromInpValL $ concat
|
||||
[ map (mk colGqlType) eqOps
|
||||
, guard (isScalarWhere (/= PGRaster)) *> map (mk colGqlType) compOps
|
||||
, map (mk $ G.toLT $ G.toNT colGqlType) listOps
|
||||
, guard (isScalarWhere isStringType) *> map (mk $ mkScalarTy PGText) stringOps
|
||||
, guard (isScalarWhere (== PGJSONB)) *> map opToInpVal jsonbOps
|
||||
, guard (isScalarWhere (== PGGeometry)) *>
|
||||
(stDWithinGeoOpInpVal stDWithinGeometryInpTy : map geoOpToInpVal (geoOps ++ geomOps))
|
||||
, guard (isScalarWhere (== PGGeography)) *>
|
||||
(stDWithinGeoOpInpVal stDWithinGeographyInpTy : map geoOpToInpVal geoOps)
|
||||
, [InpValInfo Nothing "_is_null" Nothing $ G.TypeNamed (G.Nullability True) $ G.NamedType "Boolean"]
|
||||
, castOpInputValues
|
||||
, guard (isScalarWhere (== PGRaster)) *> map opToInpVal rasterOps
|
||||
]) TLHasuraType
|
||||
-- field_name: field_type_bool_exp
|
||||
FIRelationship relationshipInfo -> do
|
||||
let remoteTable = riRTable relationshipInfo
|
||||
remotePermissions <- lift $ tableSelectPermissions remoteTable
|
||||
lift $ fmap (AVRel relationshipInfo) <$> boolExp remoteTable remotePermissions
|
||||
|
||||
-- Using computed fields in boolean expressions is not currently supported.
|
||||
FIComputedField _ -> empty
|
||||
|
||||
-- Using remote relationship fields in boolean expressions is not supported.
|
||||
FIRemoteRelationship _ -> empty
|
||||
|
||||
comparisonExps
|
||||
:: forall m n. (MonadSchema n m, MonadError QErr m)
|
||||
=> PGColumnType -> m (Parser 'Input n [ComparisonExp])
|
||||
comparisonExps = P.memoize 'comparisonExps \columnType -> do
|
||||
geogInputParser <- geographyWithinDistanceInput
|
||||
geomInputParser <- geometryWithinDistanceInput
|
||||
ignInputParser <- intersectsGeomNbandInput
|
||||
ingInputParser <- intersectsNbandGeomInput
|
||||
-- see Note [Columns in comparison expression are never nullable]
|
||||
columnParser <- P.column columnType (G.Nullability False)
|
||||
nullableTextParser <- P.column (PGColumnScalar PGText) (G.Nullability True)
|
||||
textParser <- P.column (PGColumnScalar PGText) (G.Nullability False)
|
||||
maybeCastParser <- castExp columnType
|
||||
let name = P.getName columnParser <> $$(G.litName "_comparison_exp")
|
||||
desc = G.Description $ "Boolean expression to compare columns of type "
|
||||
<> P.getName columnParser
|
||||
<<> ". All fields are combined with logical 'AND'."
|
||||
textListParser = P.list textParser `P.bind` traverse P.openOpaque
|
||||
columnListParser = P.list columnParser `P.bind` traverse P.openOpaque
|
||||
pure $ P.object name (Just desc) $ fmap catMaybes $ sequenceA $ concat
|
||||
[ flip (maybe []) maybeCastParser $ \castParser ->
|
||||
[ P.fieldOptional $$(G.litName "_cast") Nothing (ACast <$> castParser)
|
||||
]
|
||||
-- Common ops for all types
|
||||
, [ P.fieldOptional $$(G.litName "_is_null") Nothing (bool ANISNOTNULL ANISNULL <$> P.boolean)
|
||||
, P.fieldOptional $$(G.litName "_eq") Nothing (AEQ True . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_neq") Nothing (ANE True . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_in") Nothing (AIN . mkListLiteral columnType <$> columnListParser)
|
||||
, P.fieldOptional $$(G.litName "_nin") Nothing (ANIN . mkListLiteral columnType <$> columnListParser)
|
||||
]
|
||||
-- Comparison ops for non Raster types
|
||||
, guard (isScalarColumnWhere (/= PGRaster) columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_gt") Nothing (AGT . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_lt") Nothing (ALT . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_gte") Nothing (AGTE . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_lte") Nothing (ALTE . mkParameter <$> columnParser)
|
||||
]
|
||||
-- Ops for Raster types
|
||||
, guard (isScalarColumnWhere (== PGRaster) columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_st_intersects_rast")
|
||||
Nothing
|
||||
(ASTIntersectsRast . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_intersects_nband_geom")
|
||||
Nothing
|
||||
(ASTIntersectsNbandGeom <$> ingInputParser)
|
||||
, P.fieldOptional $$(G.litName "_st_intersects_geom_nband")
|
||||
Nothing
|
||||
(ASTIntersectsGeomNband <$> ignInputParser)
|
||||
]
|
||||
-- Ops for String like types
|
||||
, guard (isScalarColumnWhere isStringType columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_like")
|
||||
(Just "does the column match the given pattern")
|
||||
(ALIKE . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_nlike")
|
||||
(Just "does the column NOT match the given pattern")
|
||||
(ANLIKE . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_ilike")
|
||||
(Just "does the column match the given case-insensitive pattern")
|
||||
(AILIKE . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_nilike")
|
||||
(Just "does the column NOT match the given case-insensitive pattern")
|
||||
(ANILIKE . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_similar")
|
||||
(Just "does the column match the given SQL regular expression")
|
||||
(ASIMILAR . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_nsimilar")
|
||||
(Just "does the column NOT match the given SQL regular expression")
|
||||
(ANSIMILAR . mkParameter <$> columnParser)
|
||||
]
|
||||
-- Ops for JSONB type
|
||||
, guard (isScalarColumnWhere (== PGJSONB) columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_contains")
|
||||
(Just "does the column contain the given json value at the top level")
|
||||
(AContains . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_contained_in")
|
||||
(Just "is the column contained in the given json value")
|
||||
(AContainedIn . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_has_key")
|
||||
(Just "does the string exist as a top-level key in the column")
|
||||
(AHasKey . mkParameter <$> nullableTextParser)
|
||||
, P.fieldOptional $$(G.litName "_has_keys_any")
|
||||
(Just "do any of these strings exist as top-level keys in the column")
|
||||
(AHasKeysAny . mkListLiteral (PGColumnScalar PGText) <$> textListParser)
|
||||
, P.fieldOptional $$(G.litName "_has_keys_all")
|
||||
(Just "do all of these strings exist as top-level keys in the column")
|
||||
(AHasKeysAll . mkListLiteral (PGColumnScalar PGText) <$> textListParser)
|
||||
]
|
||||
-- Ops for Geography type
|
||||
, guard (isScalarColumnWhere (== PGGeography) columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_st_intersects")
|
||||
(Just "does the column spatially intersect the given geography value")
|
||||
(ASTIntersects . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_d_within")
|
||||
(Just "is the column within a given distance from the given geography value")
|
||||
(ASTDWithinGeog <$> geogInputParser)
|
||||
]
|
||||
-- Ops for Geometry type
|
||||
, guard (isScalarColumnWhere (== PGGeometry) columnType) *>
|
||||
[ P.fieldOptional $$(G.litName "_st_contains")
|
||||
(Just "does the column contain the given geometry value")
|
||||
(ASTContains . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_crosses")
|
||||
(Just "does the column cross the given geometry value")
|
||||
(ASTCrosses . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_equals")
|
||||
(Just "is the column equal to given geometry value (directionality is ignored)")
|
||||
(ASTEquals . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_overlaps")
|
||||
(Just "does the column 'spatially overlap' (intersect but not completely contain) the given geometry value")
|
||||
(ASTOverlaps . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_touches")
|
||||
(Just "does the column have atleast one point in common with the given geometry value")
|
||||
(ASTTouches . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_within")
|
||||
(Just "is the column contained in the given geometry value")
|
||||
(ASTWithin . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_intersects")
|
||||
(Just "does the column spatially intersect the given geometry value")
|
||||
(ASTIntersects . mkParameter <$> columnParser)
|
||||
, P.fieldOptional $$(G.litName "_st_d_within")
|
||||
(Just "is the column within a given distance from the given geometry value")
|
||||
(ASTDWithinGeom <$> geomInputParser)
|
||||
]
|
||||
]
|
||||
where
|
||||
colGqlType = mkColumnType colTy
|
||||
colTyDesc = typeToDescription colGqlType
|
||||
tyDesc =
|
||||
"expression to compare columns of type " <> colTyDesc
|
||||
<> ". All fields are combined with logical 'AND'."
|
||||
isScalarWhere = flip isScalarColumnWhere colTy
|
||||
mk t n = InpValInfo Nothing n Nothing $ G.toGT t
|
||||
mkListLiteral :: PGColumnType -> [P.PGColumnValue] -> UnpreparedValue
|
||||
mkListLiteral columnType columnValues = P.UVLiteral $ SETyAnn
|
||||
(SEArray $ txtEncoder . pstValue . P.pcvValue <$> columnValues)
|
||||
(mkTypeAnn $ PGTypeArray $ unsafePGColumnToRepresentation columnType)
|
||||
|
||||
-- colScalarListTy = GA.GTList colGTy
|
||||
eqOps =
|
||||
["_eq", "_neq"]
|
||||
compOps =
|
||||
["_gt", "_lt", "_gte", "_lte"]
|
||||
castExp :: PGColumnType -> m (Maybe (Parser 'Input n (CastExp UnpreparedValue)))
|
||||
castExp sourceType = do
|
||||
let maybeScalars = case sourceType of
|
||||
PGColumnScalar PGGeography -> Just (PGGeography, PGGeometry)
|
||||
PGColumnScalar PGGeometry -> Just (PGGeometry, PGGeography)
|
||||
_ -> Nothing
|
||||
|
||||
listOps =
|
||||
[ "_in", "_nin" ]
|
||||
-- TODO
|
||||
-- columnOps =
|
||||
-- [ "_ceq", "_cneq", "_cgt", "_clt", "_cgte", "_clte"]
|
||||
stringOps =
|
||||
[ "_like", "_nlike", "_ilike", "_nilike"
|
||||
, "_similar", "_nsimilar"
|
||||
]
|
||||
forM maybeScalars $ \(sourceScalar, targetScalar) -> do
|
||||
sourceName <- P.mkScalarTypeName sourceScalar <&> (<> $$(G.litName "_cast_exp"))
|
||||
targetName <- P.mkScalarTypeName targetScalar
|
||||
targetOpExps <- comparisonExps $ PGColumnScalar targetScalar
|
||||
let field = P.fieldOptional targetName Nothing $ (targetScalar, ) <$> targetOpExps
|
||||
pure $ P.object sourceName Nothing $ M.fromList . maybeToList <$> field
|
||||
|
||||
opToInpVal (opName, ty, desc) = InpValInfo (Just desc) opName Nothing ty
|
||||
geographyWithinDistanceInput
|
||||
:: forall m n. (MonadSchema n m, MonadError QErr m)
|
||||
=> m (Parser 'Input n (DWithinGeogOp UnpreparedValue))
|
||||
geographyWithinDistanceInput = do
|
||||
geographyParser <- P.column (PGColumnScalar PGGeography) (G.Nullability False)
|
||||
-- FIXME
|
||||
-- It doesn't make sense for this value to be nullable; it only is for
|
||||
-- backwards compatibility; if an explicit Null value is given, it will be
|
||||
-- forwarded to the underlying SQL function, that in turns treat a null value
|
||||
-- as an error. We can fix this by rejecting explicit null values, by marking
|
||||
-- this field non-nullable in a future release.
|
||||
booleanParser <- P.column (PGColumnScalar PGBoolean) (G.Nullability True)
|
||||
floatParser <- P.column (PGColumnScalar PGFloat) (G.Nullability False)
|
||||
pure $ P.object $$(G.litName "st_d_within_geography_input") Nothing $
|
||||
DWithinGeogOp <$> (mkParameter <$> P.field $$(G.litName "distance") Nothing floatParser)
|
||||
<*> (mkParameter <$> P.field $$(G.litName "from") Nothing geographyParser)
|
||||
<*> (mkParameter <$> P.fieldWithDefault $$(G.litName "use_spheroid") Nothing (G.VBoolean True) booleanParser)
|
||||
|
||||
jsonbOps =
|
||||
[ ( "_contains"
|
||||
, G.toGT $ mkScalarTy PGJSONB
|
||||
, "does the column contain the given json value at the top level"
|
||||
)
|
||||
, ( "_contained_in"
|
||||
, G.toGT $ mkScalarTy PGJSONB
|
||||
, "is the column contained in the given json value"
|
||||
)
|
||||
, ( "_has_key"
|
||||
, G.toGT $ mkScalarTy PGText
|
||||
, "does the string exist as a top-level key in the column"
|
||||
)
|
||||
, ( "_has_keys_any"
|
||||
, G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText
|
||||
, "do any of these strings exist as top-level keys in the column"
|
||||
)
|
||||
, ( "_has_keys_all"
|
||||
, G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText
|
||||
, "do all of these strings exist as top-level keys in the column"
|
||||
)
|
||||
]
|
||||
geometryWithinDistanceInput
|
||||
:: forall m n. (MonadSchema n m, MonadError QErr m)
|
||||
=> m (Parser 'Input n (DWithinGeomOp UnpreparedValue))
|
||||
geometryWithinDistanceInput = do
|
||||
geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False)
|
||||
floatParser <- P.column (PGColumnScalar PGFloat) (G.Nullability False)
|
||||
pure $ P.object $$(G.litName "st_d_within_input") Nothing $
|
||||
DWithinGeomOp <$> (mkParameter <$> P.field $$(G.litName "distance") Nothing floatParser)
|
||||
<*> (mkParameter <$> P.field $$(G.litName "from") Nothing geometryParser)
|
||||
|
||||
castOpInputValues =
|
||||
-- currently, only geometry/geography types support casting
|
||||
guard (isScalarWhere isGeoType) $>
|
||||
InpValInfo Nothing "_cast" Nothing (G.toGT $ mkCastExpTy colTy)
|
||||
intersectsNbandGeomInput
|
||||
:: forall m n. (MonadSchema n m, MonadError QErr m)
|
||||
=> m (Parser 'Input n (STIntersectsNbandGeommin UnpreparedValue))
|
||||
intersectsNbandGeomInput = do
|
||||
geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False)
|
||||
integerParser <- P.column (PGColumnScalar PGInteger) (G.Nullability False)
|
||||
pure $ P.object $$(G.litName "st_intersects_nband_geom_input") Nothing $
|
||||
STIntersectsNbandGeommin <$> (mkParameter <$> P.field $$(G.litName "nband") Nothing integerParser)
|
||||
<*> (mkParameter <$> P.field $$(G.litName "geommin") Nothing geometryParser)
|
||||
|
||||
stDWithinGeoOpInpVal ty =
|
||||
InpValInfo (Just stDWithinGeoDesc) "_st_d_within" Nothing $ G.toGT ty
|
||||
stDWithinGeoDesc =
|
||||
"is the column within a distance from a " <> colTyDesc <> " value"
|
||||
intersectsGeomNbandInput
|
||||
:: forall m n. (MonadSchema n m, MonadError QErr m)
|
||||
=> m (Parser 'Input n (STIntersectsGeomminNband UnpreparedValue))
|
||||
intersectsGeomNbandInput = do
|
||||
geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False)
|
||||
integerParser <- P.column (PGColumnScalar PGInteger) (G.Nullability False)
|
||||
pure $ P.object $$(G.litName "st_intersects_geom_nband_input") Nothing $ STIntersectsGeomminNband
|
||||
<$> ( mkParameter <$> P.field $$(G.litName "geommin") Nothing geometryParser)
|
||||
<*> (fmap mkParameter <$> P.fieldOptional $$(G.litName "nband") Nothing integerParser)
|
||||
|
||||
geoOpToInpVal (opName, desc) =
|
||||
InpValInfo (Just desc) opName Nothing $ G.toGT colGqlType
|
||||
{- Note [Columns in comparison expression are never nullable]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
In comparisonExps, we hardcode `Nullability False` when calling `column`, which
|
||||
might seem a bit sketchy. Shouldn’t the nullability depend on the nullability of
|
||||
the underlying Postgres column?
|
||||
|
||||
-- operators applicable only to geometry types
|
||||
geomOps :: [(G.Name, G.Description)]
|
||||
geomOps =
|
||||
[
|
||||
( "_st_contains"
|
||||
, "does the column contain the given geometry value"
|
||||
)
|
||||
, ( "_st_crosses"
|
||||
, "does the column crosses the given geometry value"
|
||||
)
|
||||
, ( "_st_equals"
|
||||
, "is the column equal to given geometry value. Directionality is ignored"
|
||||
)
|
||||
, ( "_st_overlaps"
|
||||
, "does the column 'spatially overlap' (intersect but not completely contain) the given geometry value"
|
||||
)
|
||||
, ( "_st_touches"
|
||||
, "does the column have atleast one point in common with the given geometry value"
|
||||
)
|
||||
, ( "_st_within"
|
||||
, "is the column contained in the given geometry value"
|
||||
)
|
||||
]
|
||||
No. If we did that, then we would allow boolean expressions like this:
|
||||
|
||||
-- operators applicable to geometry and geography types
|
||||
geoOps =
|
||||
[
|
||||
( "_st_intersects"
|
||||
, "does the column spatially intersect the given " <> colTyDesc <> " value"
|
||||
)
|
||||
]
|
||||
delete_users(where: {status: {eq: null}})
|
||||
|
||||
-- raster related operators
|
||||
rasterOps =
|
||||
[
|
||||
( "_st_intersects_rast"
|
||||
, G.toGT $ mkScalarTy PGRaster
|
||||
, boolFnMsg <> "ST_Intersects(raster <raster-col>, raster <raster-input>)"
|
||||
)
|
||||
, ( "_st_intersects_nband_geom"
|
||||
, G.toGT stIntersectsNbandGeomInputTy
|
||||
, boolFnMsg <> "ST_Intersects(raster <raster-col>, integer nband, geometry geommin)"
|
||||
)
|
||||
, ( "_st_intersects_geom_nband"
|
||||
, G.toGT stIntersectsGeomNbandInputTy
|
||||
, boolFnMsg <> "ST_Intersects(raster <raster-col> , geometry geommin, integer nband=NULL)"
|
||||
)
|
||||
]
|
||||
The user expects this to generate SQL like
|
||||
|
||||
boolFnMsg = "evaluates the following boolean Postgres function; "
|
||||
DELETE FROM users WHERE users.status IS NULL
|
||||
|
||||
geoInputTypes :: [InpObjTyInfo]
|
||||
geoInputTypes =
|
||||
[ stDWithinGeometryInputType
|
||||
, stDWithinGeographyInputType
|
||||
, mkCastExpressionInputType (PGColumnScalar PGGeometry) [PGColumnScalar PGGeography]
|
||||
, mkCastExpressionInputType (PGColumnScalar PGGeography) [PGColumnScalar PGGeometry]
|
||||
]
|
||||
where
|
||||
stDWithinGeometryInputType =
|
||||
mkHsraInpTyInfo Nothing stDWithinGeometryInpTy $ fromInpValL
|
||||
[ InpValInfo Nothing "from" Nothing $ G.toGT $ G.toNT $ mkScalarTy PGGeometry
|
||||
, InpValInfo Nothing "distance" Nothing $ G.toNT $ mkScalarTy PGFloat
|
||||
]
|
||||
stDWithinGeographyInputType =
|
||||
mkHsraInpTyInfo Nothing stDWithinGeographyInpTy $ fromInpValL
|
||||
[ InpValInfo Nothing "from" Nothing $ G.toGT $ G.toNT $ mkScalarTy PGGeography
|
||||
, InpValInfo Nothing "distance" Nothing $ G.toNT $ mkScalarTy PGFloat
|
||||
, InpValInfo
|
||||
Nothing "use_spheroid" (Just $ G.VCBoolean True) $ G.toGT $ mkScalarTy PGBoolean
|
||||
]
|
||||
but it doesn’t. We treat null to mean “no condition was specified” (since
|
||||
that’s how GraphQL indicates an optional field was omitted), and we actually
|
||||
generate SQL like this:
|
||||
|
||||
stIntersectsNbandGeomInputTy :: G.NamedType
|
||||
stIntersectsNbandGeomInputTy = G.NamedType "st_intersects_nband_geom_input"
|
||||
DELETE FROM users
|
||||
|
||||
stIntersectsGeomNbandInputTy :: G.NamedType
|
||||
stIntersectsGeomNbandInputTy = G.NamedType "st_intersects_geom_nband_input"
|
||||
Now we’ve gone and deleted every user in the database. Hopefully the user had
|
||||
backups!
|
||||
|
||||
rasterIntersectsInputTypes :: [InpObjTyInfo]
|
||||
rasterIntersectsInputTypes =
|
||||
[ stIntersectsNbandGeomInput
|
||||
, stIntersectsGeomNbandInput
|
||||
]
|
||||
where
|
||||
stIntersectsNbandGeomInput =
|
||||
mkHsraInpTyInfo Nothing stIntersectsNbandGeomInputTy $ fromInpValL
|
||||
[ InpValInfo Nothing "nband" Nothing $
|
||||
G.toGT $ G.toNT $ mkScalarTy PGInteger
|
||||
, InpValInfo Nothing "geommin" Nothing $
|
||||
G.toGT $ G.toNT $ mkScalarTy PGGeometry
|
||||
]
|
||||
|
||||
stIntersectsGeomNbandInput =
|
||||
mkHsraInpTyInfo Nothing stIntersectsGeomNbandInputTy $ fromInpValL
|
||||
[ InpValInfo Nothing "geommin" Nothing $
|
||||
G.toGT $ G.toNT $ mkScalarTy PGGeometry
|
||||
, InpValInfo Nothing "nband" Nothing $
|
||||
G.toGT $ mkScalarTy PGInteger
|
||||
]
|
||||
|
||||
mkBoolExpName :: QualifiedTable -> G.Name
|
||||
mkBoolExpName tn =
|
||||
qualObjectToName tn <> "_bool_exp"
|
||||
|
||||
mkBoolExpTy :: QualifiedTable -> G.NamedType
|
||||
mkBoolExpTy =
|
||||
G.NamedType . mkBoolExpName
|
||||
|
||||
-- table_bool_exp
|
||||
mkBoolExpInp
|
||||
:: QualifiedTable
|
||||
-- the fields that are allowed
|
||||
-> [SelField]
|
||||
-> InpObjTyInfo
|
||||
mkBoolExpInp tn fields =
|
||||
mkHsraInpTyInfo (Just desc) boolExpTy $ Map.fromList
|
||||
[(_iviName inpVal, inpVal) | inpVal <- inpValues]
|
||||
where
|
||||
desc = G.Description $
|
||||
"Boolean expression to filter rows from the table " <> tn <<>
|
||||
". All fields are combined with a logical 'AND'."
|
||||
|
||||
-- the type of this boolean expression
|
||||
boolExpTy = mkBoolExpTy tn
|
||||
|
||||
-- all the fields of this input object
|
||||
inpValues = combinators <> mapMaybe mkFldExpInp fields
|
||||
|
||||
mk n ty = InpValInfo Nothing n Nothing $ G.toGT ty
|
||||
|
||||
boolExpListTy = G.toLT boolExpTy
|
||||
|
||||
combinators =
|
||||
[ mk "_not" boolExpTy
|
||||
, mk "_and" boolExpListTy
|
||||
, mk "_or" boolExpListTy
|
||||
]
|
||||
|
||||
mkFldExpInp = \case
|
||||
SFPGColumn (PGColumnInfo _ name _ colTy _ _) ->
|
||||
Just $ mk name (mkCompExpTy colTy)
|
||||
SFRelationship relationshipField ->
|
||||
let relationshipName = riName $ _rfiInfo relationshipField
|
||||
remoteTable = riRTable $ _rfiInfo relationshipField
|
||||
in Just $ mk (mkRelName relationshipName) (mkBoolExpTy remoteTable)
|
||||
SFComputedField _ -> Nothing -- TODO: support computed fields in bool exps
|
||||
SFRemoteRelationship{} -> Nothing
|
||||
We avoid this problem by making the column value non-nullable (which is correct,
|
||||
since we never treat a null value as a SQL NULL), then creating the field using
|
||||
fieldOptional. This creates a parser that rejects nulls, but won’t be called at
|
||||
all if the field is not specified, which is permitted by the GraphQL
|
||||
specification. See Note [Optional fields and nullability] in
|
||||
Hasura.GraphQL.Parser.Internal.Parser for more details. -}
|
||||
|
@ -1,86 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Builder
|
||||
( TyAgg(..)
|
||||
, FieldMap
|
||||
, taTypes
|
||||
, taFields
|
||||
, taScalars
|
||||
, taOrdBy
|
||||
, addFieldsToTyAgg
|
||||
, addTypeInfoToTyAgg
|
||||
, addScalarToTyAgg
|
||||
, QueryRootFieldMap
|
||||
, MutationRootFieldMap
|
||||
, RootFields(..)
|
||||
, addQueryField
|
||||
, addMutationField
|
||||
) where
|
||||
|
||||
import Control.Lens
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- | A /types aggregate/, which holds role-specific information about visible GraphQL types.
|
||||
-- Importantly, it holds more than just the information needed by GraphQL: it also includes how the
|
||||
-- GraphQL types relate to Postgres types, which is used to validate literals provided for
|
||||
-- Postgres-specific scalars.
|
||||
data TyAgg
|
||||
= TyAgg
|
||||
{ _taTypes :: !TypeMap
|
||||
, _taFields :: !FieldMap
|
||||
, _taScalars :: !(Set.HashSet PGScalarType)
|
||||
, _taOrdBy :: !OrdByCtx
|
||||
} deriving (Show, Eq)
|
||||
$(makeLenses ''TyAgg)
|
||||
|
||||
addFieldsToTyAgg :: FieldMap -> TyAgg -> TyAgg
|
||||
addFieldsToTyAgg fields =
|
||||
over taFields (Map.union fields)
|
||||
|
||||
addTypeInfoToTyAgg :: TypeInfo -> TyAgg -> TyAgg
|
||||
addTypeInfoToTyAgg typeInfo tyAgg =
|
||||
tyAgg & taTypes.at (getNamedTy typeInfo) ?~ typeInfo
|
||||
|
||||
addScalarToTyAgg :: PGScalarType -> TyAgg -> TyAgg
|
||||
addScalarToTyAgg pgScalarType =
|
||||
over taScalars (Set.insert pgScalarType)
|
||||
|
||||
instance Semigroup TyAgg where
|
||||
(TyAgg t1 f1 s1 o1) <> (TyAgg t2 f2 s2 o2) =
|
||||
TyAgg (Map.union t1 t2) (Map.union f1 f2)
|
||||
(Set.union s1 s2) (Map.union o1 o2)
|
||||
|
||||
instance Monoid TyAgg where
|
||||
mempty = TyAgg Map.empty Map.empty Set.empty Map.empty
|
||||
|
||||
type QueryRootFieldMap = Map.HashMap G.Name (QueryCtx, ObjFldInfo)
|
||||
type MutationRootFieldMap = Map.HashMap G.Name (MutationCtx, ObjFldInfo)
|
||||
|
||||
-- | A role-specific mapping from root field names to allowed operations.
|
||||
data RootFields
|
||||
= RootFields
|
||||
{ _rootQueryFields :: !QueryRootFieldMap
|
||||
, _rootMutationFields :: !MutationRootFieldMap
|
||||
} deriving (Show, Eq)
|
||||
$(makeLenses ''RootFields)
|
||||
|
||||
addQueryField :: (QueryCtx, ObjFldInfo) -> RootFields -> RootFields
|
||||
addQueryField v rootFields =
|
||||
rootFields & rootQueryFields.at (_fiName $ snd v) ?~ v
|
||||
|
||||
addMutationField :: (MutationCtx, ObjFldInfo) -> RootFields -> RootFields
|
||||
addMutationField v rootFields =
|
||||
rootFields & rootMutationFields.at (_fiName $ snd v) ?~ v
|
||||
|
||||
instance Semigroup RootFields where
|
||||
RootFields a1 b1 <> RootFields a2 b2
|
||||
= RootFields (Map.union a1 a2) (Map.union b1 b2)
|
||||
|
||||
instance Monoid RootFields where
|
||||
mempty = RootFields Map.empty Map.empty
|
@ -1,170 +1,96 @@
|
||||
module Hasura.GraphQL.Schema.Common
|
||||
( qualObjectToName
|
||||
, addTypeSuffix
|
||||
, fromInpValL
|
||||
module Hasura.GraphQL.Schema.Common where
|
||||
|
||||
, RelationshipFieldInfo(..)
|
||||
, SelField(..)
|
||||
, _SFPGColumn
|
||||
, getPGColumnFields
|
||||
, getRelationshipFields
|
||||
, getComputedFields
|
||||
, getRemoteRelationships
|
||||
|
||||
, mkColumnType
|
||||
, mkRelName
|
||||
, mkAggRelName
|
||||
, mkConnectionRelName
|
||||
, mkComputedFieldName
|
||||
|
||||
, mkTableTy
|
||||
, mkTableConnectionTy
|
||||
, mkTableEdgeTy
|
||||
, mkTableEnumType
|
||||
, mkTableAggTy
|
||||
|
||||
, mkColumnEnumVal
|
||||
, mkColumnInputVal
|
||||
, mkDescriptionWith
|
||||
|
||||
, mkFuncArgsTy
|
||||
|
||||
, mkPGColGNameMap
|
||||
|
||||
, numAggregateOps
|
||||
, compAggregateOps
|
||||
|
||||
, nodeType
|
||||
, nodeIdType
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Control.Lens
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
|
||||
import Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Data.Text as T
|
||||
import qualified Hasura.GraphQL.Execute.Types as ET (GraphQLQueryType)
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
import qualified Hasura.RQL.DML.Select.Types as RQL (Fields)
|
||||
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
data RelationshipFieldInfo
|
||||
= RelationshipFieldInfo
|
||||
{ _rfiInfo :: !RelInfo
|
||||
, _rfiAllowAgg :: !Bool
|
||||
, _rfiColumns :: !PGColGNameMap
|
||||
, _rfiPermFilter :: !AnnBoolExpPartialSQL
|
||||
, _rfiPermLimit :: !(Maybe Int)
|
||||
, _rfiPrimaryKeyColumns :: !(Maybe PrimaryKeyColumns)
|
||||
, _rfiIsNullable :: !Bool
|
||||
} deriving (Show, Eq)
|
||||
data QueryContext =
|
||||
QueryContext
|
||||
{ qcStringifyNum :: !Bool
|
||||
, qcQueryType :: !ET.GraphQLQueryType
|
||||
, qcRemoteFields :: !(HashMap RemoteSchemaName [P.Definition P.FieldInfo])
|
||||
}
|
||||
|
||||
data SelField
|
||||
= SFPGColumn !PGColumnInfo
|
||||
| SFRelationship !RelationshipFieldInfo
|
||||
| SFComputedField !ComputedField
|
||||
| SFRemoteRelationship !RemoteFieldInfo
|
||||
textToName :: MonadError QErr m => Text -> m G.Name
|
||||
textToName textName = G.mkName textName `onNothing` throw400 ValidationFailed
|
||||
("cannot include " <> textName <<> " in the GraphQL schema because "
|
||||
<> " it is not a valid GraphQL identifier")
|
||||
|
||||
partialSQLExpToUnpreparedValue :: PartialSQLExp -> P.UnpreparedValue
|
||||
partialSQLExpToUnpreparedValue (PSESessVar pftype var) = P.UVSessionVar pftype var
|
||||
partialSQLExpToUnpreparedValue (PSESQLExp sqlExp) = P.UVLiteral sqlExp
|
||||
|
||||
mapField
|
||||
:: Functor m
|
||||
=> P.InputFieldsParser m (Maybe a)
|
||||
-> (a -> b)
|
||||
-> P.InputFieldsParser m (Maybe b)
|
||||
mapField fp f = fmap (fmap f) fp
|
||||
|
||||
parsedSelectionsToFields
|
||||
:: (Text -> a) -- ^ how to handle @__typename@ fields
|
||||
-> OMap.InsOrdHashMap G.Name (P.ParsedSelection a)
|
||||
-> RQL.Fields a
|
||||
parsedSelectionsToFields mkTypename = OMap.toList
|
||||
>>> map (FieldName . G.unName *** P.handleTypename (mkTypename . G.unName))
|
||||
|
||||
numericAggOperators :: [G.Name]
|
||||
numericAggOperators =
|
||||
[ $$(G.litName "sum")
|
||||
, $$(G.litName "avg")
|
||||
, $$(G.litName "stddev")
|
||||
, $$(G.litName "stddev_samp")
|
||||
, $$(G.litName "stddev_pop")
|
||||
, $$(G.litName "variance")
|
||||
, $$(G.litName "var_samp")
|
||||
, $$(G.litName "var_pop")
|
||||
]
|
||||
|
||||
comparisonAggOperators :: [G.Name]
|
||||
comparisonAggOperators = [$$(litName "max"), $$(litName "min")]
|
||||
|
||||
data NodeIdVersion
|
||||
= NIVersion1
|
||||
deriving (Show, Eq)
|
||||
$(makePrisms ''SelField)
|
||||
|
||||
getPGColumnFields :: [SelField] -> [PGColumnInfo]
|
||||
getPGColumnFields = mapMaybe (^? _SFPGColumn)
|
||||
nodeIdVersionInt :: NodeIdVersion -> Int
|
||||
nodeIdVersionInt NIVersion1 = 1
|
||||
|
||||
getRelationshipFields :: [SelField] -> [RelationshipFieldInfo]
|
||||
getRelationshipFields = mapMaybe (^? _SFRelationship)
|
||||
currentNodeIdVersion :: NodeIdVersion
|
||||
currentNodeIdVersion = NIVersion1
|
||||
|
||||
getComputedFields :: [SelField] -> [ComputedField]
|
||||
getComputedFields = mapMaybe (^? _SFComputedField)
|
||||
|
||||
getRemoteRelationships :: [SelField] -> [RemoteFieldInfo]
|
||||
getRemoteRelationships = mapMaybe (^? _SFRemoteRelationship)
|
||||
|
||||
qualObjectToName :: (ToTxt a) => QualifiedObject a -> G.Name
|
||||
qualObjectToName = G.Name . snakeCaseQualObject
|
||||
|
||||
addTypeSuffix :: Text -> G.NamedType -> G.NamedType
|
||||
addTypeSuffix suffix baseType =
|
||||
G.NamedType $ G.unNamedType baseType <> G.Name suffix
|
||||
|
||||
fromInpValL :: [InpValInfo] -> Map.HashMap G.Name InpValInfo
|
||||
fromInpValL = mapFromL _iviName
|
||||
|
||||
mkRelName :: RelName -> G.Name
|
||||
mkRelName rn = G.Name $ relNameToTxt rn
|
||||
|
||||
mkAggRelName :: RelName -> G.Name
|
||||
mkAggRelName rn = G.Name $ relNameToTxt rn <> "_aggregate"
|
||||
|
||||
mkConnectionRelName :: RelName -> G.Name
|
||||
mkConnectionRelName rn = G.Name $ relNameToTxt rn <> "_connection"
|
||||
|
||||
mkComputedFieldName :: ComputedFieldName -> G.Name
|
||||
mkComputedFieldName = G.Name . computedFieldNameToText
|
||||
|
||||
mkColumnType :: PGColumnType -> G.NamedType
|
||||
mkColumnType = \case
|
||||
PGColumnScalar scalarType -> mkScalarTy scalarType
|
||||
PGColumnEnumReference (EnumReference enumTable _) -> mkTableEnumType enumTable
|
||||
|
||||
mkTableTy :: QualifiedTable -> G.NamedType
|
||||
mkTableTy = G.NamedType . qualObjectToName
|
||||
|
||||
mkTableConnectionTy :: QualifiedTable -> G.NamedType
|
||||
mkTableConnectionTy = addTypeSuffix "Connection" . mkTableTy
|
||||
|
||||
mkTableEdgeTy :: QualifiedTable -> G.NamedType
|
||||
mkTableEdgeTy = addTypeSuffix "Edge" . mkTableTy
|
||||
|
||||
mkTableEnumType :: QualifiedTable -> G.NamedType
|
||||
mkTableEnumType = addTypeSuffix "_enum" . mkTableTy
|
||||
|
||||
mkTableAggTy :: QualifiedTable -> G.NamedType
|
||||
mkTableAggTy = addTypeSuffix "_aggregate" . mkTableTy
|
||||
|
||||
-- used for 'distinct_on' in select and upsert's 'update columns'
|
||||
mkColumnEnumVal :: G.Name -> EnumValInfo
|
||||
mkColumnEnumVal colName =
|
||||
EnumValInfo (Just "column name") (G.EnumValue colName) False
|
||||
|
||||
mkColumnInputVal :: PGColumnInfo -> InpValInfo
|
||||
mkColumnInputVal ci =
|
||||
InpValInfo (mkDescription <$> pgiDescription ci) (pgiName ci)
|
||||
Nothing $ G.toGT $ G.toNT $ mkColumnType $ pgiType ci
|
||||
instance J.FromJSON NodeIdVersion where
|
||||
parseJSON v = do
|
||||
versionInt :: Int <- J.parseJSON v
|
||||
case versionInt of
|
||||
1 -> pure NIVersion1
|
||||
_ -> fail $ "expecting version 1 for node id, but got " <> show versionInt
|
||||
|
||||
mkDescriptionWith :: Maybe PGDescription -> Text -> G.Description
|
||||
mkDescriptionWith descM defaultTxt = G.Description $ case descM of
|
||||
Nothing -> defaultTxt
|
||||
Just (PGDescription descTxt) -> T.unlines [descTxt, "\n", defaultTxt]
|
||||
|
||||
mkDescription :: PGDescription -> G.Description
|
||||
mkDescription = G.Description . getPGDescription
|
||||
|
||||
mkFuncArgsName :: QualifiedFunction -> G.Name
|
||||
mkFuncArgsName fn =
|
||||
qualObjectToName fn <> "_args"
|
||||
|
||||
mkFuncArgsTy :: QualifiedFunction -> G.NamedType
|
||||
mkFuncArgsTy =
|
||||
G.NamedType . mkFuncArgsName
|
||||
|
||||
mkPGColGNameMap :: [PGColumnInfo] -> PGColGNameMap
|
||||
mkPGColGNameMap cols = Map.fromList $
|
||||
flip map cols $ \ci -> (pgiName ci, ci)
|
||||
|
||||
numAggregateOps :: [G.Name]
|
||||
numAggregateOps = [ "sum", "avg", "stddev", "stddev_samp", "stddev_pop"
|
||||
, "variance", "var_samp", "var_pop"
|
||||
]
|
||||
|
||||
compAggregateOps :: [G.Name]
|
||||
compAggregateOps = ["max", "min"]
|
||||
|
||||
nodeType :: G.NamedType
|
||||
nodeType =
|
||||
G.NamedType "Node"
|
||||
|
||||
nodeIdType :: G.GType
|
||||
nodeIdType =
|
||||
G.toGT $ G.toNT $ G.NamedType "ID"
|
||||
-- | The default @'skip' and @'include' directives
|
||||
defaultDirectives :: [P.DirectiveInfo]
|
||||
defaultDirectives =
|
||||
[mkDirective $$(G.litName "skip"), mkDirective $$(G.litName "include")]
|
||||
where
|
||||
ifInputField =
|
||||
P.mkDefinition $$(G.litName "if") Nothing $ P.IFRequired $ P.TNamed $
|
||||
P.mkDefinition $$(G.litName "Boolean") Nothing P.TIScalar
|
||||
dirLocs = map G.DLExecutable
|
||||
[G.EDLFIELD, G.EDLFRAGMENT_SPREAD, G.EDLINLINE_FRAGMENT]
|
||||
mkDirective name =
|
||||
P.DirectiveInfo name Nothing [ifInputField] dirLocs
|
||||
|
@ -1,176 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.CustomTypes
|
||||
( buildCustomTypesSchemaPartial
|
||||
, buildCustomTypesSchema
|
||||
) where
|
||||
|
||||
import Control.Lens
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Context (defaultTypes)
|
||||
import Hasura.GraphQL.Schema.Common (mkTableTy)
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
|
||||
buildObjectTypeInfo :: RoleName -> AnnotatedObjectType -> VT.ObjTyInfo
|
||||
buildObjectTypeInfo roleName annotatedObjectType =
|
||||
let description = _otdDescription objectDefinition
|
||||
namedType = unObjectTypeName $ _otdName objectDefinition
|
||||
fieldMap = mapFromL VT._fiName $ fields <> catMaybes relationships
|
||||
-- 'mkObjTyInfo' function takes care of inserting `__typename` field
|
||||
in VT.mkObjTyInfo description namedType mempty fieldMap VT.TLCustom
|
||||
where
|
||||
objectDefinition = _aotDefinition annotatedObjectType
|
||||
|
||||
relationships =
|
||||
flip map (toList $ _aotRelationships annotatedObjectType) $
|
||||
\(TypeRelationship name ty remoteTableInfo _) ->
|
||||
if isJust (getSelectPermissionInfoM remoteTableInfo roleName) ||
|
||||
roleName == adminRoleName
|
||||
then Just (relationshipToFieldInfo name ty $ _tciName $ _tiCoreInfo remoteTableInfo)
|
||||
else Nothing
|
||||
where
|
||||
relationshipToFieldInfo name relTy remoteTableName =
|
||||
let fieldTy = case relTy of
|
||||
ObjRel -> G.toGT $ mkTableTy remoteTableName
|
||||
ArrRel -> G.toGT $ G.toLT $ mkTableTy remoteTableName
|
||||
in VT.ObjFldInfo
|
||||
{ VT._fiDesc = Nothing -- TODO
|
||||
, VT._fiName = unRelationshipName name
|
||||
, VT._fiParams = mempty
|
||||
, VT._fiTy = fieldTy
|
||||
, VT._fiLoc = VT.TLCustom
|
||||
}
|
||||
|
||||
fields =
|
||||
map convertObjectFieldDefinition $
|
||||
toList $ _otdFields objectDefinition
|
||||
where
|
||||
convertObjectFieldDefinition fieldDefinition =
|
||||
VT.ObjFldInfo
|
||||
{ VT._fiDesc = _ofdDescription fieldDefinition
|
||||
, VT._fiName = unObjectFieldName $ _ofdName fieldDefinition
|
||||
, VT._fiParams = mempty
|
||||
, VT._fiTy = unGraphQLType $ _ofdType fieldDefinition
|
||||
, VT._fiLoc = VT.TLCustom
|
||||
}
|
||||
|
||||
buildCustomTypesSchema
|
||||
:: NonObjectTypeMap -> AnnotatedObjects -> RoleName -> VT.TypeMap
|
||||
buildCustomTypesSchema nonObjectTypeMap annotatedObjectTypes roleName =
|
||||
unNonObjectTypeMap nonObjectTypeMap <> objectTypeInfos
|
||||
where
|
||||
objectTypeInfos =
|
||||
mapFromL VT.getNamedTy $
|
||||
map (VT.TIObj . buildObjectTypeInfo roleName) $
|
||||
Map.elems annotatedObjectTypes
|
||||
|
||||
annotateObjectType
|
||||
:: (MonadError QErr m)
|
||||
=> TableCache -> NonObjectTypeMap -> ObjectTypeDefinition -> m AnnotatedObjectType
|
||||
annotateObjectType tableCache nonObjectTypeMap objectDefinition = do
|
||||
annotatedFields <-
|
||||
fmap Map.fromList $ forM (toList $ _otdFields objectDefinition) $
|
||||
\objectField -> do
|
||||
let fieldName = _ofdName objectField
|
||||
fieldType = unGraphQLType $ _ofdType objectField
|
||||
fieldBaseType = G.getBaseType fieldType
|
||||
baseTypeInfo <- getFieldTypeInfo fieldBaseType
|
||||
return (fieldName, (fieldType, baseTypeInfo))
|
||||
annotatedRelationships <-
|
||||
fmap Map.fromList $ forM relationships $
|
||||
\relationship -> do
|
||||
let relationshipName = _trName relationship
|
||||
remoteTable = _trRemoteTable relationship
|
||||
remoteTableInfo <- onNothing (Map.lookup remoteTable tableCache) $
|
||||
throw500 $ "missing table info for: " <>> remoteTable
|
||||
annotatedFieldMapping <-
|
||||
forM (_trFieldMapping relationship) $ \remoteTableColumn -> do
|
||||
let fieldName = fromPGCol remoteTableColumn
|
||||
onNothing (getPGColumnInfoM remoteTableInfo fieldName) $
|
||||
throw500 $ "missing column info of " <> fieldName
|
||||
<<> " in table" <>> remoteTable
|
||||
return ( relationshipName
|
||||
, relationship & trRemoteTable .~ remoteTableInfo
|
||||
& trFieldMapping .~ annotatedFieldMapping)
|
||||
return $ AnnotatedObjectType objectDefinition
|
||||
annotatedFields annotatedRelationships
|
||||
where
|
||||
relationships = fromMaybe [] $ _otdRelationships objectDefinition
|
||||
getFieldTypeInfo typeName = do
|
||||
let inputTypeInfos = unNonObjectTypeMap nonObjectTypeMap
|
||||
<> mapFromL VT.getNamedTy defaultTypes
|
||||
typeInfo <- onNothing (Map.lookup typeName inputTypeInfos) $
|
||||
throw500 $ "the type: " <> VT.showNamedTy typeName <>
|
||||
" is not found in non-object cutom types"
|
||||
case typeInfo of
|
||||
VT.TIScalar scalarTypeInfo -> return $ OutputFieldScalar scalarTypeInfo
|
||||
VT.TIEnum enumTypeInfo -> return $ OutputFieldEnum enumTypeInfo
|
||||
_ -> throw500 $
|
||||
"expecting only scalar/enum typeinfo for an object type's field: " <>
|
||||
VT.showNamedTy typeName
|
||||
|
||||
buildCustomTypesSchemaPartial
|
||||
:: (QErrM m)
|
||||
=> TableCache
|
||||
-> CustomTypes
|
||||
-> HashSet PGScalarType
|
||||
-- ^ Postgres base types used in the custom type definitions;
|
||||
-- see Note [Postgres scalars in custom types].
|
||||
-> m (NonObjectTypeMap, AnnotatedObjects)
|
||||
buildCustomTypesSchemaPartial tableCache customTypes pgScalars = do
|
||||
let typeInfos =
|
||||
map (VT.TIEnum . convertEnumDefinition) enumDefinitions <>
|
||||
map (VT.TIInpObj . convertInputObjectDefinition) inputObjectDefinitions <>
|
||||
map (VT.TIScalar . convertScalarDefinition) scalarDefinitions <>
|
||||
map (VT.TIScalar . VT.mkHsraScalarTyInfo) (toList pgScalars)
|
||||
nonObjectTypeMap = NonObjectTypeMap $ mapFromL VT.getNamedTy typeInfos
|
||||
|
||||
annotatedObjectTypes <- mapFromL (_otdName . _aotDefinition) <$>
|
||||
traverse (annotateObjectType tableCache nonObjectTypeMap) objectDefinitions
|
||||
|
||||
return (nonObjectTypeMap, annotatedObjectTypes)
|
||||
where
|
||||
inputObjectDefinitions = fromMaybe [] $ _ctInputObjects customTypes
|
||||
objectDefinitions = fromMaybe [] $ _ctObjects customTypes
|
||||
scalarDefinitions = fromMaybe [] $ _ctScalars customTypes
|
||||
enumDefinitions = fromMaybe [] $ _ctEnums customTypes
|
||||
|
||||
convertScalarDefinition scalarDefinition =
|
||||
flip VT.fromScalarTyDef VT.TLCustom $ G.ScalarTypeDefinition
|
||||
(_stdDescription scalarDefinition)
|
||||
(G.unNamedType $ _stdName scalarDefinition) mempty
|
||||
|
||||
convertEnumDefinition enumDefinition =
|
||||
VT.EnumTyInfo (_etdDescription enumDefinition)
|
||||
(unEnumTypeName $ _etdName enumDefinition)
|
||||
(VT.EnumValuesSynthetic $ mapFromL VT._eviVal $
|
||||
map convertEnumValueDefinition $ toList $ _etdValues enumDefinition)
|
||||
VT.TLCustom
|
||||
where
|
||||
convertEnumValueDefinition enumValueDefinition =
|
||||
VT.EnumValInfo (_evdDescription enumValueDefinition)
|
||||
(_evdValue enumValueDefinition)
|
||||
(fromMaybe False $ _evdIsDeprecated enumValueDefinition)
|
||||
|
||||
convertInputObjectDefinition inputObjectDefinition =
|
||||
VT.InpObjTyInfo
|
||||
{ VT._iotiDesc = _iotdDescription inputObjectDefinition
|
||||
, VT._iotiName = unInputObjectTypeName $ _iotdName inputObjectDefinition
|
||||
, VT._iotiFields = mapFromL VT._iviName $ map convertInputFieldDefinition $
|
||||
toList $ _iotdFields inputObjectDefinition
|
||||
, VT._iotiLoc = VT.TLCustom
|
||||
}
|
||||
where
|
||||
convertInputFieldDefinition fieldDefinition =
|
||||
VT.InpValInfo
|
||||
{ VT._iviDesc = _iofdDescription fieldDefinition
|
||||
, VT._iviName = unInputObjectFieldName $ _iofdName fieldDefinition
|
||||
, VT._iviDefVal = Nothing
|
||||
, VT._iviType = unGraphQLType $ _iofdType fieldDefinition
|
||||
}
|
@ -1,149 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Function
|
||||
( procFuncArgs
|
||||
, mkFuncArgsInp
|
||||
, mkFuncQueryFld
|
||||
, mkFuncQueryConnectionFld
|
||||
, mkFuncAggQueryFld
|
||||
, mkFuncArgsTy
|
||||
, mkFuncArgItemSeq
|
||||
) where
|
||||
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Select
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
{-
|
||||
input function_args {
|
||||
arg1: arg-type1!
|
||||
. .
|
||||
. .
|
||||
argn: arg-typen!
|
||||
}
|
||||
-}
|
||||
|
||||
procFuncArgs :: Seq.Seq a -> (a -> Maybe FunctionArgName) -> (a -> Text -> b) -> [b]
|
||||
procFuncArgs argSeq nameFn resultFn =
|
||||
fst $ foldl mkItem ([], 1::Int) argSeq
|
||||
where
|
||||
mkItem (items, argNo) fa =
|
||||
case nameFn fa of
|
||||
Just argName ->
|
||||
let argT = getFuncArgNameTxt argName
|
||||
in (items <> pure (resultFn fa argT), argNo)
|
||||
Nothing ->
|
||||
let argT = "arg_" <> T.pack (show argNo)
|
||||
in (items <> pure (resultFn fa argT), argNo + 1)
|
||||
|
||||
mkFuncArgsInp :: QualifiedFunction -> Seq.Seq FunctionArg -> Maybe InpObjTyInfo
|
||||
mkFuncArgsInp funcName funcArgs =
|
||||
bool (Just inpObj) Nothing $ null funcArgs
|
||||
where
|
||||
funcArgsTy = mkFuncArgsTy funcName
|
||||
|
||||
inpObj = mkHsraInpTyInfo Nothing funcArgsTy $
|
||||
fromInpValL argInps
|
||||
|
||||
argInps = procFuncArgs funcArgs faName mkInpVal
|
||||
|
||||
mkInpVal fa t =
|
||||
InpValInfo Nothing (G.Name t) Nothing $
|
||||
G.toGT $ mkScalarTy $ _qptName $ faType fa
|
||||
|
||||
{-
|
||||
|
||||
function(
|
||||
args: function_args
|
||||
where: table_bool_exp
|
||||
limit: Int
|
||||
offset: Int
|
||||
): [table!]!
|
||||
|
||||
-}
|
||||
|
||||
mkFuncArgs :: FunctionInfo -> ParamMap
|
||||
mkFuncArgs funInfo =
|
||||
fromInpValL $ funcInpArgs <> mkSelArgs retTable
|
||||
where
|
||||
funcName = fiName funInfo
|
||||
funcArgs = getInputArgs funInfo
|
||||
retTable = fiReturnType funInfo
|
||||
|
||||
funcArgDesc = G.Description $ "input parameters for function " <>> funcName
|
||||
funcInpArg = InpValInfo (Just funcArgDesc) "args" Nothing $ G.toGT $ G.toNT $
|
||||
mkFuncArgsTy funcName
|
||||
funcInpArgs = bool [funcInpArg] [] $ null funcArgs
|
||||
|
||||
mkFuncQueryFld
|
||||
:: FunctionInfo -> Maybe PGDescription -> ObjFldInfo
|
||||
mkFuncQueryFld funInfo descM =
|
||||
mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty
|
||||
where
|
||||
retTable = fiReturnType funInfo
|
||||
funcName = fiName funInfo
|
||||
|
||||
desc = mkDescriptionWith descM $ "execute function " <> funcName
|
||||
<<> " which returns " <>> retTable
|
||||
fldName = qualObjectToName funcName
|
||||
|
||||
ty = G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkTableTy retTable
|
||||
|
||||
mkFuncQueryConnectionFld
|
||||
:: FunctionInfo -> Maybe PGDescription -> ObjFldInfo
|
||||
mkFuncQueryConnectionFld funInfo descM =
|
||||
mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty
|
||||
where
|
||||
retTable = fiReturnType funInfo
|
||||
funcName = fiName funInfo
|
||||
|
||||
desc = mkDescriptionWith descM $ "execute function " <> funcName
|
||||
<<> " which returns " <>> retTable
|
||||
fldName = qualObjectToName funcName <> "_connection"
|
||||
|
||||
ty = G.toGT $ G.toNT $ mkTableConnectionTy retTable
|
||||
|
||||
{-
|
||||
|
||||
function_aggregate(
|
||||
args: function_args
|
||||
where: table_bool_exp
|
||||
limit: Int
|
||||
offset: Int
|
||||
): table_aggregate!
|
||||
|
||||
-}
|
||||
|
||||
mkFuncAggQueryFld
|
||||
:: FunctionInfo -> Maybe PGDescription -> ObjFldInfo
|
||||
mkFuncAggQueryFld funInfo descM =
|
||||
mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty
|
||||
where
|
||||
funcName = fiName funInfo
|
||||
retTable = fiReturnType funInfo
|
||||
|
||||
desc = mkDescriptionWith descM $ "execute function " <> funcName
|
||||
<<> " and query aggregates on result of table type "
|
||||
<>> retTable
|
||||
|
||||
fldName = qualObjectToName funcName <> "_aggregate"
|
||||
|
||||
ty = G.toGT $ G.toNT $ mkTableAggTy retTable
|
||||
|
||||
|
||||
mkFuncArgItemSeq :: FunctionInfo -> Seq (InputArgument FunctionArgItem)
|
||||
mkFuncArgItemSeq functionInfo =
|
||||
let inputArgs = fiInputArgs functionInfo
|
||||
in Seq.fromList $ procFuncArgs inputArgs nameFn resultFn
|
||||
where
|
||||
nameFn = \case
|
||||
IAUserProvided fa -> faName fa
|
||||
IASessionVariables name -> Just name
|
||||
resultFn arg gName = flip fmap arg $
|
||||
\fa -> FunctionArgItem (G.Name gName) (faName fa) (faHasDefault fa)
|
76
server/src-lib/Hasura/GraphQL/Schema/Insert.hs
Normal file
76
server/src-lib/Hasura/GraphQL/Schema/Insert.hs
Normal file
@ -0,0 +1,76 @@
|
||||
module Hasura.GraphQL.Schema.Insert where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Hasura.RQL.DML.Insert.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Returning.Types as RQL
|
||||
|
||||
import Hasura.RQL.Types.BoolExp
|
||||
import Hasura.RQL.Types.Column
|
||||
import Hasura.RQL.Types.Common
|
||||
import Hasura.SQL.Types
|
||||
|
||||
|
||||
-- At time of writing (August 2020), GraphQL queries and mutations get
|
||||
-- translated into corresponding RQL queries: RQL is used as the internal
|
||||
-- intermediary representation, before a query gets translated into
|
||||
-- SQL. However, RQL inserts represenation does not support nested insertions,
|
||||
-- which means that GraphQL inserts need a separate representation, found here.
|
||||
|
||||
-- FIXME: this code doesn't belong in this folder: arguably, since this is an
|
||||
-- internal representation of a mutation, it should belong alongside RQL rather
|
||||
-- than alongside the schema code, especially if we transition RQL to only be an
|
||||
-- intermediary representation library rather than an actual API (see [1] for
|
||||
-- more information).
|
||||
-- [1] https://gist.github.com/abooij/07165b5ac36097178a334bc03805c33b
|
||||
|
||||
-- FIXME: this representation was lifted almost verbatim from pre-PDV code, and
|
||||
-- hasn't been adapted to reflect the changes that PDV brought. It is therefore
|
||||
-- quite likely that some of the information stored in those structures is
|
||||
-- redundant, and that they can be simplified.
|
||||
|
||||
data AnnInsert v
|
||||
= AnnInsert
|
||||
{ _aiFieldName :: !Text
|
||||
, _aiIsSingle :: Bool
|
||||
, _aiData :: AnnMultiInsert v
|
||||
}
|
||||
|
||||
data AnnIns a v
|
||||
= AnnIns
|
||||
{ _aiInsObj :: !a
|
||||
, _aiTableName :: !QualifiedTable
|
||||
, _aiConflictClause :: !(Maybe (RQL.ConflictClauseP1 v))
|
||||
, _aiCheckCond :: !(AnnBoolExp v, Maybe (AnnBoolExp v))
|
||||
, _aiTableCols :: ![PGColumnInfo]
|
||||
, _aiDefVals :: !(PreSetColsG v)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type SingleObjIns v = AnnIns (AnnInsObj v) v
|
||||
type MultiObjIns v = AnnIns [AnnInsObj v] v
|
||||
|
||||
data RelIns a
|
||||
= RelIns
|
||||
{ _riAnnIns :: !a
|
||||
, _riRelInfo :: !RelInfo
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type ObjRelIns v = RelIns (SingleObjIns v)
|
||||
type ArrRelIns v = RelIns (MultiObjIns v)
|
||||
|
||||
data AnnInsObj v
|
||||
= AnnInsObj
|
||||
{ _aioColumns :: ![(PGCol, v)]
|
||||
, _aioObjRels :: ![ObjRelIns v]
|
||||
, _aioArrRels :: ![ArrRelIns v]
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type AnnSingleInsert v = (SingleObjIns v, RQL.MutationOutputG v)
|
||||
type AnnMultiInsert v = (MultiObjIns v, RQL.MutationOutputG v)
|
||||
|
||||
instance Semigroup (AnnInsObj v) where
|
||||
(AnnInsObj col1 obj1 rel1) <> (AnnInsObj col2 obj2 rel2) =
|
||||
AnnInsObj (col1 <> col2) (obj1 <> obj2) (rel1 <> rel2)
|
||||
|
||||
instance Monoid (AnnInsObj v) where
|
||||
mempty = AnnInsObj [] [] []
|
602
server/src-lib/Hasura/GraphQL/Schema/Introspect.hs
Normal file
602
server/src-lib/Hasura/GraphQL/Schema/Introspect.hs
Normal file
@ -0,0 +1,602 @@
|
||||
module Hasura.GraphQL.Schema.Introspect where
|
||||
|
||||
import Hasura.Prelude
|
||||
-- import qualified Hasura.RQL.Types
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Vector as V
|
||||
import qualified Language.GraphQL.Draft.Printer as GP
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Text.Builder as T
|
||||
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
|
||||
import Hasura.GraphQL.Parser (FieldParser, Kind (..), Parser, Schema (..))
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
|
||||
{-
|
||||
Note [Basics of introspection schema generation]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
We generate the introspection schema from the existing schema for queries,
|
||||
mutations and subscriptions. In other words, we generate one @Parser@ from some
|
||||
other @Parser@s. In this way, we avoid having to remember what types we have to
|
||||
expose through introspection explicitly, as we did in a previous version of
|
||||
graphql-engine.
|
||||
|
||||
However the schema information is obtained, the @Schema@ type stores it. From a
|
||||
@Schema@ object we then produce one @FieldParser@ that reads a `__schema` field,
|
||||
and one that reads a `__type` field. The idea is that these parsers simply
|
||||
output a JSON value directly, and so indeed the type of @schema@, for instance,
|
||||
is @FieldParser n J.Value@.
|
||||
|
||||
The idea of "just output the JSON object directly" breaks down when we want to
|
||||
output a list of things, however, such as in the `types` field of `__schema`.
|
||||
In the case of `types`, the JSON object to be generated is influenced by the
|
||||
underlying selection set, so that, for instance,
|
||||
|
||||
```
|
||||
query {
|
||||
__schema {
|
||||
types {
|
||||
name
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
means that we only output the _name_ of every type in our schema. One naive
|
||||
approach one might consider here would be to have a parser
|
||||
|
||||
```
|
||||
typeField :: P.Type k -> Parser n J.Value
|
||||
```
|
||||
|
||||
that takes a type, and is able to produce a JSON value for it, and then to apply
|
||||
this parser to all the types in our schema.
|
||||
|
||||
However, we only have *one* selection set to parse: so which of the parsers we
|
||||
obtained should we use to parse it? And what should we do in the theoretical
|
||||
case that we have a schema without any types? (The latter is actually not
|
||||
possible since we always have `query_root`, but it illustrates the problem that
|
||||
there is no canonical choice of type to use to parse the selection set.)
|
||||
Additionally, this would allow us to get the JSON output for *one* type, rather
|
||||
than for our list of types. After all, @Parser n@ is *not* a @Monad@ (it's not
|
||||
even an @Applicative@), so we don't have a map @(a -> Parser n b) -> [a] -> m
|
||||
[b]@.
|
||||
|
||||
In order to resolve this conundrum, let's consider what the ordinary Postgres
|
||||
schema generates for a query such as follows.
|
||||
|
||||
```
|
||||
query {
|
||||
author {
|
||||
articles {
|
||||
title
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Now the @Parser@ for an article's title does not directly give the desired
|
||||
output data: indeed, there would be many different titles, rather than any
|
||||
single one we can return. Instead, it returns a value that can, after parsing,
|
||||
be used to build an output, along the lines of:
|
||||
|
||||
```
|
||||
articleTitle :: FieldParser n SQLArticleTitle
|
||||
```
|
||||
|
||||
(This is a heavily simplified representation of reality.)
|
||||
|
||||
These values can be thought of as an intermediate representation that can then
|
||||
be used to generate and run SQL that gives the desired JSON output at a later
|
||||
stage. In other words, in the above example, @SQLArticleTitle@ can be thought
|
||||
of as a function @Article -> Title@ that, given an article, gives back its
|
||||
title.
|
||||
|
||||
Such an approach could help us as well, as, from instructions on how to generate
|
||||
a JSON return for a given `__Type`, surely we can later simply apply this
|
||||
construction to all types desired.
|
||||
|
||||
However, we don't _need_ to build an intermediate AST to do this: we can simply
|
||||
output the conversion functions directly. So the type of @typeField@ is closer
|
||||
to:
|
||||
|
||||
```
|
||||
typeField :: Parser n (P.Type k -> J.Value)
|
||||
```
|
||||
|
||||
This says that we are always able to parse a selection set for a `__Type`, and
|
||||
once we do, we obtain a map, which we refer to as `printer` in this module,
|
||||
which can output a JSON object for a given GraphQL type from our schema.
|
||||
|
||||
To use `typeField` as part of another selection set, we build up a corresponding
|
||||
`FieldParser`, thus obtaining a printer, then apply this printer to all desired
|
||||
types, and output the final JSON object as a J.Array of the printed results,
|
||||
like so (again, heavily simplified):
|
||||
|
||||
```
|
||||
types :: FieldParser n J.Value
|
||||
types = do
|
||||
printer <- P.subselection_ $$(G.litName "types") Nothing typeField
|
||||
return $ J.Array $ map printer $ allSchemaTypes
|
||||
```
|
||||
|
||||
Upon reading this you may be bewildered how we are able to use do notation for
|
||||
@FieldParser@, which does not have a @Monad@ instance, or even an @Applicative@
|
||||
instance. It just so happens that, as long as we write our do blocks carefully,
|
||||
so that we only use the functoriality of @FieldParser@, the simplification rules
|
||||
of GHC kick in just in time to avoid any application of @(>>=)@ or @return@.
|
||||
Arguably the above notation is prettier than equivalent code that explicitly
|
||||
reduces this to applications of @fmap@. If you, dear reader, feel like the do
|
||||
notation adds more confusion than value, you should feel free to change this, as
|
||||
there is no deeper meaning to the application of do notation than ease of
|
||||
reading.
|
||||
-}
|
||||
|
||||
-- | Generate a __type introspection parser
|
||||
typeIntrospection
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Schema
|
||||
-> FieldParser n J.Value
|
||||
typeIntrospection fakeSchema = do
|
||||
let nameArg :: P.InputFieldsParser n G.Name
|
||||
nameArg = G.unsafeMkName <$> P.field $$(G.litName "name") Nothing P.string
|
||||
name'printer <- P.subselection $$(G.litName "__type") Nothing nameArg typeField
|
||||
return $ case Map.lookup (fst name'printer) (sTypes fakeSchema) of
|
||||
Nothing -> J.Null
|
||||
Just (P.Definition n u d (P.SomeTypeInfo i)) ->
|
||||
snd name'printer (SomeType (P.Nullable (P.TNamed (P.Definition n u d i))))
|
||||
|
||||
-- | Generate a __schema introspection parser.
|
||||
schema
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Schema
|
||||
-> FieldParser n J.Value
|
||||
schema fakeSchema =
|
||||
let schemaSetParser = schemaSet fakeSchema
|
||||
in P.subselection_ $$(G.litName "__schema") Nothing schemaSetParser
|
||||
|
||||
{-
|
||||
type __Type {
|
||||
kind: __TypeKind!
|
||||
name: String
|
||||
description: String
|
||||
|
||||
# should be non-null for OBJECT and INTERFACE only, must be null for the others
|
||||
fields(includeDeprecated: Boolean = false): [__Field!]
|
||||
|
||||
# should be non-null for OBJECT and INTERFACE only, must be null for the others
|
||||
interfaces: [__Type!]
|
||||
|
||||
# should be non-null for INTERFACE and UNION only, always null for the others
|
||||
possibleTypes: [__Type!]
|
||||
|
||||
# should be non-null for ENUM only, must be null for the others
|
||||
enumValues(includeDeprecated: Boolean = false): [__EnumValue!]
|
||||
|
||||
# should be non-null for INPUT_OBJECT only, must be null for the others
|
||||
inputFields: [__InputValue!]
|
||||
|
||||
# should be non-null for NON_NULL and LIST only, must be null for the others
|
||||
ofType: __Type
|
||||
}
|
||||
-}
|
||||
|
||||
data SomeType = forall k . SomeType (P.Type k)
|
||||
|
||||
typeField
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Output n (SomeType -> J.Value)
|
||||
typeField =
|
||||
let
|
||||
includeDeprecated :: P.InputFieldsParser n Bool
|
||||
includeDeprecated =
|
||||
P.fieldWithDefault $$(G.litName "includeDeprecated") Nothing (G.VBoolean False) (P.nullable P.boolean)
|
||||
<&> fromMaybe False
|
||||
kind :: FieldParser n (SomeType -> J.Value)
|
||||
kind = P.selection_ $$(G.litName "kind") Nothing typeKind $>
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.NonNullable _ ->
|
||||
J.String "NON_NULL"
|
||||
P.Nullable (P.TList _) ->
|
||||
J.String "LIST"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ P.TIScalar)) ->
|
||||
J.String "SCALAR"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIEnum _))) ->
|
||||
J.String "ENUM"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInputObject _))) ->
|
||||
J.String "INPUT_OBJECT"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject _))) ->
|
||||
J.String "OBJECT"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface _))) ->
|
||||
J.String "INTERFACE"
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIUnion _))) ->
|
||||
J.String "UNION"
|
||||
name :: FieldParser n (SomeType -> J.Value)
|
||||
name = P.selection_ $$(G.litName "name") Nothing P.string $>
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition name' _ _ _)) ->
|
||||
nameAsJSON name'
|
||||
_ -> J.Null
|
||||
description :: FieldParser n (SomeType -> J.Value)
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $>
|
||||
\case SomeType tp ->
|
||||
case P.discardNullability tp of
|
||||
P.TNamed (P.Definition _ _ (Just desc) _) ->
|
||||
J.String (G.unDescription desc)
|
||||
_ -> J.Null
|
||||
fields :: FieldParser n (SomeType -> J.Value)
|
||||
fields = do
|
||||
-- TODO handle the value of includeDeprecated
|
||||
includeDeprecated'printer <- P.subselection $$(G.litName "fields") Nothing includeDeprecated fieldField
|
||||
return $
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject (P.ObjectInfo fields' _interfaces')))) ->
|
||||
J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName fields'
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface (P.InterfaceInfo fields' _objects')))) ->
|
||||
J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName fields'
|
||||
_ -> J.Null
|
||||
interfaces :: FieldParser n (SomeType -> J.Value)
|
||||
interfaces = do
|
||||
printer <- P.subselection_ $$(G.litName "interfaces") Nothing typeField
|
||||
return $
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject (P.ObjectInfo _fields' interfaces')))) ->
|
||||
J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIInterface) $ sortOn P.dName interfaces'
|
||||
_ -> J.Null
|
||||
possibleTypes :: FieldParser n (SomeType -> J.Value)
|
||||
possibleTypes = do
|
||||
printer <- P.subselection_ $$(G.litName "possibleTypes") Nothing typeField
|
||||
return $
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface (P.InterfaceInfo _fields' objects')))) ->
|
||||
J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIObject) $ sortOn P.dName objects'
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIUnion (P.UnionInfo objects')))) ->
|
||||
J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIObject) $ sortOn P.dName objects'
|
||||
_ -> J.Null
|
||||
enumValues :: FieldParser n (SomeType -> J.Value)
|
||||
enumValues = do
|
||||
-- TODO handle the value of includeDeprecated
|
||||
includeDeprecated'printer <- P.subselection $$(G.litName "enumValues") Nothing includeDeprecated enumValue
|
||||
return $
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIEnum vals))) ->
|
||||
J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName $ toList vals
|
||||
_ -> J.Null
|
||||
inputFields :: FieldParser n (SomeType -> J.Value)
|
||||
inputFields = do
|
||||
printer <- P.subselection_ $$(G.litName "inputFields") Nothing inputValue
|
||||
return $
|
||||
\case SomeType tp ->
|
||||
case tp of
|
||||
P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInputObject (P.InputObjectInfo fieldDefs)))) ->
|
||||
J.Array $ V.fromList $ map printer $ sortOn P.dName fieldDefs
|
||||
_ -> J.Null
|
||||
ofType :: FieldParser n (SomeType -> J.Value)
|
||||
ofType = do
|
||||
printer <- P.subselection_ $$(G.litName "ofType") Nothing typeField
|
||||
return $ \case
|
||||
SomeType (P.NonNullable x) ->
|
||||
printer $ SomeType $ P.Nullable x
|
||||
SomeType (P.Nullable (P.TList x)) ->
|
||||
printer $ SomeType x
|
||||
_ -> J.Null
|
||||
in
|
||||
applyPrinter <$>
|
||||
P.selectionSet
|
||||
$$(G.litName "__Type")
|
||||
Nothing
|
||||
[ kind
|
||||
, name
|
||||
, description
|
||||
, fields
|
||||
, interfaces
|
||||
, possibleTypes
|
||||
, enumValues
|
||||
, inputFields
|
||||
, ofType
|
||||
]
|
||||
|
||||
{-
|
||||
type __InputValue {
|
||||
name: String!
|
||||
description: String
|
||||
type: __Type!
|
||||
defaultValue: String
|
||||
}
|
||||
-}
|
||||
inputValue
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Output n (P.Definition P.InputFieldInfo -> J.Value)
|
||||
inputValue =
|
||||
let
|
||||
name :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value)
|
||||
name = P.selection_ $$(G.litName "name") Nothing P.string $>
|
||||
nameAsJSON . P.dName
|
||||
description :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value)
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $>
|
||||
maybe J.Null (J.String . G.unDescription) . P.dDescription
|
||||
typeF :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value)
|
||||
typeF = do
|
||||
printer <- P.subselection_ $$(G.litName "type") Nothing typeField
|
||||
return $ \defInfo -> case P.dInfo defInfo of
|
||||
P.IFRequired tp -> printer $ SomeType $ P.NonNullable tp
|
||||
P.IFOptional tp _ -> printer $ SomeType tp
|
||||
defaultValue :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value)
|
||||
defaultValue = P.selection_ $$(G.litName "defaultValue") Nothing P.string $>
|
||||
\defInfo -> case P.dInfo defInfo of
|
||||
P.IFOptional _ (Just val) -> J.String $ T.run $ GP.value val
|
||||
_ -> J.Null
|
||||
in
|
||||
applyPrinter <$>
|
||||
P.selectionSet
|
||||
$$(G.litName "__InputValue")
|
||||
Nothing
|
||||
[ name
|
||||
, description
|
||||
, typeF
|
||||
, defaultValue
|
||||
]
|
||||
|
||||
{-
|
||||
type __EnumValue {
|
||||
name: String!
|
||||
description: String
|
||||
isDeprecated: Boolean!
|
||||
deprecationReason: String
|
||||
}
|
||||
-}
|
||||
enumValue
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Output n (P.Definition P.EnumValueInfo -> J.Value)
|
||||
enumValue =
|
||||
let
|
||||
name :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value)
|
||||
name = P.selection_ $$(G.litName "name") Nothing P.string $>
|
||||
nameAsJSON . P.dName
|
||||
description :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value)
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $>
|
||||
maybe J.Null (J.String . G.unDescription) . P.dDescription
|
||||
-- TODO We don't seem to support enum value deprecation
|
||||
isDeprecated :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value)
|
||||
isDeprecated = P.selection_ $$(G.litName "isDeprecated") Nothing P.string $>
|
||||
const (J.Bool False)
|
||||
deprecationReason :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value)
|
||||
deprecationReason = P.selection_ $$(G.litName "deprecationReason") Nothing P.string $>
|
||||
const J.Null
|
||||
in
|
||||
applyPrinter <$>
|
||||
P.selectionSet
|
||||
$$(G.litName "__EnumValue")
|
||||
Nothing
|
||||
[ name
|
||||
, description
|
||||
, isDeprecated
|
||||
, deprecationReason
|
||||
]
|
||||
|
||||
{-
|
||||
enum __TypeKind {
|
||||
ENUM
|
||||
INPUT_OBJECT
|
||||
INTERFACE
|
||||
LIST
|
||||
NON_NULL
|
||||
OBJECT
|
||||
SCALAR
|
||||
UNION
|
||||
}
|
||||
-}
|
||||
typeKind
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Both n ()
|
||||
typeKind = P.enum
|
||||
$$(G.litName "__TypeKind")
|
||||
Nothing
|
||||
(NE.fromList
|
||||
[ mkDefinition $$(G.litName "ENUM")
|
||||
, mkDefinition $$(G.litName "INPUT_OBJECT")
|
||||
, mkDefinition $$(G.litName "INTERFACE")
|
||||
, mkDefinition $$(G.litName "LIST")
|
||||
, mkDefinition $$(G.litName "NON_NULL")
|
||||
, mkDefinition $$(G.litName "OBJECT")
|
||||
, mkDefinition $$(G.litName "SCALAR")
|
||||
, mkDefinition $$(G.litName "UNION")
|
||||
])
|
||||
where
|
||||
mkDefinition name = (P.Definition name Nothing Nothing P.EnumValueInfo, ())
|
||||
|
||||
{-
|
||||
type __Field {
|
||||
name: String!
|
||||
description: String
|
||||
args: [__InputValue!]!
|
||||
type: __Type!
|
||||
isDeprecated: Boolean!
|
||||
deprecationReason: String
|
||||
}
|
||||
-}
|
||||
fieldField
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Output n (P.Definition P.FieldInfo -> J.Value)
|
||||
fieldField =
|
||||
let
|
||||
name :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
name = P.selection_ $$(G.litName "name") Nothing P.string $>
|
||||
nameAsJSON . P.dName
|
||||
description :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $> \defInfo ->
|
||||
case P.dDescription defInfo of
|
||||
Nothing -> J.Null
|
||||
Just desc -> J.String (G.unDescription desc)
|
||||
args :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
args = do
|
||||
printer <- P.subselection_ $$(G.litName "args") Nothing inputValue
|
||||
return $ J.Array . V.fromList . map printer . sortOn P.dName . P.fArguments . P.dInfo
|
||||
typeF :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
typeF = do
|
||||
printer <- P.subselection_ $$(G.litName "type") Nothing typeField
|
||||
return $ printer . (\case P.FieldInfo _ tp -> SomeType tp) . P.dInfo
|
||||
-- TODO We don't seem to track deprecation info
|
||||
isDeprecated :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
isDeprecated = P.selection_ $$(G.litName "isDeprecated") Nothing P.string $>
|
||||
const (J.Bool False)
|
||||
deprecationReason :: FieldParser n (P.Definition P.FieldInfo -> J.Value)
|
||||
deprecationReason = P.selection_ $$(G.litName "deprecationReason") Nothing P.string $>
|
||||
const J.Null
|
||||
in
|
||||
applyPrinter <$>
|
||||
P.selectionSet $$(G.litName "__Field") Nothing
|
||||
[ name
|
||||
, description
|
||||
, args
|
||||
, typeF
|
||||
, isDeprecated
|
||||
, deprecationReason
|
||||
]
|
||||
|
||||
{-
|
||||
type __Directive {
|
||||
name: String!
|
||||
description: String
|
||||
locations: [__DirectiveLocation!]!
|
||||
args: [__InputValue!]!
|
||||
isRepeatable: Boolean!
|
||||
}
|
||||
-}
|
||||
|
||||
directiveSet
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Parser 'Output n (P.DirectiveInfo -> J.Value)
|
||||
directiveSet =
|
||||
let
|
||||
name :: FieldParser n (P.DirectiveInfo -> J.Value)
|
||||
name = P.selection_ $$(G.litName "name") Nothing P.string $>
|
||||
(J.toJSON . P.diName)
|
||||
description :: FieldParser n (P.DirectiveInfo -> J.Value)
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $>
|
||||
(J.toJSON . P.diDescription)
|
||||
locations :: FieldParser n (P.DirectiveInfo -> J.Value)
|
||||
locations = P.selection_ $$(G.litName "locations") Nothing P.string $>
|
||||
(J.toJSON . map showDirLoc . P.diLocations)
|
||||
args :: FieldParser n (P.DirectiveInfo -> J.Value)
|
||||
args = do
|
||||
printer <- P.subselection_ $$(G.litName "args") Nothing inputValue
|
||||
pure $ J.toJSON . map printer . P.diArguments
|
||||
isRepeatable :: FieldParser n (P.DirectiveInfo -> J.Value)
|
||||
isRepeatable = P.selection_ $$(G.litName "isRepeatable") Nothing P.string $>
|
||||
const J.Null
|
||||
in
|
||||
applyPrinter <$> P.selectionSet $$(G.litName "__Directive") Nothing
|
||||
[ name
|
||||
, description
|
||||
, locations
|
||||
, args
|
||||
, isRepeatable
|
||||
]
|
||||
where
|
||||
showDirLoc :: G.DirectiveLocation -> Text
|
||||
showDirLoc = \case
|
||||
G.DLExecutable edl -> T.pack $ drop 3 $ show edl
|
||||
G.DLTypeSystem tsdl -> T.pack $ drop 4 $ show tsdl
|
||||
|
||||
|
||||
{-
|
||||
type __Schema {
|
||||
description: String
|
||||
types: [__Type!]!
|
||||
queryType: __Type!
|
||||
mutationType: __Type
|
||||
subscriptionType: __Type
|
||||
directives: [__Directive!]!
|
||||
}
|
||||
-}
|
||||
schemaSet
|
||||
:: forall n
|
||||
. MonadParse n
|
||||
=> Schema
|
||||
-> Parser 'Output n J.Value
|
||||
schemaSet fakeSchema =
|
||||
let
|
||||
description :: FieldParser n J.Value
|
||||
description = P.selection_ $$(G.litName "description") Nothing P.string $>
|
||||
case sDescription fakeSchema of
|
||||
Nothing -> J.Null
|
||||
Just s -> J.String $ G.unDescription s
|
||||
types :: FieldParser n J.Value
|
||||
types = do
|
||||
printer <- P.subselection_ $$(G.litName "types") Nothing typeField
|
||||
return $ J.Array $ V.fromList $ map (printer . schemaTypeToSomeType) $
|
||||
sortOn P.dName $ Map.elems $ sTypes fakeSchema
|
||||
where
|
||||
schemaTypeToSomeType
|
||||
:: P.Definition P.SomeTypeInfo
|
||||
-> SomeType
|
||||
schemaTypeToSomeType (P.Definition n u d (P.SomeTypeInfo i)) =
|
||||
SomeType $ P.Nullable $ P.TNamed (P.Definition n u d i)
|
||||
queryType :: FieldParser n J.Value
|
||||
queryType = do
|
||||
printer <- P.subselection_ $$(G.litName "queryType") Nothing typeField
|
||||
return $ printer $ SomeType $ sQueryType fakeSchema
|
||||
mutationType :: FieldParser n J.Value
|
||||
mutationType = do
|
||||
printer <- P.subselection_ $$(G.litName "mutationType") Nothing typeField
|
||||
return $ case sMutationType fakeSchema of
|
||||
Nothing -> J.Null
|
||||
Just tp -> printer $ SomeType tp
|
||||
subscriptionType :: FieldParser n J.Value
|
||||
subscriptionType = do
|
||||
printer <- P.subselection_ $$(G.litName "subscriptionType") Nothing typeField
|
||||
return $ case sSubscriptionType fakeSchema of
|
||||
Nothing -> J.Null
|
||||
Just tp -> printer $ SomeType tp
|
||||
directives :: FieldParser n J.Value
|
||||
directives = do
|
||||
printer <- P.subselection_ $$(G.litName "directives") Nothing directiveSet
|
||||
return $ J.toJSON $ map printer $ sDirectives fakeSchema
|
||||
in
|
||||
selectionSetToJSON . fmap (P.handleTypename nameAsJSON) <$>
|
||||
P.selectionSet
|
||||
$$(G.litName "__Schema")
|
||||
Nothing
|
||||
[ description
|
||||
, types
|
||||
, queryType
|
||||
, mutationType
|
||||
, subscriptionType
|
||||
, directives
|
||||
]
|
||||
|
||||
selectionSetToJSON
|
||||
:: OMap.InsOrdHashMap G.Name J.Value
|
||||
-> J.Value
|
||||
selectionSetToJSON = J.Object . OMap.toHashMap . OMap.mapKeys G.unName
|
||||
|
||||
applyPrinter
|
||||
:: OMap.InsOrdHashMap G.Name (P.ParsedSelection (a -> J.Value))
|
||||
-> a
|
||||
-> J.Value
|
||||
applyPrinter = flip (\x -> selectionSetToJSON . fmap (($ x) . P.handleTypename (const . nameAsJSON)))
|
||||
|
||||
nameAsJSON :: G.Name -> J.Value
|
||||
nameAsJSON = J.String . G.unName
|
@ -1,152 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Merge
|
||||
( mergeGCtx
|
||||
, checkSchemaConflicts
|
||||
, checkConflictingNode
|
||||
) where
|
||||
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
|
||||
mergeGCtx :: (MonadError QErr m) => GCtx -> GCtx -> m GCtx
|
||||
mergeGCtx lGCtx rGCtx = do
|
||||
checkSchemaConflicts lGCtx rGCtx
|
||||
pure GCtx { _gTypes = mergedTypeMap
|
||||
, _gFields = _gFields lGCtx <> _gFields rGCtx
|
||||
, _gQueryRoot = mergedQueryRoot
|
||||
, _gMutRoot = mergedMutationRoot
|
||||
, _gSubRoot = mergedSubRoot
|
||||
, _gOrdByCtx = _gOrdByCtx lGCtx <> _gOrdByCtx rGCtx
|
||||
, _gQueryCtxMap = _gQueryCtxMap lGCtx <> _gQueryCtxMap rGCtx
|
||||
, _gMutationCtxMap = _gMutationCtxMap lGCtx <> _gMutationCtxMap rGCtx
|
||||
, _gInsCtxMap = _gInsCtxMap lGCtx <> _gInsCtxMap rGCtx
|
||||
}
|
||||
where
|
||||
mergedQueryRoot = _gQueryRoot lGCtx <> _gQueryRoot rGCtx
|
||||
mergedMutationRoot = _gMutRoot lGCtx <> _gMutRoot rGCtx
|
||||
mergedSubRoot = _gSubRoot lGCtx <> _gSubRoot rGCtx
|
||||
mergedTypeMap =
|
||||
let mergedTypes = _gTypes lGCtx <> _gTypes rGCtx
|
||||
modifyQueryRootField = Map.insert queryRootNamedType (TIObj mergedQueryRoot)
|
||||
modifyMaybeRootField tyname maybeObj m = case maybeObj of
|
||||
Nothing -> m
|
||||
Just obj -> Map.insert tyname (TIObj obj) m
|
||||
in modifyMaybeRootField subscriptionRootNamedType mergedSubRoot $
|
||||
modifyMaybeRootField mutationRootNamedType mergedMutationRoot $
|
||||
modifyQueryRootField mergedTypes
|
||||
|
||||
checkSchemaConflicts
|
||||
:: (MonadError QErr m)
|
||||
=> GCtx -> GCtx -> m ()
|
||||
checkSchemaConflicts gCtx remoteCtx = do
|
||||
let typeMap = _gTypes gCtx -- hasura typemap
|
||||
-- check type conflicts
|
||||
let hTypes = Map.elems typeMap
|
||||
hTyNames = map G.unNamedType $ Map.keys typeMap
|
||||
-- get the root names from the remote schema
|
||||
rmQRootName = _otiName $ _gQueryRoot remoteCtx
|
||||
rmMRootName = maybeToList $ _otiName <$> _gMutRoot remoteCtx
|
||||
rmSRootName = maybeToList $ _otiName <$> _gSubRoot remoteCtx
|
||||
rmRootNames = map G.unNamedType (rmQRootName:(rmMRootName ++ rmSRootName))
|
||||
let rmTypes = Map.filterWithKey
|
||||
(\k _ -> G.unNamedType k `notElem` builtinTy ++ rmRootNames)
|
||||
$ _gTypes remoteCtx
|
||||
|
||||
isTyInfoSame ty = any (`tyinfoEq` ty) hTypes
|
||||
-- name is same and structure is not same
|
||||
isSame n ty = G.unNamedType n `elem` hTyNames &&
|
||||
not (isTyInfoSame ty)
|
||||
conflictedTypes = Map.filterWithKey isSame rmTypes
|
||||
conflictedTyNames = map G.unNamedType $ Map.keys conflictedTypes
|
||||
|
||||
unless (Map.null conflictedTypes) $
|
||||
throw400 RemoteSchemaConflicts $ tyMsg conflictedTyNames
|
||||
|
||||
-- check node conflicts
|
||||
let rmQRoot = _otiFields $ _gQueryRoot remoteCtx
|
||||
rmMRoot = _otiFields <$> _gMutRoot remoteCtx
|
||||
rmRoots = filter (`notElem` builtinNodes ++ rmRootNames) . Map.keys <$>
|
||||
mergeMaybeMaps (Just rmQRoot) rmMRoot
|
||||
hQR = _otiFields <$>
|
||||
(getObjTyM =<< Map.lookup hQRName typeMap)
|
||||
hMR = _otiFields <$>
|
||||
(getObjTyM =<< Map.lookup hMRName typeMap)
|
||||
hRoots = Map.keys <$> mergeMaybeMaps hQR hMR
|
||||
|
||||
case (rmRoots, hRoots) of
|
||||
(Just rmR, Just hR) -> do
|
||||
let conflictedNodes = filter (`elem` hR) rmR
|
||||
unless (null conflictedNodes) $
|
||||
throw400 RemoteSchemaConflicts $ nodesMsg conflictedNodes
|
||||
_ -> return ()
|
||||
|
||||
where
|
||||
tyinfoEq a b = case (a, b) of
|
||||
(TIScalar t1, TIScalar t2) -> typeEq t1 t2
|
||||
(TIObj t1, TIObj t2) -> typeEq t1 t2
|
||||
(TIEnum t1, TIEnum t2) -> typeEq t1 t2
|
||||
(TIInpObj t1, TIInpObj t2) -> typeEq t1 t2
|
||||
_ -> False
|
||||
|
||||
hQRName = queryRootNamedType
|
||||
hMRName = mutationRootNamedType
|
||||
tyMsg ty = "types: [ " <> namesToTxt ty <>
|
||||
" ] have mismatch with current graphql schema. HINT: Types must be same."
|
||||
nodesMsg n = "top-level nodes: [ " <> namesToTxt n <>
|
||||
" ] already exist in current graphql schema. HINT: Top-level nodes can't be same."
|
||||
namesToTxt = T.intercalate ", " . map G.unName
|
||||
builtinNodes = ["__type", "__schema", "__typename"]
|
||||
builtinTy = [ "__Directive"
|
||||
, "__DirectiveLocation"
|
||||
, "__EnumValue"
|
||||
, "__Field"
|
||||
, "__InputValue"
|
||||
, "__Schema"
|
||||
, "__Type"
|
||||
, "__TypeKind"
|
||||
, "Int"
|
||||
, "Float"
|
||||
, "String"
|
||||
, "Boolean"
|
||||
, "ID"
|
||||
]
|
||||
|
||||
checkConflictingNode
|
||||
:: (MonadError QErr m)
|
||||
=> TypeMap
|
||||
-- ^ See 'GCtx'.
|
||||
-> G.Name
|
||||
-> m ()
|
||||
checkConflictingNode typeMap node = do
|
||||
let hQR = _otiFields <$>
|
||||
(getObjTyM =<< Map.lookup hQRName typeMap)
|
||||
hMR = _otiFields <$>
|
||||
(getObjTyM =<< Map.lookup hMRName typeMap)
|
||||
hRoots = Map.keys <$> mergeMaybeMaps hQR hMR
|
||||
case hRoots of
|
||||
Just hR ->
|
||||
when (node `elem` hR) $
|
||||
throw400 RemoteSchemaConflicts msg
|
||||
_ -> return ()
|
||||
where
|
||||
hQRName = queryRootNamedType
|
||||
hMRName = mutationRootNamedType
|
||||
msg = "node " <> G.unName node <>
|
||||
" already exists in current graphql schema"
|
||||
|
||||
mergeMaybeMaps
|
||||
:: (Eq k, Hashable k)
|
||||
=> Maybe (Map.HashMap k v)
|
||||
-> Maybe (Map.HashMap k v)
|
||||
-> Maybe (Map.HashMap k v)
|
||||
mergeMaybeMaps m1 m2 = case (m1, m2) of
|
||||
(Nothing, Nothing) -> Nothing
|
||||
(Just m1', Nothing) -> Just m1'
|
||||
(Nothing, Just m2') -> Just m2'
|
||||
(Just m1', Just m2') -> Just $ Map.union m1' m2'
|
531
server/src-lib/Hasura/GraphQL/Schema/Mutation.hs
Normal file
531
server/src-lib/Hasura/GraphQL/Schema/Mutation.hs
Normal file
@ -0,0 +1,531 @@
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
|
||||
module Hasura.GraphQL.Schema.Mutation
|
||||
( insertIntoTable
|
||||
, insertOneIntoTable
|
||||
, updateTable
|
||||
, updateTableByPk
|
||||
, deleteFromTable
|
||||
, deleteFromTableByPk
|
||||
) where
|
||||
|
||||
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd.Extended as OMap
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
import qualified Hasura.RQL.DML.Delete.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Insert.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Returning.Types as RQL
|
||||
import qualified Hasura.RQL.DML.Update as RQL
|
||||
import qualified Hasura.RQL.DML.Update.Types as RQL
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
import Hasura.GraphQL.Parser (FieldParser, InputFieldsParser, Kind (..),
|
||||
Parser, UnpreparedValue (..), mkParameter)
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Schema.BoolExp
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Insert
|
||||
import Hasura.GraphQL.Schema.Select
|
||||
import Hasura.GraphQL.Schema.Table
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
|
||||
|
||||
-- insert
|
||||
|
||||
-- | Construct a root field, normally called insert_tablename, that can be used to add several rows to a DB table
|
||||
insertIntoTable
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> InsPermInfo -- ^ insert permissions of the table
|
||||
-> Maybe SelPermInfo -- ^ select permissions of the table (if any)
|
||||
-> Maybe UpdPermInfo -- ^ update permissions of the table (if any)
|
||||
-> m (FieldParser n (AnnInsert UnpreparedValue))
|
||||
insertIntoTable table fieldName description insertPerms selectPerms updatePerms = do
|
||||
columns <- tableColumns table
|
||||
selectionParser <- mutationSelectionSet table selectPerms
|
||||
objectsParser <- P.list <$> tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms
|
||||
let objectsName = $$(G.litName "objects")
|
||||
objectsDesc = "the rows to be inserted"
|
||||
argsParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
objects <- P.field objectsName (Just objectsDesc) objectsParser
|
||||
pure (conflictClause, objects)
|
||||
pure $ P.subselection fieldName description argsParser selectionParser
|
||||
<&> \((conflictClause, objects), output) -> AnnInsert (G.unName fieldName) False
|
||||
( mkInsertObject objects table columns conflictClause insertPerms updatePerms
|
||||
, RQL.MOutMultirowFields output
|
||||
)
|
||||
|
||||
mkConflictClause :: MonadParse n => Maybe (Parser 'Input n a) -> InputFieldsParser n (Maybe a)
|
||||
mkConflictClause conflictParser
|
||||
= maybe
|
||||
(pure Nothing) -- Empty InputFieldsParser (no arguments allowed)
|
||||
(P.fieldOptional conflictName (Just conflictDesc))
|
||||
conflictParser
|
||||
where
|
||||
conflictName = $$(G.litName "on_conflict")
|
||||
conflictDesc = "on conflict condition"
|
||||
|
||||
-- | Variant of 'insertIntoTable' that inserts a single row
|
||||
insertOneIntoTable
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> InsPermInfo -- ^ insert permissions of the table
|
||||
-> SelPermInfo -- ^ select permissions of the table
|
||||
-> Maybe UpdPermInfo -- ^ update permissions of the table (if any)
|
||||
-> m (FieldParser n (AnnInsert UnpreparedValue))
|
||||
insertOneIntoTable table fieldName description insertPerms selectPerms updatePerms = do
|
||||
columns <- tableColumns table
|
||||
selectionParser <- tableSelectionSet table selectPerms
|
||||
objectParser <- tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table (Just selectPerms) <$> updatePerms
|
||||
let objectName = $$(G.litName "object")
|
||||
objectDesc = "the row to be inserted"
|
||||
argsParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
object <- P.field objectName (Just objectDesc) objectParser
|
||||
pure (conflictClause, object)
|
||||
pure $ P.subselection fieldName description argsParser selectionParser
|
||||
<&> \((conflictClause, object), output) -> AnnInsert (G.unName fieldName) True
|
||||
( mkInsertObject [object] table columns conflictClause insertPerms updatePerms
|
||||
, RQL.MOutSinglerowObject output
|
||||
)
|
||||
|
||||
-- | We specify the data of an individual row to insert through this input parser.
|
||||
tableFieldsInput
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> InsPermInfo -- ^ insert permissions of the table
|
||||
-> m (Parser 'Input n (AnnInsObj UnpreparedValue))
|
||||
tableFieldsInput table insertPerms = memoizeOn 'tableFieldsInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
allFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table
|
||||
objectFields <- catMaybes <$> for (Map.elems allFields) \case
|
||||
FIComputedField _ -> pure Nothing
|
||||
FIRemoteRelationship _ -> pure Nothing
|
||||
FIColumn columnInfo ->
|
||||
whenMaybe (Set.member (pgiColumn columnInfo) (ipiCols insertPerms)) do
|
||||
let columnName = pgiName columnInfo
|
||||
columnDesc = pgiDescription columnInfo
|
||||
fieldParser <- P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo)
|
||||
pure $ P.fieldOptional columnName columnDesc fieldParser `mapField`
|
||||
\(mkParameter -> value) -> AnnInsObj [(pgiColumn columnInfo, value)] [] []
|
||||
FIRelationship relationshipInfo -> runMaybeT $ do
|
||||
let otherTable = riRTable relationshipInfo
|
||||
relName = riName relationshipInfo
|
||||
permissions <- MaybeT $ tablePermissions otherTable
|
||||
relFieldName <- lift $ textToName $ relNameToTxt relName
|
||||
insPerms <- MaybeT $ pure $ _permIns permissions
|
||||
let selPerms = _permSel permissions
|
||||
updPerms = _permUpd permissions
|
||||
lift $ case riType relationshipInfo of
|
||||
ObjRel -> do
|
||||
parser <- objectRelationshipInput otherTable insPerms selPerms updPerms
|
||||
pure $ P.fieldOptional relFieldName Nothing parser `mapField`
|
||||
\objRelIns -> AnnInsObj [] [RelIns objRelIns relationshipInfo] []
|
||||
ArrRel -> do
|
||||
parser <- P.nullable <$> arrayRelationshipInput otherTable insPerms selPerms updPerms
|
||||
pure $ P.fieldOptional relFieldName Nothing parser <&> \arrRelIns -> do
|
||||
rel <- join arrRelIns
|
||||
Just $ AnnInsObj [] [] [RelIns rel relationshipInfo | not $ null $ _aiInsObj rel]
|
||||
let objectName = tableName <> $$(G.litName "_insert_input")
|
||||
objectDesc = G.Description $ "input type for inserting data into table " <>> table
|
||||
pure $ P.object objectName (Just objectDesc) $ catMaybes <$> sequenceA objectFields
|
||||
<&> mconcat
|
||||
|
||||
-- | Used by 'tableFieldsInput' for object data that is nested through object relationships
|
||||
objectRelationshipInput
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> InsPermInfo
|
||||
-> Maybe SelPermInfo
|
||||
-> Maybe UpdPermInfo
|
||||
-> m (Parser 'Input n (SingleObjIns UnpreparedValue))
|
||||
objectRelationshipInput table insertPerms selectPerms updatePerms =
|
||||
memoizeOn 'objectRelationshipInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableColumns table
|
||||
objectParser <- tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms
|
||||
let objectName = $$(G.litName "data")
|
||||
inputName = tableName <> $$(G.litName "_obj_rel_insert_input")
|
||||
inputDesc = G.Description $ "input type for inserting object relation for remote table " <>> table
|
||||
inputParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
object <- P.field objectName Nothing objectParser
|
||||
pure $ mkInsertObject object table columns conflictClause insertPerms updatePerms
|
||||
pure $ P.object inputName (Just inputDesc) inputParser
|
||||
|
||||
-- | Used by 'tableFieldsInput' for object data that is nested through object relationships
|
||||
arrayRelationshipInput
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> InsPermInfo
|
||||
-> Maybe SelPermInfo
|
||||
-> Maybe UpdPermInfo
|
||||
-> m (Parser 'Input n (MultiObjIns UnpreparedValue))
|
||||
arrayRelationshipInput table insertPerms selectPerms updatePerms =
|
||||
memoizeOn 'arrayRelationshipInput table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableColumns table
|
||||
objectParser <- tableFieldsInput table insertPerms
|
||||
conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms
|
||||
let objectsName = $$(G.litName "data")
|
||||
inputName = tableName <> $$(G.litName "_arr_rel_insert_input")
|
||||
inputDesc = G.Description $ "input type for inserting array relation for remote table " <>> table
|
||||
inputParser = do
|
||||
conflictClause <- mkConflictClause conflictParser
|
||||
objects <- P.field objectsName Nothing $ P.list objectParser
|
||||
pure $ mkInsertObject objects table columns conflictClause insertPerms updatePerms
|
||||
pure $ P.object inputName (Just inputDesc) inputParser
|
||||
|
||||
mkInsertObject
|
||||
:: a
|
||||
-> QualifiedTable
|
||||
-> [PGColumnInfo]
|
||||
-> Maybe (RQL.ConflictClauseP1 UnpreparedValue)
|
||||
-> InsPermInfo
|
||||
-> Maybe UpdPermInfo
|
||||
-> AnnIns a UnpreparedValue
|
||||
mkInsertObject objects table columns conflictClause insertPerms updatePerms =
|
||||
AnnIns { _aiInsObj = objects
|
||||
, _aiTableName = table
|
||||
, _aiConflictClause = conflictClause
|
||||
, _aiCheckCond = (insertCheck, updateCheck)
|
||||
, _aiTableCols = columns
|
||||
, _aiDefVals = defaultValues
|
||||
}
|
||||
where insertCheck = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ ipiCheck insertPerms
|
||||
updateCheck = fmapAnnBoolExp partialSQLExpToUnpreparedValue <$> (upiCheck =<< updatePerms)
|
||||
defaultValues = Map.union (partialSQLExpToUnpreparedValue <$> ipiSet insertPerms)
|
||||
$ fmap UVLiteral $ S.mkColDefValMap $ map pgiColumn columns
|
||||
|
||||
-- | Specifies the "ON CONFLICT" SQL clause
|
||||
conflictObject
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> Maybe SelPermInfo
|
||||
-> UpdPermInfo
|
||||
-> m (Maybe (Parser 'Input n (RQL.ConflictClauseP1 UnpreparedValue)))
|
||||
conflictObject table selectPerms updatePerms = runMaybeT $ do
|
||||
tableName <- lift $ qualifiedObjectToName table
|
||||
columnsEnum <- MaybeT $ tableUpdateColumnsEnum table updatePerms
|
||||
constraints <- MaybeT $ tciUniqueOrPrimaryKeyConstraints . _tiCoreInfo <$> askTableInfo table
|
||||
constraintParser <- lift $ conflictConstraint constraints table
|
||||
whereExpParser <- lift $ boolExp table selectPerms
|
||||
let objectName = tableName <> $$(G.litName "_on_conflict")
|
||||
objectDesc = G.Description $ "on conflict condition type for table " <>> table
|
||||
constraintName = $$(G.litName "constraint")
|
||||
columnsName = $$(G.litName "update_columns")
|
||||
whereExpName = $$(G.litName "where")
|
||||
fieldsParser = do
|
||||
constraint <- RQL.CTConstraint <$> P.field constraintName Nothing constraintParser
|
||||
columns <- P.field columnsName Nothing $ P.list columnsEnum
|
||||
whereExp <- P.fieldOptional whereExpName Nothing whereExpParser
|
||||
pure $ case columns of
|
||||
[] -> RQL.CP1DoNothing $ Just constraint
|
||||
_ -> RQL.CP1Update constraint columns preSetColumns $
|
||||
BoolAnd $ catMaybes [whereExp, Just $ fmapAnnBoolExp partialSQLExpToUnpreparedValue $ upiFilter updatePerms]
|
||||
pure $ P.object objectName (Just objectDesc) fieldsParser
|
||||
where preSetColumns = partialSQLExpToUnpreparedValue <$> upiSet updatePerms
|
||||
|
||||
conflictConstraint
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> NonEmpty Constraint
|
||||
-> QualifiedTable
|
||||
-> m (Parser 'Both n ConstraintName)
|
||||
conflictConstraint constraints table = memoizeOn 'conflictConstraint table $ do
|
||||
tableName <- qualifiedObjectToName table
|
||||
constraintEnumValues <- for constraints \constraint -> do
|
||||
name <- textToName $ getConstraintTxt $ _cName constraint
|
||||
pure ( P.mkDefinition name (Just "unique or primary key constraint") P.EnumValueInfo
|
||||
, _cName constraint
|
||||
)
|
||||
let enumName = tableName <> $$(G.litName "_constraint")
|
||||
enumDesc = G.Description $ "unique or primary key constraints on table " <>> table
|
||||
pure $ P.enum enumName (Just enumDesc) constraintEnumValues
|
||||
|
||||
|
||||
|
||||
-- update
|
||||
|
||||
-- | Construct a root field, normally called update_tablename, that can be used
|
||||
-- to update rows in a DB table specified by filters. Only returns a parser if
|
||||
-- there are columns the user is allowed to update; otherwise returns Nothing.
|
||||
updateTable
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> UpdPermInfo -- ^ update permissions of the table
|
||||
-> Maybe SelPermInfo -- ^ select permissions of the table (if any)
|
||||
-> m (Maybe (FieldParser n (RQL.AnnUpdG UnpreparedValue)))
|
||||
updateTable table fieldName description updatePerms selectPerms = runMaybeT $ do
|
||||
let whereName = $$(G.litName "where")
|
||||
whereDesc = "filter the rows which have to be updated"
|
||||
opArgs <- MaybeT $ updateOperators table updatePerms
|
||||
columns <- lift $ tableColumns table
|
||||
whereArg <- lift $ P.field whereName (Just whereDesc) <$> boolExp table selectPerms
|
||||
selection <- lift $ mutationSelectionSet table selectPerms
|
||||
let argsParser = liftA2 (,) opArgs whereArg
|
||||
pure $ P.subselection fieldName description argsParser selection
|
||||
<&> mkUpdateObject table columns updatePerms . fmap RQL.MOutMultirowFields
|
||||
|
||||
-- | Construct a root field, normally called update_tablename, that can be used
|
||||
-- to update a single in a DB table, specified by primary key. Only returns a
|
||||
-- parser if there are columns the user is allowed to update and if the user has
|
||||
-- select permissions on all primary keys; otherwise returns Nothing.
|
||||
updateTableByPk
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> UpdPermInfo -- ^ update permissions of the table
|
||||
-> SelPermInfo -- ^ select permissions of the table
|
||||
-> m (Maybe (FieldParser n (RQL.AnnUpdG UnpreparedValue)))
|
||||
updateTableByPk table fieldName description updatePerms selectPerms = runMaybeT $ do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- lift $ tableSelectColumns table selectPerms
|
||||
pkArgs <- MaybeT $ primaryKeysArguments table selectPerms
|
||||
opArgs <- MaybeT $ updateOperators table updatePerms
|
||||
selection <- lift $ tableSelectionSet table selectPerms
|
||||
let pkFieldName = $$(G.litName "pk_columns")
|
||||
pkObjectName = tableName <> $$(G.litName "_pk_columns_input")
|
||||
pkObjectDesc = G.Description $ "primary key columns input for table: " <> G.unName tableName
|
||||
argsParser = do
|
||||
operators <- opArgs
|
||||
primaryKeys <- P.field pkFieldName Nothing $ P.object pkObjectName (Just pkObjectDesc) pkArgs
|
||||
pure (operators, primaryKeys)
|
||||
pure $ P.subselection fieldName description argsParser selection
|
||||
<&> mkUpdateObject table columns updatePerms . fmap RQL.MOutSinglerowObject
|
||||
|
||||
mkUpdateObject
|
||||
:: QualifiedTable
|
||||
-> [PGColumnInfo]
|
||||
-> UpdPermInfo
|
||||
-> ( ( [(PGCol, RQL.UpdOpExpG UnpreparedValue)]
|
||||
, AnnBoolExp UnpreparedValue
|
||||
)
|
||||
, RQL.MutationOutputG UnpreparedValue
|
||||
)
|
||||
-> RQL.AnnUpdG UnpreparedValue
|
||||
mkUpdateObject table columns updatePerms ((opExps, whereExp), mutationOutput) =
|
||||
RQL.AnnUpd { RQL.uqp1Table = table
|
||||
, RQL.uqp1OpExps = opExps
|
||||
, RQL.uqp1Where = (permissionFilter, whereExp)
|
||||
, RQL.uqp1Check = checkExp
|
||||
, RQL.uqp1Output = mutationOutput
|
||||
, RQL.uqp1AllCols = columns
|
||||
}
|
||||
where
|
||||
permissionFilter = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ upiFilter updatePerms
|
||||
checkExp = maybe annBoolExpTrue (fmapAnnBoolExp partialSQLExpToUnpreparedValue) $ upiCheck updatePerms
|
||||
|
||||
-- | Various update operators
|
||||
updateOperators
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> UpdPermInfo -- ^ update permissions of the table
|
||||
-> m (Maybe (InputFieldsParser n [(PGCol, RQL.UpdOpExpG UnpreparedValue)]))
|
||||
updateOperators table updatePermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableUpdateColumns table updatePermissions
|
||||
let numericCols = onlyNumCols columns
|
||||
jsonCols = onlyJSONBCols columns
|
||||
parsers <- catMaybes <$> sequenceA
|
||||
[ updateOperator tableName $$(G.litName "_set")
|
||||
columnParser RQL.UpdSet columns
|
||||
"sets the columns of the filtered rows to the given values"
|
||||
(G.Description $ "input type for updating data in table " <>> table)
|
||||
|
||||
, updateOperator tableName $$(G.litName "_inc")
|
||||
columnParser RQL.UpdInc numericCols
|
||||
"increments the numeric columns with given value of the filtered values"
|
||||
(G.Description $"input type for incrementing numeric columns in table " <>> table)
|
||||
|
||||
, let desc = "prepend existing jsonb value of filtered columns with new jsonb value"
|
||||
in updateOperator tableName $$(G.litName "_prepend")
|
||||
columnParser RQL.UpdPrepend jsonCols desc desc
|
||||
|
||||
, let desc = "append existing jsonb value of filtered columns with new jsonb value"
|
||||
in updateOperator tableName $$(G.litName "_append")
|
||||
columnParser RQL.UpdAppend jsonCols desc desc
|
||||
|
||||
, let desc = "delete key/value pair or string element. key/value pairs are matched based on their key value"
|
||||
in updateOperator tableName $$(G.litName "_delete_key")
|
||||
nullableTextParser RQL.UpdDeleteKey jsonCols desc desc
|
||||
|
||||
, let desc = "delete the array element with specified index (negative integers count from the end). "
|
||||
<> "throws an error if top level container is not an array"
|
||||
in updateOperator tableName $$(G.litName "_delete_elem")
|
||||
nonNullableIntParser RQL.UpdDeleteElem jsonCols desc desc
|
||||
|
||||
, let desc = "delete the field or element with specified path (for JSON arrays, negative integers count from the end)"
|
||||
in updateOperator tableName $$(G.litName "_delete_at_path")
|
||||
(fmap P.list . nonNullableTextParser) RQL.UpdDeleteAtPath jsonCols desc desc
|
||||
]
|
||||
whenMaybe (not $ null parsers) do
|
||||
let allowedOperators = fst <$> parsers
|
||||
pure $ fmap catMaybes (sequenceA $ snd <$> parsers)
|
||||
`P.bindFields` \opExps -> do
|
||||
-- there needs to be at least one operator in the update, even if it is empty
|
||||
let presetColumns = Map.toList $ RQL.UpdSet . partialSQLExpToUnpreparedValue <$> upiSet updatePermissions
|
||||
when (null opExps && null presetColumns) $ parseError $
|
||||
"at least any one of " <> T.intercalate ", " allowedOperators <> " is expected"
|
||||
|
||||
-- no column should appear twice
|
||||
let flattenedExps = concat opExps
|
||||
erroneousExps = OMap.filter ((>1) . length) $ OMap.groupTuples flattenedExps
|
||||
unless (OMap.null erroneousExps) $ parseError $
|
||||
"column found in multiple operators; " <>
|
||||
T.intercalate ". " [ dquote column <> " in " <> T.intercalate ", " (toList $ RQL.updateOperatorText <$> ops)
|
||||
| (column, ops) <- OMap.toList erroneousExps
|
||||
]
|
||||
|
||||
pure $ presetColumns <> flattenedExps
|
||||
where
|
||||
columnParser columnInfo = fmap P.mkParameter <$> P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo)
|
||||
nonNullableTextParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGText) (G.Nullability False)
|
||||
nullableTextParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGText) (G.Nullability True)
|
||||
nonNullableIntParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGInteger) (G.Nullability False)
|
||||
|
||||
updateOperator
|
||||
:: G.Name
|
||||
-> G.Name
|
||||
-> (PGColumnInfo -> m (Parser 'Both n a))
|
||||
-> (a -> RQL.UpdOpExpG UnpreparedValue)
|
||||
-> [PGColumnInfo]
|
||||
-> G.Description
|
||||
-> G.Description
|
||||
-> m (Maybe (Text, InputFieldsParser n (Maybe [(PGCol, RQL.UpdOpExpG UnpreparedValue)])))
|
||||
updateOperator tableName opName mkParser updOpExp columns opDesc objDesc =
|
||||
whenMaybe (not $ null columns) do
|
||||
fields <- for columns \columnInfo -> do
|
||||
let fieldName = pgiName columnInfo
|
||||
fieldDesc = pgiDescription columnInfo
|
||||
fieldParser <- mkParser columnInfo
|
||||
pure $ P.fieldOptional fieldName fieldDesc fieldParser
|
||||
`mapField` \value -> (pgiColumn columnInfo, updOpExp value)
|
||||
let objName = tableName <> opName <> $$(G.litName "_input")
|
||||
pure $ (G.unName opName,)
|
||||
$ P.fieldOptional opName (Just opDesc)
|
||||
$ P.object objName (Just objDesc)
|
||||
$ catMaybes <$> sequenceA fields
|
||||
|
||||
|
||||
|
||||
-- delete
|
||||
|
||||
-- | Construct a root field, normally called delete_tablename, that can be used
|
||||
-- to delete several rows from a DB table
|
||||
deleteFromTable
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> DelPermInfo -- ^ delete permissions of the table
|
||||
-> Maybe SelPermInfo -- ^ select permissions of the table (if any)
|
||||
-> m (FieldParser n (RQL.AnnDelG UnpreparedValue))
|
||||
deleteFromTable table fieldName description deletePerms selectPerms = do
|
||||
let whereName = $$(G.litName "where")
|
||||
whereDesc = "filter the rows which have to be deleted"
|
||||
whereArg <- P.field whereName (Just whereDesc) <$> boolExp table selectPerms
|
||||
selection <- mutationSelectionSet table selectPerms
|
||||
columns <- tableColumns table
|
||||
pure $ P.subselection fieldName description whereArg selection
|
||||
<&> mkDeleteObject table columns deletePerms . fmap RQL.MOutMultirowFields
|
||||
|
||||
-- | Construct a root field, normally called delete_tablename, that can be used
|
||||
-- to delete an individual rows from a DB table, specified by primary key
|
||||
deleteFromTableByPk
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable -- ^ qualified name of the table
|
||||
-> G.Name -- ^ field display name
|
||||
-> Maybe G.Description -- ^ field description, if any
|
||||
-> DelPermInfo -- ^ delete permissions of the table
|
||||
-> SelPermInfo -- ^ select permissions of the table
|
||||
-> m (Maybe (FieldParser n (RQL.AnnDelG UnpreparedValue)))
|
||||
deleteFromTableByPk table fieldName description deletePerms selectPerms = runMaybeT $ do
|
||||
columns <- lift $ tableSelectColumns table selectPerms
|
||||
pkArgs <- MaybeT $ primaryKeysArguments table selectPerms
|
||||
selection <- lift $ tableSelectionSet table selectPerms
|
||||
pure $ P.subselection fieldName description pkArgs selection
|
||||
<&> mkDeleteObject table columns deletePerms . fmap RQL.MOutSinglerowObject
|
||||
|
||||
mkDeleteObject
|
||||
:: QualifiedTable
|
||||
-> [PGColumnInfo]
|
||||
-> DelPermInfo
|
||||
-> (AnnBoolExp UnpreparedValue, RQL.MutationOutputG UnpreparedValue)
|
||||
-> RQL.AnnDelG UnpreparedValue
|
||||
mkDeleteObject table columns deletePerms (whereExp, mutationOutput) =
|
||||
RQL.AnnDel { RQL.dqp1Table = table
|
||||
, RQL.dqp1Where = (permissionFilter, whereExp)
|
||||
, RQL.dqp1Output = mutationOutput
|
||||
, RQL.dqp1AllCols = columns
|
||||
}
|
||||
where
|
||||
permissionFilter = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ dpiFilter deletePerms
|
||||
|
||||
|
||||
|
||||
-- common
|
||||
|
||||
-- | All mutations allow returning results, such as what the updated database
|
||||
-- rows look like. This parser allows a query to specify what data to fetch.
|
||||
mutationSelectionSet
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r)
|
||||
=> QualifiedTable
|
||||
-> Maybe SelPermInfo
|
||||
-> m (Parser 'Output n (RQL.MutFldsG UnpreparedValue))
|
||||
mutationSelectionSet table selectPerms =
|
||||
memoizeOn 'mutationSelectionSet table do
|
||||
tableName <- qualifiedObjectToName table
|
||||
returning <- runMaybeT do
|
||||
permissions <- MaybeT $ pure selectPerms
|
||||
tableSet <- lift $ tableSelectionList table permissions
|
||||
let returningName = $$(G.litName "returning")
|
||||
returningDesc = "data from the rows affected by the mutation"
|
||||
pure $ RQL.MRet <$> P.subselection_ returningName (Just returningDesc) tableSet
|
||||
let affectedRowsName = $$(G.litName "affected_rows")
|
||||
affectedRowsDesc = "number of rows affected by the mutation"
|
||||
selectionName = tableName <> $$(G.litName "_mutation_response")
|
||||
selectionDesc = G.Description $ "response of any mutation on the table " <>> table
|
||||
|
||||
selectionFields = catMaybes
|
||||
[ Just $ RQL.MCount <$
|
||||
P.selection_ affectedRowsName (Just affectedRowsDesc) P.int
|
||||
, returning
|
||||
]
|
||||
pure $ P.selectionSet selectionName (Just selectionDesc) selectionFields
|
||||
<&> parsedSelectionsToFields RQL.MExp
|
||||
|
||||
-- | How to specify a database row by primary key.
|
||||
primaryKeysArguments
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m (Maybe (InputFieldsParser n (AnnBoolExp UnpreparedValue)))
|
||||
primaryKeysArguments table selectPerms = runMaybeT $ do
|
||||
primaryKeys <- MaybeT $ _tciPrimaryKey . _tiCoreInfo <$> askTableInfo table
|
||||
let columns = _pkColumns primaryKeys
|
||||
guard $ all (\c -> pgiColumn c `Set.member` spiCols selectPerms) columns
|
||||
lift $ fmap (BoolAnd . toList) . sequenceA <$> for columns \columnInfo -> do
|
||||
field <- P.column (pgiType columnInfo) (G.Nullability False)
|
||||
pure $ BoolFld . AVCol columnInfo . pure . AEQ True . mkParameter <$>
|
||||
P.field (pgiName columnInfo) (pgiDescription columnInfo) field
|
@ -1,51 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Mutation.Common
|
||||
( mkPGColInp
|
||||
, mkMutRespTy
|
||||
, mkMutRespObj
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
mkPGColInp :: PGColumnInfo -> InpValInfo
|
||||
mkPGColInp ci =
|
||||
InpValInfo Nothing (pgiName ci) Nothing $ G.toGT $ mkColumnType $ pgiType ci
|
||||
|
||||
-- table_mutation_response
|
||||
mkMutRespTy :: QualifiedTable -> G.NamedType
|
||||
mkMutRespTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_mutation_response"
|
||||
|
||||
{-
|
||||
type table_mutation_response {
|
||||
affected_rows: Int!
|
||||
returning: [table!]!
|
||||
}
|
||||
-}
|
||||
mkMutRespObj
|
||||
:: QualifiedTable
|
||||
-> Bool -- is sel perm defined
|
||||
-> ObjTyInfo
|
||||
mkMutRespObj tn sel =
|
||||
mkHsraObjTyInfo (Just objDesc) (mkMutRespTy tn) Set.empty $ mapFromL _fiName
|
||||
$ affectedRowsFld : bool [] [returningFld] sel
|
||||
where
|
||||
objDesc = G.Description $
|
||||
"response of any mutation on the table " <>> tn
|
||||
affectedRowsFld =
|
||||
mkHsraObjFldInfo (Just desc) "affected_rows" Map.empty $
|
||||
G.toGT $ G.toNT $ mkScalarTy PGInteger
|
||||
where
|
||||
desc = "number of affected rows by the mutation"
|
||||
returningFld =
|
||||
mkHsraObjFldInfo (Just desc) "returning" Map.empty $
|
||||
G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkTableTy tn
|
||||
where
|
||||
desc = "data of the affected rows by the mutation"
|
@ -1,57 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Mutation.Delete
|
||||
( mkDelMutFld
|
||||
, mkDeleteByPkMutationField
|
||||
) where
|
||||
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Schema.BoolExp
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Mutation.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
{-
|
||||
|
||||
delete_table(
|
||||
where : table_bool_exp!
|
||||
): table_mutation_response
|
||||
|
||||
-}
|
||||
|
||||
mkDelMutFld :: Maybe G.Name -> QualifiedTable -> ObjFldInfo
|
||||
mkDelMutFld mCustomName tn =
|
||||
mkHsraObjFldInfo (Just desc) fldName (fromInpValL [filterArg]) $
|
||||
G.toGT $ mkMutRespTy tn
|
||||
where
|
||||
desc = G.Description $ "delete data from the table: " <>> tn
|
||||
|
||||
defFldName = "delete_" <> qualObjectToName tn
|
||||
fldName = fromMaybe defFldName mCustomName
|
||||
|
||||
filterArgDesc = "filter the rows which have to be deleted"
|
||||
filterArg =
|
||||
InpValInfo (Just filterArgDesc) "where" Nothing $ G.toGT $
|
||||
G.toNT $ mkBoolExpTy tn
|
||||
|
||||
{-
|
||||
delete_table_by_pk(
|
||||
col1: col-ty1!
|
||||
col2: col-ty2!
|
||||
): table
|
||||
-}
|
||||
|
||||
mkDeleteByPkMutationField
|
||||
:: Maybe G.Name
|
||||
-> QualifiedTable
|
||||
-> PrimaryKey PGColumnInfo
|
||||
-> ObjFldInfo
|
||||
mkDeleteByPkMutationField mCustomName qt primaryKey =
|
||||
mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputArgs) $
|
||||
G.toGT $ mkTableTy qt
|
||||
where
|
||||
description = G.Description $ "delete single row from the table: " <>> qt
|
||||
fieldName = flip fromMaybe mCustomName $ "delete_" <> qualObjectToName qt <> "_by_pk"
|
||||
inputArgs = map mkColumnInputVal $ toList $ _pkColumns primaryKey
|
@ -1,236 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Mutation.Insert
|
||||
( mkInsInp
|
||||
, mkInsInpTy
|
||||
, mkRelInsInps
|
||||
, mkInsMutFld
|
||||
, mkInsertOneMutationField
|
||||
, mkOnConflictTypes
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import Hasura.GraphQL.Schema.BoolExp
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Mutation.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- table_insert_input
|
||||
mkInsInpTy :: QualifiedTable -> G.NamedType
|
||||
mkInsInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_insert_input"
|
||||
|
||||
-- table_obj_rel_insert_input
|
||||
mkObjInsInpTy :: QualifiedTable -> G.NamedType
|
||||
mkObjInsInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_obj_rel_insert_input"
|
||||
|
||||
-- table_arr_rel_insert_input
|
||||
mkArrInsInpTy :: QualifiedTable -> G.NamedType
|
||||
mkArrInsInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_arr_rel_insert_input"
|
||||
|
||||
|
||||
-- table_on_conflict
|
||||
mkOnConflictInpTy :: QualifiedTable -> G.NamedType
|
||||
mkOnConflictInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_on_conflict"
|
||||
|
||||
-- table_constraint
|
||||
mkConstraintInpTy :: QualifiedTable -> G.NamedType
|
||||
mkConstraintInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_constraint"
|
||||
|
||||
-- table_update_column
|
||||
mkUpdColumnInpTy :: QualifiedTable -> G.NamedType
|
||||
mkUpdColumnInpTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_update_column"
|
||||
|
||||
{-
|
||||
input table_obj_rel_insert_input {
|
||||
data: table_insert_input!
|
||||
on_conflict: table_on_conflict
|
||||
}
|
||||
|
||||
-}
|
||||
|
||||
{-
|
||||
input table_arr_rel_insert_input {
|
||||
data: [table_insert_input!]!
|
||||
on_conflict: table_on_conflict
|
||||
}
|
||||
|
||||
-}
|
||||
|
||||
mkRelInsInps
|
||||
:: QualifiedTable -> Bool -> [InpObjTyInfo]
|
||||
mkRelInsInps tn upsertAllowed = [objRelInsInp, arrRelInsInp]
|
||||
where
|
||||
onConflictInpVal =
|
||||
InpValInfo Nothing "on_conflict" Nothing $ G.toGT $ mkOnConflictInpTy tn
|
||||
|
||||
onConflictInp = bool [] [onConflictInpVal] upsertAllowed
|
||||
|
||||
objRelDesc = G.Description $
|
||||
"input type for inserting object relation for remote table " <>> tn
|
||||
|
||||
objRelDataInp = InpValInfo Nothing "data" Nothing $ G.toGT $
|
||||
G.toNT $ mkInsInpTy tn
|
||||
objRelInsInp = mkHsraInpTyInfo (Just objRelDesc) (mkObjInsInpTy tn)
|
||||
$ fromInpValL $ objRelDataInp : onConflictInp
|
||||
|
||||
arrRelDesc = G.Description $
|
||||
"input type for inserting array relation for remote table " <>> tn
|
||||
|
||||
arrRelDataInp = InpValInfo Nothing "data" Nothing $ G.toGT $
|
||||
G.toNT $ G.toLT $ G.toNT $ mkInsInpTy tn
|
||||
arrRelInsInp = mkHsraInpTyInfo (Just arrRelDesc) (mkArrInsInpTy tn)
|
||||
$ fromInpValL $ arrRelDataInp : onConflictInp
|
||||
|
||||
{-
|
||||
|
||||
input table_insert_input {
|
||||
col1: colty1
|
||||
.
|
||||
.
|
||||
coln: coltyn
|
||||
}
|
||||
|
||||
-}
|
||||
|
||||
mkInsInp
|
||||
:: QualifiedTable -> [PGColumnInfo] -> RelationInfoMap -> InpObjTyInfo
|
||||
mkInsInp tn insCols relInfoMap =
|
||||
mkHsraInpTyInfo (Just desc) (mkInsInpTy tn) $ fromInpValL $
|
||||
map mkPGColInp insCols <> relInps
|
||||
where
|
||||
desc = G.Description $
|
||||
"input type for inserting data into table " <>> tn
|
||||
|
||||
relInps = flip map (Map.toList relInfoMap) $
|
||||
\(relName, relInfo) ->
|
||||
let remoteQT = riRTable relInfo
|
||||
tyMaker = case riType relInfo of
|
||||
ObjRel -> mkObjInsInpTy
|
||||
ArrRel -> mkArrInsInpTy
|
||||
in InpValInfo Nothing (mkRelName relName) Nothing $
|
||||
G.toGT $ tyMaker remoteQT
|
||||
|
||||
|
||||
{-
|
||||
|
||||
input table_on_conflict {
|
||||
constraint: table_constraint!
|
||||
update_columns: [table_column!]
|
||||
where: table_bool_exp
|
||||
}
|
||||
|
||||
-}
|
||||
|
||||
mkOnConflictInp :: QualifiedTable -> InpObjTyInfo
|
||||
mkOnConflictInp tn =
|
||||
mkHsraInpTyInfo (Just desc) (mkOnConflictInpTy tn) $ fromInpValL
|
||||
[constraintInpVal, updateColumnsInpVal, whereInpVal]
|
||||
where
|
||||
desc = G.Description $
|
||||
"on conflict condition type for table " <>> tn
|
||||
|
||||
constraintInpVal = InpValInfo Nothing (G.Name "constraint") Nothing $
|
||||
G.toGT $ G.toNT $ mkConstraintInpTy tn
|
||||
|
||||
updateColumnsInpVal = InpValInfo Nothing (G.Name "update_columns") Nothing $
|
||||
G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkUpdColumnInpTy tn
|
||||
|
||||
whereInpVal = InpValInfo Nothing (G.Name "where") Nothing $
|
||||
G.toGT $ mkBoolExpTy tn
|
||||
{-
|
||||
|
||||
insert_table(
|
||||
objects: [table_insert_input!]!
|
||||
on_conflict: table_on_conflict
|
||||
): table_mutation_response!
|
||||
-}
|
||||
|
||||
mkInsMutFld :: Maybe G.Name -> QualifiedTable -> Bool -> ObjFldInfo
|
||||
mkInsMutFld mCustomName tn isUpsertable =
|
||||
mkHsraObjFldInfo (Just desc) fldName (fromInpValL inputVals) $
|
||||
G.toGT $ mkMutRespTy tn
|
||||
where
|
||||
inputVals = catMaybes [Just objectsArg , mkOnConflictInputVal tn isUpsertable]
|
||||
desc = G.Description $
|
||||
"insert data into the table: " <>> tn
|
||||
|
||||
defFldName = "insert_" <> qualObjectToName tn
|
||||
fldName = fromMaybe defFldName mCustomName
|
||||
|
||||
objsArgDesc = "the rows to be inserted"
|
||||
objectsArg =
|
||||
InpValInfo (Just objsArgDesc) "objects" Nothing $ G.toGT $
|
||||
G.toNT $ G.toLT $ G.toNT $ mkInsInpTy tn
|
||||
|
||||
mkConstraintTy :: QualifiedTable -> [ConstraintName] -> EnumTyInfo
|
||||
mkConstraintTy tn cons = enumTyInfo
|
||||
where
|
||||
enumTyInfo = mkHsraEnumTyInfo (Just desc) (mkConstraintInpTy tn) $
|
||||
EnumValuesSynthetic . mapFromL _eviVal $ map mkConstraintEnumVal cons
|
||||
|
||||
desc = G.Description $
|
||||
"unique or primary key constraints on table " <>> tn
|
||||
|
||||
mkConstraintEnumVal (ConstraintName n) =
|
||||
EnumValInfo (Just "unique or primary key constraint")
|
||||
(G.EnumValue $ G.Name n) False
|
||||
|
||||
mkUpdColumnTy :: QualifiedTable -> [G.Name] -> EnumTyInfo
|
||||
mkUpdColumnTy tn cols = enumTyInfo
|
||||
where
|
||||
enumTyInfo = mkHsraEnumTyInfo (Just desc) (mkUpdColumnInpTy tn) $
|
||||
EnumValuesSynthetic . mapFromL _eviVal $ map mkColumnEnumVal cols
|
||||
|
||||
desc = G.Description $
|
||||
"update columns of table " <>> tn
|
||||
|
||||
mkOnConflictTypes
|
||||
:: QualifiedTable -> [ConstraintName] -> [G.Name] -> Bool -> [TypeInfo]
|
||||
mkOnConflictTypes tn uniqueOrPrimaryCons cols =
|
||||
bool [] tyInfos
|
||||
where
|
||||
tyInfos = [ TIEnum $ mkConstraintTy tn uniqueOrPrimaryCons
|
||||
, TIEnum $ mkUpdColumnTy tn cols
|
||||
, TIInpObj $ mkOnConflictInp tn
|
||||
]
|
||||
|
||||
mkOnConflictInputVal :: QualifiedTable -> Bool -> Maybe InpValInfo
|
||||
mkOnConflictInputVal qt =
|
||||
bool Nothing (Just onConflictArg)
|
||||
where
|
||||
onConflictDesc = "on conflict condition"
|
||||
onConflictArg = InpValInfo (Just onConflictDesc) "on_conflict"
|
||||
Nothing $ G.toGT $ mkOnConflictInpTy qt
|
||||
|
||||
|
||||
{-
|
||||
insert_table_one(
|
||||
object: table_insert_input!
|
||||
on_conflict: table_on_conflict
|
||||
): table
|
||||
-}
|
||||
|
||||
mkInsertOneMutationField :: Maybe G.Name -> QualifiedTable -> Bool -> ObjFldInfo
|
||||
mkInsertOneMutationField mCustomName qt isUpsertable =
|
||||
mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputVals) $
|
||||
G.toGT $ mkTableTy qt
|
||||
where
|
||||
description = G.Description $ "insert a single row into the table: " <>> qt
|
||||
|
||||
fieldName = flip fromMaybe mCustomName $ "insert_" <> qualObjectToName qt <> "_one"
|
||||
|
||||
inputVals = catMaybes [Just objectArg, mkOnConflictInputVal qt isUpsertable]
|
||||
|
||||
objectArgDesc = "the row to be inserted"
|
||||
objectArg = InpValInfo (Just objectArgDesc) "object" Nothing $ G.toGT $
|
||||
G.toNT $ mkInsInpTy qt
|
@ -1,301 +0,0 @@
|
||||
module Hasura.GraphQL.Schema.Mutation.Update
|
||||
( mkUpdSetInp
|
||||
, mkUpdIncInp
|
||||
, mkUpdJSONOpInp
|
||||
, mkUpdSetTy
|
||||
, mkUpdMutFld
|
||||
, mkPKeyColumnsInpObj
|
||||
, mkUpdateByPkMutationField
|
||||
) where
|
||||
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Schema.BoolExp
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Schema.Mutation.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- table_set_input
|
||||
mkUpdSetTy :: QualifiedTable -> G.NamedType
|
||||
mkUpdSetTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_set_input"
|
||||
|
||||
{-
|
||||
input table_set_input {
|
||||
col1: colty1
|
||||
.
|
||||
.
|
||||
coln: coltyn
|
||||
}
|
||||
-}
|
||||
mkUpdSetInp
|
||||
:: QualifiedTable -> [PGColumnInfo] -> InpObjTyInfo
|
||||
mkUpdSetInp tn cols =
|
||||
mkHsraInpTyInfo (Just desc) (mkUpdSetTy tn) $
|
||||
fromInpValL $ map mkPGColInp cols
|
||||
where
|
||||
desc = G.Description $
|
||||
"input type for updating data in table " <>> tn
|
||||
|
||||
-- table_inc_input
|
||||
mkUpdIncTy :: QualifiedTable -> G.NamedType
|
||||
mkUpdIncTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_inc_input"
|
||||
|
||||
{-
|
||||
input table_inc_input {
|
||||
integer-col1: int
|
||||
.
|
||||
.
|
||||
integer-coln: int
|
||||
}
|
||||
-}
|
||||
|
||||
mkUpdIncInp
|
||||
:: QualifiedTable -> Maybe [PGColumnInfo] -> Maybe InpObjTyInfo
|
||||
mkUpdIncInp tn = maybe Nothing mkType
|
||||
where
|
||||
mkType cols = let numCols = onlyNumCols cols
|
||||
incObjTy =
|
||||
mkHsraInpTyInfo (Just desc) (mkUpdIncTy tn) $
|
||||
fromInpValL $ map mkPGColInp numCols
|
||||
in bool (Just incObjTy) Nothing $ null numCols
|
||||
desc = G.Description $
|
||||
"input type for incrementing integer column in table " <>> tn
|
||||
|
||||
-- table_<json-op>_input
|
||||
mkJSONOpTy :: QualifiedTable -> G.Name -> G.NamedType
|
||||
mkJSONOpTy tn op =
|
||||
G.NamedType $ qualObjectToName tn <> op <> "_input"
|
||||
|
||||
-- json ops are _concat, _delete_key, _delete_elem, _delete_at_path
|
||||
{-
|
||||
input table_concat_input {
|
||||
jsonb-col1: json
|
||||
.
|
||||
.
|
||||
jsonb-coln: json
|
||||
}
|
||||
-}
|
||||
|
||||
{-
|
||||
input table_delete_key_input {
|
||||
jsonb-col1: string
|
||||
.
|
||||
.
|
||||
jsonb-coln: string
|
||||
}
|
||||
-}
|
||||
|
||||
{-
|
||||
input table_delete_elem_input {
|
||||
jsonb-col1: int
|
||||
.
|
||||
.
|
||||
jsonb-coln: int
|
||||
}
|
||||
-}
|
||||
|
||||
{-
|
||||
input table_delete_at_path_input {
|
||||
jsonb-col1: [string]
|
||||
.
|
||||
.
|
||||
jsonb-coln: [string]
|
||||
}
|
||||
-}
|
||||
|
||||
-- jsonb operators and descriptions
|
||||
prependOp :: G.Name
|
||||
prependOp = "_prepend"
|
||||
|
||||
prependDesc :: G.Description
|
||||
prependDesc = "prepend existing jsonb value of filtered columns with new jsonb value"
|
||||
|
||||
appendOp :: G.Name
|
||||
appendOp = "_append"
|
||||
|
||||
appendDesc :: G.Description
|
||||
appendDesc = "append existing jsonb value of filtered columns with new jsonb value"
|
||||
|
||||
deleteKeyOp :: G.Name
|
||||
deleteKeyOp = "_delete_key"
|
||||
|
||||
deleteKeyDesc :: G.Description
|
||||
deleteKeyDesc = "delete key/value pair or string element."
|
||||
<> " key/value pairs are matched based on their key value"
|
||||
|
||||
deleteElemOp :: G.Name
|
||||
deleteElemOp = "_delete_elem"
|
||||
|
||||
deleteElemDesc :: G.Description
|
||||
deleteElemDesc = "delete the array element with specified index (negative integers count from the end)."
|
||||
<> " throws an error if top level container is not an array"
|
||||
|
||||
deleteAtPathOp :: G.Name
|
||||
deleteAtPathOp = "_delete_at_path"
|
||||
|
||||
deleteAtPathDesc :: G.Description
|
||||
deleteAtPathDesc = "delete the field or element with specified path"
|
||||
<> " (for JSON arrays, negative integers count from the end)"
|
||||
|
||||
mkUpdJSONOpInp
|
||||
:: QualifiedTable -> [PGColumnInfo] -> [InpObjTyInfo]
|
||||
mkUpdJSONOpInp tn cols = bool inpObjs [] $ null jsonbCols
|
||||
where
|
||||
jsonbCols = onlyJSONBCols cols
|
||||
jsonbColNames = map pgiName jsonbCols
|
||||
|
||||
inpObjs = [ prependInpObj, appendInpObj, deleteKeyInpObj
|
||||
, deleteElemInpObj, deleteAtPathInpObj
|
||||
]
|
||||
|
||||
appendInpObj =
|
||||
mkHsraInpTyInfo (Just appendDesc) (mkJSONOpTy tn appendOp) $
|
||||
fromInpValL $ map mkPGColInp jsonbCols
|
||||
|
||||
prependInpObj =
|
||||
mkHsraInpTyInfo (Just prependDesc) (mkJSONOpTy tn prependOp) $
|
||||
fromInpValL $ map mkPGColInp jsonbCols
|
||||
|
||||
deleteKeyInpObj =
|
||||
mkHsraInpTyInfo (Just deleteKeyDesc) (mkJSONOpTy tn deleteKeyOp) $
|
||||
fromInpValL $ map deleteKeyInpVal jsonbColNames
|
||||
deleteKeyInpVal n =
|
||||
InpValInfo Nothing n Nothing $ G.toGT $ G.NamedType "String"
|
||||
|
||||
deleteElemInpObj =
|
||||
mkHsraInpTyInfo (Just deleteElemDesc) (mkJSONOpTy tn deleteElemOp) $
|
||||
fromInpValL $ map deleteElemInpVal jsonbColNames
|
||||
deleteElemInpVal n =
|
||||
InpValInfo Nothing n Nothing $ G.toGT $ G.NamedType "Int"
|
||||
|
||||
deleteAtPathInpObj =
|
||||
mkHsraInpTyInfo (Just deleteAtPathDesc) (mkJSONOpTy tn deleteAtPathOp) $
|
||||
fromInpValL $ map deleteAtPathInpVal jsonbColNames
|
||||
deleteAtPathInpVal n =
|
||||
InpValInfo Nothing n Nothing $ G.toGT $ G.toLT $ G.NamedType "String"
|
||||
|
||||
{-
|
||||
|
||||
update_table(
|
||||
where : table_bool_exp!
|
||||
_set : table_set_input
|
||||
_inc : table_inc_input
|
||||
_concat: table_concat_input
|
||||
_delete_key: table_delete_key_input
|
||||
_delete_elem: table_delete_elem_input
|
||||
_delete_path_at: table_delete_path_at_input
|
||||
): table_mutation_response
|
||||
|
||||
-}
|
||||
|
||||
mkIncInpVal :: QualifiedTable -> [PGColumnInfo] -> Maybe InpValInfo
|
||||
mkIncInpVal tn cols = bool (Just incArg) Nothing $ null numCols
|
||||
where
|
||||
numCols = onlyNumCols cols
|
||||
incArgDesc = "increments the integer columns with given value of the filtered values"
|
||||
incArg =
|
||||
InpValInfo (Just incArgDesc) "_inc" Nothing $ G.toGT $ mkUpdIncTy tn
|
||||
|
||||
mkJSONOpInpVals :: QualifiedTable -> [PGColumnInfo] -> [InpValInfo]
|
||||
mkJSONOpInpVals tn cols = bool jsonbOpArgs [] $ null jsonbCols
|
||||
where
|
||||
jsonbCols = onlyJSONBCols cols
|
||||
jsonbOpArgs = [appendArg, prependArg, deleteKeyArg, deleteElemArg, deleteAtPathArg]
|
||||
|
||||
appendArg =
|
||||
InpValInfo (Just appendDesc) appendOp Nothing $ G.toGT $ mkJSONOpTy tn appendOp
|
||||
|
||||
prependArg =
|
||||
InpValInfo (Just prependDesc) prependOp Nothing $ G.toGT $ mkJSONOpTy tn prependOp
|
||||
|
||||
deleteKeyArg =
|
||||
InpValInfo (Just deleteKeyDesc) deleteKeyOp Nothing $
|
||||
G.toGT $ mkJSONOpTy tn deleteKeyOp
|
||||
|
||||
deleteElemArg =
|
||||
InpValInfo (Just deleteElemDesc) deleteElemOp Nothing $
|
||||
G.toGT $ mkJSONOpTy tn deleteElemOp
|
||||
|
||||
deleteAtPathArg =
|
||||
InpValInfo (Just deleteAtPathDesc) deleteAtPathOp Nothing $
|
||||
G.toGT $ mkJSONOpTy tn deleteAtPathOp
|
||||
|
||||
mkUpdateOpInputs :: QualifiedTable -> [PGColumnInfo] -> [InpValInfo]
|
||||
mkUpdateOpInputs qt cols =
|
||||
catMaybes [Just setInp , mkIncInpVal qt cols] <> mkJSONOpInpVals qt cols
|
||||
where
|
||||
setArgDesc = "sets the columns of the filtered rows to the given values"
|
||||
setInp =
|
||||
InpValInfo (Just setArgDesc) "_set" Nothing $ G.toGT $ mkUpdSetTy qt
|
||||
|
||||
|
||||
mkUpdMutFld :: Maybe G.Name -> QualifiedTable -> [PGColumnInfo] -> ObjFldInfo
|
||||
mkUpdMutFld mCustomName tn cols =
|
||||
mkHsraObjFldInfo (Just desc) fldName (fromInpValL inputValues) $
|
||||
G.toGT $ mkMutRespTy tn
|
||||
where
|
||||
inputValues = [filterArg] <> mkUpdateOpInputs tn cols
|
||||
desc = G.Description $ "update data of the table: " <>> tn
|
||||
|
||||
defFldName = "update_" <> qualObjectToName tn
|
||||
fldName = fromMaybe defFldName mCustomName
|
||||
|
||||
filterArgDesc = "filter the rows which have to be updated"
|
||||
filterArg =
|
||||
InpValInfo (Just filterArgDesc) "where" Nothing $ G.toGT $
|
||||
G.toNT $ mkBoolExpTy tn
|
||||
|
||||
{-
|
||||
|
||||
update_table_by_pk(
|
||||
columns: table_pk_columns_input!
|
||||
_set : table_set_input
|
||||
_inc : table_inc_input
|
||||
_concat: table_concat_input
|
||||
_delete_key: table_delete_key_input
|
||||
_delete_elem: table_delete_elem_input
|
||||
_delete_path_at: table_delete_path_at_input
|
||||
)
|
||||
-}
|
||||
|
||||
{-
|
||||
input table_pk_columns_input {
|
||||
col1: col-ty1!
|
||||
col2: col-ty2!
|
||||
}
|
||||
|
||||
where col1, col2 are primary key columns
|
||||
-}
|
||||
|
||||
mkPKeyColumnsInpTy :: QualifiedTable -> G.NamedType
|
||||
mkPKeyColumnsInpTy qt =
|
||||
G.NamedType $ qualObjectToName qt <> "_pk_columns_input"
|
||||
|
||||
mkPKeyColumnsInpObj :: QualifiedTable -> PrimaryKey PGColumnInfo -> InpObjTyInfo
|
||||
mkPKeyColumnsInpObj qt primaryKey =
|
||||
mkHsraInpTyInfo (Just description) (mkPKeyColumnsInpTy qt) $
|
||||
fromInpValL $ map mkColumnInputVal $ toList $ _pkColumns primaryKey
|
||||
where
|
||||
description = G.Description $ "primary key columns input for table: " <>> qt
|
||||
|
||||
mkUpdateByPkMutationField
|
||||
:: Maybe G.Name
|
||||
-> QualifiedTable
|
||||
-> [PGColumnInfo]
|
||||
-> PrimaryKey PGColumnInfo
|
||||
-> ObjFldInfo
|
||||
mkUpdateByPkMutationField mCustomName qt cols _ =
|
||||
mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputArgs) $
|
||||
G.toGT $ mkTableTy qt
|
||||
where
|
||||
description = G.Description $ "update single row of the table: " <>> qt
|
||||
fieldName = flip fromMaybe mCustomName $ "update_" <> qualObjectToName qt <> "_by_pk"
|
||||
|
||||
inputArgs = pure primaryKeyColumnsInp <> mkUpdateOpInputs qt cols
|
||||
primaryKeyColumnsInp =
|
||||
InpValInfo Nothing "pk_columns" Nothing $ G.toGT $ G.toNT $ mkPKeyColumnsInpTy qt
|
@ -1,178 +1,171 @@
|
||||
module Hasura.GraphQL.Schema.OrderBy
|
||||
( mkOrdByTy
|
||||
, ordByEnumTy
|
||||
, mkOrdByInpObj
|
||||
, mkTabAggOrdByInpObj
|
||||
, mkTabAggregateOpOrdByInpObjs
|
||||
( orderByExp
|
||||
) where
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.Types
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
import qualified Hasura.RQL.DML.Select as RQL
|
||||
import Hasura.RQL.Types as RQL
|
||||
import Hasura.SQL.DML as SQL
|
||||
|
||||
import Hasura.GraphQL.Parser (InputFieldsParser, Kind (..), Parser,
|
||||
UnpreparedValue)
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.GraphQL.Schema.Common
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.GraphQL.Schema.Table
|
||||
import Hasura.SQL.Types
|
||||
|
||||
ordByTy :: G.NamedType
|
||||
ordByTy = G.NamedType "order_by"
|
||||
|
||||
ordByEnumTy :: EnumTyInfo
|
||||
ordByEnumTy =
|
||||
mkHsraEnumTyInfo (Just desc) ordByTy $
|
||||
EnumValuesSynthetic . mapFromL _eviVal $ map mkEnumVal enumVals
|
||||
-- | Corresponds to an object type for an order by.
|
||||
--
|
||||
-- > input table_order_by {
|
||||
-- > col1: order_by
|
||||
-- > col2: order_by
|
||||
-- > . .
|
||||
-- > . .
|
||||
-- > coln: order_by
|
||||
-- > obj-rel: <remote-table>_order_by
|
||||
-- > }
|
||||
orderByExp
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m (Parser 'Input n [RQL.AnnOrderByItemG UnpreparedValue])
|
||||
orderByExp table selectPermissions = memoizeOn 'orderByExp table $ do
|
||||
name <- qualifiedObjectToName table <&> (<> $$(G.litName "_order_by"))
|
||||
let description = G.Description $
|
||||
"Ordering options when selecting data from " <> table <<> "."
|
||||
tableFields <- tableSelectFields table selectPermissions
|
||||
fieldParsers <- sequenceA . catMaybes <$> traverse mkField tableFields
|
||||
pure $ concat . catMaybes <$> P.object name (Just description) fieldParsers
|
||||
where
|
||||
desc = G.Description "column ordering options"
|
||||
mkEnumVal (n, d) =
|
||||
EnumValInfo (Just d) (G.EnumValue n) False
|
||||
enumVals =
|
||||
[ ( "asc"
|
||||
, "in the ascending order, nulls last"
|
||||
),
|
||||
( "asc_nulls_last"
|
||||
, "in the ascending order, nulls last"
|
||||
),
|
||||
( "asc_nulls_first"
|
||||
, "in the ascending order, nulls first"
|
||||
),
|
||||
( "desc"
|
||||
, "in the descending order, nulls first"
|
||||
),
|
||||
( "desc_nulls_first"
|
||||
, "in the descending order, nulls first"
|
||||
),
|
||||
( "desc_nulls_last"
|
||||
, "in the descending order, nulls last"
|
||||
)
|
||||
]
|
||||
mkField
|
||||
:: FieldInfo
|
||||
-> m (Maybe (InputFieldsParser n (Maybe [RQL.AnnOrderByItemG UnpreparedValue])))
|
||||
mkField fieldInfo = runMaybeT $
|
||||
case fieldInfo of
|
||||
FIColumn columnInfo -> do
|
||||
let fieldName = pgiName columnInfo
|
||||
pure $ P.fieldOptional fieldName Nothing orderByOperator
|
||||
<&> fmap (pure . mkOrderByItemG (RQL.AOCColumn columnInfo)) . join
|
||||
FIRelationship relationshipInfo -> do
|
||||
let remoteTable = riRTable relationshipInfo
|
||||
fieldName <- MaybeT $ pure $ G.mkName $ relNameToTxt $ riName relationshipInfo
|
||||
perms <- MaybeT $ tableSelectPermissions remoteTable
|
||||
let newPerms = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ spiFilter perms
|
||||
case riType relationshipInfo of
|
||||
ObjRel -> do
|
||||
otherTableParser <- lift $ orderByExp remoteTable perms
|
||||
pure $ do
|
||||
otherTableOrderBy <- join <$> P.fieldOptional fieldName Nothing (P.nullable otherTableParser)
|
||||
pure $ fmap (map $ fmap $ RQL.AOCObjectRelation relationshipInfo newPerms) otherTableOrderBy
|
||||
ArrRel -> do
|
||||
let aggregateFieldName = fieldName <> $$(G.litName "_aggregate")
|
||||
aggregationParser <- lift $ orderByAggregation remoteTable perms
|
||||
pure $ do
|
||||
aggregationOrderBy <- join <$> P.fieldOptional aggregateFieldName Nothing (P.nullable aggregationParser)
|
||||
pure $ fmap (map $ fmap $ RQL.AOCArrayAggregation relationshipInfo newPerms) aggregationOrderBy
|
||||
FIComputedField _ -> empty
|
||||
FIRemoteRelationship _ -> empty
|
||||
|
||||
mkTabAggregateOpOrdByTy :: QualifiedTable -> G.Name -> G.NamedType
|
||||
mkTabAggregateOpOrdByTy tn op =
|
||||
G.NamedType $ qualObjectToName tn <> "_" <> op <> "_order_by"
|
||||
|
||||
{-
|
||||
input table_<op>_order_by {
|
||||
col1: order_by
|
||||
. .
|
||||
. .
|
||||
}
|
||||
-}
|
||||
|
||||
mkTabAggregateOpOrdByInpObjs
|
||||
:: QualifiedTable
|
||||
-> ([PGColumnInfo], [G.Name])
|
||||
-> ([PGColumnInfo], [G.Name])
|
||||
-> [InpObjTyInfo]
|
||||
mkTabAggregateOpOrdByInpObjs tn (numCols, numericAggregateOps) (compCols, compareAggregateOps) =
|
||||
mapMaybe (mkInpObjTyM numCols) numericAggregateOps
|
||||
<> mapMaybe (mkInpObjTyM compCols) compareAggregateOps
|
||||
-- local definitions
|
||||
|
||||
type OrderInfo = (SQL.OrderType, SQL.NullsOrder)
|
||||
|
||||
|
||||
orderByAggregation
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m (Parser 'Input n [OrderByItemG RQL.AnnAggregateOrderBy])
|
||||
orderByAggregation table selectPermissions = do
|
||||
-- WIP NOTE
|
||||
-- there is heavy duplication between this and Select.tableAggregationFields
|
||||
-- it might be worth putting some of it in common, just to avoid issues when
|
||||
-- we change one but not the other?
|
||||
tableName <- qualifiedObjectToName table
|
||||
allColumns <- tableSelectColumns table selectPermissions
|
||||
let numColumns = onlyNumCols allColumns
|
||||
compColumns = onlyComparableCols allColumns
|
||||
numFields = catMaybes <$> traverse mkField numColumns
|
||||
compFields = catMaybes <$> traverse mkField compColumns
|
||||
aggFields = fmap (concat . catMaybes . concat) $ sequenceA $ catMaybes
|
||||
[ -- count
|
||||
Just $ P.fieldOptional $$(G.litName "count") Nothing orderByOperator
|
||||
<&> pure . fmap (pure . mkOrderByItemG RQL.AAOCount) . join
|
||||
, -- operators on numeric columns
|
||||
if null numColumns then Nothing else Just $
|
||||
for numericAggOperators \operator ->
|
||||
parseOperator operator tableName numFields
|
||||
, -- operators on comparable columns
|
||||
if null compColumns then Nothing else Just $
|
||||
for comparisonAggOperators \operator ->
|
||||
parseOperator operator tableName compFields
|
||||
]
|
||||
let objectName = tableName <> $$(G.litName "_aggregate_order_by")
|
||||
description = G.Description $ "order by aggregate values of table " <>> table
|
||||
pure $ P.object objectName (Just description) aggFields
|
||||
where
|
||||
mkField :: PGColumnInfo -> InputFieldsParser n (Maybe (PGColumnInfo, OrderInfo))
|
||||
mkField columnInfo =
|
||||
P.fieldOptional (pgiName columnInfo) (pgiDescription columnInfo) orderByOperator
|
||||
<&> fmap (columnInfo,) . join
|
||||
|
||||
mkDesc (G.Name op) =
|
||||
G.Description $ "order by " <> op <> "() on columns of table " <>> tn
|
||||
parseOperator
|
||||
:: G.Name
|
||||
-> G.Name
|
||||
-> InputFieldsParser n [(PGColumnInfo, OrderInfo)]
|
||||
-> InputFieldsParser n (Maybe [OrderByItemG RQL.AnnAggregateOrderBy])
|
||||
parseOperator operator tableName columns =
|
||||
let opText = G.unName operator
|
||||
objectName = tableName <> $$(G.litName "_") <> operator <> $$(G.litName "_order_by")
|
||||
objectDesc = Just $ G.Description $ "order by " <> opText <> "() on columns of table " <>> table
|
||||
in P.fieldOptional operator Nothing (P.object objectName objectDesc columns)
|
||||
`mapField` map (\(col, info) -> mkOrderByItemG (RQL.AAOOp opText col) info)
|
||||
|
||||
mkInpObjTyM cols op = bool (Just $ mkInpObjTy cols op) Nothing $ null cols
|
||||
mkInpObjTy cols op =
|
||||
mkHsraInpTyInfo (Just $ mkDesc op) (mkTabAggregateOpOrdByTy tn op) $
|
||||
fromInpValL $ map mkColInpVal cols
|
||||
|
||||
mkColInpVal ci = InpValInfo Nothing (pgiName ci) Nothing $ G.toGT
|
||||
ordByTy
|
||||
|
||||
mkTabAggOrdByTy :: QualifiedTable -> G.NamedType
|
||||
mkTabAggOrdByTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_aggregate_order_by"
|
||||
|
||||
{-
|
||||
input table_aggregate_order_by {
|
||||
count: order_by
|
||||
<op-name>: table_<op-name>_order_by
|
||||
}
|
||||
-}
|
||||
|
||||
mkTabAggOrdByInpObj
|
||||
:: QualifiedTable
|
||||
-> ([PGColumnInfo], [G.Name])
|
||||
-> ([PGColumnInfo], [G.Name])
|
||||
-> InpObjTyInfo
|
||||
mkTabAggOrdByInpObj tn (numCols, numericAggregateOps) (compCols, compareAggregateOps) =
|
||||
mkHsraInpTyInfo (Just desc) (mkTabAggOrdByTy tn) $ fromInpValL $
|
||||
numOpOrdBys <> compOpOrdBys <> [countInpVal]
|
||||
orderByOperator :: MonadParse m => Parser 'Both m (Maybe OrderInfo)
|
||||
orderByOperator =
|
||||
P.nullable $ P.enum $$(G.litName "order_by") (Just "column ordering options") $ NE.fromList
|
||||
[ ( define $$(G.litName "asc") "in ascending order, nulls last"
|
||||
, (SQL.OTAsc, SQL.NLast)
|
||||
)
|
||||
, ( define $$(G.litName "asc_nulls_first") "in ascending order, nulls first"
|
||||
, (SQL.OTAsc, SQL.NFirst)
|
||||
)
|
||||
, ( define $$(G.litName "asc_nulls_last") "in ascending order, nulls last"
|
||||
, (SQL.OTAsc, SQL.NLast)
|
||||
)
|
||||
, ( define $$(G.litName "desc") "in descending order, nulls first"
|
||||
, (SQL.OTDesc, SQL.NFirst)
|
||||
)
|
||||
, ( define $$(G.litName "desc_nulls_first") "in descending order, nulls first"
|
||||
, (SQL.OTDesc, SQL.NFirst)
|
||||
)
|
||||
, ( define $$(G.litName "desc_nulls_last") "in descending order, nulls last"
|
||||
, (SQL.OTDesc, SQL.NLast)
|
||||
)
|
||||
]
|
||||
where
|
||||
desc = G.Description $
|
||||
"order by aggregate values of table " <>> tn
|
||||
define name desc = P.mkDefinition name (Just desc) P.EnumValueInfo
|
||||
|
||||
numOpOrdBys = bool (map mkInpValInfo numericAggregateOps) [] $ null numCols
|
||||
compOpOrdBys = bool (map mkInpValInfo compareAggregateOps) [] $ null compCols
|
||||
mkInpValInfo op = InpValInfo Nothing op Nothing $ G.toGT $
|
||||
mkTabAggregateOpOrdByTy tn op
|
||||
|
||||
countInpVal = InpValInfo Nothing "count" Nothing $ G.toGT ordByTy
|
||||
|
||||
mkOrdByTy :: QualifiedTable -> G.NamedType
|
||||
mkOrdByTy tn =
|
||||
G.NamedType $ qualObjectToName tn <> "_order_by"
|
||||
-- local helpers
|
||||
|
||||
{-
|
||||
input table_order_by {
|
||||
col1: order_by
|
||||
col2: order_by
|
||||
. .
|
||||
. .
|
||||
coln: order_by
|
||||
obj-rel: <remote-table>_order_by
|
||||
}
|
||||
-}
|
||||
mkOrderByItemG :: a -> OrderInfo -> OrderByItemG a
|
||||
mkOrderByItemG column (orderType, nullsOrder) =
|
||||
OrderByItemG { obiType = Just $ RQL.OrderType orderType
|
||||
, obiColumn = column
|
||||
, obiNulls = Just $ RQL.NullsOrder nullsOrder
|
||||
}
|
||||
|
||||
mkOrdByInpObj
|
||||
:: QualifiedTable -> [SelField] -> (InpObjTyInfo, OrdByCtx)
|
||||
mkOrdByInpObj tn selFlds = (inpObjTy, ordByCtx)
|
||||
where
|
||||
inpObjTy =
|
||||
mkHsraInpTyInfo (Just desc) namedTy $ fromInpValL $
|
||||
map mkColOrdBy pgColumnFields <> map mkObjRelOrdBy objRels
|
||||
<> mapMaybe mkArrayAggregateSelectOrdBy arrRels
|
||||
|
||||
namedTy = mkOrdByTy tn
|
||||
desc = G.Description $
|
||||
"ordering options when selecting data from " <>> tn
|
||||
|
||||
pgColumnFields = getPGColumnFields selFlds
|
||||
relFltr ty = flip filter (getRelationshipFields selFlds) $
|
||||
\rf -> riType (_rfiInfo rf) == ty
|
||||
objRels = relFltr ObjRel
|
||||
arrRels = relFltr ArrRel
|
||||
|
||||
mkColOrdBy columnInfo =
|
||||
InpValInfo Nothing (pgiName columnInfo) Nothing $ G.toGT ordByTy
|
||||
mkObjRelOrdBy relationshipField =
|
||||
let ri = _rfiInfo relationshipField
|
||||
in InpValInfo Nothing (mkRelName $ riName ri) Nothing $
|
||||
G.toGT $ mkOrdByTy $ riRTable ri
|
||||
|
||||
mkArrayAggregateSelectOrdBy relationshipField =
|
||||
let ri = _rfiInfo relationshipField
|
||||
isAggAllowed = _rfiAllowAgg relationshipField
|
||||
ivi = InpValInfo Nothing (mkAggRelName $ riName ri) Nothing $
|
||||
G.toGT $ mkTabAggOrdByTy $ riRTable ri
|
||||
in bool Nothing (Just ivi) isAggAllowed
|
||||
|
||||
ordByCtx = Map.singleton namedTy $ Map.fromList $
|
||||
colOrdBys <> relOrdBys <> arrRelOrdBys
|
||||
colOrdBys = map (pgiName &&& OBIPGCol) pgColumnFields
|
||||
relOrdBys = flip map objRels $
|
||||
\relationshipField ->
|
||||
let ri = _rfiInfo relationshipField
|
||||
fltr = _rfiPermFilter relationshipField
|
||||
in ( mkRelName $ riName ri
|
||||
, OBIRel ri fltr
|
||||
)
|
||||
|
||||
arrRelOrdBys = flip mapMaybe arrRels $
|
||||
\(RelationshipFieldInfo ri isAggAllowed colGNameMap fltr _ _ _) ->
|
||||
let obItem = ( mkAggRelName $ riName ri
|
||||
, OBIAgg ri colGNameMap fltr
|
||||
)
|
||||
in bool Nothing (Just obItem) isAggAllowed
|
||||
aliasToName :: G.Name -> FieldName
|
||||
aliasToName = FieldName . G.unName
|
||||
|
420
server/src-lib/Hasura/GraphQL/Schema/Remote.hs
Normal file
420
server/src-lib/Hasura/GraphQL/Schema/Remote.hs
Normal file
@ -0,0 +1,420 @@
|
||||
module Hasura.GraphQL.Schema.Remote
|
||||
( buildRemoteParser
|
||||
, remoteFieldFullSchema
|
||||
, inputValueDefinitionParser
|
||||
, lookupObject
|
||||
, lookupType
|
||||
, lookupScalar
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import Data.Type.Equality
|
||||
import Data.Foldable (sequenceA_)
|
||||
|
||||
import Hasura.GraphQL.Parser as P
|
||||
import qualified Hasura.GraphQL.Parser.Internal.Parser as P
|
||||
|
||||
buildRemoteParser
|
||||
:: forall m n
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> IntrospectionResult
|
||||
-> RemoteSchemaInfo
|
||||
-> m ( [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)]
|
||||
, Maybe [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)]
|
||||
, Maybe [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)])
|
||||
buildRemoteParser (IntrospectionResult sdoc query_root mutation_root subscription_root) info = do
|
||||
queryT <- makeParsers query_root
|
||||
mutationT <- traverse makeParsers mutation_root
|
||||
subscriptionT <- traverse makeParsers subscription_root
|
||||
return (queryT, mutationT, subscriptionT)
|
||||
where
|
||||
makeFieldParser :: G.FieldDefinition -> m (P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable))
|
||||
makeFieldParser fieldDef = do
|
||||
fp <- remoteField' sdoc fieldDef
|
||||
return $ do
|
||||
raw <- P.unsafeRawField (P.fDefinition fp)
|
||||
return (info, raw)
|
||||
makeParsers :: G.Name -> m [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)]
|
||||
makeParsers rootName =
|
||||
case lookupType sdoc rootName of
|
||||
Just (G.TypeDefinitionObject o) ->
|
||||
traverse makeFieldParser $ _otdFieldsDefinition o
|
||||
_ -> throw400 Unexpected $ rootName <<> " has to be an object type"
|
||||
|
||||
-- | 'remoteFieldFullSchema' takes the 'SchemaIntrospection' and a 'G.Name' and will
|
||||
-- return a 'SelectionSet' parser if the 'G.Name' is found and is a 'TypeDefinitionObject',
|
||||
-- otherwise, an error will be thrown.
|
||||
remoteFieldFullSchema
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.Name
|
||||
-> m (Parser 'Output n (G.SelectionSet NoFragments Variable))
|
||||
remoteFieldFullSchema sdoc name =
|
||||
P.memoizeOn 'remoteFieldFullSchema name do
|
||||
fieldObjectType <-
|
||||
case lookupType sdoc name of
|
||||
Just (G.TypeDefinitionObject o) -> pure o
|
||||
_ -> throw400 RemoteSchemaError $ "object with " <> G.unName name <> " not found"
|
||||
fieldParser <- remoteSchemaObject sdoc fieldObjectType
|
||||
pure $ P.unsafeRawParser (P.pType fieldParser)
|
||||
|
||||
remoteField'
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.FieldDefinition
|
||||
-> m (FieldParser n ())
|
||||
remoteField' schemaDoc (G.FieldDefinition description name argsDefinition gType _) =
|
||||
let
|
||||
addNullableList :: FieldParser n () -> FieldParser n ()
|
||||
addNullableList (P.FieldParser (Definition name' un desc (FieldInfo args typ)) parser)
|
||||
= P.FieldParser (Definition name' un desc (FieldInfo args (Nullable (TList typ)))) parser
|
||||
|
||||
addNonNullableList :: FieldParser n () -> FieldParser n ()
|
||||
addNonNullableList (P.FieldParser (Definition name' un desc (FieldInfo args typ)) parser)
|
||||
= P.FieldParser (Definition name' un desc (FieldInfo args (NonNullable (TList typ)))) parser
|
||||
|
||||
-- TODO add directives, deprecation
|
||||
convertType :: G.GType -> m (FieldParser n ())
|
||||
convertType gType' = do
|
||||
case gType' of
|
||||
G.TypeNamed (Nullability True) fieldTypeName ->
|
||||
P.nullableField <$> remoteFieldFromName schemaDoc name description fieldTypeName argsDefinition
|
||||
G.TypeList (Nullability True) gType'' ->
|
||||
addNullableList <$> convertType gType''
|
||||
G.TypeNamed (Nullability False) fieldTypeName -> do
|
||||
P.nullableField <$> remoteFieldFromName schemaDoc name description fieldTypeName argsDefinition
|
||||
G.TypeList (Nullability False) gType'' ->
|
||||
addNonNullableList <$> convertType gType''
|
||||
in convertType gType
|
||||
|
||||
-- | 'remoteSchemaObject' returns a output parser for a given 'ObjectTypeDefinition'.
|
||||
remoteSchemaObject
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.ObjectTypeDefinition
|
||||
-> m (Parser 'Output n ())
|
||||
remoteSchemaObject schemaDoc defn@(G.ObjectTypeDefinition description name interfaces _directives subFields) =
|
||||
P.memoizeOn 'remoteSchemaObject defn do
|
||||
subFieldParsers <- traverse (remoteField' schemaDoc) subFields
|
||||
interfaceDefs <- traverse getInterface interfaces
|
||||
implements <- traverse (remoteSchemaInterface schemaDoc) interfaceDefs
|
||||
-- TODO: also check sub-interfaces, when these are supported in a future graphql spec
|
||||
traverse_ validateImplementsFields interfaceDefs
|
||||
pure $ void $ P.selectionSetObject name description subFieldParsers implements
|
||||
where
|
||||
getInterface :: G.Name -> m (G.InterfaceTypeDefinition [G.Name])
|
||||
getInterface interfaceName =
|
||||
onNothing (lookupInterface schemaDoc interfaceName) $
|
||||
throw400 RemoteSchemaError $ "Could not find interface " <> squote interfaceName
|
||||
<> " implemented by Object type " <> squote name
|
||||
validateImplementsFields :: G.InterfaceTypeDefinition [G.Name] -> m ()
|
||||
validateImplementsFields interface =
|
||||
traverse_ (validateImplementsField (_itdName interface)) (G._itdFieldsDefinition interface)
|
||||
validateImplementsField :: G.Name -> G.FieldDefinition -> m ()
|
||||
validateImplementsField interfaceName interfaceField =
|
||||
case lookup (G._fldName interfaceField) (zip (fmap G._fldName subFields) subFields) of
|
||||
Nothing -> throw400 RemoteSchemaError $
|
||||
"Interface field " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField)
|
||||
<> " expected, but " <> squote name <> " does not provide it"
|
||||
Just f -> do
|
||||
unless (validateSubType (G._fldType f) (G._fldType interfaceField)) $
|
||||
throw400 RemoteSchemaError $
|
||||
"The type of Object field " <> squote name <> "." <> dquote (G._fldName f)
|
||||
<> " (" <> G.showGT (G._fldType f)
|
||||
<> ") is not the same type/sub type of Interface field "
|
||||
<> squote interfaceName <> "." <> dquote (G._fldName interfaceField)
|
||||
<> " (" <> G.showGT (G._fldType interfaceField) <> ")"
|
||||
traverse_ (validateArgument (G._fldArgumentsDefinition f)) (G._fldArgumentsDefinition interfaceField)
|
||||
traverse_ (validateNoExtraNonNull (G._fldArgumentsDefinition interfaceField)) (G._fldArgumentsDefinition f)
|
||||
where
|
||||
validateArgument :: G.ArgumentsDefinition -> G.InputValueDefinition -> m ()
|
||||
validateArgument objectFieldArgs ifaceArgument =
|
||||
case lookup (G._ivdName ifaceArgument) (zip (fmap G._ivdName objectFieldArgs) objectFieldArgs) of
|
||||
Nothing ->
|
||||
throw400 RemoteSchemaError $
|
||||
"Interface field argument " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField)
|
||||
<> "(" <> dquote (G._ivdName ifaceArgument) <> ":) required, but Object field " <> squote name <> "." <> dquote (G._fldName f)
|
||||
<> " does not provide it"
|
||||
Just a ->
|
||||
unless (G._ivdType a == G._ivdType ifaceArgument) $
|
||||
throw400 RemoteSchemaError $
|
||||
"Interface field argument " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField)
|
||||
<> "(" <> dquote (G._ivdName ifaceArgument) <> ":) expects type "
|
||||
<> G.showGT (G._ivdType ifaceArgument)
|
||||
<> ", but " <> squote name <> "." <> dquote (G._fldName f) <> "("
|
||||
<> dquote (G._ivdName ifaceArgument) <> ":) has type "
|
||||
<> G.showGT (G._ivdType a)
|
||||
validateNoExtraNonNull :: G.ArgumentsDefinition -> G.InputValueDefinition -> m ()
|
||||
validateNoExtraNonNull ifaceArguments objectFieldArg =
|
||||
case lookup (G._ivdName objectFieldArg) (zip (fmap G._ivdName ifaceArguments) ifaceArguments) of
|
||||
Just _ -> pure ()
|
||||
Nothing ->
|
||||
unless (G.isNullable (G._ivdType objectFieldArg)) $
|
||||
throw400 RemoteSchemaError $
|
||||
"Object field argument " <> squote name <> "." <> dquote (G._fldName f) <> "("
|
||||
<> dquote (G._ivdName objectFieldArg) <> ":) is of required type "
|
||||
<> G.showGT (G._ivdType objectFieldArg) <> ", but is not provided by Interface field "
|
||||
<> squote interfaceName <> "." <> dquote (G._fldName interfaceField)
|
||||
validateSubType :: G.GType -> G.GType -> Bool
|
||||
-- TODO this ignores nullability which is probably wrong, even though the GraphQL spec is ambiguous
|
||||
validateSubType (G.TypeList _ x) (G.TypeList _ y) = validateSubType x y
|
||||
-- It is OK to "upgrade" the strictness
|
||||
validateSubType (G.TypeNamed (Nullability False) x) (G.TypeNamed (Nullability True) y) =
|
||||
validateSubType (G.TypeNamed (Nullability True) x) (G.TypeNamed (Nullability True) y)
|
||||
validateSubType (G.TypeNamed nx x) (G.TypeNamed ny y) =
|
||||
case (lookupType schemaDoc x , lookupType schemaDoc y) of
|
||||
(Just x' , Just y') -> nx == ny && validateSubTypeDefinition x' y'
|
||||
_ -> False
|
||||
validateSubType _ _ = False
|
||||
validateSubTypeDefinition x' y' | x' == y' = True
|
||||
validateSubTypeDefinition (TypeDefinitionObject otd) (TypeDefinitionInterface itd)
|
||||
= G._otdName otd `elem` G._itdPossibleTypes itd
|
||||
validateSubTypeDefinition (TypeDefinitionObject _otd) (TypeDefinitionUnion _utd)
|
||||
= True -- TODO write appropriate check (may require saving 'possibleTypes' in Syntax.hs)
|
||||
validateSubTypeDefinition _ _ = False
|
||||
|
||||
-- | 'remoteSchemaInterface' returns a output parser for a given 'InterfaceTypeDefinition'.
|
||||
remoteSchemaInterface
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.InterfaceTypeDefinition [G.Name]
|
||||
-> m (Parser 'Output n ())
|
||||
remoteSchemaInterface schemaDoc defn@(G.InterfaceTypeDefinition description name _directives fields possibleTypes) =
|
||||
P.memoizeOn 'remoteSchemaObject defn do
|
||||
subFieldParsers <- traverse (remoteField' schemaDoc) fields
|
||||
objs :: [Parser 'Output n ()] <-
|
||||
traverse (getObject >=> remoteSchemaObject schemaDoc) possibleTypes
|
||||
-- In the Draft GraphQL spec (> June 2018), interfaces can themselves
|
||||
-- implement superinterfaces. In the future, we may need to support this
|
||||
-- here.
|
||||
when (null subFieldParsers) $
|
||||
throw400 RemoteSchemaError $ "List of fields cannot be empty for interface " <> squote name
|
||||
-- TODO: another way to obtain 'possibleTypes' is to lookup all the object
|
||||
-- types in the schema document that claim to implement this interface. We
|
||||
-- should have a check that expresses that that collection of objects is equal
|
||||
-- to 'possibelTypes'.
|
||||
pure $ void $ P.selectionSetInterface name description subFieldParsers objs
|
||||
where
|
||||
getObject :: G.Name -> m G.ObjectTypeDefinition
|
||||
getObject objectName =
|
||||
onNothing (lookupObject schemaDoc objectName) $
|
||||
case lookupInterface schemaDoc objectName of
|
||||
Nothing -> throw400 RemoteSchemaError $ "Could not find type " <> squote objectName
|
||||
<> ", which is defined as a member type of Interface " <> squote name
|
||||
Just _ -> throw400 RemoteSchemaError $ "Interface type " <> squote name <>
|
||||
" can only include object types. It cannot include " <> squote objectName
|
||||
|
||||
-- | 'remoteSchemaUnion' returns a output parser for a given 'UnionTypeDefinition'.
|
||||
remoteSchemaUnion
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.UnionTypeDefinition
|
||||
-> m (Parser 'Output n ())
|
||||
remoteSchemaUnion schemaDoc defn@(G.UnionTypeDefinition description name _directives objectNames) =
|
||||
P.memoizeOn 'remoteSchemaObject defn do
|
||||
objDefs <- traverse getObject objectNames
|
||||
objs :: [Parser 'Output n ()] <- traverse (remoteSchemaObject schemaDoc) objDefs
|
||||
when (null objs) $
|
||||
throw400 RemoteSchemaError $ "List of member types cannot be empty for union type " <> squote name
|
||||
pure $ void $ P.selectionSetUnion name description objs
|
||||
where
|
||||
getObject :: G.Name -> m G.ObjectTypeDefinition
|
||||
getObject objectName =
|
||||
onNothing (lookupObject schemaDoc objectName) $
|
||||
case lookupInterface schemaDoc objectName of
|
||||
Nothing -> throw400 RemoteSchemaError $ "Could not find type " <> squote objectName
|
||||
<> ", which is defined as a member type of Union " <> squote name
|
||||
Just _ -> throw400 RemoteSchemaError $ "Union type " <> squote name <>
|
||||
" can only include object types. It cannot include " <> squote objectName
|
||||
|
||||
-- | remoteSchemaInputObject returns an input parser for a given 'G.InputObjectTypeDefinition'
|
||||
remoteSchemaInputObject
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.InputObjectTypeDefinition
|
||||
-> m (Parser 'Input n ())
|
||||
remoteSchemaInputObject schemaDoc defn@(G.InputObjectTypeDefinition desc name _ valueDefns) =
|
||||
P.memoizeOn 'remoteSchemaInputObject defn do
|
||||
argsParser <- argumentsParser valueDefns schemaDoc
|
||||
pure $ P.object name desc argsParser
|
||||
|
||||
lookupType :: SchemaIntrospection -> G.Name -> Maybe (G.TypeDefinition [G.Name])
|
||||
lookupType (SchemaIntrospection types) name = find (\tp -> getNamedTyp tp == name) types
|
||||
where
|
||||
getNamedTyp :: G.TypeDefinition possibleTypes -> G.Name
|
||||
getNamedTyp ty = case ty of
|
||||
G.TypeDefinitionScalar t -> G._stdName t
|
||||
G.TypeDefinitionObject t -> G._otdName t
|
||||
G.TypeDefinitionInterface t -> G._itdName t
|
||||
G.TypeDefinitionUnion t -> G._utdName t
|
||||
G.TypeDefinitionEnum t -> G._etdName t
|
||||
G.TypeDefinitionInputObject t -> G._iotdName t
|
||||
|
||||
lookupObject :: SchemaIntrospection -> G.Name -> Maybe G.ObjectTypeDefinition
|
||||
lookupObject (SchemaIntrospection types) name = go types
|
||||
where
|
||||
go :: [TypeDefinition possibleTypes] -> Maybe G.ObjectTypeDefinition
|
||||
go ((G.TypeDefinitionObject t):tps)
|
||||
| G._otdName t == name = Just t
|
||||
| otherwise = go tps
|
||||
go (_:tps) = go tps
|
||||
go [] = Nothing
|
||||
|
||||
lookupInterface :: SchemaIntrospection -> G.Name -> Maybe (G.InterfaceTypeDefinition [G.Name])
|
||||
lookupInterface (SchemaIntrospection types) name = go types
|
||||
where
|
||||
go :: [TypeDefinition possibleTypes] -> Maybe (G.InterfaceTypeDefinition possibleTypes)
|
||||
go ((G.TypeDefinitionInterface t):tps)
|
||||
| G._itdName t == name = Just t
|
||||
| otherwise = go tps
|
||||
go (_:tps) = go tps
|
||||
go [] = Nothing
|
||||
|
||||
lookupScalar :: SchemaIntrospection -> G.Name -> Maybe G.ScalarTypeDefinition
|
||||
lookupScalar (SchemaIntrospection types) name = go types
|
||||
where
|
||||
go :: [TypeDefinition possibleTypes] -> Maybe G.ScalarTypeDefinition
|
||||
go ((G.TypeDefinitionScalar t):tps)
|
||||
| G._stdName t == name = Just t
|
||||
| otherwise = go tps
|
||||
go (_:tps) = go tps
|
||||
go [] = Nothing
|
||||
|
||||
-- | 'remoteFieldFromName' accepts a GraphQL name and searches for its definition
|
||||
-- in the 'SchemaIntrospection'.
|
||||
remoteFieldFromName
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.Name
|
||||
-> Maybe G.Description
|
||||
-> G.Name
|
||||
-> G.ArgumentsDefinition
|
||||
-> m (FieldParser n ())
|
||||
remoteFieldFromName sdoc fieldName description fieldTypeName argsDefns =
|
||||
case lookupType sdoc fieldTypeName of
|
||||
Nothing -> throw400 RemoteSchemaError $ "Could not find type with name " <> G.unName fieldName
|
||||
Just typeDef -> remoteField sdoc fieldName description argsDefns typeDef
|
||||
|
||||
-- | 'inputValuefinitionParser' accepts a 'G.InputValueDefinition' and will return an
|
||||
-- 'InputFieldsParser' for it. If a non 'Input' GraphQL type is found in the 'type' of
|
||||
-- the 'InputValueDefinition' then an error will be thrown.
|
||||
inputValueDefinitionParser
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> G.SchemaIntrospection
|
||||
-> G.InputValueDefinition
|
||||
-> m (InputFieldsParser n (Maybe (InputValue Variable)))
|
||||
inputValueDefinitionParser schemaDoc (G.InputValueDefinition desc name fieldType maybeDefaultVal) =
|
||||
let fieldConstructor :: forall k. 'Input <: k => Parser k n () -> InputFieldsParser n (Maybe (InputValue Variable))
|
||||
fieldConstructor parser =
|
||||
let wrappedParser :: Parser k n (InputValue Variable)
|
||||
wrappedParser =
|
||||
P.Parser
|
||||
{ P.pType = P.pType parser
|
||||
, P.pParser = \value -> P.pParser parser value $> castWith (P.inputParserInput @k) value
|
||||
}
|
||||
in case maybeDefaultVal of
|
||||
Nothing ->
|
||||
if G.isNullable fieldType
|
||||
then fieldOptional name desc wrappedParser
|
||||
else Just <$> field name desc wrappedParser
|
||||
Just defaultVal -> Just <$> fieldWithDefault name desc defaultVal wrappedParser
|
||||
doNullability :: forall k . 'Input <: k => G.Nullability -> Parser k n () -> Parser k n ()
|
||||
doNullability (G.Nullability True) = void . P.nullable
|
||||
doNullability (G.Nullability False) = id
|
||||
buildField
|
||||
:: G.GType
|
||||
-> (forall k. 'Input <: k => Parser k n () -> InputFieldsParser n (Maybe (InputValue Variable)))
|
||||
-> m (InputFieldsParser n (Maybe (InputValue Variable)))
|
||||
buildField fieldType' fieldConstructor' = case fieldType' of
|
||||
G.TypeNamed nullability typeName ->
|
||||
case lookupType schemaDoc typeName of
|
||||
Nothing -> throw400 RemoteSchemaError $ "Could not find type with name " <> G.unName typeName
|
||||
Just typeDef ->
|
||||
case typeDef of
|
||||
G.TypeDefinitionScalar (G.ScalarTypeDefinition scalarDesc name' _) ->
|
||||
pure $ fieldConstructor' $ doNullability nullability $ remoteFieldScalarParser name' scalarDesc
|
||||
G.TypeDefinitionEnum defn ->
|
||||
pure $ fieldConstructor' $ doNullability nullability $ remoteFieldEnumParser defn
|
||||
G.TypeDefinitionObject _ -> throw400 RemoteSchemaError "expected input type, but got output type" -- couldn't find the equivalent error in Validate/Types.hs, so using a new error message
|
||||
G.TypeDefinitionInputObject defn ->
|
||||
fieldConstructor' . doNullability nullability <$> remoteSchemaInputObject schemaDoc defn
|
||||
G.TypeDefinitionUnion _ -> throw400 RemoteSchemaError "expected input type, but got output type"
|
||||
G.TypeDefinitionInterface _ -> throw400 RemoteSchemaError "expected input type, but got output type"
|
||||
G.TypeList nullability subType -> buildField subType (fieldConstructor' . doNullability nullability . void . P.list)
|
||||
in buildField fieldType fieldConstructor
|
||||
|
||||
argumentsParser
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> G.ArgumentsDefinition
|
||||
-> G.SchemaIntrospection
|
||||
-> m (InputFieldsParser n ())
|
||||
argumentsParser args schemaDoc =
|
||||
sequenceA_ <$> traverse (inputValueDefinitionParser schemaDoc) args
|
||||
|
||||
-- | 'remoteField' accepts a 'G.TypeDefinition' and will returns a 'FieldParser' for it.
|
||||
-- Note that the 'G.TypeDefinition' should be of the GraphQL 'Output' kind, when an
|
||||
-- GraphQL 'Input' kind is provided, then error will be thrown.
|
||||
remoteField
|
||||
:: forall n m
|
||||
. (MonadSchema n m, MonadError QErr m)
|
||||
=> SchemaIntrospection
|
||||
-> G.Name
|
||||
-> Maybe G.Description
|
||||
-> G.ArgumentsDefinition
|
||||
-> G.TypeDefinition [G.Name]
|
||||
-> m (FieldParser n ()) -- TODO return something useful, maybe?
|
||||
remoteField sdoc fieldName description argsDefn typeDefn = do
|
||||
-- TODO add directives
|
||||
argsParser <- argumentsParser argsDefn sdoc
|
||||
case typeDefn of
|
||||
G.TypeDefinitionObject objTypeDefn -> do
|
||||
remoteSchemaObj <- remoteSchemaObject sdoc objTypeDefn
|
||||
pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj
|
||||
G.TypeDefinitionScalar (G.ScalarTypeDefinition desc name' _) ->
|
||||
pure $ P.selection fieldName description argsParser $ remoteFieldScalarParser name' desc
|
||||
G.TypeDefinitionEnum enumTypeDefn ->
|
||||
pure $ P.selection fieldName description argsParser $ remoteFieldEnumParser enumTypeDefn
|
||||
G.TypeDefinitionInterface ifaceTypeDefn -> do
|
||||
remoteSchemaObj <- remoteSchemaInterface sdoc ifaceTypeDefn
|
||||
pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj
|
||||
G.TypeDefinitionUnion unionTypeDefn -> do
|
||||
remoteSchemaObj <- remoteSchemaUnion sdoc unionTypeDefn
|
||||
pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj
|
||||
_ -> throw400 RemoteSchemaError "expected output type, but got input type"
|
||||
|
||||
remoteFieldScalarParser
|
||||
:: MonadParse n
|
||||
=> G.Name
|
||||
-> Maybe G.Description
|
||||
-> Parser 'Both n ()
|
||||
remoteFieldScalarParser name description =
|
||||
case G.unName name of
|
||||
"Boolean" -> P.boolean $> ()
|
||||
"Int" -> P.int $> ()
|
||||
"Float" -> P.float $> ()
|
||||
"String" -> P.string $> ()
|
||||
"ID" -> P.identifier $> ()
|
||||
_ -> P.unsafeRawScalar name description $> ()
|
||||
|
||||
remoteFieldEnumParser
|
||||
:: MonadParse n
|
||||
=> G.EnumTypeDefinition
|
||||
-> Parser 'Both n ()
|
||||
remoteFieldEnumParser (G.EnumTypeDefinition desc name _ valueDefns) =
|
||||
let enumValDefns = valueDefns <&> \(G.EnumValueDefinition enumDesc enumName _) ->
|
||||
(mkDefinition (G.unEnumValue enumName) enumDesc P.EnumValueInfo,())
|
||||
in P.enum name desc $ NE.fromList enumValDefns
|
File diff suppressed because it is too large
Load Diff
171
server/src-lib/Hasura/GraphQL/Schema/Table.hs
Normal file
171
server/src-lib/Hasura/GraphQL/Schema/Table.hs
Normal file
@ -0,0 +1,171 @@
|
||||
-- | Helper functions for generating the schema of database tables
|
||||
module Hasura.GraphQL.Schema.Table
|
||||
( tableSelectColumnsEnum
|
||||
, tableUpdateColumnsEnum
|
||||
, tablePermissions
|
||||
, tableSelectPermissions
|
||||
, tableUpdatePermissions
|
||||
, tableDeletePermissions
|
||||
, tableSelectFields
|
||||
, tableColumns
|
||||
, tableSelectColumns
|
||||
, tableUpdateColumns
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.GraphQL.Parser as P
|
||||
|
||||
import Hasura.GraphQL.Parser (Kind (..), Parser)
|
||||
import Hasura.GraphQL.Parser.Class
|
||||
import Hasura.RQL.DML.Internal (getRolePermInfo)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
-- | Table select columns enum
|
||||
--
|
||||
-- Parser for an enum type that matches the columns of the given
|
||||
-- table. Used as a parameter for "distinct", among others. Maps to
|
||||
-- the table_select_column object.
|
||||
--
|
||||
-- Return Nothing if there's no column the current user has "select"
|
||||
-- permissions for.
|
||||
tableSelectColumnsEnum
|
||||
:: (MonadSchema n m, MonadRole r m, MonadTableInfo r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m (Maybe (Parser 'Both n PGCol))
|
||||
tableSelectColumnsEnum table selectPermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableSelectColumns table selectPermissions
|
||||
let enumName = tableName <> $$(G.litName "_select_column")
|
||||
description = Just $ G.Description $
|
||||
"select columns of table " <>> table
|
||||
pure $ P.enum enumName description <$> nonEmpty
|
||||
[ ( define $ pgiName column
|
||||
, pgiColumn column
|
||||
)
|
||||
| column <- columns
|
||||
]
|
||||
where
|
||||
define name =
|
||||
P.mkDefinition name (Just $ G.Description "column name") P.EnumValueInfo
|
||||
|
||||
-- | Table update columns enum
|
||||
--
|
||||
-- Parser for an enum type that matches the columns of the given
|
||||
-- table. Used for conflict resolution in "insert" mutations, among
|
||||
-- others. Maps to the table_update_column object.
|
||||
--
|
||||
-- Return Nothing if there's no column the current user has "update"
|
||||
-- permissions for.
|
||||
tableUpdateColumnsEnum
|
||||
:: (MonadSchema n m, MonadRole r m, MonadTableInfo r m)
|
||||
=> QualifiedTable
|
||||
-> UpdPermInfo
|
||||
-> m (Maybe (Parser 'Both n PGCol))
|
||||
tableUpdateColumnsEnum table updatePermissions = do
|
||||
tableName <- qualifiedObjectToName table
|
||||
columns <- tableUpdateColumns table updatePermissions
|
||||
let enumName = tableName <> $$(G.litName "_update_column")
|
||||
description = Just $ G.Description $
|
||||
"update columns of table " <>> table
|
||||
pure $ P.enum enumName description <$> nonEmpty
|
||||
[ ( define $ pgiName column
|
||||
, pgiColumn column
|
||||
)
|
||||
| column <- columns
|
||||
]
|
||||
where
|
||||
define name =
|
||||
P.mkDefinition name (Just $ G.Description "column name") P.EnumValueInfo
|
||||
|
||||
tablePermissions
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> m (Maybe RolePermInfo)
|
||||
tablePermissions table = do
|
||||
roleName <- askRoleName
|
||||
tableInfo <- askTableInfo table
|
||||
pure $ getRolePermInfo roleName tableInfo
|
||||
|
||||
tableSelectPermissions
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> m (Maybe SelPermInfo)
|
||||
tableSelectPermissions table = (_permSel =<<) <$> tablePermissions table
|
||||
|
||||
tableUpdatePermissions
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> m (Maybe UpdPermInfo)
|
||||
tableUpdatePermissions table = (_permUpd =<<) <$> tablePermissions table
|
||||
|
||||
tableDeletePermissions
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> m (Maybe DelPermInfo)
|
||||
tableDeletePermissions table = (_permDel =<<) <$> tablePermissions table
|
||||
|
||||
tableSelectFields
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m [FieldInfo]
|
||||
tableSelectFields table permissions = do
|
||||
tableFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table
|
||||
filterM canBeSelected $ Map.elems tableFields
|
||||
where
|
||||
canBeSelected (FIColumn columnInfo) =
|
||||
pure $ Set.member (pgiColumn columnInfo) (spiCols permissions)
|
||||
canBeSelected (FIRelationship relationshipInfo) =
|
||||
isJust <$> tableSelectPermissions (riRTable relationshipInfo)
|
||||
canBeSelected (FIComputedField computedFieldInfo) =
|
||||
case _cfiReturnType computedFieldInfo of
|
||||
CFRScalar _ ->
|
||||
pure $ Set.member (_cfiName computedFieldInfo) $ spiScalarComputedFields permissions
|
||||
CFRSetofTable tableName ->
|
||||
isJust <$> tableSelectPermissions tableName
|
||||
-- TODO (from master): Derive permissions for remote relationships
|
||||
canBeSelected (FIRemoteRelationship _) = pure True
|
||||
|
||||
tableColumns
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> QualifiedTable
|
||||
-> m [PGColumnInfo]
|
||||
tableColumns table =
|
||||
mapMaybe columnInfo . Map.elems . _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table
|
||||
where
|
||||
columnInfo (FIColumn ci) = Just ci
|
||||
columnInfo _ = Nothing
|
||||
|
||||
tableSelectColumns
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m)
|
||||
=> QualifiedTable
|
||||
-> SelPermInfo
|
||||
-> m [PGColumnInfo]
|
||||
tableSelectColumns table permissions =
|
||||
mapMaybe columnInfo <$> tableSelectFields table permissions
|
||||
where
|
||||
columnInfo (FIColumn ci) = Just ci
|
||||
columnInfo _ = Nothing
|
||||
|
||||
tableUpdateColumns
|
||||
:: forall m n r. (MonadSchema n m, MonadTableInfo r m)
|
||||
=> QualifiedTable
|
||||
-> UpdPermInfo
|
||||
-> m [PGColumnInfo]
|
||||
tableUpdateColumns table permissions = do
|
||||
tableFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table
|
||||
pure $ mapMaybe isUpdatable $ Map.elems tableFields
|
||||
where
|
||||
isUpdatable (FIColumn columnInfo) =
|
||||
if Set.member (pgiColumn columnInfo) (upiCols permissions)
|
||||
&& not (Map.member (pgiColumn columnInfo) (upiSet permissions))
|
||||
then Just columnInfo
|
||||
else Nothing
|
||||
isUpdatable _ = Nothing
|
@ -16,7 +16,9 @@ module Hasura.GraphQL.Transport.HTTP
|
||||
import Control.Monad.Morph (hoist)
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context
|
||||
import Hasura.GraphQL.Logging (MonadQueryLog (..))
|
||||
import Hasura.GraphQL.Parser.Column (UnpreparedValue)
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
import Hasura.HTTP
|
||||
import Hasura.Prelude
|
||||
@ -27,11 +29,12 @@ import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
import Hasura.Tracing (MonadTrace, TraceT, trace)
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Hasura.GraphQL.Execute as E
|
||||
import qualified Hasura.GraphQL.Execute.Query as EQ
|
||||
import qualified Hasura.GraphQL.Resolve as R
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Hasura.Server.Telemetry.Counters as Telem
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
@ -43,7 +46,7 @@ import qualified Network.Wai.Extended as Wai
|
||||
class Monad m => MonadExecuteQuery m where
|
||||
executeQuery
|
||||
:: GQLReqParsed
|
||||
-> [R.QueryRootFldUnresolved]
|
||||
-> [QueryRootField UnpreparedValue]
|
||||
-> Maybe EQ.GeneratedSqlMap
|
||||
-> PGExecCtx
|
||||
-> Q.TxAccess
|
||||
@ -84,30 +87,61 @@ runGQ env logger reqId userInfo ipAddress reqHeaders queryType reqUnparsed = do
|
||||
-- The response and misc telemetry data:
|
||||
let telemTransport = Telem.HTTP
|
||||
(telemTimeTot_DT, (telemCacheHit, telemLocality, (telemTimeIO_DT, telemQueryType, !resp))) <- withElapsedTime $ do
|
||||
E.ExecutionCtx _ sqlGenCtx pgExecCtx planCache sc scVer httpManager enableAL <- ask
|
||||
E.ExecutionCtx _ sqlGenCtx pgExecCtx {- planCache -} sc scVer httpManager enableAL <- ask
|
||||
|
||||
-- run system authorization on the GraphQL API
|
||||
reqParsed <- E.checkGQLExecution userInfo (reqHeaders, ipAddress) enableAL sc reqUnparsed
|
||||
>>= flip onLeft throwError
|
||||
|
||||
(telemCacheHit, execPlan) <- E.getResolvedExecPlan env logger pgExecCtx planCache
|
||||
(telemCacheHit, execPlan) <- E.getResolvedExecPlan env logger pgExecCtx {- planCache -}
|
||||
userInfo sqlGenCtx sc scVer queryType
|
||||
httpManager reqHeaders (reqUnparsed, reqParsed)
|
||||
case execPlan of
|
||||
E.QueryExecutionPlan queryPlan asts ->
|
||||
case queryPlan of
|
||||
E.ExecStepDB txGenSql -> do
|
||||
(telemTimeIO, telemQueryType, respHdrs, resp) <-
|
||||
runQueryDB reqId (reqUnparsed,reqParsed) asts userInfo txGenSql
|
||||
return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp respHdrs))
|
||||
E.ExecStepRemote (rsi, opDef, _varValsM) ->
|
||||
runRemoteGQ telemCacheHit rsi opDef
|
||||
E.ExecStepRaw (name, json) -> do
|
||||
(telemTimeIO, obj) <- withElapsedTime $
|
||||
return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json
|
||||
return (telemCacheHit, Telem.Local, (telemTimeIO, Telem.Query, HttpResponse obj []))
|
||||
E.MutationExecutionPlan mutationPlan ->
|
||||
case mutationPlan of
|
||||
E.ExecStepDB (tx, responseHeaders) -> do
|
||||
(telemTimeIO, telemQueryType, resp) <- runMutationDB reqId reqUnparsed userInfo tx
|
||||
return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp responseHeaders))
|
||||
E.ExecStepRemote (rsi, opDef, _varValsM) ->
|
||||
runRemoteGQ telemCacheHit rsi opDef
|
||||
E.ExecStepRaw (name, json) -> do
|
||||
(telemTimeIO, obj) <- withElapsedTime $
|
||||
return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json
|
||||
return (telemCacheHit, Telem.Local, (telemTimeIO, Telem.Query, HttpResponse obj []))
|
||||
E.SubscriptionExecutionPlan _sub ->
|
||||
throw400 UnexpectedPayload "subscriptions are not supported over HTTP, use websockets instead"
|
||||
{-
|
||||
E.GExPHasura resolvedOp -> do
|
||||
(telemTimeIO, telemQueryType, respHdrs, resp) <- runHasuraGQ reqId (reqUnparsed, reqParsed) userInfo resolvedOp
|
||||
return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp respHdrs))
|
||||
E.GExPRemote rsi opDef -> do
|
||||
let telemQueryType | G._todType opDef == G.OperationTypeMutation = Telem.Mutation
|
||||
| otherwise = Telem.Query
|
||||
(telemTimeIO, resp) <- E.execRemoteGQ env reqId userInfo reqHeaders reqUnparsed rsi $ G._todType opDef
|
||||
pure (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp))
|
||||
|
||||
(telemTimeIO, resp) <- E.execRemoteGQ reqId userInfo reqHeaders reqUnparsed rsi $ G._todType opDef
|
||||
return (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp))
|
||||
-}
|
||||
let telemTimeIO = convertDuration telemTimeIO_DT
|
||||
telemTimeTot = convertDuration telemTimeTot_DT
|
||||
|
||||
Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..}
|
||||
return resp
|
||||
where
|
||||
runRemoteGQ telemCacheHit rsi opDef = do
|
||||
let telemQueryType | G._todType opDef == G.OperationTypeMutation = Telem.Mutation
|
||||
| otherwise = Telem.Query
|
||||
(telemTimeIO, resp) <- E.execRemoteGQ env reqId userInfo reqHeaders reqUnparsed rsi opDef
|
||||
return (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp))
|
||||
|
||||
-- | Run (execute) a batched GraphQL query (see 'GQLBatchedReqs')
|
||||
runGQBatched
|
||||
@ -131,7 +165,7 @@ runGQBatched
|
||||
-> GQLBatchedReqs GQLQueryText
|
||||
-- ^ the batched request with unparsed GraphQL query
|
||||
-> m (HttpResponse EncJSON)
|
||||
runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs queryType query = do
|
||||
runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs queryType query =
|
||||
case query of
|
||||
GQLSingleRequest req ->
|
||||
runGQ env logger reqId userInfo ipAddress reqHdrs queryType req
|
||||
@ -150,6 +184,60 @@ runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs qu
|
||||
try = flip catchError (pure . Left) . fmap Right
|
||||
|
||||
|
||||
runQueryDB
|
||||
:: ( MonadIO m
|
||||
, MonadError QErr m
|
||||
, MonadReader E.ExecutionCtx m
|
||||
, MonadQueryLog m
|
||||
, MonadTrace m
|
||||
, MonadExecuteQuery m
|
||||
)
|
||||
=> RequestId
|
||||
-> (GQLReqUnparsed, GQLReqParsed)
|
||||
-> [QueryRootField UnpreparedValue]
|
||||
-> UserInfo
|
||||
-> (Tracing.TraceT (LazyTx QErr) EncJSON, EQ.GeneratedSqlMap)
|
||||
-> m (DiffTime, Telem.QueryType, HTTP.ResponseHeaders, EncJSON)
|
||||
-- ^ Also return 'Mutation' when the operation was a mutation, and the time
|
||||
-- spent in the PG query; for telemetry.
|
||||
runQueryDB reqId (query, queryParsed) asts _userInfo (tx, genSql) = do
|
||||
-- log the generated SQL and the graphql query
|
||||
E.ExecutionCtx logger _ pgExecCtx _ _ _ _ <- ask
|
||||
logQueryLog logger query (Just genSql) reqId
|
||||
(telemTimeIO, respE) <- withElapsedTime $ runExceptT $ trace "pg" $
|
||||
Tracing.interpTraceT id $ executeQuery queryParsed asts (Just genSql) pgExecCtx Q.ReadOnly tx
|
||||
(respHdrs,resp) <- liftEither respE
|
||||
let !json = encodeGQResp $ GQSuccess $ encJToLBS resp
|
||||
telemQueryType = Telem.Query
|
||||
return (telemTimeIO, telemQueryType, respHdrs, json)
|
||||
|
||||
runMutationDB
|
||||
:: ( MonadIO m
|
||||
, MonadError QErr m
|
||||
, MonadReader E.ExecutionCtx m
|
||||
, MonadQueryLog m
|
||||
, MonadTrace m
|
||||
)
|
||||
=> RequestId
|
||||
-> GQLReqUnparsed
|
||||
-> UserInfo
|
||||
-> Tracing.TraceT (LazyTx QErr) EncJSON
|
||||
-> m (DiffTime, Telem.QueryType, EncJSON)
|
||||
-- ^ Also return 'Mutation' when the operation was a mutation, and the time
|
||||
-- spent in the PG query; for telemetry.
|
||||
runMutationDB reqId query userInfo tx = do
|
||||
E.ExecutionCtx logger _ pgExecCtx _ _ _ _ <- ask
|
||||
-- log the graphql query
|
||||
logQueryLog logger query Nothing reqId
|
||||
ctx <- Tracing.currentContext
|
||||
(telemTimeIO, respE) <- withElapsedTime $ runExceptT $ trace "pg" $
|
||||
Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadWrite . withTraceContext ctx . withUserInfo userInfo) tx
|
||||
resp <- liftEither respE
|
||||
let !json = encodeGQResp $ GQSuccess $ encJToLBS resp
|
||||
telemQueryType = Telem.Mutation
|
||||
return (telemTimeIO, telemQueryType, json)
|
||||
|
||||
{-
|
||||
runHasuraGQ
|
||||
:: ( MonadIO m
|
||||
, MonadError QErr m
|
||||
@ -187,3 +275,4 @@ runHasuraGQ reqId (query, queryParsed) userInfo resolvedOp = do
|
||||
let !json = encodeGQResp $ GQSuccess $ encJToLBS resp
|
||||
telemQueryType = case resolvedOp of E.ExOpMutation{} -> Telem.Mutation ; _ -> Telem.Query
|
||||
return (telemTimeIO, telemQueryType, respHdrs, json)
|
||||
-}
|
||||
|
@ -1,6 +1,3 @@
|
||||
{-# LANGUAGE NamedFieldPuns #-}
|
||||
{-# LANGUAGE RecordWildCards #-}
|
||||
|
||||
module Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
( GQLReq(..)
|
||||
, GQLBatchedReqs(..)
|
||||
@ -19,37 +16,25 @@ module Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
, RemoteGqlResp(..)
|
||||
, GraphqlResponse(..)
|
||||
, encodeGraphqlResponse
|
||||
, GQRespValue(..), gqRespData, gqRespErrors
|
||||
, encodeGQRespValue
|
||||
, parseGQRespValue
|
||||
, parseEncJObject
|
||||
, GQJoinError(..), gqJoinErrorToValue
|
||||
) where
|
||||
|
||||
import Control.Lens
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
|
||||
import Language.GraphQL.Draft.Instances ()
|
||||
import Language.Haskell.TH.Syntax (Lift)
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.Ordered as OJ
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Vector as V
|
||||
import qualified Language.GraphQL.Draft.Parser as G
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified VectorBuilder.Builder as VB
|
||||
import qualified VectorBuilder.Vector as VB
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Parser as G
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
newtype GQLExecDoc
|
||||
= GQLExecDoc { unGQLExecDoc :: [G.ExecutableDefinition] }
|
||||
deriving (Ord, Show, Eq, Hashable, Lift)
|
||||
= GQLExecDoc { unGQLExecDoc :: [G.ExecutableDefinition G.Name] }
|
||||
deriving (Ord, Show, Eq, Hashable,Lift)
|
||||
|
||||
instance J.FromJSON GQLExecDoc where
|
||||
parseJSON v = GQLExecDoc . G.getExecutableDefinitions <$> J.parseJSON v
|
||||
@ -62,9 +47,9 @@ newtype OperationName
|
||||
deriving (Ord, Show, Eq, Hashable, J.ToJSON, Lift)
|
||||
|
||||
instance J.FromJSON OperationName where
|
||||
parseJSON v = OperationName . G.Name <$> J.parseJSON v
|
||||
parseJSON v = OperationName <$> J.parseJSON v
|
||||
|
||||
type VariableValues = Map.HashMap G.Variable J.Value
|
||||
type VariableValues = Map.HashMap G.Name J.Value
|
||||
|
||||
data GQLReq a
|
||||
= GQLReq
|
||||
@ -105,7 +90,7 @@ type GQLReqParsed = GQLReq GQLExecDoc
|
||||
|
||||
toParsed :: (MonadError QErr m ) => GQLReqUnparsed -> m GQLReqParsed
|
||||
toParsed req = case G.parseExecutableDoc gqlText of
|
||||
Left _ -> withPathK "query" $ throwVE "not a valid graphql query"
|
||||
Left _ -> withPathK "query" $ throw400 ValidationFailed "not a valid graphql query"
|
||||
Right a -> return $ req { _grQuery = GQLExecDoc $ G.getExecutableDefinitions a }
|
||||
where
|
||||
gqlText = _unGQLQueryText $ _grQuery req
|
||||
@ -114,40 +99,11 @@ encodeGQErr :: Bool -> QErr -> J.Value
|
||||
encodeGQErr includeInternal qErr =
|
||||
J.object [ "errors" J..= [encodeGQLErr includeInternal qErr]]
|
||||
|
||||
-- | https://graphql.github.io/graphql-spec/June2018/#sec-Response-Format
|
||||
--
|
||||
-- NOTE: this type and parseGQRespValue are a lax representation of the spec,
|
||||
-- since...
|
||||
-- - remote GraphQL servers may not conform strictly, and...
|
||||
-- - we use this type as an accumulator.
|
||||
--
|
||||
-- Ideally we'd have something correct by construction for hasura results
|
||||
-- someplace.
|
||||
data GQRespValue =
|
||||
GQRespValue
|
||||
{ _gqRespData :: OJ.Object
|
||||
-- ^ 'OJ.empty' (corresponding to the invalid `"data": {}`) indicates an error.
|
||||
, _gqRespErrors :: VB.Builder OJ.Value
|
||||
-- ^ An 'OJ.Array', but with efficient cons and concatenation. Null indicates
|
||||
-- query success.
|
||||
}
|
||||
|
||||
makeLenses ''GQRespValue
|
||||
|
||||
newtype GQJoinError = GQJoinError Text
|
||||
deriving (Show, Eq, IsString, Monoid, Semigroup)
|
||||
|
||||
-- | https://graphql.github.io/graphql-spec/June2018/#sec-Errors "Error result format"
|
||||
gqJoinErrorToValue :: GQJoinError -> OJ.Value
|
||||
gqJoinErrorToValue (GQJoinError msg) =
|
||||
OJ.Object (OJ.fromList [("message", OJ.String msg)])
|
||||
|
||||
data GQResult a
|
||||
= GQSuccess !a
|
||||
| GQPreExecError ![J.Value]
|
||||
| GQExecError ![J.Value]
|
||||
| GQGeneric !GQRespValue
|
||||
deriving (Functor, Foldable, Traversable)
|
||||
deriving (Show, Eq, Functor, Foldable, Traversable)
|
||||
|
||||
type GQResponse = GQResult BL.ByteString
|
||||
|
||||
@ -156,6 +112,13 @@ isExecError = \case
|
||||
GQExecError _ -> True
|
||||
_ -> False
|
||||
|
||||
encodeGQResp :: GQResponse -> EncJSON
|
||||
encodeGQResp gqResp =
|
||||
encJFromAssocList $ case gqResp of
|
||||
GQSuccess r -> [("data", encJFromLBS r)]
|
||||
GQPreExecError e -> [("errors", encJFromJValue e)]
|
||||
GQExecError e -> [("data", "null"), ("errors", encJFromJValue e)]
|
||||
|
||||
-- | Represents GraphQL response from a remote server
|
||||
data RemoteGqlResp
|
||||
= RemoteGqlResp
|
||||
@ -181,53 +144,3 @@ encodeGraphqlResponse :: GraphqlResponse -> EncJSON
|
||||
encodeGraphqlResponse = \case
|
||||
GRHasura resp -> encodeGQResp resp
|
||||
GRRemote resp -> encodeRemoteGqlResp resp
|
||||
|
||||
-- emptyResp :: GQRespValue
|
||||
-- emptyResp = GQRespValue OJ.empty VB.empty
|
||||
|
||||
parseEncJObject :: EncJSON -> Either String OJ.Object
|
||||
parseEncJObject = OJ.eitherDecode . encJToLBS >=> \case
|
||||
OJ.Object obj -> pure obj
|
||||
_ -> Left "expected object for GraphQL response"
|
||||
|
||||
parseGQRespValue :: EncJSON -> Either String GQRespValue
|
||||
parseGQRespValue = parseEncJObject >=> \obj -> do
|
||||
_gqRespData <-
|
||||
case OJ.lookup "data" obj of
|
||||
-- "an error was encountered before execution began":
|
||||
Nothing -> pure OJ.empty
|
||||
-- "an error was encountered during the execution that prevented a valid response":
|
||||
Just OJ.Null -> pure OJ.empty
|
||||
Just (OJ.Object dobj) -> pure dobj
|
||||
Just _ -> Left "expected object or null for GraphQL data response"
|
||||
_gqRespErrors <-
|
||||
case OJ.lookup "errors" obj of
|
||||
Nothing -> pure VB.empty
|
||||
Just (OJ.Array vec) -> pure $ VB.vector vec
|
||||
Just _ -> Left "expected array for GraphQL error response"
|
||||
pure (GQRespValue {_gqRespData, _gqRespErrors})
|
||||
|
||||
encodeGQRespValue :: GQRespValue -> EncJSON
|
||||
encodeGQRespValue GQRespValue{..} = OJ.toEncJSON $ OJ.Object $ OJ.fromList $
|
||||
-- "If the data entry in the response is not present, the errors entry in the
|
||||
-- response must not be empty. It must contain at least one error. "
|
||||
if _gqRespData == OJ.empty && not anyErrors
|
||||
then
|
||||
let msg = "Somehow did not accumulate any errors or data from graphql queries"
|
||||
in [("errors", OJ.Array $ V.singleton $ OJ.Object (OJ.fromList [("message", OJ.String msg)]) )]
|
||||
else
|
||||
-- NOTE: "If an error was encountered during the execution that prevented
|
||||
-- a valid response, the data entry in the response should be null."
|
||||
-- TODO it's not clear to me how we can enforce that here or if we should try.
|
||||
("data", OJ.Object _gqRespData) :
|
||||
[("errors", OJ.Array gqRespErrorsV) | anyErrors ]
|
||||
where
|
||||
gqRespErrorsV = VB.build _gqRespErrors
|
||||
anyErrors = not $ V.null gqRespErrorsV
|
||||
|
||||
encodeGQResp :: GQResponse -> EncJSON
|
||||
encodeGQResp = \case
|
||||
GQSuccess r -> encJFromAssocList [("data", encJFromLBS r)]
|
||||
GQPreExecError e -> encJFromAssocList [("errors", encJFromJValue e)]
|
||||
GQExecError e -> encJFromAssocList [("data", "null"), ("errors", encJFromJValue e)]
|
||||
GQGeneric v -> encodeGQRespValue v
|
||||
|
@ -1,3 +1,4 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE RecordWildCards #-}
|
||||
|
||||
@ -22,6 +23,7 @@ import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.CaseInsensitive as CI
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as TE
|
||||
@ -34,16 +36,17 @@ import qualified Network.HTTP.Types as H
|
||||
import qualified Network.Wai.Extended as Wai
|
||||
import qualified Network.WebSockets as WS
|
||||
import qualified StmContainers.Map as STMMap
|
||||
import qualified Data.Environment as Env
|
||||
|
||||
import Control.Concurrent.Extended (sleep)
|
||||
import Control.Exception.Lifted
|
||||
import Data.String
|
||||
#ifndef PROFILING
|
||||
import GHC.AssertNF
|
||||
#endif
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery(..))
|
||||
import Hasura.GraphQL.Logging (MonadQueryLog (..))
|
||||
import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery (..))
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
import Hasura.GraphQL.Transport.WebSocket.Protocol
|
||||
import Hasura.HTTP
|
||||
@ -52,8 +55,7 @@ import Hasura.RQL.Types
|
||||
import Hasura.Server.Auth (AuthMode, UserAuthentication,
|
||||
resolveUserInfo)
|
||||
import Hasura.Server.Cors
|
||||
import Hasura.Server.Utils (RequestId,
|
||||
getRequestId)
|
||||
import Hasura.Server.Utils (RequestId, getRequestId)
|
||||
import Hasura.Server.Version (HasVersion)
|
||||
import Hasura.Session
|
||||
|
||||
@ -217,7 +219,7 @@ data WSServerEnv
|
||||
, _wseHManager :: !H.Manager
|
||||
, _wseCorsPolicy :: !CorsPolicy
|
||||
, _wseSQLCtx :: !SQLGenCtx
|
||||
, _wseQueryCache :: !E.PlanCache
|
||||
-- , _wseQueryCache :: !E.PlanCache -- See Note [Temporarily disabling query plan caching]
|
||||
, _wseServer :: !WSServer
|
||||
, _wseEnableAllowlist :: !Bool
|
||||
}
|
||||
@ -341,12 +343,89 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do
|
||||
reqParsedE <- lift $ E.checkGQLExecution userInfo (reqHdrs, ipAddress) enableAL sc q
|
||||
reqParsed <- either (withComplete . preExecErr requestId) return reqParsedE
|
||||
execPlanE <- runExceptT $ E.getResolvedExecPlan env logger pgExecCtx
|
||||
planCache userInfo sqlGenCtx sc scVer queryType httpMgr reqHdrs (q, reqParsed)
|
||||
{- planCache -} userInfo sqlGenCtx sc scVer queryType httpMgr reqHdrs (q, reqParsed)
|
||||
|
||||
(telemCacheHit, execPlan) <- either (withComplete . preExecErr requestId) return execPlanE
|
||||
let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx planCache sc scVer httpMgr enableAL
|
||||
let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx {- planCache -} sc scVer httpMgr enableAL
|
||||
|
||||
case execPlan of
|
||||
E.QueryExecutionPlan queryPlan asts ->
|
||||
case queryPlan of
|
||||
E.ExecStepDB (tx, genSql) -> Tracing.trace "Query" $
|
||||
execQueryOrMut timerTot Telem.Query telemCacheHit (Just genSql) requestId $
|
||||
fmap snd $ Tracing.interpTraceT id $ executeQuery reqParsed asts (Just genSql) pgExecCtx Q.ReadOnly tx
|
||||
E.ExecStepRemote (rsi, opDef, _varValsM) ->
|
||||
runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi
|
||||
E.ExecStepRaw (name, json) ->
|
||||
execQueryOrMut timerTot Telem.Query telemCacheHit Nothing requestId $
|
||||
return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json
|
||||
E.MutationExecutionPlan mutationPlan ->
|
||||
case mutationPlan of
|
||||
E.ExecStepDB (tx, _) -> Tracing.trace "Mutate" do
|
||||
ctx <- Tracing.currentContext
|
||||
execQueryOrMut timerTot Telem.Mutation telemCacheHit Nothing requestId $
|
||||
Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadWrite . withTraceContext ctx . withUserInfo userInfo) tx
|
||||
E.ExecStepRemote (rsi, opDef, _varValsM) ->
|
||||
runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi
|
||||
E.ExecStepRaw (name, json) ->
|
||||
execQueryOrMut timerTot Telem.Query telemCacheHit Nothing requestId $
|
||||
return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json
|
||||
E.SubscriptionExecutionPlan lqOp -> do
|
||||
-- log the graphql query
|
||||
logQueryLog logger q Nothing requestId
|
||||
let subscriberMetadata = LQ.mkSubscriberMetadata $ J.object
|
||||
[ "websocket_id" J..= WS.getWSId wsConn
|
||||
, "operation_id" J..= opId
|
||||
]
|
||||
-- NOTE!: we mask async exceptions higher in the call stack, but it's
|
||||
-- crucial we don't lose lqId after addLiveQuery returns successfully.
|
||||
!lqId <- liftIO $ LQ.addLiveQuery logger subscriberMetadata lqMap lqOp liveQOnChange
|
||||
let !opName = _grOperationName q
|
||||
#ifndef PROFILING
|
||||
liftIO $ $assertNFHere $! (lqId, opName) -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
liftIO $ STM.atomically $
|
||||
-- NOTE: see crucial `lookup` check above, ensuring this doesn't clobber:
|
||||
STMMap.insert (lqId, opName) opId opMap
|
||||
logOpEv ODStarted (Just requestId)
|
||||
|
||||
-- case execPlan of
|
||||
-- E.GExPHasura resolvedOp ->
|
||||
-- runHasuraGQ timerTot telemCacheHit requestId q userInfo resolvedOp
|
||||
-- E.GExPRemote rsi opDef ->
|
||||
-- runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi
|
||||
where
|
||||
telemTransport = Telem.WebSocket
|
||||
execQueryOrMut
|
||||
:: ExceptT () m DiffTime
|
||||
-> Telem.QueryType
|
||||
-> Telem.CacheHit
|
||||
-> Maybe EQ.GeneratedSqlMap
|
||||
-> RequestId
|
||||
-> ExceptT QErr (ExceptT () m) EncJSON
|
||||
-> ExceptT () m ()
|
||||
execQueryOrMut timerTot telemQueryType telemCacheHit genSql requestId action = do
|
||||
let telemLocality = Telem.Local
|
||||
logOpEv ODStarted (Just requestId)
|
||||
-- log the generated SQL and the graphql query
|
||||
logQueryLog logger q genSql requestId
|
||||
withElapsedTime (runExceptT action) >>= \case
|
||||
(_, Left err) -> postExecErr requestId err
|
||||
(telemTimeIO_DT, Right encJson) -> do
|
||||
-- Telemetry. NOTE: don't time network IO:
|
||||
telemTimeTot <- Seconds <$> timerTot
|
||||
sendSuccResp encJson $ LQ.LiveQueryMetadata telemTimeIO_DT
|
||||
let telemTimeIO = convertDuration telemTimeIO_DT
|
||||
Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..}
|
||||
|
||||
sendCompleted (Just requestId)
|
||||
{-
|
||||
runHasuraGQ :: ExceptT () m DiffTime
|
||||
-> Telem.CacheHit -> RequestId -> GQLReqUnparsed -> UserInfo -> E.ExecOp
|
||||
-> ExceptT () m ()
|
||||
runHasuraGQ timerTot telemCacheHit reqId query userInfo = \case
|
||||
E.ExOpQuery opTx genSql _asts ->
|
||||
execQueryOrMut Telem.Query genSql $ runQueryTx pgExecCtx opTx
|
||||
E.GExPHasura resolvedOp ->
|
||||
runHasuraGQ timerTot telemCacheHit requestId q reqParsed userInfo resolvedOp
|
||||
E.GExPRemote rsi opDef ->
|
||||
@ -410,10 +489,11 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do
|
||||
Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..}
|
||||
|
||||
sendCompleted (Just reqId)
|
||||
-}
|
||||
|
||||
runRemoteGQ :: ExceptT () m DiffTime
|
||||
-> Telem.CacheHit -> E.ExecutionCtx -> RequestId -> UserInfo -> [H.Header]
|
||||
-> G.TypedOperationDefinition -> RemoteSchemaInfo
|
||||
-> G.TypedOperationDefinition G.NoFragments G.Name -> RemoteSchemaInfo
|
||||
-> ExceptT () m ()
|
||||
runRemoteGQ timerTot telemCacheHit execCtx reqId userInfo reqHdrs opDef rsi = do
|
||||
let telemLocality = Telem.Remote
|
||||
@ -425,8 +505,8 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do
|
||||
G.OperationTypeQuery -> return Telem.Query
|
||||
|
||||
-- if it's not a subscription, use HTTP to execute the query on the remote
|
||||
(runExceptT $ flip runReaderT execCtx $
|
||||
E.execRemoteGQ env reqId userInfo reqHdrs q rsi (G._todType opDef)) >>= \case
|
||||
runExceptT (flip runReaderT execCtx $ E.execRemoteGQ env reqId userInfo reqHdrs q rsi opDef)
|
||||
>>= \case
|
||||
Left err -> postExecErr reqId err
|
||||
Right (telemTimeIO_DT, !val) -> do
|
||||
-- Telemetry. NOTE: don't time network IO:
|
||||
@ -445,7 +525,7 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do
|
||||
invalidGqlErr err = err500 Unexpected $
|
||||
"Failed parsing GraphQL response from remote: " <> err
|
||||
|
||||
WSServerEnv logger pgExecCtx lqMap getSchemaCache httpMgr _ sqlGenCtx planCache
|
||||
WSServerEnv logger pgExecCtx lqMap getSchemaCache httpMgr _ sqlGenCtx {- planCache -}
|
||||
_ enableAL = serverEnv
|
||||
|
||||
WSConnData userInfoR opMap errRespTy queryType = WS.getData wsConn
|
||||
@ -507,7 +587,7 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do
|
||||
(SMData $ DataMsg opId $ GRHasura $ GQSuccess $ BL.fromStrict bs)
|
||||
(LQ.LiveQueryMetadata dTime)
|
||||
resp -> sendMsg wsConn $ SMData $ DataMsg opId $ GRHasura $
|
||||
(BL.fromStrict . LQ._lqrPayload) <$> resp
|
||||
BL.fromStrict . LQ._lqrPayload <$> resp
|
||||
|
||||
catchAndIgnore :: ExceptT () m () -> m ()
|
||||
catchAndIgnore m = void $ runExceptT m
|
||||
@ -603,7 +683,7 @@ onConnInit
|
||||
:: (HasVersion, MonadIO m, UserAuthentication (Tracing.TraceT m))
|
||||
=> L.Logger L.Hasura -> H.Manager -> WSConn -> AuthMode -> Maybe ConnParams -> Tracing.TraceT m ()
|
||||
onConnInit logger manager wsConn authMode connParamsM = do
|
||||
-- TODO: what should be the behaviour of connection_init message when a
|
||||
-- TODO(from master): what should be the behaviour of connection_init message when a
|
||||
-- connection is already iniatilized? Currently, we seem to be doing
|
||||
-- something arbitrary which isn't correct. Ideally, we should stick to
|
||||
-- this:
|
||||
@ -621,7 +701,9 @@ onConnInit logger manager wsConn authMode connParamsM = do
|
||||
Left e -> do
|
||||
let !initErr = CSInitError $ qeError e
|
||||
liftIO $ do
|
||||
#ifndef PROFILING
|
||||
$assertNFHere initErr -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
STM.atomically $ STM.writeTVar (_wscUser $ WS.getData wsConn) initErr
|
||||
|
||||
let connErr = ConnErrMsg $ qeError e
|
||||
@ -631,11 +713,13 @@ onConnInit logger manager wsConn authMode connParamsM = do
|
||||
Right (userInfo, expTimeM) -> do
|
||||
let !csInit = CSInitialised $ WsClientState userInfo expTimeM paramHeaders ipAddress
|
||||
liftIO $ do
|
||||
#ifndef PROFILING
|
||||
$assertNFHere csInit -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
STM.atomically $ STM.writeTVar (_wscUser $ WS.getData wsConn) csInit
|
||||
|
||||
sendMsg wsConn SMConnAck
|
||||
-- TODO: send it periodically? Why doesn't apollo's protocol use
|
||||
-- TODO(from master): send it periodically? Why doesn't apollo's protocol use
|
||||
-- ping/pong frames of websocket spec?
|
||||
sendMsg wsConn SMConnKeepAlive
|
||||
where
|
||||
@ -685,14 +769,14 @@ createWSServerEnv
|
||||
-> CorsPolicy
|
||||
-> SQLGenCtx
|
||||
-> Bool
|
||||
-> E.PlanCache
|
||||
-- -> E.PlanCache
|
||||
-> m WSServerEnv
|
||||
createWSServerEnv logger isPgCtx lqState getSchemaCache httpManager
|
||||
corsPolicy sqlGenCtx enableAL planCache = do
|
||||
corsPolicy sqlGenCtx enableAL {- planCache -} = do
|
||||
wsServer <- liftIO $ STM.atomically $ WS.createWSServer logger
|
||||
return $
|
||||
WSServerEnv logger isPgCtx lqState getSchemaCache httpManager corsPolicy
|
||||
sqlGenCtx planCache wsServer enableAL
|
||||
sqlGenCtx {- planCache -} wsServer enableAL
|
||||
|
||||
createWSServerApp
|
||||
:: ( HasVersion
|
||||
|
@ -1,5 +1,6 @@
|
||||
{-# LANGUAGE NondecreasingIndentation #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE CPP #-}
|
||||
|
||||
module Hasura.GraphQL.Transport.WebSocket.Server
|
||||
( WSId(..)
|
||||
@ -45,7 +46,9 @@ import qualified Data.TByteString as TBS
|
||||
import qualified Data.UUID as UUID
|
||||
import qualified Data.UUID.V4 as UUID
|
||||
import Data.Word (Word16)
|
||||
#ifndef PROFILING
|
||||
import GHC.AssertNF
|
||||
#endif
|
||||
import GHC.Int (Int64)
|
||||
import Hasura.Prelude
|
||||
import qualified ListT
|
||||
@ -116,10 +119,6 @@ $(J.deriveToJSON
|
||||
}
|
||||
''WSLog)
|
||||
|
||||
instance L.ToEngineLog WSLog L.Hasura where
|
||||
toEngineLog wsLog =
|
||||
(L.LevelDebug, L.ELTInternal L.ILTWsServer, J.toJSON wsLog)
|
||||
|
||||
class Monad m => MonadWSLog m where
|
||||
-- | Takes WS server log data and logs it
|
||||
-- logWSServer
|
||||
@ -131,6 +130,10 @@ instance MonadWSLog m => MonadWSLog (ExceptT e m) where
|
||||
instance MonadWSLog m => MonadWSLog (ReaderT r m) where
|
||||
logWSLog l ws = lift $ logWSLog l ws
|
||||
|
||||
instance L.ToEngineLog WSLog L.Hasura where
|
||||
toEngineLog wsLog =
|
||||
(L.LevelDebug, L.ELTInternal L.ILTWsServer, J.toJSON wsLog)
|
||||
|
||||
data WSQueueResponse
|
||||
= WSQueueResponse
|
||||
{ _wsqrMessage :: !BL.ByteString
|
||||
@ -172,7 +175,9 @@ closeConnWithCode wsConn code bs = do
|
||||
-- so that sendMsg doesn't block
|
||||
sendMsg :: WSConn a -> WSQueueResponse -> IO ()
|
||||
sendMsg wsConn = \ !resp -> do
|
||||
#ifndef PROFILING
|
||||
$assertNFHere resp -- so we don't write thunks to mutable vars
|
||||
#endif
|
||||
STM.atomically $ STM.writeTQueue (_wcSendQ wsConn) resp
|
||||
|
||||
type ConnMap a = STMMap.Map WSId (WSConn a)
|
||||
@ -362,7 +367,6 @@ createServerApp (WSServer logger@(L.Logger writeLog) serverStatus) wsHandlers !i
|
||||
_hOnClose wsHandlers wsConn
|
||||
logWSLog logger $ WSLog (_wcConnId wsConn) EClosed Nothing
|
||||
|
||||
|
||||
shutdown :: WSServer a -> IO ()
|
||||
shutdown (WSServer (L.Logger writeLog) serverStatus) = do
|
||||
writeLog $ L.debugT "Shutting websockets server down"
|
||||
|
@ -1,23 +1,14 @@
|
||||
module Hasura.GraphQL.Utils
|
||||
( showName
|
||||
, showNamedTy
|
||||
, throwVE
|
||||
, getBaseTy
|
||||
, groupTuples
|
||||
, groupListWith
|
||||
, mkMapWith
|
||||
, showNames
|
||||
, unwrapTy
|
||||
, simpleGraphQLQuery
|
||||
, jsonValueToGValue
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types.Error
|
||||
|
||||
import Data.Scientific (floatingOrInteger)
|
||||
|
||||
import qualified Data.Aeson as A
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import qualified Data.Text as T
|
||||
@ -26,26 +17,6 @@ import qualified Language.GraphQL.Draft.Syntax as G
|
||||
showName :: G.Name -> Text
|
||||
showName name = "\"" <> G.unName name <> "\""
|
||||
|
||||
throwVE :: (MonadError QErr m) => Text -> m a
|
||||
throwVE = throw400 ValidationFailed
|
||||
|
||||
showNamedTy :: G.NamedType -> Text
|
||||
showNamedTy nt =
|
||||
"'" <> G.showNT nt <> "'"
|
||||
|
||||
getBaseTy :: G.GType -> G.NamedType
|
||||
getBaseTy = \case
|
||||
G.TypeNamed _ n -> n
|
||||
G.TypeList _ lt -> getBaseTyL lt
|
||||
where
|
||||
getBaseTyL = getBaseTy . G.unListType
|
||||
|
||||
unwrapTy :: G.GType -> G.GType
|
||||
unwrapTy =
|
||||
\case
|
||||
G.TypeList _ lt -> G.unListType lt
|
||||
nt -> nt
|
||||
|
||||
groupListWith
|
||||
:: (Eq k, Hashable k, Foldable t, Functor t)
|
||||
=> (v -> k) -> t v -> Map.HashMap k (NE.NonEmpty v)
|
||||
@ -81,15 +52,3 @@ showNames names =
|
||||
-- A simple graphql query to be used in generators
|
||||
simpleGraphQLQuery :: Text
|
||||
simpleGraphQLQuery = "query {author {id name}}"
|
||||
|
||||
-- | Convert a JSON value to a GraphQL value.
|
||||
jsonValueToGValue :: A.Value -> G.Value
|
||||
jsonValueToGValue = \case
|
||||
A.String t -> G.VString $ G.StringValue t
|
||||
-- TODO: Note the danger zone of scientific:
|
||||
A.Number n -> either (\(_::Float) -> G.VFloat n) G.VInt (floatingOrInteger n)
|
||||
A.Bool b -> G.VBoolean b
|
||||
A.Object o -> G.VObject $ G.ObjectValueG $
|
||||
map (uncurry G.ObjectFieldG . (G.Name *** jsonValueToGValue)) $ Map.toList o
|
||||
A.Array a -> G.VList $ G.ListValueG $ map jsonValueToGValue $ toList a
|
||||
A.Null -> G.VNull
|
||||
|
@ -1,331 +0,0 @@
|
||||
module Hasura.GraphQL.Validate
|
||||
( validateGQ
|
||||
, showVars
|
||||
, RootSelectionSet(..)
|
||||
, SelectionSet(..)
|
||||
, Field(..)
|
||||
, getTypedOp
|
||||
, QueryParts(..)
|
||||
, getQueryParts
|
||||
|
||||
, ReusableVariableTypes(..)
|
||||
, ReusableVariableValues
|
||||
, validateVariablesForReuse
|
||||
|
||||
, isQueryInAllowlist
|
||||
|
||||
, unValidateArgsMap
|
||||
, unValidateSelectionSet
|
||||
, unValidateField
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import Data.Has
|
||||
import Data.Time
|
||||
|
||||
import qualified Data.Aeson as A
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.HashSet as HS
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.UUID as UUID
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.NormalForm
|
||||
import Hasura.GraphQL.Resolve.InputValue (annInpValueToJson)
|
||||
import Hasura.GraphQL.Schema
|
||||
import Hasura.GraphQL.Transport.HTTP.Protocol
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.Context
|
||||
import Hasura.GraphQL.Validate.InputValue
|
||||
import Hasura.GraphQL.Validate.SelectionSet
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.DML.Select.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Time
|
||||
import Hasura.SQL.Value
|
||||
|
||||
data QueryParts
|
||||
= QueryParts
|
||||
{ qpOpDef :: !G.TypedOperationDefinition
|
||||
, qpOpRoot :: !ObjTyInfo
|
||||
, qpFragDefsL :: ![G.FragmentDefinition]
|
||||
, qpVarValsM :: !(Maybe VariableValues)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
getTypedOp
|
||||
:: (MonadError QErr m)
|
||||
=> Maybe OperationName
|
||||
-> [G.SelectionSet]
|
||||
-> [G.TypedOperationDefinition]
|
||||
-> m G.TypedOperationDefinition
|
||||
getTypedOp opNameM selSets opDefs =
|
||||
case (opNameM, selSets, opDefs) of
|
||||
(Just opName, [], _) -> do
|
||||
let n = _unOperationName opName
|
||||
opDefM = find (\opDef -> G._todName opDef == Just n) opDefs
|
||||
onNothing opDefM $ throwVE $
|
||||
"no such operation found in the document: " <> showName n
|
||||
(Just _, _, _) ->
|
||||
throwVE $ "operationName cannot be used when " <>
|
||||
"an anonymous operation exists in the document"
|
||||
(Nothing, [selSet], []) ->
|
||||
return $ G.TypedOperationDefinition G.OperationTypeQuery Nothing [] [] selSet
|
||||
(Nothing, [], [opDef]) ->
|
||||
return opDef
|
||||
(Nothing, _, _) ->
|
||||
throwVE $ "exactly one operation has to be present " <>
|
||||
"in the document when operationName is not specified"
|
||||
|
||||
-- | For all the variables defined there will be a value in the final map
|
||||
-- If no default, not in variables and nullable, then null value
|
||||
validateVariables
|
||||
:: (MonadReader r m, Has TypeMap r, MonadError QErr m)
|
||||
=> [G.VariableDefinition] -> VariableValues -> m AnnVarVals
|
||||
validateVariables varDefsL inpVals = withPathK "variableValues" $ do
|
||||
varDefs <- onLeft (mkMapWith G._vdVariable varDefsL) $ \dups ->
|
||||
throwVE $ "the following variables are defined more than once: " <>
|
||||
showVars dups
|
||||
|
||||
let unexpectedVars = filter (not . (`Map.member` varDefs)) $ Map.keys inpVals
|
||||
unless (null unexpectedVars) $
|
||||
throwVE $ "unexpected variables in variableValues: " <>
|
||||
showVars unexpectedVars
|
||||
|
||||
traverse validateVariable varDefs
|
||||
where
|
||||
validateVariable (G.VariableDefinition var ty defM) = do
|
||||
let baseTy = getBaseTy ty
|
||||
baseTyInfo <- getTyInfoVE baseTy
|
||||
-- check that the variable is defined on input types
|
||||
when (isObjTy baseTyInfo) $ throwVE $
|
||||
"variables can only be defined on input types"
|
||||
<> "(enums, scalars, input objects), but "
|
||||
<> showNamedTy baseTy <> " is an object type"
|
||||
|
||||
let defM' = bool (defM <|> Just G.VCNull) defM $ G.isNotNull ty
|
||||
annDefM <- withPathK "defaultValue" $
|
||||
mapM (validateInputValue constValueParser ty) defM'
|
||||
let inpValM = Map.lookup var inpVals
|
||||
annInpValM <- withPathK (G.unName $ G.unVariable var) $
|
||||
mapM (validateInputValue jsonParser ty) inpValM
|
||||
let varValM = annInpValM <|> annDefM
|
||||
onNothing varValM $ throwVE $
|
||||
"expecting a value for non-nullable variable: " <>
|
||||
showVars [var] <>
|
||||
" of type: " <> G.showGT ty <>
|
||||
" in variableValues"
|
||||
|
||||
|
||||
showVars :: (Functor f, Foldable f) => f G.Variable -> Text
|
||||
showVars = showNames . fmap G.unVariable
|
||||
|
||||
-- | This is similar in spirit to 'validateVariables' but uses preexisting 'ReusableVariableTypes'
|
||||
-- information to parse Postgres values directly for use with a reusable query plan. (Ideally, it
|
||||
-- would be nice to be able to share more of the logic instead of duplicating it.)
|
||||
validateVariablesForReuse
|
||||
:: (MonadError QErr m)
|
||||
=> ReusableVariableTypes -> Maybe VariableValues -> m ReusableVariableValues
|
||||
validateVariablesForReuse (ReusableVariableTypes varTypes) varValsM =
|
||||
withPathK "variableValues" $ do
|
||||
let unexpectedVars = filter (not . (`Map.member` varTypes)) $ Map.keys varVals
|
||||
unless (null unexpectedVars) $
|
||||
throwVE $ "unexpected variables: " <> showVars unexpectedVars
|
||||
|
||||
flip Map.traverseWithKey varTypes $ \varName varType ->
|
||||
withPathK (G.unName $ G.unVariable varName) $ do
|
||||
varVal <- onNothing (Map.lookup varName varVals) $
|
||||
throwVE "expected a value for non-nullable variable"
|
||||
-- TODO: we don't have the graphql type
|
||||
-- <> " of type: " <> T.pack (show varType)
|
||||
parsePGScalarValue varType varVal
|
||||
where
|
||||
varVals = fromMaybe Map.empty varValsM
|
||||
|
||||
validateFrag
|
||||
:: (MonadError QErr m, MonadReader r m, Has TypeMap r)
|
||||
=> G.FragmentDefinition -> m FragDef
|
||||
validateFrag (G.FragmentDefinition n onTy dirs selSet) = do
|
||||
unless (null dirs) $ throwVE
|
||||
"unexpected directives at fragment definition"
|
||||
fragmentTypeInfo <- getFragmentTyInfo onTy
|
||||
return $ FragDef n fragmentTypeInfo selSet
|
||||
|
||||
validateGQ
|
||||
:: (MonadError QErr m, MonadReader GCtx m, MonadReusability m)
|
||||
=> QueryParts -> m RootSelectionSet
|
||||
validateGQ (QueryParts opDef opRoot fragDefsL varValsM) = do
|
||||
ctx <- ask
|
||||
|
||||
-- annotate the variables of this operation
|
||||
annVarVals <- validateVariables (G._todVariableDefinitions opDef) $ fromMaybe Map.empty varValsM
|
||||
|
||||
-- annotate the fragments
|
||||
fragDefs <- onLeft (mkMapWith G._fdName fragDefsL) $ \dups ->
|
||||
throwVE $ "the following fragments are defined more than once: " <>
|
||||
showNames dups
|
||||
annFragDefs <- mapM validateFrag fragDefs
|
||||
|
||||
-- build a validation ctx
|
||||
let valCtx = ValidationCtx (_gTypes ctx) annVarVals annFragDefs
|
||||
|
||||
selSet <- flip runReaderT valCtx $ parseObjectSelectionSet valCtx opRoot $
|
||||
G._todSelectionSet opDef
|
||||
|
||||
case G._todType opDef of
|
||||
G.OperationTypeQuery -> return $ RQuery selSet
|
||||
G.OperationTypeMutation -> return $ RMutation selSet
|
||||
G.OperationTypeSubscription ->
|
||||
case OMap.toList $ unAliasedFields $ unObjectSelectionSet selSet of
|
||||
[] -> throw500 "empty selset for subscription"
|
||||
(_:rst) -> do
|
||||
-- As an internal testing feature, we support subscribing to multiple
|
||||
-- selection sets. First check if the corresponding directive is set.
|
||||
let multipleAllowed = G.Directive "_multiple_top_level_fields" [] `elem` G._todDirectives opDef
|
||||
unless (multipleAllowed || null rst) $
|
||||
throwVE "subscriptions must select one top level field"
|
||||
return $ RSubscription selSet
|
||||
|
||||
isQueryInAllowlist :: GQLExecDoc -> HS.HashSet GQLQuery -> Bool
|
||||
isQueryInAllowlist q = HS.member gqlQuery
|
||||
where
|
||||
gqlQuery = GQLQuery $ G.ExecutableDocument $ stripTypenames $
|
||||
unGQLExecDoc q
|
||||
|
||||
getQueryParts
|
||||
:: ( MonadError QErr m, MonadReader GCtx m)
|
||||
=> GQLReqParsed
|
||||
-> m QueryParts
|
||||
getQueryParts (GQLReq opNameM q varValsM) = do
|
||||
-- get the operation that needs to be evaluated
|
||||
opDef <- getTypedOp opNameM selSets opDefs
|
||||
ctx <- ask
|
||||
|
||||
-- get the operation root
|
||||
opRoot <- case G._todType opDef of
|
||||
G.OperationTypeQuery -> return $ _gQueryRoot ctx
|
||||
G.OperationTypeMutation ->
|
||||
onNothing (_gMutRoot ctx) $ throwVE "no mutations exist"
|
||||
G.OperationTypeSubscription ->
|
||||
onNothing (_gSubRoot ctx) $ throwVE "no subscriptions exist"
|
||||
return $ QueryParts opDef opRoot fragDefsL varValsM
|
||||
where
|
||||
(selSets, opDefs, fragDefsL) = G.partitionExDefs $ unGQLExecDoc q
|
||||
|
||||
-- | Convert the validated arguments to GraphQL parser AST arguments
|
||||
unValidateArgsMap :: ArgsMap -> [RemoteFieldArgument]
|
||||
unValidateArgsMap argsMap =
|
||||
map (\(n, inpVal) ->
|
||||
let _rfaArgument = G.Argument n $ unValidateInpVal inpVal
|
||||
_rfaVariable = unValidateInpVariable inpVal
|
||||
in RemoteFieldArgument {..})
|
||||
. Map.toList $ argsMap
|
||||
|
||||
-- | Convert the validated field to GraphQL parser AST field
|
||||
unValidateField :: G.Alias -> Field -> G.Field
|
||||
unValidateField alias (Field name _ argsMap selSet) =
|
||||
let args = map (\(n, inpVal) -> G.Argument n $ unValidateInpVal inpVal) $
|
||||
Map.toList argsMap
|
||||
in G.Field (Just alias) name args [] $ unValidateSelectionSet selSet
|
||||
|
||||
-- | Convert the validated selection set to GraphQL parser AST selection set
|
||||
unValidateSelectionSet :: SelectionSet -> G.SelectionSet
|
||||
unValidateSelectionSet = \case
|
||||
SelectionSetObject selectionSet -> fromSelectionSet selectionSet
|
||||
SelectionSetInterface selectionSet -> fromScopedSelectionSet selectionSet
|
||||
SelectionSetUnion selectionSet -> fromScopedSelectionSet selectionSet
|
||||
SelectionSetNone -> mempty
|
||||
where
|
||||
fromAliasedFields :: (IsField f) => AliasedFields f -> G.SelectionSet
|
||||
fromAliasedFields =
|
||||
map (G.SelectionField . uncurry unValidateField) .
|
||||
OMap.toList . fmap toField . unAliasedFields
|
||||
fromSelectionSet =
|
||||
fromAliasedFields . unObjectSelectionSet
|
||||
toInlineSelection typeName =
|
||||
G.SelectionInlineFragment . G.InlineFragment (Just typeName) mempty .
|
||||
fromSelectionSet
|
||||
fromScopedSelectionSet (ScopedSelectionSet base specific) =
|
||||
map (uncurry toInlineSelection) (Map.toList specific) <> fromAliasedFields base
|
||||
|
||||
-- | Get the variable definition and it's value (if exists)
|
||||
unValidateInpVariable :: AnnInpVal -> Maybe [(G.VariableDefinition,A.Value)]
|
||||
unValidateInpVariable inputValue =
|
||||
case (_aivValue inputValue) of
|
||||
AGScalar _ _ -> mkVariableDefnValueTuple inputValue
|
||||
AGEnum _ _ -> mkVariableDefnValueTuple inputValue
|
||||
AGObject _ o ->
|
||||
(\obj ->
|
||||
let listObjects = OMap.toList obj
|
||||
in concat $
|
||||
mapMaybe (\(_, inpVal) -> unValidateInpVariable inpVal) listObjects)
|
||||
<$> o
|
||||
AGArray _ _ -> mkVariableDefnValueTuple inputValue
|
||||
where
|
||||
mkVariableDefnValueTuple val = maybe Nothing (\vars -> Just [vars]) $
|
||||
variableDefnValueTuple val
|
||||
|
||||
variableDefnValueTuple :: AnnInpVal -> Maybe (G.VariableDefinition,A.Value)
|
||||
variableDefnValueTuple inpVal@AnnInpVal {..} =
|
||||
let varDefn = G.VariableDefinition <$> _aivVariable <*> Just _aivType <*> Just Nothing
|
||||
in (,) <$> varDefn <*> Just (annInpValueToJson inpVal)
|
||||
|
||||
-- | Convert the validated input value to GraphQL value, if the input value
|
||||
-- is a variable then it will be returned without resolving it, otherwise it
|
||||
-- will be resolved
|
||||
unValidateInpVal :: AnnInpVal -> G.Value
|
||||
unValidateInpVal (AnnInpVal _ var val) = fromMaybe G.VNull $
|
||||
-- if a variable is found, then directly return that, if not found then
|
||||
-- convert it into a G.Value and return it
|
||||
case var of
|
||||
Just var' -> Just $ G.VVariable var'
|
||||
Nothing ->
|
||||
case val of
|
||||
AGScalar _ v -> pgScalarToGValue <$> v
|
||||
AGEnum _ v -> pgEnumToGEnum v
|
||||
AGObject _ o ->
|
||||
(G.VObject . G.ObjectValueG
|
||||
. map (uncurry G.ObjectFieldG . (second unValidateInpVal))
|
||||
. OMap.toList
|
||||
) <$> o
|
||||
AGArray _ vs -> (G.VList . G.ListValueG . map unValidateInpVal) <$> vs
|
||||
|
||||
where
|
||||
pgEnumToGEnum :: AnnGEnumValue -> Maybe G.Value
|
||||
pgEnumToGEnum = \case
|
||||
AGESynthetic v -> G.VEnum <$> v
|
||||
AGEReference _ v -> (G.VEnum . G.EnumValue . G.Name . getEnumValue) <$> v
|
||||
|
||||
pgScalarToGValue :: PGScalarValue -> G.Value
|
||||
pgScalarToGValue = \case
|
||||
PGValInteger i -> G.VInt $ fromIntegral i
|
||||
PGValSmallInt i -> G.VInt $ fromIntegral i
|
||||
PGValBigInt i -> G.VInt $ fromIntegral i
|
||||
PGValFloat f -> G.VFloat $ realToFrac f
|
||||
PGValDouble d -> G.VFloat $ realToFrac d
|
||||
-- TODO: Scientific is a danger zone; use its safe conv function.
|
||||
PGValNumeric sc -> G.VFloat $ realToFrac sc
|
||||
PGValMoney m -> G.VFloat $ realToFrac m
|
||||
PGValBoolean b -> G.VBoolean b
|
||||
PGValChar t -> toStringValue $ T.singleton t
|
||||
PGValVarchar t -> toStringValue t
|
||||
PGValText t -> toStringValue t
|
||||
PGValCitext t -> toStringValue t
|
||||
PGValDate d -> toStringValue $ T.pack $ showGregorian d
|
||||
PGValTimeStampTZ u -> toStringValue $ T.pack $
|
||||
formatTime defaultTimeLocale "%FT%T%QZ" u
|
||||
PGValTimeStamp u -> toStringValue $ T.pack $
|
||||
formatTime defaultTimeLocale "%FT%T%QZ" u
|
||||
PGValTimeTZ (ZonedTimeOfDay tod tz) ->
|
||||
toStringValue $ T.pack (show tod ++ timeZoneOffsetString tz)
|
||||
PGNull _ -> G.VNull
|
||||
PGValJSON (Q.JSON v) -> jsonValueToGValue v
|
||||
PGValJSONB (Q.JSONB v) -> jsonValueToGValue v
|
||||
PGValGeo v -> jsonValueToGValue $ A.toJSON v
|
||||
PGValRaster v -> jsonValueToGValue $ A.toJSON v
|
||||
PGValUUID u -> toStringValue $ UUID.toText u
|
||||
PGValUnknown t -> toStringValue t
|
||||
where
|
||||
toStringValue = G.VString . G.StringValue
|
@ -1,78 +0,0 @@
|
||||
module Hasura.GraphQL.Validate.Context
|
||||
( ValidationCtx(..)
|
||||
, getFieldInfo
|
||||
, getInpFieldInfo
|
||||
, getTyInfo
|
||||
, getTyInfoVE
|
||||
, getFragmentTyInfo
|
||||
, module Hasura.GraphQL.Utils
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Data.Has
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
|
||||
getFieldInfo
|
||||
:: ( MonadError QErr m)
|
||||
=> G.NamedType -> ObjFieldMap -> G.Name -> m ObjFldInfo
|
||||
getFieldInfo typeName fieldMap fldName =
|
||||
onNothing (Map.lookup fldName fieldMap) $ throwVE $
|
||||
"field " <> showName fldName <>
|
||||
" not found in type: " <> showNamedTy typeName
|
||||
|
||||
getInpFieldInfo
|
||||
:: ( MonadError QErr m)
|
||||
=> InpObjTyInfo -> G.Name -> m G.GType
|
||||
getInpFieldInfo tyInfo fldName =
|
||||
fmap _iviType $ onNothing (Map.lookup fldName $ _iotiFields tyInfo) $
|
||||
throwVE $ "field " <> showName fldName <>
|
||||
" not found in type: " <> showNamedTy (_iotiName tyInfo)
|
||||
|
||||
data ValidationCtx
|
||||
= ValidationCtx
|
||||
{ _vcTypeMap :: !TypeMap
|
||||
-- these are in the scope of the operation
|
||||
, _vcVarVals :: !AnnVarVals
|
||||
-- all the fragments
|
||||
, _vcFragDefMap :: !FragDefMap
|
||||
} deriving (Show, Eq)
|
||||
|
||||
instance Has TypeMap ValidationCtx where
|
||||
getter = _vcTypeMap
|
||||
modifier f ctx = ctx { _vcTypeMap = f $ _vcTypeMap ctx }
|
||||
|
||||
getTyInfo
|
||||
:: ( MonadReader r m , Has TypeMap r
|
||||
, MonadError QErr m)
|
||||
=> G.NamedType
|
||||
-> m TypeInfo
|
||||
getTyInfo namedTy = do
|
||||
tyMap <- asks getter
|
||||
onNothing (Map.lookup namedTy tyMap) $
|
||||
throw500 $ "type info not found for: " <> showNamedTy namedTy
|
||||
|
||||
getTyInfoVE
|
||||
:: ( MonadReader r m , Has TypeMap r
|
||||
, MonadError QErr m)
|
||||
=> G.NamedType
|
||||
-> m TypeInfo
|
||||
getTyInfoVE namedTy = do
|
||||
tyMap <- asks getter
|
||||
onNothing (Map.lookup namedTy tyMap) $
|
||||
throwVE $ "no such type exists in the schema: " <> showNamedTy namedTy
|
||||
|
||||
getFragmentTyInfo
|
||||
:: (MonadReader r m, Has TypeMap r, MonadError QErr m)
|
||||
=> G.NamedType -> m FragmentTypeInfo
|
||||
getFragmentTyInfo onType =
|
||||
getTyInfoVE onType >>= \case
|
||||
TIObj tyInfo -> pure $ FragmentTyObject tyInfo
|
||||
TIIFace tyInfo -> pure $ FragmentTyInterface tyInfo
|
||||
TIUnion tyInfo -> pure $ FragmentTyUnion tyInfo
|
||||
_ -> throwVE "fragments can only be defined on object/interface/union types"
|
@ -1,347 +0,0 @@
|
||||
module Hasura.GraphQL.Validate.InputValue
|
||||
( validateInputValue
|
||||
, jsonParser
|
||||
, valueParser
|
||||
, constValueParser
|
||||
, pPrintValueC
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import Data.Has
|
||||
import Data.List.Extended (duplicates)
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd as OMap
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Vector as V
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import qualified Hasura.RQL.Types as RQL
|
||||
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.Context
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
newtype P a = P { unP :: Maybe (Either (G.Variable, AnnInpVal) a)}
|
||||
|
||||
pNull :: (Monad m) => m (P a)
|
||||
pNull = return $ P Nothing
|
||||
|
||||
pVal :: (Monad m) => a -> m (P a)
|
||||
pVal = return . P . Just . Right
|
||||
|
||||
resolveVar
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader ValidationCtx m)
|
||||
=> G.Variable -> m AnnInpVal
|
||||
resolveVar var = do
|
||||
varVals <- _vcVarVals <$> ask
|
||||
onNothing (Map.lookup var varVals) $
|
||||
throwVE $ "no such variable defined in the operation: "
|
||||
<> showName (G.unVariable var)
|
||||
|
||||
pVar
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader ValidationCtx m)
|
||||
=> G.Variable -> m (P a)
|
||||
pVar var = do
|
||||
annInpVal <- resolveVar var
|
||||
return . P . Just $ Left (var, annInpVal)
|
||||
|
||||
data InputValueParser a m
|
||||
= InputValueParser
|
||||
{ getScalar :: a -> m (P J.Value)
|
||||
, getList :: a -> m (P [a])
|
||||
, getObject :: a -> m (P [(G.Name, a)])
|
||||
, getEnum :: a -> m (P G.EnumValue)
|
||||
}
|
||||
|
||||
jsonParser :: (MonadError QErr m) => InputValueParser J.Value m
|
||||
jsonParser =
|
||||
InputValueParser jScalar jList jObject jEnum
|
||||
where
|
||||
jEnum (J.String t) = pVal $ G.EnumValue $ G.Name t
|
||||
jEnum J.Null = pNull
|
||||
jEnum _ = throwVE "expecting a JSON string for Enum"
|
||||
|
||||
jList (J.Array l) = pVal $ V.toList l
|
||||
jList J.Null = pNull
|
||||
jList v = pVal [v]
|
||||
|
||||
jObject (J.Object m) = pVal [(G.Name t, v) | (t, v) <- Map.toList m]
|
||||
jObject J.Null = pNull
|
||||
jObject _ = throwVE "expecting a JSON object"
|
||||
|
||||
jScalar J.Null = pNull
|
||||
jScalar v = pVal v
|
||||
|
||||
toJValue :: (MonadError QErr m) => G.Value -> m J.Value
|
||||
toJValue = \case
|
||||
G.VVariable _ ->
|
||||
throwVE "variables are not allowed in scalars"
|
||||
G.VInt i -> return $ J.toJSON i
|
||||
G.VFloat f -> return $ J.toJSON f
|
||||
G.VString (G.StringValue t) -> return $ J.toJSON t
|
||||
G.VBoolean b -> return $ J.toJSON b
|
||||
G.VNull -> return J.Null
|
||||
G.VEnum (G.EnumValue n) -> return $ J.toJSON n
|
||||
G.VList (G.ListValueG vals) ->
|
||||
J.toJSON <$> mapM toJValue vals
|
||||
G.VObject (G.ObjectValueG objs) ->
|
||||
J.toJSON . Map.fromList <$> mapM toTup objs
|
||||
where
|
||||
toTup (G.ObjectFieldG f v) = (f,) <$> toJValue v
|
||||
|
||||
valueParser
|
||||
:: ( MonadError QErr m
|
||||
, MonadReader ValidationCtx m)
|
||||
=> InputValueParser G.Value m
|
||||
valueParser =
|
||||
InputValueParser pScalar pList pObject pEnum
|
||||
where
|
||||
pEnum (G.VVariable var) = pVar var
|
||||
pEnum (G.VEnum e) = pVal e
|
||||
pEnum G.VNull = pNull
|
||||
pEnum _ = throwVE "expecting an enum"
|
||||
|
||||
pList (G.VVariable var) = pVar var
|
||||
pList (G.VList lv) = pVal $ G.unListValue lv
|
||||
pList G.VNull = pNull
|
||||
pList v = pVal [v]
|
||||
|
||||
pObject (G.VVariable var) = pVar var
|
||||
pObject (G.VObject ov) = pVal
|
||||
[(G._ofName oFld, G._ofValue oFld) | oFld <- G.unObjectValue ov]
|
||||
pObject G.VNull = pNull
|
||||
pObject _ = throwVE "expecting an object"
|
||||
|
||||
-- scalar json
|
||||
pScalar (G.VVariable var) = pVar var
|
||||
pScalar G.VNull = pNull
|
||||
pScalar (G.VInt v) = pVal $ J.Number $ fromIntegral v
|
||||
pScalar (G.VFloat v) = pVal $ J.Number v
|
||||
pScalar (G.VBoolean b) = pVal $ J.Bool b
|
||||
pScalar (G.VString sv) = pVal $ J.String $ G.unStringValue sv
|
||||
pScalar (G.VEnum _) = throwVE "unexpected enum for a scalar"
|
||||
pScalar v = pVal =<< toJValue v
|
||||
|
||||
pPrintValueC :: G.ValueConst -> Text
|
||||
pPrintValueC = \case
|
||||
G.VCInt i -> T.pack $ show i
|
||||
G.VCFloat f -> T.pack $ show f
|
||||
G.VCString (G.StringValue t) -> T.pack $ show t
|
||||
G.VCBoolean b -> bool "false" "true" b
|
||||
G.VCNull -> "null"
|
||||
G.VCEnum (G.EnumValue n) -> G.unName n
|
||||
G.VCList (G.ListValueG vals) -> withSquareBraces $ T.intercalate ", " $ map pPrintValueC vals
|
||||
G.VCObject (G.ObjectValueG objs) -> withCurlyBraces $ T.intercalate ", " $ map ppObjFld objs
|
||||
where
|
||||
ppObjFld (G.ObjectFieldG f v) = G.unName f <> ": " <> pPrintValueC v
|
||||
withSquareBraces t = "[" <> t <> "]"
|
||||
withCurlyBraces t = "{" <> t <> "}"
|
||||
|
||||
|
||||
toJValueC :: G.ValueConst -> J.Value
|
||||
toJValueC = \case
|
||||
G.VCInt i -> J.toJSON i
|
||||
G.VCFloat f -> J.toJSON f
|
||||
G.VCString (G.StringValue t) -> J.toJSON t
|
||||
G.VCBoolean b -> J.toJSON b
|
||||
G.VCNull -> J.Null
|
||||
G.VCEnum (G.EnumValue n) -> J.toJSON n
|
||||
G.VCList (G.ListValueG vals) ->
|
||||
J.toJSON $ map toJValueC vals
|
||||
G.VCObject (G.ObjectValueG objs) ->
|
||||
J.toJSON . OMap.fromList $ map toTup objs
|
||||
where
|
||||
toTup (G.ObjectFieldG f v) = (f, toJValueC v)
|
||||
|
||||
constValueParser :: (MonadError QErr m) => InputValueParser G.ValueConst m
|
||||
constValueParser =
|
||||
InputValueParser pScalar pList pObject pEnum
|
||||
where
|
||||
pEnum (G.VCEnum e) = pVal e
|
||||
pEnum G.VCNull = pNull
|
||||
pEnum _ = throwVE "expecting an enum"
|
||||
|
||||
pList (G.VCList lv) = pVal $ G.unListValue lv
|
||||
pList G.VCNull = pNull
|
||||
pList v = pVal [v]
|
||||
|
||||
pObject (G.VCObject ov) = pVal
|
||||
[(G._ofName oFld, G._ofValue oFld) | oFld <- G.unObjectValue ov]
|
||||
pObject G.VCNull = pNull
|
||||
pObject _ = throwVE "expecting an object"
|
||||
|
||||
-- scalar json
|
||||
pScalar G.VCNull = pNull
|
||||
pScalar (G.VCInt v) = pVal $ J.Number $ fromIntegral v
|
||||
pScalar (G.VCFloat v) = pVal $ J.Number v
|
||||
pScalar (G.VCBoolean b) = pVal $ J.Bool b
|
||||
pScalar (G.VCString sv) = pVal $ J.String $ G.unStringValue sv
|
||||
pScalar (G.VCEnum _) = throwVE "unexpected enum for a scalar"
|
||||
pScalar v = pVal $ toJValueC v
|
||||
|
||||
validateObject
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m
|
||||
)
|
||||
=> InputValueParser a m
|
||||
-> InpObjTyInfo -> [(G.Name, a)] -> m AnnGObject
|
||||
validateObject valParser tyInfo flds = do
|
||||
|
||||
-- check duplicates
|
||||
unless (null dups) $
|
||||
throwVE $ "when parsing a value of type: " <> showNamedTy (_iotiName tyInfo)
|
||||
<> ", the following fields are duplicated: "
|
||||
<> showNames dups
|
||||
|
||||
-- make default values object
|
||||
defValObj <- fmap (OMap.fromList . catMaybes) $
|
||||
forM (Map.toList $ _iotiFields tyInfo) $
|
||||
\(fldName, inpValInfo) -> do
|
||||
let ty = _iviType inpValInfo
|
||||
isNotNull = G.isNotNull ty
|
||||
defValM = _iviDefVal inpValInfo
|
||||
hasDefVal = isJust defValM
|
||||
fldPresent = fldName `elem` inpFldNames
|
||||
|
||||
when (not fldPresent && isNotNull && not hasDefVal) $
|
||||
throwVE $ "field " <> G.unName fldName <> " of type "
|
||||
<> G.showGT ty <> " is required, but not found"
|
||||
|
||||
convDefValM <- validateInputValue constValueParser ty `mapM` defValM
|
||||
return $ (fldName,) <$> convDefValM
|
||||
|
||||
-- compute input values object
|
||||
inpValObj <- fmap OMap.fromList $ forM flds $ \(fldName, fldVal) ->
|
||||
withPathK (G.unName fldName) $ do
|
||||
fldTy <- getInpFieldInfo tyInfo fldName
|
||||
convFldVal <- validateInputValue valParser fldTy fldVal
|
||||
return (fldName, convFldVal)
|
||||
|
||||
return $ inpValObj `OMap.union` defValObj
|
||||
|
||||
where
|
||||
inpFldNames = map fst flds
|
||||
dups = duplicates inpFldNames
|
||||
|
||||
validateNamedTypeVal
|
||||
:: ( MonadReader r m, Has TypeMap r
|
||||
, MonadError QErr m)
|
||||
=> InputValueParser a m
|
||||
-> (G.Nullability, G.NamedType) -> a -> m AnnInpVal
|
||||
validateNamedTypeVal inpValParser (nullability, nt) val = do
|
||||
tyInfo <- getTyInfo nt
|
||||
case tyInfo of
|
||||
-- this should never happen
|
||||
TIObj _ ->
|
||||
throwUnexpTypeErr "object"
|
||||
TIIFace _ ->
|
||||
throwUnexpTypeErr "interface"
|
||||
TIUnion _ ->
|
||||
throwUnexpTypeErr "union"
|
||||
TIInpObj ioti ->
|
||||
withParsed gType (getObject inpValParser) val $
|
||||
fmap (AGObject nt) . mapM (validateObject inpValParser ioti)
|
||||
TIEnum eti ->
|
||||
withParsed gType (getEnum inpValParser) val $
|
||||
fmap (AGEnum nt) . validateEnum eti
|
||||
TIScalar (ScalarTyInfo _ _ pgColTy _) ->
|
||||
withParsed gType (getScalar inpValParser) val $
|
||||
fmap (AGScalar pgColTy) . mapM (validateScalar pgColTy)
|
||||
where
|
||||
throwUnexpTypeErr ty = throw500 $ "unexpected " <> ty <> " type info for: "
|
||||
<> showNamedTy nt
|
||||
|
||||
validateEnum enumTyInfo maybeEnumValue = case (_etiValues enumTyInfo, maybeEnumValue) of
|
||||
(EnumValuesSynthetic _, Nothing) -> pure $ AGESynthetic Nothing
|
||||
(EnumValuesReference reference, Nothing) -> pure $ AGEReference reference Nothing
|
||||
(EnumValuesSynthetic values, Just enumValue)
|
||||
| Map.member enumValue values -> pure $ AGESynthetic (Just enumValue)
|
||||
(EnumValuesReference reference@(EnumReference _ values), Just enumValue)
|
||||
| rqlEnumValue <- RQL.EnumValue . G.unName $ G.unEnumValue enumValue
|
||||
, Map.member rqlEnumValue values
|
||||
-> pure $ AGEReference reference (Just rqlEnumValue)
|
||||
(_, Just enumValue) -> throwVE $
|
||||
"unexpected value " <> showName (G.unEnumValue enumValue) <> " for enum: " <> showNamedTy nt
|
||||
|
||||
validateScalar pgColTy = runAesonParser (parsePGValue pgColTy)
|
||||
gType = G.TypeNamed nullability nt
|
||||
|
||||
validateList
|
||||
:: (MonadError QErr m, MonadReader r m, Has TypeMap r)
|
||||
=> InputValueParser a m
|
||||
-> (G.Nullability, G.ListType)
|
||||
-> a
|
||||
-> m AnnInpVal
|
||||
validateList inpValParser (nullability, listTy) val =
|
||||
withParsed ty (getList inpValParser) val $ \lM -> do
|
||||
let baseTy = G.unListType listTy
|
||||
AGArray listTy <$>
|
||||
mapM (indexedMapM (validateInputValue inpValParser baseTy)) lM
|
||||
where
|
||||
ty = G.TypeList nullability listTy
|
||||
|
||||
validateInputValue
|
||||
:: (MonadError QErr m, MonadReader r m, Has TypeMap r)
|
||||
=> InputValueParser a m
|
||||
-> G.GType
|
||||
-> a
|
||||
-> m AnnInpVal
|
||||
validateInputValue inpValParser ty val =
|
||||
case ty of
|
||||
G.TypeNamed nullability nt ->
|
||||
validateNamedTypeVal inpValParser (nullability, nt) val
|
||||
G.TypeList nullability lt ->
|
||||
validateList inpValParser (nullability, lt) val
|
||||
|
||||
withParsed
|
||||
:: (Monad m, MonadError QErr m)
|
||||
=> G.GType
|
||||
-> (val -> m (P specificVal))
|
||||
-> val
|
||||
-> (Maybe specificVal -> m AnnGValue)
|
||||
-> m AnnInpVal
|
||||
withParsed expectedTy valParser val fn = do
|
||||
parsedVal <- valParser val
|
||||
case unP parsedVal of
|
||||
Nothing ->
|
||||
if G.isNullable expectedTy
|
||||
then AnnInpVal expectedTy Nothing <$> fn Nothing
|
||||
else throwVE $ "null value found for non-nullable type: "
|
||||
<> G.showGT expectedTy
|
||||
Just (Right v) -> AnnInpVal expectedTy Nothing <$> fn (Just v)
|
||||
Just (Left (var, v)) -> do
|
||||
let varTxt = G.unName $ G.unVariable var
|
||||
unless (isTypeAllowed expectedTy $ _aivType v) $
|
||||
throwVE $ "variable " <> varTxt
|
||||
<> " of type " <> G.showGT (_aivType v)
|
||||
<> " is used in position expecting " <> G.showGT expectedTy
|
||||
return $ v { _aivVariable = Just var }
|
||||
where
|
||||
-- is the type 'ofType' allowed at a position of type 'atType'
|
||||
-- Examples:
|
||||
-- . a! is allowed at a
|
||||
-- . [a!]! is allowed at [a]
|
||||
-- . but 'a' is not allowed at 'a!'
|
||||
isTypeAllowed ofType atType =
|
||||
case (ofType, atType) of
|
||||
(G.TypeNamed ofTyN ofNt, G.TypeNamed atTyN atNt) ->
|
||||
checkNullability ofTyN atTyN && (ofNt == atNt)
|
||||
(G.TypeList ofTyN ofLt, G.TypeList atTyN atLt) ->
|
||||
checkNullability ofTyN atTyN &&
|
||||
isTypeAllowed (G.unListType ofLt) (G.unListType atLt)
|
||||
_ -> False
|
||||
|
||||
-- only when 'atType' is non nullable and 'ofType' is nullable,
|
||||
-- this check fails
|
||||
checkNullability (G.Nullability ofNullable) (G.Nullability atNullable) =
|
||||
case (ofNullable, atNullable) of
|
||||
(True, _) -> True
|
||||
(False, False) -> True
|
||||
(False, True) -> False
|
@ -1,550 +0,0 @@
|
||||
{-# LANGUAGE RecordWildCards #-}
|
||||
{-# LANGUAGE TypeFamilyDependencies #-}
|
||||
module Hasura.GraphQL.Validate.SelectionSet
|
||||
( ArgsMap
|
||||
, Field(..)
|
||||
, AliasedFields(..)
|
||||
, SelectionSet(..)
|
||||
, ObjectSelectionSet(..)
|
||||
, traverseObjectSelectionSet
|
||||
, InterfaceSelectionSet
|
||||
, UnionSelectionSet
|
||||
, RootSelectionSet(..)
|
||||
, parseObjectSelectionSet
|
||||
, asObjectSelectionSet
|
||||
, asInterfaceSelectionSet
|
||||
, getMemberSelectionSet
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashMap.Strict.InsOrd.Extended as OMap
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.List as L
|
||||
import qualified Data.Sequence.NonEmpty as NE
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.NormalForm
|
||||
import Hasura.GraphQL.Validate.Context
|
||||
import Hasura.GraphQL.Validate.InputValue
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
class HasSelectionSet a where
|
||||
|
||||
getTypename :: a -> G.NamedType
|
||||
getMemberTypes :: a -> Set.HashSet G.NamedType
|
||||
|
||||
fieldToSelectionSet
|
||||
:: G.Alias -> NormalizedField a -> NormalizedSelectionSet a
|
||||
|
||||
parseField_
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
, MonadState [G.Name] m
|
||||
)
|
||||
=> a
|
||||
-> G.Field
|
||||
-> m (Maybe (NormalizedField a))
|
||||
|
||||
mergeNormalizedSelectionSets
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
)
|
||||
=> [NormalizedSelectionSet a]
|
||||
-> m (NormalizedSelectionSet a)
|
||||
|
||||
fromObjectSelectionSet
|
||||
:: G.NamedType
|
||||
-- ^ parent typename
|
||||
-> G.NamedType
|
||||
-- ^ fragment typename
|
||||
-> Set.HashSet G.NamedType
|
||||
-- ^ common types
|
||||
-> NormalizedSelectionSet ObjTyInfo
|
||||
-> NormalizedSelectionSet a
|
||||
|
||||
fromInterfaceSelectionSet
|
||||
:: G.NamedType
|
||||
-- ^ parent typename
|
||||
-> G.NamedType
|
||||
-- ^ fragment typename
|
||||
-> Set.HashSet G.NamedType
|
||||
-> NormalizedSelectionSet IFaceTyInfo
|
||||
-> NormalizedSelectionSet a
|
||||
|
||||
fromUnionSelectionSet
|
||||
:: G.NamedType
|
||||
-- ^ parent typename
|
||||
-> G.NamedType
|
||||
-- ^ fragment typename
|
||||
-> Set.HashSet G.NamedType
|
||||
-- ^ common types
|
||||
-> NormalizedSelectionSet UnionTyInfo
|
||||
-> NormalizedSelectionSet a
|
||||
|
||||
parseObjectSelectionSet
|
||||
:: ( MonadError QErr m
|
||||
, MonadReusability m
|
||||
)
|
||||
=> ValidationCtx
|
||||
-> ObjTyInfo
|
||||
-> G.SelectionSet
|
||||
-> m ObjectSelectionSet
|
||||
parseObjectSelectionSet validationCtx objectTypeInfo selectionSet =
|
||||
flip evalStateT [] $ flip runReaderT validationCtx $
|
||||
parseSelectionSet objectTypeInfo selectionSet
|
||||
|
||||
selectionToSelectionSet
|
||||
:: HasSelectionSet a
|
||||
=> NormalizedSelection a -> NormalizedSelectionSet a
|
||||
selectionToSelectionSet = \case
|
||||
SelectionField alias fld -> fieldToSelectionSet alias fld
|
||||
SelectionInlineFragmentSpread selectionSet -> selectionSet
|
||||
SelectionFragmentSpread _ selectionSet -> selectionSet
|
||||
|
||||
parseSelectionSet
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
, HasSelectionSet a
|
||||
, MonadState [G.Name] m
|
||||
)
|
||||
=> a
|
||||
-> G.SelectionSet
|
||||
-> m (NormalizedSelectionSet a)
|
||||
parseSelectionSet fieldTypeInfo selectionSet = do
|
||||
visitedFragments <- get
|
||||
withPathK "selectionSet" $ do
|
||||
-- The visited fragments state shouldn't accumulate over a selection set.
|
||||
normalizedSelections <-
|
||||
catMaybes <$> mapM (parseSelection visitedFragments fieldTypeInfo) selectionSet
|
||||
mergeNormalizedSelections normalizedSelections
|
||||
where
|
||||
mergeNormalizedSelections = mergeNormalizedSelectionSets . map selectionToSelectionSet
|
||||
|
||||
-- | While interfaces and objects have fields, unions do not, so
|
||||
-- this is a specialized function for every Object type
|
||||
parseSelection
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
, HasSelectionSet a
|
||||
)
|
||||
=> [G.Name]
|
||||
-> a -- parent type info
|
||||
-> G.Selection
|
||||
-> m (Maybe (NormalizedSelection a))
|
||||
parseSelection visitedFragments parentTypeInfo =
|
||||
flip evalStateT visitedFragments . \case
|
||||
G.SelectionField fld -> withPathK (G.unName $ G._fName fld) $ do
|
||||
let fieldName = G._fName fld
|
||||
fieldAlias = fromMaybe (G.Alias fieldName) $ G._fAlias fld
|
||||
fmap (SelectionField fieldAlias) <$> parseField_ parentTypeInfo fld
|
||||
G.SelectionFragmentSpread (G.FragmentSpread name directives) -> do
|
||||
FragDef _ fragmentTyInfo fragmentSelectionSet <- getFragmentInfo name
|
||||
withPathK (G.unName name) $
|
||||
fmap (SelectionFragmentSpread name) <$>
|
||||
parseFragment parentTypeInfo fragmentTyInfo directives fragmentSelectionSet
|
||||
G.SelectionInlineFragment G.InlineFragment{..} -> do
|
||||
let fragmentType = fromMaybe (getTypename parentTypeInfo) _ifTypeCondition
|
||||
fragmentTyInfo <- getFragmentTyInfo fragmentType
|
||||
withPathK "inlineFragment" $ fmap SelectionInlineFragmentSpread <$>
|
||||
parseFragment parentTypeInfo fragmentTyInfo _ifDirectives _ifSelectionSet
|
||||
|
||||
parseFragment
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
, MonadState [G.Name] m
|
||||
, HasSelectionSet a
|
||||
)
|
||||
=> a
|
||||
-> FragmentTypeInfo
|
||||
-> [G.Directive]
|
||||
-> G.SelectionSet
|
||||
-> m (Maybe (NormalizedSelectionSet a))
|
||||
parseFragment parentTyInfo fragmentTyInfo directives fragmentSelectionSet = do
|
||||
commonTypes <- validateSpread
|
||||
case fragmentTyInfo of
|
||||
FragmentTyObject objTyInfo ->
|
||||
withDirectives directives $
|
||||
fmap (fromObjectSelectionSet parentType fragmentType commonTypes) $
|
||||
parseSelectionSet objTyInfo fragmentSelectionSet
|
||||
FragmentTyInterface interfaceTyInfo ->
|
||||
withDirectives directives $
|
||||
fmap (fromInterfaceSelectionSet parentType fragmentType commonTypes) $
|
||||
parseSelectionSet interfaceTyInfo fragmentSelectionSet
|
||||
FragmentTyUnion unionTyInfo ->
|
||||
withDirectives directives $
|
||||
fmap (fromUnionSelectionSet parentType fragmentType commonTypes) $
|
||||
parseSelectionSet unionTyInfo fragmentSelectionSet
|
||||
where
|
||||
validateSpread = do
|
||||
let commonTypes = parentTypeMembers `Set.intersection` fragmentTypeMembers
|
||||
if null commonTypes then
|
||||
-- TODO: better error location by capturing the fragment source -
|
||||
-- named or otherwise
|
||||
-- throwVE $ "cannot spread fragment " <> showName name <> " defined on " <>
|
||||
throwVE $ "cannot spread fragment defined on " <> showNamedTy fragmentType
|
||||
<> " when selecting fields of type " <> showNamedTy parentType
|
||||
else pure commonTypes
|
||||
|
||||
parentType = getTypename parentTyInfo
|
||||
parentTypeMembers = getMemberTypes parentTyInfo
|
||||
|
||||
fragmentType = case fragmentTyInfo of
|
||||
FragmentTyObject tyInfo -> getTypename tyInfo
|
||||
FragmentTyInterface tyInfo -> getTypename tyInfo
|
||||
FragmentTyUnion tyInfo -> getTypename tyInfo
|
||||
fragmentTypeMembers = case fragmentTyInfo of
|
||||
FragmentTyObject tyInfo -> getMemberTypes tyInfo
|
||||
FragmentTyInterface tyInfo -> getMemberTypes tyInfo
|
||||
FragmentTyUnion tyInfo -> getMemberTypes tyInfo
|
||||
|
||||
class IsField f => MergeableField f where
|
||||
|
||||
checkFieldMergeability
|
||||
:: (MonadError QErr m) => G.Alias -> NE.NESeq f -> m f
|
||||
|
||||
instance MergeableField Field where
|
||||
|
||||
checkFieldMergeability alias fields = do
|
||||
let groupedFlds = toList $ NE.toSeq fields
|
||||
fldNames = L.nub $ map getFieldName groupedFlds
|
||||
args = L.nub $ map getFieldArguments groupedFlds
|
||||
when (length fldNames > 1) $
|
||||
throwVE $ "cannot merge different fields under the same alias ("
|
||||
<> showName (G.unAlias alias) <> "): "
|
||||
<> showNames fldNames
|
||||
when (length args > 1) $
|
||||
throwVE $ "cannot merge fields with different arguments"
|
||||
<> " under the same alias: "
|
||||
<> showName (G.unAlias alias)
|
||||
let fld = NE.head fields
|
||||
mergedGroupSelectionSet <- mergeSelectionSets $ fmap _fSelSet fields
|
||||
return $ fld { _fSelSet = mergedGroupSelectionSet }
|
||||
|
||||
instance MergeableField Typename where
|
||||
|
||||
checkFieldMergeability _ fields = pure $ NE.head fields
|
||||
|
||||
parseArguments
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
)
|
||||
=> ParamMap
|
||||
-> [G.Argument]
|
||||
-> m ArgsMap
|
||||
parseArguments fldParams argsL = do
|
||||
|
||||
args <- onLeft (mkMapWith G._aName argsL) $ \dups ->
|
||||
throwVE $ "the following arguments are defined more than once: " <>
|
||||
showNames dups
|
||||
|
||||
let requiredParams = Map.filter (G.isNotNull . _iviType) fldParams
|
||||
|
||||
inpArgs <- forM args $ \(G.Argument argName argVal) ->
|
||||
withPathK (G.unName argName) $ do
|
||||
argTy <- getArgTy argName
|
||||
validateInputValue valueParser argTy argVal
|
||||
|
||||
forM_ requiredParams $ \argDef -> do
|
||||
let param = _iviName argDef
|
||||
onNothing (Map.lookup param inpArgs) $ throwVE $ mconcat
|
||||
[ "the required argument ", showName param, " is missing"]
|
||||
|
||||
return inpArgs
|
||||
|
||||
where
|
||||
getArgTy argName =
|
||||
onNothing (_iviType <$> Map.lookup argName fldParams) $ throwVE $
|
||||
"no such argument " <> showName argName <> " is expected"
|
||||
|
||||
mergeFields
|
||||
:: ( MonadError QErr m
|
||||
, MergeableField f
|
||||
)
|
||||
-- => Seq.Seq Field
|
||||
=> [AliasedFields f]
|
||||
-> m (AliasedFields f)
|
||||
mergeFields flds =
|
||||
AliasedFields <$> OMap.traverseWithKey checkFieldMergeability groups
|
||||
where
|
||||
groups = foldr (OMap.unionWith (<>)) mempty $
|
||||
map (fmap NE.init . unAliasedFields) flds
|
||||
|
||||
appendSelectionSets
|
||||
:: (MonadError QErr m) => SelectionSet -> SelectionSet -> m SelectionSet
|
||||
appendSelectionSets = curry \case
|
||||
(SelectionSetObject s1, SelectionSetObject s2) ->
|
||||
SelectionSetObject <$> mergeObjectSelectionSets [s1, s2]
|
||||
(SelectionSetInterface s1, SelectionSetInterface s2) ->
|
||||
SelectionSetInterface <$> appendScopedSelectionSet s1 s2
|
||||
(SelectionSetUnion s1, SelectionSetUnion s2) ->
|
||||
SelectionSetUnion <$> appendScopedSelectionSet s1 s2
|
||||
(SelectionSetNone, SelectionSetNone) -> pure SelectionSetNone
|
||||
(_, _) -> throw500 $ "mergeSelectionSets: 'same kind' assertion failed"
|
||||
|
||||
|
||||
-- query q {
|
||||
-- author {
|
||||
-- id
|
||||
-- }
|
||||
-- author {
|
||||
-- name
|
||||
-- }
|
||||
-- }
|
||||
--
|
||||
-- | When we are merging two selection sets down two different trees they
|
||||
-- should be of the same type, however, as it is not enforced in the type
|
||||
-- system, an internal error is thrown when this assumption is violated
|
||||
mergeSelectionSets
|
||||
:: (MonadError QErr m) => NE.NESeq SelectionSet -> m SelectionSet
|
||||
-- mergeSelectionSets = curry $ \case
|
||||
mergeSelectionSets selectionSets =
|
||||
foldM appendSelectionSets (NE.head selectionSets) $ NE.tail selectionSets
|
||||
|
||||
mergeObjectSelectionSets
|
||||
:: (MonadError QErr m) => [ObjectSelectionSet] -> m ObjectSelectionSet
|
||||
mergeObjectSelectionSets =
|
||||
fmap ObjectSelectionSet . mergeFields . map unObjectSelectionSet
|
||||
|
||||
mergeObjectSelectionSetMaps
|
||||
:: (MonadError QErr m) => [ObjectSelectionSetMap] -> m ObjectSelectionSetMap
|
||||
mergeObjectSelectionSetMaps selectionSetMaps =
|
||||
traverse mergeObjectSelectionSets $
|
||||
foldr (Map.unionWith (<>)) mempty $ map (fmap (:[])) selectionSetMaps
|
||||
|
||||
appendScopedSelectionSet
|
||||
:: (MonadError QErr m, MergeableField f)
|
||||
=> ScopedSelectionSet f -> ScopedSelectionSet f -> m (ScopedSelectionSet f)
|
||||
appendScopedSelectionSet s1 s2 =
|
||||
ScopedSelectionSet
|
||||
<$> mergeFields [_sssBaseSelectionSet s1, _sssBaseSelectionSet s2]
|
||||
<*> mergeObjectSelectionSetMaps [s1MembersUnified, s2MembersUnified]
|
||||
|
||||
where
|
||||
s1Base = fmap toField $ _sssBaseSelectionSet s1
|
||||
s2Base = fmap toField $ _sssBaseSelectionSet s2
|
||||
|
||||
s1MembersUnified =
|
||||
(_sssMemberSelectionSets s1)
|
||||
<> fmap (const (ObjectSelectionSet s1Base)) (_sssMemberSelectionSets s2)
|
||||
|
||||
s2MembersUnified =
|
||||
(_sssMemberSelectionSets s2)
|
||||
<> fmap (const (ObjectSelectionSet s2Base)) (_sssMemberSelectionSets s1)
|
||||
|
||||
mergeScopedSelectionSets
|
||||
:: (MonadError QErr m, MergeableField f)
|
||||
=> [ScopedSelectionSet f] -> m (ScopedSelectionSet f)
|
||||
mergeScopedSelectionSets selectionSets =
|
||||
foldM appendScopedSelectionSet emptyScopedSelectionSet selectionSets
|
||||
|
||||
withDirectives
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
)
|
||||
=> [G.Directive]
|
||||
-> m a
|
||||
-> m (Maybe a)
|
||||
withDirectives dirs act = do
|
||||
procDirs <- withPathK "directives" $ do
|
||||
dirDefs <- onLeft (mkMapWith G._dName dirs) $ \dups ->
|
||||
throwVE $ "the following directives are used more than once: " <>
|
||||
showNames dups
|
||||
|
||||
flip Map.traverseWithKey dirDefs $ \name dir ->
|
||||
withPathK (G.unName name) $ do
|
||||
dirInfo <- onNothing (Map.lookup (G._dName dir) defDirectivesMap) $
|
||||
throwVE $ "unexpected directive: " <> showName name
|
||||
procArgs <- withPathK "args" $ parseArguments (_diParams dirInfo)
|
||||
(G._dArguments dir)
|
||||
getIfArg procArgs
|
||||
|
||||
let shouldSkip = fromMaybe False $ Map.lookup "skip" procDirs
|
||||
shouldInclude = fromMaybe True $ Map.lookup "include" procDirs
|
||||
|
||||
if not shouldSkip && shouldInclude
|
||||
then Just <$> act
|
||||
else return Nothing
|
||||
|
||||
where
|
||||
getIfArg m = do
|
||||
val <- onNothing (Map.lookup "if" m) $ throw500
|
||||
"missing if argument in the directive"
|
||||
when (isJust $ _aivVariable val) markNotReusable
|
||||
case _aivValue val of
|
||||
AGScalar _ (Just (PGValBoolean v)) -> return v
|
||||
_ -> throw500 "did not find boolean scalar for if argument"
|
||||
|
||||
getFragmentInfo
|
||||
:: (MonadReader ValidationCtx m, MonadError QErr m, MonadState [G.Name] m)
|
||||
=> G.Name
|
||||
-- ^ fragment name
|
||||
-> m FragDef
|
||||
getFragmentInfo name = do
|
||||
-- check for cycles
|
||||
visitedFragments <- get
|
||||
if name `elem` visitedFragments
|
||||
then throwVE $ "cannot spread fragment " <> showName name
|
||||
<> " within itself via "
|
||||
<> T.intercalate "," (map G.unName visitedFragments)
|
||||
else put $ name:visitedFragments
|
||||
fragInfo <- Map.lookup name <$> asks _vcFragDefMap
|
||||
onNothing fragInfo $ throwVE $ "fragment '" <> G.unName name <> "' not found"
|
||||
|
||||
denormalizeField
|
||||
:: ( MonadReader ValidationCtx m
|
||||
, MonadError QErr m
|
||||
, MonadReusability m
|
||||
, MonadState [G.Name] m
|
||||
)
|
||||
=> ObjFldInfo
|
||||
-> G.Field
|
||||
-> m (Maybe Field)
|
||||
denormalizeField fldInfo (G.Field _ name args dirs selSet) = do
|
||||
|
||||
let fldTy = _fiTy fldInfo
|
||||
fldBaseTy = getBaseTy fldTy
|
||||
|
||||
fldTyInfo <- getTyInfo fldBaseTy
|
||||
|
||||
argMap <- withPathK "args" $ parseArguments (_fiParams fldInfo) args
|
||||
|
||||
fields <- case (fldTyInfo, selSet) of
|
||||
|
||||
(TIObj _, []) ->
|
||||
throwVE $ "field " <> showName name <> " of type "
|
||||
<> G.showGT fldTy <> " must have a selection of subfields"
|
||||
|
||||
(TIObj objTyInfo, _) ->
|
||||
SelectionSetObject <$> parseSelectionSet objTyInfo selSet
|
||||
|
||||
(TIIFace _, []) ->
|
||||
throwVE $ "field " <> showName name <> " of type "
|
||||
<> G.showGT fldTy <> " must have a selection of subfields"
|
||||
|
||||
(TIIFace interfaceTyInfo, _) ->
|
||||
SelectionSetInterface <$> parseSelectionSet interfaceTyInfo selSet
|
||||
|
||||
(TIUnion _, []) ->
|
||||
throwVE $ "field " <> showName name <> " of type "
|
||||
<> G.showGT fldTy <> " must have a selection of subfields"
|
||||
|
||||
(TIUnion unionTyInfo, _) ->
|
||||
SelectionSetUnion <$> parseSelectionSet unionTyInfo selSet
|
||||
|
||||
(TIScalar _, []) -> return SelectionSetNone
|
||||
-- when scalar/enum and no empty set
|
||||
(TIScalar _, _) ->
|
||||
throwVE $ "field " <> showName name <> " must not have a "
|
||||
<> "selection since type " <> G.showGT fldTy <> " has no subfields"
|
||||
|
||||
(TIEnum _, []) -> return SelectionSetNone
|
||||
(TIEnum _, _) ->
|
||||
throwVE $ "field " <> showName name <> " must not have a "
|
||||
<> "selection since type " <> G.showGT fldTy <> " has no subfields"
|
||||
|
||||
(TIInpObj _, _) ->
|
||||
throwVE $ "internal error: unexpected input type for field: "
|
||||
<> showName name
|
||||
|
||||
withDirectives dirs $ pure $ Field name fldBaseTy argMap fields
|
||||
|
||||
type instance NormalizedSelectionSet ObjTyInfo = ObjectSelectionSet
|
||||
type instance NormalizedField ObjTyInfo = Field
|
||||
|
||||
instance HasSelectionSet ObjTyInfo where
|
||||
|
||||
getTypename = _otiName
|
||||
getMemberTypes = Set.singleton . _otiName
|
||||
|
||||
parseField_ objTyInfo field = do
|
||||
fieldInfo <- getFieldInfo (_otiName objTyInfo) (_otiFields objTyInfo) $ G._fName field
|
||||
denormalizeField fieldInfo field
|
||||
|
||||
fieldToSelectionSet alias fld =
|
||||
ObjectSelectionSet $ AliasedFields $ OMap.singleton alias fld
|
||||
|
||||
mergeNormalizedSelectionSets = mergeObjectSelectionSets
|
||||
|
||||
fromObjectSelectionSet _ _ _ objectSelectionSet =
|
||||
objectSelectionSet
|
||||
|
||||
fromInterfaceSelectionSet parentType _ _ interfaceSelectionSet =
|
||||
getMemberSelectionSet parentType interfaceSelectionSet
|
||||
|
||||
fromUnionSelectionSet parentType _ _ unionSelectionSet =
|
||||
getMemberSelectionSet parentType unionSelectionSet
|
||||
|
||||
type instance NormalizedSelectionSet IFaceTyInfo = InterfaceSelectionSet
|
||||
type instance NormalizedField IFaceTyInfo = Field
|
||||
|
||||
instance HasSelectionSet IFaceTyInfo where
|
||||
|
||||
getTypename = _ifName
|
||||
getMemberTypes = _ifMemberTypes
|
||||
|
||||
parseField_ interfaceTyInfo field = do
|
||||
fieldInfo <- getFieldInfo (_ifName interfaceTyInfo) (_ifFields interfaceTyInfo)
|
||||
$ G._fName field
|
||||
denormalizeField fieldInfo field
|
||||
|
||||
fieldToSelectionSet alias field =
|
||||
ScopedSelectionSet (AliasedFields $ OMap.singleton alias field) mempty
|
||||
|
||||
mergeNormalizedSelectionSets = mergeScopedSelectionSets
|
||||
|
||||
fromObjectSelectionSet _ fragmentType _ objectSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.singleton fragmentType objectSelectionSet
|
||||
|
||||
fromInterfaceSelectionSet _ _ commonTypes interfaceSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.fromList $ flip map (toList commonTypes) $
|
||||
\commonType -> (commonType, getMemberSelectionSet commonType interfaceSelectionSet)
|
||||
|
||||
fromUnionSelectionSet _ _ commonTypes unionSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.fromList $ flip map (toList commonTypes) $
|
||||
\commonType -> (commonType, getMemberSelectionSet commonType unionSelectionSet)
|
||||
|
||||
type instance NormalizedSelectionSet UnionTyInfo = UnionSelectionSet
|
||||
type instance NormalizedField UnionTyInfo = Typename
|
||||
|
||||
instance HasSelectionSet UnionTyInfo where
|
||||
|
||||
getTypename = _utiName
|
||||
getMemberTypes = _utiMemberTypes
|
||||
|
||||
parseField_ unionTyInfo field = do
|
||||
let fieldMap = Map.singleton (_fiName typenameFld) typenameFld
|
||||
fieldInfo <- getFieldInfo (_utiName unionTyInfo) fieldMap $ G._fName field
|
||||
fmap (const Typename) <$> denormalizeField fieldInfo field
|
||||
|
||||
fieldToSelectionSet alias field =
|
||||
ScopedSelectionSet (AliasedFields $ OMap.singleton alias field) mempty
|
||||
|
||||
mergeNormalizedSelectionSets = mergeScopedSelectionSets
|
||||
|
||||
fromObjectSelectionSet _ fragmentType _ objectSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.singleton fragmentType objectSelectionSet
|
||||
|
||||
fromInterfaceSelectionSet _ _ commonTypes interfaceSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.fromList $ flip map (toList commonTypes) $
|
||||
\commonType -> (commonType, getMemberSelectionSet commonType interfaceSelectionSet)
|
||||
|
||||
fromUnionSelectionSet _ _ commonTypes unionSelectionSet =
|
||||
ScopedSelectionSet (AliasedFields mempty) $
|
||||
Map.fromList $ flip map (toList commonTypes) $
|
||||
\commonType -> (commonType, getMemberSelectionSet commonType unionSelectionSet)
|
@ -1,819 +0,0 @@
|
||||
{-# LANGUAGE GADTs #-}
|
||||
module Hasura.GraphQL.Validate.Types
|
||||
( InpValInfo(..)
|
||||
, ParamMap
|
||||
|
||||
, typenameFld
|
||||
, ObjFldInfo(..)
|
||||
, mkHsraObjFldInfo
|
||||
, ObjFieldMap
|
||||
|
||||
-- Don't expose 'ObjTyInfo' constructor. Instead use 'mkObjTyInfo' or 'mkHsraObjTyInfo'
|
||||
-- which will auto-insert the compulsory '__typename' field.
|
||||
, ObjTyInfo
|
||||
, _otiDesc
|
||||
, _otiName
|
||||
, _otiImplIFaces
|
||||
, _otiFields
|
||||
, mkObjTyInfo
|
||||
, mkHsraObjTyInfo
|
||||
|
||||
-- Don't expose 'IFaceTyInfo' constructor. Instead use 'mkIFaceTyInfo'
|
||||
-- which will auto-insert the compulsory '__typename' field.
|
||||
, IFaceTyInfo
|
||||
, _ifDesc
|
||||
, _ifName
|
||||
, _ifFields
|
||||
, _ifMemberTypes
|
||||
, mkIFaceTyInfo
|
||||
|
||||
, IFacesSet
|
||||
, UnionTyInfo(..)
|
||||
, FragDef(..)
|
||||
, FragmentTypeInfo(..)
|
||||
, FragDefMap
|
||||
, AnnVarVals
|
||||
, AnnInpVal(..)
|
||||
|
||||
, EnumTyInfo(..)
|
||||
, mkHsraEnumTyInfo
|
||||
|
||||
, EnumValuesInfo(..)
|
||||
, normalizeEnumValues
|
||||
, EnumValInfo(..)
|
||||
, InpObjFldMap
|
||||
, InpObjTyInfo(..)
|
||||
, mkHsraInpTyInfo
|
||||
|
||||
, ScalarTyInfo(..)
|
||||
, fromScalarTyDef
|
||||
, mkHsraScalarTyInfo
|
||||
|
||||
, DirectiveInfo(..)
|
||||
, AsObjType(..)
|
||||
, defaultDirectives
|
||||
, defDirectivesMap
|
||||
, defaultSchema
|
||||
, TypeInfo(..)
|
||||
, isObjTy
|
||||
, isIFaceTy
|
||||
, getPossibleObjTypes
|
||||
, getObjTyM
|
||||
, getUnionTyM
|
||||
, mkScalarTy
|
||||
, pgColTyToScalar
|
||||
, getNamedTy
|
||||
, mkTyInfoMap
|
||||
, fromTyDef
|
||||
, fromSchemaDoc
|
||||
, fromSchemaDocQ
|
||||
, TypeMap
|
||||
, TypeLoc (..)
|
||||
, typeEq
|
||||
, AnnGValue(..)
|
||||
, AnnGEnumValue(..)
|
||||
, AnnGObject
|
||||
, hasNullVal
|
||||
, getAnnInpValKind
|
||||
, stripTypenames
|
||||
|
||||
, ReusableVariableTypes(..)
|
||||
, ReusableVariableValues
|
||||
|
||||
, QueryReusability(..)
|
||||
, _Reusable
|
||||
, _NotReusable
|
||||
, MonadReusability(..)
|
||||
, ReusabilityT
|
||||
, runReusabilityT
|
||||
, runReusabilityTWith
|
||||
, evalReusabilityT
|
||||
|
||||
, module Hasura.GraphQL.Utils
|
||||
) where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Language.GraphQL.Draft.TH as G
|
||||
import qualified Language.Haskell.TH.Syntax as TH
|
||||
|
||||
import Control.Lens (makePrisms)
|
||||
|
||||
import qualified Hasura.RQL.Types.Column as RQL
|
||||
import qualified Hasura.Tracing as Tracing
|
||||
|
||||
import Hasura.GraphQL.NormalForm
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.RQL.Instances ()
|
||||
import Hasura.RQL.Types.Common
|
||||
import Hasura.RQL.Types.RemoteSchema (RemoteSchemaInfo, RemoteSchemaName)
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
typeEq :: (EquatableGType a, Eq (EqProps a)) => a -> a -> Bool
|
||||
typeEq a b = getEqProps a == getEqProps b
|
||||
|
||||
data EnumValInfo
|
||||
= EnumValInfo
|
||||
{ _eviDesc :: !(Maybe G.Description)
|
||||
, _eviVal :: !G.EnumValue
|
||||
, _eviIsDeprecated :: !Bool
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
fromEnumValDef :: G.EnumValueDefinition -> EnumValInfo
|
||||
fromEnumValDef (G.EnumValueDefinition descM val _) =
|
||||
EnumValInfo descM val False
|
||||
|
||||
data EnumValuesInfo
|
||||
= EnumValuesSynthetic !(Map.HashMap G.EnumValue EnumValInfo)
|
||||
-- ^ Values for an enum that exists only in the GraphQL schema and does not
|
||||
-- have any external source of truth.
|
||||
| EnumValuesReference !RQL.EnumReference
|
||||
-- ^ Values for an enum that is backed by an enum table reference (see
|
||||
-- "Hasura.RQL.Schema.Enum").
|
||||
deriving (Show, Eq, TH.Lift)
|
||||
|
||||
normalizeEnumValues :: EnumValuesInfo -> Map.HashMap G.EnumValue EnumValInfo
|
||||
normalizeEnumValues = \case
|
||||
EnumValuesSynthetic values -> values
|
||||
EnumValuesReference (RQL.EnumReference _ values) ->
|
||||
mapFromL _eviVal . flip map (Map.toList values) $
|
||||
\(RQL.EnumValue name, RQL.EnumValueInfo maybeDescription) -> EnumValInfo
|
||||
{ _eviVal = G.EnumValue $ G.Name name
|
||||
, _eviDesc = G.Description <$> maybeDescription
|
||||
, _eviIsDeprecated = False }
|
||||
|
||||
data EnumTyInfo
|
||||
= EnumTyInfo
|
||||
{ _etiDesc :: !(Maybe G.Description)
|
||||
, _etiName :: !G.NamedType
|
||||
, _etiValues :: !EnumValuesInfo
|
||||
, _etiLoc :: !TypeLoc
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType EnumTyInfo where
|
||||
type EqProps EnumTyInfo = (G.NamedType, Map.HashMap G.EnumValue EnumValInfo)
|
||||
getEqProps ety = (,) (_etiName ety) (normalizeEnumValues $ _etiValues ety)
|
||||
|
||||
fromEnumTyDef :: G.EnumTypeDefinition -> TypeLoc -> EnumTyInfo
|
||||
fromEnumTyDef (G.EnumTypeDefinition descM n _ valDefs) loc =
|
||||
EnumTyInfo descM (G.NamedType n) (EnumValuesSynthetic enumVals) loc
|
||||
where
|
||||
enumVals = Map.fromList
|
||||
[(G._evdName valDef, fromEnumValDef valDef) | valDef <- valDefs]
|
||||
|
||||
mkHsraEnumTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> EnumValuesInfo
|
||||
-> EnumTyInfo
|
||||
mkHsraEnumTyInfo descM ty enumVals =
|
||||
EnumTyInfo descM ty enumVals TLHasuraType
|
||||
|
||||
fromInpValDef :: G.InputValueDefinition -> InpValInfo
|
||||
fromInpValDef (G.InputValueDefinition descM n ty defM) =
|
||||
InpValInfo descM n defM ty
|
||||
|
||||
type ParamMap = Map.HashMap G.Name InpValInfo
|
||||
|
||||
-- | location of the type: a hasura type or a remote type
|
||||
data TypeLoc
|
||||
= TLHasuraType
|
||||
| TLRemoteType !RemoteSchemaName !RemoteSchemaInfo
|
||||
| TLCustom
|
||||
deriving (Show, Eq, TH.Lift, Generic)
|
||||
|
||||
$(J.deriveJSON
|
||||
J.defaultOptions { J.constructorTagModifier = J.snakeCase . drop 2
|
||||
, J.sumEncoding = J.TaggedObject "type" "detail"
|
||||
}
|
||||
''TypeLoc)
|
||||
|
||||
instance Hashable TypeLoc
|
||||
|
||||
data ObjFldInfo
|
||||
= ObjFldInfo
|
||||
{ _fiDesc :: !(Maybe G.Description)
|
||||
, _fiName :: !G.Name
|
||||
, _fiParams :: !ParamMap
|
||||
, _fiTy :: !G.GType
|
||||
, _fiLoc :: !TypeLoc
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType ObjFldInfo where
|
||||
type EqProps ObjFldInfo = (G.Name, G.GType, ParamMap)
|
||||
getEqProps o = (,,) (_fiName o) (_fiTy o) (_fiParams o)
|
||||
|
||||
fromFldDef :: G.FieldDefinition -> TypeLoc -> ObjFldInfo
|
||||
fromFldDef (G.FieldDefinition descM n args ty _) loc =
|
||||
ObjFldInfo descM n params ty loc
|
||||
where
|
||||
params = Map.fromList [(G._ivdName arg, fromInpValDef arg) | arg <- args]
|
||||
|
||||
mkHsraObjFldInfo
|
||||
:: Maybe G.Description
|
||||
-> G.Name
|
||||
-> ParamMap
|
||||
-> G.GType
|
||||
-> ObjFldInfo
|
||||
mkHsraObjFldInfo descM name params ty =
|
||||
ObjFldInfo descM name params ty TLHasuraType
|
||||
|
||||
type ObjFieldMap = Map.HashMap G.Name ObjFldInfo
|
||||
|
||||
type IFacesSet = Set.HashSet G.NamedType
|
||||
|
||||
data ObjTyInfo
|
||||
= ObjTyInfo
|
||||
{ _otiDesc :: !(Maybe G.Description)
|
||||
, _otiName :: !G.NamedType
|
||||
, _otiImplIFaces :: !IFacesSet
|
||||
, _otiFields :: !ObjFieldMap
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType ObjTyInfo where
|
||||
type EqProps ObjTyInfo =
|
||||
(G.NamedType, Set.HashSet G.NamedType, Map.HashMap G.Name (G.Name, G.GType, ParamMap))
|
||||
getEqProps a = (,,) (_otiName a) (_otiImplIFaces a) (Map.map getEqProps (_otiFields a))
|
||||
|
||||
instance Monoid ObjTyInfo where
|
||||
mempty = ObjTyInfo Nothing (G.NamedType "") Set.empty Map.empty
|
||||
|
||||
instance Semigroup ObjTyInfo where
|
||||
objA <> objB =
|
||||
objA { _otiFields = Map.union (_otiFields objA) (_otiFields objB)
|
||||
, _otiImplIFaces = _otiImplIFaces objA `Set.union` _otiImplIFaces objB
|
||||
}
|
||||
|
||||
mkObjTyInfo
|
||||
:: Maybe G.Description -> G.NamedType
|
||||
-> IFacesSet -> ObjFieldMap -> TypeLoc -> ObjTyInfo
|
||||
mkObjTyInfo descM ty iFaces flds _ =
|
||||
ObjTyInfo descM ty iFaces $ Map.insert (_fiName newFld) newFld flds
|
||||
where newFld = typenameFld
|
||||
|
||||
mkHsraObjTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> IFacesSet
|
||||
-> ObjFieldMap
|
||||
-> ObjTyInfo
|
||||
mkHsraObjTyInfo descM ty implIFaces flds =
|
||||
mkObjTyInfo descM ty implIFaces flds TLHasuraType
|
||||
|
||||
mkIFaceTyInfo
|
||||
:: Maybe G.Description -> G.NamedType
|
||||
-> Map.HashMap G.Name ObjFldInfo -> MemberTypes -> IFaceTyInfo
|
||||
mkIFaceTyInfo descM ty flds =
|
||||
IFaceTyInfo descM ty $ Map.insert (_fiName newFld) newFld flds
|
||||
where
|
||||
newFld = typenameFld
|
||||
|
||||
typenameFld :: ObjFldInfo
|
||||
typenameFld =
|
||||
ObjFldInfo (Just desc) "__typename" Map.empty
|
||||
(G.toGT $ G.toNT $ G.NamedType "String") TLHasuraType
|
||||
where
|
||||
desc = "The name of the current Object type at runtime"
|
||||
|
||||
fromObjTyDef :: G.ObjectTypeDefinition -> TypeLoc -> ObjTyInfo
|
||||
fromObjTyDef (G.ObjectTypeDefinition descM n ifaces _ flds) loc =
|
||||
mkObjTyInfo descM (G.NamedType n) (Set.fromList ifaces) fldMap loc
|
||||
where
|
||||
fldMap = Map.fromList [(G._fldName fld, fromFldDef fld loc) | fld <- flds]
|
||||
|
||||
data IFaceTyInfo
|
||||
= IFaceTyInfo
|
||||
{ _ifDesc :: !(Maybe G.Description)
|
||||
, _ifName :: !G.NamedType
|
||||
, _ifFields :: !ObjFieldMap
|
||||
, _ifMemberTypes :: !MemberTypes
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType IFaceTyInfo where
|
||||
type EqProps IFaceTyInfo =
|
||||
(G.NamedType, Map.HashMap G.Name (G.Name, G.GType, ParamMap))
|
||||
getEqProps a = (,) (_ifName a) (Map.map getEqProps (_ifFields a))
|
||||
|
||||
instance Semigroup IFaceTyInfo where
|
||||
objA <> objB =
|
||||
objA { _ifFields = Map.union (_ifFields objA) (_ifFields objB)
|
||||
}
|
||||
|
||||
fromIFaceDef
|
||||
:: InterfaceImplementations -> G.InterfaceTypeDefinition -> TypeLoc -> IFaceTyInfo
|
||||
fromIFaceDef interfaceImplementations (G.InterfaceTypeDefinition descM n _ flds) loc =
|
||||
mkIFaceTyInfo descM (G.NamedType n) fldMap implementations
|
||||
where
|
||||
fldMap = Map.fromList [(G._fldName fld, fromFldDef fld loc) | fld <- flds]
|
||||
implementations = fromMaybe mempty $ Map.lookup (G.NamedType n) interfaceImplementations
|
||||
|
||||
type MemberTypes = Set.HashSet G.NamedType
|
||||
|
||||
data UnionTyInfo
|
||||
= UnionTyInfo
|
||||
{ _utiDesc :: !(Maybe G.Description)
|
||||
, _utiName :: !G.NamedType
|
||||
, _utiMemberTypes :: !MemberTypes
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType UnionTyInfo where
|
||||
type EqProps UnionTyInfo =
|
||||
(G.NamedType, Set.HashSet G.NamedType)
|
||||
getEqProps a = (,) (_utiName a) (_utiMemberTypes a)
|
||||
|
||||
instance Monoid UnionTyInfo where
|
||||
mempty = UnionTyInfo Nothing (G.NamedType "") Set.empty
|
||||
|
||||
instance Semigroup UnionTyInfo where
|
||||
objA <> objB =
|
||||
objA { _utiMemberTypes = Set.union (_utiMemberTypes objA) (_utiMemberTypes objB)
|
||||
}
|
||||
|
||||
fromUnionTyDef :: G.UnionTypeDefinition -> UnionTyInfo
|
||||
fromUnionTyDef (G.UnionTypeDefinition descM n _ mt) = UnionTyInfo descM (G.NamedType n) $ Set.fromList mt
|
||||
|
||||
type InpObjFldMap = Map.HashMap G.Name InpValInfo
|
||||
|
||||
data InpObjTyInfo
|
||||
= InpObjTyInfo
|
||||
{ _iotiDesc :: !(Maybe G.Description)
|
||||
, _iotiName :: !G.NamedType
|
||||
, _iotiFields :: !InpObjFldMap
|
||||
, _iotiLoc :: !TypeLoc
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance EquatableGType InpObjTyInfo where
|
||||
type EqProps InpObjTyInfo = (G.NamedType, Map.HashMap G.Name (G.Name, G.GType))
|
||||
getEqProps a = (,) (_iotiName a) (Map.map getEqProps $ _iotiFields a)
|
||||
|
||||
fromInpObjTyDef :: G.InputObjectTypeDefinition -> TypeLoc -> InpObjTyInfo
|
||||
fromInpObjTyDef (G.InputObjectTypeDefinition descM n _ inpFlds) loc =
|
||||
InpObjTyInfo descM (G.NamedType n) fldMap loc
|
||||
where
|
||||
fldMap = Map.fromList
|
||||
[(G._ivdName inpFld, fromInpValDef inpFld) | inpFld <- inpFlds]
|
||||
|
||||
mkHsraInpTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> InpObjFldMap
|
||||
-> InpObjTyInfo
|
||||
mkHsraInpTyInfo descM ty flds =
|
||||
InpObjTyInfo descM ty flds TLHasuraType
|
||||
|
||||
data ScalarTyInfo
|
||||
= ScalarTyInfo
|
||||
{ _stiDesc :: !(Maybe G.Description)
|
||||
, _stiName :: !G.Name
|
||||
, _stiType :: !PGScalarType
|
||||
, _stiLoc :: !TypeLoc
|
||||
} deriving (Show, Eq, TH.Lift)
|
||||
|
||||
mkHsraScalarTyInfo :: PGScalarType -> ScalarTyInfo
|
||||
mkHsraScalarTyInfo ty =
|
||||
ScalarTyInfo Nothing (G.Name $ pgColTyToScalar ty) ty TLHasuraType
|
||||
|
||||
instance EquatableGType ScalarTyInfo where
|
||||
type EqProps ScalarTyInfo = PGScalarType
|
||||
getEqProps = _stiType
|
||||
|
||||
fromScalarTyDef
|
||||
:: G.ScalarTypeDefinition
|
||||
-> TypeLoc
|
||||
-> ScalarTyInfo
|
||||
fromScalarTyDef (G.ScalarTypeDefinition descM n _) =
|
||||
ScalarTyInfo descM n ty
|
||||
where
|
||||
ty = case n of
|
||||
"Int" -> PGInteger
|
||||
"Float" -> PGFloat
|
||||
"String" -> PGText
|
||||
"Boolean" -> PGBoolean
|
||||
"ID" -> PGText
|
||||
_ -> textToPGScalarType $ G.unName n
|
||||
|
||||
data TypeInfo
|
||||
= TIScalar !ScalarTyInfo
|
||||
| TIObj !ObjTyInfo
|
||||
| TIEnum !EnumTyInfo
|
||||
| TIInpObj !InpObjTyInfo
|
||||
| TIIFace !IFaceTyInfo
|
||||
| TIUnion !UnionTyInfo
|
||||
deriving (Show, Eq, TH.Lift)
|
||||
|
||||
instance J.ToJSON TypeInfo where
|
||||
toJSON _ = J.String "toJSON not implemented for TypeInfo"
|
||||
|
||||
data AsObjType
|
||||
= AOTIFace IFaceTyInfo
|
||||
| AOTUnion UnionTyInfo
|
||||
|
||||
getPossibleObjTypes :: TypeMap -> AsObjType -> Map.HashMap G.NamedType ObjTyInfo
|
||||
getPossibleObjTypes tyMap = \case
|
||||
(AOTIFace i) ->
|
||||
toObjMap $ mapMaybe (extrObjTyInfoM tyMap) $ Set.toList $ _ifMemberTypes i
|
||||
(AOTUnion u) ->
|
||||
toObjMap $ mapMaybe (extrObjTyInfoM tyMap) $ Set.toList $ _utiMemberTypes u
|
||||
-- toObjMap $ mapMaybe previewImplTypeM $ Map.elems tyMap
|
||||
-- where
|
||||
-- previewImplTypeM = \case
|
||||
-- TIObj objTyInfo -> bool Nothing (Just objTyInfo) $
|
||||
-- _ifName i `elem` _otiImplIFaces objTyInfo
|
||||
-- _ -> Nothing
|
||||
|
||||
|
||||
toObjMap :: [ObjTyInfo] -> Map.HashMap G.NamedType ObjTyInfo
|
||||
toObjMap = foldr (\o -> Map.insert (_otiName o) o) Map.empty
|
||||
|
||||
|
||||
isObjTy :: TypeInfo -> Bool
|
||||
isObjTy = \case
|
||||
(TIObj _) -> True
|
||||
_ -> False
|
||||
|
||||
getObjTyM :: TypeInfo -> Maybe ObjTyInfo
|
||||
getObjTyM = \case
|
||||
(TIObj t) -> return t
|
||||
_ -> Nothing
|
||||
|
||||
getUnionTyM :: TypeInfo -> Maybe UnionTyInfo
|
||||
getUnionTyM = \case
|
||||
(TIUnion u) -> return u
|
||||
_ -> Nothing
|
||||
|
||||
isIFaceTy :: TypeInfo -> Bool
|
||||
isIFaceTy = \case
|
||||
(TIIFace _) -> True
|
||||
_ -> False
|
||||
|
||||
data SchemaPath
|
||||
= SchemaPath
|
||||
{ _spTypeName :: !(Maybe G.NamedType)
|
||||
, _spFldName :: !(Maybe G.Name)
|
||||
, _spArgName :: !(Maybe G.Name)
|
||||
, _spType :: !(Maybe T.Text)
|
||||
}
|
||||
|
||||
setFldNameSP :: SchemaPath -> G.Name -> SchemaPath
|
||||
setFldNameSP sp fn = sp { _spFldName = Just fn}
|
||||
|
||||
setArgNameSP :: SchemaPath -> G.Name -> SchemaPath
|
||||
setArgNameSP sp an = sp { _spArgName = Just an}
|
||||
|
||||
showSP :: SchemaPath -> Text
|
||||
showSP (SchemaPath t f a _) = maybe "" (\x -> showNamedTy x <> fN) t
|
||||
where
|
||||
fN = maybe "" (\x -> "." <> showName x <> aN) f
|
||||
aN = maybe "" showArg a
|
||||
showArg x = "(" <> showName x <> ":)"
|
||||
|
||||
showSPTxt' :: SchemaPath -> Text
|
||||
showSPTxt' (SchemaPath _ f a t) = maybe "" (<> " "<> fld) t
|
||||
where
|
||||
fld = maybe "" (const $ "field " <> arg) f
|
||||
arg = maybe "" (const "argument ") a
|
||||
|
||||
showSPTxt :: SchemaPath -> Text
|
||||
showSPTxt p = showSPTxt' p <> showSP p
|
||||
|
||||
validateIFace :: MonadError Text f => IFaceTyInfo -> f ()
|
||||
validateIFace (IFaceTyInfo _ n flds _) =
|
||||
when (isFldListEmpty flds) $ throwError $ "List of fields cannot be empty for interface " <> showNamedTy n
|
||||
|
||||
validateObj :: TypeMap -> ObjTyInfo -> Either Text ()
|
||||
validateObj tyMap objTyInfo@(ObjTyInfo _ n _ flds) = do
|
||||
when (isFldListEmpty flds) $ throwError $ "List of fields cannot be empty for " <> objTxt
|
||||
mapM_ (extrIFaceTyInfo' >=> validateIFaceImpl objTyInfo) $ _otiImplIFaces objTyInfo
|
||||
where
|
||||
extrIFaceTyInfo' t = withObjTxt $ extrIFaceTyInfo tyMap t
|
||||
withObjTxt x = x `catchError` \e -> throwError $ e <> " implemented by " <> objTxt
|
||||
objTxt = "Object type " <> showNamedTy n
|
||||
validateIFaceImpl = implmntsIFace tyMap
|
||||
|
||||
isFldListEmpty :: ObjFieldMap -> Bool
|
||||
isFldListEmpty = Map.null . Map.delete "__typename"
|
||||
|
||||
validateUnion :: MonadError Text m => TypeMap -> UnionTyInfo -> m ()
|
||||
validateUnion tyMap (UnionTyInfo _ un mt) = do
|
||||
when (Set.null mt) $ throwError $ "List of member types cannot be empty for union type " <> showNamedTy un
|
||||
mapM_ valIsObjTy $ Set.toList mt
|
||||
where
|
||||
valIsObjTy mn = case Map.lookup mn tyMap of
|
||||
Just (TIObj t) -> return t
|
||||
Nothing -> throwError $ "Could not find type " <> showNamedTy mn <> ", which is defined as a member type of Union " <> showNamedTy un
|
||||
_ -> throwError $ "Union type " <> showNamedTy un <> " can only include object types. It cannot include " <> showNamedTy mn
|
||||
|
||||
implmntsIFace :: TypeMap -> ObjTyInfo -> IFaceTyInfo -> Either Text ()
|
||||
implmntsIFace tyMap objTyInfo iFaceTyInfo = do
|
||||
let path =
|
||||
( SchemaPath (Just $ _otiName objTyInfo) Nothing Nothing (Just "Object")
|
||||
, SchemaPath (Just $ _ifName iFaceTyInfo) Nothing Nothing (Just "Interface")
|
||||
)
|
||||
mapM_ (includesIFaceFld path) $ _ifFields iFaceTyInfo
|
||||
where
|
||||
includesIFaceFld (spO,spIF) ifFld = do
|
||||
let pathA@(spOA, spIFA) = (spO, setFldNameSP spIF $ _fiName ifFld)
|
||||
objFld <- sameNameFld pathA ifFld
|
||||
let pathB = (setFldNameSP spOA $ _fiName objFld, spIFA)
|
||||
validateIsSubType' pathB (_fiTy objFld) (_fiTy ifFld)
|
||||
hasAllArgs pathB objFld ifFld
|
||||
isExtraArgsNullable pathB objFld ifFld
|
||||
|
||||
validateIsSubType' (spO,spIF) oFld iFld = validateIsSubType tyMap oFld iFld `catchError` \_ ->
|
||||
throwError $ "The type of " <> showSPTxt spO <> " (" <> G.showGT oFld <>
|
||||
") is not the same type/sub type of " <> showSPTxt spIF <> " (" <> G.showGT iFld <> ")"
|
||||
|
||||
sameNameFld (spO, spIF) ifFld = do
|
||||
let spIFN = setFldNameSP spIF $ _fiName ifFld
|
||||
onNothing (Map.lookup (_fiName ifFld) objFlds)
|
||||
$ throwError $ showSPTxt spIFN <> " expected, but " <> showSP spO <> " does not provide it"
|
||||
|
||||
hasAllArgs (spO, spIF) objFld ifFld = forM_ (_fiParams ifFld) $ \ifArg -> do
|
||||
objArg <- sameNameArg ifArg
|
||||
let (spON, spIFN) = (setArgNameSP spO $ _iviName objArg, setArgNameSP spIF $ _iviName ifArg)
|
||||
unless (_iviType objArg == _iviType ifArg) $ throwError $
|
||||
showSPTxt spIFN <> " expects type " <> G.showGT (_iviType ifArg) <> ", but " <>
|
||||
showSP spON <> " has type " <> G.showGT (_iviType objArg)
|
||||
where
|
||||
sameNameArg ivi = do
|
||||
let spIFN = setArgNameSP spIF $ _iviName ivi
|
||||
onNothing (Map.lookup (_iviName ivi) objArgs) $ throwError $ showSPTxt spIFN <> " required, but " <>
|
||||
showSPTxt spO <> " does not provide it"
|
||||
objArgs = _fiParams objFld
|
||||
|
||||
isExtraArgsNullable (spO, spIF) objFld ifFld = forM_ extraArgs isInpValNullable
|
||||
where
|
||||
extraArgs = Map.difference (_fiParams objFld) (_fiParams ifFld)
|
||||
isInpValNullable ivi = unless (G.isNullable $ _iviType ivi) $ throwError $
|
||||
showSPTxt (setArgNameSP spO $ _iviName ivi) <> " is of required type "
|
||||
<> G.showGT (_iviType ivi) <> ", but is not provided by " <> showSPTxt spIF
|
||||
|
||||
objFlds = _otiFields objTyInfo
|
||||
|
||||
extrTyInfo :: TypeMap -> G.NamedType -> Either Text TypeInfo
|
||||
extrTyInfo tyMap tn = maybe
|
||||
(throwError $ "Could not find type with name " <> showNamedTy tn)
|
||||
return
|
||||
$ Map.lookup tn tyMap
|
||||
|
||||
extrIFaceTyInfo :: MonadError Text m => Map.HashMap G.NamedType TypeInfo -> G.NamedType -> m IFaceTyInfo
|
||||
extrIFaceTyInfo tyMap tn = case Map.lookup tn tyMap of
|
||||
Just (TIIFace i) -> return i
|
||||
_ -> throwError $ "Could not find interface " <> showNamedTy tn
|
||||
|
||||
extrObjTyInfoM :: TypeMap -> G.NamedType -> Maybe ObjTyInfo
|
||||
extrObjTyInfoM tyMap tn = case Map.lookup tn tyMap of
|
||||
Just (TIObj o) -> return o
|
||||
_ -> Nothing
|
||||
|
||||
validateIsSubType :: Map.HashMap G.NamedType TypeInfo -> G.GType -> G.GType -> Either Text ()
|
||||
validateIsSubType tyMap subFldTy supFldTy = do
|
||||
checkNullMismatch subFldTy supFldTy
|
||||
case (subFldTy,supFldTy) of
|
||||
(G.TypeNamed _ subTy, G.TypeNamed _ supTy) -> do
|
||||
subTyInfo <- extrTyInfo tyMap subTy
|
||||
supTyInfo <- extrTyInfo tyMap supTy
|
||||
isSubTypeBase subTyInfo supTyInfo
|
||||
(G.TypeList _ (G.ListType sub), G.TypeList _ (G.ListType sup) ) ->
|
||||
validateIsSubType tyMap sub sup
|
||||
_ -> throwError $ showIsListTy subFldTy <> " Type " <> G.showGT subFldTy <>
|
||||
" cannot be a sub-type of " <> showIsListTy supFldTy <> " Type " <> G.showGT supFldTy
|
||||
where
|
||||
checkNullMismatch subTy supTy = when (G.isNotNull supTy && G.isNullable subTy ) $
|
||||
throwError $ "Nullable Type " <> G.showGT subFldTy <> " cannot be a sub-type of Non-Null Type " <> G.showGT supFldTy
|
||||
showIsListTy = \case
|
||||
G.TypeList {} -> "List"
|
||||
G.TypeNamed {} -> "Named"
|
||||
|
||||
-- TODO Should we check the schema location as well?
|
||||
isSubTypeBase :: (MonadError Text m) => TypeInfo -> TypeInfo -> m ()
|
||||
isSubTypeBase subTyInfo supTyInfo = case (subTyInfo,supTyInfo) of
|
||||
(TIObj obj, TIIFace iFace) -> unless (_ifName iFace `elem` _otiImplIFaces obj) notSubTyErr
|
||||
_ -> unless (subTyInfo == supTyInfo) notSubTyErr
|
||||
where
|
||||
showTy = showNamedTy . getNamedTy
|
||||
notSubTyErr = throwError $ "Type " <> showTy subTyInfo <> " is not a sub type of " <> showTy supTyInfo
|
||||
|
||||
-- map postgres types to builtin scalars
|
||||
pgColTyToScalar :: PGScalarType -> Text
|
||||
pgColTyToScalar = \case
|
||||
PGInteger -> "Int"
|
||||
PGBoolean -> "Boolean"
|
||||
PGFloat -> "Float"
|
||||
PGText -> "String"
|
||||
PGVarchar -> "String"
|
||||
t -> toSQLTxt t
|
||||
|
||||
mkScalarTy :: PGScalarType -> G.NamedType
|
||||
mkScalarTy =
|
||||
G.NamedType . G.Name . pgColTyToScalar
|
||||
|
||||
getNamedTy :: TypeInfo -> G.NamedType
|
||||
getNamedTy = \case
|
||||
TIScalar t -> G.NamedType $ _stiName t
|
||||
TIObj t -> _otiName t
|
||||
TIIFace i -> _ifName i
|
||||
TIEnum t -> _etiName t
|
||||
TIInpObj t -> _iotiName t
|
||||
TIUnion u -> _utiName u
|
||||
|
||||
mkTyInfoMap :: [TypeInfo] -> TypeMap
|
||||
mkTyInfoMap tyInfos =
|
||||
Map.fromList [(getNamedTy tyInfo, tyInfo) | tyInfo <- tyInfos]
|
||||
|
||||
fromTyDef :: InterfaceImplementations -> TypeLoc -> G.TypeDefinition -> TypeInfo
|
||||
fromTyDef interfaceImplementations loc tyDef = case tyDef of
|
||||
G.TypeDefinitionScalar t -> TIScalar $ fromScalarTyDef t loc
|
||||
G.TypeDefinitionObject t -> TIObj $ fromObjTyDef t loc
|
||||
G.TypeDefinitionInterface t -> TIIFace $ fromIFaceDef interfaceImplementations t loc
|
||||
G.TypeDefinitionUnion t -> TIUnion $ fromUnionTyDef t
|
||||
G.TypeDefinitionEnum t -> TIEnum $ fromEnumTyDef t loc
|
||||
G.TypeDefinitionInputObject t -> TIInpObj $ fromInpObjTyDef t loc
|
||||
|
||||
type InterfaceImplementations = Map.HashMap G.NamedType MemberTypes
|
||||
|
||||
fromSchemaDoc :: G.SchemaDocument -> TypeLoc -> Either Text TypeMap
|
||||
fromSchemaDoc (G.SchemaDocument tyDefs) loc = do
|
||||
let tyMap = mkTyInfoMap $ map (fromTyDef interfaceImplementations loc) tyDefs
|
||||
validateTypeMap tyMap
|
||||
return tyMap
|
||||
where
|
||||
interfaceImplementations :: InterfaceImplementations
|
||||
interfaceImplementations =
|
||||
foldr (Map.unionWith (<>)) mempty $ flip mapMaybe tyDefs $ \case
|
||||
G.TypeDefinitionObject objectDefinition ->
|
||||
Just $ Map.fromList $ zip
|
||||
(G._otdImplementsInterfaces objectDefinition)
|
||||
(repeat $ Set.singleton $ G.NamedType $ G._otdName objectDefinition)
|
||||
_ -> Nothing
|
||||
|
||||
validateTypeMap :: TypeMap -> Either Text ()
|
||||
validateTypeMap tyMap = mapM_ validateTy $ Map.elems tyMap
|
||||
where
|
||||
validateTy (TIObj o) = validateObj tyMap o
|
||||
validateTy (TIUnion u) = validateUnion tyMap u
|
||||
validateTy (TIIFace i) = validateIFace i
|
||||
validateTy _ = return ()
|
||||
|
||||
fromSchemaDocQ :: G.SchemaDocument -> TypeLoc -> TH.Q TH.Exp
|
||||
fromSchemaDocQ sd loc = case fromSchemaDoc sd loc of
|
||||
Left e -> fail $ T.unpack e
|
||||
Right tyMap -> TH.ListE <$> mapM TH.lift (Map.elems tyMap)
|
||||
|
||||
defaultSchema :: G.SchemaDocument
|
||||
defaultSchema = $(G.parseSchemaDocQ "src-rsr/schema.graphql")
|
||||
|
||||
-- fromBaseSchemaFileQ :: FilePath -> TH.Q TH.Exp
|
||||
-- fromBaseSchemaFileQ fp =
|
||||
-- fromSchemaDocQ $(G.parseSchemaDocQ fp)
|
||||
|
||||
type TypeMap = Map.HashMap G.NamedType TypeInfo
|
||||
|
||||
data DirectiveInfo
|
||||
= DirectiveInfo
|
||||
{ _diDescription :: !(Maybe G.Description)
|
||||
, _diName :: !G.Name
|
||||
, _diParams :: !ParamMap
|
||||
, _diLocations :: ![G.DirectiveLocation]
|
||||
} deriving (Show, Eq)
|
||||
|
||||
-- TODO: generate this from template haskell once we have a parser for directive defs
|
||||
-- directive @skip(if: Boolean!) on FIELD | FRAGMENT_SPREAD | INLINE_FRAGMENT
|
||||
defaultDirectives :: [DirectiveInfo]
|
||||
defaultDirectives =
|
||||
[mkDirective "skip", mkDirective "include"]
|
||||
where
|
||||
mkDirective n = DirectiveInfo Nothing n args dirLocs
|
||||
args = Map.singleton "if" $ InpValInfo Nothing "if" Nothing $
|
||||
G.TypeNamed (G.Nullability False) $ mkScalarTy PGBoolean
|
||||
dirLocs = map G.DLExecutable
|
||||
[G.EDLFIELD, G.EDLFRAGMENT_SPREAD, G.EDLINLINE_FRAGMENT]
|
||||
|
||||
defDirectivesMap :: Map.HashMap G.Name DirectiveInfo
|
||||
defDirectivesMap = mapFromL _diName defaultDirectives
|
||||
|
||||
data FragDef
|
||||
= FragDef
|
||||
{ _fdName :: !G.Name
|
||||
, _fdTyInfo :: !FragmentTypeInfo
|
||||
, _fdSelSet :: !G.SelectionSet
|
||||
} deriving (Show, Eq)
|
||||
|
||||
data FragmentTypeInfo
|
||||
= FragmentTyObject !ObjTyInfo
|
||||
| FragmentTyInterface !IFaceTyInfo
|
||||
| FragmentTyUnion !UnionTyInfo
|
||||
deriving (Show, Eq)
|
||||
|
||||
type FragDefMap = Map.HashMap G.Name FragDef
|
||||
|
||||
type AnnVarVals =
|
||||
Map.HashMap G.Variable AnnInpVal
|
||||
|
||||
stripTypenames :: [G.ExecutableDefinition] -> [G.ExecutableDefinition]
|
||||
stripTypenames = map filterExecDef
|
||||
where
|
||||
filterExecDef = \case
|
||||
G.ExecutableDefinitionOperation opDef ->
|
||||
G.ExecutableDefinitionOperation $ filterOpDef opDef
|
||||
G.ExecutableDefinitionFragment fragDef ->
|
||||
let newSelset = filterSelSet $ G._fdSelectionSet fragDef
|
||||
in G.ExecutableDefinitionFragment fragDef{G._fdSelectionSet = newSelset}
|
||||
|
||||
filterOpDef = \case
|
||||
G.OperationDefinitionTyped typeOpDef ->
|
||||
let newSelset = filterSelSet $ G._todSelectionSet typeOpDef
|
||||
in G.OperationDefinitionTyped typeOpDef{G._todSelectionSet = newSelset}
|
||||
G.OperationDefinitionUnTyped selset ->
|
||||
G.OperationDefinitionUnTyped $ filterSelSet selset
|
||||
|
||||
filterSelSet = mapMaybe filterSel
|
||||
filterSel s = case s of
|
||||
G.SelectionField f ->
|
||||
if G._fName f == "__typename"
|
||||
then Nothing
|
||||
else
|
||||
let newSelset = filterSelSet $ G._fSelectionSet f
|
||||
in Just $ G.SelectionField f{G._fSelectionSet = newSelset}
|
||||
_ -> Just s
|
||||
|
||||
-- | Used by 'Hasura.GraphQL.Validate.validateVariablesForReuse' to parse new sets of variables for
|
||||
-- reusable query plans; see also 'QueryReusability'.
|
||||
newtype ReusableVariableTypes
|
||||
= ReusableVariableTypes { unReusableVarTypes :: Map.HashMap G.Variable RQL.PGColumnType }
|
||||
deriving (Show, Eq, Semigroup, Monoid, J.ToJSON)
|
||||
type ReusableVariableValues = Map.HashMap G.Variable (WithScalarType PGScalarValue)
|
||||
|
||||
-- | Tracks whether or not a query is /reusable/. Reusable queries are nice, since we can cache
|
||||
-- their resolved ASTs and avoid re-resolving them if we receive an identical query. However, we
|
||||
-- can’t always safely reuse queries if they have variables, since some variable values can affect
|
||||
-- the generated SQL. For example, consider the following query:
|
||||
--
|
||||
-- > query users_where($condition: users_bool_exp!) {
|
||||
-- > users(where: $condition) {
|
||||
-- > id
|
||||
-- > }
|
||||
-- > }
|
||||
--
|
||||
-- Different values for @$condition@ will produce completely different queries, so we can’t reuse
|
||||
-- its plan (unless the variable values were also all identical, of course, but we don’t bother
|
||||
-- caching those).
|
||||
--
|
||||
-- If a query does turn out to be reusable, we build up a 'ReusableVariableTypes' value that maps
|
||||
-- variable names to their types so that we can use a fast path for validating new sets of
|
||||
-- variables (namely 'Hasura.GraphQL.Validate.validateVariablesForReuse').
|
||||
data QueryReusability
|
||||
= Reusable !ReusableVariableTypes
|
||||
| NotReusable
|
||||
deriving (Show, Eq)
|
||||
$(makePrisms ''QueryReusability)
|
||||
|
||||
instance Semigroup QueryReusability where
|
||||
Reusable a <> Reusable b = Reusable (a <> b)
|
||||
_ <> _ = NotReusable
|
||||
instance Monoid QueryReusability where
|
||||
mempty = Reusable mempty
|
||||
|
||||
class (Monad m) => MonadReusability m where
|
||||
recordVariableUse :: G.Variable -> RQL.PGColumnType -> m ()
|
||||
markNotReusable :: m ()
|
||||
|
||||
instance (MonadReusability m) => MonadReusability (ReaderT r m) where
|
||||
recordVariableUse a b = lift $ recordVariableUse a b
|
||||
markNotReusable = lift markNotReusable
|
||||
|
||||
instance (MonadReusability m) => MonadReusability (StateT s m) where
|
||||
recordVariableUse a b = lift $ recordVariableUse a b
|
||||
markNotReusable = lift markNotReusable
|
||||
|
||||
newtype ReusabilityT m a = ReusabilityT { unReusabilityT :: StateT QueryReusability m a }
|
||||
deriving (Functor, Applicative, Monad, MonadError e, MonadReader r, MonadIO, MonadTrans)
|
||||
|
||||
instance (Monad m) => MonadReusability (ReusabilityT m) where
|
||||
recordVariableUse varName varType = ReusabilityT $
|
||||
modify' (<> Reusable (ReusableVariableTypes $ Map.singleton varName varType))
|
||||
markNotReusable = ReusabilityT $ put NotReusable
|
||||
|
||||
instance Tracing.MonadTrace m => Tracing.MonadTrace (ReusabilityT m) where
|
||||
trace name (ReusabilityT ma) = ReusabilityT (Tracing.trace name ma)
|
||||
currentContext = lift Tracing.currentContext
|
||||
currentReporter = lift Tracing.currentReporter
|
||||
attachMetadata = lift . Tracing.attachMetadata
|
||||
|
||||
runReusabilityT :: ReusabilityT m a -> m (a, QueryReusability)
|
||||
runReusabilityT = runReusabilityTWith mempty
|
||||
|
||||
-- | Like 'runReusabilityT', but starting from an existing 'QueryReusability' state.
|
||||
runReusabilityTWith :: QueryReusability -> ReusabilityT m a -> m (a, QueryReusability)
|
||||
runReusabilityTWith initialReusability = flip runStateT initialReusability . unReusabilityT
|
||||
|
||||
evalReusabilityT :: (Monad m) => ReusabilityT m a -> m a
|
||||
evalReusabilityT = flip evalStateT mempty . unReusabilityT
|
@ -11,7 +11,6 @@ import qualified Data.URL.Template as UT
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.URI.Extended as N
|
||||
|
||||
import Control.Applicative
|
||||
import Data.Aeson (Value)
|
||||
import Data.CaseInsensitive (CI)
|
||||
import Data.Functor.Classes (Eq1 (..), Eq2 (..))
|
||||
@ -165,6 +164,7 @@ instance Cacheable Integer where unchanged _ = (==)
|
||||
instance Cacheable Scientific where unchanged _ = (==)
|
||||
instance Cacheable Text where unchanged _ = (==)
|
||||
instance Cacheable N.URIAuth where unchanged _ = (==)
|
||||
instance Cacheable G.Name where unchanged _ = (==)
|
||||
instance Cacheable DiffTime where unchanged _ = (==)
|
||||
instance Cacheable NominalDiffTime where unchanged _ = (==)
|
||||
instance Cacheable UTCTime where unchanged _ = (==)
|
||||
@ -202,44 +202,49 @@ instance (Cacheable a, Cacheable b, Cacheable c, Cacheable d) => Cacheable (a, b
|
||||
instance (Cacheable a, Cacheable b, Cacheable c, Cacheable d, Cacheable e) => Cacheable (a, b, c, d, e)
|
||||
|
||||
instance Cacheable Bool
|
||||
instance Cacheable Void
|
||||
instance Cacheable Value
|
||||
instance Cacheable G.Argument
|
||||
instance Cacheable G.Directive
|
||||
instance Cacheable G.ExecutableDefinition
|
||||
instance Cacheable G.Field
|
||||
instance Cacheable G.FragmentDefinition
|
||||
instance Cacheable G.FragmentSpread
|
||||
instance Cacheable G.GType
|
||||
instance Cacheable G.InlineFragment
|
||||
instance Cacheable G.Nullability
|
||||
instance Cacheable G.OperationDefinition
|
||||
instance Cacheable G.OperationType
|
||||
instance Cacheable G.Selection
|
||||
instance Cacheable G.TypedOperationDefinition
|
||||
instance Cacheable G.Value
|
||||
instance Cacheable G.ValueConst
|
||||
instance Cacheable G.VariableDefinition
|
||||
instance Cacheable G.InputValueDefinition
|
||||
instance Cacheable G.EnumValueDefinition
|
||||
instance Cacheable G.FieldDefinition
|
||||
instance Cacheable G.ScalarTypeDefinition
|
||||
instance Cacheable G.UnionTypeDefinition
|
||||
instance Cacheable possibleTypes => Cacheable (G.InterfaceTypeDefinition possibleTypes)
|
||||
instance Cacheable G.EnumTypeDefinition
|
||||
instance Cacheable G.InputObjectTypeDefinition
|
||||
instance Cacheable G.ObjectTypeDefinition
|
||||
instance Cacheable possibleTypes => Cacheable (G.TypeDefinition possibleTypes)
|
||||
instance Cacheable N.URI
|
||||
instance Cacheable UT.Variable
|
||||
instance Cacheable UT.TemplateItem
|
||||
instance Cacheable UT.URLTemplate
|
||||
instance (Cacheable a) => Cacheable (Maybe a)
|
||||
instance (Cacheable a, Cacheable b) => Cacheable (Either a b)
|
||||
instance (Cacheable a) => Cacheable [a]
|
||||
instance (Cacheable a) => Cacheable (NonEmpty a)
|
||||
instance (Cacheable a) => Cacheable (G.ObjectFieldG a)
|
||||
instance (Cacheable a) => Cacheable (NESeq a)
|
||||
instance Cacheable a => Cacheable [a]
|
||||
instance Cacheable a => Cacheable (NonEmpty a)
|
||||
instance Cacheable a => Cacheable (G.Directive a)
|
||||
instance Cacheable a => Cacheable (G.ExecutableDefinition a)
|
||||
instance (Cacheable (a b), Cacheable b) => Cacheable (G.Field a b)
|
||||
instance Cacheable a => Cacheable (G.FragmentSpread a)
|
||||
instance (Cacheable (a b), Cacheable b) => Cacheable (G.InlineFragment a b)
|
||||
instance (Cacheable (a b), Cacheable b) => Cacheable (G.OperationDefinition a b)
|
||||
instance (Cacheable (a b), Cacheable b) => Cacheable (G.Selection a b)
|
||||
instance (Cacheable (a b), Cacheable b) => Cacheable (G.TypedOperationDefinition a b)
|
||||
|
||||
instance Cacheable a => Cacheable (G.Value a)
|
||||
|
||||
deriving instance Cacheable G.Alias
|
||||
deriving instance Cacheable G.EnumValue
|
||||
deriving instance Cacheable G.ExecutableDocument
|
||||
deriving instance Cacheable G.ListType
|
||||
deriving instance Cacheable G.Name
|
||||
deriving instance Cacheable G.NamedType
|
||||
deriving instance Cacheable G.StringValue
|
||||
deriving instance Cacheable G.Variable
|
||||
deriving instance Cacheable G.Description
|
||||
deriving instance (Cacheable a) => Cacheable (G.ListValueG a)
|
||||
deriving instance (Cacheable a) => Cacheable (G.ObjectValueG a)
|
||||
deriving instance Cacheable G.EnumValue
|
||||
deriving instance Cacheable a => Cacheable (G.ExecutableDocument a)
|
||||
|
||||
instance Cacheable G.SchemaDocument
|
||||
instance Cacheable G.SchemaIntrospection
|
||||
|
||||
class GCacheable f where
|
||||
gunchanged :: f p -> f p -> Accesses -> Bool
|
||||
|
@ -9,7 +9,6 @@ import Hasura.Prelude hiding (id, (.))
|
||||
|
||||
import qualified Data.HashMap.Strict as HM
|
||||
|
||||
import Control.Applicative hiding (liftA)
|
||||
import Control.Arrow.Extended
|
||||
import Control.Category
|
||||
import Data.Profunctor
|
||||
|
@ -1,6 +1,4 @@
|
||||
{-# OPTIONS_HADDOCK not-home #-}
|
||||
{-# LANGUAGE GADTs #-}
|
||||
{-# LANGUAGE RoleAnnotations #-}
|
||||
|
||||
module Hasura.Incremental.Select
|
||||
( Select(..)
|
||||
@ -27,6 +25,7 @@ import Control.Monad.Unique
|
||||
import Data.GADT.Compare
|
||||
import Data.Kind
|
||||
import Data.Proxy (Proxy (..))
|
||||
import Data.Type.Equality
|
||||
import GHC.OverloadedLabels (IsLabel (..))
|
||||
import GHC.Records (HasField (..))
|
||||
import GHC.TypeLits (KnownSymbol, sameSymbol, symbolVal)
|
||||
|
@ -7,6 +7,7 @@ module Hasura.Prelude
|
||||
, onNothing
|
||||
, onJust
|
||||
, onLeft
|
||||
, whenMaybe
|
||||
, choice
|
||||
, afold
|
||||
, bsToTxt
|
||||
@ -24,7 +25,7 @@ module Hasura.Prelude
|
||||
, module Data.Time.Clock.Units
|
||||
) where
|
||||
|
||||
import Control.Applicative as M (Alternative (..))
|
||||
import Control.Applicative as M (Alternative (..), liftA2)
|
||||
import Control.Arrow as M (first, second, (&&&), (***), (<<<), (>>>))
|
||||
import Control.DeepSeq as M (NFData, deepseq, force)
|
||||
import Control.Monad.Base as M
|
||||
@ -32,34 +33,39 @@ import Control.Monad.Except as M
|
||||
import Control.Monad.Identity as M
|
||||
import Control.Monad.Reader as M
|
||||
import Control.Monad.State.Strict as M
|
||||
import Control.Monad.Trans.Maybe as M (MaybeT (..))
|
||||
import Control.Monad.Writer.Strict as M (MonadWriter (..), WriterT (..),
|
||||
execWriterT, runWriterT)
|
||||
import Data.Align as M (Semialign (align, alignWith))
|
||||
import Data.Bool as M (bool)
|
||||
import Data.Data as M (Data (..))
|
||||
import Data.Either as M (lefts, partitionEithers, rights)
|
||||
import Data.Foldable as M (asum, foldrM, for_, toList, traverse_)
|
||||
import Data.Foldable as M (asum, fold, foldrM, for_, toList,
|
||||
traverse_)
|
||||
import Data.Function as M (on, (&))
|
||||
import Data.Functor as M (($>), (<&>))
|
||||
import Data.Hashable as M (Hashable)
|
||||
import Data.HashMap.Strict as M (HashMap)
|
||||
import Data.HashMap.Strict.InsOrd as M (InsOrdHashMap)
|
||||
import Data.HashSet as M (HashSet)
|
||||
import Data.List as M (find, findIndex, foldl', group,
|
||||
intercalate, intersect, lookup, sort,
|
||||
sortBy, sortOn, union, unionBy, (\\))
|
||||
import Data.List.NonEmpty as M (NonEmpty (..))
|
||||
import Data.List.NonEmpty as M (NonEmpty (..), nonEmpty)
|
||||
import Data.Maybe as M (catMaybes, fromMaybe, isJust, isNothing,
|
||||
listToMaybe, mapMaybe, maybeToList)
|
||||
import Data.Monoid as M (getAlt)
|
||||
import Data.Ord as M (comparing)
|
||||
import Data.Semigroup as M (Semigroup (..))
|
||||
import Data.Sequence as M (Seq)
|
||||
import Data.Sequence.NonEmpty as M (NESeq)
|
||||
import Data.String as M (IsString)
|
||||
import Data.Text as M (Text)
|
||||
import Data.These as M (These (..), fromThese, mergeThese,
|
||||
mergeTheseWith, partitionThese, these)
|
||||
import Data.Time.Clock.Units
|
||||
import Data.Traversable as M (for)
|
||||
import Data.Void as M (Void, absurd)
|
||||
import Data.Word as M (Word64)
|
||||
import GHC.Generics as M (Generic)
|
||||
import Prelude as M hiding (fail, init, lookup)
|
||||
@ -67,10 +73,9 @@ import Test.QuickCheck.Arbitrary.Generic as M
|
||||
import Text.Read as M (readEither, readMaybe)
|
||||
|
||||
import qualified Data.ByteString as B
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
|
||||
import qualified Data.ByteString.Base64.Lazy as Base64
|
||||
import Data.Coerce
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Set as Set
|
||||
import qualified Data.Text as T
|
||||
@ -98,6 +103,10 @@ onJust m action = maybe (return ()) action m
|
||||
onLeft :: (Monad m) => Either e a -> (e -> m a) -> m a
|
||||
onLeft e f = either f return e
|
||||
|
||||
whenMaybe :: Applicative m => Bool -> m a -> m (Maybe a)
|
||||
whenMaybe True = fmap Just
|
||||
whenMaybe False = const $ pure Nothing
|
||||
|
||||
choice :: (Alternative f) => [f a] -> f a
|
||||
choice = asum
|
||||
|
||||
@ -114,7 +123,6 @@ base64Decode :: Text -> BL.ByteString
|
||||
base64Decode =
|
||||
Base64.decodeLenient . BL.fromStrict . txtToBs
|
||||
|
||||
|
||||
-- Like 'span', but monadic and with a function that produces 'Maybe' instead of 'Bool'
|
||||
spanMaybeM
|
||||
:: (Foldable f, Monad m)
|
||||
@ -142,7 +150,7 @@ findWithIndex p l = do
|
||||
i <- findIndex p l
|
||||
pure (v, i)
|
||||
|
||||
-- TODO: Move to Data.HashMap.Strict.Extended; rename to fromListWith?
|
||||
-- TODO (from master): Move to Data.HashMap.Strict.Extended; rename to fromListWith?
|
||||
mapFromL :: (Eq k, Hashable k) => (a -> k) -> [a] -> Map.HashMap k a
|
||||
mapFromL f = Map.fromList . map (\v -> (f v, v))
|
||||
|
||||
|
@ -24,21 +24,18 @@ module Hasura.RQL.DDL.Action
|
||||
) where
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Context (defaultTypes)
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DDL.CustomTypes (lookupPGScalar)
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.Session
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.Environment as Env
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
@ -99,55 +96,38 @@ referred scalars.
|
||||
resolveAction
|
||||
:: QErrM m
|
||||
=> Env.Environment
|
||||
-> (NonObjectTypeMap, AnnotatedObjects)
|
||||
-> HashSet PGScalarType
|
||||
-- ^ List of all Postgres scalar types.
|
||||
-> AnnotatedCustomTypes
|
||||
-> ActionDefinitionInput
|
||||
-> HashSet PGScalarType -- See Note [Postgres scalars in custom types]
|
||||
-> m ( ResolvedActionDefinition
|
||||
, AnnotatedObjectType
|
||||
, HashSet PGScalarType
|
||||
-- ^ see Note [Postgres scalars in action input arguments].
|
||||
)
|
||||
resolveAction env customTypes allPGScalars actionDefinition = do
|
||||
let responseType = unGraphQLType $ _adOutputType actionDefinition
|
||||
responseBaseType = G.getBaseType responseType
|
||||
|
||||
reusedPGScalars <- execWriterT $
|
||||
forM (_adArguments actionDefinition) $ \argument -> do
|
||||
let argumentBaseType = G.getBaseType $ unGraphQLType $ _argType argument
|
||||
maybeArgTypeInfo = getNonObjectTypeInfo argumentBaseType
|
||||
maybePGScalar = find ((==) argumentBaseType . VT.mkScalarTy) allPGScalars
|
||||
|
||||
if | Just argTypeInfo <- maybeArgTypeInfo ->
|
||||
case argTypeInfo of
|
||||
VT.TIScalar _ -> pure ()
|
||||
VT.TIEnum _ -> pure ()
|
||||
VT.TIInpObj _ -> pure ()
|
||||
_ -> throw400 InvalidParams $ "the argument's base type: "
|
||||
<> showNamedTy argumentBaseType <>
|
||||
" should be a scalar/enum/input_object"
|
||||
-- Collect the referred Postgres scalar. See Note [Postgres scalars in action input arguments].
|
||||
| Just pgScalar <- maybePGScalar -> tell $ Set.singleton pgScalar
|
||||
| Nothing <- maybeArgTypeInfo ->
|
||||
throw400 NotExists $ "the type: " <> showNamedTy argumentBaseType
|
||||
<> " is not defined in custom types"
|
||||
| otherwise -> pure ()
|
||||
resolveAction env AnnotatedCustomTypes{..} ActionDefinition{..} allPGScalars = do
|
||||
resolvedArguments <- forM _adArguments $ \argumentDefinition -> do
|
||||
forM argumentDefinition $ \argumentType -> do
|
||||
let gType = unGraphQLType argumentType
|
||||
argumentBaseType = G.getBaseType gType
|
||||
(gType,) <$>
|
||||
if | Just pgScalar <- lookupPGScalar allPGScalars argumentBaseType ->
|
||||
pure $ NOCTScalar $ ASTReusedPgScalar argumentBaseType pgScalar
|
||||
| Just nonObjectType <- Map.lookup argumentBaseType _actNonObjects ->
|
||||
pure nonObjectType
|
||||
| otherwise ->
|
||||
throw400 InvalidParams $
|
||||
"the type: " <> showName argumentBaseType
|
||||
<> " is not defined in custom types or it is not a scalar/enum/input_object"
|
||||
|
||||
-- Check if the response type is an object
|
||||
outputObject <- getObjectTypeInfo responseBaseType
|
||||
resolvedDef <- traverse (resolveWebhook env) actionDefinition
|
||||
pure (resolvedDef, outputObject, reusedPGScalars)
|
||||
where
|
||||
getNonObjectTypeInfo typeName =
|
||||
let nonObjectTypeMap = unNonObjectTypeMap $ fst $ customTypes
|
||||
inputTypeInfos = nonObjectTypeMap <> mapFromL VT.getNamedTy defaultTypes
|
||||
in Map.lookup typeName inputTypeInfos
|
||||
|
||||
getObjectTypeInfo typeName =
|
||||
onNothing (Map.lookup (ObjectTypeName typeName) (snd customTypes)) $
|
||||
throw400 NotExists $ "the type: "
|
||||
<> showNamedTy typeName <>
|
||||
" is not an object type defined in custom types"
|
||||
let outputType = unGraphQLType _adOutputType
|
||||
outputBaseType = G.getBaseType outputType
|
||||
outputObject <- onNothing (Map.lookup outputBaseType _actObjects) $
|
||||
throw400 NotExists $ "the type: " <> showName outputBaseType
|
||||
<> " is not an object type defined in custom types"
|
||||
resolvedWebhook <- resolveWebhook env _adHandler
|
||||
pure ( ActionDefinition resolvedArguments _adOutputType _adType
|
||||
_adHeaders _adForwardClientHeaders resolvedWebhook
|
||||
, outputObject
|
||||
)
|
||||
|
||||
runUpdateAction
|
||||
:: forall m. ( QErrM m , CacheRWM m, MonadTx m)
|
||||
|
@ -134,13 +134,13 @@ addComputedFieldP2Setup trackedTables table computedField definition rawFunction
|
||||
(rfiReturnTypeName rawFunctionInfo)
|
||||
(rfiReturnTypeType rawFunctionInfo)
|
||||
|
||||
computedFieldGraphQLName = G.Name $ computedFieldNameToText computedField
|
||||
computedFieldGraphQLName = G.mkName $ computedFieldNameToText computedField
|
||||
|
||||
mkComputedFieldInfo :: (MV.MonadValidate [ComputedFieldValidateError] m)
|
||||
=> m ComputedFieldInfo
|
||||
mkComputedFieldInfo = do
|
||||
-- Check if computed field name is a valid GraphQL name
|
||||
unless (G.isValidName computedFieldGraphQLName) $
|
||||
unless (isJust computedFieldGraphQLName) $
|
||||
MV.dispute $ pure $ CFVENotValidGraphQLName computedField
|
||||
|
||||
-- Check if function is VOLATILE
|
||||
|
@ -1,27 +1,26 @@
|
||||
{-# LANGUAGE RecordWildCards #-}
|
||||
module Hasura.RQL.DDL.CustomTypes
|
||||
( runSetCustomTypes
|
||||
, persistCustomTypes
|
||||
, clearCustomTypes
|
||||
, resolveCustomTypes
|
||||
, lookupPGScalar
|
||||
) where
|
||||
|
||||
import Control.Monad.Validate
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.List.Extended as L
|
||||
import qualified Data.Text as T
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.List.Extended as L
|
||||
import qualified Data.Text as T
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.EncJSON
|
||||
import Hasura.GraphQL.Validate.Types (mkScalarTy)
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import Hasura.GraphQL.Schema.CustomTypes (buildCustomTypesSchemaPartial)
|
||||
|
||||
{- Note [Postgres scalars in custom types]
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
It’s very convenient to be able to reference Postgres scalars in custom type
|
||||
@ -53,13 +52,21 @@ validateCustomTypeDefinitions
|
||||
:: (MonadValidate [CustomTypeValidationError] m)
|
||||
=> TableCache
|
||||
-> CustomTypes
|
||||
-> HashSet PGScalarType -- ^ all Postgres base types
|
||||
-> m (HashSet PGScalarType) -- ^ see Note [Postgres scalars in custom types]
|
||||
validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT do
|
||||
-> HashSet PGScalarType
|
||||
-- ^ all Postgres base types. See Note [Postgres scalars in custom types]
|
||||
-> m AnnotatedCustomTypes
|
||||
validateCustomTypeDefinitions tableCache customTypes allPGScalars = do
|
||||
unless (null duplicateTypes) $ dispute $ pure $ DuplicateTypeNames duplicateTypes
|
||||
traverse_ validateEnum enumDefinitions
|
||||
traverse_ validateInputObject inputObjectDefinitions
|
||||
traverse_ validateObject objectDefinitions
|
||||
reusedPGScalars <- execWriterT $ traverse_ validateInputObject inputObjectDefinitions
|
||||
annotatedObjects <- mapFromL (unObjectTypeName . _otdName) <$>
|
||||
traverse validateObject objectDefinitions
|
||||
let scalarTypeMap = Map.map NOCTScalar $
|
||||
Map.map ASTCustom scalarTypes <> Map.mapWithKey ASTReusedPgScalar reusedPGScalars
|
||||
enumTypeMap = Map.map NOCTEnum enumTypes
|
||||
inputObjectTypeMap = Map.map NOCTInputObject inputObjectTypes
|
||||
nonObjectTypeMap = scalarTypeMap <> enumTypeMap <> inputObjectTypeMap
|
||||
pure $ AnnotatedCustomTypes nonObjectTypeMap annotatedObjects
|
||||
where
|
||||
inputObjectDefinitions = fromMaybe [] $ _ctInputObjects customTypes
|
||||
objectDefinitions = fromMaybe [] $ _ctObjects customTypes
|
||||
@ -74,12 +81,13 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT
|
||||
map (unObjectTypeName . _otdName) objectDefinitions
|
||||
|
||||
scalarTypes =
|
||||
Set.fromList $ map _stdName scalarDefinitions <> defaultScalars
|
||||
mapFromL _stdName $ scalarDefinitions <> defaultScalars
|
||||
|
||||
enumTypes =
|
||||
Set.fromList $ map (unEnumTypeName . _etdName) enumDefinitions
|
||||
mapFromL (unEnumTypeName . _etdName) enumDefinitions
|
||||
|
||||
defaultScalars = map G.NamedType ["Int", "Float", "String", "Boolean", "ID"]
|
||||
inputObjectTypes =
|
||||
mapFromL (unInputObjectTypeName . _iotdName) inputObjectDefinitions
|
||||
|
||||
validateEnum
|
||||
:: (MonadValidate [CustomTypeValidationError] m)
|
||||
@ -94,7 +102,7 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT
|
||||
|
||||
validateInputObject
|
||||
:: ( MonadValidate [CustomTypeValidationError] m
|
||||
, MonadWriter (Set.HashSet PGScalarType) m
|
||||
, MonadWriter (Map.HashMap G.Name PGScalarType) m
|
||||
)
|
||||
=> InputObjectTypeDefinition -> m ()
|
||||
validateInputObject inputObjectDefinition = do
|
||||
@ -108,118 +116,126 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT
|
||||
dispute $ pure $ InputObjectDuplicateFields
|
||||
inputObjectTypeName duplicateFieldNames
|
||||
|
||||
let inputObjectTypes =
|
||||
Set.fromList $ map (unInputObjectTypeName . _iotdName)
|
||||
inputObjectDefinitions
|
||||
|
||||
let inputTypes =
|
||||
scalarTypes `Set.union` enumTypes `Set.union` inputObjectTypes
|
||||
let mapToSet = Set.fromList . Map.keys
|
||||
inputTypes =
|
||||
mapToSet scalarTypes `Set.union` mapToSet enumTypes `Set.union` mapToSet inputObjectTypes
|
||||
|
||||
-- check that fields reference input types
|
||||
for_ (_iotdFields inputObjectDefinition) $ \inputObjectField -> do
|
||||
let fieldBaseType = G.getBaseType $ unGraphQLType $ _iofdType inputObjectField
|
||||
if | Set.member fieldBaseType inputTypes -> pure ()
|
||||
| Just pgScalar <- lookupPGScalar fieldBaseType ->
|
||||
tell $ Set.singleton pgScalar
|
||||
| Just pgScalar <- lookupPGScalar allPGScalars fieldBaseType ->
|
||||
tell $ Map.singleton fieldBaseType pgScalar
|
||||
| otherwise ->
|
||||
refute $ pure $ InputObjectFieldTypeDoesNotExist
|
||||
(_iotdName inputObjectDefinition)
|
||||
(_iofdName inputObjectField) fieldBaseType
|
||||
|
||||
validateObject
|
||||
:: ( MonadValidate [CustomTypeValidationError] m
|
||||
, MonadWriter (Set.HashSet PGScalarType) m
|
||||
)
|
||||
=> ObjectTypeDefinition -> m ()
|
||||
:: (MonadValidate [CustomTypeValidationError] m)
|
||||
=> ObjectType -> m AnnotatedObjectType
|
||||
validateObject objectDefinition = do
|
||||
let objectTypeName = _otdName objectDefinition
|
||||
fieldNames = map (unObjectFieldName . _ofdName) $
|
||||
toList (_otdFields objectDefinition)
|
||||
relationships = fromMaybe [] $ _otdRelationships objectDefinition
|
||||
relNames = map (unRelationshipName . _trName) relationships
|
||||
maybeRelationships = _otdRelationships objectDefinition
|
||||
relNames = maybe []
|
||||
(map (unRelationshipName . _trName) . toList) maybeRelationships
|
||||
duplicateFieldNames = L.duplicates $ fieldNames <> relNames
|
||||
fields = toList $ _otdFields objectDefinition
|
||||
fields = _otdFields objectDefinition
|
||||
|
||||
-- check for duplicate field names
|
||||
unless (null duplicateFieldNames) $
|
||||
dispute $ pure $ ObjectDuplicateFields objectTypeName duplicateFieldNames
|
||||
|
||||
scalarFields <- fmap Map.fromList $ for fields $ \objectField -> do
|
||||
let fieldType = _ofdType objectField
|
||||
fieldBaseType = G.getBaseType $ unGraphQLType fieldType
|
||||
fieldName = _ofdName objectField
|
||||
|
||||
scalarOrEnumFields <- for fields $ \objectField -> do
|
||||
let fieldName = _ofdName objectField
|
||||
-- check that arguments are not defined
|
||||
when (isJust $ _ofdArguments objectField) $
|
||||
dispute $ pure $ ObjectFieldArgumentsNotAllowed
|
||||
objectTypeName fieldName
|
||||
|
||||
let objectTypes = Set.fromList $ map (unObjectTypeName . _otdName)
|
||||
objectDefinitions
|
||||
forM objectField $ \fieldType -> do
|
||||
let fieldBaseType = G.getBaseType $ unGraphQLType fieldType
|
||||
objectTypes = Set.fromList $ map (unObjectTypeName . _otdName)
|
||||
objectDefinitions
|
||||
|
||||
-- check that the fields only reference scalars and enums
|
||||
-- and not other object types
|
||||
if | Set.member fieldBaseType scalarTypes -> pure ()
|
||||
| Set.member fieldBaseType enumTypes -> pure ()
|
||||
| Set.member fieldBaseType objectTypes ->
|
||||
dispute $ pure $ ObjectFieldObjectBaseType
|
||||
objectTypeName fieldName fieldBaseType
|
||||
| Just pgScalar <- lookupPGScalar fieldBaseType ->
|
||||
tell $ Set.singleton pgScalar
|
||||
| otherwise ->
|
||||
dispute $ pure $ ObjectFieldTypeDoesNotExist
|
||||
objectTypeName fieldName fieldBaseType
|
||||
-- check that the fields only reference scalars and enums
|
||||
-- and not other object types
|
||||
annotatedObjectFieldType <-
|
||||
if | Just scalarDef <- Map.lookup fieldBaseType scalarTypes ->
|
||||
pure $ AOFTScalar $ ASTCustom scalarDef
|
||||
| Just enumDef <- Map.lookup fieldBaseType enumTypes ->
|
||||
pure $ AOFTEnum enumDef
|
||||
| Set.member fieldBaseType objectTypes ->
|
||||
refute $ pure $ ObjectFieldObjectBaseType
|
||||
objectTypeName fieldName fieldBaseType
|
||||
| Just pgScalar <- lookupPGScalar allPGScalars fieldBaseType ->
|
||||
pure $ AOFTScalar $ ASTReusedPgScalar fieldBaseType pgScalar
|
||||
| otherwise ->
|
||||
refute $ pure $ ObjectFieldTypeDoesNotExist
|
||||
objectTypeName fieldName fieldBaseType
|
||||
pure (unGraphQLType fieldType, annotatedObjectFieldType)
|
||||
|
||||
pure (fieldName, fieldType)
|
||||
|
||||
for_ relationships $ \relationshipField -> do
|
||||
let relationshipName = _trName relationshipField
|
||||
remoteTable = _trRemoteTable relationshipField
|
||||
fieldMapping = _trFieldMapping relationshipField
|
||||
let scalarOrEnumFieldMap = Map.fromList $
|
||||
map (_ofdName &&& (fst . _ofdType)) $ toList $ scalarOrEnumFields
|
||||
|
||||
annotatedRelationships <- forM maybeRelationships $ \relationships ->
|
||||
forM relationships $ \TypeRelationship{..} -> do
|
||||
--check that the table exists
|
||||
remoteTableInfo <- onNothing (Map.lookup remoteTable tableCache) $
|
||||
remoteTableInfo <- onNothing (Map.lookup _trRemoteTable tableCache) $
|
||||
refute $ pure $ ObjectRelationshipTableDoesNotExist
|
||||
objectTypeName relationshipName remoteTable
|
||||
objectTypeName _trName _trRemoteTable
|
||||
|
||||
-- check that the column mapping is sane
|
||||
forM_ (Map.toList fieldMapping) $ \(fieldName, columnName) -> do
|
||||
annotatedFieldMapping <- flip Map.traverseWithKey _trFieldMapping $
|
||||
\fieldName columnName -> do
|
||||
case Map.lookup fieldName scalarOrEnumFieldMap of
|
||||
Nothing -> dispute $ pure $ ObjectRelationshipFieldDoesNotExist
|
||||
objectTypeName _trName fieldName
|
||||
Just fieldType ->
|
||||
-- the field should be a non-list type scalar
|
||||
when (G.isListType fieldType) $
|
||||
dispute $ pure $ ObjectRelationshipFieldListType
|
||||
objectTypeName _trName fieldName
|
||||
|
||||
case Map.lookup fieldName scalarFields of
|
||||
Nothing -> dispute $ pure $ ObjectRelationshipFieldDoesNotExist
|
||||
objectTypeName relationshipName fieldName
|
||||
Just fieldType ->
|
||||
-- the field should be a non-list type scalar
|
||||
when (isListType fieldType) $
|
||||
dispute $ pure $ ObjectRelationshipFieldListType
|
||||
objectTypeName relationshipName fieldName
|
||||
-- the column should be a column of the table
|
||||
case getPGColumnInfoM remoteTableInfo (fromPGCol columnName) of
|
||||
Nothing ->
|
||||
refute $ pure $ ObjectRelationshipColumnDoesNotExist
|
||||
objectTypeName _trName _trRemoteTable columnName
|
||||
Just pgColumnInfo -> pure pgColumnInfo
|
||||
|
||||
-- the column should be a column of the table
|
||||
when (getPGColumnInfoM remoteTableInfo (fromPGCol columnName) == Nothing) $
|
||||
dispute $ pure $ ObjectRelationshipColumnDoesNotExist
|
||||
objectTypeName relationshipName remoteTable columnName
|
||||
return ()
|
||||
pure $ TypeRelationship _trName _trType remoteTableInfo annotatedFieldMapping
|
||||
|
||||
lookupPGScalar baseType = -- see Note [Postgres scalars in custom types]
|
||||
find ((==) baseType . mkScalarTy) allPGScalars
|
||||
pure $ ObjectTypeDefinition objectTypeName (_otdDescription objectDefinition)
|
||||
scalarOrEnumFields annotatedRelationships
|
||||
|
||||
-- see Note [Postgres scalars in custom types]
|
||||
lookupPGScalar :: Set.HashSet PGScalarType -> G.Name -> Maybe PGScalarType
|
||||
lookupPGScalar allPGScalars baseType =
|
||||
fmap snd
|
||||
$ find ((==) baseType . fst)
|
||||
$ flip mapMaybe (toList allPGScalars)
|
||||
$ \pgScalar -> (,pgScalar) <$> G.mkName (toSQLTxt pgScalar)
|
||||
|
||||
data CustomTypeValidationError
|
||||
= DuplicateTypeNames !(Set.HashSet G.NamedType)
|
||||
= DuplicateTypeNames !(Set.HashSet G.Name)
|
||||
-- ^ type names have to be unique across all types
|
||||
| InputObjectFieldTypeDoesNotExist
|
||||
!InputObjectTypeName !InputObjectFieldName !G.NamedType
|
||||
!InputObjectTypeName !InputObjectFieldName !G.Name
|
||||
-- ^ field name and the field's base type
|
||||
| InputObjectDuplicateFields
|
||||
!InputObjectTypeName !(Set.HashSet InputObjectFieldName)
|
||||
-- ^ duplicate field declaration in input objects
|
||||
| ObjectFieldTypeDoesNotExist
|
||||
!ObjectTypeName !ObjectFieldName !G.NamedType
|
||||
!ObjectTypeName !ObjectFieldName !G.Name
|
||||
-- ^ field name and the field's base type
|
||||
| ObjectDuplicateFields !ObjectTypeName !(Set.HashSet G.Name)
|
||||
-- ^ duplicate field declaration in objects
|
||||
| ObjectFieldArgumentsNotAllowed !ObjectTypeName !ObjectFieldName
|
||||
-- ^ object fields can't have arguments
|
||||
| ObjectFieldObjectBaseType !ObjectTypeName !ObjectFieldName !G.NamedType
|
||||
| ObjectFieldObjectBaseType !ObjectTypeName !ObjectFieldName !G.Name
|
||||
-- ^ object fields can't have object types as base types
|
||||
| ObjectRelationshipTableDoesNotExist
|
||||
!ObjectTypeName !RelationshipName !QualifiedTable
|
||||
@ -315,11 +331,13 @@ clearCustomTypes = do
|
||||
|
||||
resolveCustomTypes
|
||||
:: (MonadError QErr m)
|
||||
=> TableCache -> CustomTypes -> HashSet PGScalarType -> m (NonObjectTypeMap, AnnotatedObjects)
|
||||
resolveCustomTypes tableCache customTypes allPGScalars = do
|
||||
reusedPGScalars <- either (throw400 ConstraintViolation . showErrors) pure
|
||||
=> TableCache
|
||||
-> CustomTypes
|
||||
-> HashSet PGScalarType
|
||||
-> m AnnotatedCustomTypes
|
||||
resolveCustomTypes tableCache customTypes allPGScalars =
|
||||
either (throw400 ConstraintViolation . showErrors) pure
|
||||
=<< runValidateT (validateCustomTypeDefinitions tableCache customTypes allPGScalars)
|
||||
buildCustomTypesSchemaPartial tableCache customTypes reusedPGScalars
|
||||
where
|
||||
showErrors :: [CustomTypeValidationError] -> T.Text
|
||||
showErrors allErrors =
|
||||
|
@ -7,7 +7,7 @@ module Hasura.RQL.DDL.EventTrigger
|
||||
, runRedeliverEvent
|
||||
, runInvokeEventTrigger
|
||||
|
||||
-- TODO: review
|
||||
-- TODO(from master): review
|
||||
, delEventTriggerFromCatalog
|
||||
, subTableP2
|
||||
, subTableP2Setup
|
||||
|
@ -74,8 +74,10 @@ runClearMetadata _ = do
|
||||
applyQP1
|
||||
:: (QErrM m)
|
||||
=> ReplaceMetadata -> m ()
|
||||
applyQP1 (ReplaceMetadata _ tables functionsMeta schemas collections
|
||||
allowlist _ actions cronTriggers) = do
|
||||
applyQP1 (ReplaceMetadata _ tables functionsMeta schemas
|
||||
collections
|
||||
allowlist _ actions
|
||||
cronTriggers) = do
|
||||
withPathK "tables" $ do
|
||||
|
||||
checkMultipleDecls "tables" $ map _tmTable tables
|
||||
@ -299,12 +301,11 @@ fetchMetadata = do
|
||||
|
||||
customTypes <- fetchCustomTypes
|
||||
|
||||
-- fetch actions
|
||||
-- -- fetch actions
|
||||
actions <- fetchActions
|
||||
|
||||
cronTriggers <- fetchCronTriggers
|
||||
|
||||
|
||||
return $ ReplaceMetadata currentMetadataVersion
|
||||
(HMIns.elems postRelMap)
|
||||
functions
|
||||
|
@ -3,21 +3,7 @@ module Hasura.RQL.DDL.Metadata.Generator
|
||||
(genReplaceMetadata)
|
||||
where
|
||||
|
||||
import Hasura.GraphQL.Utils (simpleGraphQLQuery)
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DDL.Headers
|
||||
import Hasura.RQL.DDL.Metadata.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
import qualified Hasura.RQL.DDL.ComputedField as ComputedField
|
||||
import qualified Hasura.RQL.DDL.Permission as Permission
|
||||
import qualified Hasura.RQL.DDL.Permission.Internal as Permission
|
||||
import qualified Hasura.RQL.DDL.QueryCollection as Collection
|
||||
import qualified Hasura.RQL.DDL.Relationship as Relationship
|
||||
import qualified Hasura.RQL.DDL.Schema as Schema
|
||||
|
||||
import System.Cron.Types
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Text as T
|
||||
@ -31,11 +17,25 @@ import qualified System.Cron.Parser as Cr
|
||||
|
||||
import Data.List.Extended (duplicates)
|
||||
import Data.Scientific
|
||||
import System.Cron.Types
|
||||
import Test.QuickCheck
|
||||
import Test.QuickCheck.Instances.Semigroup ()
|
||||
import Test.QuickCheck.Instances.Time ()
|
||||
import Test.QuickCheck.Instances.UnorderedContainers ()
|
||||
|
||||
import qualified Hasura.RQL.DDL.ComputedField as ComputedField
|
||||
import qualified Hasura.RQL.DDL.Permission as Permission
|
||||
import qualified Hasura.RQL.DDL.Permission.Internal as Permission
|
||||
import qualified Hasura.RQL.DDL.QueryCollection as Collection
|
||||
import qualified Hasura.RQL.DDL.Relationship as Relationship
|
||||
import qualified Hasura.RQL.DDL.Schema as Schema
|
||||
|
||||
import Hasura.GraphQL.Utils (simpleGraphQLQuery)
|
||||
import Hasura.RQL.DDL.Headers
|
||||
import Hasura.RQL.DDL.Metadata.Types
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
genReplaceMetadata :: Gen ReplaceMetadata
|
||||
genReplaceMetadata = do
|
||||
version <- arbitrary
|
||||
@ -55,7 +55,7 @@ genReplaceMetadata = do
|
||||
MVVersion2 -> FMVersion2 <$> arbitrary
|
||||
|
||||
instance Arbitrary G.Name where
|
||||
arbitrary = G.Name . T.pack <$> listOf1 (elements ['a'..'z'])
|
||||
arbitrary = G.unsafeMkName . T.pack <$> listOf1 (elements ['a'..'z'])
|
||||
|
||||
instance Arbitrary MetadataVersion where
|
||||
arbitrary = genericArbitrary
|
||||
@ -205,18 +205,12 @@ instance Arbitrary Collection.CreateCollection where
|
||||
instance Arbitrary Collection.CollectionReq where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary G.NamedType where
|
||||
arbitrary = G.NamedType <$> arbitrary
|
||||
|
||||
instance Arbitrary G.Description where
|
||||
arbitrary = G.Description <$> arbitrary
|
||||
|
||||
instance Arbitrary G.Nullability where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary G.ListType where
|
||||
arbitrary = G.ListType <$> arbitrary
|
||||
|
||||
instance Arbitrary G.GType where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
@ -247,16 +241,16 @@ instance Arbitrary RelationshipName where
|
||||
instance Arbitrary ObjectFieldName where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary TypeRelationshipDefinition where
|
||||
instance (Arbitrary a, Arbitrary b) => Arbitrary (TypeRelationship a b) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ObjectTypeName where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ObjectFieldDefinition where
|
||||
instance (Arbitrary a) => Arbitrary (ObjectFieldDefinition a) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ObjectTypeDefinition where
|
||||
instance (Arbitrary a, Arbitrary b, Arbitrary c) => Arbitrary (ObjectTypeDefinition a b c) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ScalarTypeDefinition where
|
||||
@ -277,7 +271,7 @@ instance Arbitrary CustomTypes where
|
||||
instance Arbitrary ArgumentName where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ArgumentDefinition where
|
||||
instance (Arbitrary a) => Arbitrary (ArgumentDefinition a) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ActionMutationKind where
|
||||
@ -286,7 +280,7 @@ instance Arbitrary ActionMutationKind where
|
||||
instance Arbitrary ActionType where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance (Arbitrary a) => Arbitrary (ActionDefinition a) where
|
||||
instance (Arbitrary a, Arbitrary b) => Arbitrary (ActionDefinition a b) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary ActionName where
|
||||
@ -301,19 +295,11 @@ instance Arbitrary ActionPermissionMetadata where
|
||||
instance Arbitrary ActionMetadata where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
deriving instance Arbitrary G.StringValue
|
||||
deriving instance Arbitrary G.Variable
|
||||
deriving instance Arbitrary G.ListValue
|
||||
deriving instance Arbitrary G.ObjectValue
|
||||
|
||||
instance Arbitrary G.Value where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance (Arbitrary a) => Arbitrary (G.ObjectFieldG a) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
deriving instance Arbitrary RemoteArguments
|
||||
|
||||
instance Arbitrary a => Arbitrary (G.Value a) where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
instance Arbitrary FieldCall where
|
||||
arbitrary = genericArbitrary
|
||||
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user