From 7e970177c19169ffff4e06fe37fe9d0aafe04ff8 Mon Sep 17 00:00:00 2001 From: Alexis King Date: Fri, 21 Aug 2020 12:27:01 -0500 Subject: [PATCH] Rewrite GraphQL schema generation and query parsing (close #2801) (#4111) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Aka “the PDV refactor.” History is preserved on the branch 2801-graphql-schema-parser-refactor. * [skip ci] remove stale benchmark commit from commit_diff * [skip ci] Check for root field name conflicts between remotes * [skip ci] Additionally check for conflicts between remotes and DB * [skip ci] Check for conflicts in schema when tracking a table * [skip ci] Fix equality checking in GraphQL AST * server: fix mishandling of GeoJSON inputs in subscriptions (fix #3239) (#4551) * Add support for multiple top-level fields in a subscription to improve testability of subscriptions * Add an internal flag to enable multiple subscriptions * Add missing call to withConstructorFn in live queries (fix #3239) Co-authored-by: Alexis King * Scheduled triggers (close #1914) (#3553) server: add scheduled triggers Co-authored-by: Alexis King Co-authored-by: Marion Schleifer Co-authored-by: Karthikeyan Chinnakonda Co-authored-by: Aleksandra Sikora * dev.sh: bump version due to addition of croniter python dependency * server: fix an introspection query caching issue (fix #4547) (#4661) Introspection queries accept variables, but we need to make sure to also touch the variables that we ignore, so that an introspection query is marked not reusable if we are not able to build a correct query plan for it. A better solution here would be to deal with such unused variables correctly, so that more introspection queries become reusable. An even better solution would be to type-safely track *how* to reuse which variables, rather than to split the reusage marking from the planning. Co-authored-by: Tirumarai Selvan * flush log buffer on exception in mkWaiApp ( fix #4772 ) (#4801) * flush log buffer on exception in mkWaiApp * add comment to explain the introduced change * add changelog * allow logging details of a live query polling thread (#4959) * changes for poller-log add various multiplexed query info in poller-log * minor cleanup, also fixes a bug which will return duplicate data * Live query poller stats can now be logged This also removes in-memory stats that are collected about batched query execution as the log lines when piped into an monitoring tool will give us better insights. * allow poller-log to be configurable * log minimal information in the livequery-poller-log Other information can be retrieved from /dev/subscriptions/extended * fix few review comments * avoid marshalling and unmarshalling from ByteString to EncJSON * separate out SubscriberId and SubscriberMetadata Co-authored-by: Anon Ray * Don't compile in developer APIs by default * Tighten up handling of admin secret, more docs Store the admin secret only as a hash to prevent leaking the secret inadvertently, and to prevent timing attacks on the secret. NOTE: best practice for stored user passwords is a function with a tunable cost like bcrypt, but our threat model is quite different (even if we thought we could reasonably protect the secret from an attacker who could read arbitrary regions of memory), and bcrypt is far too slow (by design) to perform on each request. We'd have to rely on our (technically savvy) users to choose high entropy passwords in any case. Referencing #4736 * server/docs: add instructions to fix loss of float precision in PostgreSQL <= 11 (#5187) This adds a server flag, --pg-connection-options, that can be used to set a PostgreSQL connection parameter, extra_float_digits, that needs to be used to avoid loss of data on older versions of PostgreSQL, which have odd default behavior when returning float values. (fixes #5092) * [skip ci] Add new commits from master to the commit diff * [skip ci] serve default directives (skip & include) over introspection * [skip ci] Update non-Haskell assets with the version on master * server: refactor GQL execution check and config API (#5094) Co-authored-by: Vamshi Surabhi Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] fix js issues in tests by pinning dependencies version * [skip ci] bump graphql version * [skip ci] Add note about memory usage * generalize query execution logic on Postgres (#5110) * generalize PGExecCtx to support specialized functions for various operations * fix tests compilation * allow customising PGExecCtx when starting the web server * server: changes catalog initialization and logging for pro customization (#5139) * new typeclass to abstract the logic of QueryLog-ing * abstract the logic of logging websocket-server logs introduce a MonadWSLog typeclass * move catalog initialization to init step expose a helper function to migrate catalog create schema cache in initialiseCtx * expose various modules and functions for pro * [skip ci] cosmetic change * [skip ci] fix test calling a mutation that does not exist * [skip ci] minor text change * [skip ci] refactored input values * [skip ci] remove VString Origin * server: fix updating of headers behaviour in the update cron trigger API and create future events immediately (#5151) * server: fix bug to update headers in an existing cron trigger and create future events Co-authored-by: Tirumarai Selvan * Lower stack chunk size in RTS to reduce thread STACK memory (closes #5190) This reduces memory consumption for new idle subscriptions significantly (see linked ticket). The hypothesis is: we fork a lot of threads per websocket, and some of these use slightly more than the initial 1K stack size, so the first overflow balloons to 32K, when significantly less is required. However: running with `+RTS -K1K -xc` did not seem to show evidence of any overflows! So it's a mystery why this improves things. GHC should probably also be doubling the stack buffer at each overflow or doing something even smarter; the knobs we have aren't so helpful. * [skip ci] fix todo and schema generation for aggregate fields * 5087 libpq pool leak (#5089) Shrink libpq buffers to 1MB before returning connection to pool. Closes #5087 See: https://github.com/hasura/pg-client-hs/pull/19 Also related: #3388 #4077 * bump pg-client-hs version (fixes a build issue on some environments) (#5267) * do not use prepared statements for mutations * server: unlock scheduled events on graceful shutdown (#4928) * Fix buggy parsing of new --conn-lifetime flag in 2b0e3774 * [skip ci] remove cherry-picked commit from commit_diff.txt * server: include additional fields in scheduled trigger webhook payload (#5262) * include scheduled triggers metadata in the webhook body Co-authored-by: Tirumarai Selvan * server: call the webhook asynchronously in event triggers (#5352) * server: call the webhook asynchronosly in event triggers * Expose all modules in Cabal file (#5371) * [skip ci] update commit_diff.txt * [skip ci] fix cast exp parser & few TODOs * [skip ci] fix remote fields arguments * [skip ci] fix few more TODO, no-op refactor, move resolve/action.hs to execute/action.hs * Pass environment variables around as a data structure, via @sordina (#5374) * Pass environment variables around as a data structure, via @sordina * Resolving build error * Adding Environment passing note to changelog * Removing references to ILTPollerLog as this seems to have been reintroduced from a bad merge * removing commented-out imports * Language pragmas already set by project * Linking async thread * Apply suggestions from code review Use `runQueryTx` instead of `runLazyTx` for queries. * remove the non-user facing entry in the changelog Co-authored-by: Phil Freeman Co-authored-by: Phil Freeman Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] fix: restrict remote relationship field generation for hasura queries * [skip ci] no-op refactor; move insert execution code from schema parser module * server: call the webhook asynchronously in event triggers (#5352) * server: call the webhook asynchronosly in event triggers * Expose all modules in Cabal file (#5371) * [skip ci] update commit_diff.txt * Pass environment variables around as a data structure, via @sordina (#5374) * Pass environment variables around as a data structure, via @sordina * Resolving build error * Adding Environment passing note to changelog * Removing references to ILTPollerLog as this seems to have been reintroduced from a bad merge * removing commented-out imports * Language pragmas already set by project * Linking async thread * Apply suggestions from code review Use `runQueryTx` instead of `runLazyTx` for queries. * remove the non-user facing entry in the changelog Co-authored-by: Phil Freeman Co-authored-by: Phil Freeman Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] implement header checking Probably closes #14 and #3659. * server: refactor 'pollQuery' to have a hook to process 'PollDetails' (#5391) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * update pg-client (#5421) * [skip ci] update commit_diff * Fix latency buckets for telemetry data These must have gotten messed up during a refactor. As a consequence almost all samples received so far fall into the single erroneous 0 to 1K seconds (originally supposed to be 1ms?) bucket. I also re-thought what the numbers should be, but these are still arbitrary and might want adjusting in the future. * [skip ci] include the latest commit compared against master in commit_diff * [skip ci] include new commits from master in commit_diff * [skip ci] improve description generation * [skip ci] sort all introspect arrays * [skip ci] allow parsers to specify error codes * [skip ci] fix integer and float parsing error code * [skip ci] scalar from json errors are now parse errors * [skip ci] fixed negative integer error message and code * [skip ci] Re-fix nullability in relationships * [skip ci] no-op refactor and removed couple of FIXMEs * [skip ci] uncomment code in 'deleteMetadataObject' * [skip ci] Fix re-fix of nullability for relationships * [skip ci] fix default arguments error code * [skip ci] updated test error message !!! WARNING !!! Since all fields accept `null`, they all are technically optional in the new schema. Meaning there's no such thing as a missing mandatory field anymore: a field that doesn't have a default value, and which therefore isn't labelled as "optional" in the schema, will be assumed to be null if it's missing, meaning it isn't possible anymore to have an error for a missing mandatory field. The only possible error is now when a optional positional argument is omitted but is not the last positional argument. * [skip ci] cleanup of int scalar parser * [skip ci] retro-compatibility of offset as string * [skip ci] Remove commit from commit_diff.txt Although strictly speaking we don't know if this will work correctly in PDV if we would implement query plan caching, the fact is that in the theoretical case that we would have the same issue in PDV, it would probably apply not just to introspection, and the fix would be written completely differently. So this old commit is of no value to us other than the heads-up "make sure query plan caching works correctly even in the presence of unused variables", which is already part of the test suite. * Add MonadTrace and MonadExecuteQuery abstractions (#5383) * [skip ci] Fix accumulation of input object types Just like object types, interface types, and union types, we have to avoid circularities when collecting input types from the GraphQL AST. Additionally, this fixes equality checks for input object types (whose fields are unordered, and hence should be compared as sets) and enum types (ditto). * [skip ci] fix fragment error path * [skip ci] fix node error code * [skip ci] fix paths in insert queries * [skip ci] fix path in objects * [skip ci] manually alter node id path for consistency * [skip ci] more node error fixups * [skip ci] one last relay error message fix * [skip ci] update commit_diff * Propagate the trace context to event triggers (#5409) * Propagate the trace context to event triggers * Handle missing trace and span IDs * Store trace context as one LOCAL * Add migrations * Documentation * changelog * Fix warnings * Respond to code review suggestions * Respond to code review * Undo changelog * Update CHANGELOG.md Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * server: log request/response sizes for event triggers (#5463) * server: log request/response sizes for event triggers event triggers (and scheduled triggers) now have request/response size in their logs. * add changelog entry * Tracing: Simplify HTTP traced request (#5451) Remove the Inversion of Control (SuspendRequest) and simplify the tracing of HTTP Requests. Co-authored-by: Phil Freeman * Attach request ID as tracing metadata (#5456) * Propagate the trace context to event triggers * Handle missing trace and span IDs * Store trace context as one LOCAL * Add migrations * Documentation * Include the request ID as trace metadata * changelog * Fix warnings * Respond to code review suggestions * Respond to code review * Undo changelog * Update CHANGELOG.md * Typo Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * server: add logging for action handlers (#5471) * server: add logging for action handlers * add changelog entry * change action-handler log type from internal to non-internal * fix action-handler-log name * server: pass http and websocket request to logging context (#5470) * pass request body to logging context in all cases * add message size logging on the websocket API this is required by graphql-engine-pro/#416 * message size logging on websocket API As we need to log all messages recieved/sent by the websocket server, it makes sense to log them as part of the websocket server event logs. Previously message recieved were logged inside the onMessage handler, and messages sent were logged only for "data" messages (as a server event log) * fix review comments Co-authored-by: Phil Freeman * server: stop eventing subsystem threads when shutting down (#5479) * server: stop eventing subsystem threads when shutting down * Apply suggestions from code review Co-authored-by: Karthikeyan Chinnakonda Co-authored-by: Phil Freeman Co-authored-by: Phil Freeman Co-authored-by: Karthikeyan Chinnakonda * [skip ci] update commit_diff with new commits added in master * Bugfix to support 0-size HASURA_GRAPHQL_QUERY_PLAN_CACHE_SIZE Also some minor refactoring of bounded cache module: - the maxBound check in `trim` was confusing and unnecessary - consequently trim was unnecessary for lookupPure Also add some basic tests * Support only the bounded cache, with default HASURA_GRAPHQL_QUERY_PLAN_CACHE_SIZE of 4000. Closes #5363 * [skip ci] remove merge commit from commit_diff * server: Fix compiler warning caused by GHC upgrade (#5489) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * [skip ci] update all non server code from master * [skip ci] aligned object field error message with master * [skip ci] fix remaining undefined? * [skip ci] remove unused import * [skip ci] revert to previous error message, fix tests * Move nullableType/nonNullableType to Schema.hs These are functions on Types, not on Parsers. * [skip ci] fix setup to fix backend only test the order in which permission checks are performed on the branch is slightly different than on master, resulting in a slightly different error if there are no other mutations the user has access to. By adding update permissions, we go back to the expected case. * [skip ci] fix insert geojson tests to reflect new paths * [skip ci] fix enum test for better error message * [skip ci] fix header test for better error message * [skip ci] fix fragment cycle test for better error message * [skip ci] fix error message for type mismatch * [skip ci] fix variable path in test * [skip ci] adjust tests after bug fix * [skip ci] more tests fixing * Add hdb_catalog.current_setting abstraction for reading Hasura settings As the comment in the function’s definition explains, this is needed to work around an awkward Postgres behavior. * [skip ci] Update CONTRIBUTING.md to mention Node setup for Python tests * [skip ci] Add missing Python tests env var to CONTRIBUTING.md * [skip ci] fix order of result when subscription is run with multiple nodes * [skip ci] no-op refactor: fix a warning in Internal/Parser.hs * [skip ci] throw error when a subscription contains remote joins * [skip ci] Enable easier profiling by hiding AssertNF behind a flag In order to compile a profiling build, run: $ cabal new-build -f profiling --enable-profiling * [skip ci] Fix two warnings We used to lookup the objects that implement a given interface by filtering all objects in the schema document. However, one of the tests expects us to generate a warning if the provided `implements` field of an introspection query specifies an object not implementing some interface. So we use that field instead. * [skip ci] Fix warnings by commenting out query plan caching * [skip ci] improve masking/commenting query caching related code & few warning fixes * [skip ci] Fixed compiler warnings in graphql-parser-hs * Sync non-Haskell assets with master * [skip ci] add a test inserting invalid GraphQL but valid JSON value in a jsonb column * [skip ci] Avoid converting to/from Map * [skip ci] Apply some hlint suggestions * [skip ci] remove redundant constraints from buildLiveQueryPlan and explainGQLQuery * [skip ci] add NOTEs about missing Tracing constraints in PDV from master * Remove -fdefer-typed-holes, fix warnings * Update cabal.project.freeze * Limit GHC’s heap size to 8GB in CI to avoid the OOM killer * Commit package-lock.json for Python tests’ remote schema server * restrict env variables start with HASURA_GRAPHQL_ for headers configuration in actions, event triggers & remote schemas (#5519) * restrict env variables start with HASURA_GRAPHQL_ for headers definition in actions & event triggers * update CHANGELOG.md * Apply suggestions from code review Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * add test for table_by_pk node when roles doesn't have permission to PK * [skip ci] fix introspection query if any enum column present in primary key (fix #5200) (#5522) * [skip ci] test case fix for a6450e126bc2d98bcfd3791501986e4627ce6c6f * [skip ci] add tests to agg queries when role doesn't have access to any cols * fix backend test * Simplify subscription execution * [skip ci] add test to check if required headers are present while querying * Suppose, table B is related to table A and to query B certain headers are necessary, then the test checks that we are throwing error when the header is not set when B is queried through A * fix mutations not checking for view mutability * [skip ci] add variable type checking and corresponding tests * [skip ci] add test to check if update headers are present while doing an upsert * [skip ci] add positive counterparts to some of the negative permission tests * fix args missing their description in introspect * [skip ci] Remove unused function; insert missing markNotReusable call * [skip ci] Add a Note about InputValue * [skip ci] Delete LegacySchema/ 🎉 * [skip ci] Delete GraphQL/{Resolve,Validate}/ 🎉 * [skip ci] Delete top-level Resolve/Validate modules; tidy .cabal file * [skip ci] Delete LegacySchema top-level module Somehow I missed this one. * fix input value to json * [skip ci] elaborate on JSON objects in GraphQL * [skip ci] add missing file * [skip ci] add a test with subscription containing remote joins * add a test with remote joins in mutation output * [skip ci] Add some comments to Schema/Mutation.hs * [skip ci] Remove no longer needed code from RemoteServer.hs * [skip ci] Use a helper function to generate conflict clause parsers * [skip ci] fix type checker error in fields with default value * capitalize the header keys in select_articles_without_required_headers * Somehow, this was the reason the tests were failing. I have no idea, why! * [skip ci] Add a long Note about optional fields and nullability * Improve comments a bit; simplify Schema/Common.hs a bit * [skip ci] full implementation of 5.8.5 type checking. * [skip ci] fix validation test teardown * [skip ci] fix schema stitching test * fix remote schema ignoring enum nullability * [skip ci] fix fieldOptional to not discard nullability * revert nullability of use_spheroid * fix comment * add required remote fields with arguments for tests * [skip ci] add missing docstrings * [skip ci] fixed description of remote fields * [skip ci] change docstring for consistency * fix several schema inconsistencies * revert behaviour change in function arguments parsing * fix remaining nullability issues in new schema * minor no-op refactor; use isListType from graphql-parser-hs * use nullability of remote schema node, while creating a Remote reln * fix 'ID' input coercing & action 'ID' type relationship mapping * include ASTs in MonadExecuteQuery * needed for PRO code-base * Delete code for "interfaces implementing ifaces" (draft GraphQL spec) Previously I started writing some code that adds support for a future GraphQL feature where interfaces may themselves be sub-types of other interfaces. However, this code was incomplete, and partially incorrect. So this commit deletes support for that entirely. * Ignore a remote schema test during the upgrade/downgrade test The PDV refactor does a better job at exposing a minimal set of types through introspection. In particular, not every type that is present in a remote schema is re-exposed by Hasura. The test test_schema_stitching.py::TestRemoteSchemaBasic::test_introspection assumed that all types were re-exposed, which is not required for GraphQL compatibility, in order to test some aspect of our support for remote schemas. So while this particular test has been updated on PDV, the PDV branch now does not pass the old test, which we argue to be incorrect. Hence this test is disabled while we await a release, after which we can re-enable it. This also re-enables a test that was previously disabled for similar, though unrelated, reasons. * add haddock documentation to the action's field parsers * Deslecting some tests in server-upgrade Some tests with current build are failing on server upgrade which it should not. The response is more accurate than what it was. Also the upgrade tests were not throwing errors when the test is expected to return an error, but succeeds. The test framework is patched to catch this case. * [skip ci] Add a long Note about interfaces and object types * send the response headers back to client after running a query * Deselect a few more tests during upgrade/downgrade test * Update commit_diff.txt * change log kind from db_migrate to catalog_migrate (#5531) * Show method and complete URI in traced HTTP calls (#5525) Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * restrict env variables start with HASURA_GRAPHQL_ for headers configuration in actions, event triggers & remote schemas (#5519) * restrict env variables start with HASURA_GRAPHQL_ for headers definition in actions & event triggers * update CHANGELOG.md * Apply suggestions from code review Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> * fix introspection query if any enum column present in primary key (fix #5200) (#5522) * Fix telemetry reporting of transport (websocket was reported as http) * add log kinds in cli-migrations image (#5529) * add log kinds in cli-migrations image * give hint to resolve timeout error * minor changes and CHANGELOG * server: set hasura.tracecontext in RQL mutations [#5542] (#5555) * server: set hasura.tracecontext in RQL mutations [#5542] * Update test suite Co-authored-by: Tirumarai Selvan * Add bulldozer auto-merge and -update configuration We still need to add the github app (as of time of opening this PR) Afterwards devs should be able to allow bulldozer to automatically "update" the branch, merging in parent when it changes, as well as automatically merge when all checks pass. This is opt-in by adding the `auto-update-auto-merge` label to the PR. * Remove 'bulldozer' config, try 'kodiak' for auto-merge see: https://github.com/chdsbd/kodiak The main issue that bit us was not being able to auto update forked branches, also: https://github.com/palantir/bulldozer/issues/66 https://github.com/palantir/bulldozer/issues/145 * Cherry-picked all commits * [skip ci] Slightly improve formatting * Revert "fix introspection query if any enum column present in primary key (fix #5200) (#5522)" This reverts commit 0f9a5afa59a88f6824f4d63d58db246a5ba3fb03. This undoes a cherry-pick of 34288e1eb5f2c5dad9e6d1e05453dd52397dc970 that was already done previously in a6450e126bc2d98bcfd3791501986e4627ce6c6f, and subsequently fixed for PDV in 70e89dc250f8ddc6e2b7930bbe2b3eeaa6dbe1db * Do a small bit of tidying in Hasura.GraphQL.Parser.Collect * Fix cherry-picking work Some previous cherry-picks ended up modifying code that is commented out * [skip ci] clarified comment regarding insert representation * [skip ci] removed obsolete todos * cosmetic change * fix action error message * [skip ci] remove obsolete comment * [skip ci] synchronize stylish haskell extensions list * use previously defined scalar names in parsers rather than ad-hoc literals * Apply most syntax hlint hints. * Clarify comment on update mutation. * [skip ci] Clarify what fields should be specified for objects * Update "_inc" description. * Use record types rather than tuples fo IntrospectionResult and ParsedIntrospection * Get rid of checkFieldNamesUnique (use Data.List.Extended.duplicates) * Throw more errors when collecting query root names * [skip ci] clean column parser comment * Remove dead code inserted in ab65b39 * avoid converting to non-empty list where not needed * add note and TODO about the disabled checks in PDV * minor refactor in remoteField' function * Unify two getObject methods * Nitpicks in Remote.hs * Update CHANGELOG.md * Revert "Unify two getObject methods" This reverts commit bd6bb40355b3d189a46c0312eb52225e18be57b3. We do need two different getObject functions as the corresponding error message is different * Fix error message in Remote.hs * Update CHANGELOG.md Co-authored-by: Auke Booij * Apply suggested Changelog fix. Co-authored-by: Auke Booij * Fix typo in Changelog. * [skip ci] Update changelog. * reuse type names to avoid duplication * Fix Hashable instance for Definition The presence of `Maybe Unique`, and an optional description, as part of `Definition`s, means that `Definition`s that are considered `Eq`ual may get different hashes. This can happen, for instance, when one object is memoized but another is not. * [skip ci] Update commit_diff.txt * Bump parser version. * Bump freeze file after changes in parser. * [skip ci] Incorporate commits from master * Fix developer flag in server/cabal.project.freeze Co-authored-by: Auke Booij * Deselect a changed ENUM test for upgrade/downgrade CI * Deselect test here as well * [skip ci] remove dead code * Disable more tests for upgrade/downgrade * Fix which test gets deselected * Revert "Add hdb_catalog.current_setting abstraction for reading Hasura settings" This reverts commit 66e85ab9fbd56cca2c28a80201f6604fbe811b85. * Remove circular reference in cabal.project.freeze Co-authored-by: Karthikeyan Chinnakonda Co-authored-by: Auke Booij Co-authored-by: Tirumarai Selvan Co-authored-by: Marion Schleifer Co-authored-by: Aleksandra Sikora Co-authored-by: Brandon Simmons Co-authored-by: Vamshi Surabhi <0x777@users.noreply.github.com> Co-authored-by: Anon Ray Co-authored-by: rakeshkky <12475069+rakeshkky@users.noreply.github.com> Co-authored-by: Anon Ray Co-authored-by: Vamshi Surabhi Co-authored-by: Antoine Leblanc Co-authored-by: Brandon Simmons Co-authored-by: Phil Freeman Co-authored-by: Lyndon Maydwell Co-authored-by: Phil Freeman Co-authored-by: Naveen Naidu Co-authored-by: Karthikeyan Chinnakonda Co-authored-by: Nizar Malangadan Co-authored-by: Antoine Leblanc Co-authored-by: Auke Booij --- .../server-upgrade-downgrade/err_msg.patch | 13 + .circleci/server-upgrade-downgrade/run-dev.sh | 11 +- .circleci/server-upgrade-downgrade/run.sh | 33 +- .circleci/test-server.sh | 4 +- .kodiak.toml | 2 +- CHANGELOG.md | 16 +- server/.stylish-haskell.yaml | 7 + server/CONTRIBUTING.md | 23 +- server/cabal.project | 11 +- server/cabal.project.ci | 8 +- server/cabal.project.dev | 6 +- server/cabal.project.freeze | 11 +- server/commit_diff.txt | 1 + server/graphql-engine.cabal | 311 +-- server/src-bench-cache/Main.hs | 4 +- server/src-lib/Data/Aeson/Ordered.hs | 3 +- server/src-lib/Data/GADT/Compare/Extended.hs | 29 + .../Data/HashMap/Strict/InsOrd/Extended.hs | 4 +- server/src-lib/Data/Sequence/NonEmpty.hs | 95 +- server/src-lib/Data/Tuple/Extended.hs | 23 + server/src-lib/Hasura/App.hs | 14 +- server/src-lib/Hasura/Db.hs | 1 + server/src-lib/Hasura/EncJSON.hs | 2 +- server/src-lib/Hasura/GraphQL/Context.hs | 163 +- server/src-lib/Hasura/GraphQL/Execute.hs | 499 ++--- .../GraphQL/{Resolve => Execute}/Action.hs | 360 ++-- .../src-lib/Hasura/GraphQL/Execute/Inline.hs | 180 ++ .../src-lib/Hasura/GraphQL/Execute/Insert.hs | 318 +++ .../Hasura/GraphQL/Execute/LiveQuery.hs | 2 +- .../Hasura/GraphQL/Execute/LiveQuery/Plan.hs | 227 ++- .../Hasura/GraphQL/Execute/LiveQuery/Poll.hs | 12 +- .../Hasura/GraphQL/Execute/LiveQuery/State.hs | 7 + .../Hasura/GraphQL/Execute/Mutation.hs | 211 ++ server/src-lib/Hasura/GraphQL/Execute/Plan.hs | 77 +- .../src-lib/Hasura/GraphQL/Execute/Prepare.hs | 187 ++ .../src-lib/Hasura/GraphQL/Execute/Query.hs | 429 ++-- .../src-lib/Hasura/GraphQL/Execute/Resolve.hs | 84 + .../src-lib/Hasura/GraphQL/Execute/Types.hs | 17 + server/src-lib/Hasura/GraphQL/Explain.hs | 181 +- server/src-lib/Hasura/GraphQL/Logging.hs | 5 +- server/src-lib/Hasura/GraphQL/NormalForm.hs | 4 +- server/src-lib/Hasura/GraphQL/Parser.hs | 51 + server/src-lib/Hasura/GraphQL/Parser/Class.hs | 193 ++ .../Hasura/GraphQL/Parser/Class.hs-boot | 5 + .../src-lib/Hasura/GraphQL/Parser/Collect.hs | 274 +++ .../src-lib/Hasura/GraphQL/Parser/Column.hs | 154 ++ .../Hasura/GraphQL/Parser/Internal/Parser.hs | 998 ++++++++++ .../GraphQL/Parser/Internal/Parser.hs-boot | 23 + server/src-lib/Hasura/GraphQL/Parser/Monad.hs | 192 ++ .../src-lib/Hasura/GraphQL/Parser/Schema.hs | 803 ++++++++ server/src-lib/Hasura/GraphQL/RelaySchema.hs | 425 ---- server/src-lib/Hasura/GraphQL/RemoteServer.hs | 158 +- server/src-lib/Hasura/GraphQL/Resolve.hs | 244 --- .../src-lib/Hasura/GraphQL/Resolve/BoolExp.hs | 211 -- .../src-lib/Hasura/GraphQL/Resolve/Context.hs | 151 -- .../Hasura/GraphQL/Resolve/InputValue.hs | 228 --- .../src-lib/Hasura/GraphQL/Resolve/Insert.hs | 592 ------ .../Hasura/GraphQL/Resolve/Introspect.hs | 424 ---- .../Hasura/GraphQL/Resolve/Mutation.hs | 411 ---- .../src-lib/Hasura/GraphQL/Resolve/Select.hs | 833 -------- .../src-lib/Hasura/GraphQL/Resolve/Types.hs | 337 ---- server/src-lib/Hasura/GraphQL/Schema.hs | 1427 +++++--------- .../src-lib/Hasura/GraphQL/Schema/Action.hs | 636 +++--- .../src-lib/Hasura/GraphQL/Schema/BoolExp.hs | 537 ++--- .../src-lib/Hasura/GraphQL/Schema/Builder.hs | 86 - .../src-lib/Hasura/GraphQL/Schema/Common.hs | 234 +-- .../Hasura/GraphQL/Schema/CustomTypes.hs | 176 -- .../src-lib/Hasura/GraphQL/Schema/Function.hs | 149 -- .../src-lib/Hasura/GraphQL/Schema/Insert.hs | 76 + .../Hasura/GraphQL/Schema/Introspect.hs | 602 ++++++ server/src-lib/Hasura/GraphQL/Schema/Merge.hs | 152 -- .../src-lib/Hasura/GraphQL/Schema/Mutation.hs | 531 +++++ .../Hasura/GraphQL/Schema/Mutation/Common.hs | 51 - .../Hasura/GraphQL/Schema/Mutation/Delete.hs | 57 - .../Hasura/GraphQL/Schema/Mutation/Insert.hs | 236 --- .../Hasura/GraphQL/Schema/Mutation/Update.hs | 301 --- .../src-lib/Hasura/GraphQL/Schema/OrderBy.hs | 301 ++- .../src-lib/Hasura/GraphQL/Schema/Remote.hs | 420 ++++ .../src-lib/Hasura/GraphQL/Schema/Select.hs | 1747 ++++++++++++----- server/src-lib/Hasura/GraphQL/Schema/Table.hs | 171 ++ .../src-lib/Hasura/GraphQL/Transport/HTTP.hs | 107 +- .../Hasura/GraphQL/Transport/HTTP/Protocol.hs | 127 +- .../Hasura/GraphQL/Transport/WebSocket.hs | 118 +- .../GraphQL/Transport/WebSocket/Server.hs | 14 +- server/src-lib/Hasura/GraphQL/Utils.hs | 41 - server/src-lib/Hasura/GraphQL/Validate.hs | 331 ---- .../Hasura/GraphQL/Validate/Context.hs | 78 - .../Hasura/GraphQL/Validate/InputValue.hs | 347 ---- .../Hasura/GraphQL/Validate/SelectionSet.hs | 550 ------ .../src-lib/Hasura/GraphQL/Validate/Types.hs | 819 -------- .../Hasura/Incremental/Internal/Dependency.hs | 55 +- .../Hasura/Incremental/Internal/Rule.hs | 1 - server/src-lib/Hasura/Incremental/Select.hs | 3 +- server/src-lib/Hasura/Prelude.hs | 22 +- server/src-lib/Hasura/RQL/DDL/Action.hs | 74 +- .../src-lib/Hasura/RQL/DDL/ComputedField.hs | 4 +- server/src-lib/Hasura/RQL/DDL/CustomTypes.hs | 186 +- server/src-lib/Hasura/RQL/DDL/EventTrigger.hs | 2 +- server/src-lib/Hasura/RQL/DDL/Metadata.hs | 9 +- .../Hasura/RQL/DDL/Metadata/Generator.hs | 60 +- .../src-lib/Hasura/RQL/DDL/Metadata/Types.hs | 8 +- server/src-lib/Hasura/RQL/DDL/Permission.hs | 6 +- server/src-lib/Hasura/RQL/DDL/Relationship.hs | 11 +- .../Hasura/RQL/DDL/RemoteRelationship.hs | 34 +- .../RQL/DDL/RemoteRelationship/Validate.hs | 553 +++--- server/src-lib/Hasura/RQL/DDL/RemoteSchema.hs | 82 +- server/src-lib/Hasura/RQL/DDL/Schema/Cache.hs | 173 +- .../Hasura/RQL/DDL/Schema/Cache/Common.hs | 26 +- .../RQL/DDL/Schema/Cache/Dependencies.hs | 6 +- .../Hasura/RQL/DDL/Schema/Cache/Fields.hs | 17 +- server/src-lib/Hasura/RQL/DDL/Schema/Enum.hs | 37 +- .../src-lib/Hasura/RQL/DDL/Schema/Function.hs | 19 +- .../src-lib/Hasura/RQL/DDL/Schema/Rename.hs | 37 +- server/src-lib/Hasura/RQL/DDL/Schema/Table.hs | 144 +- server/src-lib/Hasura/RQL/DML/Delete.hs | 19 +- server/src-lib/Hasura/RQL/DML/Delete/Types.hs | 21 + server/src-lib/Hasura/RQL/DML/Insert.hs | 98 +- server/src-lib/Hasura/RQL/DML/Insert/Types.hs | 34 + server/src-lib/Hasura/RQL/DML/Internal.hs | 25 +- server/src-lib/Hasura/RQL/DML/Mutation.hs | 10 +- server/src-lib/Hasura/RQL/DML/RemoteJoin.hs | 238 ++- server/src-lib/Hasura/RQL/DML/Returning.hs | 26 +- .../src-lib/Hasura/RQL/DML/Returning/Types.hs | 42 + server/src-lib/Hasura/RQL/DML/Select.hs | 23 +- .../src-lib/Hasura/RQL/DML/Select/Internal.hs | 6 +- server/src-lib/Hasura/RQL/DML/Select/Types.hs | 33 +- server/src-lib/Hasura/RQL/DML/Update.hs | 90 +- server/src-lib/Hasura/RQL/DML/Update/Types.hs | 36 + server/src-lib/Hasura/RQL/GBoolExp.hs | 4 +- server/src-lib/Hasura/RQL/Instances.hs | 54 +- server/src-lib/Hasura/RQL/Types.hs | 22 +- server/src-lib/Hasura/RQL/Types/Action.hs | 113 +- server/src-lib/Hasura/RQL/Types/BoolExp.hs | 19 +- server/src-lib/Hasura/RQL/Types/Column.hs | 32 +- server/src-lib/Hasura/RQL/Types/Common.hs | 16 +- .../src-lib/Hasura/RQL/Types/ComputedField.hs | 12 +- .../src-lib/Hasura/RQL/Types/CustomTypes.hs | 189 +- server/src-lib/Hasura/RQL/Types/Error.hs | 2 +- .../Hasura/RQL/Types/QueryCollection.hs | 67 +- .../Hasura/RQL/Types/RemoteRelationship.hs | 127 +- .../src-lib/Hasura/RQL/Types/RemoteSchema.hs | 15 +- .../src-lib/Hasura/RQL/Types/SchemaCache.hs | 72 +- .../Hasura/RQL/Types/SchemaCacheTypes.hs | 34 +- server/src-lib/Hasura/RQL/Types/Table.hs | 42 +- server/src-lib/Hasura/SQL/DML.hs | 3 + server/src-lib/Hasura/SQL/Types.hs | 102 +- server/src-lib/Hasura/SQL/Value.hs | 42 +- server/src-lib/Hasura/Server/API/PGDump.hs | 10 +- server/src-lib/Hasura/Server/API/Query.hs | 6 +- server/src-lib/Hasura/Server/App.hs | 65 +- server/src-lib/Hasura/Server/Auth.hs | 1 + server/src-lib/Hasura/Server/Auth/JWT.hs | 18 +- server/src-lib/Hasura/Server/Auth/WebHook.hs | 7 +- server/src-lib/Hasura/Server/Cors.hs | 4 +- server/src-lib/Hasura/Server/Init.hs | 12 +- server/src-lib/Hasura/Server/Init/Config.hs | 4 +- server/src-lib/Hasura/Server/Logging.hs | 2 +- server/src-lib/Hasura/Server/SchemaUpdate.hs | 5 + server/src-lib/Hasura/Server/Telemetry.hs | 31 +- server/src-lib/Hasura/Server/Utils.hs | 8 +- server/src-lib/Hasura/Session.hs | 12 + server/src-lib/Hasura/Tracing.hs | 41 +- server/tests-py/graphql_server.py | 6 +- .../enums/insert_enum_field_bad_value.yaml | 2 +- .../basic/can_insert_in_insertable_view.yaml | 27 + .../cannot_insert_in_non_insertable_view.yaml | 25 + ...id_variable_but_invalid_graphql_value.yaml | 33 + .../insert/basic/schema_setup.yaml | 29 +- .../insert/basic/schema_teardown.yaml | 2 + .../insert_area_less_than_4_points_err.yaml | 2 +- .../insert_geometry_unexpected_type_err.yaml | 2 +- .../insert_landmark_single_position_err.yaml | 2 +- ...ing_last_point_not_equal_to_first_err.yaml | 2 +- .../geojson/insert_road_single_point_err.yaml | 4 +- .../permissions/backend_user_insert_fail.yaml | 2 +- .../backend_user_no_admin_secret_fail.yaml | 2 +- .../leads_upsert_check_with_headers.yaml | 49 + .../insert/permissions/schema_setup.yaml | 66 + .../insert/permissions/schema_teardown.yaml | 1 + .../user_with_no_backend_privilege.yaml | 6 +- .../article_column_multiple_operators.yaml | 5 +- .../update/basic/schema_setup.yaml | 1 + ...icle_agg_with_role_with_select_access.yaml | 42 + ...e_agg_with_role_without_select_access.yaml | 63 + .../queries/graphql_query/agg_perm/setup.yaml | 20 + .../basic/select_query_fragment_cycles.yaml | 2 +- .../basic/select_query_test_types.yaml | 56 +- .../queries/graphql_query/basic/setup.yaml | 18 +- .../raster/query_st_intersects_rast_fail.yaml | 2 +- .../enums/introspect_user_role.yaml | 26 +- .../enums/select_where_enum_eq_bad_value.yaml | 2 +- .../enums/select_where_enum_eq_string.yaml | 2 +- ...lect_where_enum_eq_variable_bad_value.yaml | 4 +- ...lect_query_article_string_limit_error.yaml | 2 +- .../artist_select_query_Track_fail.yaml | 4 +- ...ect_articles_without_required_headers.yaml | 55 + .../graphql_query/permissions/setup.yaml | 57 + .../graphql_query/permissions/teardown.yaml | 1 + ...uld_not_be_able_to_access_books_by_pk.yaml | 20 + .../backward/page_1.yaml | 49 + .../backward/page_2.yaml | 45 + .../backward/page_3.yaml | 40 + .../forward/page_1.yaml | 49 + .../forward/page_2.yaml | 45 + .../forward/page_3.yaml | 40 + .../backward/page_1.yaml | 44 + .../backward/page_2.yaml | 51 + .../backward/page_3.yaml | 45 + .../forward/page_1.yaml | 50 + .../forward/page_2.yaml | 51 + .../relay/basic/invalid_node_id.yaml | 19 + .../pagination_errors/after_and_before.yaml | 2 +- .../pagination_errors/first_and_last.yaml | 2 +- .../pagination_errors/after_and_before.yaml | 21 + .../pagination_errors/first_and_last.yaml | 21 + .../queries/graphql_query/relay/setup.yaml | 79 + .../queries/graphql_query/relay/teardown.yaml | 8 + .../graphql_validation/json_column_value.yaml | 38 + .../graphql_validation/null_value_err.yaml | 2 +- .../null_variable_value_err.yaml | 7 +- .../queries/graphql_validation/setup.yaml | 45 +- .../queries/graphql_validation/teardown.yaml | 8 + .../variable_type_mismatch.yaml | 223 +++ .../add_remote_schema_err_missing_arg.yaml | 2 +- ...te_schema_with_union_err_wrapped_type.yaml | 2 +- ...tation_output_with_remote_join_fields.yaml | 60 + .../query_with_errors_arr.yaml | 6 +- .../query_with_errors_obj.yaml | 6 +- .../remote_relationships/setup.yaml | 12 + .../setup_remote_rel_basic_with_authors.yaml | 11 + .../subscription_with_remote_join_fields.yaml | 21 + .../remote_relationships/teardown.yaml | 5 + .../tests-py/remote_schemas/nodejs/.gitignore | 1 - .../remote_schemas/nodejs/package-lock.json | 1437 ++++++++++++++ .../remote_schemas/nodejs/package.json | 6 +- server/tests-py/test_graphql_mutations.py | 12 + server/tests-py/test_graphql_queries.py | 12 + server/tests-py/test_remote_relationships.py | 19 +- server/tests-py/test_schema_stitching.py | 25 +- server/tests-py/test_validation.py | 6 + server/tests-py/validate.py | 2 +- 241 files changed, 14804 insertions(+), 12656 deletions(-) create mode 100644 .circleci/server-upgrade-downgrade/err_msg.patch create mode 100644 server/commit_diff.txt create mode 100644 server/src-lib/Data/GADT/Compare/Extended.hs create mode 100644 server/src-lib/Data/Tuple/Extended.hs rename server/src-lib/Hasura/GraphQL/{Resolve => Execute}/Action.hs (66%) create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Inline.hs create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Insert.hs create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Mutation.hs create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Prepare.hs create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Resolve.hs create mode 100644 server/src-lib/Hasura/GraphQL/Execute/Types.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Class.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Collect.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Column.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Monad.hs create mode 100644 server/src-lib/Hasura/GraphQL/Parser/Schema.hs delete mode 100644 server/src-lib/Hasura/GraphQL/RelaySchema.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/BoolExp.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Context.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/InputValue.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Insert.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Introspect.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Mutation.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Select.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Resolve/Types.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Builder.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/CustomTypes.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Function.hs create mode 100644 server/src-lib/Hasura/GraphQL/Schema/Insert.hs create mode 100644 server/src-lib/Hasura/GraphQL/Schema/Introspect.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Merge.hs create mode 100644 server/src-lib/Hasura/GraphQL/Schema/Mutation.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Mutation/Common.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Mutation/Delete.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Mutation/Insert.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Schema/Mutation/Update.hs create mode 100644 server/src-lib/Hasura/GraphQL/Schema/Remote.hs create mode 100644 server/src-lib/Hasura/GraphQL/Schema/Table.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Validate.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Validate/Context.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Validate/InputValue.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Validate/SelectionSet.hs delete mode 100644 server/src-lib/Hasura/GraphQL/Validate/Types.hs create mode 100644 server/src-lib/Hasura/RQL/DML/Delete/Types.hs create mode 100644 server/src-lib/Hasura/RQL/DML/Insert/Types.hs create mode 100644 server/src-lib/Hasura/RQL/DML/Returning/Types.hs create mode 100644 server/src-lib/Hasura/RQL/DML/Update/Types.hs create mode 100644 server/tests-py/queries/graphql_mutation/insert/basic/can_insert_in_insertable_view.yaml create mode 100644 server/tests-py/queries/graphql_mutation/insert/basic/cannot_insert_in_non_insertable_view.yaml create mode 100644 server/tests-py/queries/graphql_mutation/insert/basic/person_valid_variable_but_invalid_graphql_value.yaml create mode 100644 server/tests-py/queries/graphql_mutation/insert/permissions/leads_upsert_check_with_headers.yaml create mode 100644 server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_with_select_access.yaml create mode 100644 server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_without_select_access.yaml create mode 100644 server/tests-py/queries/graphql_query/permissions/select_articles_without_required_headers.yaml create mode 100644 server/tests-py/queries/graphql_query/permissions/user_should_not_be_able_to_access_books_by_pk.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_1.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_2.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_3.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_1.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_2.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_3.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_1.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_2.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_3.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_1.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_2.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/basic/invalid_node_id.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/pagination_errors/after_and_before.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/pagination_errors/first_and_last.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/setup.yaml create mode 100644 server/tests-py/queries/graphql_query/relay/teardown.yaml create mode 100644 server/tests-py/queries/graphql_validation/json_column_value.yaml create mode 100644 server/tests-py/queries/graphql_validation/variable_type_mismatch.yaml create mode 100644 server/tests-py/queries/remote_schemas/remote_relationships/mutation_output_with_remote_join_fields.yaml create mode 100644 server/tests-py/queries/remote_schemas/remote_relationships/setup_remote_rel_basic_with_authors.yaml create mode 100644 server/tests-py/queries/remote_schemas/remote_relationships/subscription_with_remote_join_fields.yaml create mode 100644 server/tests-py/remote_schemas/nodejs/package-lock.json diff --git a/.circleci/server-upgrade-downgrade/err_msg.patch b/.circleci/server-upgrade-downgrade/err_msg.patch new file mode 100644 index 00000000000..1e542382474 --- /dev/null +++ b/.circleci/server-upgrade-downgrade/err_msg.patch @@ -0,0 +1,13 @@ +diff --git a/server/tests-py/validate.py b/server/tests-py/validate.py +index 3eecd52a..a18b3113 100644 +--- a/server/tests-py/validate.py ++++ b/server/tests-py/validate.py +@@ -318,7 +318,7 @@ def assert_graphql_resp_expected(resp_orig, exp_response_orig, query, resp_hdrs= + # If it is a batch GraphQL query, compare each individual response separately + for (exp, out) in zip(as_list(exp_response), as_list(resp)): + matched_ = equal_CommentedMap(exp, out) +- if is_err_msg(exp): ++ if is_err_msg(exp) and is_err_msg(out): + if not matched_: + warnings.warn("Response does not have the expected error message\n" + dump_str.getvalue()) + return resp, matched diff --git a/.circleci/server-upgrade-downgrade/run-dev.sh b/.circleci/server-upgrade-downgrade/run-dev.sh index 7d9575ea618..aed009ba4c8 100755 --- a/.circleci/server-upgrade-downgrade/run-dev.sh +++ b/.circleci/server-upgrade-downgrade/run-dev.sh @@ -6,7 +6,9 @@ # and sets some of the required variables that run.sh needs, # before executing run.sh set -euo pipefail -ROOT="${BASH_SOURCE[0]%/*}" +cd "${BASH_SOURCE[0]%/*}" +ROOT="${PWD}" +cd - > /dev/null SERVER_DIR="$ROOT/../../server" @@ -18,8 +20,8 @@ echo "server binary: $SERVER_BINARY" cd - set +x -export SERVER_OUTPUT_DIR="server-output" -export LATEST_SERVER_BINARY="./graphql-engine-latest" +export SERVER_OUTPUT_DIR="$ROOT/server-output" +export LATEST_SERVER_BINARY="$ROOT/graphql-engine-latest" # Create Python virtualenv if ! [ -f ".venv/bin/activate" ] ; then @@ -40,7 +42,8 @@ log_duration=on port=$PG_PORT EOF ) -export HASURA_GRAPHQL_DATABASE_URL="postgres://postgres:$PGPASSWORD@127.0.0.1:$PG_PORT/postgres" +# Pytest is giving out deprecated warnings when postgres:// is used +export HASURA_GRAPHQL_DATABASE_URL="postgresql://postgres:$PGPASSWORD@127.0.0.1:$PG_PORT/postgres" DOCKER_PSQL="docker exec -u postgres -it $PG_CONTAINER_NAME psql -p $PG_PORT" diff --git a/.circleci/server-upgrade-downgrade/run.sh b/.circleci/server-upgrade-downgrade/run.sh index 0e7166fd43e..4a37477a3fb 100755 --- a/.circleci/server-upgrade-downgrade/run.sh +++ b/.circleci/server-upgrade-downgrade/run.sh @@ -12,7 +12,9 @@ set -euo pipefail # # echo an error message before exiting # trap 'echo "\"${last_command}\" command filed with exit code $?."' EXIT -ROOT="${BASH_SOURCE[0]%/*}" +cd "${BASH_SOURCE[0]%/*}" +ROOT="${PWD}" +cd - > /dev/null download_with_etag_check() { URL="$1" @@ -119,6 +121,17 @@ trap rm_worktree ERR make_latest_release_worktree() { git worktree add --detach "$WORKTREE_DIR" "$RELEASE_VERSION" + cd "$WORKTREE_DIR" + # FIX ME: Remove the patch below after the next stable release + # The --avoid-error-message-checks in pytest was implementated as a rather relaxed check than + # what we intended to have. In versions <= v1.3.0, + # this check allows response to be success even if the expected response is a failure. + # The patch below fixes that issue. + # The `git apply` should give errors from next release onwards, + # since this change is going to be included in the next release version + git apply "${ROOT}/err_msg.patch" || \ + (log "Remove the git apply in make_latest_release_worktree function" && false) + cd - > /dev/null } cleanup_hasura_metadata_if_present() { @@ -148,7 +161,18 @@ get_server_upgrade_tests() { cd $RELEASE_PYTEST_DIR tmpfile="$(mktemp --dry-run)" set -x - python3 -m pytest -q --collect-only --collect-upgrade-tests-to-file "$tmpfile" -m 'allow_server_upgrade_test and not skip_server_upgrade_test' "${args[@]}" 1>/dev/null 2>/dev/null + # FIX ME: Deselecting some introspection tests from the previous test suite + # which throw errors on the latest build. Even when the output of the current build is more accurate. + # Remove these deselects after the next stable release + python3 -m pytest -q --collect-only --collect-upgrade-tests-to-file "$tmpfile" \ + -m 'allow_server_upgrade_test and not skip_server_upgrade_test' \ + --deselect test_schema_stitching.py::TestRemoteSchemaBasic::test_introspection \ + --deselect test_schema_stitching.py::TestAddRemoteSchemaCompareRootQueryFields::test_schema_check_arg_default_values_and_field_and_arg_types \ + --deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_user_with_no_backend_privilege \ + --deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_backend_user_no_admin_secret_fail \ + --deselect test_graphql_mutations.py::TestGraphqlMutationCustomSchema::test_update_article \ + --deselect test_graphql_queries.py::TestGraphQLQueryEnums::test_introspect_user_role \ + "${args[@]}" 1>/dev/null 2>/dev/null set +x cat "$tmpfile" cd - >/dev/null @@ -174,11 +198,12 @@ run_server_upgrade_pytest() { set -x # With --avoid-error-message-checks, we are only going to throw warnings if the error message has changed between releases - # FIX ME: Remove the deselect below after the next stable release pytest --hge-urls "${HGE_URL}" --pg-urls "$HASURA_GRAPHQL_DATABASE_URL" \ --avoid-error-message-checks "$@" \ -m 'allow_server_upgrade_test and not skip_server_upgrade_test' \ - --deselect test_graphql_mutations.py::TestGraphqlUpdateBasic::test_numerics_inc \ + --deselect test_graphql_mutations.py::TestGraphqlInsertPermission::test_user_with_no_backend_privilege \ + --deselect test_graphql_mutations.py::TestGraphqlMutationCustomSchema::test_update_article \ + --deselect test_graphql_queries.py::TestGraphQLQueryEnums::test_introspect_user_role \ -v $tests_to_run set +x cd - diff --git a/.circleci/test-server.sh b/.circleci/test-server.sh index dc1db1e9f4c..e097cf051a6 100755 --- a/.circleci/test-server.sh +++ b/.circleci/test-server.sh @@ -191,9 +191,7 @@ pip3 install -r requirements.txt # node js deps curl -sL https://deb.nodesource.com/setup_8.x | bash - apt-get install -y nodejs -npm_config_loglevel=error npm install $PYTEST_ROOT/remote_schemas/nodejs/ - -npm install apollo-server graphql +(cd remote_schemas/nodejs && npm_config_loglevel=error npm ci) mkdir -p "$OUTPUT_FOLDER/hpc" diff --git a/.kodiak.toml b/.kodiak.toml index 0da457e7677..55aef81308e 100644 --- a/.kodiak.toml +++ b/.kodiak.toml @@ -1,7 +1,7 @@ ############################################################################### ## Configuration for auto-merge / auto-update bot ## -## See: https://kodiakhq.com/ +## See: https://kodiakhq.com/ ############################################################################### # Kodiak's configuration file should be placed at `.kodiak.toml` (repository diff --git a/CHANGELOG.md b/CHANGELOG.md index e997527dd26..e4bd535b5c4 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -2,19 +2,31 @@ ## Next release +### Breaking changes + +This release contains the [PDV refactor (#4111)](https://github.com/hasura/graphql-engine/pull/4111), a significant rewrite of the internals of the server, which did include some breaking changes: + +- The semantics of explicit `null` values in `where` filters have changed according to the discussion in [issue 704](https://github.com/hasura/graphql-engine/issues/704#issuecomment-635571407): an explicit `null` value in a comparison input object will be treated as an error rather than resulting in the expression being evaluated to `True`. For instance: `delete_users(where: {id: {_eq: $userId}}) { name }` will yield an error if `$userId` is `null` instead of deleting all users. +- The validation of required headers has been fixed (closing #14 and #3659): + - if a query selects table `bar` through table `foo` via a relationship, the required permissions headers will be the union of the required headers of table `foo` and table `bar` (we used to only check the headers of the root table); + - if an insert does not have an `on_conflict` clause, it will not require the update permissions headers. + ### Bug fixes and improvements (Add entries here in the order of: server, console, cli, docs, others) -- docs: add docs page on networking with docker (close #4346) (#4811) +- server: some mutations that cannot be performed will no longer be in the schema (for instance, `delete_by_pk` mutations won't be shown to users that do not have select permissions on all primary keys) (#4111) +- server: miscellaneous description changes (#4111) +- server: treat the absence of `backend_only` configuration and `backend_only: false` equally (closing #5059) (#4111) - cli: add missing global flags for seeds command (#5565) +- docs: add docs page on networking with docker (close #4346) (#4811) ## `v1.3.1`, `v1.3.1-beta.1` ### Breaking change -Headers from environment variables starting with `HASURA_GRAPHQL_` are not allowed +Headers from environment variables starting with `HASURA_GRAPHQL_` are not allowed in event triggers, actions & remote schemas. If you do have such headers configured, then you must update the header configuration before upgrading. diff --git a/server/.stylish-haskell.yaml b/server/.stylish-haskell.yaml index e7672a411c0..90266fb71a7 100644 --- a/server/.stylish-haskell.yaml +++ b/server/.stylish-haskell.yaml @@ -229,6 +229,7 @@ language_extensions: - BangPatterns - BlockArguments - ConstraintKinds +- DataKinds - DefaultSignatures - DeriveDataTypeable - DeriveFoldable @@ -238,9 +239,11 @@ language_extensions: - DeriveTraversable - DerivingVia - EmptyCase +- ExistentialQuantification - FlexibleContexts - FlexibleInstances - FunctionalDependencies +- GADTs - GeneralizedNewtypeDeriving - InstanceSigs - LambdaCase @@ -249,12 +252,16 @@ language_extensions: - NamedFieldPuns - NoImplicitPrelude - OverloadedStrings +- QuantifiedConstraints - QuasiQuotes - RankNTypes +- RecordWildCards +- RoleAnnotations - ScopedTypeVariables - StandaloneDeriving - TemplateHaskell - TupleSections - TypeApplications - TypeFamilies +- TypeFamilyDependencies - TypeOperators diff --git a/server/CONTRIBUTING.md b/server/CONTRIBUTING.md index 69d7104af5a..ee9a96aaa41 100644 --- a/server/CONTRIBUTING.md +++ b/server/CONTRIBUTING.md @@ -24,6 +24,11 @@ Additionally, you will need a way to run a Postgres database server. The `dev.sh - [PostgreSQL](https://www.postgresql.org) >= 9.5 - [postgis](https://postgis.net) +Additionally, you will need a way to run a Postgres database server. The `dev.sh` script (described below) can set up a Postgres instance for you via [Docker](https://www.docker.com), but if you want to run it yourself, you’ll need: + +- [PostgreSQL](https://www.postgresql.org) >= 9.5 +- [postgis](https://postgis.net) + ### Upgrading npm If your npm is too old (>= 5.7 required): @@ -116,16 +121,13 @@ cabal new-run -- test:graphql-engine-tests \ ##### Running the Python test suite -1. To run the Python tests, you’ll need to install the necessary Python dependencies first. It is - recommended that you do this in a self-contained Python venv, which is supported by Python 3.3+ - out of the box. To create one, run: +1. To run the Python tests, you’ll need to install the necessary Python dependencies first. It is recommended that you do this in a self-contained Python venv, which is supported by Python 3.3+ out of the box. To create one, run: ``` python3 -m venv .python-venv ``` - (The second argument names a directory where the venv sandbox will be created; it can be anything - you like, but `.python-venv` is `.gitignore`d.) + (The second argument names a directory where the venv sandbox will be created; it can be anything you like, but `.python-venv` is `.gitignore`d.) With the venv created, you can enter into it in your current shell session by running: @@ -141,11 +143,18 @@ cabal new-run -- test:graphql-engine-tests \ pip3 install -r tests-py/requirements.txt ``` -3. Start an instance of `graphql-engine` for the test suite to use: +3. Install the dependencies for the Node server used by the remote schema tests: + + ``` + (cd tests-py/remote_schemas/nodejs && npm ci) + ``` + +4. Start an instance of `graphql-engine` for the test suite to use: ``` env EVENT_WEBHOOK_HEADER=MyEnvValue \ WEBHOOK_FROM_ENV=http://localhost:5592/ \ + SCHEDULED_TRIGGERS_WEBHOOK_DOMAIN=http://127.0.0.1:5594 \ cabal new-run -- exe:graphql-engine \ --database-url='postgres://:@:/' \ serve --stringify-numeric-types @@ -153,7 +162,7 @@ cabal new-run -- test:graphql-engine-tests \ The environment variables are needed for a couple tests, and the `--stringify-numeric-types` option is used to avoid the need to do floating-point comparisons. -4. With the server running, run the test suite: +5. With the server running, run the test suite: ``` cd tests-py diff --git a/server/cabal.project b/server/cabal.project index 63b3a4210dd..d02bd69fc07 100644 --- a/server/cabal.project +++ b/server/cabal.project @@ -15,10 +15,17 @@ packages: . constraints: - -- ensure we don’t end up with a freeze file that forces an incompatible + -- ensure we don't end up with a freeze file that forces an incompatible -- version in CI for Setup.hs scripts. setup.Cabal <3.4 +allow-newer: + -- dependent-map depends on constraints-extras, but its bounds have not yet + -- been relaxed for GHC 8.10. + constraints-extras-0.3.0.2:base, + constraints-extras-0.3.0.2:constraints, + constraints-extras-0.3.0.2:template-haskell + package * optimization: 2 -- For tooling, e.g. 'weeder', and IDE-like stuff: @@ -41,7 +48,7 @@ source-repository-package source-repository-package type: git location: https://github.com/hasura/graphql-parser-hs.git - tag: f4a093981ca5626982a17c2bfaad047cc0834a81 + tag: 8f1cd3a9bf6ec91f1ba1d83f704ab078113e035b source-repository-package type: git diff --git a/server/cabal.project.ci b/server/cabal.project.ci index 5bc6968f6d9..221c5c20b12 100644 --- a/server/cabal.project.ci +++ b/server/cabal.project.ci @@ -3,6 +3,12 @@ reject-unconstrained-dependencies: all package graphql-engine - ghc-options: -j3 -Werror + ghc-options: + -j3 -Werror + -- Limit heap size to 8GB, which is the amount of available memory on a + -- CircleCI `large` instance. (GHC interprets G as GB, i.e. 1,000^3 bytes, + -- but instances seem to have 8GiB, i.e. 1,024^3 bytes, so that leaves us + -- a little breathing room.) + +RTS -M8G -RTS tests: true benchmarks: true diff --git a/server/cabal.project.dev b/server/cabal.project.dev index bddc8b12304..329ce17acd9 100644 --- a/server/cabal.project.dev +++ b/server/cabal.project.dev @@ -6,13 +6,15 @@ -- Or, if you want to customize the configuration: -- $ cp cabal.project.dev cabal.project.local +with-compiler: ghc-8.10.1 + package * documentation: true package graphql-engine -- NOTE: this seems to work so long as there is no 'ghc-options: -O2' in the cabal file, - -- but new-build will report 'Build profile: -O1' for some reason. - -- See:https://github.com/haskell/cabal/issues/6221 + -- but new-build will report 'Build profile: -O1' for some reason. + -- See:https://github.com/haskell/cabal/issues/6221 optimization: 0 documentation: false flags: +developer diff --git a/server/cabal.project.freeze b/server/cabal.project.freeze index 30e6678ccaf..32b15940cd8 100644 --- a/server/cabal.project.freeze +++ b/server/cabal.project.freeze @@ -79,6 +79,8 @@ constraints: any.Cabal ==3.2.0.0, any.conduit ==1.3.2, any.connection ==0.3.1, any.constraints ==0.12, + any.constraints-extras ==0.3.0.2, + constraints-extras +build-readme, any.containers ==0.6.2.1, any.contravariant ==1.5.2, contravariant +semigroups +statevar +tagged, @@ -112,8 +114,8 @@ constraints: any.Cabal ==3.2.0.0, any.deepseq ==1.4.4.0, any.deferred-folds ==0.9.10.1, any.dense-linear-algebra ==0.1.0.0, - any.dependent-map ==0.2.4.0, - any.dependent-sum ==0.4, + any.dependent-map ==0.4.0.0, + any.dependent-sum ==0.7.1.0, any.directory ==1.3.6.1, any.distributive ==0.6.2, distributive +semigroups +tagged, @@ -127,7 +129,6 @@ constraints: any.Cabal ==3.2.0.0, any.erf ==2.0.0.0, any.errors ==2.3.0, any.exceptions ==0.10.4, - exceptions +transformers-0-4, any.fail ==4.9.0.0, any.fast-logger ==3.0.1, any.file-embed ==0.0.11.2, @@ -242,8 +243,6 @@ constraints: any.Cabal ==3.2.0.0, any.primitive-unlifted ==0.1.3.0, any.process ==1.6.8.2, any.profunctors ==5.5.2, - any.protolude ==0.3.0, - protolude -dev, any.psqueues ==0.2.7.2, any.quickcheck-instances ==0.3.22, quickcheck-instances -bytestring-builder, @@ -277,6 +276,8 @@ constraints: any.Cabal ==3.2.0.0, any.simple-sendfile ==0.2.30, simple-sendfile +allow-bsd, any.socks ==0.6.1, + any.some ==1.0.1, + some +newtype-unsafe, any.split ==0.2.3.4, any.splitmix ==0.0.4, splitmix -optimised-mixer +random, diff --git a/server/commit_diff.txt b/server/commit_diff.txt new file mode 100644 index 00000000000..eaf5fe97a52 --- /dev/null +++ b/server/commit_diff.txt @@ -0,0 +1 @@ +**** Latest commit compared against master - fd7fb580831fe9054164a285441c99562f34c815 diff --git a/server/graphql-engine.cabal b/server/graphql-engine.cabal index 6b420b721e6..07a92af5127 100644 --- a/server/graphql-engine.cabal +++ b/server/graphql-engine.cabal @@ -20,6 +20,11 @@ flag developer default: False manual: True +flag profiling + description: Configures the project to be profiling-compatible + default: False + manual: True + common common-all ghc-options: -fmax-simplifier-iterations=20 -foptimal-applicative-do @@ -27,16 +32,51 @@ common common-all if flag(developer) cpp-options: -DDeveloperAPIs + if flag(profiling) + cpp-options: -DPROFILING default-language: Haskell2010 default-extensions: - ApplicativeDo BangPatterns BlockArguments ConstraintKinds DataKinds DefaultSignatures DeriveDataTypeable - DeriveFoldable DeriveFunctor DeriveGeneric DeriveLift DeriveTraversable DerivingVia EmptyCase - FlexibleContexts FlexibleInstances FunctionalDependencies GeneralizedNewtypeDeriving - InstanceSigs LambdaCase MultiParamTypeClasses MultiWayIf NamedFieldPuns NoImplicitPrelude - OverloadedStrings QuantifiedConstraints QuasiQuotes RankNTypes RecordWildCards ScopedTypeVariables - StandaloneDeriving TemplateHaskell TupleSections TypeApplications TypeFamilies TypeOperators + ApplicativeDo + BangPatterns + BlockArguments + ConstraintKinds + DataKinds + DefaultSignatures + DeriveDataTypeable + DeriveFoldable + DeriveFunctor + DeriveGeneric + DeriveLift + DeriveTraversable + DerivingVia + EmptyCase + ExistentialQuantification + FlexibleContexts + FlexibleInstances + FunctionalDependencies + GADTs + GeneralizedNewtypeDeriving + InstanceSigs + LambdaCase + MultiParamTypeClasses + MultiWayIf + NamedFieldPuns + NoImplicitPrelude + OverloadedStrings + QuantifiedConstraints + QuasiQuotes + RankNTypes RecordWildCards + RoleAnnotations + ScopedTypeVariables + StandaloneDeriving + TemplateHaskell + TupleSections + TypeApplications + TypeFamilies + TypeFamilyDependencies + TypeOperators common common-exe ghc-options: @@ -50,12 +90,12 @@ common common-exe -- than that is highly unlikely to ever be helpful. More benchmarking would be useful to know if -- this is the right decision. It’s possible it would better to just turn it off completely. -- - -- `-kc8K` helps limit memory consumption in websockets (perhaps elsewhere) by making the - -- cost of a thread's first (and probably only) stack overflow less severe. - -- See:https://github.com/hasura/graphql-engine/issues/5190 + -- `-kc8K` helps limit memory consumption in websockets (perhaps elsewhere) by making the + -- cost of a thread's first (and probably only) stack overflow less severe. + -- See:https://github.com/hasura/graphql-engine/issues/5190 -- - -- `--disable-delayed-os-memory-return` seems to help lower reported residency, in particular - -- in situations where we seem to be dealing with haskell heap fragmentation. This is more a + -- `--disable-delayed-os-memory-return` seems to help lower reported residency, in particular + -- in situations where we seem to be dealing with haskell heap fragmentation. This is more a -- workaround for limitations in monitoring tools than anything... "-with-rtsopts=-N -I0 -T -qn2 -kc8K --disable-delayed-os-memory-return" @@ -68,6 +108,7 @@ library , validation , lifted-base , pg-client + , validation , text , text-builder >= 0.6 , vector-builder @@ -100,8 +141,8 @@ library , http-client-tls , profunctors , deepseq - , dependent-map >=0.2.4 && <0.4 - , dependent-sum >=0.4 && <0.5 + , dependent-map >=0.4 && <0.5 + , dependent-sum >=0.7.1 && <0.8 , exceptions , these , semialign @@ -133,10 +174,10 @@ library , lens -- GraphQL related - , graphql-parser + , graphql-parser >=0.2 && <0.3 -- URL parser related - , network-uri + , network-uri >=2.6.3.0 && <2.7 , uri-encode -- String related @@ -208,23 +249,35 @@ library , generic-arbitrary , quickcheck-instances - -- 0.6.1 is supposedly not okay for ghc 8.6: - -- https://github.com/nomeata/ghc-heap-view/issues/27 - , ghc-heap-view - , directory - + , random , mmorph , http-api-data , lens-aeson , safe - + , semigroups >= 0.19.1 -- scheduled triggers , cron >= 0.6.2 + , random + , mmorph + , http-api-data + , lens-aeson + , safe + + , semigroups >= 0.19.1 + + -- scheduled triggers + , cron >= 0.6.2 + if !flag(profiling) + build-depends: + -- 0.6.1 is supposedly not okay for ghc 8.6: + -- https://github.com/nomeata/ghc-heap-view/issues/27 + ghc-heap-view + exposed-modules: Control.Arrow.Extended , Control.Arrow.Trans , Control.Concurrent.Extended @@ -246,76 +299,85 @@ library , Data.Time.Clock.Units , Data.URL.Template , Hasura.App - , Hasura.Cache.Bounded , Hasura.Db + -- Exposed for benchmark: + , Hasura.Cache.Bounded + , Hasura.Logging + , Hasura.HTTP + , Hasura.Incremental + , Hasura.Server.App + , Hasura.Server.Auth + , Hasura.Server.Init + , Hasura.Server.Init.Config + , Hasura.Server.API.Query + , Hasura.Server.Utils + , Hasura.Server.Version + , Hasura.Server.Logging + , Hasura.Server.Migrate + , Hasura.Server.Compression + , Hasura.Server.API.PGDump + , Hasura.Prelude + , Hasura.EncJSON - , Hasura.Eventing.Common - , Hasura.Eventing.EventTrigger - , Hasura.Eventing.HTTP - , Hasura.Eventing.ScheduledTrigger - , Hasura.GraphQL.Context + , Hasura.GraphQL.Execute.Query + , Hasura.GraphQL.Logging + , Hasura.Incremental.Select + , Hasura.RQL.DML.Select + , Hasura.RQL.Types.Run + , Hasura.Session + + -- exposed for Pro + , Hasura.Server.API.Config + , Hasura.Server.Telemetry + -- Exposed for testing: + , Hasura.Server.Telemetry.Counters + , Hasura.Server.Auth.JWT , Hasura.GraphQL.Execute , Hasura.GraphQL.Execute.LiveQuery - , Hasura.GraphQL.Execute.LiveQuery.Options - , Hasura.GraphQL.Execute.LiveQuery.Plan - , Hasura.GraphQL.Execute.LiveQuery.Poll - , Hasura.GraphQL.Execute.LiveQuery.State - , Hasura.GraphQL.Execute.LiveQuery.TMap - , Hasura.GraphQL.Execute.Plan - , Hasura.GraphQL.Execute.Query - , Hasura.GraphQL.Explain - , Hasura.GraphQL.Logging - , Hasura.GraphQL.NormalForm - , Hasura.GraphQL.RelaySchema - , Hasura.GraphQL.RemoteServer - , Hasura.GraphQL.Resolve - , Hasura.GraphQL.Resolve.Action - , Hasura.GraphQL.Resolve.BoolExp - , Hasura.GraphQL.Resolve.Context - , Hasura.GraphQL.Resolve.InputValue - , Hasura.GraphQL.Resolve.Insert - , Hasura.GraphQL.Resolve.Introspect - , Hasura.GraphQL.Resolve.Mutation - , Hasura.GraphQL.Resolve.Select - , Hasura.GraphQL.Resolve.Types - , Hasura.GraphQL.Schema - , Hasura.GraphQL.Schema.Action - , Hasura.GraphQL.Schema.BoolExp - , Hasura.GraphQL.Schema.Builder - , Hasura.GraphQL.Schema.Common - , Hasura.GraphQL.Schema.CustomTypes - , Hasura.GraphQL.Schema.Function - , Hasura.GraphQL.Schema.Merge - , Hasura.GraphQL.Schema.Mutation.Common - , Hasura.GraphQL.Schema.Mutation.Delete - , Hasura.GraphQL.Schema.Mutation.Insert - , Hasura.GraphQL.Schema.Mutation.Update - , Hasura.GraphQL.Schema.OrderBy - , Hasura.GraphQL.Schema.Select , Hasura.GraphQL.Transport.HTTP , Hasura.GraphQL.Transport.HTTP.Protocol , Hasura.GraphQL.Transport.WebSocket , Hasura.GraphQL.Transport.WebSocket.Protocol , Hasura.GraphQL.Transport.WebSocket.Server , Hasura.GraphQL.Utils - , Hasura.GraphQL.Validate - , Hasura.GraphQL.Validate.Context - , Hasura.GraphQL.Validate.InputValue - , Hasura.GraphQL.Validate.SelectionSet - , Hasura.GraphQL.Validate.Types - , Hasura.HTTP - , Hasura.Incremental , Hasura.Incremental.Internal.Cache , Hasura.Incremental.Internal.Dependency , Hasura.Incremental.Internal.Rule - , Hasura.Incremental.Select - , Hasura.Logging - , Hasura.Prelude + , Hasura.Server.Auth.WebHook + , Hasura.Server.Middleware + , Hasura.Server.Cors + , Hasura.Server.CheckUpdates + , Hasura.Server.SchemaUpdate + , Hasura.Server.Migrate.Version + , Hasura.Server.Auth.JWT.Internal + , Hasura.Server.Auth.JWT.Logging + , Hasura.RQL.Instances + , Hasura.RQL.Types + , Hasura.RQL.Types.SchemaCache + , Hasura.RQL.Types.Table + , Hasura.RQL.Types.SchemaCache.Build + , Hasura.RQL.Types.SchemaCacheTypes + , Hasura.RQL.Types.BoolExp + , Hasura.RQL.Types.Function + , Hasura.RQL.Types.Catalog + , Hasura.RQL.Types.Column + , Hasura.RQL.Types.Common + , Hasura.RQL.Types.ComputedField + , Hasura.RQL.Types.DML + , Hasura.RQL.Types.Error + , Hasura.RQL.Types.EventTrigger + , Hasura.RQL.Types.Metadata + , Hasura.RQL.Types.Permission + , Hasura.RQL.Types.QueryCollection + , Hasura.RQL.Types.Action + , Hasura.RQL.Types.RemoteSchema + , Hasura.RQL.Types.RemoteRelationship + , Hasura.RQL.Types.ScheduledTrigger , Hasura.RQL.DDL.Action , Hasura.RQL.DDL.ComputedField , Hasura.RQL.DDL.CustomTypes + , Hasura.RQL.Types.CustomTypes , Hasura.RQL.DDL.Deps - , Hasura.RQL.DDL.EventTrigger , Hasura.RQL.DDL.Headers , Hasura.RQL.DDL.Metadata , Hasura.RQL.DDL.Metadata.Generator @@ -329,7 +391,6 @@ library , Hasura.RQL.DDL.RemoteRelationship , Hasura.RQL.DDL.RemoteRelationship.Validate , Hasura.RQL.DDL.RemoteSchema - , Hasura.RQL.DDL.ScheduledTrigger , Hasura.RQL.DDL.Schema , Hasura.RQL.DDL.Schema.Cache , Hasura.RQL.DDL.Schema.Cache.Common @@ -343,66 +404,63 @@ library , Hasura.RQL.DDL.Schema.Rename , Hasura.RQL.DDL.Schema.Table , Hasura.RQL.DDL.Utils - , Hasura.RQL.DML.Count + , Hasura.RQL.DDL.EventTrigger + , Hasura.RQL.DDL.ScheduledTrigger , Hasura.RQL.DML.Delete - , Hasura.RQL.DML.Insert + , Hasura.RQL.DML.Delete.Types , Hasura.RQL.DML.Internal + , Hasura.RQL.DML.Insert + , Hasura.RQL.DML.Insert.Types , Hasura.RQL.DML.Mutation , Hasura.RQL.DML.RemoteJoin , Hasura.RQL.DML.Returning - , Hasura.RQL.DML.Select + , Hasura.RQL.DML.Returning.Types , Hasura.RQL.DML.Select.Internal , Hasura.RQL.DML.Select.Types , Hasura.RQL.DML.Update + , Hasura.RQL.DML.Update.Types + , Hasura.RQL.DML.Count , Hasura.RQL.GBoolExp - , Hasura.RQL.Instances - , Hasura.RQL.Types - , Hasura.RQL.Types.Action - , Hasura.RQL.Types.BoolExp - , Hasura.RQL.Types.Catalog - , Hasura.RQL.Types.Column - , Hasura.RQL.Types.Common - , Hasura.RQL.Types.ComputedField - , Hasura.RQL.Types.CustomTypes - , Hasura.RQL.Types.DML - , Hasura.RQL.Types.Error - , Hasura.RQL.Types.EventTrigger - , Hasura.RQL.Types.Function - , Hasura.RQL.Types.Metadata - , Hasura.RQL.Types.Permission - , Hasura.RQL.Types.QueryCollection - , Hasura.RQL.Types.RemoteRelationship - , Hasura.RQL.Types.RemoteSchema - , Hasura.RQL.Types.Run - , Hasura.RQL.Types.ScheduledTrigger - , Hasura.RQL.Types.SchemaCache - , Hasura.RQL.Types.SchemaCache.Build - , Hasura.RQL.Types.SchemaCacheTypes - , Hasura.RQL.Types.Table - , Hasura.Server.API.Config - , Hasura.Server.API.PGDump - , Hasura.Server.API.Query - , Hasura.Server.App - , Hasura.Server.Auth - , Hasura.Server.Auth.JWT - , Hasura.Server.Auth.JWT.Internal - , Hasura.Server.Auth.JWT.Logging - , Hasura.Server.Auth.WebHook - , Hasura.Server.CheckUpdates - , Hasura.Server.Compression - , Hasura.Server.Cors - , Hasura.Server.Init - , Hasura.Server.Init.Config - , Hasura.Server.Logging - , Hasura.Server.Middleware - , Hasura.Server.Migrate - , Hasura.Server.Migrate.Version - , Hasura.Server.SchemaUpdate - , Hasura.Server.Telemetry - , Hasura.Server.Telemetry.Counters - , Hasura.Server.Utils - , Hasura.Server.Version - , Hasura.Session + , Hasura.GraphQL.Explain + , Hasura.GraphQL.Execute.Action + , Hasura.GraphQL.Execute.Inline + , Hasura.GraphQL.Execute.Insert + , Hasura.GraphQL.Execute.Plan + , Hasura.GraphQL.Execute.Types + , Hasura.GraphQL.Execute.Mutation + , Hasura.GraphQL.Execute.Resolve + , Hasura.GraphQL.Execute.Prepare + , Hasura.GraphQL.Execute.LiveQuery.Options + , Hasura.GraphQL.Execute.LiveQuery.Plan + , Hasura.GraphQL.Execute.LiveQuery.Poll + , Hasura.GraphQL.Execute.LiveQuery.State + , Hasura.GraphQL.Execute.LiveQuery.TMap + , Hasura.GraphQL.RemoteServer + , Hasura.GraphQL.Context + , Hasura.GraphQL.Parser + , Hasura.GraphQL.Parser.Class + , Hasura.GraphQL.Parser.Collect + , Hasura.GraphQL.Parser.Column + , Hasura.GraphQL.Parser.Internal.Parser + , Hasura.GraphQL.Parser.Monad + , Hasura.GraphQL.Parser.Schema + , Hasura.GraphQL.Schema + , Hasura.GraphQL.Schema.Action + , Hasura.GraphQL.Schema.BoolExp + , Hasura.GraphQL.Schema.Common + , Hasura.GraphQL.Schema.Insert + , Hasura.GraphQL.Schema.Introspect + , Hasura.GraphQL.Schema.Mutation + , Hasura.GraphQL.Schema.OrderBy + , Hasura.GraphQL.Schema.Remote + , Hasura.GraphQL.Schema.Select + , Hasura.GraphQL.Schema.Table + , Hasura.Eventing.HTTP + , Hasura.Eventing.EventTrigger + , Hasura.Eventing.ScheduledTrigger + , Hasura.Eventing.Common + , Data.GADT.Compare.Extended + , Data.Tuple.Extended , Hasura.SQL.DML , Hasura.SQL.Error , Hasura.SQL.GeoJSON @@ -414,6 +472,7 @@ library , Network.URI.Extended , Network.Wai.Extended , Network.Wai.Handler.WebSockets.Custom + executable graphql-engine import: common-all, common-exe hs-source-dirs: src-exec diff --git a/server/src-bench-cache/Main.hs b/server/src-bench-cache/Main.hs index a717ff257c6..09b06952b52 100644 --- a/server/src-bench-cache/Main.hs +++ b/server/src-bench-cache/Main.hs @@ -82,7 +82,7 @@ main = defaultMain [ -- correct, or might be incorrect for some users. Or it might be that many -- users interact with hasura ONLY with parameterized queries with variables, -- where all of these fit into a fairly small cache (but where occurrences of --- these are zipf-distributed). (TODO It should be simple to adapt this to the latter +-- these are zipf-distributed). (TODO (from master) It should be simple to adapt this to the latter -- case (just test on zipf Word8 domain), but these benchmarks don't seem very -- useful if we assume we effectively get only cache hits). -- @@ -141,7 +141,7 @@ realisticBenches name wrk = _hitsMisses <- forConcurrently localPayloads $ \payloadL -> do foldM lookupInsertLoop (0,0) payloadL aft <- getMonotonicTimeNSec - -- TODO we need to decide whether to rewrite these benchmarks or fix + -- TODO (from master) we need to decide whether to rewrite these benchmarks or fix -- criterion so it can support what I want here (to run a slow benchmark -- perhaps one time, with an actual time limit). -- We should also look into just generating a report by hand that takes diff --git a/server/src-lib/Data/Aeson/Ordered.hs b/server/src-lib/Data/Aeson/Ordered.hs index 0104aae961f..74ed4130665 100644 --- a/server/src-lib/Data/Aeson/Ordered.hs +++ b/server/src-lib/Data/Aeson/Ordered.hs @@ -16,8 +16,8 @@ module Data.Aeson.Ordered , decode , Data.Aeson.Ordered.toList , fromList - , object , asObject + , object , array , insert , delete @@ -41,7 +41,6 @@ import qualified Data.ByteString.Lazy as L import Data.Data import Data.Functor import qualified Data.HashMap.Strict as Map -import Data.HashMap.Strict.InsOrd (InsOrdHashMap) import qualified Data.HashMap.Strict.InsOrd as OMap import Data.Scientific import qualified Data.Text as T diff --git a/server/src-lib/Data/GADT/Compare/Extended.hs b/server/src-lib/Data/GADT/Compare/Extended.hs new file mode 100644 index 00000000000..a7b4a2eeb4e --- /dev/null +++ b/server/src-lib/Data/GADT/Compare/Extended.hs @@ -0,0 +1,29 @@ +{-# OPTIONS_GHC -fno-warn-orphans #-} +{-# LANGUAGE PolyKinds #-} + +module Data.GADT.Compare.Extended + ( module Data.GADT.Compare + , strengthenOrdering + , extendGOrdering + ) where + +import Prelude + +import Data.GADT.Compare +import Type.Reflection + +instance GEq ((:~~:) a) where + geq HRefl HRefl = Just Refl +instance GCompare ((:~~:) a) where + gcompare HRefl HRefl = GEQ + +strengthenOrdering :: Ordering -> GOrdering a a +strengthenOrdering LT = GLT +strengthenOrdering EQ = GEQ +strengthenOrdering GT = GGT + +infixr 6 `extendGOrdering` +extendGOrdering :: GOrdering a b -> (a ~ b => GOrdering c d) -> GOrdering c d +extendGOrdering GLT _ = GLT +extendGOrdering GEQ x = x +extendGOrdering GGT _ = GGT diff --git a/server/src-lib/Data/HashMap/Strict/InsOrd/Extended.hs b/server/src-lib/Data/HashMap/Strict/InsOrd/Extended.hs index 832b7420602..dd6f28e243d 100644 --- a/server/src-lib/Data/HashMap/Strict/InsOrd/Extended.hs +++ b/server/src-lib/Data/HashMap/Strict/InsOrd/Extended.hs @@ -10,7 +10,7 @@ import qualified Data.List as L import Data.Hashable (Hashable) -import Prelude (Eq, Foldable, Functor, fmap, ($)) +import Prelude (Eq, Foldable, Functor, flip, fmap, ($), (<>)) groupTuples :: (Eq k, Hashable k, Foldable t) @@ -19,7 +19,7 @@ groupTuples = L.foldl' groupFlds OMap.empty where groupFlds m (k, v) = - OMap.insertWith (\_ c -> c NE.|> v) k (NE.init v) m + OMap.insertWith (flip (<>)) k (NE.singleton v) m groupListWith :: (Eq k, Hashable k, Foldable t, Functor t) diff --git a/server/src-lib/Data/Sequence/NonEmpty.hs b/server/src-lib/Data/Sequence/NonEmpty.hs index 99f2a14a17e..f9740437946 100644 --- a/server/src-lib/Data/Sequence/NonEmpty.hs +++ b/server/src-lib/Data/Sequence/NonEmpty.hs @@ -1,39 +1,49 @@ +{-# LANGUAGE PatternSynonyms #-} +{-# LANGUAGE ViewPatterns #-} + module Data.Sequence.NonEmpty - ( NESeq(..) - , (<|) - , (|>) - , init + ( NESeq + , pattern (:<||) + , pattern (:||>) + , singleton , head , tail , toSeq - , fromSeq , toNonEmpty ) where -import qualified Data.Foldable as Foldable -import qualified Data.Functor as Functor +import Prelude hiding (head, tail) + import qualified Data.List.NonEmpty as NE import qualified Data.Sequence as Seq +import Control.DeepSeq (NFData) import Data.Aeson -import Hasura.Incremental (Cacheable) -import Hasura.Prelude hiding (head, tail) +import Data.Foldable +import GHC.Generics (Generic) -infixr 5 <| -infixl 5 |> - -newtype NESeq a - = NESeq { unNESeq :: (a, Seq.Seq a)} - deriving (Show, Eq, Generic, Traversable) +data NESeq a = NESeq + { head :: a + , tail :: Seq.Seq a + } deriving (Show, Eq, Functor, Traversable, Generic) instance (NFData a) => NFData (NESeq a) -instance (Cacheable a) => Cacheable (NESeq a) -instance Functor.Functor NESeq where - fmap f (NESeq (a, rest)) - = NESeq (f a, Functor.fmap f rest) +instance Semigroup (NESeq a) where + NESeq x xs <> NESeq y ys = NESeq x (xs Seq.>< y Seq.<| ys) -instance Foldable.Foldable NESeq where - foldr f v = Foldable.foldr f v . toSeq +instance Foldable NESeq where + null _ = False + toList (NESeq x xs) = x : toList xs + length (NESeq _ xs) = 1 + length xs + foldl1 f (NESeq x xs) = foldl f x xs + + fold = fold . toSeq + foldMap f = foldMap f . toSeq + foldl f v = foldl f v . toSeq + foldl' f v = foldl' f v . toSeq + foldr f v = foldr f v . toSeq + foldr' f v = foldr' f v . toSeq + foldr1 f = foldr1 f . toSeq instance FromJSON a => FromJSON (NESeq a) where parseJSON v = do @@ -43,32 +53,33 @@ instance FromJSON a => FromJSON (NESeq a) where instance ToJSON a => ToJSON (NESeq a) where toJSON = toJSON . toSeq -init :: a -> NESeq a -init a = NESeq (a, Seq.empty) - -head :: NESeq a -> a -head = fst . unNESeq - -tail :: NESeq a -> Seq.Seq a -tail = snd . unNESeq - -(|>) :: NESeq a -> a -> NESeq a -(NESeq (h, l)) |> v = NESeq (h, l Seq.|> v) - -(<|) :: a -> NESeq a -> NESeq a -v <| (NESeq (h, l)) = NESeq (v, h Seq.<| l) +singleton :: a -> NESeq a +singleton a = NESeq a Seq.empty toSeq :: NESeq a -> Seq.Seq a -toSeq (NESeq (v, l)) = v Seq.<| l +toSeq (NESeq v l) = v Seq.<| l fromSeq :: Seq.Seq a -> Maybe (NESeq a) fromSeq = \case Seq.Empty -> Nothing - h Seq.:<| l -> Just $ NESeq (h, l) + h Seq.:<| l -> Just $ NESeq h l -toNonEmpty :: NESeq a -> NonEmpty a -toNonEmpty (NESeq (v, l)) = v NE.:| toList l +pattern (:<||) :: a -> Seq.Seq a -> NESeq a +pattern x :<|| xs = NESeq x xs +{-# COMPLETE (:<||) #-} -instance Semigroup (NESeq a) where - (NESeq (h, l)) <> r = - NESeq (h, l <> toSeq r) +unsnoc :: NESeq a -> (Seq.Seq a, a) +unsnoc (x :<|| (xs Seq.:|> y)) = (x Seq.:<| xs, y) +unsnoc (x :<|| Seq.Empty ) = (Seq.Empty , x) +{-# INLINE unsnoc #-} + +pattern (:||>) :: Seq.Seq a -> a -> NESeq a +pattern xs :||> x <- (unsnoc->(!xs, x)) + where + (x Seq.:<| xs) :||> y = x :<|| (xs Seq.:|> y) + Seq.Empty :||> y = y :<|| Seq.Empty +{-# COMPLETE (:||>) #-} + +toNonEmpty :: NESeq a -> NE.NonEmpty a +toNonEmpty (NESeq head tail) = + head NE.:| toList tail diff --git a/server/src-lib/Data/Tuple/Extended.hs b/server/src-lib/Data/Tuple/Extended.hs new file mode 100644 index 00000000000..040364c2d6d --- /dev/null +++ b/server/src-lib/Data/Tuple/Extended.hs @@ -0,0 +1,23 @@ +module Data.Tuple.Extended + ( module Data.Tuple + , curry3 + , curry4 + , uncurry3 + , uncurry4 + ) where + +import Prelude + +import Data.Tuple + +curry3 :: ((a, b, c) -> d) -> a -> b -> c -> d +curry3 f a b c = f (a, b, c) + +curry4 :: ((a, b, c, d) -> e) -> a -> b -> c -> d -> e +curry4 f a b c d = f (a, b, c, d) + +uncurry3 :: (a -> b -> c -> d) -> (a, b, c) -> d +uncurry3 f (a, b, c) = f a b c + +uncurry4 :: (a -> b -> c -> d -> e) -> (a, b, c, d) -> e +uncurry4 f (a, b, c, d) = f a b c d diff --git a/server/src-lib/Hasura/App.hs b/server/src-lib/Hasura/App.hs index ae34203f349..4e202bf8709 100644 --- a/server/src-lib/Hasura/App.hs +++ b/server/src-lib/Hasura/App.hs @@ -1,4 +1,5 @@ {-# LANGUAGE UndecidableInstances #-} +{-# LANGUAGE CPP #-} module Hasura.App where @@ -12,9 +13,12 @@ import Control.Monad.Morph (hoist) import Control.Monad.Stateless import Control.Monad.STM (atomically) import Control.Monad.Trans.Control (MonadBaseControl (..)) +import Control.Monad.Unique import Data.Aeson ((.=)) import Data.Time.Clock (UTCTime) +#ifndef PROFILING import GHC.AssertNF +#endif import GHC.Stats import Options.Applicative import System.Environment (getEnvironment) @@ -45,8 +49,8 @@ import Hasura.Eventing.EventTrigger import Hasura.Eventing.ScheduledTrigger import Hasura.GraphQL.Execute (MonadGQLExecutionCheck (..), checkQueryInAllowlist) +import Hasura.GraphQL.Execute.Action (asyncActionsProcessor) import Hasura.GraphQL.Logging (MonadQueryLog (..), QueryLog (..)) -import Hasura.GraphQL.Resolve.Action (asyncActionsProcessor) import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery (..)) import Hasura.GraphQL.Transport.HTTP.Protocol (toParsed) import Hasura.Logging @@ -165,7 +169,7 @@ data InitCtx } -- | Collection of the LoggerCtx, the regular Logger and the PGLogger --- TODO: better naming? +-- TODO (from master): better naming? data Loggers = Loggers { _lsLoggerCtx :: !(LoggerCtx Hasura) @@ -319,7 +323,9 @@ runHGEServer env ServeOptions{..} InitCtx{..} pgExecCtx initTime shutdownApp pos -- tool. -- -- NOTE: be sure to compile WITHOUT code coverage, for this to work properly. +#ifndef PROFILING liftIO disableAssertNF +#endif let sqlGenCtx = SQLGenCtx soStringifyNum Loggers loggerCtx logger _ = _icLoggers @@ -597,6 +603,7 @@ execQuery , CacheRWM m , MonadTx m , MonadIO m + , MonadUnique m , HasHttpManager m , HasSQLGenCtx m , UserInfoM m @@ -625,7 +632,7 @@ instance HttpLog AppM where mkHttpAccessLogContext userInfoM reqId waiReq compressedResponse qTime cType headers instance MonadExecuteQuery AppM where - executeQuery _ _ _ pgCtx _txAccess tx = + executeQuery _ _ _ pgCtx _txAccess tx = ([],) <$> hoist (runQueryTx pgCtx) tx instance UserAuthentication (Tracing.TraceT AppM) where @@ -660,7 +667,6 @@ instance MonadQueryLog AppM where instance WS.MonadWSLog AppM where logWSLog = unLogger - --- helper functions --- mkConsoleHTML :: HasVersion => Text -> AuthMode -> Bool -> Maybe Text -> Either String Text diff --git a/server/src-lib/Hasura/Db.hs b/server/src-lib/Hasura/Db.hs index 1fd38e13e6a..726010fb255 100644 --- a/server/src-lib/Hasura/Db.hs +++ b/server/src-lib/Hasura/Db.hs @@ -19,6 +19,7 @@ module Hasura.Db , LazyRespTx , defaultTxErrorHandler , mkTxErrorHandler + , lazyTxToQTx ) where import Control.Lens diff --git a/server/src-lib/Hasura/EncJSON.hs b/server/src-lib/Hasura/EncJSON.hs index 459433757c1..0bb8ca59260 100644 --- a/server/src-lib/Hasura/EncJSON.hs +++ b/server/src-lib/Hasura/EncJSON.hs @@ -25,7 +25,7 @@ import qualified Data.Text.Encoding as TE import qualified Database.PG.Query as Q -- encoded json --- TODO: can be improved with gadts capturing bytestring, lazybytestring +-- TODO (from master): can be improved with gadts capturing bytestring, lazybytestring -- and builder newtype EncJSON = EncJSON { unEncJSON :: BB.Builder } diff --git a/server/src-lib/Hasura/GraphQL/Context.hs b/server/src-lib/Hasura/GraphQL/Context.hs index c1b21620c7a..f5ba13b3927 100644 --- a/server/src-lib/Hasura/GraphQL/Context.hs +++ b/server/src-lib/Hasura/GraphQL/Context.hs @@ -1,54 +1,43 @@ -module Hasura.GraphQL.Context where +{-# LANGUAGE StrictData #-} + +module Hasura.GraphQL.Context + ( RoleContext(..) + , GQLContext(..) + , ParserFn + , RootField(..) + , traverseDB + , traverseAction + , RemoteField + , QueryDB(..) + , ActionQuery(..) + , QueryRootField + , MutationDB(..) + , ActionMutation(..) + , MutationRootField + , SubscriptionRootField + , SubscriptionRootFieldResolved + ) where import Hasura.Prelude -import Data.Aeson +import qualified Data.Aeson as J import Data.Aeson.Casing import Data.Aeson.TH -import Data.Has - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set import qualified Language.GraphQL.Draft.Syntax as G -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types -import Hasura.Session +import qualified Hasura.RQL.DML.Delete.Types as RQL +import qualified Hasura.RQL.DML.Select.Types as RQL +import qualified Hasura.RQL.DML.Update.Types as RQL +import qualified Hasura.RQL.Types.Action as RQL +import qualified Hasura.RQL.Types.RemoteSchema as RQL +import qualified Hasura.SQL.DML as S --- | A /GraphQL context/, aka the final output of GraphQL schema generation. Used to both validate --- incoming queries and respond to introspection queries. --- --- Combines information from 'TyAgg', 'RootFields', and 'InsCtxMap' datatypes and adds a bit more on --- top. Constructed via the 'mkGCtx' smart constructor. -data GCtx - = GCtx - -- GraphQL type information - { _gTypes :: !TypeMap - , _gFields :: !FieldMap - , _gQueryRoot :: !ObjTyInfo - , _gMutRoot :: !(Maybe ObjTyInfo) - , _gSubRoot :: !(Maybe ObjTyInfo) - -- Postgres type information - , _gOrdByCtx :: !OrdByCtx - , _gQueryCtxMap :: !QueryCtxMap - , _gMutationCtxMap :: !MutationCtxMap - , _gInsCtxMap :: !InsCtxMap - } deriving (Show, Eq) +import Hasura.GraphQL.Parser +import Hasura.GraphQL.Schema.Insert (AnnInsert) -data RemoteGCtx - = RemoteGCtx - { _rgTypes :: !TypeMap - , _rgQueryRoot :: !ObjTyInfo - , _rgMutationRoot :: !(Maybe ObjTyInfo) - , _rgSubscriptionRoot :: !(Maybe ObjTyInfo) - } deriving (Show, Eq) - -instance Has TypeMap GCtx where - getter = _gTypes - modifier f ctx = ctx { _gTypes = f $ _gTypes ctx } - -instance ToJSON GCtx where - toJSON _ = String "ToJSON for GCtx is not implemented" +-- | For storing both a normal GQLContext and one for the backend variant. +-- Currently, this is to enable the backend variant to have certain insert +-- permissions which the frontend variant does not. data RoleContext a = RoleContext @@ -57,37 +46,71 @@ data RoleContext a } deriving (Show, Eq, Functor, Foldable, Traversable) $(deriveToJSON (aesonDrop 5 snakeCase) ''RoleContext) -type GCtxMap = Map.HashMap RoleName (RoleContext GCtx) -type RelayGCtxMap = Map.HashMap RoleName GCtx +data GQLContext = GQLContext + { gqlQueryParser :: ParserFn (InsOrdHashMap G.Name (QueryRootField UnpreparedValue)) + , gqlMutationParser :: Maybe (ParserFn (InsOrdHashMap G.Name (MutationRootField UnpreparedValue))) + } -queryRootNamedType :: G.NamedType -queryRootNamedType = G.NamedType "query_root" +instance J.ToJSON GQLContext where + toJSON GQLContext{} = J.String "The GraphQL schema parsers" -mutationRootNamedType :: G.NamedType -mutationRootNamedType = G.NamedType "mutation_root" +type ParserFn a + = G.SelectionSet G.NoFragments Variable + -> Either (NESeq ParseError) (a, QueryReusability) -subscriptionRootNamedType :: G.NamedType -subscriptionRootNamedType = G.NamedType "subscription_root" +data RootField db remote action raw + = RFDB db + | RFRemote remote + | RFAction action + | RFRaw raw -mkQueryRootTyInfo :: [ObjFldInfo] -> ObjTyInfo -mkQueryRootTyInfo flds = - mkHsraObjTyInfo (Just "query root") queryRootNamedType Set.empty $ - mapFromL _fiName $ schemaFld:typeFld:flds - where - schemaFld = mkHsraObjFldInfo Nothing "__schema" Map.empty $ - G.toGT $ G.toNT $ G.NamedType "__Schema" - typeFld = mkHsraObjFldInfo Nothing "__type" typeFldArgs $ - G.toGT $ G.NamedType "__Type" - typeFldArgs = mapFromL _iviName $ pure $ - InpValInfo (Just "name of the type") "name" Nothing - $ G.toGT $ G.toNT $ G.NamedType "String" +traverseDB :: forall db db' remote action raw f + . Applicative f + => (db -> f db') + -> RootField db remote action raw + -> f (RootField db' remote action raw) +traverseDB f = \case + RFDB x -> RFDB <$> f x + RFRemote x -> pure $ RFRemote x + RFAction x -> pure $ RFAction x + RFRaw x -> pure $ RFRaw x -defaultTypes :: [TypeInfo] -defaultTypes = $(fromSchemaDocQ defaultSchema TLHasuraType) +traverseAction :: forall db remote action action' raw f + . Applicative f + => (action -> f action') + -> RootField db remote action raw + -> f (RootField db remote action' raw) +traverseAction f = \case + RFDB x -> pure $ RFDB x + RFRemote x -> pure $ RFRemote x + RFAction x -> RFAction <$> f x + RFRaw x -> pure $ RFRaw x -emptyGCtx :: GCtx -emptyGCtx = - let queryRoot = mkQueryRootTyInfo [] - allTys = mkTyInfoMap $ TIObj queryRoot:defaultTypes - -- for now subscription root is query root - in GCtx allTys mempty queryRoot Nothing Nothing mempty mempty mempty mempty +data QueryDB v + = QDBSimple (RQL.AnnSimpleSelG v) + | QDBPrimaryKey (RQL.AnnSimpleSelG v) + | QDBAggregation (RQL.AnnAggregateSelectG v) + | QDBConnection (RQL.ConnectionSelect v) + +data ActionQuery v + = AQQuery !(RQL.AnnActionExecution v) + | AQAsync !(RQL.AnnActionAsyncQuery v) + +type RemoteField = (RQL.RemoteSchemaInfo, G.Field G.NoFragments Variable) + +type QueryRootField v = RootField (QueryDB v) RemoteField (ActionQuery v) J.Value + +data MutationDB v + = MDBInsert (AnnInsert v) + | MDBUpdate (RQL.AnnUpdG v) + | MDBDelete (RQL.AnnDelG v) + +data ActionMutation v + = AMSync !(RQL.AnnActionExecution v) + | AMAsync !RQL.AnnActionMutationAsync + +type MutationRootField v = + RootField (MutationDB v) RemoteField (ActionMutation v) J.Value + +type SubscriptionRootField v = RootField (QueryDB v) Void (RQL.AnnActionAsyncQuery v) Void +type SubscriptionRootFieldResolved = RootField (QueryDB S.SQLExp) Void RQL.AnnSimpleSel Void diff --git a/server/src-lib/Hasura/GraphQL/Execute.hs b/server/src-lib/Hasura/GraphQL/Execute.hs index bc642e60832..014e77bb685 100644 --- a/server/src-lib/Hasura/GraphQL/Execute.hs +++ b/server/src-lib/Hasura/GraphQL/Execute.hs @@ -1,21 +1,19 @@ module Hasura.GraphQL.Execute - ( GQExecPlan(..) - , EQ.GraphQLQueryType(..) - - , ExecPlanPartial - , getExecPlanPartial - - , ExecOp(..) - , GQExecPlanResolved + ( EPr.ExecutionStep(..) + , ResolvedExecutionPlan(..) + , ET.GraphQLQueryType(..) , getResolvedExecPlan + , getExecPlanPartial , execRemoteGQ - , getSubsOp + , validateSubscriptionRootField + -- , getSubsOp - , EP.PlanCache - , EP.PlanCacheOptions(..) - , EP.initPlanCache - , EP.clearPlanCache - , EP.dumpPlanCache + -- , EP.PlanCache + -- , EP.mkPlanCacheOptions + -- , EP.PlanCacheOptions(..) + -- , EP.initPlanCache + -- , EP.clearPlanCache + -- , EP.dumpPlanCache , EQ.PreparedSql(..) , ExecutionCtx(..) @@ -23,55 +21,46 @@ module Hasura.GraphQL.Execute , checkQueryInAllowlist ) where -import Control.Lens -import Data.Has +import Hasura.Prelude import qualified Data.Aeson as J import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set + +import qualified Data.HashSet as HS import qualified Language.GraphQL.Draft.Syntax as G import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types as HTTP import qualified Network.Wai.Extended as Wai import Hasura.EncJSON -import Hasura.GraphQL.Context import Hasura.GraphQL.Logging +import Hasura.GraphQL.Parser.Column (UnpreparedValue) import Hasura.GraphQL.RemoteServer (execRemoteGQ') -import Hasura.GraphQL.Resolve.Action -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Schema import Hasura.GraphQL.Transport.HTTP.Protocol -import Hasura.GraphQL.Validate.Types +import Hasura.GraphQL.Utils (showName) import Hasura.HTTP -import Hasura.Prelude import Hasura.RQL.Types import Hasura.Server.Utils (RequestId) import Hasura.Server.Version (HasVersion) import Hasura.Session -import qualified Hasura.GraphQL.Context as GC +import qualified Hasura.GraphQL.Context as C +import qualified Hasura.GraphQL.Execute.Inline as EI + import qualified Hasura.GraphQL.Execute.LiveQuery as EL -import qualified Hasura.GraphQL.Execute.Plan as EP +import qualified Hasura.GraphQL.Execute.Mutation as EM +-- import qualified Hasura.GraphQL.Execute.Plan as EP +import qualified Hasura.GraphQL.Execute.Prepare as EPr import qualified Hasura.GraphQL.Execute.Query as EQ -import qualified Hasura.GraphQL.Resolve as GR -import qualified Hasura.GraphQL.Validate as VQ -import qualified Hasura.GraphQL.Validate.SelectionSet as VQ -import qualified Hasura.GraphQL.Validate.Types as VT +import qualified Hasura.GraphQL.Execute.Types as ET + import qualified Hasura.Logging as L import qualified Hasura.Server.Telemetry.Counters as Telem import qualified Hasura.Tracing as Tracing --- The current execution plan of a graphql operation, it is --- currently, either local pg execution or a remote execution --- --- The 'a' is parameterised so this AST can represent --- intermediate passes -data GQExecPlan a - = GExPHasura !a - | GExPRemote !RemoteSchemaInfo !G.TypedOperationDefinition - deriving (Functor, Foldable, Traversable) + +type QueryParts = G.TypedOperationDefinition G.FragmentSpread G.Name -- | Execution context data ExecutionCtx @@ -79,7 +68,7 @@ data ExecutionCtx { _ecxLogger :: !(L.Logger L.Hasura) , _ecxSqlGenCtx :: !SQLGenCtx , _ecxPgExecCtx :: !PGExecCtx - , _ecxPlanCache :: !EP.PlanCache + -- , _ecxPlanCache :: !EP.PlanCache , _ecxSchemaCache :: !SchemaCache , _ecxSchemaCacheVer :: !SchemaCacheVer , _ecxHttpManager :: !HTTP.Manager @@ -90,7 +79,7 @@ data ExecutionCtx -- before a GraphQL query should be allowed to be executed. In OSS, the safety -- check is to check in the query is in the allow list. --- | TODO: Limitation: This parses the query, which is not ideal if we already +-- | TODO (from master): Limitation: This parses the query, which is not ideal if we already -- have the query cached. The parsing happens unnecessary. But getting this to -- either return a plan or parse was tricky and complicated. class Monad m => MonadGQLExecutionCheck m where @@ -117,75 +106,83 @@ instance MonadGQLExecutionCheck m => MonadGQLExecutionCheck (Tracing.TraceT m) w checkGQLExecution ui det enableAL sc req = lift $ checkGQLExecution ui det enableAL sc req --- Enforces the current limitation -assertSameLocationNodes - :: (MonadError QErr m) => [VT.TypeLoc] -> m VT.TypeLoc -assertSameLocationNodes typeLocs = - case Set.toList (Set.fromList typeLocs) of - -- this shouldn't happen - [] -> return VT.TLHasuraType - [loc] -> return loc - _ -> throw400 NotSupported msg - where - msg = "cannot mix top level fields from two different graphql servers" - --- TODO: we should fix this function asap --- as this will fail when there is a fragment at the top level -getTopLevelNodes :: G.TypedOperationDefinition -> [G.Name] -getTopLevelNodes opDef = - mapMaybe f $ G._todSelectionSet opDef - where - f = \case - G.SelectionField fld -> Just $ G._fName fld - G.SelectionFragmentSpread _ -> Nothing - G.SelectionInlineFragment _ -> Nothing - -gatherTypeLocs :: GCtx -> [G.Name] -> [VT.TypeLoc] -gatherTypeLocs gCtx nodes = - catMaybes $ flip map nodes $ \node -> - VT._fiLoc <$> Map.lookup node schemaNodes - where - schemaNodes = - let qr = VT._otiFields $ _gQueryRoot gCtx - mr = VT._otiFields <$> _gMutRoot gCtx - in maybe qr (Map.union qr) mr - --- This is for when the graphql query is validated -type ExecPlanPartial = GQExecPlan (GCtx, VQ.RootSelectionSet) - getExecPlanPartial - :: (MonadReusability m, MonadError QErr m) + :: (MonadError QErr m) => UserInfo -> SchemaCache - -> EQ.GraphQLQueryType + -> ET.GraphQLQueryType -> GQLReqParsed - -> m ExecPlanPartial -getExecPlanPartial userInfo sc queryType req = do - let gCtx = case queryType of - EQ.QueryHasura -> getGCtx (_uiBackendOnlyFieldAccess userInfo) sc roleName - EQ.QueryRelay -> fromMaybe GC.emptyGCtx $ Map.lookup roleName $ scRelayGCtxMap sc - - queryParts <- flip runReaderT gCtx $ VQ.getQueryParts req - - let opDef = VQ.qpOpDef queryParts - topLevelNodes = getTopLevelNodes opDef - -- gather TypeLoc of topLevelNodes - typeLocs = gatherTypeLocs gCtx topLevelNodes - - -- see if they are all the same - typeLoc <- assertSameLocationNodes typeLocs - - case typeLoc of - VT.TLHasuraType -> do - rootSelSet <- runReaderT (VQ.validateGQ queryParts) gCtx - pure $ GExPHasura (gCtx, rootSelSet) - VT.TLRemoteType _ rsi -> - pure $ GExPRemote rsi opDef - VT.TLCustom -> - throw500 "unexpected custom type for top level field" + -> m (C.GQLContext, QueryParts) +getExecPlanPartial userInfo sc queryType req = + (getGCtx ,) <$> getQueryParts req where roleName = _uiRole userInfo + contextMap = + case queryType of + ET.QueryHasura -> scGQLContext sc + ET.QueryRelay -> scRelayContext sc + + defaultContext = + case queryType of + ET.QueryHasura -> scUnauthenticatedGQLContext sc + ET.QueryRelay -> scUnauthenticatedRelayContext sc + + getGCtx :: C.GQLContext + getGCtx = + case Map.lookup roleName contextMap of + Nothing -> defaultContext + Just (C.RoleContext frontend backend) -> + case _uiBackendOnlyFieldAccess userInfo of + BOFAAllowed -> fromMaybe frontend backend + BOFADisallowed -> frontend + + -- | Depending on the request parameters, fetch the correct typed operation + -- definition from the GraphQL query + getQueryParts + :: MonadError QErr m + => GQLReqParsed + -> m QueryParts + getQueryParts (GQLReq opNameM q _varValsM) = do + let (selSets, opDefs, _fragDefsL) = G.partitionExDefs $ unGQLExecDoc q + case (opNameM, selSets, opDefs) of + (Just opName, [], _) -> do + let n = _unOperationName opName + opDefM = find (\opDef -> G._todName opDef == Just n) opDefs + onNothing opDefM $ throw400 ValidationFailed $ + "no such operation found in the document: " <> showName n + (Just _, _, _) -> + throw400 ValidationFailed $ "operationName cannot be used when " <> + "an anonymous operation exists in the document" + (Nothing, [selSet], []) -> + return $ G.TypedOperationDefinition G.OperationTypeQuery Nothing [] [] selSet + (Nothing, [], [opDef]) -> + return opDef + (Nothing, _, _) -> + throw400 ValidationFailed $ "exactly one operation has to be present " <> + "in the document when operationName is not specified" + +-- The graphql query is resolved into a sequence of execution operations +data ResolvedExecutionPlan m + = QueryExecutionPlan + (EPr.ExecutionPlan (m EncJSON, EQ.GeneratedSqlMap) EPr.RemoteCall (G.Name, J.Value)) [C.QueryRootField UnpreparedValue] + -- ^ query execution; remote schemas and introspection possible + | MutationExecutionPlan (EPr.ExecutionPlan (m EncJSON, HTTP.ResponseHeaders) EPr.RemoteCall (G.Name, J.Value)) + -- ^ mutation execution; only __typename introspection supported + | SubscriptionExecutionPlan EL.LiveQueryPlan + -- ^ live query execution; remote schemas and introspection not supported + +validateSubscriptionRootField + :: MonadError QErr m + => C.QueryRootField v -> m (C.SubscriptionRootField v) +validateSubscriptionRootField = \case + C.RFDB x -> pure $ C.RFDB x + C.RFAction (C.AQAsync s) -> pure $ C.RFAction s + C.RFAction (C.AQQuery _) -> throw400 NotSupported "query actions cannot be run as a subscription" + C.RFRemote _ -> throw400 NotSupported "subscription to remote server is not supported" + C.RFRaw _ -> throw400 NotSupported "Introspection not supported over subscriptions" + + checkQueryInAllowlist :: (MonadError QErr m) => Bool -> UserInfo -> GQLReqParsed -> SchemaCache -> m () checkQueryInAllowlist enableAL userInfo req sc = @@ -193,24 +190,18 @@ checkQueryInAllowlist enableAL userInfo req sc = -- check if query is in allowlist when (enableAL && (_uiRole userInfo /= adminRoleName)) $ do let notInAllowlist = - not $ VQ.isQueryInAllowlist (_grQuery req) (scAllowlist sc) - when notInAllowlist $ modifyQErr modErr $ throwVE "query is not allowed" + not $ isQueryInAllowlist (_grQuery req) (scAllowlist sc) + when notInAllowlist $ modifyQErr modErr $ throw400 ValidationFailed "query is not allowed" where modErr e = let msg = "query is not in any of the allowlists" in e{qeInternal = Just $ J.object [ "message" J..= J.String msg]} - --- An execution operation, in case of queries and mutations it is just a --- transaction to be executed -data ExecOp m - = ExOpQuery !(m EncJSON) !(Maybe EQ.GeneratedSqlMap) ![GR.QueryRootFldUnresolved] - | ExOpMutation !HTTP.ResponseHeaders !(m EncJSON) - | ExOpSubs !EL.LiveQueryPlan - --- The graphql query is resolved into an execution operation -type GQExecPlanResolved m = GQExecPlan (ExecOp m) + isQueryInAllowlist q = HS.member gqlQuery + where + gqlQuery = GQLQuery $ G.ExecutableDocument $ stripTypenames $ + unGQLExecDoc q getResolvedExecPlan :: forall m tx @@ -225,207 +216,104 @@ getResolvedExecPlan => Env.Environment -> L.Logger L.Hasura -> PGExecCtx - -> EP.PlanCache + -- -> EP.PlanCache -> UserInfo -> SQLGenCtx -> SchemaCache -> SchemaCacheVer - -> EQ.GraphQLQueryType + -> ET.GraphQLQueryType -> HTTP.Manager -> [HTTP.Header] -> (GQLReqUnparsed, GQLReqParsed) - -> m (Telem.CacheHit, GQExecPlanResolved tx) -getResolvedExecPlan env logger pgExecCtx planCache userInfo sqlGenCtx - sc scVer queryType httpManager reqHeaders (reqUnparsed, reqParsed) = do + -> m (Telem.CacheHit, ResolvedExecutionPlan tx) +getResolvedExecPlan env logger pgExecCtx {- planCache-} userInfo sqlGenCtx + sc scVer queryType httpManager reqHeaders (reqUnparsed, reqParsed) = -- do - planM <- liftIO $ EP.getPlan scVer (_uiRole userInfo) operationNameM queryStr - queryType planCache - let usrVars = _uiSession userInfo - case planM of - -- plans are only for queries and subscriptions - Just plan -> (Telem.Hit,) . GExPHasura <$> case plan of - EP.RPQuery queryPlan asts -> do - (tx, genSql) <- EQ.queryOpFromPlan env httpManager reqHeaders userInfo queryVars queryPlan - pure $ ExOpQuery tx (Just genSql) asts - EP.RPSubs subsPlan -> - ExOpSubs <$> EL.reuseLiveQueryPlan pgExecCtx usrVars queryVars subsPlan - Nothing -> (Telem.Miss,) <$> noExistingPlan + -- See Note [Temporarily disabling query plan caching] + -- planM <- liftIO $ EP.getPlan scVer (_uiRole userInfo) opNameM queryStr + -- queryType planCache +-- case planM of +-- -- plans are only for queries and subscriptions +-- Just plan -> (Telem.Hit,) <$> case plan of +-- EP.RPQuery queryPlan -> do +-- -- (tx, genSql) <- EQ.queryOpFromPlan env httpManager reqHeaders userInfo queryVars queryPlan +-- return $ QueryExecutionPlan _ -- tx (Just genSql) +-- EP.RPSubs subsPlan -> +-- return $ SubscriptionExecutionPlan _ -- <$> EL.reuseLiveQueryPlan pgExecCtx usrVars queryVars subsPlan +-- Nothing -> (Telem.Miss,) <$> noExistingPlan + (Telem.Miss,) <$> noExistingPlan where - GQLReq operationNameM queryStr queryVars = reqUnparsed - addPlanToCache plan = - liftIO $ EP.addPlan scVer (_uiRole userInfo) - operationNameM queryStr plan queryType planCache - - noExistingPlan :: m (GQExecPlanResolved tx) + GQLReq opNameM queryStr queryVars = reqUnparsed + -- addPlanToCache plan = + -- liftIO $ EP.addPlan scVer (userRole userInfo) + -- opNameM queryStr plan planCache + noExistingPlan :: m (ResolvedExecutionPlan tx) noExistingPlan = do + -- GraphQL requests may incorporate fragments which insert a pre-defined + -- part of a GraphQL query. Here we make sure to remember those + -- pre-defined sections, so that when we encounter a fragment spread + -- later, we can inline it instead. -- req <- toParsed reqUnparsed - (partialExecPlan, queryReusability) <- runReusabilityT $ - getExecPlanPartial userInfo sc queryType reqParsed - forM partialExecPlan $ \(gCtx, rootSelSet) -> - case rootSelSet of - VQ.RMutation selSet -> do - (tx, respHeaders) <- getMutOp env logger gCtx sqlGenCtx userInfo httpManager reqHeaders selSet - pure $ ExOpMutation respHeaders tx - VQ.RQuery selSet -> do - (queryTx, plan, genSql, asts) <- getQueryOp env logger gCtx sqlGenCtx httpManager reqHeaders userInfo - queryReusability (allowQueryActionExecuter httpManager reqHeaders) selSet - traverse_ (addPlanToCache . flip EP.RPQuery asts) plan - return $ ExOpQuery queryTx (Just genSql) asts - VQ.RSubscription fields -> do - (lqOp, plan) <- getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo queryReusability - (restrictActionExecuter "query actions cannot be run as a subscription") fields - traverse_ (addPlanToCache . EP.RPSubs) plan - return $ ExOpSubs lqOp + let takeFragment = \case G.ExecutableDefinitionFragment f -> Just f; _ -> Nothing + fragments = + mapMaybe takeFragment $ unGQLExecDoc $ _grQuery reqParsed + (gCtx, queryParts) <- getExecPlanPartial userInfo sc queryType reqParsed + case queryParts of + G.TypedOperationDefinition G.OperationTypeQuery _ varDefs _ selSet -> do + -- (Here the above fragment inlining is actually executed.) + inlinedSelSet <- EI.inlineSelectionSet fragments selSet + -- (unpreparedQueries, _) <- + -- E.parseGraphQLQuery gCtx varDefs + (execPlan,asts) {-, plan-} <- + EQ.convertQuerySelSet env logger gCtx userInfo httpManager reqHeaders inlinedSelSet varDefs (_grVariables reqUnparsed) + -- See Note [Temporarily disabling query plan caching] + -- traverse_ (addPlanToCache . EP.RPQuery) plan + return $ QueryExecutionPlan execPlan asts + G.TypedOperationDefinition G.OperationTypeMutation _ varDefs _ selSet -> do + -- (Here the above fragment inlining is actually executed.) + inlinedSelSet <- EI.inlineSelectionSet fragments selSet + queryTx <- EM.convertMutationSelectionSet env logger gCtx sqlGenCtx userInfo httpManager reqHeaders + inlinedSelSet varDefs (_grVariables reqUnparsed) + -- See Note [Temporarily disabling query plan caching] + -- traverse_ (addPlanToCache . EP.RPQuery) plan + return $ MutationExecutionPlan queryTx + G.TypedOperationDefinition G.OperationTypeSubscription _ varDefs directives selSet -> do + -- (Here the above fragment inlining is actually executed.) + inlinedSelSet <- EI.inlineSelectionSet fragments selSet + -- Parse as query to check correctness + (unpreparedAST, _reusability) <- + EQ.parseGraphQLQuery gCtx varDefs (_grVariables reqUnparsed) inlinedSelSet + -- A subscription should have exactly one root field + -- As an internal testing feature, we support subscribing to multiple + -- root fields in a subcription. First, we check if the corresponding directive + -- (@_multiple_top_level_fields) is set. + case inlinedSelSet of + [] -> throw500 "empty selset for subscription" + [_] -> pure () + (_:rst) -> + let multipleAllowed = + G.Directive $$(G.litName "_multiple_top_level_fields") mempty `elem` directives + in + unless (multipleAllowed || null rst) $ + throw400 ValidationFailed "subscriptions must select one top level field" + validSubscriptionAST <- for unpreparedAST validateSubscriptionRootField + (lqOp, _plan) <- EL.buildLiveQueryPlan pgExecCtx userInfo validSubscriptionAST + -- getSubsOpM pgExecCtx userInfo inlinedSelSet + return $ SubscriptionExecutionPlan lqOp --- Monad for resolving a hasura query/mutation -type E m = - ReaderT ( UserInfo - , QueryCtxMap - , MutationCtxMap - , TypeMap - , FieldMap - , OrdByCtx - , InsCtxMap - , SQLGenCtx - , L.Logger L.Hasura - ) (ExceptT QErr m) - -runE - :: (MonadError QErr m) - => L.Logger L.Hasura - -> GCtx - -> SQLGenCtx - -> UserInfo - -> E m a - -> m a -runE logger ctx sqlGenCtx userInfo action = do - res <- runExceptT $ runReaderT action - (userInfo, queryCtxMap, mutationCtxMap, typeMap, fldMap, ordByCtx, insCtxMap, sqlGenCtx, logger) - either throwError return res - where - queryCtxMap = _gQueryCtxMap ctx - mutationCtxMap = _gMutationCtxMap ctx - typeMap = _gTypes ctx - fldMap = _gFields ctx - ordByCtx = _gOrdByCtx ctx - insCtxMap = _gInsCtxMap ctx - -getQueryOp - :: ( HasVersion - , MonadError QErr m - , MonadIO m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> L.Logger L.Hasura - -> GCtx - -> SQLGenCtx - -> HTTP.Manager - -> [HTTP.Header] - -> UserInfo - -> QueryReusability - -> QueryActionExecuter - -> VQ.ObjectSelectionSet - -> m (tx EncJSON, Maybe EQ.ReusableQueryPlan, EQ.GeneratedSqlMap, [GR.QueryRootFldUnresolved]) -getQueryOp env logger gCtx sqlGenCtx manager reqHdrs userInfo queryReusability actionExecuter selSet = - runE logger gCtx sqlGenCtx userInfo $ EQ.convertQuerySelSet env manager reqHdrs queryReusability selSet actionExecuter - -resolveMutSelSet - :: ( HasVersion - , MonadError QErr m - , MonadReader r m - , Has UserInfo r - , Has MutationCtxMap r - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has InsCtxMap r - , Has HTTP.Manager r - , Has [HTTP.Header] r - , Has (L.Logger L.Hasura) r - , MonadIO m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> VQ.ObjectSelectionSet - -> m (tx EncJSON, HTTP.ResponseHeaders) -resolveMutSelSet env fields = do - aliasedTxs <- traverseObjectSelectionSet fields $ \fld -> - case VQ._fName fld of - "__typename" -> return (return $ encJFromJValue mutationRootNamedType, []) - _ -> evalReusabilityT $ GR.mutFldToTx env fld - - -- combines all transactions into a single transaction - return (toSingleTx aliasedTxs, concatMap (snd . snd) aliasedTxs) - where - -- A list of aliased transactions for eg - -- [("f1", Tx r1), ("f2", Tx r2)] - -- are converted into a single transaction as follows - -- Tx {"f1": r1, "f2": r2} - -- toSingleTx :: [(Text, LazyRespTx)] -> LazyRespTx - toSingleTx aliasedTxs = - fmap encJFromAssocList $ forM aliasedTxs $ \(al, (tx, _)) -> (,) al <$> tx - -getMutOp - :: ( HasVersion - , MonadError QErr m - , MonadIO m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> L.Logger L.Hasura - -> GCtx - -> SQLGenCtx - -> UserInfo - -> HTTP.Manager - -> [HTTP.Header] - -> VQ.ObjectSelectionSet - -> m (tx EncJSON, HTTP.ResponseHeaders) -getMutOp env logger ctx sqlGenCtx userInfo manager reqHeaders selSet = - peelReaderT $ resolveMutSelSet env selSet - where - peelReaderT action = - runReaderT action - ( userInfo, queryCtxMap, mutationCtxMap - , typeMap, fldMap, ordByCtx, insCtxMap, sqlGenCtx - , manager, reqHeaders, logger - ) - where - queryCtxMap = _gQueryCtxMap ctx - mutationCtxMap = _gMutationCtxMap ctx - typeMap = _gTypes ctx - fldMap = _gFields ctx - ordByCtx = _gOrdByCtx ctx - insCtxMap = _gInsCtxMap ctx - -getSubsOp - :: ( MonadError QErr m - , MonadIO m - , HasVersion - , Tracing.MonadTrace m - ) - => Env.Environment - -> L.Logger L.Hasura - -> PGExecCtx - -> GCtx - -> SQLGenCtx - -> UserInfo - -> QueryReusability - -> QueryActionExecuter - -> VQ.ObjectSelectionSet - -> m (EL.LiveQueryPlan, Maybe EL.ReusableLiveQueryPlan) -getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo queryReusability actionExecuter = - runE logger gCtx sqlGenCtx userInfo . - EL.buildLiveQueryPlan env pgExecCtx queryReusability actionExecuter + -- forM partialExecPlan $ \(gCtx, rootSelSet) -> + -- case rootSelSet of + -- VQ.RMutation selSet -> do + -- (tx, respHeaders) <- getMutOp gCtx sqlGenCtx userInfo httpManager reqHeaders selSet + -- pure $ ExOpMutation respHeaders tx + -- VQ.RQuery selSet -> do + -- (queryTx, plan, genSql) <- getQueryOp gCtx sqlGenCtx userInfo queryReusability (allowQueryActionExecuter httpManager reqHeaders) selSet + -- traverse_ (addPlanToCache . EP.RPQuery) plan + -- return $ ExOpQuery queryTx (Just genSql) + -- VQ.RSubscription fld -> do + -- (lqOp, plan) <- getSubsOp pgExecCtx gCtx sqlGenCtx userInfo queryReusability (restrictActionExecuter "query actions cannot be run as a subscription") fld + -- traverse_ (addPlanToCache . EP.RPSubs) plan + -- return $ ExOpSubs lqOp execRemoteGQ :: ( HasVersion @@ -441,13 +329,14 @@ execRemoteGQ -> [HTTP.Header] -> GQLReqUnparsed -> RemoteSchemaInfo - -> G.OperationType + -> G.TypedOperationDefinition G.NoFragments G.Name -> m (DiffTime, HttpResponse EncJSON) -- ^ Also returns time spent in http request, for telemetry. -execRemoteGQ env reqId userInfo reqHdrs q rsi opType = do +execRemoteGQ env reqId userInfo reqHdrs q rsi opDef = do execCtx <- ask let logger = _ecxLogger execCtx manager = _ecxHttpManager execCtx + opType = G._todType opDef logQueryLog logger q Nothing reqId (time, respHdrs, resp) <- execRemoteGQ' env manager userInfo reqHdrs q rsi opType let !httpResp = HttpResponse (encJFromLBS resp) respHdrs diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Action.hs b/server/src-lib/Hasura/GraphQL/Execute/Action.hs similarity index 66% rename from server/src-lib/Hasura/GraphQL/Resolve/Action.hs rename to server/src-lib/Hasura/GraphQL/Execute/Action.hs index 5eea760108d..d9c23475c14 100644 --- a/server/src-lib/Hasura/GraphQL/Resolve/Action.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/Action.hs @@ -1,35 +1,21 @@ --- This pragma is needed for allowQueryActionExecuter -{-# OPTIONS_GHC -fno-warn-redundant-constraints #-} - -module Hasura.GraphQL.Resolve.Action - ( resolveActionMutation +module Hasura.GraphQL.Execute.Action + ( ActionExecuteTx + , ActionExecuteResult(..) , resolveAsyncActionQuery , asyncActionsProcessor - , resolveActionQuery - , mkJsonAggSelect - , QueryActionExecuter - , allowQueryActionExecuter - , restrictActionExecuter + , resolveActionExecution + , resolveActionMutationAsync ) where import Hasura.Prelude -import Control.Concurrent (threadDelay) -import Control.Exception (try) -import Control.Lens -import Control.Monad.Trans.Control (MonadBaseControl) -import Data.Has -import Data.Int (Int64) -import Data.IORef - -import qualified Control.Concurrent.Async.Lifted.Safe as LA -import qualified Data.Environment as Env import qualified Data.Aeson as J import qualified Data.Aeson.Casing as J import qualified Data.Aeson.TH as J import qualified Data.ByteString.Lazy as BL import qualified Data.CaseInsensitive as CI import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set import qualified Data.Text as T import qualified Data.UUID as UUID import qualified Database.PG.Query as Q @@ -38,19 +24,29 @@ import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types as HTTP import qualified Network.Wreq as Wreq -import qualified Hasura.GraphQL.Resolve.Select as GRS +import Control.Concurrent (threadDelay) +import Control.Exception (try) +import Control.Lens +import Data.Has +import Data.Int (Int64) +import Data.IORef + import qualified Hasura.RQL.DML.RemoteJoin as RJ import qualified Hasura.RQL.DML.Select as RS -import qualified Hasura.Tracing as Tracing +-- import qualified Hasura.GraphQL.Resolve.Select as GRS +import Control.Monad.Trans.Control (MonadBaseControl) + +import qualified Control.Concurrent.Async.Lifted.Safe as LA +import qualified Data.Environment as Env import qualified Hasura.Logging as L +import qualified Hasura.Tracing as Tracing import Hasura.EncJSON -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Resolve.Select (processTableSelectionSet) -import Hasura.GraphQL.Validate.SelectionSet +import Hasura.GraphQL.Execute.Prepare +import Hasura.GraphQL.Parser hiding (column) +import Hasura.GraphQL.Utils (showNames) import Hasura.HTTP -import Hasura.RQL.DDL.Headers (makeHeadersFromConf, toHeadersConf) +import Hasura.RQL.DDL.Headers import Hasura.RQL.DDL.Schema.Cache import Hasura.RQL.DML.Select (asSingleRowJsonResp) import Hasura.RQL.Types @@ -61,6 +57,9 @@ import Hasura.Session import Hasura.SQL.Types import Hasura.SQL.Value (PGScalarValue (..), toTxtValue) +type ActionExecuteTx = + forall tx. (MonadIO tx, MonadTx tx, Tracing.MonadTrace tx) => tx EncJSON + newtype ActionContext = ActionContext {_acName :: ActionName} deriving (Show, Eq) @@ -82,19 +81,19 @@ data ActionWebhookErrorResponse $(J.deriveJSON (J.aesonDrop 5 J.snakeCase) ''ActionWebhookErrorResponse) data ActionWebhookResponse - = AWRArray ![J.Object] - | AWRObject !J.Object + = AWRArray ![Map.HashMap G.Name J.Value] + | AWRObject !(Map.HashMap G.Name J.Value) deriving (Show, Eq) instance J.FromJSON ActionWebhookResponse where parseJSON v = case v of J.Array{} -> AWRArray <$> J.parseJSON v - J.Object o -> pure $ AWRObject o + J.Object{} -> AWRObject <$> J.parseJSON v _ -> fail $ "expecting object or array of objects for action webhook response" instance J.ToJSON ActionWebhookResponse where toJSON (AWRArray objects) = J.toJSON objects - toJSON (AWRObject object) = J.toJSON object + toJSON (AWRObject obj) = J.toJSON obj data ActionRequestInfo = ActionRequestInfo @@ -131,142 +130,55 @@ $(J.deriveJSON (J.aesonDrop 4 J.snakeCase){J.omitNothingFields=True} ''ActionHan instance L.ToEngineLog ActionHandlerLog L.Hasura where toEngineLog ahl = (L.LevelInfo, L.ELTActionHandler, J.toJSON ahl) -resolveActionMutation - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , MonadIO m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has HTTP.Manager r - , Has [HTTP.Header] r - , Has (L.Logger L.Hasura) r - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> Field - -> ActionMutationExecutionContext - -> UserInfo - -> m (tx EncJSON, HTTP.ResponseHeaders) -resolveActionMutation env field executionContext userInfo = - case executionContext of - ActionMutationSyncWebhook executionContextSync -> - resolveActionMutationSync env field executionContextSync userInfo - ActionMutationAsync -> - (,[]) <$> resolveActionMutationAsync field userInfo + +data ActionExecuteResult + = ActionExecuteResult + { _aerTransaction :: !ActionExecuteTx + , _aerHeaders :: !HTTP.ResponseHeaders + } -- | Synchronously execute webhook handler and resolve response to action "output" -resolveActionMutationSync +resolveActionExecution :: ( HasVersion - , MonadReusability m , MonadError QErr m - , MonadReader r m , MonadIO m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has HTTP.Manager r - , Has [HTTP.Header] r - , Has (L.Logger L.Hasura) r , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx ) => Env.Environment - -> Field - -> ActionExecutionContext + -> L.Logger L.Hasura -> UserInfo - -> m (tx EncJSON, HTTP.ResponseHeaders) -resolveActionMutationSync env field executionContext userInfo = do - let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field - actionContext = ActionContext actionName - sessionVariables = _uiSession userInfo - handlerPayload = ActionWebhookPayload actionContext sessionVariables inputArgs - manager <- asks getter - reqHeaders <- asks getter - (webhookRes, respHeaders) <- callWebhook env manager outputType outputFields reqHeaders confHeaders + -> AnnActionExecution UnpreparedValue + -> ActionExecContext + -> m ActionExecuteResult +resolveActionExecution env logger userInfo annAction execContext = do + let actionContext = ActionContext actionName + handlerPayload = ActionWebhookPayload actionContext sessionVariables inputPayload + (webhookRes, respHeaders) <- flip runReaderT logger $ callWebhook env manager outputType outputFields reqHeaders confHeaders forwardClientHeaders resolvedWebhook handlerPayload - let webhookResponseExpression = RS.AEInput $ UVSQL $ + let webhookResponseExpression = RS.AEInput $ UVLiteral $ toTxtValue $ WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON webhookRes - selSet <- asObjectSelectionSet $ _fSelSet field - selectAstUnresolved <- - processOutputSelectionSet webhookResponseExpression outputType definitionList - (_fType field) selSet - astResolved <- RS.traverseAnnSimpleSelect resolveValTxt selectAstUnresolved - let (astResolvedWithoutRemoteJoins,maybeRemoteJoins) = RJ.getRemoteJoins astResolved - jsonAggType = mkJsonAggSelect outputType - return $ (,respHeaders) $ - case maybeRemoteJoins of - Just remoteJoins -> - let query = Q.fromBuilder $ toSQL $ - RS.mkSQLSelect jsonAggType astResolvedWithoutRemoteJoins - in RJ.executeQueryWithRemoteJoins env manager reqHeaders userInfo query [] remoteJoins - Nothing -> - liftTx $ asSingleRowJsonResp (Q.fromBuilder $ toSQL $ RS.mkSQLSelect jsonAggType astResolved) [] - + selectAstUnresolved = processOutputSelectionSet webhookResponseExpression + outputType definitionList annFields stringifyNum + (astResolved, _expectedVariables) <- flip runStateT Set.empty $ RS.traverseAnnSimpleSelect prepareWithoutPlan selectAstUnresolved + return $ ActionExecuteResult (executeAction astResolved) respHeaders where - ActionExecutionContext actionName outputType outputFields definitionList resolvedWebhook confHeaders - forwardClientHeaders = executionContext + AnnActionExecution actionName outputType annFields inputPayload + outputFields definitionList resolvedWebhook confHeaders + forwardClientHeaders stringifyNum = annAction + ActionExecContext manager reqHeaders sessionVariables = execContext --- QueryActionExecuter is a type for a higher function, this is being used --- to allow or disallow where a query action can be executed. We would like --- to explicitly control where a query action can be run. --- Example: We do not explain a query action, so we use the `restrictActionExecuter` --- to prevent resolving the action query. -type QueryActionExecuter = - forall m a. (MonadError QErr m) - => (HTTP.Manager -> [HTTP.Header] -> m a) - -> m a -allowQueryActionExecuter :: HTTP.Manager -> [HTTP.Header] -> QueryActionExecuter -allowQueryActionExecuter manager reqHeaders actionResolver = - actionResolver manager reqHeaders - -restrictActionExecuter :: Text -> QueryActionExecuter -restrictActionExecuter errMsg _ = - throw400 NotSupported errMsg - -resolveActionQuery - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , MonadIO m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has (L.Logger L.Hasura) r - , Tracing.MonadTrace m - ) - => Env.Environment - -> Field - -> ActionExecutionContext - -> SessionVariables - -> HTTP.Manager - -> [HTTP.Header] - -> m (RS.AnnSimpleSelG UnresolvedVal) -resolveActionQuery env field executionContext sessionVariables httpManager reqHeaders = do - let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field - actionContext = ActionContext actionName - handlerPayload = ActionWebhookPayload actionContext sessionVariables inputArgs - (webhookRes, _) <- callWebhook env httpManager outputType outputFields reqHeaders confHeaders - forwardClientHeaders resolvedWebhook handlerPayload - let webhookResponseExpression = RS.AEInput $ UVSQL $ - toTxtValue $ WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON webhookRes - selSet <- asObjectSelectionSet $ _fSelSet field - selectAstUnresolved <- - processOutputSelectionSet webhookResponseExpression outputType definitionList - (_fType field) selSet - return selectAstUnresolved - where - ActionExecutionContext actionName outputType outputFields definitionList resolvedWebhook confHeaders - forwardClientHeaders = executionContext + executeAction :: RS.AnnSimpleSel -> ActionExecuteTx + executeAction astResolved = do + let (astResolvedWithoutRemoteJoins,maybeRemoteJoins) = RJ.getRemoteJoins astResolved + jsonAggType = mkJsonAggSelect outputType + case maybeRemoteJoins of + Just remoteJoins -> + let query = Q.fromBuilder $ toSQL $ + RS.mkSQLSelect jsonAggType astResolvedWithoutRemoteJoins + in RJ.executeQueryWithRemoteJoins env manager reqHeaders userInfo query [] remoteJoins + Nothing -> + liftTx $ asSingleRowJsonResp (Q.fromBuilder $ toSQL $ RS.mkSQLSelect jsonAggType astResolved) [] {- Note: [Async action architecture] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -284,17 +196,13 @@ table provides the action response. See Note [Resolving async action query/subsc -- | Resolve asynchronous action mutation which returns only the action uuid resolveActionMutationAsync :: ( MonadError QErr m - , MonadReader r m - , Has [HTTP.Header] r , MonadTx tx ) - => Field - -> UserInfo + => AnnActionMutationAsync + -> [HTTP.Header] + -> SessionVariables -> m (tx EncJSON) -resolveActionMutationAsync field userInfo = do - let sessionVariables = _uiSession userInfo - reqHeaders <- asks getter - let inputArgs = J.toJSON $ fmap annInpValueToJson $ _fArguments field +resolveActionMutationAsync annAction reqHeaders sessionVariables = do pure $ liftTx do actionId <- runIdentity . Q.getRow <$> Q.withQE defaultTxErrorHandler [Q.sql| INSERT INTO @@ -308,7 +216,7 @@ resolveActionMutationAsync field userInfo = do pure $ encJFromJValue $ UUID.toText actionId where - actionName = G.unName $ _fName field + AnnActionMutationAsync actionName inputArgs = annAction toHeadersMap = Map.fromList . map ((bsToTxt . CI.original) *** bsToTxt) {- Note: [Resolving async action query/subscription] @@ -322,67 +230,51 @@ action's type. Here, we treat the "output" field as a computed field to hdb_acti `jsonb_to_record` as custom SQL function. -} + +-- TODO: Add tracing here? Avoided now because currently the function is pure resolveAsyncActionQuery - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => UserInfo - -> ActionSelectOpContext - -> Field - -> m GRS.AnnSimpleSelect -resolveAsyncActionQuery userInfo selectOpCtx field = do - actionId <- withArg (_fArguments field) "id" parseActionId - stringifyNumerics <- stringifyNum <$> asks getter + :: UserInfo + -> AnnActionAsyncQuery UnpreparedValue + -> RS.AnnSimpleSelG UnpreparedValue +resolveAsyncActionQuery userInfo annAction = + let annotatedFields = asyncFields <&> second \case + AsyncTypename t -> RS.AFExpression t + AsyncOutput annFields -> + -- See Note [Resolving async action query/subscription] + let inputTableArgument = RS.AETableRow $ Just $ Iden "response_payload" + jsonAggSelect = mkJsonAggSelect outputType + in RS.AFComputedField $ RS.CFSTable jsonAggSelect $ + processOutputSelectionSet inputTableArgument outputType + definitionList annFields stringifyNumerics - selSet <- asObjectSelectionSet $ _fSelSet field + AsyncId -> mkAnnFldFromPGCol "id" PGUUID + AsyncCreatedAt -> mkAnnFldFromPGCol "created_at" PGTimeStampTZ + AsyncErrors -> mkAnnFldFromPGCol "errors" PGJSONB - annotatedFields <- fmap (map (first FieldName)) $ traverseObjectSelectionSet selSet $ \fld -> - case _fName fld of - "__typename" -> return $ RS.AFExpression $ G.unName $ G.unNamedType $ _fType field - "output" -> do - -- See Note [Resolving async action query/subscription] - let inputTableArgument = RS.AETableRow $ Just $ Iden "response_payload" - ActionSelectOpContext outputType definitionList = selectOpCtx - jsonAggSelect = mkJsonAggSelect outputType - fldSelSet <- asObjectSelectionSet $ _fSelSet fld - (RS.AFComputedField . RS.CFSTable jsonAggSelect) - <$> processOutputSelectionSet inputTableArgument outputType - definitionList (_fType fld) fldSelSet - - -- The metadata columns - "id" -> return $ mkAnnFieldFromPGCol "id" PGUUID - "created_at" -> return $ mkAnnFieldFromPGCol "created_at" PGTimeStampTZ - "errors" -> return $ mkAnnFieldFromPGCol "errors" PGJSONB - G.Name t -> throw500 $ "unexpected field in actions' httpResponse : " <> t - - let tableFromExp = RS.FromTable actionLogTable + tableFromExp = RS.FromTable actionLogTable tableArguments = RS.noSelectArgs - { RS._saWhere = Just $ mkTableBoolExpression actionId} + { RS._saWhere = Just tableBoolExpression} tablePermissions = RS.TablePerm annBoolExpTrue Nothing - selectAstUnresolved = RS.AnnSelectG annotatedFields tableFromExp tablePermissions - tableArguments stringifyNumerics - return selectAstUnresolved + + in RS.AnnSelectG annotatedFields tableFromExp tablePermissions + tableArguments stringifyNumerics where + AnnActionAsyncQuery actionName actionId outputType asyncFields definitionList stringifyNumerics = annAction actionLogTable = QualifiedObject (SchemaName "hdb_catalog") (TableName "hdb_action_log") - -- TODO:- Avoid using PGColumnInfo - mkAnnFieldFromPGCol column columnType = + -- TODO (from master):- Avoid using PGColumnInfo + mkAnnFldFromPGCol column' columnType = flip RS.mkAnnColumnField Nothing $ - PGColumnInfo (unsafePGCol column) (G.Name column) 0 (PGColumnScalar columnType) True Nothing + PGColumnInfo (unsafePGCol column') (G.unsafeMkName column') 0 (PGColumnScalar columnType) True Nothing - parseActionId annInpValue = mkParameterizablePGValue <$> asPGColumnValue annInpValue - - mkTableBoolExpression actionId = - let actionIdColumnInfo = PGColumnInfo (unsafePGCol "id") "id" 0 (PGColumnScalar PGUUID) False Nothing + tableBoolExpression = + let actionIdColumnInfo = PGColumnInfo (unsafePGCol "id") $$(G.litName "id") + 0 (PGColumnScalar PGUUID) False Nothing actionIdColumnEq = BoolFld $ AVCol actionIdColumnInfo [AEQ True actionId] - sessionVarsColumnInfo = PGColumnInfo (unsafePGCol "session_variables") "session_variables" + sessionVarsColumnInfo = PGColumnInfo (unsafePGCol "session_variables") $$(G.litName "session_variables") 0 (PGColumnScalar PGJSONB) False Nothing - sessionVarValue = UVPG $ AnnPGVal Nothing False $ WithScalarType PGJSONB - $ PGValJSONB $ Q.JSONB $ J.toJSON $ _uiSession userInfo + sessionVarValue = flip UVParameter Nothing $ PGColumnValue (PGColumnScalar PGJSONB) $ + WithScalarType PGJSONB $ PGValJSONB $ Q.JSONB $ J.toJSON $ _uiSession userInfo sessionVarsColumnEq = BoolFld $ AVCol sessionVarsColumnInfo [AEQ True sessionVarValue] -- For non-admin roles, accessing an async action's response should be allowed only for the user @@ -403,13 +295,13 @@ data ActionLogItem -- | Process async actions from hdb_catalog.hdb_action_log table. This functions is executed in a background thread. -- See Note [Async action architecture] above asyncActionsProcessor - :: forall m void . - ( HasVersion - , MonadIO m - , MonadBaseControl IO m - , LA.Forall (LA.Pure m) - , Tracing.HasReporter m - ) + :: forall m void + . ( HasVersion + , MonadIO m + , MonadBaseControl IO m + , LA.Forall (LA.Pure m) + , Tracing.HasReporter m + ) => Env.Environment -> L.Logger L.Hasura -> IORef (RebuildableSchemaCache Run, SchemaCacheVer) @@ -443,8 +335,8 @@ asyncActionsProcessor env logger cacheRef pgPool httpManager = forever $ do actionContext = ActionContext actionName eitherRes <- runExceptT $ flip runReaderT logger $ callWebhook env httpManager outputType outputFields reqHeaders confHeaders - forwardClientHeaders webhookUrl $ - ActionWebhookPayload actionContext sessionVariables inputPayload + forwardClientHeaders webhookUrl $ + ActionWebhookPayload actionContext sessionVariables inputPayload liftIO $ case eitherRes of Left e -> setError actionId e Right (responsePayload, _) -> setCompleted actionId $ J.toJSON responsePayload @@ -529,7 +421,7 @@ callWebhook env manager outputType outputFields reqHeaders confHeaders requestBody = J.encode postPayload requestBodySize = BL.length requestBody url = unResolvedWebhook resolvedWebhook - httpResponse <- do + httpResponse <- do initReq <- liftIO $ HTTP.parseRequest (T.unpack url) let req = initReq { HTTP.method = "POST" , HTTP.requestHeaders = addDefaultHeaders hdrs @@ -602,13 +494,13 @@ callWebhook env manager outputType outputFields reqHeaders confHeaders -- Webhook response object should conform to action output fields validateResponseObject obj = do -- Fields not specified in the output type shouldn't be present in the response - let extraFields = filter (not . flip Map.member outputFields) $ map G.Name $ Map.keys obj + let extraFields = filter (not . flip Map.member outputFields) $ Map.keys obj when (not $ null extraFields) $ throwUnexpected $ "unexpected fields in webhook response: " <> showNames extraFields void $ flip Map.traverseWithKey outputFields $ \fieldName fieldTy -> -- When field is non-nullable, it has to present in the response with no null value - when (not $ G.isNullable fieldTy) $ case Map.lookup (G.unName fieldName) obj of + when (not $ G.isNullable fieldTy) $ case Map.lookup fieldName obj of Nothing -> throwUnexpected $ "field " <> fieldName <<> " expected in webhook response, but not found" Just v -> when (v == J.Null) $ throwUnexpected $ @@ -619,23 +511,14 @@ mkJsonAggSelect = bool RS.JASSingleObject RS.JASMultipleRows . isListType processOutputSelectionSet - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => RS.ArgumentExp UnresolvedVal + :: RS.ArgumentExp v -> GraphQLType -> [(PGCol, PGScalarType)] - -> G.NamedType -> ObjectSelectionSet -> m GRS.AnnSimpleSelect -processOutputSelectionSet tableRowInput actionOutputType definitionList fldTy flds = do - stringifyNumerics <- stringifyNum <$> asks getter - annotatedFields <- processTableSelectionSet fldTy flds - let annSel = RS.AnnSelectG annotatedFields selectFrom - RS.noTablePermissions RS.noSelectArgs stringifyNumerics - pure annSel + -> RS.AnnFieldsG v + -> Bool + -> RS.AnnSimpleSelG v +processOutputSelectionSet tableRowInput actionOutputType definitionList annotatedFields = + RS.AnnSelectG annotatedFields selectFrom RS.noTablePermissions RS.noSelectArgs where jsonbToPostgresRecordFunction = QualifiedObject "pg_catalog" $ FunctionName $ @@ -645,4 +528,3 @@ processOutputSelectionSet tableRowInput actionOutputType definitionList fldTy fl functionArgs = RS.FunctionArgsExp [tableRowInput] mempty selectFrom = RS.FromFunction jsonbToPostgresRecordFunction functionArgs $ Just definitionList - diff --git a/server/src-lib/Hasura/GraphQL/Execute/Inline.hs b/server/src-lib/Hasura/GraphQL/Execute/Inline.hs new file mode 100644 index 00000000000..da0b4ace8f6 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Inline.hs @@ -0,0 +1,180 @@ +{-# LANGUAGE StrictData #-} + +{-| This module implements /fragment inlining/, which converts all fragment +spreads in a GraphQL query to inline fragments. For example, given a query like + +> query { +> users { +> id +> ...userFields +> } +> } +> +> fragment userFields on User { +> name +> favoriteColor +> } + +the fragment inliner will convert it to this: + +> query { +> users { +> id +> ... on User { +> name +> favoriteColor +> } +> } +> } + +This is a straightforward and mechanical transformation, but it simplifies +further processing, since we catch unbound fragments and recursive fragment +definitions early in the pipeline, so parsing does not have to worry about it. +In that sense, fragment inlining is similar to the variable resolution pass +performed by "Hasura.GraphQL.Execute.Resolve", but for fragment definitions +rather than variables. -} +module Hasura.GraphQL.Execute.Inline + ( inlineSelectionSet + ) where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict.Extended as Map +import qualified Data.HashSet as Set +import qualified Data.List as L +import qualified Data.Text as T + +import Control.Lens +import Language.GraphQL.Draft.Syntax + +import Hasura.RQL.Types.Error +import Hasura.Server.Utils +import Hasura.SQL.Types + +-- | Internal bookkeeping used during inlining. +data InlineEnv = InlineEnv + { _ieFragmentDefinitions :: HashMap Name FragmentDefinition + -- ^ All known fragment definitions. + , _ieFragmentStack :: [Name] + -- ^ Fragments we’re currently inlining higher up in the call stack, used to + -- detect fragment cycles. + } + +-- | Internal bookkeeping used during inlining. +newtype InlineState var = InlineState + { _isFragmentCache :: HashMap Name (InlineFragment NoFragments var) + -- ^ A cache of fragment definitions we’ve already inlined, so we don’t need + -- to inline them again. + } + +$(makeLensesFor [("_ieFragmentStack", "ieFragmentStack")] ''InlineEnv) +$(makeLenses ''InlineState) + +type MonadInline var m = + ( MonadError QErr m + , MonadReader InlineEnv m + , MonadState (InlineState var) m + ) + +-- | Inlines all fragment spreads in a 'SelectionSet'; see the module +-- documentation for "Hasura.GraphQL.Execute.Inline" for details. +inlineSelectionSet + :: (MonadError QErr m, Foldable t) + => t FragmentDefinition + -> SelectionSet FragmentSpread var + -> m (SelectionSet NoFragments var) +inlineSelectionSet fragmentDefinitions selectionSet = do + let fragmentDefinitionMap = Map.groupOnNE _fdName fragmentDefinitions + uniqueFragmentDefinitions <- flip Map.traverseWithKey fragmentDefinitionMap + \fragmentName fragmentDefinitions' -> + case fragmentDefinitions' of + a :| [] -> return a + _ -> throw400 ParseFailed $ "multiple definitions for fragment " <>> fragmentName + let usedFragmentNames = Set.fromList $ fragmentsInSelectionSet selectionSet + definedFragmentNames = Set.fromList $ Map.keys uniqueFragmentDefinitions + -- At the time of writing, this check is disabled using + -- a local binding because, the master branch doesn't implement this + -- check. + -- TODO: Do this check using a feature flag + isFragmentValidationEnabled = False + when (isFragmentValidationEnabled && (usedFragmentNames /= definedFragmentNames)) $ + throw400 ValidationFailed $ + "following fragment(s) have been defined, but have not been used in the query - " + <> T.concat (L.intersperse ", " + $ map unName $ Set.toList $ + Set.difference definedFragmentNames usedFragmentNames) + traverse inlineSelection selectionSet + & flip evalStateT InlineState{ _isFragmentCache = mempty } + & flip runReaderT InlineEnv + { _ieFragmentDefinitions = uniqueFragmentDefinitions + , _ieFragmentStack = [] } + where + fragmentsInSelectionSet :: SelectionSet FragmentSpread var -> [Name] + fragmentsInSelectionSet selectionSet' = concatMap getFragFromSelection selectionSet' + + getFragFromSelection :: Selection FragmentSpread var -> [Name] + getFragFromSelection = \case + SelectionField fld -> fragmentsInSelectionSet $ _fSelectionSet fld + SelectionFragmentSpread fragmentSpread -> [_fsName fragmentSpread] + SelectionInlineFragment inlineFragment -> fragmentsInSelectionSet $ _ifSelectionSet inlineFragment + +inlineSelection + :: MonadInline var m + => Selection FragmentSpread var + -> m (Selection NoFragments var) +inlineSelection (SelectionField field@Field{ _fSelectionSet }) = + withPathK "selectionSet" $ withPathK (unName $ _fName field) $ do + selectionSet <- traverse inlineSelection _fSelectionSet + pure $! SelectionField field{ _fSelectionSet = selectionSet } +inlineSelection (SelectionFragmentSpread spread) = + withPathK "selectionSet" $ + SelectionInlineFragment <$> inlineFragmentSpread spread +inlineSelection (SelectionInlineFragment fragment@InlineFragment{ _ifSelectionSet }) = do + selectionSet <- traverse inlineSelection _ifSelectionSet + pure $! SelectionInlineFragment fragment{ _ifSelectionSet = selectionSet } + +inlineFragmentSpread + :: MonadInline var m + => FragmentSpread var + -> m (InlineFragment NoFragments var) +inlineFragmentSpread FragmentSpread{ _fsName, _fsDirectives } = do + InlineEnv{ _ieFragmentDefinitions, _ieFragmentStack } <- ask + InlineState{ _isFragmentCache } <- get + + if -- If we’ve already inlined this fragment, no need to process it again. + | Just fragment <- Map.lookup _fsName _isFragmentCache -> + pure $! addSpreadDirectives fragment + + -- Fragment cycles are always illegal; see + -- http://spec.graphql.org/June2018/#sec-Fragment-spreads-must-not-form-cycles + | (fragmentCycle, _:_) <- break (== _fsName) _ieFragmentStack -> + throw400 ValidationFailed $ "the fragment definition(s) " + <> englishList "and" (dquoteTxt <$> (_fsName :| reverse fragmentCycle)) + <> " form a cycle" + + -- We didn’t hit the fragment cache, so look up the definition and convert + -- it to an inline fragment. + | Just FragmentDefinition{ _fdTypeCondition, _fdSelectionSet } + <- Map.lookup _fsName _ieFragmentDefinitions -> withPathK (unName _fsName) $ do + + selectionSet <- locally ieFragmentStack (_fsName :) $ + traverse inlineSelection (fmap absurd <$> _fdSelectionSet) + + let fragment = InlineFragment + { _ifTypeCondition = Just _fdTypeCondition + -- As far as I can tell, the GraphQL spec says that directives + -- on the fragment definition do NOT apply to the fields in its + -- selection set. + , _ifDirectives = [] + , _ifSelectionSet = selectionSet + } + modify' $ over isFragmentCache $ Map.insert _fsName fragment + pure $! addSpreadDirectives fragment + + -- If we get here, the fragment name is unbound; raise an error. + -- http://spec.graphql.org/June2018/#sec-Fragment-spread-target-defined + | otherwise -> throw400 ValidationFailed $ + "reference to undefined fragment " <>> _fsName + where + addSpreadDirectives fragment = + fragment{ _ifDirectives = _ifDirectives fragment ++ _fsDirectives } diff --git a/server/src-lib/Hasura/GraphQL/Execute/Insert.hs b/server/src-lib/Hasura/GraphQL/Execute/Insert.hs new file mode 100644 index 00000000000..9c6afb5a1f4 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Insert.hs @@ -0,0 +1,318 @@ +module Hasura.GraphQL.Execute.Insert + ( traverseAnnInsert + , convertToSQLTransaction + ) where + +import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.Environment as Env +import qualified Data.HashMap.Strict as Map +import qualified Data.Sequence as Seq +import qualified Data.Text as T +import qualified Database.PG.Query as Q + + +import qualified Hasura.RQL.DML.Insert as RQL +import qualified Hasura.RQL.DML.Insert.Types as RQL +import qualified Hasura.RQL.DML.Mutation as RQL +import qualified Hasura.RQL.DML.RemoteJoin as RQL +import qualified Hasura.RQL.DML.Returning as RQL +import qualified Hasura.RQL.DML.Returning.Types as RQL +import qualified Hasura.RQL.GBoolExp as RQL +import qualified Hasura.SQL.DML as S +import qualified Hasura.Tracing as Tracing + +import Hasura.Db +import Hasura.EncJSON +import Hasura.GraphQL.Schema.Insert +import Hasura.RQL.Types +import Hasura.Server.Version (HasVersion) +import Hasura.SQL.Types +import Hasura.SQL.Value + + +traverseAnnInsert + :: (Applicative f) + => (a -> f b) + -> AnnInsert a + -> f (AnnInsert b) +traverseAnnInsert f (AnnInsert fieldName isSingle (annIns, mutationOutput)) = + AnnInsert fieldName isSingle + <$> ( (,) + <$> traverseMulti annIns + <*> RQL.traverseMutationOutput f mutationOutput + ) + where + traverseMulti (AnnIns objs tableName conflictClause checkCond columns defaultValues) = AnnIns + <$> traverse traverseObject objs + <*> pure tableName + <*> traverse (traverse f) conflictClause + <*> ( (,) + <$> traverseAnnBoolExp f (fst checkCond) + <*> traverse (traverseAnnBoolExp f) (snd checkCond) + ) + <*> pure columns + <*> traverse f defaultValues + traverseSingle (AnnIns obj tableName conflictClause checkCond columns defaultValues) = AnnIns + <$> traverseObject obj + <*> pure tableName + <*> traverse (traverse f) conflictClause + <*> ( (,) + <$> traverseAnnBoolExp f (fst checkCond) + <*> traverse (traverseAnnBoolExp f) (snd checkCond) + ) + <*> pure columns + <*> traverse f defaultValues + traverseObject (AnnInsObj columns objRels arrRels) = AnnInsObj + <$> traverse (traverse f) columns + <*> traverse (traverseRel traverseSingle) objRels + <*> traverse (traverseRel traverseMulti) arrRels + traverseRel z (RelIns object relInfo) = RelIns <$> z object <*> pure relInfo + + +convertToSQLTransaction + :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) + => Env.Environment + -> AnnInsert S.SQLExp + -> RQL.MutationRemoteJoinCtx + -> Seq.Seq Q.PrepArg + -> Bool + -> m EncJSON +convertToSQLTransaction env (AnnInsert fieldName isSingle (annIns, mutationOutput)) rjCtx planVars stringifyNum = + if null $ _aiInsObj annIns + then pure $ RQL.buildEmptyMutResp mutationOutput + else withPaths ["selectionSet", fieldName, "args", suffix] $ + insertMultipleObjects env annIns [] rjCtx mutationOutput planVars stringifyNum + where + withPaths p x = foldr ($) x $ withPathK <$> p + suffix = bool "objects" "object" isSingle + +insertMultipleObjects + :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) + => Env.Environment + -> MultiObjIns S.SQLExp + -> [(PGCol, S.SQLExp)] + -> RQL.MutationRemoteJoinCtx + -> RQL.MutationOutput + -> Seq.Seq Q.PrepArg + -> Bool + -> m EncJSON +insertMultipleObjects env multiObjIns additionalColumns rjCtx mutationOutput planVars stringifyNum = + bool withoutRelsInsert withRelsInsert anyRelsToInsert + where + AnnIns insObjs table conflictClause checkCondition columnInfos defVals = multiObjIns + allInsObjRels = concatMap _aioObjRels insObjs + allInsArrRels = concatMap _aioArrRels insObjs + anyRelsToInsert = not $ null allInsArrRels && null allInsObjRels + + withoutRelsInsert = do + indexedForM_ (_aioColumns <$> insObjs) \column -> + validateInsert (map fst column) [] (map fst additionalColumns) + let columnValues = map (mkSQLRow defVals) $ union additionalColumns . _aioColumns <$> insObjs + columnNames = Map.keys defVals + insertQuery = RQL.InsertQueryP1 + table + columnNames + columnValues + conflictClause + checkCondition + mutationOutput + columnInfos + RQL.execInsertQuery env stringifyNum (Just rjCtx) (insertQuery, planVars) + + withRelsInsert = do + insertRequests <- indexedForM insObjs \obj -> do + let singleObj = AnnIns obj table conflictClause checkCondition columnInfos defVals + insertObject env singleObj additionalColumns rjCtx planVars stringifyNum + let affectedRows = sum $ map fst insertRequests + columnValues = mapMaybe snd insertRequests + selectExpr <- RQL.mkSelCTEFromColVals table columnInfos columnValues + let (mutOutputRJ, remoteJoins) = RQL.getRemoteJoinsMutationOutput mutationOutput + sqlQuery = Q.fromBuilder $ toSQL $ + RQL.mkMutationOutputExp table columnInfos (Just affectedRows) selectExpr mutOutputRJ stringifyNum + RQL.executeMutationOutputQuery env sqlQuery [] $ (,rjCtx) <$> remoteJoins + +insertObject + :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) + => Env.Environment + -> SingleObjIns S.SQLExp + -> [(PGCol, S.SQLExp)] + -> RQL.MutationRemoteJoinCtx + -> Seq.Seq Q.PrepArg + -> Bool + -> m (Int, Maybe (ColumnValues TxtEncodedPGVal)) +insertObject env singleObjIns additionalColumns rjCtx planVars stringifyNum = do + validateInsert (map fst columns) (map _riRelInfo objectRels) (map fst additionalColumns) + + -- insert all object relations and fetch this insert dependent column values + objInsRes <- forM objectRels $ insertObjRel env planVars rjCtx stringifyNum + + -- prepare final insert columns + let objRelAffRows = sum $ map fst objInsRes + objRelDeterminedCols = concatMap snd objInsRes + finalInsCols = columns <> objRelDeterminedCols <> additionalColumns + + cte <- mkInsertQ table onConflict finalInsCols defaultValues checkCond + + MutateResp affRows colVals <- liftTx $ RQL.mutateAndFetchCols table allColumns (cte, planVars) stringifyNum + colValM <- asSingleObject colVals + + arrRelAffRows <- bool (withArrRels colValM) (return 0) $ null arrayRels + let totAffRows = objRelAffRows + affRows + arrRelAffRows + + return (totAffRows, colValM) + where + AnnIns annObj table onConflict checkCond allColumns defaultValues = singleObjIns + AnnInsObj columns objectRels arrayRels = annObj + + arrRelDepCols = flip getColInfos allColumns $ + concatMap (Map.keys . riMapping . _riRelInfo) arrayRels + + withArrRels colValM = do + colVal <- onNothing colValM $ throw400 NotSupported cannotInsArrRelErr + arrDepColsWithVal <- fetchFromColVals colVal arrRelDepCols + arrInsARows <- forM arrayRels $ insertArrRel env arrDepColsWithVal rjCtx planVars stringifyNum + return $ sum arrInsARows + + asSingleObject = \case + [] -> pure Nothing + [r] -> pure $ Just r + _ -> throw500 "more than one row returned" + + cannotInsArrRelErr = + "cannot proceed to insert array relations since insert to table " + <> table <<> " affects zero rows" + +insertObjRel + :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) + => Env.Environment + -> Seq.Seq Q.PrepArg + -> RQL.MutationRemoteJoinCtx + -> Bool + -> ObjRelIns S.SQLExp + -> m (Int, [(PGCol, S.SQLExp)]) +insertObjRel env planVars rjCtx stringifyNum objRelIns = + withPathK (relNameToTxt relName) $ do + (affRows, colValM) <- withPathK "data" $ insertObject env singleObjIns [] rjCtx planVars stringifyNum + colVal <- onNothing colValM $ throw400 NotSupported errMsg + retColsWithVals <- fetchFromColVals colVal rColInfos + let columns = flip mapMaybe (Map.toList mapCols) \(column, target) -> do + value <- lookup target retColsWithVals + Just (column, value) + pure (affRows, columns) + where + RelIns singleObjIns relInfo = objRelIns + relName = riName relInfo + table = riRTable relInfo + mapCols = riMapping relInfo + allCols = _aiTableCols singleObjIns + rCols = Map.elems mapCols + rColInfos = getColInfos rCols allCols + errMsg = "cannot proceed to insert object relation " + <> relName <<> " since insert to table " + <> table <<> " affects zero rows" + +insertArrRel + :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) + => Env.Environment + -> [(PGCol, S.SQLExp)] + -> RQL.MutationRemoteJoinCtx + -> Seq.Seq Q.PrepArg + -> Bool + -> ArrRelIns S.SQLExp + -> m Int +insertArrRel env resCols rjCtx planVars stringifyNum arrRelIns = + withPathK (relNameToTxt $ riName relInfo) $ do + let additionalColumns = flip mapMaybe resCols \(column, value) -> do + target <- Map.lookup column mapping + Just (target, value) + resBS <- withPathK "data" $ + insertMultipleObjects env multiObjIns additionalColumns rjCtx mutOutput planVars stringifyNum + resObj <- decodeEncJSON resBS + onNothing (Map.lookup ("affected_rows" :: T.Text) resObj) $ + throw500 "affected_rows not returned in array rel insert" + where + RelIns multiObjIns relInfo = arrRelIns + mapping = riMapping relInfo + mutOutput = RQL.MOutMultirowFields [("affected_rows", RQL.MCount)] + +-- | validate an insert object based on insert columns, +-- | insert object relations and additional columns from parent +validateInsert + :: (MonadError QErr m) + => [PGCol] -- ^ inserting columns + -> [RelInfo] -- ^ object relation inserts + -> [PGCol] -- ^ additional fields from parent + -> m () +validateInsert insCols objRels addCols = do + -- validate insertCols + unless (null insConflictCols) $ throw400 ValidationFailed $ + "cannot insert " <> showPGCols insConflictCols + <> " columns as their values are already being determined by parent insert" + + forM_ objRels $ \relInfo -> do + let lCols = Map.keys $ riMapping relInfo + relName = riName relInfo + relNameTxt = relNameToTxt relName + lColConflicts = lCols `intersect` (addCols <> insCols) + withPathK relNameTxt $ unless (null lColConflicts) $ throw400 ValidationFailed $ + "cannot insert object relation ship " <> relName + <<> " as " <> showPGCols lColConflicts + <> " column values are already determined" + where + insConflictCols = insCols `intersect` addCols + + +mkInsertQ + :: MonadError QErr m + => QualifiedTable + -> Maybe (RQL.ConflictClauseP1 S.SQLExp) + -> [(PGCol, S.SQLExp)] + -> Map.HashMap PGCol S.SQLExp + -> (AnnBoolExpSQL, Maybe AnnBoolExpSQL) + -> m S.CTE +mkInsertQ table onConflictM insCols defVals (insCheck, updCheck) = do + let sqlConflict = RQL.toSQLConflict table <$> onConflictM + sqlExps = mkSQLRow defVals insCols + valueExp = S.ValuesExp [S.TupleExp sqlExps] + tableCols = Map.keys defVals + sqlInsert = + S.SQLInsert table tableCols valueExp sqlConflict + . Just + $ S.RetExp + [ S.selectStar + , S.Extractor + (RQL.insertOrUpdateCheckExpr table onConflictM + (RQL.toSQLBoolExp (S.QualTable table) insCheck) + (fmap (RQL.toSQLBoolExp (S.QualTable table)) updCheck)) + Nothing + ] + pure $ S.CTEInsert sqlInsert + +fetchFromColVals + :: MonadError QErr m + => ColumnValues TxtEncodedPGVal + -> [PGColumnInfo] + -> m [(PGCol, S.SQLExp)] +fetchFromColVals colVal reqCols = + forM reqCols $ \ci -> do + let valM = Map.lookup (pgiColumn ci) colVal + val <- onNothing valM $ throw500 $ "column " + <> pgiColumn ci <<> " not found in given colVal" + let pgColVal = case val of + TENull -> S.SENull + TELit t -> S.SELit t + return (pgiColumn ci, pgColVal) + +mkSQLRow :: Map.HashMap PGCol S.SQLExp -> [(PGCol, S.SQLExp)] -> [S.SQLExp] +mkSQLRow defVals withPGCol = map snd $ + flip map (Map.toList defVals) $ + \(col, defVal) -> (col,) $ fromMaybe defVal $ Map.lookup col withPGColMap + where + withPGColMap = Map.fromList withPGCol + +decodeEncJSON :: (J.FromJSON a, QErrM m) => EncJSON -> m a +decodeEncJSON = + either (throw500 . T.pack) decodeValue . + J.eitherDecode . encJToLBS diff --git a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery.hs b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery.hs index d25a613c8b9..468f967f5fb 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery.hs @@ -90,7 +90,7 @@ Additional details are provided by the documentation for individual bindings. module Hasura.GraphQL.Execute.LiveQuery ( LiveQueryPlan , ReusableLiveQueryPlan - , reuseLiveQueryPlan + -- , reuseLiveQueryPlan , buildLiveQueryPlan , LiveQueryPlanExplanation diff --git a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Plan.hs b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Plan.hs index e9cf84c71c0..67a968273e4 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Plan.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Plan.hs @@ -18,7 +18,7 @@ module Hasura.GraphQL.Execute.LiveQuery.Plan , ReusableLiveQueryPlan , ValidatedQueryVariables , buildLiveQueryPlan - , reuseLiveQueryPlan + -- , reuseLiveQueryPlan , LiveQueryPlanExplanation , explainLiveQueryPlan @@ -27,41 +27,37 @@ module Hasura.GraphQL.Execute.LiveQuery.Plan import Hasura.Prelude import Hasura.Session -import qualified Data.Aeson.Casing as J -import qualified Data.Aeson.Extended as J -import qualified Data.Aeson.TH as J -import qualified Data.ByteString as B -import qualified Data.Environment as E -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.Text as T -import qualified Data.UUID.V4 as UUID -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G +import qualified Data.Aeson.Casing as J +import qualified Data.Aeson.Extended as J +import qualified Data.Aeson.TH as J +import qualified Data.ByteString as B +import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.Sequence as Seq +import qualified Data.Text as T +import qualified Data.UUID.V4 as UUID +import qualified Database.PG.Query as Q +import qualified Language.GraphQL.Draft.Syntax as G -- remove these when array encoding is merged -import qualified Database.PG.Query.PTI as PTI -import qualified PostgreSQL.Binary.Encoding as PE +import qualified Database.PG.Query.PTI as PTI +import qualified PostgreSQL.Binary.Encoding as PE import Control.Lens -import Data.Has -import Data.UUID (UUID) +import Data.UUID (UUID) -import qualified Hasura.GraphQL.Resolve as GR -import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH -import qualified Hasura.GraphQL.Validate as GV -import qualified Hasura.Logging as L -import qualified Hasura.SQL.DML as S -import qualified Hasura.Tracing as Tracing +import qualified Hasura.GraphQL.Parser.Schema as PS +-- import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH +import qualified Hasura.RQL.DML.RemoteJoin as RR +import qualified Hasura.RQL.DML.Select as DS +import qualified Hasura.SQL.DML as S import Hasura.Db -import Hasura.GraphQL.Resolve.Action -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Utils -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types +import Hasura.GraphQL.Context +import Hasura.GraphQL.Execute.Action +import Hasura.GraphQL.Execute.Query +import Hasura.GraphQL.Parser.Column import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) import Hasura.SQL.Error import Hasura.SQL.Types import Hasura.SQL.Value @@ -72,7 +68,17 @@ import Hasura.SQL.Value newtype MultiplexedQuery = MultiplexedQuery { unMultiplexedQuery :: Q.Query } deriving (Show, Eq, Hashable, J.ToJSON) -mkMultiplexedQuery :: OMap.InsOrdHashMap G.Alias GR.QueryRootFldResolved -> MultiplexedQuery +toSQLFromItem :: S.Alias -> SubscriptionRootFieldResolved -> S.FromItem +toSQLFromItem alias = \case + RFDB (QDBPrimaryKey s) -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s + RFDB (QDBSimple s) -> fromSelect $ DS.mkSQLSelect DS.JASMultipleRows s + RFDB (QDBAggregation s) -> fromSelect $ DS.mkAggregateSelect s + RFDB (QDBConnection s) -> S.mkSelectWithFromItem (DS.mkConnectionSelect s) alias + RFAction s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s + where + fromSelect s = S.mkSelFromItem s alias + +mkMultiplexedQuery :: OMap.InsOrdHashMap G.Name SubscriptionRootFieldResolved -> MultiplexedQuery mkMultiplexedQuery rootFields = MultiplexedQuery . Q.fromBuilder . toSQL $ S.mkSelect { S.selExtr = -- SELECT _subs.result_id, _fld_resp.root AS result @@ -94,38 +100,38 @@ mkMultiplexedQuery rootFields = MultiplexedQuery . Q.fromBuilder . toSQL $ S.mkS { S.selExtr = [S.Extractor rootFieldsJsonAggregate (Just . S.Alias $ Iden "root")] , S.selFrom = Just . S.FromExp $ flip map (OMap.toList rootFields) $ \(fieldAlias, resolvedAST) -> - GR.toSQLFromItem (S.Alias $ aliasToIden fieldAlias) resolvedAST + toSQLFromItem (S.Alias $ aliasToIden fieldAlias) resolvedAST } -- json_build_object('field1', field1.root, 'field2', field2.root, ...) rootFieldsJsonAggregate = S.SEFnApp "json_build_object" rootFieldsJsonPairs Nothing rootFieldsJsonPairs = flip concatMap (OMap.keys rootFields) $ \fieldAlias -> - [ S.SELit (G.unName $ G.unAlias fieldAlias) + [ S.SELit (G.unName fieldAlias) , mkQualIden (aliasToIden fieldAlias) (Iden "root") ] mkQualIden prefix = S.SEQIden . S.QIden (S.QualIden prefix Nothing) -- TODO fix this Nothing of course - aliasToIden = Iden . G.unName . G.unAlias + aliasToIden = Iden . G.unName +-- TODO fix this comment -- | Resolves an 'GR.UnresolvedVal' by converting 'GR.UVPG' values to SQL expressions that refer to -- the @result_vars@ input object, collecting variable values along the way. resolveMultiplexedValue - :: (MonadState (GV.ReusableVariableValues, Seq (WithScalarType PGScalarValue)) m) - => GR.UnresolvedVal -> m S.SQLExp + :: (MonadState (HashMap G.Name PGColumnValue, Seq PGColumnValue) m) + => UnpreparedValue -> m S.SQLExp resolveMultiplexedValue = \case - GR.UVPG annPGVal -> do - let GR.AnnPGVal varM _ colVal = annPGVal - varJsonPath <- case varM of + UVParameter colVal varM -> do + varJsonPath <- case fmap PS.getName varM of Just varName -> do modifying _1 $ Map.insert varName colVal - pure ["query", G.unName $ G.unVariable varName] + pure ["query", G.unName varName] Nothing -> do syntheticVarIndex <- gets (length . snd) modifying _2 (|> colVal) pure ["synthetic", T.pack $ show syntheticVarIndex] - pure $ fromResVars (PGTypeScalar $ pstType colVal) varJsonPath - GR.UVSessVar ty sessVar -> pure $ fromResVars ty ["session", sessionVariableToText sessVar] - GR.UVSQL sqlExp -> pure sqlExp - GR.UVSession -> pure $ fromResVars (PGTypeScalar PGJSON) ["session"] + pure $ fromResVars (PGTypeScalar $ pstType $ pcvValue colVal) varJsonPath + UVSessionVar ty sessVar -> pure $ fromResVars ty ["session", sessionVariableToText sessVar] + UVLiteral sqlExp -> pure sqlExp + UVSession -> pure $ fromResVars (PGTypeScalar PGJSON) ["session"] where fromResVars pgType jPath = addTypeAnnotation pgType $ S.SEOpApp (S.SQLOp "#>>") [ S.SEQIden $ S.QIden (S.QualIden (Iden "_subs") Nothing) (Iden "result_vars") @@ -215,7 +221,7 @@ deriving instance (Eq (f TxtEncodedPGVal)) => Eq (ValidatedVariables f) deriving instance (Hashable (f TxtEncodedPGVal)) => Hashable (ValidatedVariables f) deriving instance (J.ToJSON (f TxtEncodedPGVal)) => J.ToJSON (ValidatedVariables f) -type ValidatedQueryVariables = ValidatedVariables (Map.HashMap G.Variable) +type ValidatedQueryVariables = ValidatedVariables (Map.HashMap G.Name) type ValidatedSyntheticVariables = ValidatedVariables [] -- | Checks if the provided arguments are valid values for their corresponding types. @@ -264,76 +270,109 @@ data ReusableLiveQueryPlan = ReusableLiveQueryPlan { _rlqpParameterizedPlan :: !ParameterizedLiveQueryPlan , _rlqpSyntheticVariableValues :: !ValidatedSyntheticVariables - , _rlqpQueryVariableTypes :: !GV.ReusableVariableTypes + , _rlqpQueryVariableTypes :: HashMap G.Name PGColumnType } deriving (Show) $(J.deriveToJSON (J.aesonDrop 4 J.snakeCase) ''ReusableLiveQueryPlan) -- | Constructs a new execution plan for a live query and returns a reusable version of the plan if -- possible. + +-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it +-- here. We should evaluate if we need it here. buildLiveQueryPlan :: ( MonadError QErr m - , MonadReader r m - , Has UserInfo r - , Has FieldMap r - , Has OrdByCtx r - , Has QueryCtxMap r - , Has SQLGenCtx r - , Has (L.Logger L.Hasura) r , MonadIO m - , Tracing.MonadTrace m - , HasVersion ) - => E.Environment - -> PGExecCtx - -> QueryReusability - -> QueryActionExecuter - -> ObjectSelectionSet + => PGExecCtx + -> UserInfo + -> InsOrdHashMap G.Name (SubscriptionRootField UnpreparedValue) -> m (LiveQueryPlan, Maybe ReusableLiveQueryPlan) -buildLiveQueryPlan env pgExecCtx initialReusability actionExecuter selectionSet = do - ((resolvedASTMap, (queryVariableValues, syntheticVariableValues)), finalReusability) <- - runReusabilityTWith initialReusability $ - flip runStateT mempty $ flip OMap.traverseWithKey (unAliasedFields $ unObjectSelectionSet selectionSet) $ - \_ field -> case GV._fName field of - "__typename" -> throwVE "you cannot create a subscription on '__typename' field" - _ -> do - unresolvedAST <- GR.queryFldToPGAST env field actionExecuter - resolvedAST <- GR.traverseQueryRootFldAST resolveMultiplexedValue unresolvedAST +buildLiveQueryPlan pgExecCtx userInfo unpreparedAST = do + -- ((resolvedASTs, (queryVariableValues, syntheticVariableValues)), finalReusability) <- + -- GV.runReusabilityTWith initialReusability . flip runStateT mempty $ + -- fmap Map.fromList . for (toList fields) $ \field -> case GV._fName field of + -- "__typename" -> throwVE "you cannot create a subscription on '__typename' field" + -- _ -> do + -- unresolvedAST <- GR.queryFldToPGAST field actionExecutioner + -- resolvedAST <- GR.traverseQueryRootFldAST resolveMultiplexedValue unresolvedAST - let (_, remoteJoins) = GR.toPGQuery resolvedAST - -- Reject remote relationships in subscription live query - when (remoteJoins /= mempty) $ - throw400 NotSupported - "Remote relationships are not allowed in subscriptions" - pure resolvedAST + -- let (_, remoteJoins) = GR.toPGQuery resolvedAST + -- -- Reject remote relationships in subscription live query + -- when (remoteJoins /= mempty) $ + -- throw400 NotSupported + -- "Remote relationships are not allowed in subscriptions" + -- pure (GV._fAlias field, resolvedAST) - userInfo <- asks getter - let multiplexedQuery = mkMultiplexedQuery resolvedASTMap + -- Transform the RQL AST into a prepared SQL query +{- preparedAST <- for unpreparedAST \unpreparedQuery -> do + (preparedQuery, PlanningSt _ planVars planVals) + <- flip runStateT initPlanningSt + $ traverseSubscriptionRootField prepareWithPlan unpreparedQuery + pure $! irToRootFieldPlan planVars planVals preparedQuery +-} + (preparedAST, (queryVariableValues, querySyntheticVariableValues)) <- flip runStateT (mempty, Seq.empty) $ + for unpreparedAST \unpreparedQuery -> do + resolvedRootField <- traverseQueryRootField resolveMultiplexedValue unpreparedQuery + case resolvedRootField of + RFDB qDB -> do + let remoteJoins = case qDB of + QDBSimple s -> snd $ RR.getRemoteJoins s + QDBPrimaryKey s -> snd $ RR.getRemoteJoins s + QDBAggregation s -> snd $ RR.getRemoteJoinsAggregateSelect s + QDBConnection s -> snd $ RR.getRemoteJoinsConnectionSelect s + when (remoteJoins /= mempty) + $ throw400 NotSupported "Remote relationships are not allowed in subscriptions" + _ -> pure () + traverseAction (DS.traverseAnnSimpleSelect resolveMultiplexedValue . resolveAsyncActionQuery userInfo) resolvedRootField + + let multiplexedQuery = mkMultiplexedQuery preparedAST roleName = _uiRole userInfo parameterizedPlan = ParameterizedLiveQueryPlan roleName multiplexedQuery -- We need to ensure that the values provided for variables are correct according to Postgres. -- Without this check an invalid value for a variable for one instance of the subscription will -- take down the entire multiplexed query. - validatedQueryVars <- validateVariables pgExecCtx queryVariableValues - validatedSyntheticVars <- validateVariables pgExecCtx (toList syntheticVariableValues) - let cohortVariables = CohortVariables (_uiSession userInfo) validatedQueryVars validatedSyntheticVars - plan = LiveQueryPlan parameterizedPlan cohortVariables - varTypes = finalReusability ^? _Reusable - reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars <$> varTypes - pure (plan, reusablePlan) + validatedQueryVars <- validateVariables pgExecCtx $ fmap pcvValue queryVariableValues + validatedSyntheticVars <- validateVariables pgExecCtx $ map pcvValue $ toList querySyntheticVariableValues -reuseLiveQueryPlan - :: (MonadError QErr m, MonadIO m) - => PGExecCtx - -> SessionVariables - -> Maybe GH.VariableValues - -> ReusableLiveQueryPlan - -> m LiveQueryPlan -reuseLiveQueryPlan pgExecCtx sessionVars queryVars reusablePlan = do - let ReusableLiveQueryPlan parameterizedPlan syntheticVars queryVarTypes = reusablePlan - annVarVals <- GV.validateVariablesForReuse queryVarTypes queryVars - validatedVars <- validateVariables pgExecCtx annVarVals - pure $ LiveQueryPlan parameterizedPlan (CohortVariables sessionVars validatedVars syntheticVars) + let -- TODO validatedQueryVars validatedSyntheticVars + cohortVariables = CohortVariables (_uiSession userInfo) validatedQueryVars validatedSyntheticVars + + plan = LiveQueryPlan parameterizedPlan cohortVariables + -- See Note [Temporarily disabling query plan caching] + -- varTypes = finalReusability ^? GV._Reusable + reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars mempty {- <$> _varTypes -} + pure (plan, Just reusablePlan) + + -- (astResolved, (queryVariableValues, syntheticVariableValues)) <- flip runStateT mempty $ + -- GEQ.traverseSubscriptionRootField resolveMultiplexedValue _astUnresolved + -- let pgQuery = mkMultiplexedQuery $ _toPGQuery astResolved + -- parameterizedPlan = ParameterizedLiveQueryPlan (userRole userInfo) fieldAlias pgQuery + + -- -- We need to ensure that the values provided for variables + -- -- are correct according to Postgres. Without this check + -- -- an invalid value for a variable for one instance of the + -- -- subscription will take down the entire multiplexed query + -- validatedQueryVars <- validateVariables pgExecCtx queryVariableValues + -- validatedSyntheticVars <- validateVariables pgExecCtx (toList syntheticVariableValues) + -- let cohortVariables = CohortVariables (userVars userInfo) validatedQueryVars validatedSyntheticVars + -- plan = LiveQueryPlan parameterizedPlan cohortVariables + -- reusablePlan = ReusableLiveQueryPlan parameterizedPlan validatedSyntheticVars <$> _varTypes + -- pure (plan, reusablePlan) + +-- See Note [Temporarily disabling query plan caching] +-- reuseLiveQueryPlan +-- :: (MonadError QErr m, MonadIO m) +-- => PGExecCtx +-- -> SessionVariables +-- -> Maybe GH.VariableValues +-- -> ReusableLiveQueryPlan +-- -> m LiveQueryPlan +-- reuseLiveQueryPlan pgExecCtx sessionVars queryVars reusablePlan = do +-- let ReusableLiveQueryPlan parameterizedPlan syntheticVars queryVarTypes = reusablePlan +-- annVarVals <- _validateVariablesForReuse queryVarTypes queryVars +-- validatedVars <- validateVariables pgExecCtx annVarVals +-- pure $ LiveQueryPlan parameterizedPlan (CohortVariables sessionVars validatedVars syntheticVars) data LiveQueryPlanExplanation = LiveQueryPlanExplanation diff --git a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Poll.hs b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Poll.hs index 5019168b97b..0b8550c38ec 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Poll.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/Poll.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE CPP #-} -- | Multiplexed live query poller threads; see "Hasura.GraphQL.Execute.LiveQuery" for details. module Hasura.GraphQL.Execute.LiveQuery.Poll ( -- * Pollers @@ -39,7 +40,9 @@ module Hasura.GraphQL.Execute.LiveQuery.Poll ( ) where import Data.List.Split (chunksOf) +#ifndef PROFILING import GHC.AssertNF +#endif import Hasura.Prelude import qualified Control.Concurrent.Async as A @@ -215,7 +218,9 @@ pushResultToCohort result !respHashM (LiveQueryMetadata dTime) cohortSnapshot = (subscribersToPush, subscribersToIgnore) <- if isExecError result || respHashM /= prevRespHashM then do +#ifndef PROFILING $assertNFHere respHashM -- so we don't write thunks to mutable vars +#endif STM.atomically $ STM.writeTVar respRef respHashM return (newSinks <> curSinks, mempty) else @@ -225,6 +230,7 @@ pushResultToCohort result !respHashM (LiveQueryMetadata dTime) cohortSnapshot = (subscribersToPush, subscribersToIgnore) where CohortSnapshot _ respRef curSinks newSinks = cohortSnapshot + response = result <&> \payload -> LiveQueryResponse payload dTime pushResultToSubscribers = A.mapConcurrently_ $ \(Subscriber _ _ action) -> action response @@ -375,10 +381,10 @@ they need to. -- | see Note [Minimal LiveQuery Poller Log] pollDetailMinimal :: PollDetails -> J.Value -pollDetailMinimal (PollDetails{..}) = +pollDetailMinimal PollDetails{..} = J.object [ "poller_id" J..= _pdPollerId , "snapshot_time" J..= _pdSnapshotTime - , "batches" J..= (map batchExecutionDetailMinimal _pdBatches) + , "batches" J..= map batchExecutionDetailMinimal _pdBatches , "total_time" J..= _pdTotalTime ] @@ -389,7 +395,7 @@ type LiveQueryPostPollHook = PollDetails -> IO () -- the default LiveQueryPostPollHook defaultLiveQueryPostPollHook :: L.Logger L.Hasura -> LiveQueryPostPollHook -defaultLiveQueryPostPollHook logger pd = L.unLogger logger pd +defaultLiveQueryPostPollHook = L.unLogger -- | Where the magic happens: the top-level action run periodically by each -- active 'Poller'. This needs to be async exception safe. diff --git a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/State.hs b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/State.hs index d9b9a7d661f..5bc3e02d793 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/State.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/LiveQuery/State.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE CPP #-} -- | Top-level management of live query poller threads. The implementation of the polling itself is -- in "Hasura.GraphQL.Execute.LiveQuery.Poll". See "Hasura.GraphQL.Execute.LiveQuery" for high-level -- details. @@ -23,7 +24,9 @@ import qualified StmContainers.Map as STMMap import Control.Concurrent.Extended (forkImmortal, sleep) import Control.Exception (mask_) import Data.String +#ifndef PROFILING import GHC.AssertNF +#endif import qualified Hasura.GraphQL.Execute.LiveQuery.TMap as TMap import qualified Hasura.Logging as L @@ -83,7 +86,9 @@ addLiveQuery logger subscriberMetadata lqState plan onResultAction = do let !subscriber = Subscriber subscriberId subscriberMetadata onResultAction +#ifndef PROFILING $assertNFHere subscriber -- so we don't write thunks to mutable vars +#endif -- a handler is returned only when it is newly created handlerM <- STM.atomically $ @@ -107,7 +112,9 @@ addLiveQuery logger subscriberMetadata lqState plan onResultAction = do pollQuery pollerId lqOpts pgExecCtx query (_pCohorts handler) postPollHook sleep $ unRefetchInterval refetchInterval let !pState = PollerIOState threadRef pollerId +#ifndef PROFILING $assertNFHere pState -- so we don't write thunks to mutable vars +#endif STM.atomically $ STM.putTMVar (_pIOState handler) pState pure $ LiveQueryId handlerId cohortKey subscriberId diff --git a/server/src-lib/Hasura/GraphQL/Execute/Mutation.hs b/server/src-lib/Hasura/GraphQL/Execute/Mutation.hs new file mode 100644 index 00000000000..d5bdd6f1914 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Mutation.hs @@ -0,0 +1,211 @@ +module Hasura.GraphQL.Execute.Mutation where + +import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.Environment as Env +import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.HashSet as Set +import qualified Data.IntMap as IntMap +import qualified Data.Sequence as Seq +import qualified Data.Sequence.NonEmpty as NE +import qualified Database.PG.Query as Q +import qualified Language.GraphQL.Draft.Syntax as G +import qualified Network.HTTP.Client as HTTP +import qualified Network.HTTP.Types as HTTP + +import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH +import qualified Hasura.RQL.DML.Delete as RQL +import qualified Hasura.RQL.DML.Mutation as RQL +import qualified Hasura.RQL.DML.Returning.Types as RQL +import qualified Hasura.RQL.DML.Update as RQL +import qualified Hasura.Tracing as Tracing +import qualified Hasura.Logging as L + + +import Hasura.Db +import Hasura.EncJSON +import Hasura.GraphQL.Context +import Hasura.GraphQL.Execute.Action +import Hasura.GraphQL.Execute.Insert +import Hasura.GraphQL.Execute.Prepare +import Hasura.GraphQL.Execute.Resolve +import Hasura.GraphQL.Parser +import Hasura.GraphQL.Schema.Insert +import Hasura.RQL.Types +import Hasura.Server.Version (HasVersion) +import Hasura.Session + +convertDelete + :: ( HasVersion + , MonadError QErr m + , MonadTx tx + , Tracing.MonadTrace tx + , MonadIO tx) + => Env.Environment + -> SessionVariables + -> RQL.MutationRemoteJoinCtx + -> RQL.AnnDelG UnpreparedValue + -> Bool + -> m (tx EncJSON) +convertDelete env usrVars rjCtx deleteOperation stringifyNum = do + let (preparedDelete, expectedVariables) = flip runState Set.empty $ RQL.traverseAnnDel prepareWithoutPlan deleteOperation + validateSessionVariables expectedVariables usrVars + pure $ RQL.execDeleteQuery env stringifyNum (Just rjCtx) (preparedDelete, Seq.empty) + +convertUpdate + :: ( HasVersion + , MonadError QErr m + , MonadTx tx + , Tracing.MonadTrace tx + , MonadIO tx + ) + => Env.Environment + -> SessionVariables + -> RQL.MutationRemoteJoinCtx + -> RQL.AnnUpdG UnpreparedValue + -> Bool + -> m (tx EncJSON) +convertUpdate env usrVars rjCtx updateOperation stringifyNum = do + let (preparedUpdate, expectedVariables) = flip runState Set.empty $ RQL.traverseAnnUpd prepareWithoutPlan updateOperation + if null $ RQL.uqp1OpExps updateOperation + then pure $ pure $ RQL.buildEmptyMutResp $ RQL.uqp1Output preparedUpdate + else do + validateSessionVariables expectedVariables usrVars + pure $ RQL.execUpdateQuery env stringifyNum (Just rjCtx) (preparedUpdate, Seq.empty) + +convertInsert + :: ( HasVersion + , MonadError QErr m + , MonadTx tx + , Tracing.MonadTrace tx + , MonadIO tx) + => Env.Environment + -> SessionVariables + -> RQL.MutationRemoteJoinCtx + -> AnnInsert UnpreparedValue + -> Bool + -> m (tx EncJSON) +convertInsert env usrVars rjCtx insertOperation stringifyNum = do + let (preparedInsert, expectedVariables) = flip runState Set.empty $ traverseAnnInsert prepareWithoutPlan insertOperation + validateSessionVariables expectedVariables usrVars + pure $ convertToSQLTransaction env preparedInsert rjCtx Seq.empty stringifyNum + +planVariablesSequence :: SessionVariables -> PlanningSt -> Seq.Seq Q.PrepArg +planVariablesSequence usrVars = Seq.fromList . map fst . withUserVars usrVars . IntMap.elems . _psPrepped + +convertMutationRootField + :: forall m tx + . ( HasVersion + , MonadIO m + , MonadError QErr m + , Tracing.MonadTrace m + , Tracing.MonadTrace tx + , MonadIO tx + , MonadTx tx + ) + => Env.Environment + -> L.Logger L.Hasura + -> UserInfo + -> HTTP.Manager + -> HTTP.RequestHeaders + -> Bool + -> MutationRootField UnpreparedValue + -> m (Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField) +convertMutationRootField env logger userInfo manager reqHeaders stringifyNum = \case + RFDB (MDBInsert s) -> noResponseHeaders =<< convertInsert env userSession rjCtx s stringifyNum + RFDB (MDBUpdate s) -> noResponseHeaders =<< convertUpdate env userSession rjCtx s stringifyNum + RFDB (MDBDelete s) -> noResponseHeaders =<< convertDelete env userSession rjCtx s stringifyNum + RFRemote remote -> pure $ Right remote + RFAction (AMSync s) -> Left . (_aerTransaction &&& _aerHeaders) <$> resolveActionExecution env logger userInfo s actionExecContext + RFAction (AMAsync s) -> noResponseHeaders =<< resolveActionMutationAsync s reqHeaders userSession + RFRaw s -> noResponseHeaders $ pure $ encJFromJValue s + where + noResponseHeaders :: tx EncJSON -> m (Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField) + noResponseHeaders rTx = pure $ Left (rTx, []) + + userSession = _uiSession userInfo + actionExecContext = ActionExecContext manager reqHeaders $ _uiSession userInfo + + rjCtx = (manager, reqHeaders, userInfo) + +convertMutationSelectionSet + :: forall m tx + . ( HasVersion + , Tracing.MonadTrace m + , MonadIO m + , MonadError QErr m + , MonadTx tx + , Tracing.MonadTrace tx + , MonadIO tx + ) + => Env.Environment + -> L.Logger L.Hasura + -> GQLContext + -> SQLGenCtx + -> UserInfo + -> HTTP.Manager + -> HTTP.RequestHeaders + -> G.SelectionSet G.NoFragments G.Name + -> [G.VariableDefinition] + -> Maybe GH.VariableValues + -> m (ExecutionPlan (tx EncJSON, HTTP.ResponseHeaders) RemoteCall (G.Name, J.Value)) +convertMutationSelectionSet env logger gqlContext sqlGenCtx userInfo manager reqHeaders fields varDefs varValsM = do + mutationParser <- onNothing (gqlMutationParser gqlContext) $ + throw400 ValidationFailed "no mutations exist" + -- Parse the GraphQL query into the RQL AST + (unpreparedQueries, _reusability) + :: (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue), QueryReusability) + <- resolveVariables varDefs (fromMaybe Map.empty varValsM) fields + >>= (mutationParser >>> (`onLeft` reportParseErrors)) + + -- Transform the RQL AST into a prepared SQL query + txs <- for unpreparedQueries $ convertMutationRootField env logger userInfo manager reqHeaders (stringifyNum sqlGenCtx) + let txList = OMap.toList txs + case (mapMaybe takeTx txList, mapMaybe takeRemote txList) of + (dbPlans, []) -> do + let allHeaders = concatMap (snd . snd) dbPlans + combinedTx = toSingleTx $ map (G.unName *** fst) dbPlans + pure $ ExecStepDB (combinedTx, allHeaders) + ([], remotes@(firstRemote:_)) -> do + let (remoteOperation, varValsM') = + buildTypedOperation + G.OperationTypeMutation + varDefs + (map (G.SelectionField . snd . snd) remotes) + varValsM + if all (\remote' -> fst (snd firstRemote) == fst (snd remote')) remotes + then return $ ExecStepRemote (fst (snd firstRemote), remoteOperation, varValsM') + else throw400 NotSupported "Mixed remote schemas are not supported" + _ -> throw400 NotSupported "Heterogeneous execution of database and remote schemas not supported" + -- Build and return an executable action from the generated SQL + where + reportParseErrors errs = case NE.head errs of + -- TODO: Our error reporting machinery doesn’t currently support reporting + -- multiple errors at once, so we’re throwing away all but the first one + -- here. It would be nice to report all of them! + ParseError{ pePath, peMessage, peCode } -> + throwError (err400 peCode peMessage){ qePath = pePath } + + -- | A list of aliased transactions for eg + -- + -- > [("f1", Tx r1), ("f2", Tx r2)] + -- + -- are converted into a single transaction as follows + -- + -- > Tx {"f1": r1, "f2": r2} + toSingleTx :: [(Text, tx EncJSON)] -> tx EncJSON + toSingleTx aliasedTxs = + fmap encJFromAssocList $ + forM aliasedTxs $ \(al, tx) -> (,) al <$> tx + takeTx + :: (G.Name, Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField) + -> Maybe (G.Name, (tx EncJSON, HTTP.ResponseHeaders)) + takeTx (name, Left tx) = Just (name, tx) + takeTx _ = Nothing + takeRemote + :: (G.Name, Either (tx EncJSON, HTTP.ResponseHeaders) RemoteField) + -> Maybe (G.Name, RemoteField) + takeRemote (name, Right remote) = Just (name, remote) + takeRemote _ = Nothing diff --git a/server/src-lib/Hasura/GraphQL/Execute/Plan.hs b/server/src-lib/Hasura/GraphQL/Execute/Plan.hs index 3bdd7ce5225..4d4671a5107 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/Plan.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/Plan.hs @@ -2,6 +2,7 @@ module Hasura.GraphQL.Execute.Plan ( ReusablePlan(..) , PlanCache , PlanCacheOptions(..) + , mkPlanCacheOptions , getPlan , addPlan , initPlanCache @@ -18,18 +19,32 @@ import Hasura.RQL.Types import Hasura.Session import qualified Hasura.Cache.Bounded as Cache -import qualified Hasura.GraphQL.Execute.LiveQuery as LQ -import qualified Hasura.GraphQL.Execute.Query as EQ -import qualified Hasura.GraphQL.Resolve as R +-- import qualified Hasura.GraphQL.Execute.LiveQuery as LQ +-- import qualified Hasura.GraphQL.Execute.Query as EQ + +import qualified Hasura.GraphQL.Execute.Types as ET import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH +{- Note [Temporarily disabling query plan caching] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Caching the incoming queries for re-usability is *temporarily* disabled. +This is being done as part of rewriting GraphQL schema generation and +execution (See https://github.com/hasura/graphql-engine/pull/4111) +until we figure out if we need query plan caching. + +The code related to query caching in GraphQL query execution code path +is just commented with referring to this note. The relavent variables are +commented inline (Ex: {- planCache -}) to help authors in re-enabling +the query caching feature (if needed). +-} + data PlanId = PlanId { _piSchemaCacheVersion :: !SchemaCacheVer , _piRole :: !RoleName , _piOperationName :: !(Maybe GH.OperationName) , _piQuery :: !GH.GQLQueryText - , _piQueryType :: !EQ.GraphQLQueryType + , _piQueryType :: !ET.GraphQLQueryType } deriving (Show, Eq, Ord, Generic) instance Hashable PlanId @@ -47,27 +62,33 @@ instance J.ToJSON PlanId where newtype PlanCache = PlanCache {_unPlanCache :: Cache.BoundedCache PlanId ReusablePlan} -data ReusablePlan - = RPQuery !EQ.ReusableQueryPlan ![R.QueryRootFldUnresolved] - | RPSubs !LQ.ReusableLiveQueryPlan +data ReusablePlan = ReusablePlan -instance J.ToJSON ReusablePlan where - toJSON = \case - RPQuery queryPlan _ -> J.toJSON queryPlan - RPSubs subsPlan -> J.toJSON subsPlan +-- See Note [Temporarily disabling query plan caching] +-- data ReusablePlan +-- = RPQuery !EQ.ReusableQueryPlan +-- | RPSubs !LQ.ReusableLiveQueryPlan + +-- instance J.ToJSON ReusablePlan where +-- toJSON = \case +-- RPQuery queryPlan -> J.toJSON queryPlan +-- RPSubs subsPlan -> J.toJSON subsPlan newtype PlanCacheOptions = PlanCacheOptions { unPlanCacheSize :: Cache.CacheSize } deriving (Show, Eq) $(J.deriveJSON (J.aesonDrop 2 J.snakeCase) ''PlanCacheOptions) +mkPlanCacheOptions :: Cache.CacheSize -> PlanCacheOptions +mkPlanCacheOptions = PlanCacheOptions + initPlanCache :: PlanCacheOptions -> IO PlanCache initPlanCache options = PlanCache <$> Cache.initialise (unPlanCacheSize options) getPlan :: SchemaCacheVer -> RoleName -> Maybe GH.OperationName -> GH.GQLQueryText - -> EQ.GraphQLQueryType -> PlanCache -> IO (Maybe ReusablePlan) + -> ET.GraphQLQueryType -> PlanCache -> IO (Maybe ReusablePlan) getPlan schemaVer rn opNameM q queryType (PlanCache planCache) = Cache.lookup planId planCache where @@ -75,22 +96,28 @@ getPlan schemaVer rn opNameM q queryType (PlanCache planCache) = addPlan :: SchemaCacheVer -> RoleName -> Maybe GH.OperationName -> GH.GQLQueryText - -> ReusablePlan -> EQ.GraphQLQueryType -> PlanCache -> IO () + -> ReusablePlan -> ET.GraphQLQueryType -> PlanCache -> IO () addPlan schemaVer rn opNameM q queryPlan queryType (PlanCache planCache) = Cache.insert planId queryPlan planCache where planId = PlanId schemaVer rn opNameM q queryType -clearPlanCache :: PlanCache -> IO () -clearPlanCache (PlanCache planCache) = - Cache.clear planCache +-- See Note [Temporarily disabling query plan caching] +-- clearPlanCache :: PlanCache -> IO () +clearPlanCache :: IO () +clearPlanCache {- (PlanCache planCache) -} = + pure () + -- Cache.clear planCache -dumpPlanCache :: PlanCache -> IO J.Value -dumpPlanCache (PlanCache cache) = - J.toJSON . map (map dumpEntry) <$> Cache.getEntries cache - where - dumpEntry (planId, plan) = - J.object - [ "id" J..= planId - , "plan" J..= plan - ] +-- See Note [Temporarily disabling query plan caching] +-- dumpPlanCache :: PlanCache -> IO J.Value +dumpPlanCache :: IO J.Value +dumpPlanCache {- (PlanCache cache) -} = + pure $ J.String "Plan cache is temporarily disabled" + -- J.toJSON . map (map dumpEntry) <$> Cache.getEntries cache + -- where + -- dumpEntry (planId, plan) = + -- J.object + -- [ "id" J..= planId + -- , "plan" J..= plan + -- ] diff --git a/server/src-lib/Hasura/GraphQL/Execute/Prepare.hs b/server/src-lib/Hasura/GraphQL/Execute/Prepare.hs new file mode 100644 index 00000000000..18459fdb4cd --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Prepare.hs @@ -0,0 +1,187 @@ +module Hasura.GraphQL.Execute.Prepare + ( PlanVariables + , PrepArgMap + , PlanningSt(..) + , RemoteCall + , ExecutionPlan + , ExecutionStep(..) + , initPlanningSt + , runPlan + , prepareWithPlan + , prepareWithoutPlan + , validateSessionVariables + , withUserVars + , buildTypedOperation + ) where + + +import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set +import qualified Data.IntMap as IntMap +import qualified Data.Text as T +import qualified Database.PG.Query as Q +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH +import qualified Hasura.SQL.DML as S + +import Hasura.GraphQL.Parser.Column +import Hasura.GraphQL.Parser.Schema +import Hasura.RQL.DML.Internal (currentSession) +import Hasura.RQL.Types +import Hasura.Session +import Hasura.SQL.Types +import Hasura.SQL.Value + + +type PlanVariables = Map.HashMap G.Name Int + +-- | The value is (Q.PrepArg, PGScalarValue) because we want to log the human-readable value of the +-- prepared argument and not the binary encoding in PG format +type PrepArgMap = IntMap.IntMap (Q.PrepArg, PGScalarValue) + +-- | Full execution plan to process one GraphQL query. Once we work on +-- heterogeneous execution this will contain a mixture of things to run on the +-- database and things to run on remote schemas. +type ExecutionPlan db remote raw = ExecutionStep db remote raw + +type RemoteCall = (RemoteSchemaInfo, G.TypedOperationDefinition G.NoFragments G.Name, Maybe GH.VariableValues) + +-- | One execution step to processing a GraphQL query (e.g. one root field). +-- Polymorphic to allow the SQL to be generated in stages. +data ExecutionStep db remote raw + = ExecStepDB db + -- ^ A query to execute against the database + | ExecStepRemote remote -- !RemoteSchemaInfo !(G.Selection G.NoFragments G.Name) + -- ^ A query to execute against a remote schema + | ExecStepRaw raw + -- ^ Output a plain JSON object + +data PlanningSt + = PlanningSt + { _psArgNumber :: !Int + , _psVariables :: !PlanVariables + , _psPrepped :: !PrepArgMap + , _psSessionVariables :: !(Set.HashSet SessionVariable) + } + +initPlanningSt :: PlanningSt +initPlanningSt = + PlanningSt 2 Map.empty IntMap.empty Set.empty + +runPlan :: StateT PlanningSt m a -> m (a, PlanningSt) +runPlan = flip runStateT initPlanningSt + +prepareWithPlan :: (MonadState PlanningSt m) => UnpreparedValue -> m S.SQLExp +prepareWithPlan = \case + UVParameter PGColumnValue{ pcvValue = colVal } varInfoM -> do + argNum <- case fmap getName varInfoM of + Just var -> getVarArgNum var + Nothing -> getNextArgNum + addPrepArg argNum (toBinaryValue colVal, pstValue colVal) + return $ toPrepParam argNum (pstType colVal) + + UVSessionVar ty sessVar -> do + sessVarVal <- retrieveAndFlagSessionVariableValue insertSessionVariable sessVar currentSessionExp + pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of + PGTypeScalar colTy -> withConstructorFn colTy sessVarVal + PGTypeArray _ -> sessVarVal + + UVLiteral sqlExp -> pure sqlExp + UVSession -> pure currentSessionExp + where + currentSessionExp = S.SEPrep 1 + insertSessionVariable sessVar plan = + plan { _psSessionVariables = Set.insert sessVar $ _psSessionVariables plan } + +prepareWithoutPlan :: (MonadState (Set.HashSet SessionVariable) m) => UnpreparedValue -> m S.SQLExp +prepareWithoutPlan = \case + UVParameter pgValue _ -> pure $ toTxtValue $ pcvValue pgValue + UVLiteral sqlExp -> pure sqlExp + UVSession -> pure currentSession + UVSessionVar ty sessVar -> do + sessVarVal <- retrieveAndFlagSessionVariableValue Set.insert sessVar currentSession + -- TODO: this piece of code appears at least three times: twice here + -- and once in RQL.DML.Internal. Some de-duplication is in order. + pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of + PGTypeScalar colTy -> withConstructorFn colTy sessVarVal + PGTypeArray _ -> sessVarVal + +retrieveAndFlagSessionVariableValue + :: (MonadState s m) + => (SessionVariable -> s -> s) + -> SessionVariable + -> S.SQLExp + -> m S.SQLExp +retrieveAndFlagSessionVariableValue updateState sessVar currentSessionExp = do + modify $ updateState sessVar + pure $ S.SEOpApp (S.SQLOp "->>") + [currentSessionExp, S.SELit $ sessionVariableToText sessVar] + +withUserVars :: SessionVariables -> [(Q.PrepArg, PGScalarValue)] -> [(Q.PrepArg, PGScalarValue)] +withUserVars usrVars list = + let usrVarsAsPgScalar = PGValJSON $ Q.JSON $ J.toJSON usrVars + prepArg = Q.toPrepVal (Q.AltJ usrVars) + in (prepArg, usrVarsAsPgScalar):list + +validateSessionVariables :: MonadError QErr m => Set.HashSet SessionVariable -> SessionVariables -> m () +validateSessionVariables requiredVariables sessionVariables = do + let missingSessionVariables = requiredVariables `Set.difference` getSessionVariablesSet sessionVariables + unless (null missingSessionVariables) do + throw400 NotFound $ "missing session variables: " <> T.intercalate ", " (dquote . sessionVariableToText <$> toList missingSessionVariables) + +getVarArgNum :: (MonadState PlanningSt m) => G.Name -> m Int +getVarArgNum var = do + PlanningSt curArgNum vars prepped sessionVariables <- get + case Map.lookup var vars of + Just argNum -> pure argNum + Nothing -> do + put $ PlanningSt (curArgNum + 1) (Map.insert var curArgNum vars) prepped sessionVariables + pure curArgNum + +addPrepArg + :: (MonadState PlanningSt m) + => Int -> (Q.PrepArg, PGScalarValue) -> m () +addPrepArg argNum arg = do + PlanningSt curArgNum vars prepped sessionVariables <- get + put $ PlanningSt curArgNum vars (IntMap.insert argNum arg prepped) sessionVariables + +getNextArgNum :: (MonadState PlanningSt m) => m Int +getNextArgNum = do + PlanningSt curArgNum vars prepped sessionVariables <- get + put $ PlanningSt (curArgNum + 1) vars prepped sessionVariables + return curArgNum + +unresolveVariables + :: forall fragments + . Functor fragments + => G.SelectionSet fragments Variable + -> G.SelectionSet fragments G.Name +unresolveVariables = + fmap (fmap (getName . vInfo)) + +collectVariables + :: forall fragments var + . (Foldable fragments, Hashable var, Eq var) + => G.SelectionSet fragments var + -> Set.HashSet var +collectVariables = + Set.unions . fmap (foldMap Set.singleton) + +buildTypedOperation + :: forall frag + . (Functor frag, Foldable frag) + => G.OperationType + -> [G.VariableDefinition] + -> G.SelectionSet frag Variable + -> Maybe GH.VariableValues + -> (G.TypedOperationDefinition frag G.Name, Maybe GH.VariableValues) +buildTypedOperation tp varDefs selSet varValsM = + let unresolvedSelSet = unresolveVariables selSet + requiredVars = collectVariables unresolvedSelSet + restrictedDefs = filter (\varDef -> G._vdName varDef `Set.member` requiredVars) varDefs + restrictedValsM = flip Map.intersection (Set.toMap requiredVars) <$> varValsM + in (G.TypedOperationDefinition tp Nothing restrictedDefs [] unresolvedSelSet, restrictedValsM) diff --git a/server/src-lib/Hasura/GraphQL/Execute/Query.hs b/server/src-lib/Hasura/GraphQL/Execute/Query.hs index e0d1391795d..c5617a7cdb3 100644 --- a/server/src-lib/Hasura/GraphQL/Execute/Query.hs +++ b/server/src-lib/Hasura/GraphQL/Execute/Query.hs @@ -1,10 +1,12 @@ module Hasura.GraphQL.Execute.Query ( convertQuerySelSet - , queryOpFromPlan - , ReusableQueryPlan + -- , queryOpFromPlan + -- , ReusableQueryPlan , GeneratedSqlMap , PreparedSql(..) - , GraphQLQueryType(..) + , traverseQueryRootField -- for live query planning + , irToRootFieldPlan + , parseGraphQLQuery ) where import qualified Data.Aeson as J @@ -12,43 +14,38 @@ import qualified Data.ByteString as B import qualified Data.ByteString.Lazy as LBS import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap import qualified Data.IntMap as IntMap +import qualified Data.Sequence as Seq +import qualified Data.Sequence.NonEmpty as NESeq import qualified Data.TByteString as TBS import qualified Database.PG.Query as Q import qualified Language.GraphQL.Draft.Syntax as G import qualified Network.HTTP.Client as HTTP -import qualified Network.HTTP.Types as N +import qualified Network.HTTP.Types as HTTP -import Control.Lens ((^?)) -import Data.Has - -import qualified Hasura.GraphQL.Resolve as R import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH -import qualified Hasura.GraphQL.Validate as GV -import qualified Hasura.GraphQL.Validate.SelectionSet as V import qualified Hasura.Logging as L +import Hasura.Server.Version (HasVersion) import qualified Hasura.SQL.DML as S import qualified Hasura.Tracing as Tracing +import Hasura.Db import Hasura.EncJSON import Hasura.GraphQL.Context -import Hasura.GraphQL.Resolve.Action -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types +import Hasura.GraphQL.Execute.Action +import Hasura.GraphQL.Execute.Prepare +import Hasura.GraphQL.Execute.Resolve +import Hasura.GraphQL.Parser import Hasura.Prelude import Hasura.RQL.DML.RemoteJoin -import Hasura.RQL.DML.Select +import Hasura.RQL.DML.Select (asSingleRowJsonResp) import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) import Hasura.Session -import Hasura.SQL.Types +-- import Hasura.SQL.Types import Hasura.SQL.Value -type PlanVariables = Map.HashMap G.Variable Int - --- | The value is (Q.PrepArg, PGScalarValue) because we want to log the human-readable value of the --- prepared argument and not the binary encoding in PG format -type PrepArgMap = IntMap.IntMap (Q.PrepArg, PGScalarValue) +import qualified Hasura.RQL.DML.Select as DS data PGPlan = PGPlan @@ -68,43 +65,57 @@ instance J.ToJSON PGPlan where data RootFieldPlan = RFPRaw !B.ByteString | RFPPostgres !PGPlan - -fldPlanFromJ :: (J.ToJSON a) => a -> RootFieldPlan -fldPlanFromJ = RFPRaw . LBS.toStrict . J.encode + | RFPActionQuery !ActionExecuteTx instance J.ToJSON RootFieldPlan where toJSON = \case RFPRaw encJson -> J.toJSON $ TBS.fromBS encJson RFPPostgres pgPlan -> J.toJSON pgPlan + RFPActionQuery _ -> J.String "Action Execution Tx" -type FieldPlans = [(G.Alias, RootFieldPlan)] +type FieldPlans = [(G.Name, RootFieldPlan)] -data ReusableQueryPlan - = ReusableQueryPlan - { _rqpVariableTypes :: !ReusableVariableTypes - , _rqpFldPlans :: !FieldPlans - } +data ActionQueryPlan + = AQPAsyncQuery !DS.AnnSimpleSel -- ^ Cacheable plan + | AQPQuery !ActionExecuteTx -- ^ Non cacheable transaction -instance J.ToJSON ReusableQueryPlan where - toJSON (ReusableQueryPlan varTypes fldPlans) = - J.object [ "variables" J..= varTypes - , "field_plans" J..= fldPlans - ] +actionQueryToRootFieldPlan + :: PlanVariables -> PrepArgMap -> ActionQueryPlan -> RootFieldPlan +actionQueryToRootFieldPlan vars prepped = \case + AQPAsyncQuery s -> RFPPostgres $ + PGPlan (DS.selectQuerySQL DS.JASSingleObject s) vars prepped Nothing + AQPQuery tx -> RFPActionQuery tx -withPlan - :: (MonadError QErr m) - => SessionVariables -> PGPlan -> ReusableVariableValues -> m PreparedSql -withPlan usrVars (PGPlan q reqVars prepMap rq) annVars = do - prepMap' <- foldM getVar prepMap (Map.toList reqVars) - let args = withSessionVariables usrVars $ IntMap.elems prepMap' - return $ PreparedSql q args rq - where - getVar accum (var, prepNo) = do - let varName = G.unName $ G.unVariable var - colVal <- onNothing (Map.lookup var annVars) $ - throw500 $ "missing variable in annVars : " <> varName - let prepVal = (toBinaryValue colVal, pstValue colVal) - return $ IntMap.insert prepNo prepVal accum +-- See Note [Temporarily disabling query plan caching] +-- data ReusableVariableTypes +-- data ReusableVariableValues + +-- data ReusableQueryPlan +-- = ReusableQueryPlan +-- { _rqpVariableTypes :: !ReusableVariableTypes +-- , _rqpFldPlans :: !FieldPlans +-- } + +-- instance J.ToJSON ReusableQueryPlan where +-- toJSON (ReusableQueryPlan varTypes fldPlans) = +-- J.object [ "variables" J..= () -- varTypes +-- , "field_plans" J..= fldPlans +-- ] + +-- withPlan +-- :: (MonadError QErr m) +-- => SessionVariables -> PGPlan -> HashMap G.Name (WithScalarType PGScalarValue) -> m PreparedSql +-- withPlan usrVars (PGPlan q reqVars prepMap remoteJoins) annVars = do +-- prepMap' <- foldM getVar prepMap (Map.toList reqVars) +-- let args = withUserVars usrVars $ IntMap.elems prepMap' +-- return $ PreparedSql q args remoteJoins +-- where +-- getVar accum (var, prepNo) = do +-- let varName = G.unName var +-- colVal <- onNothing (Map.lookup var annVars) $ +-- throw500 $ "missing variable in annVars : " <> varName +-- let prepVal = (toBinaryValue colVal, pstValue colVal) +-- return $ IntMap.insert prepNo prepVal accum -- turn the current plan into a transaction mkCurPlanTx @@ -117,7 +128,7 @@ mkCurPlanTx ) => Env.Environment -> HTTP.Manager - -> [N.Header] + -> [HTTP.Header] -> UserInfo -> FieldPlans -> m (tx EncJSON, GeneratedSqlMap) @@ -126,85 +137,72 @@ mkCurPlanTx env manager reqHdrs userInfo fldPlans = do resolved <- forM fldPlans $ \(alias, fldPlan) -> do fldResp <- case fldPlan of RFPRaw resp -> return $ RRRaw resp - RFPPostgres (PGPlan q _ prepMap rq) -> do - let args = withSessionVariables (_uiSession userInfo) $ IntMap.elems prepMap - return $ RRSql $ PreparedSql q args rq + RFPPostgres (PGPlan q _ prepMap remoteJoins) -> do + let args = withUserVars (_uiSession userInfo) $ IntMap.elems prepMap + return $ RRSql $ PreparedSql q args remoteJoins + RFPActionQuery tx -> pure $ RRActionQuery tx return (alias, fldResp) (,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved) -withSessionVariables :: SessionVariables -> [(Q.PrepArg, PGScalarValue)] -> [(Q.PrepArg, PGScalarValue)] -withSessionVariables usrVars list = - let usrVarsAsPgScalar = PGValJSON $ Q.JSON $ J.toJSON usrVars - prepArg = Q.toPrepVal (Q.AltJ usrVars) - in (prepArg, usrVarsAsPgScalar):list - -data PlanningSt - = PlanningSt - { _psArgNumber :: !Int - , _psVariables :: !PlanVariables - , _psPrepped :: !PrepArgMap - } - -initPlanningSt :: PlanningSt -initPlanningSt = - PlanningSt 2 Map.empty IntMap.empty - -getVarArgNum :: (MonadState PlanningSt m) => G.Variable -> m Int -getVarArgNum var = do - PlanningSt curArgNum vars prepped <- get - case Map.lookup var vars of - Just argNum -> pure argNum - Nothing -> do - put $ PlanningSt (curArgNum + 1) (Map.insert var curArgNum vars) prepped - pure curArgNum - -addPrepArg - :: (MonadState PlanningSt m) - => Int -> (Q.PrepArg, PGScalarValue) -> m () -addPrepArg argNum arg = do - PlanningSt curArgNum vars prepped <- get - put $ PlanningSt curArgNum vars $ IntMap.insert argNum arg prepped - -getNextArgNum :: (MonadState PlanningSt m) => m Int -getNextArgNum = do - PlanningSt curArgNum vars prepped <- get - put $ PlanningSt (curArgNum + 1) vars prepped - return curArgNum - -prepareWithPlan :: (MonadState PlanningSt m) => UnresolvedVal -> m S.SQLExp -prepareWithPlan = \case - R.UVPG annPGVal -> do - let AnnPGVal varM _ colVal = annPGVal - argNum <- case varM of - Just var -> getVarArgNum var - Nothing -> getNextArgNum - addPrepArg argNum (toBinaryValue colVal, pstValue colVal) - return $ toPrepParam argNum (pstType colVal) - - R.UVSessVar ty sessVar -> do - let sessVarVal = - S.SEOpApp (S.SQLOp "->>") - [currentSession, S.SELit $ sessionVariableToText sessVar] - return $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of - PGTypeScalar colTy -> withConstructorFn colTy sessVarVal - PGTypeArray _ -> sessVarVal - - R.UVSQL sqlExp -> pure sqlExp - R.UVSession -> pure currentSession +-- convert a query from an intermediate representation to... another +irToRootFieldPlan + :: PlanVariables + -> PrepArgMap + -> QueryDB S.SQLExp -> PGPlan +irToRootFieldPlan vars prepped = \case + QDBSimple s -> mkPGPlan (DS.selectQuerySQL DS.JASMultipleRows) s + QDBPrimaryKey s -> mkPGPlan (DS.selectQuerySQL DS.JASSingleObject) s + QDBAggregation s -> + let (annAggSel, aggRemoteJoins) = getRemoteJoinsAggregateSelect s + in PGPlan (DS.selectAggregateQuerySQL annAggSel) vars prepped aggRemoteJoins + QDBConnection s -> + let (connSel, connRemoteJoins) = getRemoteJoinsConnectionSelect s + in PGPlan (DS.connectionSelectQuerySQL connSel) vars prepped connRemoteJoins where - currentSession = S.SEPrep 1 + mkPGPlan f simpleSel = + let (simpleSel',remoteJoins) = getRemoteJoins simpleSel + in PGPlan (f simpleSel') vars prepped remoteJoins + +traverseQueryRootField + :: forall f a b c d h + . Applicative f + => (a -> f b) + -> RootField (QueryDB a) c h d + -> f (RootField (QueryDB b) c h d) +traverseQueryRootField f = + traverseDB f' + where + f' :: QueryDB a -> f (QueryDB b) + f' = \case + QDBSimple s -> QDBSimple <$> DS.traverseAnnSimpleSelect f s + QDBPrimaryKey s -> QDBPrimaryKey <$> DS.traverseAnnSimpleSelect f s + QDBAggregation s -> QDBAggregation <$> DS.traverseAnnAggregateSelect f s + QDBConnection s -> QDBConnection <$> DS.traverseConnectionSelect f s + +parseGraphQLQuery + :: MonadError QErr m + => GQLContext + -> [G.VariableDefinition] + -> Maybe (HashMap G.Name J.Value) + -> G.SelectionSet G.NoFragments G.Name + -> m ( InsOrdHashMap G.Name (QueryRootField UnpreparedValue) + , QueryReusability + ) +parseGraphQLQuery gqlContext varDefs varValsM fields = + resolveVariables varDefs (fromMaybe Map.empty varValsM) fields + >>= (gqlQueryParser gqlContext >>> (`onLeft` reportParseErrors)) + where + reportParseErrors errs = case NESeq.head errs of + -- TODO: Our error reporting machinery doesn’t currently support reporting + -- multiple errors at once, so we’re throwing away all but the first one + -- here. It would be nice to report all of them! + ParseError{ pePath, peMessage, peCode } -> + throwError (err400 peCode peMessage){ qePath = pePath } convertQuerySelSet - :: ( MonadError QErr m - , MonadReader r m - , Has TypeMap r - , Has QueryCtxMap r - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has UserInfo r - , Has (L.Logger L.Hasura) r + :: forall m tx . + ( MonadError QErr m , HasVersion , MonadIO m , Tracing.MonadTrace m @@ -213,59 +211,111 @@ convertQuerySelSet , Tracing.MonadTrace tx ) => Env.Environment - -> HTTP.Manager - -> [N.Header] - -> QueryReusability - -> V.ObjectSelectionSet - -> QueryActionExecuter - -> m (tx EncJSON, Maybe ReusableQueryPlan, GeneratedSqlMap, [R.QueryRootFldUnresolved]) -convertQuerySelSet env manager reqHdrs initialReusability selSet actionRunner = do - userInfo <- asks getter - (fldPlansAndAst, finalReusability) <- runReusabilityTWith initialReusability $ do - result <- V.traverseObjectSelectionSet selSet $ \fld -> do - case V._fName fld of - "__type" -> ((, Nothing) . fldPlanFromJ) <$> R.typeR fld - "__schema" -> ((, Nothing) . fldPlanFromJ) <$> R.schemaR fld - "__typename" -> pure (fldPlanFromJ queryRootNamedType, Nothing) - _ -> do - unresolvedAst <- R.queryFldToPGAST env fld actionRunner - (q, PlanningSt _ vars prepped) <- flip runStateT initPlanningSt $ - R.traverseQueryRootFldAST prepareWithPlan unresolvedAst - let (query, remoteJoins) = R.toPGQuery q - pure $ (RFPPostgres $ PGPlan query vars prepped remoteJoins, Just unresolvedAst) - return $ map (\(alias, (fldPlan, ast)) -> ((G.Alias $ G.Name alias, fldPlan), ast)) result - - let varTypes = finalReusability ^? _Reusable - fldPlans = map fst fldPlansAndAst - reusablePlan = ReusableQueryPlan <$> varTypes <*> pure fldPlans - (tx, sql) <- mkCurPlanTx env manager reqHdrs userInfo fldPlans - pure (tx, reusablePlan, sql, mapMaybe snd fldPlansAndAst) - --- use the existing plan and new variables to create a pg query -queryOpFromPlan - :: ( HasVersion - , MonadError QErr m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> HTTP.Manager - -> [N.Header] + -> L.Logger L.Hasura + -> GQLContext -> UserInfo + -> HTTP.Manager + -> HTTP.RequestHeaders + -> G.SelectionSet G.NoFragments G.Name + -> [G.VariableDefinition] -> Maybe GH.VariableValues - -> ReusableQueryPlan - -> m (tx EncJSON, GeneratedSqlMap) -queryOpFromPlan env manager reqHdrs userInfo varValsM (ReusableQueryPlan varTypes fldPlans) = do - validatedVars <- GV.validateVariablesForReuse varTypes varValsM - -- generate the SQL and prepared vars or the bytestring - resolved <- forM fldPlans $ \(alias, fldPlan) -> - (alias,) <$> case fldPlan of - RFPRaw resp -> return $ RRRaw resp - RFPPostgres pgPlan -> RRSql <$> withPlan (_uiSession userInfo) pgPlan validatedVars + -> m ( ExecutionPlan (tx EncJSON, GeneratedSqlMap) RemoteCall (G.Name, J.Value) + -- , Maybe ReusableQueryPlan + , [QueryRootField UnpreparedValue] + ) +convertQuerySelSet env logger gqlContext userInfo manager reqHeaders fields varDefs varValsM = do + -- Parse the GraphQL query into the RQL AST + (unpreparedQueries, _reusability) <- parseGraphQLQuery gqlContext varDefs varValsM fields - (,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved) + -- Transform the RQL AST into a prepared SQL query + queryPlans <- for unpreparedQueries \unpreparedQuery -> do + (preparedQuery, PlanningSt _ planVars planVals expectedVariables) + <- flip runStateT initPlanningSt + $ traverseQueryRootField prepareWithPlan unpreparedQuery + >>= traverseAction convertActionQuery + validateSessionVariables expectedVariables $ _uiSession userInfo + traverseDB (pure . irToRootFieldPlan planVars planVals) preparedQuery + >>= traverseAction (pure . actionQueryToRootFieldPlan planVars planVals) + + -- This monster makes sure that consecutive database operation get executed together + let dbPlans :: Seq.Seq (G.Name, RootFieldPlan) + remoteFields :: Seq.Seq (G.Name, RemoteField) + (dbPlans, remoteFields) = OMap.foldlWithKey' collectPlan (Seq.Empty, Seq.Empty) queryPlans + + collectPlan + :: (Seq.Seq (G.Name, RootFieldPlan), Seq.Seq (G.Name, RemoteField)) + -> G.Name + -> RootField PGPlan RemoteField RootFieldPlan J.Value + -> (Seq.Seq (G.Name, RootFieldPlan), Seq.Seq (G.Name, RemoteField)) + + collectPlan (seqDB, seqRemote) name (RFRemote r) = + (seqDB, seqRemote Seq.:|> (name, r)) + + collectPlan (seqDB, seqRemote) name (RFDB db) = + (seqDB Seq.:|> (name, RFPPostgres db), seqRemote) + + collectPlan (seqDB, seqRemote) name (RFAction rfp) = + (seqDB Seq.:|> (name, rfp), seqRemote) + + collectPlan (seqDB, seqRemote) name (RFRaw r) = + (seqDB Seq.:|> (name, RFPRaw $ LBS.toStrict $ J.encode r), seqRemote) + + + executionPlan <- case (dbPlans, remoteFields) of + (dbs, Seq.Empty) -> ExecStepDB <$> mkCurPlanTx env manager reqHeaders userInfo (toList dbs) + (Seq.Empty, remotes@(firstRemote Seq.:<| _)) -> do + let (remoteOperation, _) = + buildTypedOperation + G.OperationTypeQuery + varDefs + (map (G.SelectionField . snd . snd) $ toList remotes) + varValsM + if all (\remote' -> fst (snd firstRemote) == fst (snd remote')) remotes + then return $ ExecStepRemote (fst (snd firstRemote), remoteOperation, varValsM) + else throw400 NotSupported "Mixed remote schemas are not supported" + _ -> throw400 NotSupported "Heterogeneous execution of database and remote schemas not supported" + + let asts :: [QueryRootField UnpreparedValue] + asts = OMap.elems unpreparedQueries + pure (executionPlan,asts) -- See Note [Temporarily disabling query plan caching] + where + usrVars = _uiSession userInfo + + convertActionQuery + :: ActionQuery UnpreparedValue -> StateT PlanningSt m ActionQueryPlan + convertActionQuery = \case + AQQuery s -> lift $ do + result <- resolveActionExecution env logger userInfo s $ ActionExecContext manager reqHeaders usrVars + pure $ AQPQuery $ _aerTransaction result + AQAsync s -> AQPAsyncQuery <$> + DS.traverseAnnSimpleSelect prepareWithPlan (resolveAsyncActionQuery userInfo s) + +-- See Note [Temporarily disabling query plan caching] +-- use the existing plan and new variables to create a pg query +-- queryOpFromPlan +-- :: ( HasVersion +-- , MonadError QErr m +-- , Tracing.MonadTrace m +-- , MonadIO tx +-- , MonadTx tx +-- , Tracing.MonadTrace tx +-- ) +-- => Env.Environment +-- -> HTTP.Manager +-- -> [HTTP.Header] +-- -> UserInfo +-- -> Maybe GH.VariableValues +-- -> ReusableQueryPlan +-- -> m (tx EncJSON, GeneratedSqlMap) +-- queryOpFromPlan env manager reqHdrs userInfo varValsM (ReusableQueryPlan varTypes fldPlans) = do +-- validatedVars <- _validateVariablesForReuse varTypes varValsM +-- -- generate the SQL and prepared vars or the bytestring +-- resolved <- forM fldPlans $ \(alias, fldPlan) -> +-- (alias,) <$> case fldPlan of +-- RFPRaw resp -> return $ RRRaw resp +-- RFPPostgres pgPlan -> RRSql <$> withPlan (_uiSession userInfo) pgPlan validatedVars + +-- (,) <$> mkLazyRespTx env manager reqHdrs userInfo resolved <*> pure (mkGeneratedSqlMap resolved) data PreparedSql = PreparedSql @@ -291,11 +341,12 @@ instance J.ToJSON PreparedSql where data ResolvedQuery = RRRaw !B.ByteString | RRSql !PreparedSql + | RRActionQuery !ActionExecuteTx -- | The computed SQL with alias which can be logged. Nothing here represents no -- SQL for cases like introspection responses. Tuple of alias to a (maybe) -- prepared statement -type GeneratedSqlMap = [(G.Alias, Maybe PreparedSql)] +type GeneratedSqlMap = [(G.Name, Maybe PreparedSql)] mkLazyRespTx :: ( HasVersion @@ -306,38 +357,28 @@ mkLazyRespTx ) => Env.Environment -> HTTP.Manager - -> [N.Header] + -> [HTTP.Header] -> UserInfo - -> [(G.Alias, ResolvedQuery)] + -> [(G.Name, ResolvedQuery)] -> m (tx EncJSON) -mkLazyRespTx env manager reqHdrs userInfo resolved = do +mkLazyRespTx env manager reqHdrs userInfo resolved = pure $ fmap encJFromAssocList $ forM resolved $ \(alias, node) -> do resp <- case node of - RRRaw bs -> return $ encJFromBS bs - RRSql (PreparedSql q args maybeRemoteJoins) -> do + RRRaw bs -> return $ encJFromBS bs + RRSql (PreparedSql q args maybeRemoteJoins) -> do let prepArgs = map fst args case maybeRemoteJoins of Nothing -> Tracing.trace "Postgres" . liftTx $ asSingleRowJsonResp q prepArgs Just remoteJoins -> executeQueryWithRemoteJoins env manager reqHdrs userInfo q prepArgs remoteJoins - return (G.unName $ G.unAlias alias, resp) + RRActionQuery actionTx -> actionTx + return (G.unName alias, resp) -mkGeneratedSqlMap :: [(G.Alias, ResolvedQuery)] -> GeneratedSqlMap +mkGeneratedSqlMap :: [(G.Name, ResolvedQuery)] -> GeneratedSqlMap mkGeneratedSqlMap resolved = flip map resolved $ \(alias, node) -> let res = case node of - RRRaw _ -> Nothing - RRSql ps -> Just ps + RRRaw _ -> Nothing + RRSql ps -> Just ps + RRActionQuery _ -> Nothing in (alias, res) - --- The GraphQL Query type -data GraphQLQueryType - = QueryHasura - | QueryRelay - deriving (Show, Eq, Ord, Generic) -instance Hashable GraphQLQueryType - -instance J.ToJSON GraphQLQueryType where - toJSON = \case - QueryHasura -> "hasura" - QueryRelay -> "relay" diff --git a/server/src-lib/Hasura/GraphQL/Execute/Resolve.hs b/server/src-lib/Hasura/GraphQL/Execute/Resolve.hs new file mode 100644 index 00000000000..9ddf7908fac --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Resolve.hs @@ -0,0 +1,84 @@ +-- | Implements /variable resolution/ for GraphQL queries, which annotates the +-- use site of each GraphQL variable with its value. +module Hasura.GraphQL.Execute.Resolve + ( resolveVariables + ) where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict.Extended as Map +import qualified Data.HashSet as HS +import qualified Data.List as L +import qualified Data.Text as T +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH + +import Hasura.GraphQL.Parser.Schema +import Hasura.RQL.Types.Error +import Hasura.SQL.Types + +resolveVariables + :: forall m fragments + . (MonadError QErr m, Traversable fragments) + => [G.VariableDefinition] + -> GH.VariableValues + -> G.SelectionSet fragments G.Name + -> m (G.SelectionSet fragments Variable) +resolveVariables definitions jsonValues selSet = do + variablesByName <- Map.groupOnNE getName <$> traverse buildVariable definitions + uniqueVariables <- flip Map.traverseWithKey variablesByName + \variableName variableDefinitions -> + case variableDefinitions of + a :| [] -> return a + _ -> throw400 ParseFailed + $ "multiple definitions for variable " <>> variableName + (selSet', usedVariables) <- flip runStateT mempty $ + traverse (traverse (resolveVariable uniqueVariables)) selSet + let variablesByNameSet = HS.fromList . Map.keys $ variablesByName + jsonVariableNames = HS.fromList $ Map.keys jsonValues + -- At the time of writing, this check is disabled using + -- a local binding because, the master branch doesn't implement this + -- check. + -- TODO: Do this check using a feature flag + isVariableValidationEnabled = False + + when (isVariableValidationEnabled && usedVariables /= variablesByNameSet) $ + throw400 ValidationFailed $ + "following variable(s) have been defined, but have not been used in the query - " + <> T.concat (L.intersperse ", " $ + map G.unName $ HS.toList $ + HS.difference variablesByNameSet usedVariables) + + -- There may be variables which have a default value and may not be + -- included in the variables JSON Map. So, we should only see, if a + -- variable is inlcuded in the JSON Map, then it must be used in the + -- query + when (HS.difference jsonVariableNames usedVariables /= HS.empty) $ + throw400 ValidationFailed $ + "unexpected variables in variableValues: " + <> T.concat (L.intersperse ", " $ + map G.unName $ HS.toList $ + HS.difference jsonVariableNames usedVariables) + return selSet' + where + buildVariable :: G.VariableDefinition -> m Variable + buildVariable G.VariableDefinition{ G._vdName, G._vdType, G._vdDefaultValue } = do + let defaultValue = fromMaybe G.VNull _vdDefaultValue + value <- case Map.lookup _vdName jsonValues of + Just jsonValue -> pure $ JSONValue jsonValue + Nothing + | G.isNullable _vdType -> pure $ GraphQLValue $ absurd <$> defaultValue + | otherwise -> throw400 ValidationFailed $ + "expecting a value for non-nullable variable: " <>> _vdName + pure $! Variable + { vInfo = if G.isNullable _vdType + then VIOptional _vdName defaultValue + else VIRequired _vdName + , vType = _vdType + , vValue = value + } + resolveVariable :: HashMap G.Name Variable -> G.Name -> StateT (HS.HashSet G.Name) m Variable + resolveVariable variables name = case Map.lookup name variables of + Just variable -> modify (HS.insert name) >> pure variable + Nothing -> throw400 ValidationFailed $ "unbound variable " <>> name diff --git a/server/src-lib/Hasura/GraphQL/Execute/Types.hs b/server/src-lib/Hasura/GraphQL/Execute/Types.hs new file mode 100644 index 00000000000..f25c4b86c5c --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Execute/Types.hs @@ -0,0 +1,17 @@ +module Hasura.GraphQL.Execute.Types (GraphQLQueryType(..)) where + +import Hasura.Prelude + +import qualified Data.Aeson as J + +-- graphql-engine supports two GraphQL interfaces: one at v1/graphql, and a Relay one at v1beta1/relay +data GraphQLQueryType + = QueryHasura + | QueryRelay + deriving (Show, Eq, Ord, Generic) +instance Hashable GraphQLQueryType + +instance J.ToJSON GraphQLQueryType where + toJSON = \case + QueryHasura -> "hasura" + QueryRelay -> "relay" diff --git a/server/src-lib/Hasura/GraphQL/Explain.hs b/server/src-lib/Hasura/GraphQL/Explain.hs index 85b482d2fef..3f327331ddc 100644 --- a/server/src-lib/Hasura/GraphQL/Explain.hs +++ b/server/src-lib/Hasura/GraphQL/Explain.hs @@ -6,32 +6,29 @@ module Hasura.GraphQL.Explain import qualified Data.Aeson as J import qualified Data.Aeson.Casing as J import qualified Data.Aeson.TH as J -import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap import qualified Database.PG.Query as Q import qualified Language.GraphQL.Draft.Syntax as G import Hasura.EncJSON import Hasura.GraphQL.Context -import Hasura.GraphQL.Resolve.Action -import Hasura.GraphQL.Validate.Types (evalReusabilityT, runReusabilityT) +import Hasura.GraphQL.Parser import Hasura.Prelude import Hasura.RQL.DML.Internal import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) import Hasura.Session import Hasura.SQL.Types import Hasura.SQL.Value import qualified Hasura.GraphQL.Execute as E +import qualified Hasura.GraphQL.Execute.Inline as E import qualified Hasura.GraphQL.Execute.LiveQuery as E -import qualified Hasura.GraphQL.Resolve as RS +import qualified Hasura.GraphQL.Execute.Query as E import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH -import qualified Hasura.GraphQL.Validate as GV -import qualified Hasura.GraphQL.Validate.SelectionSet as GV -import qualified Hasura.Logging as L +import qualified Hasura.RQL.DML.RemoteJoin as RR +import qualified Hasura.RQL.DML.Select as DS import qualified Hasura.SQL.DML as S -import qualified Hasura.Tracing as Tracing data GQLExplain = GQLExplain @@ -53,118 +50,96 @@ data FieldPlan $(J.deriveJSON (J.aesonDrop 3 J.camelCase) ''FieldPlan) -type Explain r m = - (ReaderT r (ExceptT QErr m)) - -runExplain +resolveUnpreparedValue :: (MonadError QErr m) - => r -> Explain r m a -> m a -runExplain ctx m = - either throwError return =<< runExceptT (runReaderT m ctx) + => UserInfo -> UnpreparedValue -> m S.SQLExp +resolveUnpreparedValue userInfo = \case + UVParameter pgValue _ -> pure $ toTxtValue $ pcvValue pgValue + UVLiteral sqlExp -> pure sqlExp + UVSession -> pure $ sessionInfoJsonExp $ _uiSession userInfo + UVSessionVar ty sessionVariable -> do + let maybeSessionVariableValue = + getSessionVariableValue sessionVariable (_uiSession userInfo) -resolveVal - :: (MonadError QErr m) - => UserInfo -> RS.UnresolvedVal -> m S.SQLExp -resolveVal userInfo = \case - RS.UVPG annPGVal -> - RS.txtConverter annPGVal - RS.UVSessVar ty sessVar -> do - sessVarVal <- S.SELit <$> getSessVarVal userInfo sessVar - return $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of - PGTypeScalar colTy -> withConstructorFn colTy sessVarVal - PGTypeArray _ -> sessVarVal - RS.UVSQL sqlExp -> return sqlExp - RS.UVSession -> pure $ sessionInfoJsonExp $ _uiSession userInfo + sessionVariableValue <- fmap S.SELit $ onNothing maybeSessionVariableValue $ + throw400 UnexpectedPayload $ "missing required session variable for role " + <> _uiRole userInfo <<> " : " <> sessionVariableToText sessionVariable -getSessVarVal - :: (MonadError QErr m) - => UserInfo -> SessionVariable -> m Text -getSessVarVal userInfo sessVar = - onNothing (getSessionVariableValue sessVar sessionVariables) $ - throw400 UnexpectedPayload $ - "missing required session variable for role " <> rn <<> - " : " <> sessionVariableToText sessVar - where - rn = _uiRole userInfo - sessionVariables = _uiSession userInfo + pure $ flip S.SETyAnn (S.mkTypeAnn ty) $ case ty of + PGTypeScalar colTy -> withConstructorFn colTy sessionVariableValue + PGTypeArray _ -> sessionVariableValue -explainField - :: (MonadError QErr m, MonadTx m, HasVersion, MonadIO m, Tracing.MonadTrace m) - => Env.Environment - -> L.Logger L.Hasura - -> UserInfo - -> GCtx - -> SQLGenCtx - -> QueryActionExecuter - -> GV.Field +-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it +-- here. We should evaluate if we need it here. +explainQueryField + :: (MonadError QErr m, MonadTx m) + => UserInfo + -> G.Name + -> QueryRootField UnpreparedValue -> m FieldPlan -explainField env logger userInfo gCtx sqlGenCtx actionExecuter fld = - case fName of - "__type" -> return $ FieldPlan fName Nothing Nothing - "__schema" -> return $ FieldPlan fName Nothing Nothing - "__typename" -> return $ FieldPlan fName Nothing Nothing - _ -> do - unresolvedAST <- - runExplain (logger, queryCtxMap, userInfo, fldMap, orderByCtx, sqlGenCtx) $ - evalReusabilityT $ RS.queryFldToPGAST env fld actionExecuter - resolvedAST <- RS.traverseQueryRootFldAST (resolveVal userInfo) unresolvedAST - let (query, remoteJoins) = RS.toPGQuery resolvedAST - txtSQL = Q.getQueryText query +explainQueryField userInfo fieldName rootField = do + resolvedRootField <- E.traverseQueryRootField (resolveUnpreparedValue userInfo) rootField + case resolvedRootField of + RFRemote _ -> throw400 InvalidParams "only hasura queries can be explained" + RFAction _ -> throw400 InvalidParams "query actions cannot be explained" + RFRaw _ -> pure $ FieldPlan fieldName Nothing Nothing + RFDB qDB -> do + let (querySQL, remoteJoins) = case qDB of + QDBSimple s -> first (DS.selectQuerySQL DS.JASMultipleRows) $ RR.getRemoteJoins s + QDBPrimaryKey s -> first (DS.selectQuerySQL DS.JASSingleObject) $ RR.getRemoteJoins s + QDBAggregation s -> first DS.selectAggregateQuerySQL $ RR.getRemoteJoinsAggregateSelect s + QDBConnection s -> first DS.connectionSelectQuerySQL $ RR.getRemoteJoinsConnectionSelect s + textSQL = Q.getQueryText querySQL -- CAREFUL!: an `EXPLAIN ANALYZE` here would actually *execute* this - -- query, resulting in potential privilege escalation: - withExplain = "EXPLAIN (FORMAT TEXT) " <> txtSQL + -- query, maybe resulting in privilege escalation: + withExplain = "EXPLAIN (FORMAT TEXT) " <> textSQL -- Reject if query contains any remote joins when (remoteJoins /= mempty) $ throw400 NotSupported "Remote relationships are not allowed in explain query" planLines <- liftTx $ map runIdentity <$> - Q.listQE dmlTxErrorHandler (Q.fromText withExplain) () True - return $ FieldPlan fName (Just txtSQL) $ Just planLines - where - fName = GV._fName fld - - queryCtxMap = _gQueryCtxMap gCtx - fldMap = _gFields gCtx - orderByCtx = _gOrdByCtx gCtx + Q.listQE dmlTxErrorHandler (Q.fromText withExplain) () True + pure $ FieldPlan fieldName (Just textSQL) $ Just planLines +-- NOTE: This function has a 'MonadTrace' constraint in master, but we don't need it +-- here. We should evaluate if we need it here. explainGQLQuery - :: ( HasVersion - , MonadError QErr m - , MonadIO m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> L.Logger L.Hasura - -> PGExecCtx - -> (tx EncJSON -> m EncJSON) + :: forall m + . ( MonadError QErr m + , MonadIO m + ) + => PGExecCtx -> SchemaCache - -> SQLGenCtx - -> QueryActionExecuter -> GQLExplain -> m EncJSON -explainGQLQuery env logger pgExecCtx runInTx sc sqlGenCtx actionExecuter (GQLExplain query userVarsRaw maybeIsRelay) = do - -- NOTE!: we will be executing what follows as though admin role. See e.g. - -- notes in explainField: +explainGQLQuery pgExecCtx sc (GQLExplain query userVarsRaw maybeIsRelay) = do + -- NOTE!: we will be executing what follows as though admin role. See e.g. notes in explainField: userInfo <- mkUserInfo (URBFromSessionVariablesFallback adminRoleName) UAdminSecretSent sessionVariables -- we don't need to check in allow list as we consider it an admin endpoint - (execPlan, queryReusability) <- runReusabilityT $ - E.getExecPlanPartial userInfo sc queryType query - (gCtx, rootSelSet) <- case execPlan of - E.GExPHasura (gCtx, rootSelSet) -> - return (gCtx, rootSelSet) - E.GExPRemote{} -> - throw400 InvalidParams "only hasura queries can be explained" - case rootSelSet of - GV.RQuery selSet -> - runInTx $ encJFromJValue . map snd <$> - GV.traverseObjectSelectionSet selSet (explainField env logger userInfo gCtx sqlGenCtx actionExecuter) - GV.RMutation _ -> + let takeFragment = + \case G.ExecutableDefinitionFragment f -> Just f; _ -> Nothing + fragments = mapMaybe takeFragment $ GH.unGQLExecDoc $ GH._grQuery query + (graphQLContext, queryParts) <- E.getExecPlanPartial userInfo sc queryType query + case queryParts of + G.TypedOperationDefinition G.OperationTypeQuery _ varDefs _ selSet -> do + -- (Here the above fragment inlining is actually executed.) + inlinedSelSet <- E.inlineSelectionSet fragments selSet + (unpreparedQueries, _) <- + E.parseGraphQLQuery graphQLContext varDefs (GH._grVariables query) inlinedSelSet + runInTx $ encJFromJValue + <$> for (OMap.toList unpreparedQueries) (uncurry (explainQueryField userInfo)) + + G.TypedOperationDefinition G.OperationTypeMutation _ _ _ _ -> throw400 InvalidParams "only queries can be explained" - GV.RSubscription fields -> do - (plan, _) <- E.getSubsOp env logger pgExecCtx gCtx sqlGenCtx userInfo - queryReusability actionExecuter fields + + G.TypedOperationDefinition G.OperationTypeSubscription _ varDefs _ selSet -> do + -- (Here the above fragment inlining is actually executed.) + inlinedSelSet <- E.inlineSelectionSet fragments selSet + (unpreparedQueries, _) <- E.parseGraphQLQuery graphQLContext varDefs (GH._grVariables query) inlinedSelSet + validSubscriptionQueries <- for unpreparedQueries E.validateSubscriptionRootField + (plan, _) <- E.buildLiveQueryPlan pgExecCtx userInfo validSubscriptionQueries runInTx $ encJFromJValue <$> E.explainLiveQueryPlan plan where queryType = bool E.QueryHasura E.QueryRelay $ fromMaybe False maybeIsRelay sessionVariables = mkSessionVariablesText $ maybe [] Map.toList userVarsRaw + + runInTx :: LazyTx QErr EncJSON -> m EncJSON + runInTx = liftEither <=< liftIO . runExceptT . runLazyTx pgExecCtx Q.ReadOnly diff --git a/server/src-lib/Hasura/GraphQL/Logging.hs b/server/src-lib/Hasura/GraphQL/Logging.hs index f284adfadb9..c5a6ae419de 100644 --- a/server/src-lib/Hasura/GraphQL/Logging.hs +++ b/server/src-lib/Hasura/GraphQL/Logging.hs @@ -5,7 +5,7 @@ layer. In contrast with, logging at the HTTP server layer. module Hasura.GraphQL.Logging ( QueryLog(..) - , MonadQueryLog (..) + , MonadQueryLog(..) ) where import qualified Data.Aeson as J @@ -43,9 +43,8 @@ instance L.ToEngineLog QueryLog L.Hasura where -- | key-value map to be printed as JSON encodeSql :: EQ.GeneratedSqlMap -> J.Value encodeSql sql = - jValFromAssocList $ map (\(a, q) -> (alName a, fmap J.toJSON q)) sql + jValFromAssocList $ map (\(a, q) -> (G.unName a, fmap J.toJSON q)) sql where - alName = G.unName . G.unAlias jValFromAssocList xs = J.object $ map (uncurry (J..=)) xs class Monad m => MonadQueryLog m where diff --git a/server/src-lib/Hasura/GraphQL/NormalForm.hs b/server/src-lib/Hasura/GraphQL/NormalForm.hs index 58e20980168..869e70e66b1 100644 --- a/server/src-lib/Hasura/GraphQL/NormalForm.hs +++ b/server/src-lib/Hasura/GraphQL/NormalForm.hs @@ -247,9 +247,9 @@ instance IsField Typename where getMemberSelectionSet :: IsField f => G.NamedType -> ScopedSelectionSet f -> ObjectSelectionSet -getMemberSelectionSet namedType (ScopedSelectionSet {..}) = +getMemberSelectionSet namedType ScopedSelectionSet{..} = fromMaybe (ObjectSelectionSet (fmap toField _sssBaseSelectionSet)) $ - Map.lookup namedType $ _sssMemberSelectionSets + Map.lookup namedType _sssMemberSelectionSets data AnnInpVal = AnnInpVal diff --git a/server/src-lib/Hasura/GraphQL/Parser.hs b/server/src-lib/Hasura/GraphQL/Parser.hs new file mode 100644 index 00000000000..3bb0e893fa0 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser.hs @@ -0,0 +1,51 @@ +-- | This module exports the public API to our internal GraphQL query parser +-- combinator language. For more details, see the documentation for 'Parser'. +module Hasura.GraphQL.Parser + ( Parser + , parserType + , runParser + , bind + , bindFields + + , ScalarRepresentation(..) + , scalar + , boolean + , int + , float + , string + , identifier + , unsafeRawScalar + + , enum + , nullable + , list + , object + , selectionSet + , selectionSetObject + + , InputFieldsParser + , field + , fieldWithDefault + , fieldOptional + + , FieldParser + , ParsedSelection(..) + , handleTypename + , selection + , selection_ + , subselection + , subselection_ + + , jsonToGraphQL + + , module Hasura.GraphQL.Parser.Class + , module Hasura.GraphQL.Parser.Column + , module Hasura.GraphQL.Parser.Monad + , module Hasura.GraphQL.Parser.Schema + ) where + +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Column +import Hasura.GraphQL.Parser.Internal.Parser +import Hasura.GraphQL.Parser.Monad +import Hasura.GraphQL.Parser.Schema diff --git a/server/src-lib/Hasura/GraphQL/Parser/Class.hs b/server/src-lib/Hasura/GraphQL/Parser/Class.hs new file mode 100644 index 00000000000..687fc26f3e3 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Class.hs @@ -0,0 +1,193 @@ +-- | Classes for monads used during schema construction and query parsing. +module Hasura.GraphQL.Parser.Class where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict as Map +import qualified Language.Haskell.TH as TH + +import Data.Has +import Data.Parser.JSONPath +import Data.Tuple.Extended +import GHC.Stack (HasCallStack) +import Type.Reflection (Typeable) + +import {-# SOURCE #-} Hasura.GraphQL.Parser.Internal.Parser +import Hasura.RQL.Types.Error +import Hasura.RQL.Types.Table (TableCache, TableInfo) +import Hasura.Session (RoleName) +import Hasura.SQL.Types + +{- Note [Tying the knot] +~~~~~~~~~~~~~~~~~~~~~~~~ +GraphQL type definitions can be mutually recursive, and indeed, they quite often +are! For example, two tables that reference one another will be represented by +types such as the following: + + type author { + id: Int! + name: String! + articles: [article!]! + } + + type article { + id: Int! + title: String! + content: String! + author: author! + } + +This doesn’t cause any trouble if the schema is represented by a mapping from +type names to type definitions, but the Parser abstraction is all about avoiding +that kind of indirection to improve type safety — parsers refer to their +sub-parsers directly. This presents two problems during schema generation: + + 1. Schema generation needs to terminate in finite time, so we need to ensure + we don’t try to eagerly construct an infinitely-large schema due to the + mutually-recursive structure. + + 2. To serve introspection queries, we do eventually need to construct a + mapping from names to types (a TypeMap), so we need to be able to + recursively walk the entire schema in finite time. + +Solving point number 1 could be done with either laziness or sharing, but +neither of those are enough to solve point number 2, which requires /observable/ +sharing. We need to construct a Parser graph that contains enough information to +detect cycles during traversal. + +It may seem appealing to just use type names to detect cycles, which would allow +us to get away with using laziness rather than true sharing. Unfortunately, that +leads to two further problems: + + * It’s possible to end up with two different types with the same name, which + is an error and should be reported as such. Using names to break cycles + prevents us from doing that, since we have no way to check that two types + with the same name are actually the same. + + * Some Parser constructors can fail — the `column` parser checks that the type + name is a valid GraphQL name, for example. This extra validation means lazy + schema construction isn’t viable, since we need to eagerly build the schema + to ensure all the validation checks hold. + +So we’re forced to use sharing. But how do we do it? Somehow, we have to /tie +the knot/ — we have to build a cyclic data structure — and some of the cycles +may be quite large. Doing all this knot-tying by hand would be incredibly +tricky, and it would require a lot of inversion of control to thread the shared +parsers around. + +To avoid contorting the program, we instead implement a form of memoization. The +MonadSchema class provides a mechanism to memoize a parser constructor function, +which allows us to get sharing mostly for free. The memoization strategy also +annotates cached parsers with a Unique that can be used to break cycles while +traversing the graph, so we get observable sharing as well. -} + +-- | A class that provides functionality used when building the GraphQL schema, +-- i.e. constructing the 'Parser' graph. +class (Monad m, MonadParse n) => MonadSchema n m | m -> n where + -- | Memoizes a parser constructor function for the extent of a single schema + -- construction process. This is mostly useful for recursive parsers; + -- see Note [Tying the knot] for more details. + memoizeOn + :: (HasCallStack, Ord a, Typeable a, Typeable b, Typeable k) + => TH.Name + -- ^ A unique name used to identify the function being memoized. There isn’t + -- really any metaprogramming going on here, we just use a Template Haskell + -- 'TH.Name' as a convenient source for a static, unique identifier. + -> a + -- ^ The value to use as the memoization key. It’s the caller’s + -- responsibility to ensure multiple calls to the same function don’t use + -- the same key. + -> m (Parser k n b) -> m (Parser k n b) + +type MonadRole r m = (MonadReader r m, Has RoleName r) + +-- | Gets the current role the schema is being built for. +askRoleName + :: MonadRole r m + => m RoleName +askRoleName = asks getter + +type MonadTableInfo r m = (MonadReader r m, Has TableCache r, MonadError QErr m) + +-- | Looks up table information for the given table name. This function +-- should never fail, since the schema cache construction process is +-- supposed to ensure all dependencies are resolved. +askTableInfo + :: MonadTableInfo r m + => QualifiedTable + -> m TableInfo +askTableInfo tableName = do + tableInfo <- asks $ Map.lookup tableName . getter + -- This should never fail, since the schema cache construction process is + -- supposed to ensure that all dependencies are resolved. + tableInfo `onNothing` throw500 ("askTableInfo: no info for " <>> tableName) + +-- | A wrapper around 'memoizeOn' that memoizes a function by using its argument +-- as the key. +memoize + :: (HasCallStack, MonadSchema n m, Ord a, Typeable a, Typeable b, Typeable k) + => TH.Name + -> (a -> m (Parser k n b)) + -> (a -> m (Parser k n b)) +memoize name f a = memoizeOn name a (f a) + +memoize2 + :: (HasCallStack, MonadSchema n m, Ord a, Ord b, Typeable a, Typeable b, Typeable c, Typeable k) + => TH.Name + -> (a -> b -> m (Parser k n c)) + -> (a -> b -> m (Parser k n c)) +memoize2 name = curry . memoize name . uncurry + +memoize3 + :: ( HasCallStack, MonadSchema n m, Ord a, Ord b, Ord c + , Typeable a, Typeable b, Typeable c, Typeable d, Typeable k ) + => TH.Name + -> (a -> b -> c -> m (Parser k n d)) + -> (a -> b -> c -> m (Parser k n d)) +memoize3 name = curry3 . memoize name . uncurry3 + +memoize4 + :: ( HasCallStack, MonadSchema n m, Ord a, Ord b, Ord c, Ord d + , Typeable a, Typeable b, Typeable c, Typeable d, Typeable e, Typeable k ) + => TH.Name + -> (a -> b -> c -> d -> m (Parser k n e)) + -> (a -> b -> c -> d -> m (Parser k n e)) +memoize4 name = curry4 . memoize name . uncurry4 + +-- | A class that provides functionality for parsing GraphQL queries, i.e. +-- running a fully-constructed 'Parser'. +class Monad m => MonadParse m where + withPath :: (JSONPath -> JSONPath) -> m a -> m a + -- | Not the full power of 'MonadError' because parse errors cannot be + -- caught. + parseErrorWith :: Code -> Text -> m a + -- | See 'QueryReusability'. + markNotReusable :: m () + +parseError :: MonadParse m => Text -> m a +parseError = parseErrorWith ValidationFailed + +-- | Tracks whether or not a query is /reusable/. Reusable queries are nice, +-- since we can cache their resolved ASTs and avoid re-resolving them if we +-- receive an identical query. However, we can’t always safely reuse queries if +-- they have variables, since some variable values can affect the generated SQL. +-- For example, consider the following query: +-- +-- > query users_where($condition: users_bool_exp!) { +-- > users(where: $condition) { +-- > id +-- > } +-- > } +-- +-- Different values for @$condition@ will produce completely different queries, +-- so we can’t reuse its plan (unless the variable values were also all +-- identical, of course, but we don’t bother caching those). +data QueryReusability = Reusable | NotReusable + +instance Semigroup QueryReusability where + NotReusable <> _ = NotReusable + _ <> NotReusable = NotReusable + Reusable <> Reusable = Reusable + +instance Monoid QueryReusability where + mempty = Reusable diff --git a/server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot b/server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot new file mode 100644 index 00000000000..79a9c3f3c56 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Class.hs-boot @@ -0,0 +1,5 @@ +module Hasura.GraphQL.Parser.Class where + +import Data.Kind (Type) + +class MonadParse (m :: Type -> Type) diff --git a/server/src-lib/Hasura/GraphQL/Parser/Collect.hs b/server/src-lib/Hasura/GraphQL/Parser/Collect.hs new file mode 100644 index 00000000000..5cd805a9ee7 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Collect.hs @@ -0,0 +1,274 @@ +{-# LANGUAGE StrictData #-} + +{-| This module implements two parts of the GraphQL specification: + + 1. <§ 5.3.2 Field Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging> + 2. <§ 6.3.2 Field Collection http://spec.graphql.org/June2018/#sec-Field-Collection> + +These are described in completely different sections of the specification, but +they’re actually highly related: both essentially normalize fields in a +selection set. -} +module Hasura.GraphQL.Parser.Collect + ( collectFields + ) where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict.Extended as Map +import qualified Data.HashMap.Strict.InsOrd as OMap + +import Data.List.Extended (duplicates) +import Language.GraphQL.Draft.Syntax + +import Hasura.GraphQL.Parser.Class +import {-# SOURCE #-} Hasura.GraphQL.Parser.Internal.Parser (boolean, runParser) +import Hasura.GraphQL.Parser.Schema +import Hasura.GraphQL.Utils (showNames) +import Hasura.SQL.Types + +-- | Collects the effective set of fields queried by a selection set by +-- flattening fragments and merging duplicate fields. +collectFields + :: (MonadParse m, Foldable t) + => t Name + -- ^ The names of the object types and interface types the 'SelectionSet' is + -- selecting against. + -> SelectionSet NoFragments Variable + -> m (InsOrdHashMap Name (Field NoFragments Variable)) +collectFields objectTypeNames selectionSet = + mergeFields =<< flattenSelectionSet objectTypeNames selectionSet + +-- | Flattens inline fragments in a selection set. For example, +-- +-- > { +-- > bar +-- > ... on Foo { +-- > baz +-- > qux +-- > } +-- > } +-- +-- is flattened to: +-- +-- > { +-- > bar +-- > baz +-- > qux +-- > } +-- +-- Nested fragments are similarly flattened, but only as is necessary: fragments +-- inside subselection sets of individual fields are /not/ flattened. For +-- example, +-- +-- > { +-- > bar +-- > ... on Foo { +-- > baz { +-- > ... on Baz { +-- > foo +-- > } +-- > } +-- > qux +-- > } +-- > } +-- +-- is flattened to +-- +-- > { +-- > bar +-- > baz { +-- > ... on Baz { +-- > foo +-- > } +-- > } +-- > qux +-- > } +-- +-- leaving the innermost fragment on @baz@ alone. +-- +-- This function also applies @\@include@ and @\@skip@ directives, since they +-- should be applied before fragments are flattened. +flattenSelectionSet + :: (MonadParse m, Foldable t) + => t Name + -- ^ The name of the object type the 'SelectionSet' is selecting against. + -> SelectionSet NoFragments Variable + -> m [Field NoFragments Variable] +flattenSelectionSet objectTypeNames = fmap concat . traverse flattenSelection + where + -- The easy case: just a single field. + flattenSelection (SelectionField field) = do + validateDirectives (_fDirectives field) + applyInclusionDirectives (_fDirectives field) $ pure [field] + + -- Note: The 'SelectionFragmentSpread' case has already been eliminated by + -- the fragment inliner. + + -- The involved case: we have an inline fragment to process. + flattenSelection (SelectionInlineFragment fragment) = do + validateDirectives (_ifDirectives fragment) + applyInclusionDirectives (_ifDirectives fragment) $ + case _ifTypeCondition fragment of + -- No type condition, so the fragment unconditionally applies. + Nothing -> flattenInlineFragment fragment + Just typeName + -- There is a type condition, but it is just the type of the + -- selection set; the fragment trivially applies. + | typeName `elem` objectTypeNames -> flattenInlineFragment fragment + + -- Otherwise, the fragment must not apply, because we do not currently + -- support interfaces or unions. According to the GraphQL spec, it is + -- an *error* to select a fragment that cannot possibly apply to the + -- given type; see + -- http://spec.graphql.org/June2018/#sec-Fragment-spread-is-possible. + -- Therefore, we raise an error. + | otherwise -> return [] + {- parseError $ "illegal type condition in fragment; type " + <> typeName <<> " is unrelated to any of the types " <> + Text.intercalate ", " (fmap dquoteTxt (toList objectTypeNames)) + -} + + flattenInlineFragment InlineFragment{ _ifDirectives, _ifSelectionSet } = do + validateDirectives _ifDirectives + flattenSelectionSet objectTypeNames _ifSelectionSet + + applyInclusionDirectives directives continue + | Just directive <- find ((== $$(litName "include")) . _dName) directives + = applyInclusionDirective id directive continue + | Just directive <- find ((== $$(litName "skip")) . _dName) directives + = applyInclusionDirective not directive continue + | otherwise = continue + + applyInclusionDirective adjust Directive{ _dName, _dArguments } continue = do + ifArgument <- Map.lookup $$(litName "if") _dArguments `onNothing` + parseError ("missing \"if\" argument for " <> _dName <<> " directive") + value <- runParser boolean $ GraphQLValue ifArgument + if adjust value then continue else pure [] + + validateDirectives directives = + case nonEmpty $ toList $ duplicates $ map _dName directives of + Nothing -> pure () + Just duplicatedDirectives -> parseError + $ "the following directives are used more than once: " + <> showNames duplicatedDirectives + +-- | Merges fields according to the rules in the GraphQL specification, specifically +-- <§ 5.3.2 Field Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging>. +mergeFields + :: (MonadParse m, Eq var) + => [Field NoFragments var] + -> m (InsOrdHashMap Name (Field NoFragments var)) +mergeFields = foldM addField OMap.empty + where + addField fields newField = case OMap.lookup alias fields of + Nothing -> + pure $! OMap.insert alias newField fields + Just oldField -> do + mergedField <- mergeField alias oldField newField + pure $! OMap.insert alias mergedField fields + where + alias = fromMaybe (_fName newField) (_fAlias newField) + + mergeField alias oldField newField = do + unless (_fName oldField == _fName newField) $ parseError $ + "selection of both " <> _fName oldField <<> " and " <> + _fName newField <<> " specify the same response name, " <>> alias + + unless (_fArguments oldField == _fArguments newField) $ parseError $ + "inconsistent arguments between multiple selections of " <> + "field " <>> _fName oldField + + pure $! Field + { _fAlias = Just alias + , _fName = _fName oldField + , _fArguments = _fArguments oldField + -- see Note [Drop directives from merged fields] + , _fDirectives = [] + -- see Note [Lazily merge selection sets] + , _fSelectionSet = _fSelectionSet oldField ++ _fSelectionSet newField + } + +{- Note [Drop directives from merged fields] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +When we merge two fields, what do we do with directives? The GraphQL spec isn’t +very clear here, but it does explicitly state that directives only need to be +unique per unmerged field (§ 5.7.3 Directives Are Unique Per Location, +http://spec.graphql.org/June2018/#sec-Directives-Are-Unique-Per-Location). For +clarity, here is the example given by the spec: + + query ($foo: Boolean = true, $bar: Boolean = false) { + field @skip(if: $foo) { + subfieldA + } + field @skip(if: $bar) { + subfieldB + } + } + +The spec says this is totally fine, since the @skip directives appear in +different places. This forces our hand: we *must* process @include/@skip +directives prior to merging fields. And conveniently, aside from @include/@skip, +we don’t care about directives, so we don’t bother reconciling them during field +merging---we just drop them. + +Note [Lazily merge selection sets] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Field merging is described in a recursive way in the GraphQL spec (§ 5.3.2 Field +Selection Merging http://spec.graphql.org/June2018/#sec-Field-Selection-Merging). +This makes sense: if fields have sub-selection sets, they should be recursively +merged. For example, suppose we have the following selection set: + + { + field1 { + field2 { + field3 + } + field5 + } + field1 { + field2 { + field4 + } + field5 + } + } + +After a single level of merging, we’ll merge the two occurrences of field1 +together to get: + + { + field1 { + field2 { + field3 + } + field5 + field2 { + field4 + } + field5 + } + } + +It would be natural to then merge the inner selection set, too, yielding: + + { + field1 { + field2 { + field3 + field4 + } + field5 + } + } + +But we don’t do this. Instead, we stop after the first level of merging, so +field1’s sub-selection set still has duplication. Why? Because recursively +merging fields would also require recursively flattening fragments, and +flattening fragments is tricky: it requires knowledge of type information. + +Fortunately, this lazy approach to field merging is totally okay, because we +call collectFields (and therefore mergeFields) each time we parse a selection +set. Once we get to processing the sub-selection set of field1, we’ll call +collectFields again, and it will merge things the rest of the way. This is +consistent with the way the rest of our parsing system works, where parsers +interpret their own inputs on an as-needed basis. -} diff --git a/server/src-lib/Hasura/GraphQL/Parser/Column.hs b/server/src-lib/Hasura/GraphQL/Parser/Column.hs new file mode 100644 index 00000000000..49a626373eb --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Column.hs @@ -0,0 +1,154 @@ +{-# LANGUAGE StrictData #-} + +module Hasura.GraphQL.Parser.Column + ( PGColumnValue(..) + , column + , mkScalarTypeName + + , UnpreparedValue(..) + + , Opaque + , openOpaque + , mkParameter + ) where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict.Extended as M +import qualified Database.PG.Query as Q + +import Language.GraphQL.Draft.Syntax (Description (..), Name (..), + Nullability (..), Value (..), litName, + mkName) + +import qualified Hasura.RQL.Types.Column as RQL +import qualified Hasura.RQL.Types.CustomTypes as RQL + +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Internal.Parser +import Hasura.GraphQL.Parser.Schema +import Hasura.RQL.Types.Column hiding (EnumValue (..), EnumValueInfo (..)) +import Hasura.RQL.Types.Error +import Hasura.Session (SessionVariable) +import Hasura.SQL.DML +import Hasura.SQL.Types +import Hasura.SQL.Value + +-- ------------------------------------------------------------------------------------------------- + +data Opaque a = Opaque + { _opVariable :: Maybe VariableInfo + -- ^ The variable this value came from, if any. + , _opValue :: a + } -- Note: we intentionally don’t derive any instances here, since that would + -- defeat the opaqueness! + +openOpaque :: MonadParse m => Opaque a -> m a +openOpaque (Opaque Nothing value) = pure value +openOpaque (Opaque (Just _) value) = markNotReusable $> value + +data UnpreparedValue + -- | A SQL value that can be parameterized over. + = UVParameter PGColumnValue + (Maybe VariableInfo) + -- ^ The GraphQL variable this value came from, if any. + -- | A literal SQL expression that /cannot/ be parameterized over. + | UVLiteral SQLExp + -- | The entire session variables JSON object. + | UVSession + -- | A single session variable. + | UVSessionVar (PGType PGScalarType) SessionVariable + +data PGColumnValue = PGColumnValue + { pcvType :: PGColumnType + , pcvValue :: WithScalarType PGScalarValue + } + +mkParameter :: Opaque PGColumnValue -> UnpreparedValue +mkParameter (Opaque variable value) = UVParameter value variable + +-- ------------------------------------------------------------------------------------------------- + +column + :: (MonadSchema n m, MonadError QErr m) + => PGColumnType + -> Nullability + -> m (Parser 'Both n (Opaque PGColumnValue)) +column columnType (Nullability isNullable) = + -- TODO(PDV): It might be worth memoizing this function even though it isn’t + -- recursive simply for performance reasons, since it’s likely to be hammered + -- during schema generation. Need to profile to see whether or not it’s a win. + opaque . fmap (PGColumnValue columnType) <$> case columnType of + PGColumnScalar scalarType -> withScalarType scalarType <$> case scalarType of + PGInteger -> pure (PGValInteger <$> int) + PGBoolean -> pure (PGValBoolean <$> boolean) + PGFloat -> pure (PGValDouble <$> float) + PGText -> pure (PGValText <$> string) + PGVarchar -> pure (PGValVarchar <$> string) + PGJSON -> pure (PGValJSON . Q.JSON <$> json) + PGJSONB -> pure (PGValJSONB . Q.JSONB <$> jsonb) + + -- For all other scalars, we convert the value to JSON and use the + -- FromJSON instance. The major upside is that this avoids having to write + -- a new parsers for each custom type: if the JSON parser is sound, so + -- will this one, and it avoids the risk of having two separate ways of + -- parsing a value in the codebase, which could lead to inconsistencies. + _ -> do + name <- mkScalarTypeName scalarType + let schemaType = NonNullable $ TNamed $ mkDefinition name Nothing TIScalar + pure $ Parser + { pType = schemaType + , pParser = + valueToJSON (toGraphQLType schemaType) >=> + either (parseErrorWith ParseFailed . qeError) pure . runAesonParser (parsePGValue scalarType) + } + PGColumnEnumReference (EnumReference tableName enumValues) -> + case nonEmpty (M.toList enumValues) of + Just enumValuesList -> do + name <- qualifiedObjectToName tableName <&> (<> $$(litName "_enum")) + pure $ withScalarType PGText $ enum name Nothing (mkEnumValue <$> enumValuesList) + Nothing -> throw400 ValidationFailed "empty enum values" + where + -- Sadly, this combinator is not sound in general, so we can’t export it + -- for general-purpose use. If we did, someone could write this: + -- + -- mkParameter <$> opaque do + -- n <- int + -- pure (mkIntColumnValue (n + 1)) + -- + -- Now we’d end up with a UVParameter that has a variable in it, so we’d + -- parameterize over it. But when we’d reuse the plan, we wouldn’t know to + -- increment the value by 1, so we’d use the wrong value! + -- + -- We could theoretically solve this by retaining a reference to the parser + -- itself and re-parsing each new value, using the saved parser, which + -- would admittedly be neat. But it’s more complicated, and it isn’t clear + -- that it would actually be useful, so for now we don’t support it. + opaque :: MonadParse m => Parser 'Both m a -> Parser 'Both m (Opaque a) + opaque parser = parser + { pParser = \case + GraphQLValue (VVariable var@Variable{ vInfo, vValue }) -> do + typeCheck False (toGraphQLType $ pType parser) var + Opaque (Just vInfo) <$> pParser parser (absurd <$> vValue) + value -> Opaque Nothing <$> pParser parser value + } + + withScalarType scalarType = fmap (WithScalarType scalarType) . possiblyNullable scalarType + possiblyNullable scalarType + | isNullable = fmap (fromMaybe $ PGNull scalarType) . nullable + | otherwise = id + + mkEnumValue (RQL.EnumValue value, RQL.EnumValueInfo description) = + ( mkDefinition value (Description <$> description) EnumValueInfo + , PGValText $ unName value + ) + +mkScalarTypeName :: MonadError QErr m => PGScalarType -> m Name +mkScalarTypeName PGInteger = pure RQL.intScalar +mkScalarTypeName PGBoolean = pure RQL.boolScalar +mkScalarTypeName PGFloat = pure RQL.floatScalar +mkScalarTypeName PGText = pure RQL.stringScalar +mkScalarTypeName PGVarchar = pure RQL.stringScalar +mkScalarTypeName scalarType = mkName (toSQLTxt scalarType) `onNothing` throw400 ValidationFailed + ("cannot use SQL type " <> scalarType <<> " in the GraphQL schema because its name is not a " + <> "valid GraphQL identifier") diff --git a/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs b/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs new file mode 100644 index 00000000000..7c6443088eb --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs @@ -0,0 +1,998 @@ +{-# OPTIONS_HADDOCK not-home #-} +{-# LANGUAGE AllowAmbiguousTypes #-} +{-# LANGUAGE StrictData #-} +{-# LANGUAGE ViewPatterns #-} + +-- | Defines the 'Parser' type and its primitive combinators. +module Hasura.GraphQL.Parser.Internal.Parser where + +import Hasura.Prelude + +import qualified Data.Aeson as A +import qualified Data.HashMap.Strict.Extended as M +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.HashSet as S +import qualified Data.Text as T + +import Control.Lens.Extended hiding (enum, index) +import Data.Int (Int32, Int64) +import Data.Scientific (toBoundedInteger) +import Data.Parser.JSONPath +import Data.Type.Equality +import Language.GraphQL.Draft.Syntax hiding (Definition) + +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Collect +import Hasura.GraphQL.Parser.Schema +import Hasura.RQL.Types.CustomTypes +import Hasura.RQL.Types.Error +import Hasura.Server.Utils (englishList) +import Hasura.SQL.Types +import Hasura.SQL.Value + + +-- ----------------------------------------------------------------------------- +-- type definitions + +-- | A 'Parser' that corresponds to a type in the GraphQL schema. A 'Parser' is +-- really two things at once: +-- +-- 1. As its name implies, a 'Parser' can be used to parse GraphQL queries +-- (via 'runParser'). +-- +-- 2. Less obviously, a 'Parser' represents a slice of the GraphQL schema, +-- since every 'Parser' corresponds to a particular GraphQL type, and +-- information about that type can be recovered (via 'parserType'). +-- +-- A natural way to view this is that 'Parser's support a sort of dynamic +-- reflection: in addition to running a 'Parser' on an input query, you can ask +-- it to tell you about what type of input it expects. Importantly, you can do +-- this even if you don’t have a query to parse; this is necessary to implement +-- GraphQL introspection, which provides precisely this sort of reflection on +-- types. +-- +-- Another way of viewing a 'Parser' is a little more quantum: just as light +-- “sometimes behaves like a particle and sometimes behaves like a wave,” a +-- 'Parser' “sometimes behaves like a query parser and sometimes behaves like a +-- type.” In this way, you can think of a function that produces a 'Parser' as +-- simultaneously both a function that constructs a GraphQL schema and a +-- function that parses a GraphQL query. 'Parser' constructors therefore +-- interleave two concerns: information about a type definition (like the type’s +-- name and description) and information about how to parse a query on that type. +-- +-- Notably, these two concerns happen at totally different phases in the +-- program: GraphQL schema construction happens when @graphql-engine@ first +-- starts up, before it receives any GraphQL queries at all. But query parsing +-- obviously can’t happen until there is actually a query to parse. For that +-- reason, it’s useful to take care to distinguish which effects are happening +-- at which phase during 'Parser' construction, since otherwise you may get +-- mixed up! +-- +-- For some more information about how to interpret the meaning of a 'Parser', +-- see Note [The meaning of Parser 'Output]. +data Parser k m a = Parser + { pType :: ~(Type k) + -- ^ Lazy for knot-tying reasons; see Note [Tying the knot] in + -- Hasura.GraphQL.Parser.Class. + , pParser :: ParserInput k -> m a + } deriving (Functor) + +parserType :: Parser k m a -> Type k +parserType = pType + +runParser :: Parser k m a -> ParserInput k -> m a +runParser = pParser + +instance HasName (Parser k m a) where + getName = getName . pType + +instance HasDefinition (Parser k m a) (TypeInfo k) where + definitionLens f parser = definitionLens f (pType parser) <&> \pType -> parser { pType } + +type family ParserInput k where + -- see Note [The 'Both kind] in Hasura.GraphQL.Parser.Schema + ParserInput 'Both = InputValue Variable + ParserInput 'Input = InputValue Variable + -- see Note [The meaning of Parser 'Output] + ParserInput 'Output = SelectionSet NoFragments Variable + +{- Note [The meaning of Parser 'Output] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +The ParserInput type family determines what a Parser accepts as input during +query parsing, which varies based on its Kind. A `Parser 'Input`, +unsurprisingly, parses GraphQL input values, much in the same way aeson +`Parser`s parse JSON values. + +Therefore, one might naturally conclude that `Parser 'Output` ought to parse +GraphQL output values. But it doesn’t---a Parser is used to parse GraphQL +*queries*, and output values don’t show up in queries anywhere! Rather, the +output values are the results of executing the query, not something the user +sends us, so we don’t have to parse those at all. + +What output types really correspond to in GraphQL queries is selection sets. For +example, if we have the GraphQL types + + type User { + posts(filters: PostFilters): [Post] + } + + input PostFilters { + newer_than: Date + } + + type Post { + id: Int + title: String + body: String + } + +then we might receive a query that looks like this: + + query list_user_posts($user_id: Int, $date: Date) { + user_by_id(id: $user_id) { + posts(filters: {newer_than: $date}) { + id + title + } + } + } + +We have Parsers to represent each of these types: a `Parser 'Input` for +PostFilters, and two `Parser 'Output`s for User and Post. When we parse the +query, we pass the `{newer_than: $date}` input value to the PostFilters parser, +as expected. But what do we pass to the User parser? The answer is this +selection set: + + { + posts(filters: {newer_than: $date}) { + id + title + } + } + +Likewise, the Post parser eventually receives the inner selection set: + + { + id + title + } + +These Parsers handle interpreting the fields of the selection sets. This is why +`ParserInput 'Output` is SelectionSet---the GraphQL *type* associated with the +Parser is an output type, but the part of the *query* that corresponds to that +output type isn’t an output value but a selection set. -} + +-- | The constraint @(''Input' '<:' k)@ entails @('ParserInput' k ~ 'Value')@, +-- but GHC can’t figure that out on its own, so we have to be explicit to give +-- it a little help. +inputParserInput :: forall k. 'Input <: k => ParserInput k :~: InputValue Variable +inputParserInput = case subKind @'Input @k of { KRefl -> Refl; KBoth -> Refl } + +pInputParser :: forall k m a. 'Input <: k => Parser k m a -> InputValue Variable -> m a +pInputParser = gcastWith (inputParserInput @k) pParser + +infixl 1 `bind` +bind :: Monad m => Parser k m a -> (a -> m b) -> Parser k m b +bind p f = p { pParser = pParser p >=> f } + +-- | Parses some collection of input fields. Build an 'InputFieldsParser' using +-- 'field', 'fieldWithDefault', or 'fieldOptional', combine several together +-- with the 'Applicative' instance, and finish it off using 'object' to turn it +-- into a 'Parser'. +data InputFieldsParser m a = InputFieldsParser + -- Note: this is isomorphic to + -- Compose ((,) [Definition (FieldInfo k)]) + -- (ReaderT (HashMap Name (FieldInput k)) m) a + -- but working with that type sucks. + { ifDefinitions :: [Definition InputFieldInfo] + , ifParser :: HashMap Name (InputValue Variable) -> m a + } deriving (Functor) + +infixl 1 `bindFields` +bindFields :: Monad m => InputFieldsParser m a -> (a -> m b) -> InputFieldsParser m b +bindFields p f = p { ifParser = ifParser p >=> f } + +instance Applicative m => Applicative (InputFieldsParser m) where + pure v = InputFieldsParser [] (const $ pure v) + a <*> b = InputFieldsParser + (ifDefinitions a <> ifDefinitions b) + (liftA2 (<*>) (ifParser a) (ifParser b)) + +-- | A parser for a single field in a selection set. Build a 'FieldParser' +-- with 'selection' or 'subselection', and combine them together with +-- 'selectionSet' to obtain a 'Parser'. +data FieldParser m a = FieldParser + { fDefinition :: Definition FieldInfo + , fParser :: Field NoFragments Variable -> m a + } deriving (Functor) + +infixl 1 `bindField` +bindField :: Monad m => FieldParser m a -> (a -> m b) -> FieldParser m b +bindField p f = p { fParser = fParser p >=> f } + +-- | A single parsed field in a selection set. +data ParsedSelection a + -- | An ordinary field. + = SelectField a + -- | The magical @__typename@ field, implicitly available on all objects + -- . + | SelectTypename Name + deriving (Functor) + +handleTypename :: (Name -> a) -> ParsedSelection a -> a +handleTypename _ (SelectField value) = value +handleTypename f (SelectTypename name) = f name + +-- ----------------------------------------------------------------------------- +-- combinators + +data ScalarRepresentation a where + SRBoolean :: ScalarRepresentation Bool + SRInt :: ScalarRepresentation Int32 + SRFloat :: ScalarRepresentation Double + SRString :: ScalarRepresentation Text + +scalar + :: MonadParse m + => Name + -> Maybe Description + -> ScalarRepresentation a + -> Parser 'Both m a +scalar name description representation = Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \v -> case representation of + SRBoolean -> case v of + GraphQLValue (VBoolean b) -> pure b + JSONValue (A.Bool b) -> pure b + _ -> typeMismatch name "a boolean" v + SRInt -> case v of + GraphQLValue (VInt i) -> convertWith scientificToInteger $ fromInteger i + JSONValue (A.Number n) -> convertWith scientificToInteger n + _ -> typeMismatch name "a 32-bit integer" v + SRFloat -> case v of + GraphQLValue (VFloat f) -> convertWith scientificToFloat f + GraphQLValue (VInt i) -> convertWith scientificToFloat $ fromInteger i + JSONValue (A.Number n) -> convertWith scientificToFloat n + _ -> typeMismatch name "a float" v + SRString -> case v of + GraphQLValue (VString s) -> pure s + JSONValue (A.String s) -> pure s + _ -> typeMismatch name "a string" v + } + where + schemaType = NonNullable $ TNamed $ mkDefinition name description TIScalar + convertWith f = either (parseErrorWith ParseFailed . qeError) pure . runAesonParser f + +{- WIP NOTE (FIXME: make into an actual note by expanding on it a bit) + +There's a delicate balance between GraphQL types and Postgres types. + +The mapping is done in the 'column' parser. But we want to only have +one source of truth for parsing postgres values, which happens to be +the JSON parsing code in SQL.Value. So here we reuse some of that code +despite not having a JSON value. + +-} + +boolean :: MonadParse m => Parser 'Both m Bool +boolean = scalar boolScalar Nothing SRBoolean + +int :: MonadParse m => Parser 'Both m Int32 +int = scalar intScalar Nothing SRInt + +float :: MonadParse m => Parser 'Both m Double +float = scalar floatScalar Nothing SRFloat + +string :: MonadParse m => Parser 'Both m Text +string = scalar stringScalar Nothing SRString + +-- | As an input type, any string or integer input value should be coerced to ID as Text +-- https://spec.graphql.org/June2018/#sec-ID +identifier :: MonadParse m => Parser 'Both m Text +identifier = Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case + GraphQLValue (VString s) -> pure s + GraphQLValue (VInt i) -> pure $ T.pack $ show i + JSONValue (A.String s) -> pure s + JSONValue (A.Number n) -> parseScientific n + v -> typeMismatch idName "a String or a 32-bit integer" v + } + where + idName = idScalar + schemaType = NonNullable $ TNamed $ mkDefinition idName Nothing TIScalar + parseScientific = either (parseErrorWith ParseFailed . qeError) + (pure . T.pack . show @Int) . runAesonParser scientificToInteger + +namedJSON :: MonadParse m => Name -> Maybe Description -> Parser 'Both m A.Value +namedJSON name description = Parser + { pType = schemaType + , pParser = valueToJSON $ toGraphQLType schemaType + } + where + schemaType = NonNullable $ TNamed $ mkDefinition name description TIScalar + +json, jsonb :: MonadParse m => Parser 'Both m A.Value +json = namedJSON $$(litName "json") Nothing +jsonb = namedJSON $$(litName "jsonb") Nothing + +-- | Explicitly define any desired scalar type. This is unsafe because it does +-- not mark queries as unreusable when they should be. +unsafeRawScalar + :: MonadParse n + => Name + -> Maybe Description + -> Parser 'Both n (InputValue Variable) +unsafeRawScalar name description = Parser + { pType = NonNullable $ TNamed $ mkDefinition name description TIScalar + , pParser = pure + } + +enum + :: MonadParse m + => Name + -> Maybe Description + -> NonEmpty (Definition EnumValueInfo, a) + -> Parser 'Both m a +enum name description values = Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case + JSONValue (A.String stringValue) + | Just enumValue <- mkName stringValue -> validate enumValue + GraphQLValue (VEnum (EnumValue enumValue)) -> validate enumValue + other -> typeMismatch name "an enum value" other + } + where + schemaType = NonNullable $ TNamed $ mkDefinition name description $ TIEnum (fst <$> values) + valuesMap = M.fromList $ over (traverse._1) dName $ toList values + validate value = case M.lookup value valuesMap of + Just result -> pure result + Nothing -> parseError $ "expected one of the values " + <> englishList "or" (dquoteTxt . dName . fst <$> values) <> " for type " + <> name <<> ", but found " <>> value + +nullable :: forall k m a. (MonadParse m, 'Input <: k) => Parser k m a -> Parser k m (Maybe a) +nullable parser = gcastWith (inputParserInput @k) Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case + JSONValue A.Null -> pure Nothing + GraphQLValue VNull -> pure Nothing + value -> Just <$> pParser parser value + } + where + schemaType = nullableType $ pType parser + +-- | Decorate a schema field as NON_NULL +nonNullableField :: forall m a . FieldParser m a -> FieldParser m a +nonNullableField (FieldParser (Definition n u d (FieldInfo as t)) p) = + FieldParser (Definition n u d (FieldInfo as (nonNullableType t))) p + +-- | Decorate a schema field as NULL +nullableField :: forall m a . FieldParser m a -> FieldParser m a +nullableField (FieldParser (Definition n u d (FieldInfo as t)) p) = + FieldParser (Definition n u d (FieldInfo as (nullableType t))) p +{- +field = field + { fDefinition = (fDefinition field) + { dInfo = (dInfo (fDefinition field)) + { fType = nonNullableType (fType (dInfo (fDefinition field))) + } + } + } +-} +-- | Decorate a schema output type as NON_NULL +nonNullableParser :: forall m a . Parser 'Output m a -> Parser 'Output m a +nonNullableParser parser = parser { pType = nonNullableType (pType parser) } + +multiple :: Parser 'Output m a -> Parser 'Output m a +multiple parser = parser { pType = Nullable $ TList $ pType parser } + +list :: forall k m a. (MonadParse m, 'Input <: k) => Parser k m a -> Parser k m [a] +list parser = gcastWith (inputParserInput @k) Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case + GraphQLValue (VList values) -> for (zip [0..] values) \(index, value) -> + withPath (++[Index index]) $ pParser parser $ GraphQLValue value + JSONValue (A.Array values) -> for (zip [0..] $ toList values) \(index, value) -> + withPath (++[Index index]) $ pParser parser $ JSONValue value + -- List Input Coercion + -- + -- According to section 3.11 of the GraphQL spec: iff the value + -- passed as an input to a list type is not a list and not the + -- null value, then the result of input coercion is a list of + -- size one, where the single item value is the result of input + -- coercion for the list’s item type on the provided value. + -- + -- We need to explicitly test for VNull here, otherwise we could + -- be returning `[null]` if the parser accepts a null value, + -- which would contradict the spec. + GraphQLValue VNull -> parseError "expected a list, but found null" + JSONValue A.Null -> parseError "expected a list, but found null" + other -> fmap pure $ withPath (++[Index 0]) $ pParser parser other + } + where + schemaType = NonNullable $ TList $ pType parser + +object + :: MonadParse m + => Name + -> Maybe Description + -> InputFieldsParser m a + -> Parser 'Input m a +object name description parser = Parser + { pType = schemaType + , pParser = peelVariable (Just $ toGraphQLType schemaType) >=> \case + GraphQLValue (VObject fields) -> parseFields $ GraphQLValue <$> fields + JSONValue (A.Object fields) -> do + translatedFields <- M.fromList <$> for (M.toList fields) \(key, val) -> do + name' <- mkName key `onNothing` parseError + ("variable value contains object with key " <> key <<> ", which is not a legal GraphQL name") + pure (name', JSONValue val) + parseFields translatedFields + other -> typeMismatch name "an object" other + } + where + schemaType = NonNullable $ TNamed $ mkDefinition name description $ + TIInputObject (InputObjectInfo (ifDefinitions parser)) + fieldNames = S.fromList (dName <$> ifDefinitions parser) + parseFields fields = do + -- check for extraneous fields here, since the InputFieldsParser just + -- handles parsing the fields it cares about + for_ (M.keys fields) \fieldName -> + unless (fieldName `S.member` fieldNames) $ withPath (++[Key (unName fieldName)]) $ + parseError $ "field " <> dquote fieldName <> " not found in type: " <> squote name + ifParser parser fields + +{- Note [Optional fields and nullability] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +GraphQL conflates optional fields and nullability. A field of a GraphQL input +object (or an argument to a selection set field, which is really the same thing) +is optional if and only if its type is nullable. It’s worth fully spelling out +the implications here: if a field (or argument) is non-nullable, it /cannot/ be +omitted. So, for example, suppose we had a table type like this: + + type article { + comments(limit: Int!): [comment!]! + } + +Since we made `limit` non-nullable, it is /illegal/ to omit the argument. You’d +/always/ have to provide some value---and that isn’t what we want, because the +row limit should be optional. We have no choice but to make it nullable: + + type article { + comments(limit: Int): [comment!]! + } + +But this feels questionable. Should we really accept `null` values for `limit`? +That is, should this query be legal? + + { + articles { + comments(limit: null) { ... } + } + } + +A tempting answer to that question is “yes”: we can just treat a `null` value +for any optional field as precisely equivalent to leaving the field off +entirely. That is, any field with no default value really just has a default +value of `null`. Unfortunately, this approach turns out to be a really bad idea. +It’s all too easy to write something like + + mutation delete_article_by_id($article_id: Int) { + delete_articles(where: {id: {eq: $article_id}}) + } + +then accidentally misspell `article_id` in the variables payload, and now you’ve +deleted all the articles in your database. Very bad. + +So we’d really like to be able to have a way to say “this field is optional, but +`null` is not a legal value,” but at first it seems like the GraphQL spec ties +our hands. Fortunately, there is a way out. The spec explicitly permits +distinguishing between the following two situations: + + comments { ... } + comments(limit: null) { ... } + +That is, the spec allows implementations to behave differently depending on +whether an argument was omitted or whether its value was `null`. This is spelled +out in a few different places in the spec, but §3.10 Input Objects + is the most explicit: + +> If the value `null` was provided for an input object field, and the field’s +> type is not a non‐null type, an entry in the coerced unordered map is given +> the value `null`. In other words, there is a semantic difference between the +> explicitly provided value `null` versus having not provided a value. + +Note that this is only allowed for fields that don’t have any default value! If +the field were declared with an explicit `null` default value, like + + type article { + comments(limit: Int = null): [comment!]! + } + +then it would not be legal to distinguish the two cases. Yes, this is all +terribly subtle. + +Okay. So armed with that knowledge, what do we do about it? We offer three +different combinators for parsing input fields: + + 1. `field` — Defines a field with no default value. The field’s nullability is + taken directly from the nullability of the field’s value parser. + 2. `fieldOptional` — Defines a field with no default value that is always + nullable. Returns Nothing if (and only if!) the field is omitted. + 3. `fieldWithDefault` — Defines a field with a default value. + +The last of the three, `fieldWithDefault`, is actually the simplest. It +corresponds to a field with a default value, and the underlying value parser +will /always/ be called. If the field is omitted, the value parser is called +with the default value. (This makes it impossible to distinguish omitted fields +from those explicitly passed the default value, as mandated by the spec.) Use +`fieldWithDefault` for any field or argument with a non-`null` default value. + +`field` is also fairly straightforward. It always calls its value parser, so if +the field is omitted, it calls it with a value of `null`. Notably, there is no +special handling for non-nullable fields, since the underlying parser will raise +an error in that case, anyway. Use `field` for required fields, and combine +`field` with `nullable` for optional fields with a default value of `null`. + +`fieldOptional` is the most interesting. Unlike `field` and `fieldWithDefault`, +`fieldOptional` does not call its underlying value parser if the field is not +provided; it simply returns Nothing. If a value /is/ provided, it is passed +along without modification. This yields an interesting interaction when the +value parser does not actually accept nulls, such as a parser like this: + + fieldOptional $$(litName "limit") Nothing int + +This corresponds to the `limit` field from our original example. If the field is +omitted, the `int` parser is not called, and the field parser just returns +Nothing. But if a value of `null` is explicitly provided, it is forwarded to the +`int` parser, which then rejects it with a parse error, since it does not accept +nulls. This is exactly the behavior we want. + +This semantics can appear confusing. We end up with a field with a nullable type +for which `null` is not a legal value! A strange interpretation of “nullable”, +indeed. But realize that the nullability really means “optional”, and the +behavior makes more sense. + +As a final point, note that similar behavior can be obtained with +`fieldWithDefault`. The following creates a boolean field that defaults to +`false` and rejects `null` values: + + fieldWithDefault $$(litName "includeDeprecated") Nothing (VBoolean False) boolean + +This is a perfectly reasonable thing to do for exactly the same rationale behind +the use of `fieldOptional` above. -} + +-- | Creates a parser for an input field. The field’s nullability is determined +-- by the nullability of the given value parser; see Note [Optional fields and +-- nullability] for more details. +field + :: (MonadParse m, 'Input <: k) + => Name + -> Maybe Description + -> Parser k m a + -> InputFieldsParser m a +field name description parser = case pType parser of + NonNullable typ -> InputFieldsParser + { ifDefinitions = [mkDefinition name description $ IFRequired typ] + , ifParser = \ values -> withPath (++[Key (unName name)]) do + value <- onNothing (M.lookup name values) $ + parseError ("missing required field " <>> name) + pInputParser parser value + } + -- nullable fields just have an implicit default value of `null` + Nullable _ -> fieldWithDefault name description VNull parser + +-- | Creates a parser for an input field with the given default value. The +-- resulting field will always be nullable, even if the underlying parser +-- rejects `null` values; see Note [Optional fields and nullability] for more +-- details. +fieldWithDefault + :: (MonadParse m, 'Input <: k) + => Name + -> Maybe Description + -> Value Void -- ^ default value + -> Parser k m a + -> InputFieldsParser m a +fieldWithDefault name description defaultValue parser = InputFieldsParser + { ifDefinitions = [mkDefinition name description $ IFOptional (pType parser) (Just defaultValue)] + , ifParser = M.lookup name >>> withPath (++[Key (unName name)]) . \case + Just value -> peelVariableWith True expectedType value >>= parseValue expectedType + Nothing -> pInputParser parser $ GraphQLValue $ literal defaultValue + } + where + expectedType = Just $ toGraphQLType $ pType parser + parseValue _ value = pInputParser parser value + {- + FIXME!!!! + FIXME!!!! + + parseValue expectedType value = case value of + VVariable (var@Variable { vInfo, vValue }) -> do + typeCheck expectedType var + -- This case is tricky: if we get a nullable variable, we have to + -- pessimistically mark the query non-reusable, regardless of its + -- contents. Why? Well, suppose we have a type like + -- + -- type Foo { + -- bar(arg: Int = 42): String + -- } + -- + -- and suppose we receive the following query: + -- + -- query blah($var: Int) { + -- foo { + -- bar(arg: $var) + -- } + -- } + -- + -- Suppose no value is provided for $var, so it defaults to null. When + -- we parse the arg field, we see it has a default value, so we + -- substitute 42 for null and carry on. But now we’ve discarded the + -- information that this value came from a variable at all, so if we + -- cache the query plan, changes to the variable will be ignored, since + -- we’ll always use 42! + -- + -- Note that the problem doesn’t go away even if $var has a non-null + -- value. In that case, we’d simply have flipped the problem around: now + -- our cached query plan will do the wrong thing if $var *is* null, + -- since we won’t know to substitute 42. + -- + -- Theoretically, we could be smarter here: we could record a sort of + -- “derived variable reference” that includes a new default value. But + -- that would be more complicated, so for now we don’t do that. + case vInfo of + VIRequired _ -> pInputParser parser value + VIOptional _ _ -> markNotReusable *> parseValue expectedType (literal vValue) + VNull -> pInputParser parser $ literal defaultValue + _ -> pInputParser parser value + -} + +-- | Creates a parser for a nullable field with no default value. If the field +-- is omitted, the provided parser /will not be called/. This allows a field to +-- distinguish an omitted field from a field supplied with @null@ (which is +-- permitted by the GraphQL specification); see Note [Optional fields and +-- nullability] for more details. +-- +-- If you want a field with a default value of @null@, combine 'field' with +-- 'nullable', instead. +fieldOptional + :: (MonadParse m, 'Input <: k) + => Name + -> Maybe Description + -> Parser k m a + -> InputFieldsParser m (Maybe a) +fieldOptional name description parser = InputFieldsParser + { ifDefinitions = [mkDefinition name description $ + IFOptional (nullableType $ pType parser) Nothing] + , ifParser = M.lookup name >>> withPath (++[Key (unName name)]) . + traverse (pInputParser parser <=< peelVariable expectedType) + } + where + expectedType = Just $ toGraphQLType $ nullableType $ pType parser + +-- | A variant of 'selectionSetObject' which doesn't implement any interfaces +selectionSet + :: MonadParse m + => Name + -> Maybe Description + -> [FieldParser m a] + -> Parser 'Output m (OMap.InsOrdHashMap Name (ParsedSelection a)) +selectionSet name desc fields = selectionSetObject name desc fields [] + +-- Should this rather take a non-empty `FieldParser` list? +-- See also Note [Selectability of tables]. +selectionSetObject + :: MonadParse m + => Name + -> Maybe Description + -> [FieldParser m a] + -- ^ Fields of this object, including any fields that are required from the + -- interfaces that it implements. Note that we can't derive those fields from + -- the list of interfaces (next argument), because the types of the fields of + -- the object are only required to be *subtypes* of the types of the fields of + -- the interfaces it implements. + -> [Parser 'Output m b] + -- ^ Interfaces implemented by this object; + -- see Note [The interfaces story] in Hasura.GraphQL.Parser.Schema. + -> Parser 'Output m (OMap.InsOrdHashMap Name (ParsedSelection a)) +selectionSetObject name description parsers implementsInterfaces = Parser + { pType = Nullable $ TNamed $ mkDefinition name description $ + TIObject $ ObjectInfo (map fDefinition parsers) interfaces + , pParser = \input -> withPath (++[Key "selectionSet"]) do + -- Not all fields have a selection set, but if they have one, it + -- must contain at least one field. The GraphQL parser returns a + -- list to represent this: an empty list indicates there was no + -- selection set, as an empty set is rejected outright. + -- Arguably, this would be better represented by a `Maybe + -- (NonEmpty a)`. + -- The parser can't know whether a given field needs a selection + -- set or not; but if we're in this function, it means that yes: + -- this field needs a selection set, and if none was provided, + -- we must fail. + when (null input) $ + parseError $ "missing selection set for " <>> name + + -- TODO(PDV) This probably accepts invalid queries, namely queries that use + -- type names that do not exist. + fields <- collectFields (name:parsedInterfaceNames) input + for fields \selectionField@Field{ _fName, _fAlias } -> if + | _fName == $$(litName "__typename") -> + pure $ SelectTypename name + | Just parser <- M.lookup _fName parserMap -> + withPath (++[Key (unName _fName)]) $ + SelectField <$> parser selectionField + | otherwise -> + withPath (++[Key (unName _fName)]) $ + parseError $ "field " <> _fName <<> " not found in type: " <> squote name + } + where + parserMap = parsers + & map (\FieldParser{ fDefinition, fParser } -> (getName fDefinition, fParser)) + & M.fromList + interfaces = mapMaybe (getInterfaceInfo . pType) implementsInterfaces + parsedInterfaceNames = fmap getName interfaces + +selectionSetInterface + :: (MonadParse n, Traversable t) + => Name + -> Maybe Description + -> [FieldParser n a] + -- ^ Fields defined in this interface + -> t (Parser 'Output n b) + -- ^ Parsers for the object types that implement this interface; see + -- Note [The interfaces story] in Hasura.GraphQL.Parser.Schema for details. + -> Parser 'Output n (t b) +selectionSetInterface name description fields objectImplementations = Parser + { pType = Nullable $ TNamed $ mkDefinition name description $ + TIInterface $ InterfaceInfo (map fDefinition fields) objects + , pParser = \input -> for objectImplementations (($ input) . pParser) + -- Note: This is somewhat suboptimal, since it parses a query against every + -- possible object implementing this interface, possibly duplicating work for + -- fields defined on the interface itself. + -- + -- Furthermore, in our intended use case (Relay), based on a field argument, + -- we can decide which object we are about to retrieve, so in theory we could + -- save some work by only parsing against that object type. But it’s still + -- useful to parse against all of them, since it checks the validity of any + -- fragments on the other types. + } + where + objects = catMaybes $ toList $ fmap (getObjectInfo . pType) objectImplementations + +selectionSetUnion + :: (MonadParse n, Traversable t) + => Name + -> Maybe Description + -> t (Parser 'Output n b) -- ^ The member object types. + -> Parser 'Output n (t b) +selectionSetUnion name description objectImplementations = Parser + { pType = Nullable $ TNamed $ mkDefinition name description $ + TIUnion $ UnionInfo objects + , pParser = \input -> for objectImplementations (($ input) . pParser) + } + where + objects = catMaybes $ toList $ fmap (getObjectInfo . pType) objectImplementations + +-- | An "escape hatch" that doesn't validate anything and just gives the +-- requested selection set. This is unsafe because it does not check the +-- selection set for validity. +unsafeRawParser + :: forall m + . MonadParse m + => Type 'Output + -> Parser 'Output m (SelectionSet NoFragments Variable) +unsafeRawParser tp = Parser + { pType = tp + , pParser = pure + } + +unsafeRawField + :: forall m + . MonadParse m + => Definition FieldInfo + -> FieldParser m (Field NoFragments Variable) +unsafeRawField def = FieldParser + { fDefinition = def + , fParser = pure + } + +-- | Builds a 'FieldParser' for a field that does not take a subselection set, +-- i.e. a field that returns a scalar or enum. The field’s type is taken from +-- the provided 'Parser', but the 'Parser' is not otherwise used. +-- +-- See also Note [The delicate balance of GraphQL kinds] in "Hasura.GraphQL.Parser.Schema". +selection + :: forall m a b + . MonadParse m + => Name + -> Maybe Description + -> InputFieldsParser m a -- ^ parser for the input arguments + -> Parser 'Both m b -- ^ type of the result + -> FieldParser m a +selection name description argumentsParser resultParser = FieldParser + { fDefinition = mkDefinition name description $ + FieldInfo (ifDefinitions argumentsParser) (pType resultParser) + , fParser = \Field{ _fArguments, _fSelectionSet } -> do + unless (null _fSelectionSet) $ + parseError "unexpected subselection set for non-object field" + -- check for extraneous arguments here, since the InputFieldsParser just + -- handles parsing the fields it cares about + for_ (M.keys _fArguments) \argumentName -> + unless (argumentName `S.member` argumentNames) $ + parseError $ name <<> " has no argument named " <>> argumentName + withPath (++[Key "args"]) $ ifParser argumentsParser $ GraphQLValue <$> _fArguments + } + where + argumentNames = S.fromList (dName <$> ifDefinitions argumentsParser) + +-- | Builds a 'FieldParser' for a field that takes a subselection set, i.e. a +-- field that returns an object. +-- +-- See also Note [The delicate balance of GraphQL kinds] in "Hasura.GraphQL.Parser.Schema". +subselection + :: forall m a b + . MonadParse m + => Name + -> Maybe Description + -> InputFieldsParser m a -- ^ parser for the input arguments + -> Parser 'Output m b -- ^ parser for the subselection set + -> FieldParser m (a, b) +subselection name description argumentsParser bodyParser = FieldParser + { fDefinition = mkDefinition name description $ + FieldInfo (ifDefinitions argumentsParser) (pType bodyParser) + , fParser = \Field{ _fArguments, _fSelectionSet } -> do + -- check for extraneous arguments here, since the InputFieldsParser just + -- handles parsing the fields it cares about + for_ (M.keys _fArguments) \argumentName -> + unless (argumentName `S.member` argumentNames) $ + parseError $ name <<> " has no argument named " <>> argumentName + (,) <$> withPath (++[Key "args"]) (ifParser argumentsParser $ GraphQLValue <$> _fArguments) + <*> pParser bodyParser _fSelectionSet + } + where + argumentNames = S.fromList (dName <$> ifDefinitions argumentsParser) + +-- | A shorthand for a 'selection' that takes no arguments. +selection_ + :: MonadParse m + => Name + -> Maybe Description + -> Parser 'Both m a -- ^ type of the result + -> FieldParser m () +selection_ name description = selection name description (pure ()) + +-- | A shorthand for a 'subselection' that takes no arguments. +subselection_ + :: MonadParse m + => Name + -> Maybe Description + -> Parser 'Output m a -- ^ parser for the subselection set + -> FieldParser m a +subselection_ name description bodyParser = + snd <$> subselection name description (pure ()) bodyParser + + +-- ----------------------------------------------------------------------------- +-- helpers + +valueToJSON :: MonadParse m => GType -> InputValue Variable -> m A.Value +valueToJSON expected = peelVariable (Just expected) >=> valueToJSON' + where + valueToJSON' = \case + JSONValue j -> pure j + GraphQLValue g -> graphQLToJSON g + graphQLToJSON = \case + VNull -> pure A.Null + VInt i -> pure $ A.toJSON i + VFloat f -> pure $ A.toJSON f + VString t -> pure $ A.toJSON t + VBoolean b -> pure $ A.toJSON b + VEnum (EnumValue n) -> pure $ A.toJSON n + VList values -> A.toJSON <$> traverse graphQLToJSON values + VObject objects -> A.toJSON <$> traverse graphQLToJSON objects + VVariable variable -> valueToJSON' $ absurd <$> vValue variable + +jsonToGraphQL :: (MonadError Text m) => A.Value -> m (Value Void) +jsonToGraphQL = \case + A.Null -> pure VNull + A.Bool val -> pure $ VBoolean val + A.String val -> pure $ VString val + A.Number val -> case toBoundedInteger val of + Just intVal -> pure $ VInt $ fromIntegral @Int64 intVal + Nothing -> pure $ VFloat val + A.Array vals -> VList <$> traverse jsonToGraphQL (toList vals) + A.Object vals -> VObject . M.fromList <$> for (M.toList vals) \(key, val) -> do + graphQLName <- onNothing (mkName key) $ throwError $ + "variable value contains object with key " <> key <<> ", which is not a legal GraphQL name" + (graphQLName,) <$> jsonToGraphQL val + +peelVariable :: MonadParse m => Maybe GType -> InputValue Variable -> m (InputValue Variable) +peelVariable = peelVariableWith False + +peelVariableWith :: MonadParse m => Bool -> Maybe GType -> InputValue Variable -> m (InputValue Variable) +peelVariableWith hasLocationDefaultValue expected = \case + GraphQLValue (VVariable var) -> do + onJust expected \locationType -> typeCheck hasLocationDefaultValue locationType var + markNotReusable + pure $ absurd <$> vValue var + value -> pure value + +typeCheck :: MonadParse m => Bool -> GType -> Variable -> m () +typeCheck hasLocationDefaultValue locationType variable@Variable { vInfo, vType } = + unless (isVariableUsageAllowed hasLocationDefaultValue locationType variable) $ parseError + $ "variable " <> dquote (getName vInfo) <> " is declared as " + <> showGT vType <> ", but used where " + <> showGT locationType <> " is expected" + +typeMismatch :: MonadParse m => Name -> Text -> InputValue Variable -> m a +typeMismatch name expected given = parseError $ + "expected " <> expected <> " for type " <> name <<> ", but found " <> describeValue given + +describeValue :: InputValue Variable -> Text +describeValue = describeValueWith (describeValueWith absurd . vValue) + +describeValueWith :: (var -> Text) -> InputValue var -> Text +describeValueWith describeVariable = \case + JSONValue jval -> describeJSON jval + GraphQLValue gval -> describeGraphQL gval + where + describeJSON = \case + A.Null -> "null" + A.Bool _ -> "a boolean" + A.String _ -> "a string" + A.Number _ -> "a number" + A.Array _ -> "a list" + A.Object _ -> "an object" + describeGraphQL = \case + VVariable var -> describeVariable var + VInt _ -> "an integer" + VFloat _ -> "a float" + VString _ -> "a string" + VBoolean _ -> "a boolean" + VNull -> "null" + VEnum _ -> "an enum value" + VList _ -> "a list" + VObject _ -> "an object" + +-- | Checks whether the type of a variable is compatible with the type +-- at the location at which it is used. This is an implementation of +-- the function described in section 5.8.5 of the spec: +-- http://spec.graphql.org/June2018/#sec-All-Variable-Usages-are-Allowed +-- No input type coercion is allowed between variables: coercion +-- rules only allow when translating a value from a literal. It is +-- therefore not allowed to use an Int variable at a Float location, +-- despite the fact that it is legal to use an Int literal at a +-- Float location. +-- Furthermore, it's also worth noting that there's one tricky case +-- where we might allow a nullable variable at a non-nullable +-- location: when either side has a non-null default value. That's +-- because GraphQL conflates nullability and optinal fields (see +-- Note [Optional fields and nullability] for more details). +isVariableUsageAllowed + :: Bool -- ^ does the location have a default value + -> GType -- ^ the location type + -> Variable -- ^ the variable + -> Bool +isVariableUsageAllowed hasLocationDefaultValue locationType variable + | isNullable locationType = areTypesCompatible locationType variableType + | not $ isNullable variableType = areTypesCompatible locationType variableType + | hasLocationDefaultValue = areTypesCompatible locationType variableType + | hasNonNullDefault variable = areTypesCompatible locationType variableType + | otherwise = False + where + areTypesCompatible = compareTypes `on` \case + TypeNamed _ n -> TypeNamed (Nullability True) n + TypeList _ n -> TypeList (Nullability True) n + variableType = vType variable + hasNonNullDefault = vInfo >>> \case + VIRequired _ -> False + VIOptional _ value -> value /= VNull + compareTypes = curry \case + (TypeList lNull lType, TypeList vNull vType) + -> checkNull lNull vNull && areTypesCompatible lType vType + (TypeNamed lNull lType, TypeNamed vNull vType) + -> checkNull lNull vNull && lType == vType + _ -> False + checkNull (Nullability expectedNull) (Nullability actualNull) = + expectedNull || not actualNull diff --git a/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot b/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot new file mode 100644 index 00000000000..782c6e2078b --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Internal/Parser.hs-boot @@ -0,0 +1,23 @@ +module Hasura.GraphQL.Parser.Internal.Parser where + +import Hasura.Prelude + +import qualified Data.Kind as K + +import Language.GraphQL.Draft.Syntax + + +import {-# SOURCE #-} Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Schema + +type role Parser nominal representational nominal +data Parser (k :: Kind) (m :: K.Type -> K.Type) (a :: K.Type) + +runParser :: Parser k m a -> ParserInput k -> m a + +type family ParserInput k where + ParserInput 'Both = InputValue Variable + ParserInput 'Input = InputValue Variable + ParserInput 'Output = SelectionSet NoFragments Variable + +boolean :: MonadParse m => Parser 'Both m Bool diff --git a/server/src-lib/Hasura/GraphQL/Parser/Monad.hs b/server/src-lib/Hasura/GraphQL/Parser/Monad.hs new file mode 100644 index 00000000000..573cc26c594 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Monad.hs @@ -0,0 +1,192 @@ +{-# LANGUAGE StrictData #-} + +-- | Monad transformers for GraphQL schema construction and query parsing. +module Hasura.GraphQL.Parser.Monad + ( SchemaT + , runSchemaT + + , ParseT + , runParseT + , ParseError(..) + ) where + +import Hasura.Prelude + +import qualified Data.Dependent.Map as DM +import qualified Data.Kind as K +import qualified Data.Sequence.NonEmpty as NE +import qualified Language.Haskell.TH as TH + +import Control.Monad.Unique +import Control.Monad.Validate +import Data.Dependent.Map (DMap) +import Data.GADT.Compare.Extended +import Data.IORef +import Data.Parser.JSONPath +import Data.Proxy (Proxy (..)) +import System.IO.Unsafe (unsafeInterleaveIO) +import Type.Reflection ((:~:) (..), Typeable, typeRep) + +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Internal.Parser +import Hasura.GraphQL.Parser.Schema +import Hasura.RQL.Types.Error (Code) + +-- ------------------------------------------------------------------------------------------------- +-- schema construction + +newtype SchemaT n m a = SchemaT + { unSchemaT :: StateT (DMap ParserId (ParserById n)) m a + } deriving (Functor, Applicative, Monad, MonadError e) + +runSchemaT :: forall m n a . Monad m => SchemaT n m a -> m a +runSchemaT = flip evalStateT mempty . unSchemaT + +-- | see Note [SchemaT requires MonadIO] +instance (MonadIO m, MonadUnique m, MonadParse n) + => MonadSchema n (SchemaT n m) where + memoizeOn name key buildParser = SchemaT do + let parserId = ParserId name key + parsersById <- get + case DM.lookup parserId parsersById of + Just (ParserById parser) -> pure parser + Nothing -> do + -- We manually do eager blackholing here using a MutVar rather than + -- relying on MonadFix and ordinary thunk blackholing. Why? A few + -- reasons: + -- + -- 1. We have more control. We aren’t at the whims of whatever + -- MonadFix instance happens to get used. + -- + -- 2. We can be more precise. GHC’s lazy blackholing doesn’t always + -- kick in when you’d expect. + -- + -- 3. We can provide more useful error reporting if things go wrong. + -- Most usefully, we can include a HasCallStack source location. + cell <- liftIO $ newIORef Nothing + + -- We use unsafeInterleaveIO here, which sounds scary, but + -- unsafeInterleaveIO is actually far more safe than unsafePerformIO. + -- unsafeInterleaveIO just defers the execution of the action until its + -- result is needed, adding some laziness. + -- + -- That laziness can be dangerous if the action has side-effects, since + -- the point at which the effect is performed can be unpredictable. But + -- this action just reads, never writes, so that isn’t a concern. + parserById <- liftIO $ unsafeInterleaveIO $ readIORef cell >>= \case + Just parser -> pure $ ParserById parser + Nothing -> error $ unlines + [ "memoize: parser was forced before being fully constructed" + , " parser constructor: " ++ TH.pprint name ] + put $! DM.insert parserId parserById parsersById + + unique <- newUnique + parser <- addDefinitionUnique unique <$> unSchemaT buildParser + liftIO $ writeIORef cell (Just parser) + pure parser + +-- We can add a reader in two places. I'm not sure which one is the correct +-- one. But since we don't seem to change the values that are being read, I +-- don't think it matters. + +deriving instance Monad m => MonadReader a (SchemaT n (ReaderT a m)) + +instance (MonadIO m, MonadUnique m, MonadParse n) + => MonadSchema n (ReaderT a (SchemaT n m)) where + memoizeOn name key = mapReaderT (memoizeOn name key) + +{- Note [SchemaT requires MonadIO] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +The MonadSchema instance for SchemaT requires MonadIO, which is unsatisfying. +The only reason the constraint is needed is to implement knot-tying via IORefs +(see Note [Tying the knot] in Hasura.GraphQL.Parser.Class), which really only +requires the power of ST. Using ST would be much nicer, since we could discharge +the burden locally, but unfortunately we also want to use MonadUnique, which +is handled by IO in the end. + +This means that we need IO at the base of our monad, so to use STRefs, we’d need +a hypothetical STT transformer (i.e. a monad transformer version of ST). But +such a thing isn’t safe in general, since reentrant monads like ListT or ContT +would incorrectly share state between the different threads of execution. + +In theory, this can be resolved by using something like Vault (from the vault +package) to create “splittable” sets of variable references. That would allow +you to create a transformer with an STRef-like interface that works over any +arbitrary monad. However, while the interface would be safe, the implementation +of such an abstraction requires unsafe primitives, and to the best of my +knowledge no such transformer exists in any existing libraries. + +So we decide it isn’t worth the trouble and just use MonadIO. If `eff` ever pans +out, it should be able to support this more naturally, so we can fix it then. -} + +-- | A key used to distinguish calls to 'memoize'd functions. The 'TH.Name' +-- distinguishes calls to completely different parsers, and the @a@ value +-- records the arguments. +data ParserId (t :: (Kind, K.Type)) where + ParserId :: (Ord a, Typeable a, Typeable b, Typeable k) => TH.Name -> a -> ParserId '(k, b) + +instance GEq ParserId where + geq (ParserId name1 (arg1 :: a1) :: ParserId t1) + (ParserId name2 (arg2 :: a2) :: ParserId t2) + | _ :: Proxy '(k1, b1) <- Proxy @t1 + , _ :: Proxy '(k2, b2) <- Proxy @t2 + , name1 == name2 + , Just Refl <- typeRep @a1 `geq` typeRep @a2 + , arg1 == arg2 + , Just Refl <- typeRep @k1 `geq` typeRep @k2 + , Just Refl <- typeRep @b1 `geq` typeRep @b2 + = Just Refl + | otherwise = Nothing + +instance GCompare ParserId where + gcompare (ParserId name1 (arg1 :: a1) :: ParserId t1) + (ParserId name2 (arg2 :: a2) :: ParserId t2) + | _ :: Proxy '(k1, b1) <- Proxy @t1 + , _ :: Proxy '(k2, b2) <- Proxy @t2 + = strengthenOrdering (compare name1 name2) + `extendGOrdering` gcompare (typeRep @a1) (typeRep @a2) + `extendGOrdering` strengthenOrdering (compare arg1 arg2) + `extendGOrdering` gcompare (typeRep @k1) (typeRep @k2) + `extendGOrdering` gcompare (typeRep @b1) (typeRep @b2) + `extendGOrdering` GEQ + +-- | A newtype wrapper around a 'Parser' that rearranges the type parameters +-- so that it can be indexed by a 'ParserId' in a 'DMap'. +-- +-- This is really just a single newtype, but it’s implemented as a data family +-- because GHC doesn’t allow ordinary datatype declarations to pattern-match on +-- type parameters, and we want to match on the tuple. +data family ParserById (m :: K.Type -> K.Type) (a :: (Kind, K.Type)) +newtype instance ParserById m '(k, a) = ParserById (Parser k m a) + +-- ------------------------------------------------------------------------------------------------- +-- query parsing + +newtype ParseT m a = ParseT + { unParseT :: ReaderT JSONPath (StateT QueryReusability (ValidateT (NESeq ParseError) m)) a + } deriving (Functor, Applicative, Monad) + +runParseT + :: Functor m + => ParseT m a + -> m (Either (NESeq ParseError) (a, QueryReusability)) +runParseT = unParseT + >>> flip runReaderT [] + >>> flip runStateT mempty + >>> runValidateT + +instance MonadTrans ParseT where + lift = ParseT . lift . lift . lift + +instance Monad m => MonadParse (ParseT m) where + withPath f x = ParseT $ withReaderT f $ unParseT x + parseErrorWith code text = ParseT $ do + path <- ask + lift $ refute $ NE.singleton ParseError{ peCode = code, pePath = path, peMessage = text } + markNotReusable = ParseT $ lift $ put NotReusable + +data ParseError = ParseError + { pePath :: JSONPath + , peMessage :: Text + , peCode :: Code + } diff --git a/server/src-lib/Hasura/GraphQL/Parser/Schema.hs b/server/src-lib/Hasura/GraphQL/Parser/Schema.hs new file mode 100644 index 00000000000..654fd79e0ca --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Parser/Schema.hs @@ -0,0 +1,803 @@ +{-# LANGUAGE StrictData #-} + +-- | Types for representing a GraphQL schema. +module Hasura.GraphQL.Parser.Schema ( + -- * Kinds + Kind(..) + , (:<:)(..) + , type (<:)(..) + + -- * Types + , Type(..) + , NonNullableType(..) + , TypeInfo(..) + , SomeTypeInfo(..) + , eqType + , eqNonNullableType + , eqTypeInfo + , discardNullability + , nullableType + , nonNullableType + , toGraphQLType + , getObjectInfo + , getInterfaceInfo + + , EnumValueInfo(..) + , InputFieldInfo(..) + , FieldInfo(..) + , InputObjectInfo(..) + , ObjectInfo(..) + , InterfaceInfo(..) + , UnionInfo(..) + + -- * Definitions + , Definition(..) + , mkDefinition + , addDefinitionUnique + , HasDefinition(..) + + -- * Schemas + , Schema(..) + , ConflictingDefinitions(..) + , HasTypeDefinitions(..) + , collectTypeDefinitions + + -- * Miscellany + , HasName(..) + , InputValue(..) + , Variable(..) + , VariableInfo(..) + , DirectiveInfo(..) + ) where + +import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict.Extended as Map +import qualified Data.HashSet as Set +import Data.Hashable ( Hashable (..) ) + +import Control.Lens.Extended +import Control.Monad.Unique +import Data.Functor.Classes +import Language.GraphQL.Draft.Syntax ( Description (..), Name (..) + , Value (..), Nullability(..) + , GType (..), DirectiveLocation(..) + ) + +class HasName a where + getName :: a -> Name +instance HasName Name where + getName = id + +-- | GraphQL types are divided into two classes: input types and output types. +-- The GraphQL spec does not use the word “kind” to describe these classes, but +-- it’s an apt term. +-- +-- Some GraphQL types can be used at either kind, so we also include the 'Both' +-- kind, the superkind of both 'Input' and 'Output'. The '<:' class provides +-- kind subsumption constraints. +-- +-- For more details, see . +data Kind + = Both -- ^ see Note [The 'Both kind] + | Input + | Output + +{- Note [The 'Both kind] +~~~~~~~~~~~~~~~~~~~~~~~~ +As described in the Haddock comments for Kind and <:, we use Kind to index +various types, such as Type and Parser. We use this to enforce various +correctness constraints mandated by the GraphQL spec; for example, we don’t +allow input object fields to have output types and we don’t allow output object +fields to have input types. + +But scalars and enums can be used as input types *or* output types. A natural +encoding of that in Haskell would be to make constructors for those types +polymorphic, like this: + + data Kind = Input | Output + + data TypeInfo k where + TIScalar :: TypeInfo k -- \ Polymorphic! + TIEnum :: ... -> TypeInfo k -- / + TIInputObject :: ... -> TypeInfo 'Input + TIObject :: ... -> TypeInfo 'Output + +Naturally, this would give the `scalar` parser constructor a similarly +polymorphic type: + + scalar + :: MonadParse m + => Name + -> Maybe Description + -> ScalarRepresentation a + -> Parser k m a -- Polymorphic! + +But if we actually try that, we run into problems. The trouble is that we want +to use the Kind to influence several different things: + + * As mentioned above, we use it to ensure that the types we generate are + well-kinded according to the GraphQL spec rules. + + * We use it to determine what a Parser consumes as input. Parsers for input + types parse GraphQL input values, but Parsers for output types parse + selection sets. (See Note [The meaning of Parser 'Output] in + Hasura.GraphQL.Parser.Internal.Parser for an explanation of why.) + + * We use it to know when to expect a sub-selection set for a field of an + output object (see Note [The delicate balance of GraphQL kinds]). + +These many uses of Kind cause some trouble for a polymorphic representation. For +example, consider our `scalar` parser constructor above---if we were to +instantiate it at kind 'Output, we’d receive a `Parser 'Output`, which we would +then expect to be able to apply to a selection set. But that doesn’t make any +sense, since scalar fields don’t have selection sets! + +Another issue with this representation has to do with effectful parser +constructors (such as constructors that can throw errors). These have types like + + mkFooParser :: MonadSchema n m => Blah -> m (Parser k n Foo) + +where the parser construction is itself monadic. This causes some annoyance, +since even if mkFooParser returns a Parser of a polymorphic kind, code like this +will not typecheck: + + (fooParser :: forall k. Parser k n Foo) <- mkFooParser blah + +The issue is that we have to instantiate k to a particular type to be able to +call mkFooParser. If we want to use the result at both kinds, we’d have to call +mkFooParser twice: + + (fooInputParser :: Parser 'Input n Foo) <- mkFooParser blah + (fooOutputParser :: Parser 'Output n Foo) <- mkFooParser blah + +Other situations encounter similar difficulties, and they are not easy to +resolve without impredicative polymorphism (which GHC does not support). + +To avoid this problem, we don’t use polymorphic kinds, but instead introduce a +form of kind subsumption. Types that can be used as both input and output types +are explicitly given the kind 'Both. This allows us to get the best of both +worlds: + + * We use the <: typeclass to accept 'Both in most places where we expect + either input or output types. + + * We can treat 'Both specially to avoid requiring `scalar` to supply a + selection set parser (see Note [The delicate balance of GraphQL kinds] for + further explanation). + + * Because we avoid the polymorphism, we don’t run into the aforementioned + issue with monadic parser constructors. + +All of this is subtle and somewhat complicated, but unfortunately there isn’t +much of a way around that: GraphQL is subtle and complicated. Our use of an +explicit 'Both kind isn’t the only way to encode these things, but it’s the +particular set of compromises we’ve chosen to accept. + +Note [The delicate balance of GraphQL kinds] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +As discussed in Note [The 'Both kind], we use GraphQL kinds to distinguish +several different things. One of them is which output types take sub-selection +sets. For example, scalars don’t accept sub-selection sets, so if we have a +schema like + + type Query { + users: [User!]! + } + + type User { + id: Int! + } + +then the following query is illegal: + + query { + users { + id { + blah + } + } + } + +The id field has a scalar type, so it should not take a sub-selection set. This +is actually something we care about distinguishing at the type level, because it +affects the type of the `selection` parser combinator. Suppose we have a +`Parser 'Output m UserQuery` for the User type. When we parse a field with that +type, we expect to receive a UserQuery as a result, unsurprisingly. But what if +we parse an output field using the `int` parser, which has this type: + + int :: MonadParse m => Parser 'Both m Int32 + +If we follow the same logic as for the User parser above, we’d expect to receive +an Int32 as a result... but that doesn’t make any sense, since the Int32 +corresponds to the result *we* are suppose to produce as a result of executing +the query, not something user-specified. + +One way to solve this would be to associate every Parser with two result types: +one when given an input object, and one when given a selection set. Then our +parsers could be given these types, instead: + + user :: MonadParse m => Parser 'Output m Void UserQuery + int :: MonadParse m => Parser 'Both m Int32 () + +But if you work through this, you’ll find that *all* parsers will either have +Void or () for at least one of their input result types or their output result +types, depending on their kind: + + * All 'Input parsers must have Void for their output result type, since they + aren’t allowed to be used in output contexts at all. + + * All 'Output parsers must have Void for their input result type, since they + aren’t allowed to be used in input contexts at all. + + * That just leaves 'Both. The only types of kind 'Both are scalars and enums, + neither of which accept a sub-selection set. Their output result type would + therefore be (), since they are allowed to appear in output contexts, but + they don’t return any results. + +The end result of this is that we clutter all our types with Voids and ()s, with +little actual benefit. + +If you really think about it, the fact that the no types of kind 'Both accept a +sub-selection set is really something of a coincidence. In theory, one could +imagine a future version of the GraphQL spec adding a type that can be used as +both an input type or an output type, but accepts a sub-selection set. If that +ever happens, we’ll have to tweak our encoding, but for now, we can take +advantage of this happy coincidence and make the kinds serve double duty: + + * We can make `ParserInput 'Both` identical to `ParserInput 'Input`, since + all parsers of kind 'Both only parse input values. + + * We can require types of kind 'Both in `selection`, which does not expect a + sub-selection set, and types of kind 'Output in `subselection`, which does. + +Relying on this coincidence might seem a little gross, and perhaps it is +somewhat. But it’s enormously convenient: not doing this would make some types +significantly more complicated, since we would have to thread around more +information at the type level and we couldn’t make as many simplifying +assumptions. So until GraphQL adds a type that violates these assumptions, we +are happy to take advantage of this coincidence. -} + +-- | Evidence for '<:'. +data k1 :<: k2 where + KRefl :: k :<: k + KBoth :: k :<: 'Both + +-- | 'Kind' subsumption. The GraphQL kind hierarchy is extremely simple: +-- +-- > Both +-- > / \ +-- > Input Output +-- +-- Various functions in this module use '<:' to allow 'Both' to be used in +-- places where 'Input' or 'Output' would otherwise be expected. +class k1 <: k2 where + subKind :: k1 :<: k2 +instance k1 ~ k2 => k1 <: k2 where + subKind = KRefl +instance {-# OVERLAPPING #-} k <: 'Both where + subKind = KBoth + +data Type k + = NonNullable (NonNullableType k) + | Nullable (NonNullableType k) + +instance Eq (Type k) where + (==) = eqType + +-- | Like '==', but can compare 'Type's of different kinds. +eqType :: Type k1 -> Type k2 -> Bool +eqType (NonNullable a) (NonNullable b) = eqNonNullableType a b +eqType (Nullable a) (Nullable b) = eqNonNullableType a b +eqType _ _ = False + +instance HasName (Type k) where + getName = getName . discardNullability + +instance HasDefinition (Type k) (TypeInfo k) where + definitionLens f (NonNullable t) = NonNullable <$> definitionLens f t + definitionLens f (Nullable t) = Nullable <$> definitionLens f t + +discardNullability :: Type k -> NonNullableType k +discardNullability (NonNullable t) = t +discardNullability (Nullable t) = t + +nullableType :: Type k -> Type k +nullableType = Nullable . discardNullability + +nonNullableType :: Type k -> Type k +nonNullableType = NonNullable . discardNullability + +data NonNullableType k + = TNamed (Definition (TypeInfo k)) + | TList (Type k) + +instance Eq (NonNullableType k) where + (==) = eqNonNullableType + +toGraphQLType :: Type k -> GType +toGraphQLType = \case + NonNullable t -> translateWith False t + Nullable t -> translateWith True t + where + translateWith nullability = \case + TNamed typeInfo -> TypeNamed (Nullability nullability) $ getName typeInfo + TList typeInfo -> TypeList (Nullability nullability) $ toGraphQLType typeInfo + + +-- | Like '==', but can compare 'NonNullableType's of different kinds. +eqNonNullableType :: NonNullableType k1 -> NonNullableType k2 -> Bool +eqNonNullableType (TNamed a) (TNamed b) = liftEq eqTypeInfo a b +eqNonNullableType (TList a) (TList b) = eqType a b +eqNonNullableType _ _ = False + +instance HasName (NonNullableType k) where + getName (TNamed definition) = getName definition + getName (TList t) = getName t + +instance HasDefinition (NonNullableType k) (TypeInfo k) where + definitionLens f (TNamed definition) = TNamed <$> f definition + definitionLens f (TList t) = TList <$> definitionLens f t + +{- Note [The interfaces story] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +GraphQL interfaces are not conceptually complicated, but they pose some +non-obvious challenges for our implementation. First, familiarize yourself with +GraphQL interfaces themselves: + + * https://graphql.org/learn/schema/#interfaces + * http://spec.graphql.org/June2018/#sec-Interfaces + * http://spec.graphql.org/June2018/#sec-Objects + +The most logical repesentation of object and interface types is to have objects +reference the interfaces they implement, but not the other way around. After +all, that’s how it works in the GraphQL language: when you declare an interface, +you just specify its fields, and you specify which interfaces each object type +implements as part of their declarations. + +However, this representation is actually not very useful for us. We /also/ need +the interfaces to reference the objects that implement them---forming a circular +structure---for two reasons: + + 1. Most directly, we need this information for introspection queries. + Introspection queries for object types return the set of interfaces they + implement , and introspection + queries for interfaces return the set of object types that implement them + . + + 2. Less obviously, it’s more natural to specify the relationships “backwards” + like this when building the schema using the parser combinator language. + + From the parser’s point of view, each implementation of an interface + corresponds to a distinct parsing possibility. For example, when we + generate a Relay schema, the type of the `node` root field is an interface, + and each table is a type that implements it: + + type query_root { + node(id: ID!): Node + ... + } + + interface Node { + id: ID! + } + + type author implements Node { + id: ID! + name: String! + ... + } + + type article implements Node { + id: ID! + title: String! + body: String! + ... + } + + A query will use fragments on the Node type to access table-specific fields: + + query get_article_info($article_id: ID!) { + node(id: $article_id) { + ... on article { + title + body + } + } + } + + The query parser needs to know which types implement the interface (and + how to parse their selection sets) so that it can parse the fragments. + +This presents some complications, since we need to build this information in a +circular fashion. Currently, we do this in a very naïve way: + + * We require selectionSetObject to specify the interfaces it implements /and/ + require selectionSetInterface to specify the objects that implement it. + + * We take advantage of our existing memoization mechanism to do the knot-tying + for us (see Note [Tying the knot] in Hasura.GraphQL.Parser.Class). + +You may notice that this makes it possible for the definitions to be +inconsistent: we could construct an interface parser that parses some object +type, but forget to specify that the object type implements the interface. This +inconsistency is currently completely unchecked, which is quite unfortunate. It +also means we don’t support remote schema-defined object types that implement +interfaces we generate, since we don’t know anything about those types when we +construct the interface. + +Since we don’t make very much use of interface types at the time of this +writing, this isn’t much of a problem in practice. But if that changes, it would +be worth implementing a more sophisticated solution that can gather up all the +different sources of information and make sure they’re consistent. -} + +data InputObjectInfo = InputObjectInfo ~[Definition InputFieldInfo] +-- Note that we can't check for equality of the fields since there may be +-- circularity. So we rather check for equality of names. +instance Eq InputObjectInfo where + InputObjectInfo fields1 == InputObjectInfo fields2 + = Set.fromList (fmap dName fields1) == Set.fromList (fmap dName fields2) + +data ObjectInfo = ObjectInfo + { oiFields :: ~[Definition FieldInfo] + -- ^ The fields that this object has. This consists of the fields of the + -- interfaces that it implements, as well as any additional fields. + , oiImplements :: ~[Definition InterfaceInfo] + -- ^ The interfaces that this object implements (inheriting all their + -- fields). See Note [The interfaces story] for more details. + } +-- Note that we can't check for equality of the fields and the interfaces since +-- there may be circularity. So we rather check for equality of names. +instance Eq ObjectInfo where + ObjectInfo fields1 interfaces1 == ObjectInfo fields2 interfaces2 + = Set.fromList (fmap dName fields1 ) == Set.fromList (fmap dName fields2 ) + && Set.fromList (fmap dName interfaces1) == Set.fromList (fmap dName interfaces2) + +-- | Type information for a GraphQL interface; see Note [The interfaces story] +-- for more details. +-- +-- Note: in the current working draft of the GraphQL specification (> June +-- 2018), interfaces may implement other interfaces, but we currently don't +-- support this. +data InterfaceInfo = InterfaceInfo + { iiFields :: ~[Definition FieldInfo] + -- ^ Fields declared by this interface. Every object implementing this + -- interface must include those fields. + , iiPossibleTypes :: ~[Definition ObjectInfo] + -- ^ Objects that implement this interface. See Note [The interfaces story] + -- for why we include that information here. + } +-- Note that we can't check for equality of the fields and the interfaces since +-- there may be circularity. So we rather check for equality of names. +instance Eq InterfaceInfo where + InterfaceInfo fields1 objects1 == InterfaceInfo fields2 objects2 + = Set.fromList (fmap dName fields1 ) == Set.fromList (fmap dName fields2 ) + && Set.fromList (fmap dName objects1 ) == Set.fromList (fmap dName objects2 ) + +data UnionInfo = UnionInfo + { uiPossibleTypes :: ~[Definition ObjectInfo] + -- ^ The member object types of this union. + } + +data TypeInfo k where + TIScalar :: TypeInfo 'Both + TIEnum :: NonEmpty (Definition EnumValueInfo) -> TypeInfo 'Both + TIInputObject :: InputObjectInfo -> TypeInfo 'Input + TIObject :: ObjectInfo -> TypeInfo 'Output + TIInterface :: InterfaceInfo -> TypeInfo 'Output + TIUnion :: UnionInfo -> TypeInfo 'Output + +instance Eq (TypeInfo k) where + (==) = eqTypeInfo + +-- | Like '==', but can compare 'TypeInfo's of different kinds. +eqTypeInfo :: TypeInfo k1 -> TypeInfo k2 -> Bool +eqTypeInfo TIScalar TIScalar = True +eqTypeInfo (TIEnum values1) (TIEnum values2) + = Set.fromList (toList values1) == Set.fromList (toList values2) +-- NB the case for input objects currently has quadratic complexity, which is +-- probably avoidable. HashSets should be able to get this down to +-- O(n*log(n)). But this requires writing some Hashable instances by hand +-- because we use some existential types and GADTs. +eqTypeInfo (TIInputObject ioi1) (TIInputObject ioi2) = ioi1 == ioi2 +eqTypeInfo (TIObject oi1) (TIObject oi2) = oi1 == oi2 +eqTypeInfo (TIInterface ii1) (TIInterface ii2) = ii1 == ii2 +eqTypeInfo (TIUnion (UnionInfo objects1)) (TIUnion (UnionInfo objects2)) + = Set.fromList (fmap dName objects1) == Set.fromList (fmap dName objects2) +eqTypeInfo _ _ = False + +getObjectInfo :: Type k -> Maybe (Definition ObjectInfo) +getObjectInfo = traverse getTI . (^.definitionLens) + where + getTI :: TypeInfo k -> Maybe ObjectInfo + getTI (TIObject oi) = Just oi + getTI _ = Nothing + +getInterfaceInfo :: Type 'Output -> Maybe (Definition InterfaceInfo) +getInterfaceInfo = traverse getTI . (^.definitionLens) + where + getTI :: TypeInfo 'Output -> Maybe InterfaceInfo + getTI (TIInterface ii) = Just ii + getTI _ = Nothing + +data SomeTypeInfo = forall k. SomeTypeInfo (TypeInfo k) + +instance Eq SomeTypeInfo where + SomeTypeInfo a == SomeTypeInfo b = eqTypeInfo a b + +data Definition a = Definition + { dName :: Name + , dUnique :: Maybe Unique + -- ^ A unique identifier used to break cycles in mutually-recursive type + -- definitions. If two 'Definition's have the same 'Unique', they can be + -- assumed to be identical. Note that the inverse is /not/ true: two + -- definitions with different 'Unique's might still be otherwise identical. + -- + -- Also see Note [Tying the knot] in Hasura.GraphQL.Parser.Class. + , dDescription :: Maybe Description + , dInfo :: ~a + -- ^ Lazy to allow mutually-recursive type definitions. + } deriving (Functor, Foldable, Traversable, Generic) +instance Hashable a => Hashable (Definition a) where + hashWithSalt salt Definition{..} = + salt `hashWithSalt` dName `hashWithSalt` dInfo + +mkDefinition :: Name -> Maybe Description -> a -> Definition a +mkDefinition name description info = Definition name Nothing description info + +instance Eq a => Eq (Definition a) where + (==) = eq1 + +instance Eq1 Definition where + liftEq eq (Definition name1 maybeUnique1 _ info1) + (Definition name2 maybeUnique2 _ info2) + | Just unique1 <- maybeUnique1 + , Just unique2 <- maybeUnique2 + , unique1 == unique2 + = True + | otherwise + = name1 == name2 && eq info1 info2 + +instance HasName (Definition a) where + getName = dName + +class HasDefinition s a | s -> a where + definitionLens :: Lens' s (Definition a) +instance HasDefinition (Definition a) a where + definitionLens = id + +-- | Adds a 'Unique' to a 'Definition' that does not yet have one. If the +-- definition already has a 'Unique', the existing 'Unique' is kept. +addDefinitionUnique :: HasDefinition s a => Unique -> s -> s +addDefinitionUnique unique = over definitionLens \definition -> + definition { dUnique = dUnique definition <|> Just unique } + +-- | Enum values have no extra information except for the information common to +-- all definitions, so this is just a placeholder for use as @'Definition' +-- 'EnumValueInfo'@. +data EnumValueInfo = EnumValueInfo + deriving (Eq, Generic) +instance Hashable EnumValueInfo + +data InputFieldInfo + -- | A required field with a non-nullable type. + = forall k. ('Input <: k) => IFRequired (NonNullableType k) + -- | An optional input field with a nullable type and possibly a default + -- value. If a default value is provided, it should be a valid value for the + -- type. + -- + -- Note that a default value of 'VNull' is subtly different from having no + -- default value at all. If no default value is provided, the GraphQL + -- specification allows distinguishing provided @null@ values from values left + -- completely absent; see Note [Optional fields and nullability] in + -- Hasura.GraphQL.Parser.Internal.Parser. + | forall k. ('Input <: k) => IFOptional (Type k) (Maybe (Value Void)) + +instance Eq InputFieldInfo where + IFRequired t1 == IFRequired t2 = eqNonNullableType t1 t2 + IFOptional t1 v1 == IFOptional t2 v2 = eqType t1 t2 && v1 == v2 + _ == _ = False + +data FieldInfo = forall k. ('Output <: k) => FieldInfo + { fArguments :: [Definition InputFieldInfo] + , fType :: Type k + } + +instance Eq FieldInfo where + FieldInfo args1 t1 == FieldInfo args2 t2 = args1 == args2 && eqType t1 t2 + +{- Note [Parsing variable values] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +GraphQL includes its own tiny language for input values, which is similar to +JSON but not quite the same---GraphQL input values can be enum values, and there +are restrictions on the names of input object keys. Despite these differences, +variables’ values are passed as JSON, so we actually need to be able to parse +values expressed in both languages. + +It’s tempting to contain this complexity by simply converting the JSON values to +GraphQL input values up front, and for booleans, numbers, arrays, and most +objects, this conversion is viable. But JSON strings pose a problem, since they +are used to represent both GraphQL strings and GraphQL enums. For example, +consider a query like this: + + enum FooBar { + FOO + BAR + } + + query some_query($a: String, $b: FooBar) { + ... + } + +We might receive an accompany variables payload like this: + + { + "a": "FOO", + "b": "FOO" + } + +To properly convert these JSON values to GraphQL, we’d need to use the type +information to guide the parsing. Since $a has type String, its value should be +parsed as the GraphQL string "FOO", while $b has type FooBar, so its value +should be parsed as the GraphQL enum value FOO. + +We could do this type-directed parsing, but there are some advantages to being +lazier. For one, we can use JSON values directly when used as a column value of +type json or jsonb, rather than converting them to GraphQL and back; which, in +turn, solves another problem with JSON objects: JSON object keys are arbitrary +strings, while GraphQL input object keys are GraphQL names, and therefore +restricted: not all JSON objects can be represented by a GraphQL input object. + +Arguably such columns should really be represented as strings containing encoded +JSON, not GraphQL lists/objects, but the decision to treat them otherwise is +old, and it would be backwards-incompatible to change now. We can also avoid +needing to interpret the values of variables for types outside our control +(i.e. those from a remote schema), which can be useful in the case of custom +scalars or extensions of the GraphQL protocol. + +So instead we use the InputValue type to represent that an input value might be +a GraphQL literal value or a JSON value from the variables payload. This means +each input parser constructor needs to be able to parse both GraphQL values and +JSON values, but fortunately, the duplication of logic is minimal. -} + +-- | See Note [Parsing variable values]. +data InputValue v + = GraphQLValue (Value v) + | JSONValue J.Value + deriving (Show, Eq, Functor) + +data Variable = Variable + { vInfo :: VariableInfo + , vType :: GType + , vValue :: InputValue Void + -- ^ Note: if the variable was null or was not provided and the field has a + -- non-null default value, this field contains the default value, not 'VNull'. + } deriving (Show,Eq) + + +data VariableInfo + = VIRequired Name + -- | Unlike fields (see 'IFOptional'), nullable variables with no default + -- value are indistinguishable from variables with a default value of null, so + -- we don’t distinguish those cases here. + | VIOptional Name (Value Void) + deriving (Show,Eq) + +instance HasName Variable where + getName = getName . vInfo +instance HasName VariableInfo where + getName (VIRequired name) = name + getName (VIOptional name _) = name + +-- ----------------------------------------------------------------------------- +-- support for introspection queries + +-- | This type represents the directives information to be served over GraphQL introspection +data DirectiveInfo = DirectiveInfo + { diName :: !Name + , diDescription :: !(Maybe Description) + , diArguments :: ![Definition InputFieldInfo] + , diLocations :: ![DirectiveLocation] + } + +-- | This type contains all the information needed to efficiently serve GraphQL +-- introspection queries. It corresponds to the GraphQL @__Schema@ type defined +-- in <§ 4.5 Schema Introspection http://spec.graphql.org/June2018/#sec-Introspection>. +data Schema = Schema + { sDescription :: Maybe Description + , sTypes :: HashMap Name (Definition SomeTypeInfo) + , sQueryType :: Type 'Output + , sMutationType :: Maybe (Type 'Output) + , sSubscriptionType :: Maybe (Type 'Output) + , sDirectives :: [DirectiveInfo] + } + +-- | Recursively collects all type definitions accessible from the given value. +collectTypeDefinitions + :: (HasTypeDefinitions a, MonadError ConflictingDefinitions m) + => a -> m (HashMap Name (Definition SomeTypeInfo)) +collectTypeDefinitions = flip execStateT Map.empty . accumulateTypeDefinitions + +data ConflictingDefinitions + = ConflictingDefinitions (Definition SomeTypeInfo) (Definition SomeTypeInfo) + +class HasTypeDefinitions a where + -- | Recursively accumulates all type definitions accessible from the given + -- value. This is done statefully to avoid infinite loops arising from + -- recursive type definitions; see Note [Tying the knot] in Hasura.GraphQL.Parser.Class. + accumulateTypeDefinitions + :: ( MonadError ConflictingDefinitions m + , MonadState (HashMap Name (Definition SomeTypeInfo)) m ) + => a -> m () + +instance HasTypeDefinitions (Definition (TypeInfo k)) where + accumulateTypeDefinitions definition = do + -- This is the important case! We actually have a type definition, so we + -- need to add it to the state. + definitions <- get + let new = SomeTypeInfo <$> definition + case Map.lookup (dName new) definitions of + Nothing -> do + put $! Map.insert (dName new) new definitions + -- This type definition might reference other type definitions, so we + -- still need to recur. + accumulateTypeDefinitions (dInfo definition) + Just old + -- It’s important we /don’t/ recur if we’ve already seen this definition + -- before to avoid infinite loops; see Note [Tying the knot] in Hasura.GraphQL.Parser.Class. + | old == new -> pure () + | otherwise -> throwError $ ConflictingDefinitions old new + +instance HasTypeDefinitions a => HasTypeDefinitions [a] where + accumulateTypeDefinitions = traverse_ accumulateTypeDefinitions + +instance HasTypeDefinitions (Type k) where + accumulateTypeDefinitions = \case + NonNullable t -> accumulateTypeDefinitions t + Nullable t -> accumulateTypeDefinitions t + +instance HasTypeDefinitions (NonNullableType k) where + accumulateTypeDefinitions = \case + TNamed d -> accumulateTypeDefinitions d + TList t -> accumulateTypeDefinitions t + +instance HasTypeDefinitions (TypeInfo k) where + accumulateTypeDefinitions = \case + TIScalar -> pure () + TIEnum _ -> pure () + TIInputObject (InputObjectInfo fields) -> accumulateTypeDefinitions fields + TIObject (ObjectInfo fields interfaces) -> + accumulateTypeDefinitions fields >> accumulateTypeDefinitions interfaces + TIInterface (InterfaceInfo fields objects) -> + accumulateTypeDefinitions fields + >> accumulateTypeDefinitions objects + TIUnion (UnionInfo objects) -> accumulateTypeDefinitions objects + +instance HasTypeDefinitions (Definition InputObjectInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIInputObject + +instance HasTypeDefinitions (Definition InputFieldInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . dInfo + +instance HasTypeDefinitions InputFieldInfo where + accumulateTypeDefinitions = \case + IFRequired t -> accumulateTypeDefinitions t + IFOptional t _ -> accumulateTypeDefinitions t + +instance HasTypeDefinitions (Definition FieldInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . dInfo + +instance HasTypeDefinitions FieldInfo where + accumulateTypeDefinitions (FieldInfo args t) = do + accumulateTypeDefinitions args + accumulateTypeDefinitions t + +instance HasTypeDefinitions (Definition ObjectInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIObject + +instance HasTypeDefinitions (Definition InterfaceInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIInterface + +instance HasTypeDefinitions (Definition UnionInfo) where + accumulateTypeDefinitions = accumulateTypeDefinitions . fmap TIUnion diff --git a/server/src-lib/Hasura/GraphQL/RelaySchema.hs b/server/src-lib/Hasura/GraphQL/RelaySchema.hs deleted file mode 100644 index 5ccf9e2223f..00000000000 --- a/server/src-lib/Hasura/GraphQL/RelaySchema.hs +++ /dev/null @@ -1,425 +0,0 @@ -module Hasura.GraphQL.RelaySchema where - -import Control.Lens.Extended hiding (op) - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Context -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.Server.Utils (duplicates) -import Hasura.Session -import Hasura.SQL.Types - -import Hasura.GraphQL.Schema -import Hasura.GraphQL.Schema.BoolExp -import Hasura.GraphQL.Schema.Builder -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Function -import Hasura.GraphQL.Schema.OrderBy -import Hasura.GraphQL.Schema.Select - -mkNodeInterface :: [QualifiedTable] -> IFaceTyInfo -mkNodeInterface relayTableNames = - let description = G.Description "An object with globally unique ID" - in mkIFaceTyInfo (Just description) nodeType (mapFromL _fiName [idField]) $ - Set.fromList $ map mkTableTy relayTableNames - where - idField = - let description = G.Description "A globally unique identifier" - in mkHsraObjFldInfo (Just description) "id" mempty nodeIdType - --- | Relay schema should contain tables and relationships (whose remote tables) --- with a mandatory primary key -tablesWithOnlyPrimaryKey :: TableCache -> TableCache -tablesWithOnlyPrimaryKey tableCache = - flip Map.mapMaybe tableCache $ \tableInfo -> - tableInfo ^. tiCoreInfo.tciPrimaryKey *> - Just (infoWithPrimaryKeyRelations tableInfo) - where - infoWithPrimaryKeyRelations = - tiCoreInfo.tciFieldInfoMap %~ Map.mapMaybe (_FIRelationship %%~ withPrimaryKey) - - withPrimaryKey relInfo = - let remoteTable = riRTable relInfo - maybePrimaryKey = - (tableCache ^. at remoteTable) >>= (^. tiCoreInfo.tciPrimaryKey) - in maybePrimaryKey *> Just relInfo - -mkRelayGCtxMap - :: forall m. (MonadError QErr m) - => TableCache -> FunctionCache -> m RelayGCtxMap -mkRelayGCtxMap tableCache functionCache = do - typesMapL <- mapM (mkRelayGCtxMapTable relayTableCache functionCache) relayTables - typesMap <- combineTypes typesMapL - let gCtxMap = flip Map.map typesMap $ - \(ty, flds, insCtx) -> mkGCtx ty flds insCtx - pure gCtxMap - where - relayTableCache = tablesWithOnlyPrimaryKey tableCache - relayTables = - filter (tableFltr . _tiCoreInfo) $ Map.elems relayTableCache - - tableFltr ti = - not (isSystemDefined $ _tciSystemDefined ti) - && isValidObjectName (_tciName ti) - - combineTypes - :: [Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap)] - -> m (Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap)) - combineTypes maps = do - let listMap = foldr (Map.unionWith (++) . Map.map pure) mempty maps - flip Map.traverseWithKey listMap $ \roleName typeList -> do - let relayTableNames = map (_tciName . _tiCoreInfo) relayTables - tyAgg = foldr addTypeInfoToTyAgg (mconcat $ map (^. _1) typeList) - [ TIIFace $ mkNodeInterface relayTableNames - , TIObj pageInfoObj - ] - insCtx = mconcat $ map (^. _3) typeList - rootFields <- combineRootFields roleName $ map (^. _2) typeList - pure (tyAgg, rootFields, insCtx) - - combineRootFields :: RoleName -> [RootFields] -> m RootFields - combineRootFields roleName rootFields = do - let duplicateQueryFields = duplicates $ - concatMap (Map.keys . _rootQueryFields) rootFields - duplicateMutationFields = duplicates $ - concatMap (Map.keys . _rootMutationFields) rootFields - - -- TODO: The following exception should result in inconsistency - when (not $ null duplicateQueryFields) $ - throw400 Unexpected $ "following query root fields are duplicated: " - <> showNames duplicateQueryFields - - when (not $ null duplicateMutationFields) $ - throw400 Unexpected $ "following mutation root fields are duplicated: " - <> showNames duplicateMutationFields - - pure $ mconcat $ mkNodeQueryRootFields roleName relayTables : rootFields - -mkRelayGCtxMapTable - :: (MonadError QErr m) - => TableCache - -> FunctionCache - -> TableInfo - -> m (Map.HashMap RoleName (TyAgg, RootFields, InsCtxMap)) -mkRelayGCtxMapTable tableCache funcCache tabInfo = do - m <- flip Map.traverseWithKey rolePerms $ - mkRelayGCtxRole tableCache tn descM fields primaryKey validConstraints tabFuncs viewInfo customConfig - adminSelFlds <- mkAdminSelFlds fields tableCache - adminInsCtx <- mkAdminInsCtx tableCache fields - let adminCtx = mkRelayTyAggRole tn descM (Just (cols, icRelations adminInsCtx)) - (Just (True, adminSelFlds)) (Just cols) (Just ()) - primaryKey validConstraints viewInfo tabFuncs - adminInsCtxMap = Map.singleton tn adminInsCtx - return $ Map.insert adminRoleName (adminCtx, adminRootFlds, adminInsCtxMap) m - where - TableInfo coreInfo rolePerms _ = tabInfo - TableCoreInfo tn descM _ fields primaryKey _ _ viewInfo _ customConfig = coreInfo - validConstraints = mkValidConstraints $ map _cName (tciUniqueOrPrimaryKeyConstraints coreInfo) - tabFuncs = filter (isValidObjectName . fiName) $ - getFuncsOfTable tn funcCache - cols = getValidCols fields - adminRootFlds = - let insertPermDetails = Just ([], True) - selectPermDetails = Just (noFilter, Nothing, [], True) - updatePermDetails = Just (getValidCols fields, mempty, noFilter, Nothing, []) - deletePermDetails = Just (noFilter, []) - - queryFields = getRelayQueryRootFieldsRole tn primaryKey fields tabFuncs - selectPermDetails - mutationFields = getMutationRootFieldsRole tn primaryKey - validConstraints fields insertPermDetails - selectPermDetails updatePermDetails - deletePermDetails viewInfo customConfig - in RootFields queryFields mutationFields - -mkRelayGCtxRole - :: (MonadError QErr m) - => TableCache - -> QualifiedTable - -> Maybe PGDescription - -> FieldInfoMap FieldInfo - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -> [FunctionInfo] - -> Maybe ViewInfo - -> TableConfig - -> RoleName - -> RolePermInfo - -> m (TyAgg, RootFields, InsCtxMap) -mkRelayGCtxRole tableCache tn descM fields primaryKey constraints funcs viM tabConfigM role permInfo = do - selPermM <- mapM (getSelPerm tableCache fields role) selM - tabInsInfoM <- forM (_permIns permInfo) $ \ipi -> do - ctx <- mkInsCtx role tableCache fields ipi $ _permUpd permInfo - let permCols = flip getColInfos allCols $ Set.toList $ ipiCols ipi - return (ctx, (permCols, icRelations ctx)) - let insPermM = snd <$> tabInsInfoM - insCtxM = fst <$> tabInsInfoM - updColsM = filterColumnFields . upiCols <$> _permUpd permInfo - tyAgg = mkRelayTyAggRole tn descM insPermM selPermM updColsM - (void $ _permDel permInfo) primaryKey constraints viM funcs - queryRootFlds = getRelayQueryRootFieldsRole tn primaryKey fields funcs - (mkSel <$> _permSel permInfo) - mutationRootFlds = getMutationRootFieldsRole tn primaryKey constraints fields - (mkIns <$> insM) (mkSel <$> selM) - (mkUpd <$> updM) (mkDel <$> delM) viM tabConfigM - insCtxMap = maybe Map.empty (Map.singleton tn) insCtxM - return (tyAgg, RootFields queryRootFlds mutationRootFlds, insCtxMap) - where - RolePermInfo insM selM updM delM = permInfo - allCols = getCols fields - filterColumnFields allowedSet = - filter ((`Set.member` allowedSet) . pgiColumn) $ getValidCols fields - mkIns i = (ipiRequiredHeaders i, isJust updM) - mkSel s = ( spiFilter s, spiLimit s - , spiRequiredHeaders s, spiAllowAgg s - ) - mkUpd u = ( flip getColInfos allCols $ Set.toList $ upiCols u - , upiSet u - , upiFilter u - , upiCheck u - , upiRequiredHeaders u - ) - mkDel d = (dpiFilter d, dpiRequiredHeaders d) - -mkRelayTyAggRole - :: QualifiedTable - -> Maybe PGDescription - -- ^ Postgres description - -> Maybe ([PGColumnInfo], RelationInfoMap) - -- ^ insert permission - -> Maybe (Bool, [SelField]) - -- ^ select permission - -> Maybe [PGColumnInfo] - -- ^ update cols - -> Maybe () - -- ^ delete cols - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -- ^ constraints - -> Maybe ViewInfo - -> [FunctionInfo] - -- ^ all functions - -> TyAgg -mkRelayTyAggRole tn descM insPermM selPermM updColsM delPermM pkeyCols constraints viM funcs = - let (mutationTypes, mutationFields) = - mkMutationTypesAndFieldsRole tn insPermM selFldsM updColsM delPermM pkeyCols constraints viM - in TyAgg (mkTyInfoMap allTypes <> mutationTypes) - (fieldMap <> mutationFields) - scalars ordByCtx - where - ordByCtx = fromMaybe Map.empty ordByCtxM - - funcInpArgTys = bool [] (map TIInpObj funcArgInpObjs) $ isJust selFldsM - - allTypes = queryTypes <> aggQueryTypes <> funcInpArgTys <> computedFieldFuncArgsInps - - queryTypes = map TIObj selectObjects <> - catMaybes - [ TIInpObj <$> boolExpInpObjM - , TIInpObj <$> ordByInpObjM - , TIEnum <$> selColInpTyM - ] - aggQueryTypes = map TIObj aggObjs <> map TIInpObj aggOrdByInps - - fieldMap = Map.unions $ catMaybes [boolExpInpObjFldsM, selObjFldsM] - scalars = selByPkScalarSet <> funcArgScalarSet <> computedFieldFuncArgScalars - - selFldsM = snd <$> selPermM - selColNamesM = map pgiName . getPGColumnFields <$> selFldsM - selColInpTyM = mkSelColumnTy tn <$> selColNamesM - -- boolexp input type - boolExpInpObjM = case selFldsM of - Just selFlds -> Just $ mkBoolExpInp tn selFlds - -- no select permission - Nothing -> - -- but update/delete is defined - if isJust updColsM || isJust delPermM - then Just $ mkBoolExpInp tn [] - else Nothing - - -- funcargs input type - funcArgInpObjs = flip mapMaybe funcs $ \func -> - mkFuncArgsInp (fiName func) (getInputArgs func) - -- funcArgCtx = Map.unions funcArgCtxs - funcArgScalarSet = funcs ^.. folded.to getInputArgs.folded.to (_qptName.faType) - - -- helper - mkFldMap ty = Map.fromList . concatMap (mkFld ty) - mkFld ty = \case - SFPGColumn ci -> [((ty, pgiName ci), RFPGColumn ci)] - SFRelationship (RelationshipFieldInfo relInfo allowAgg cols permFilter permLimit maybePkCols _) -> - let relationshipName = riName relInfo - relFld = ( (ty, mkRelName relationshipName) - , RFRelationship $ RelationshipField relInfo RFKSimple cols permFilter permLimit - ) - aggRelFld = ( (ty, mkAggRelName relationshipName) - , RFRelationship $ RelationshipField relInfo RFKAggregate cols permFilter permLimit - ) - maybeConnFld = maybePkCols <&> \pkCols -> - ( (ty, mkConnectionRelName relationshipName) - , RFRelationship $ RelationshipField relInfo - (RFKConnection pkCols) cols permFilter permLimit - ) - in case riType relInfo of - ObjRel -> [relFld] - ArrRel -> bool [relFld] [relFld, aggRelFld] allowAgg - <> maybeToList maybeConnFld - - SFComputedField cf -> pure - ( (ty, mkComputedFieldName $ _cfName cf) - , RFComputedField cf - ) - SFRemoteRelationship remoteField -> pure - ( (ty, G.Name (remoteRelationshipNameToText (_rfiName remoteField))) - , RFRemoteRelationship remoteField - ) - - -- the fields used in bool exp - boolExpInpObjFldsM = mkFldMap (mkBoolExpTy tn) <$> selFldsM - - -- table obj - selectObjects = case selPermM of - Just (_, selFlds) -> - [ (mkRelayTableObj tn descM selFlds) - {_otiImplIFaces = Set.singleton nodeType} - , mkTableEdgeObj tn - , mkTableConnectionObj tn - ] - Nothing -> [] - - -- aggregate objs and order by inputs - (aggObjs, aggOrdByInps) = case selPermM of - Just (True, selFlds) -> - let cols = getPGColumnFields selFlds - numCols = onlyNumCols cols - compCols = onlyComparableCols cols - objs = [ mkTableAggObj tn - , mkTableAggregateFieldsObj tn (numCols, numAggregateOps) (compCols, compAggregateOps) - ] <> mkColAggregateFieldsObjs selFlds - ordByInps = mkTabAggOrdByInpObj tn (numCols, numAggregateOps) (compCols, compAggregateOps) - : mkTabAggregateOpOrdByInpObjs tn (numCols, numAggregateOps) (compCols, compAggregateOps) - in (objs, ordByInps) - _ -> ([], []) - - getNumericCols = onlyNumCols . getPGColumnFields - getComparableCols = onlyComparableCols . getPGColumnFields - onlyFloat = const $ mkScalarTy PGFloat - - mkTypeMaker "sum" = mkColumnType - mkTypeMaker _ = onlyFloat - - mkColAggregateFieldsObjs flds = - let numCols = getNumericCols flds - compCols = getComparableCols flds - mkNumObjFld n = mkTableColAggregateFieldsObj tn n (mkTypeMaker n) numCols - mkCompObjFld n = mkTableColAggregateFieldsObj tn n mkColumnType compCols - numFldsObjs = bool (map mkNumObjFld numAggregateOps) [] $ null numCols - compFldsObjs = bool (map mkCompObjFld compAggregateOps) [] $ null compCols - in numFldsObjs <> compFldsObjs - -- the fields used in table object - nodeFieldM = RFNodeId tn . _pkColumns <$> pkeyCols - selObjFldsM = mkFldMap (mkTableTy tn) <$> selFldsM >>= - \fm -> nodeFieldM <&> \nodeField -> - Map.insert (mkTableTy tn, "id") nodeField fm - -- the scalar set for table_by_pk arguments - selByPkScalarSet = pkeyCols ^.. folded.to _pkColumns.folded.to pgiType._PGColumnScalar - - ordByInpCtxM = mkOrdByInpObj tn <$> selFldsM - (ordByInpObjM, ordByCtxM) = case ordByInpCtxM of - Just (a, b) -> (Just a, Just b) - Nothing -> (Nothing, Nothing) - - -- computed fields' function args input objects and scalar types - mkComputedFieldRequiredTypes computedFieldInfo = - let ComputedFieldFunction qf inputArgs _ _ _ = _cfFunction computedFieldInfo - scalarArgs = map (_qptName . faType) $ toList inputArgs - in (, scalarArgs) <$> mkFuncArgsInp qf inputArgs - - computedFieldReqTypes = catMaybes $ - maybe [] (map mkComputedFieldRequiredTypes . getComputedFields) selFldsM - - computedFieldFuncArgsInps = map (TIInpObj . fst) computedFieldReqTypes - computedFieldFuncArgScalars = Set.fromList $ concatMap snd computedFieldReqTypes - -mkSelectOpCtx - :: QualifiedTable - -> [PGColumnInfo] - -> (AnnBoolExpPartialSQL, Maybe Int, [T.Text]) -- select filter - -> SelOpCtx -mkSelectOpCtx tn allCols (fltr, pLimit, hdrs) = - SelOpCtx tn hdrs colGNameMap fltr pLimit - where - colGNameMap = mkPGColGNameMap allCols - -getRelayQueryRootFieldsRole - :: QualifiedTable - -> Maybe (PrimaryKey PGColumnInfo) - -> FieldInfoMap FieldInfo - -> [FunctionInfo] - -> Maybe (AnnBoolExpPartialSQL, Maybe Int, [T.Text], Bool) -- select filter - -> QueryRootFieldMap -getRelayQueryRootFieldsRole tn primaryKey fields funcs selM = - makeFieldMap $ - funcConnectionQueries - <> catMaybes - [ getSelConnectionDet <$> selM <*> maybePrimaryKeyColumns - ] - where - maybePrimaryKeyColumns = fmap _pkColumns primaryKey - colGNameMap = mkPGColGNameMap $ getCols fields - - funcConnectionQueries = fromMaybe [] $ getFuncQueryConnectionFlds - <$> selM <*> maybePrimaryKeyColumns - - getSelConnectionDet (selFltr, pLimit, hdrs, _) primaryKeyColumns = - ( QCSelectConnection primaryKeyColumns $ mkSelectOpCtx tn (getCols fields) (selFltr, pLimit, hdrs) - , mkSelFldConnection Nothing tn - ) - - getFuncQueryConnectionFlds (selFltr, pLimit, hdrs, _) primaryKeyColumns = - flip map funcs $ \fi -> - ( QCFuncConnection primaryKeyColumns $ - FuncQOpCtx (fiName fi) (mkFuncArgItemSeq fi) hdrs colGNameMap selFltr pLimit - , mkFuncQueryConnectionFld fi $ fiDescription fi - ) - -mkNodeQueryRootFields :: RoleName -> [TableInfo] -> RootFields -mkNodeQueryRootFields roleName relayTables = - RootFields (mapFromL (_fiName . snd) [nodeQueryDet]) mempty - where - nodeQueryDet = - ( QCNodeSelect nodeSelMap - , nodeQueryField - ) - - nodeQueryField = - let nodeParams = fromInpValL $ pure $ - InpValInfo (Just $ G.Description "A globally unique id") - "id" Nothing nodeIdType - in mkHsraObjFldInfo Nothing "node" nodeParams $ G.toGT nodeType - - nodeSelMap = - Map.fromList $ flip mapMaybe relayTables $ \table -> - let tableName = _tciName $ _tiCoreInfo table - allColumns = getCols $ _tciFieldInfoMap $ _tiCoreInfo table - selectPermM = _permSel <$> Map.lookup roleName - (_tiRolePermInfoMap table) - permDetailsM = join selectPermM <&> \perm -> - ( spiFilter perm - , spiLimit perm - , spiRequiredHeaders perm - ) - adminPermDetails = (noFilter, Nothing, []) - in (mkTableTy tableName,) <$> - ((,) <$> - (mkSelectOpCtx tableName allColumns <$> - bool permDetailsM (Just adminPermDetails) (isAdmin roleName) - ) <*> (table ^? tiCoreInfo.tciPrimaryKey._Just.pkColumns) - ) diff --git a/server/src-lib/Hasura/GraphQL/RemoteServer.hs b/server/src-lib/Hasura/GraphQL/RemoteServer.hs index 2c14bb43517..f65d10f906b 100644 --- a/server/src-lib/Hasura/GraphQL/RemoteServer.hs +++ b/server/src-lib/Hasura/GraphQL/RemoteServer.hs @@ -1,9 +1,13 @@ -module Hasura.GraphQL.RemoteServer where +module Hasura.GraphQL.RemoteServer + ( fetchRemoteSchema + , IntrospectionResult + , execRemoteGQ' + ) where import Control.Exception (try) import Control.Lens ((^.)) +import Control.Monad.Unique import Data.Aeson ((.:), (.:?)) -import Data.Foldable (foldlM) import Hasura.HTTP import Hasura.Prelude @@ -12,14 +16,17 @@ import qualified Data.ByteString.Lazy as BL import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G +import qualified Hasura.Tracing as Tracing import qualified Language.GraphQL.Draft.Parser as G +import qualified Language.GraphQL.Draft.Syntax as G import qualified Language.Haskell.TH.Syntax as TH import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types as N import qualified Network.Wreq as Wreq -import Hasura.GraphQL.Schema.Merge + +import qualified Hasura.GraphQL.Parser.Monad as P +import Hasura.GraphQL.Schema.Remote import Hasura.GraphQL.Transport.HTTP.Protocol import Hasura.RQL.DDL.Headers (makeHeadersFromConf) import Hasura.RQL.Types @@ -27,11 +34,6 @@ import Hasura.Server.Utils import Hasura.Server.Version (HasVersion) import Hasura.Session -import qualified Hasura.GraphQL.Context as GC -import qualified Hasura.GraphQL.Schema as GS -import qualified Hasura.GraphQL.Validate.Types as VT -import qualified Hasura.Tracing as Tracing - introspectionQuery :: GQLReqParsed introspectionQuery = $(do @@ -44,12 +46,14 @@ introspectionQuery = ) fetchRemoteSchema - :: (HasVersion, MonadIO m, MonadError QErr m) + :: forall m + . (HasVersion, MonadIO m, MonadUnique m, MonadError QErr m) => Env.Environment -> HTTP.Manager + -> RemoteSchemaName -> RemoteSchemaInfo - -> m GC.RemoteGCtx -fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeout) = do + -> m RemoteSchemaCtx +fetchRemoteSchema env manager schemaName schemaInfo@(RemoteSchemaInfo url headerConf _ timeout) = do headers <- makeHeadersFromConf env headerConf let hdrsWithDefaults = addDefaultHeaders headers @@ -68,28 +72,25 @@ fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeou statusCode = resp ^. Wreq.responseStatus . Wreq.statusCode when (statusCode /= 200) $ throwNon200 statusCode respData - introspectRes :: (FromIntrospection IntrospectionResult) <- + -- Parse the JSON into flat GraphQL type AST + (FromIntrospection introspectRes) :: (FromIntrospection IntrospectionResult) <- either (remoteSchemaErr . T.pack) return $ J.eitherDecode respData - let (sDoc, qRootN, mRootN, sRootN) = - fromIntrospection introspectRes - typMap <- either remoteSchemaErr return $ VT.fromSchemaDoc sDoc $ - VT.TLRemoteType name def - let mQrTyp = Map.lookup qRootN typMap - mMrTyp = (`Map.lookup` typMap) =<< mRootN - mSrTyp = (`Map.lookup` typMap) =<< sRootN - qrTyp <- liftMaybe noQueryRoot mQrTyp - let mRmQR = VT.getObjTyM qrTyp - mRmMR = VT.getObjTyM =<< mMrTyp - mRmSR = VT.getObjTyM =<< mSrTyp - rmQR <- liftMaybe (err400 Unexpected "query root has to be an object type") mRmQR - return $ GC.RemoteGCtx typMap rmQR mRmMR mRmSR + -- Check that the parsed GraphQL type info is valid by running the schema generation + (queryParsers, mutationParsers, subscriptionParsers) <- + P.runSchemaT @m @(P.ParseT Identity) $ buildRemoteParser introspectRes schemaInfo + + -- The 'rawIntrospectionResult' contains the 'Bytestring' response of + -- the introspection result of the remote server. We store this in the + -- 'RemoteSchemaCtx' because we can use this when the 'introspect_remote_schema' + -- is called by simple encoding the result to JSON. + return $ RemoteSchemaCtx schemaName introspectRes schemaInfo respData $ + ParsedIntrospection queryParsers mutationParsers subscriptionParsers where - noQueryRoot = err400 Unexpected "query root not found in remote schema" - remoteSchemaErr :: (MonadError QErr m) => T.Text -> m a + remoteSchemaErr :: T.Text -> m a remoteSchemaErr = throw400 RemoteSchemaError - throwHttpErr :: (MonadError QErr m) => HTTP.HttpException -> m a + throwHttpErr :: HTTP.HttpException -> m a throwHttpErr = throwWithInternal httpExceptMsg . httpExceptToJSON throwNon200 st = throwWithInternal (non200Msg st) . decodeNon200Resp @@ -108,36 +109,9 @@ fetchRemoteSchema env manager def@(RemoteSchemaInfo name url headerConf _ timeou Right a -> J.object ["response" J..= (a :: J.Value)] Left _ -> J.object ["raw_body" J..= bsToTxt (BL.toStrict bs)] -mergeSchemas - :: (MonadError QErr m) - => RemoteSchemaMap - -> GS.GCtxMap - -- the merged GCtxMap and the default GCtx without roles - -> m (GS.GCtxMap, GS.GCtx) -mergeSchemas rmSchemaMap gCtxMap = do - def <- mkDefaultRemoteGCtx remoteSchemas - merged <- mergeRemoteSchema gCtxMap def - return (merged, def) - where - remoteSchemas = map rscGCtx $ Map.elems rmSchemaMap - -mkDefaultRemoteGCtx - :: (MonadError QErr m) - => [GC.GCtx] -> m GS.GCtx -mkDefaultRemoteGCtx = - foldlM mergeGCtx GC.emptyGCtx - --- merge a remote schema `gCtx` into current `gCtxMap` -mergeRemoteSchema - :: (MonadError QErr m) - => GS.GCtxMap - -> GS.GCtx - -> m GS.GCtxMap -mergeRemoteSchema ctxMap mergedRemoteGCtx = - flip Map.traverseWithKey ctxMap $ \_ schemaCtx -> - for schemaCtx $ \gCtx -> mergeGCtx gCtx mergedRemoteGCtx - --- | Parsing the introspection query result +-- | Parsing the introspection query result. We use this newtype wrapper to +-- avoid orphan instances and parse JSON in the way that we need for GraphQL +-- introspection results. newtype FromIntrospection a = FromIntrospection { fromIntrospection :: a } deriving (Show, Eq, Generic) @@ -153,7 +127,7 @@ instance J.FromJSON (FromIntrospection G.Description) where instance J.FromJSON (FromIntrospection G.ScalarTypeDefinition) where parseJSON = J.withObject "ScalarTypeDefinition" $ \o -> do - kind <- o .: "kind" + kind <- o .: "kind" name <- o .: "name" desc <- o .:? "description" when (kind /= "SCALAR") $ kindErr kind "scalar" @@ -163,14 +137,13 @@ instance J.FromJSON (FromIntrospection G.ScalarTypeDefinition) where instance J.FromJSON (FromIntrospection G.ObjectTypeDefinition) where parseJSON = J.withObject "ObjectTypeDefinition" $ \o -> do - kind <- o .: "kind" - name <- o .: "name" - desc <- o .:? "description" - fields <- o .:? "fields" - interfaces <- o .:? "interfaces" + kind <- o .: "kind" + name <- o .: "name" + desc <- o .:? "description" + fields <- o .:? "fields" + interfaces :: Maybe [FromIntrospection (G.InterfaceTypeDefinition [G.Name])] <- o .:? "interfaces" when (kind /= "OBJECT") $ kindErr kind "object" - let implIfaces = map (G.NamedType . G._itdName) $ - maybe [] (fmap fromIntrospection) interfaces + let implIfaces = map G._itdName $ maybe [] (fmap fromIntrospection) interfaces flds = maybe [] (fmap fromIntrospection) fields desc' = fmap fromIntrospection desc r = G.ObjectTypeDefinition desc' name implIfaces [] flds @@ -196,8 +169,7 @@ instance J.FromJSON (FromIntrospection G.GType) where ("NON_NULL", _, Just typ) -> return $ mkNotNull (fromIntrospection typ) ("NON_NULL", _, Nothing) -> pErr "NON_NULL should have `ofType`" ("LIST", _, Just typ) -> - return $ G.TypeList (G.Nullability True) - (G.ListType $ fromIntrospection typ) + return $ G.TypeList (G.Nullability True) (fromIntrospection typ) ("LIST", _, Nothing) -> pErr "LIST should have `ofType`" (_, Just name, _) -> return $ G.TypeNamed (G.Nullability True) name _ -> pErr $ "kind: " <> kind <> " should have name" @@ -208,7 +180,6 @@ instance J.FromJSON (FromIntrospection G.GType) where G.TypeList _ ty -> G.TypeList (G.Nullability False) ty G.TypeNamed _ n -> G.TypeNamed (G.Nullability False) n - instance J.FromJSON (FromIntrospection G.InputValueDefinition) where parseJSON = J.withObject "InputValueDefinition" $ \o -> do name <- o .: "name" @@ -220,20 +191,25 @@ instance J.FromJSON (FromIntrospection G.InputValueDefinition) where r = G.InputValueDefinition desc' name (fromIntrospection _type) defVal' return $ FromIntrospection r -instance J.FromJSON (FromIntrospection G.ValueConst) where - parseJSON = J.withText "defaultValue" $ \t -> fmap FromIntrospection - $ either (fail . T.unpack) return $ G.parseValueConst t +instance J.FromJSON (FromIntrospection (G.Value Void)) where + parseJSON = J.withText "Value Void" $ \t -> + let parseValueConst = G.runParser G.value + in fmap FromIntrospection $ either (fail . T.unpack) return $ parseValueConst t -instance J.FromJSON (FromIntrospection G.InterfaceTypeDefinition) where +instance J.FromJSON (FromIntrospection (G.InterfaceTypeDefinition [G.Name])) where parseJSON = J.withObject "InterfaceTypeDefinition" $ \o -> do kind <- o .: "kind" name <- o .: "name" desc <- o .:? "description" fields <- o .:? "fields" + possibleTypes :: Maybe [FromIntrospection G.ObjectTypeDefinition] <- o .:? "possibleTypes" let flds = maybe [] (fmap fromIntrospection) fields desc' = fmap fromIntrospection desc + possTps = map G._otdName $ maybe [] (fmap fromIntrospection) possibleTypes when (kind /= "INTERFACE") $ kindErr kind "interface" - let r = G.InterfaceTypeDefinition desc' name [] flds + -- TODO (non PDV) track which interfaces implement which other interfaces, after a + -- GraphQL spec > Jun 2018 is released. + let r = G.InterfaceTypeDefinition desc' name [] flds possTps return $ FromIntrospection r instance J.FromJSON (FromIntrospection G.UnionTypeDefinition) where @@ -242,11 +218,10 @@ instance J.FromJSON (FromIntrospection G.UnionTypeDefinition) where name <- o .: "name" desc <- o .:? "description" possibleTypes <- o .: "possibleTypes" - let memberTys = map (G.NamedType . G._otdName) $ - fmap fromIntrospection possibleTypes + let possibleTypes' = map G._otdName $ fmap fromIntrospection possibleTypes desc' = fmap fromIntrospection desc when (kind /= "UNION") $ kindErr kind "union" - let r = G.UnionTypeDefinition desc' name [] memberTys + let r = G.UnionTypeDefinition desc' name [] possibleTypes' return $ FromIntrospection r instance J.FromJSON (FromIntrospection G.EnumTypeDefinition) where @@ -280,7 +255,7 @@ instance J.FromJSON (FromIntrospection G.InputObjectTypeDefinition) where let r = G.InputObjectTypeDefinition desc' name [] inputFields return $ FromIntrospection r -instance J.FromJSON (FromIntrospection G.TypeDefinition) where +instance J.FromJSON (FromIntrospection (G.TypeDefinition [G.Name])) where parseJSON = J.withObject "TypeDefinition" $ \o -> do kind :: Text <- o .: "kind" r <- case kind of @@ -299,12 +274,6 @@ instance J.FromJSON (FromIntrospection G.TypeDefinition) where _ -> pErr $ "unknown kind: " <> kind return $ FromIntrospection r -type IntrospectionResult = ( G.SchemaDocument - , G.NamedType - , Maybe G.NamedType - , Maybe G.NamedType - ) - instance J.FromJSON (FromIntrospection IntrospectionResult) where parseJSON = J.withObject "SchemaDocument" $ \o -> do _data <- o .: "data" @@ -328,23 +297,10 @@ instance J.FromJSON (FromIntrospection IntrospectionResult) where Just subsType -> do subRoot <- subsType .: "name" return $ Just subRoot - let r = ( G.SchemaDocument (fmap fromIntrospection types) - , queryRoot - , mutationRoot - , subsRoot - ) + let r = IntrospectionResult (G.SchemaIntrospection (fmap fromIntrospection types)) + queryRoot mutationRoot subsRoot return $ FromIntrospection r - -getNamedTyp :: G.TypeDefinition -> G.Name -getNamedTyp ty = case ty of - G.TypeDefinitionScalar t -> G._stdName t - G.TypeDefinitionObject t -> G._otdName t - G.TypeDefinitionInterface t -> G._itdName t - G.TypeDefinitionUnion t -> G._utdName t - G.TypeDefinitionEnum t -> G._etdName t - G.TypeDefinitionInputObject t -> G._iotdName t - execRemoteGQ' :: ( HasVersion , MonadIO m @@ -385,7 +341,7 @@ execRemoteGQ' env manager userInfo reqHdrs q rsi opType = do resp <- either httpThrow return res pure (time, mkSetCookieHeaders resp, resp ^. Wreq.responseBody) where - RemoteSchemaInfo _ url hdrConf fwdClientHdrs timeout = rsi + RemoteSchemaInfo url hdrConf fwdClientHdrs timeout = rsi httpThrow :: (MonadError QErr m) => HTTP.HttpException -> m a httpThrow = \case HTTP.HttpExceptionRequest _req content -> throw500 $ T.pack . show $ content diff --git a/server/src-lib/Hasura/GraphQL/Resolve.hs b/server/src-lib/Hasura/GraphQL/Resolve.hs deleted file mode 100644 index ac7d1f6e087..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve.hs +++ /dev/null @@ -1,244 +0,0 @@ -module Hasura.GraphQL.Resolve - ( mutFldToTx - - , queryFldToPGAST - , traverseQueryRootFldAST - , UnresolvedVal(..) - - , AnnPGVal(..) - , txtConverter - - , QueryRootFldAST(..) - , QueryRootFldUnresolved - , QueryRootFldResolved - , toPGQuery - , toSQLFromItem - - , RIntro.schemaR - , RIntro.typeR - ) where - -import Data.Has -import Hasura.Session - -import qualified Data.Environment as Env -import qualified Data.HashMap.Strict as Map -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G -import qualified Network.HTTP.Client as HTTP -import qualified Network.HTTP.Types as HTTP - -import Hasura.EncJSON -import Hasura.GraphQL.Resolve.Context -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) -import Hasura.SQL.Types - -import qualified Hasura.GraphQL.Resolve.Action as RA -import qualified Hasura.GraphQL.Resolve.Insert as RI -import qualified Hasura.GraphQL.Resolve.Introspect as RIntro -import qualified Hasura.GraphQL.Resolve.Mutation as RM -import qualified Hasura.GraphQL.Resolve.Select as RS -import qualified Hasura.GraphQL.Schema.Common as GS -import qualified Hasura.GraphQL.Validate as V -import qualified Hasura.Logging as L -import qualified Hasura.RQL.DML.RemoteJoin as RR -import qualified Hasura.RQL.DML.Select as DS -import qualified Hasura.SQL.DML as S -import qualified Hasura.Tracing as Tracing - -data QueryRootFldAST v - = QRFNode !(DS.AnnSimpleSelG v) - | QRFPk !(DS.AnnSimpleSelG v) - | QRFSimple !(DS.AnnSimpleSelG v) - | QRFAgg !(DS.AnnAggregateSelectG v) - | QRFConnection !(DS.ConnectionSelect v) - | QRFActionSelect !(DS.AnnSimpleSelG v) - | QRFActionExecuteObject !(DS.AnnSimpleSelG v) - | QRFActionExecuteList !(DS.AnnSimpleSelG v) - deriving (Show, Eq) - -type QueryRootFldUnresolved = QueryRootFldAST UnresolvedVal -type QueryRootFldResolved = QueryRootFldAST S.SQLExp - -traverseQueryRootFldAST - :: (Applicative f) - => (a -> f b) - -> QueryRootFldAST a - -> f (QueryRootFldAST b) -traverseQueryRootFldAST f = \case - QRFNode s -> QRFNode <$> DS.traverseAnnSimpleSelect f s - QRFPk s -> QRFPk <$> DS.traverseAnnSimpleSelect f s - QRFSimple s -> QRFSimple <$> DS.traverseAnnSimpleSelect f s - QRFAgg s -> QRFAgg <$> DS.traverseAnnAggregateSelect f s - QRFActionSelect s -> QRFActionSelect <$> DS.traverseAnnSimpleSelect f s - QRFActionExecuteObject s -> QRFActionExecuteObject <$> DS.traverseAnnSimpleSelect f s - QRFActionExecuteList s -> QRFActionExecuteList <$> DS.traverseAnnSimpleSelect f s - QRFConnection s -> QRFConnection <$> DS.traverseConnectionSelect f s - -toPGQuery :: QueryRootFldResolved -> (Q.Query, Maybe RR.RemoteJoins) -toPGQuery = \case - QRFNode s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s - QRFPk s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s - QRFSimple s -> first (toQuery . DS.mkSQLSelect DS.JASMultipleRows) $ RR.getRemoteJoins s - QRFAgg s -> first (toQuery . DS.mkAggregateSelect) $ RR.getRemoteJoinsAggregateSelect s - QRFActionSelect s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s - QRFActionExecuteObject s -> first (toQuery . DS.mkSQLSelect DS.JASSingleObject) $ RR.getRemoteJoins s - QRFActionExecuteList s -> first (toQuery . DS.mkSQLSelect DS.JASMultipleRows) $ RR.getRemoteJoins s - QRFConnection s -> first (toQuery . DS.mkConnectionSelect) $ RR.getRemoteJoinsConnectionSelect s - where - toQuery :: ToSQL a => a -> Q.Query - toQuery = Q.fromBuilder . toSQL - -validateHdrs - :: (Foldable t, QErrM m) => UserInfo -> t Text -> m () -validateHdrs userInfo hdrs = do - let receivedVars = _uiSession userInfo - forM_ hdrs $ \hdr -> - unless (isJust $ getSessionVariableValue (mkSessionVariable hdr) receivedVars) $ - throw400 NotFound $ hdr <<> " header is expected but not found" - -queryFldToPGAST - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has UserInfo r - , Has QueryCtxMap r - , Has (L.Logger L.Hasura) r - , HasVersion - , MonadIO m - , Tracing.MonadTrace m - ) - => Env.Environment - -> V.Field - -> RA.QueryActionExecuter - -> m QueryRootFldUnresolved -queryFldToPGAST env fld actionExecuter = do - opCtx <- getOpCtx $ V._fName fld - userInfo <- asks getter - case opCtx of - QCNodeSelect nodeSelectMap -> do - NodeIdV1 (V1NodeId table columnValues) <- RS.resolveNodeId fld - case Map.lookup (GS.mkTableTy table) nodeSelectMap of - Nothing -> throwVE $ "table " <> table <<> " not found" - Just (selOpCtx, pkeyColumns) -> do - validateHdrs userInfo (_socHeaders selOpCtx) - QRFNode <$> RS.convertNodeSelect selOpCtx pkeyColumns columnValues fld - QCSelect ctx -> do - validateHdrs userInfo (_socHeaders ctx) - QRFSimple <$> RS.convertSelect ctx fld - QCSelectPkey ctx -> do - validateHdrs userInfo (_spocHeaders ctx) - QRFPk <$> RS.convertSelectByPKey ctx fld - QCSelectAgg ctx -> do - validateHdrs userInfo (_socHeaders ctx) - QRFAgg <$> RS.convertAggSelect ctx fld - QCFuncQuery ctx -> do - validateHdrs userInfo (_fqocHeaders ctx) - QRFSimple <$> RS.convertFuncQuerySimple ctx fld - QCFuncAggQuery ctx -> do - validateHdrs userInfo (_fqocHeaders ctx) - QRFAgg <$> RS.convertFuncQueryAgg ctx fld - QCAsyncActionFetch ctx -> - QRFActionSelect <$> RA.resolveAsyncActionQuery userInfo ctx fld - QCAction ctx -> do - -- query actions should not be marked reusable because we aren't - -- capturing the variable value in the state as re-usable variables. - -- The variables captured in non-action queries are used to generate - -- an SQL query, but in case of query actions it's converted into JSON - -- and included in the action's webhook payload. - markNotReusable - let jsonAggType = RA.mkJsonAggSelect $ _saecOutputType ctx - f = case jsonAggType of - DS.JASMultipleRows -> QRFActionExecuteList - DS.JASSingleObject -> QRFActionExecuteObject - f <$> actionExecuter (RA.resolveActionQuery env fld ctx (_uiSession userInfo)) - QCSelectConnection pk ctx -> do - validateHdrs userInfo (_socHeaders ctx) - QRFConnection <$> RS.convertConnectionSelect pk ctx fld - QCFuncConnection pk ctx -> do - validateHdrs userInfo (_fqocHeaders ctx) - QRFConnection <$> RS.convertConnectionFuncQuery pk ctx fld - -mutFldToTx - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has UserInfo r - , Has MutationCtxMap r - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has InsCtxMap r - , Has HTTP.Manager r - , Has [HTTP.Header] r - , Has (L.Logger L.Hasura) r - , MonadIO m - , Tracing.MonadTrace m - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> V.Field - -> m (tx EncJSON, HTTP.ResponseHeaders) -mutFldToTx env fld = do - userInfo <- asks getter - reqHeaders <- asks getter - httpManager <- asks getter - let rjCtx = (httpManager, reqHeaders, userInfo) - opCtx <- getOpCtx $ V._fName fld - let noRespHeaders = fmap (,[]) - roleName = _uiRole userInfo - case opCtx of - MCInsert ctx -> do - validateHdrs userInfo (_iocHeaders ctx) - noRespHeaders $ RI.convertInsert env rjCtx roleName (_iocTable ctx) fld - MCInsertOne ctx -> do - validateHdrs userInfo (_iocHeaders ctx) - noRespHeaders $ RI.convertInsertOne env rjCtx roleName (_iocTable ctx) fld - MCUpdate ctx -> do - validateHdrs userInfo (_uocHeaders ctx) - noRespHeaders $ RM.convertUpdate env ctx rjCtx fld - MCUpdateByPk ctx -> do - validateHdrs userInfo (_uocHeaders ctx) - noRespHeaders $ RM.convertUpdateByPk env ctx rjCtx fld - MCDelete ctx -> do - validateHdrs userInfo (_docHeaders ctx) - noRespHeaders $ RM.convertDelete env ctx rjCtx fld - MCDeleteByPk ctx -> do - validateHdrs userInfo (_docHeaders ctx) - noRespHeaders $ RM.convertDeleteByPk env ctx rjCtx fld - MCAction ctx -> - RA.resolveActionMutation env fld ctx userInfo - -getOpCtx - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has (OpCtxMap a) r - ) - => G.Name -> m a -getOpCtx f = do - opCtxMap <- asks getter - onNothing (Map.lookup f opCtxMap) $ throw500 $ - "lookup failed: opctx: " <> showName f - -toSQLFromItem :: S.Alias -> QueryRootFldResolved -> S.FromItem -toSQLFromItem alias = \case - QRFNode s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s - QRFPk s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s - QRFSimple s -> fromSelect $ DS.mkSQLSelect DS.JASMultipleRows s - QRFAgg s -> fromSelect $ DS.mkAggregateSelect s - QRFActionSelect s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s - QRFActionExecuteObject s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s - QRFActionExecuteList s -> fromSelect $ DS.mkSQLSelect DS.JASSingleObject s - QRFConnection s -> flip (S.FISelectWith (S.Lateral False)) alias - $ DS.mkConnectionSelect s - where - fromSelect = flip (S.FISelect (S.Lateral False)) alias diff --git a/server/src-lib/Hasura/GraphQL/Resolve/BoolExp.hs b/server/src-lib/Hasura/GraphQL/Resolve/BoolExp.hs deleted file mode 100644 index 45ea56b79d5..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/BoolExp.hs +++ /dev/null @@ -1,211 +0,0 @@ -module Hasura.GraphQL.Resolve.BoolExp - ( parseBoolExp - , pgColValToBoolExp - ) where - -import Data.Has -import Hasura.Prelude - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types -import Hasura.SQL.Types -import Hasura.SQL.Value - -import qualified Hasura.SQL.DML as S - -type OpExp = OpExpG UnresolvedVal - -parseOpExps :: (MonadReusability m, MonadError QErr m) => PGColumnType -> AnnInpVal -> m [OpExp] -parseOpExps colTy annVal = do - opExpsM <- flip withObjectM annVal $ \nt objM -> forM objM $ \obj -> - forM (OMap.toList obj) $ \(k, v) -> - case k of - "_cast" -> fmap ACast <$> parseCastExpression v - - "_eq" -> fmap (AEQ True) <$> asOpRhs v - "_ne" -> fmap (ANE True) <$> asOpRhs v - "_neq" -> fmap (ANE True) <$> asOpRhs v - "_is_null" -> resolveIsNull v - - "_in" -> fmap AIN <$> asPGArray colTy v - "_nin" -> fmap ANIN <$> asPGArray colTy v - - "_gt" -> fmap AGT <$> asOpRhs v - "_lt" -> fmap ALT <$> asOpRhs v - "_gte" -> fmap AGTE <$> asOpRhs v - "_lte" -> fmap ALTE <$> asOpRhs v - - "_like" -> fmap ALIKE <$> asOpRhs v - "_nlike" -> fmap ANLIKE <$> asOpRhs v - - "_ilike" -> fmap AILIKE <$> asOpRhs v - "_nilike" -> fmap ANILIKE <$> asOpRhs v - - "_similar" -> fmap ASIMILAR <$> asOpRhs v - "_nsimilar" -> fmap ANSIMILAR <$> asOpRhs v - - -- jsonb related operators - "_contains" -> fmap AContains <$> asOpRhs v - "_contained_in" -> fmap AContainedIn <$> asOpRhs v - "_has_key" -> fmap AHasKey <$> asOpRhs v - - "_has_keys_any" -> fmap AHasKeysAny <$> asPGArray (PGColumnScalar PGText) v - "_has_keys_all" -> fmap AHasKeysAll <$> asPGArray (PGColumnScalar PGText) v - - -- geometry/geography type related operators - "_st_contains" -> fmap ASTContains <$> asOpRhs v - "_st_crosses" -> fmap ASTCrosses <$> asOpRhs v - "_st_equals" -> fmap ASTEquals <$> asOpRhs v - "_st_intersects" -> fmap ASTIntersects <$> asOpRhs v - "_st_overlaps" -> fmap ASTOverlaps <$> asOpRhs v - "_st_touches" -> fmap ASTTouches <$> asOpRhs v - "_st_within" -> fmap ASTWithin <$> asOpRhs v - "_st_d_within" -> parseAsObjectM v parseAsSTDWithinObj - - -- raster type related operators - "_st_intersects_rast" -> fmap ASTIntersectsRast <$> asOpRhs v - "_st_intersects_nband_geom" -> parseAsObjectM v parseAsSTIntersectsNbandGeomObj - "_st_intersects_geom_nband" -> parseAsObjectM v parseAsSTIntersectsGeomNbandObj - - _ -> - throw500 - $ "unexpected operator found in opexp of " - <> showNamedTy nt - <> ": " - <> showName k - return $ catMaybes $ fromMaybe [] opExpsM - where - asOpRhs = fmap (fmap mkParameterizablePGValue) . asPGColumnValueM - - parseAsObjectM v f = asObjectM v >>= mapM f - - asPGArray rhsTy v = do - valsM <- parseMany (openOpaqueValue <=< asPGColumnValue) v - forM valsM $ \vals -> do - let arrayExp = S.SEArray $ map (txtEncoder . pstValue . _apvValue) vals - return $ UVSQL $ S.SETyAnn arrayExp $ S.mkTypeAnn $ - -- Safe here because asPGColumnValue ensured all the values are of the right type, but if the - -- list is empty, we don’t actually have a scalar type to use, so we need to use - -- unsafePGColumnToRepresentation to create it. (It would be nice to refactor things to - -- somehow get rid of this.) - PGTypeArray (unsafePGColumnToRepresentation rhsTy) - - resolveIsNull v = asPGColumnValueM v >>= traverse openOpaqueValue >>= \case - Nothing -> pure Nothing - Just annPGVal -> case pstValue $ _apvValue annPGVal of - PGValBoolean b -> pure . Just $ bool ANISNOTNULL ANISNULL b - _ -> throw500 "boolean value is expected" - - parseAsSTDWithinObj obj = do - distanceVal <- onNothing (OMap.lookup "distance" obj) $ - throw500 "expected \"distance\" input field in st_d_within" - dist <- mkParameterizablePGValue <$> asPGColumnValue distanceVal - fromVal <- onNothing (OMap.lookup "from" obj) $ - throw500 "expected \"from\" input field in st_d_within" - from <- mkParameterizablePGValue <$> asPGColumnValue fromVal - case colTy of - PGColumnScalar PGGeography -> do - useSpheroidVal <- - onNothing (OMap.lookup "use_spheroid" obj) $ - throw500 "expected \"use_spheroid\" input field in st_d_within" - useSpheroid <- mkParameterizablePGValue <$> asPGColumnValue useSpheroidVal - return $ ASTDWithinGeog $ DWithinGeogOp dist from useSpheroid - PGColumnScalar PGGeometry -> - return $ ASTDWithinGeom $ DWithinGeomOp dist from - _ -> throw500 "expected PGGeometry/PGGeography column for st_d_within" - - parseAsSTIntersectsNbandGeomObj obj = do - nbandVal <- onNothing (OMap.lookup "nband" obj) $ - throw500 "expected \"nband\" input field" - nband <- mkParameterizablePGValue <$> asPGColumnValue nbandVal - geommin <- parseGeommin obj - return $ ASTIntersectsNbandGeom $ STIntersectsNbandGeommin nband geommin - - parseAsSTIntersectsGeomNbandObj obj = do - nbandMM <- fmap (fmap mkParameterizablePGValue) <$> - traverse asPGColumnValueM (OMap.lookup "nband" obj) - geommin <- parseGeommin obj - return $ ASTIntersectsGeomNband $ STIntersectsGeomminNband geommin $ join nbandMM - - parseGeommin obj = do - geomminVal <- onNothing (OMap.lookup "geommin" obj) $ - throw500 "expected \"geommin\" input field" - mkParameterizablePGValue <$> asPGColumnValue geomminVal - -parseCastExpression - :: (MonadReusability m, MonadError QErr m) - => AnnInpVal -> m (Maybe (CastExp UnresolvedVal)) -parseCastExpression = - withObjectM $ \_ objM -> forM objM $ \obj -> do - targetExps <- forM (OMap.toList obj) $ \(targetTypeName, castedComparisonExpressionInput) -> do - let targetType = textToPGScalarType $ G.unName targetTypeName - castedComparisonExpressions <- parseOpExps (PGColumnScalar targetType) castedComparisonExpressionInput - return (targetType, castedComparisonExpressions) - return $ Map.fromList targetExps - -parseColExp - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - ) - => G.NamedType -> G.Name -> AnnInpVal - -> m (AnnBoolExpFld UnresolvedVal) -parseColExp nt n val = do - fldInfo <- getFldInfo nt n - case fldInfo of - RFPGColumn pgColInfo -> do - opExps <- parseOpExps (pgiType pgColInfo) val - return $ AVCol pgColInfo opExps - RFRelationship (RelationshipField relInfo _ _ permExp _)-> do - relBoolExp <- parseBoolExp val - return $ AVRel relInfo $ andAnnBoolExps relBoolExp $ - fmapAnnBoolExp partialSQLExpToUnresolvedVal permExp - RFComputedField _ -> throw500 - "computed fields are not allowed in bool_exp" - RFRemoteRelationship _ -> throw500 - "remote relationships are not allowed in bool_exp" - RFNodeId _ _ -> throw500 - "node id is not allowed in bool_exp" - -parseBoolExp - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - ) - => AnnInpVal -> m (AnnBoolExp UnresolvedVal) -parseBoolExp annGVal = do - boolExpsM <- - flip withObjectM annGVal - $ \nt objM -> forM objM $ \obj -> forM (OMap.toList obj) $ \(k, v) -> if - | k == "_or" -> BoolOr . fromMaybe [] - <$> parseMany parseBoolExp v - | k == "_and" -> BoolAnd . fromMaybe [] - <$> parseMany parseBoolExp v - | k == "_not" -> BoolNot <$> parseBoolExp v - | otherwise -> BoolFld <$> parseColExp nt k v - return $ BoolAnd $ fromMaybe [] boolExpsM - -type PGColValMap = Map.HashMap G.Name AnnInpVal - -pgColValToBoolExp - :: (MonadReusability m, MonadError QErr m) - => PGColArgMap -> PGColValMap -> m AnnBoolExpUnresolved -pgColValToBoolExp colArgMap colValMap = do - colExps <- forM colVals $ \(name, val) -> - BoolFld <$> do - opExp <- AEQ True . mkParameterizablePGValue <$> asPGColumnValue val - colInfo <- onNothing (Map.lookup name colArgMap) $ - throw500 $ "column name " <> showName name - <> " not found in column arguments map" - return $ AVCol colInfo [opExp] - return $ BoolAnd colExps - where - colVals = Map.toList colValMap diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Context.hs b/server/src-lib/Hasura/GraphQL/Resolve/Context.hs deleted file mode 100644 index 044fbc0fc9e..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Context.hs +++ /dev/null @@ -1,151 +0,0 @@ -module Hasura.GraphQL.Resolve.Context - ( FunctionArgItem(..) - , OrdByItem(..) - , UpdPermForIns(..) - , InsCtx(..) - , RespTx - , LazyRespTx - , AnnPGVal(..) - , UnresolvedVal(..) - , resolveValTxt - , InsertTxConflictCtx(..) - , getFldInfo - , getPGColInfo - , getArg - , withArg - , withArgM - , nameAsPath - - , PrepArgs - , prepare - , prepareColVal - , withPrepArgs - - , txtConverter - - , traverseObjectSelectionSet - , fieldAsPath - , resolvePGCol - , module Hasura.GraphQL.Utils - , module Hasura.GraphQL.Resolve.Types - ) where - -import Data.Has -import Hasura.Prelude - -import qualified Data.HashMap.Strict as Map -import qualified Data.Sequence as Seq -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Utils -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DML.Internal (currentSession, sessVarFromCurrentSetting) -import Hasura.RQL.Types -import Hasura.SQL.Types -import Hasura.SQL.Value - -import qualified Hasura.SQL.DML as S - -getFldInfo - :: (MonadError QErr m, MonadReader r m, Has FieldMap r) - => G.NamedType -> G.Name - -> m ResolveField -getFldInfo nt n = do - fldMap <- asks getter - onNothing (Map.lookup (nt,n) fldMap) $ - throw500 $ "could not lookup " <> showName n <> " in " <> - showNamedTy nt - -getPGColInfo - :: (MonadError QErr m, MonadReader r m, Has FieldMap r) - => G.NamedType -> G.Name -> m PGColumnInfo -getPGColInfo nt n = do - fldInfo <- getFldInfo nt n - case fldInfo of - RFPGColumn pgColInfo -> return pgColInfo - RFRelationship _ -> throw500 $ mkErrMsg "relation" - RFComputedField _ -> throw500 $ mkErrMsg "computed field" - RFRemoteRelationship _ -> throw500 $ mkErrMsg "remote relationship" - RFNodeId _ _ -> throw500 $ mkErrMsg "node id" - where - mkErrMsg ty = - "found " <> ty <> " when expecting pgcolinfo for " - <> showNamedTy nt <> ":" <> showName n - -getArg - :: (MonadError QErr m) - => ArgsMap - -> G.Name - -> m AnnInpVal -getArg args arg = - onNothing (Map.lookup arg args) $ - throw500 $ "missing argument: " <> showName arg - -prependArgsInPath - :: (MonadError QErr m) - => m a -> m a -prependArgsInPath = withPathK "args" - -nameAsPath - :: (MonadError QErr m) - => G.Name -> m a -> m a -nameAsPath name = withPathK (G.unName name) - -withArg - :: (MonadError QErr m) - => ArgsMap - -> G.Name - -> (AnnInpVal -> m a) - -> m a -withArg args arg f = prependArgsInPath $ nameAsPath arg $ - getArg args arg >>= f - -withArgM - :: (MonadReusability m, MonadError QErr m) - => ArgsMap - -> G.Name - -> (AnnInpVal -> m a) - -> m (Maybe a) -withArgM args argName f = do - wrappedArg <- for (Map.lookup argName args) $ \arg -> do - when (isJust (_aivVariable arg) && G.isNullable (_aivType arg)) markNotReusable - pure . bool (Just arg) Nothing $ hasNullVal (_aivValue arg) - prependArgsInPath . nameAsPath argName $ traverse f (join wrappedArg) - -type PrepArgs = Seq.Seq Q.PrepArg - -prepare :: (MonadState PrepArgs m) => AnnPGVal -> m S.SQLExp -prepare (AnnPGVal _ _ scalarValue) = prepareColVal scalarValue - -resolveValTxt :: (Applicative f) => UnresolvedVal -> f S.SQLExp -resolveValTxt = \case - UVPG annPGVal -> txtConverter annPGVal - UVSessVar colTy sessVar -> sessVarFromCurrentSetting colTy sessVar - UVSQL sqlExp -> pure sqlExp - UVSession -> pure currentSession - -withPrepArgs :: StateT PrepArgs m a -> m (a, PrepArgs) -withPrepArgs m = runStateT m Seq.empty - -prepareColVal - :: (MonadState PrepArgs m) - => WithScalarType PGScalarValue -> m S.SQLExp -prepareColVal (WithScalarType scalarType colVal) = do - preparedArgs <- get - put (preparedArgs Seq.|> binEncoder colVal) - return $ toPrepParam (Seq.length preparedArgs + 1) scalarType - -txtConverter :: Applicative f => AnnPGVal -> f S.SQLExp -txtConverter (AnnPGVal _ _ scalarValue) = pure $ toTxtValue scalarValue - -fieldAsPath :: (MonadError QErr m) => Field -> m a -> m a -fieldAsPath = nameAsPath . _fName - -resolvePGCol :: (MonadError QErr m) - => PGColGNameMap -> G.Name -> m PGColumnInfo -resolvePGCol colFldMap fldName = - onNothing (Map.lookup fldName colFldMap) $ throw500 $ - "no column associated with name " <> G.unName fldName diff --git a/server/src-lib/Hasura/GraphQL/Resolve/InputValue.hs b/server/src-lib/Hasura/GraphQL/Resolve/InputValue.hs deleted file mode 100644 index 4c76ef71558..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/InputValue.hs +++ /dev/null @@ -1,228 +0,0 @@ -module Hasura.GraphQL.Resolve.InputValue - ( withNotNull - , tyMismatch - - , OpaqueValue - , OpaquePGValue - , mkParameterizablePGValue - , openOpaqueValue - , asPGColumnTypeAndValueM - , asPGColumnValueM - , asPGColumnValue - - , asScalarValM - , asScalarVal - , asEnumVal - , asEnumValM - , withObject - , asObject - , withObjectM - , asObjectM - , withArray - , asArray - , withArrayM - , parseMany - , asPGColText - , asPGColTextM - , annInpValueToJson - ) where - -import Hasura.Prelude - -import qualified Text.Builder as TB - -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Data.Aeson as J -import qualified Hasura.RQL.Types as RQL - -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types -import Hasura.SQL.Types -import Hasura.SQL.Value - -withNotNull - :: (MonadError QErr m) - => G.NamedType -> Maybe a -> m a -withNotNull nt v = - onNothing v $ throw500 $ - "unexpected null for a value of type " <> showNamedTy nt - -tyMismatch - :: (MonadError QErr m) => Text -> AnnInpVal -> m a -tyMismatch expectedTy v = - throw500 $ "expected " <> expectedTy <> ", found " <> - getAnnInpValKind (_aivValue v) <> " for value of type " <> - G.showGT (_aivType v) - --- | As part of query reusability tracking (see 'QueryReusability'), functions that parse input --- values call 'markNotReusable' when the value comes from a variable. However, always calling --- 'markNotReusable' when parsing column values (using 'asPGColumnValue' and its variants) would be --- much too conservative: often the value is simply validated and wrapped immediately in 'UVPG', --- which allows it to be parameterized over. --- --- Always omitting the check would be incorrect, as some callers inspect the column values and use --- them to generate different SQL, which is where 'OpaqueValue' comes in. Functions like --- 'asPGColumnValue' return an 'OpaquePGValue', which can be safely converted to an 'UnresolvedVal' --- via 'mkParameterizablePGValue' without marking the query as non-reusable. Other callers that wish --- to inspect the value can instead call 'openOpaqueValue' to get the value out, and /that/ will --- mark the query non-reusable, instead. --- --- In other words, 'OpaqueValue' is a mechanism of delaying the 'markNotReusable' call until we’re --- confident its value will actually affect the generated SQL. -data OpaqueValue a - = OpaqueValue - { _opgvValue :: !a - , _opgvIsVariable :: !Bool - } deriving (Show) -type OpaquePGValue = OpaqueValue AnnPGVal - -mkParameterizablePGValue :: OpaquePGValue -> UnresolvedVal -mkParameterizablePGValue (OpaqueValue v _) = UVPG v - -openOpaqueValue :: (MonadReusability m) => OpaqueValue a -> m a -openOpaqueValue (OpaqueValue v isVariable) = when isVariable markNotReusable $> v - -asPGColumnTypeAndValueM - :: (MonadReusability m, MonadError QErr m) - => AnnInpVal - -> m (PGColumnType, WithScalarType (Maybe (OpaqueValue PGScalarValue))) -asPGColumnTypeAndValueM v = do - (columnType, scalarValueM) <- case _aivValue v of - AGScalar colTy val -> pure (PGColumnScalar colTy, WithScalarType colTy val) - AGEnum _ (AGEReference reference maybeValue) -> do - let maybeScalarValue = PGValText . RQL.getEnumValue <$> maybeValue - pure (PGColumnEnumReference reference, WithScalarType PGText maybeScalarValue) - _ -> tyMismatch "pgvalue" v - - for_ (_aivVariable v) $ \variableName -> if - -- If the value is a nullable variable, then the caller might make a different decision based on - -- whether the result is 'Nothing' or 'Just', which would change the generated query, so we have - -- to unconditionally mark the query non-reusable. - | G.isNullable (_aivType v) -> markNotReusable - | otherwise -> recordVariableUse variableName columnType - - let isVariable = isJust $ _aivVariable v - pure (columnType, fmap (flip OpaqueValue isVariable) <$> scalarValueM) - -asPGColumnTypeAndAnnValueM - :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (PGColumnType, Maybe OpaquePGValue) -asPGColumnTypeAndAnnValueM v = do - (columnType, scalarValueM) <- asPGColumnTypeAndValueM v - let mkAnnPGColVal = AnnPGVal (_aivVariable v) (G.isNullable (_aivType v)) - replaceOpaqueValue (WithScalarType scalarType (OpaqueValue scalarValue isVariable)) = - OpaqueValue (mkAnnPGColVal (WithScalarType scalarType scalarValue)) isVariable - pure (columnType, replaceOpaqueValue <$> sequence scalarValueM) - -asPGColumnValueM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe OpaquePGValue) -asPGColumnValueM = fmap snd . asPGColumnTypeAndAnnValueM - -asPGColumnValue :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m OpaquePGValue -asPGColumnValue v = do - (columnType, annPGValM) <- asPGColumnTypeAndAnnValueM v - onNothing annPGValM $ throw500 ("unexpected null for type " <>> columnType) - -openInputValue :: (MonadReusability m) => AnnInpVal -> m AnnGValue -openInputValue v = when (isJust $ _aivVariable v) markNotReusable $> _aivValue v - -asScalarValM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> PGScalarType -> m (Maybe PGScalarValue) -asScalarValM v tp = openInputValue v >>= \case - AGScalar tp' vM -> - if tp == tp' - then pure vM - else tyMismatch "scalar" v - _ -> tyMismatch "scalar" v - -asScalarVal :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> PGScalarType -> m PGScalarValue -asScalarVal v tp = asScalarValM v tp >>= \case - Just val -> pure val - Nothing -> throw500 $ "unexpected null for ty " <> TB.run (toSQL tp) - --- | Note: only handles “synthetic” enums (see 'EnumValuesInfo'). Enum table references are handled --- by 'asPGColumnType' and its variants. -asEnumVal :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (G.NamedType, G.EnumValue) -asEnumVal = asEnumValM >=> \case - (ty, Just val) -> pure (ty, val) - (ty, Nothing) -> throw500 $ "unexpected null for ty " <> showNamedTy ty - --- | Like 'asEnumVal', only handles “synthetic” enums. -asEnumValM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (G.NamedType, Maybe G.EnumValue) -asEnumValM v = openInputValue v >>= \case - AGEnum ty (AGESynthetic valM) -> return (ty, valM) - _ -> tyMismatch "enum" v - -withObject - :: (MonadReusability m, MonadError QErr m) => (G.NamedType -> AnnGObject -> m a) -> AnnInpVal -> m a -withObject fn v = openInputValue v >>= \case - AGObject nt (Just obj) -> fn nt obj - AGObject _ Nothing -> - throw500 $ "unexpected null for ty" - <> G.showGT (_aivType v) - _ -> tyMismatch "object" v - -asObject :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m AnnGObject -asObject = withObject (\_ o -> return o) - -withObjectM - :: (MonadReusability m, MonadError QErr m) - => (G.NamedType -> Maybe AnnGObject -> m a) -> AnnInpVal -> m a -withObjectM fn v = openInputValue v >>= \case - AGObject nt objM -> fn nt objM - _ -> tyMismatch "object" v - -asObjectM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe AnnGObject) -asObjectM = withObjectM (\_ o -> return o) - -withArrayM - :: (MonadReusability m, MonadError QErr m) - => (G.ListType -> Maybe [AnnInpVal] -> m a) -> AnnInpVal -> m a -withArrayM fn v = openInputValue v >>= \case - AGArray lt listM -> fn lt listM - _ -> tyMismatch "array" v - -withArray - :: (MonadReusability m, MonadError QErr m) - => (G.ListType -> [AnnInpVal] -> m a) -> AnnInpVal -> m a -withArray fn v = openInputValue v >>= \case - AGArray lt (Just l) -> fn lt l - AGArray _ Nothing -> throw500 $ "unexpected null for ty" - <> G.showGT (_aivType v) - _ -> tyMismatch "array" v - -asArray :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m [AnnInpVal] -asArray = withArray (\_ vals -> return vals) - -parseMany - :: (MonadReusability m, MonadError QErr m) => (AnnInpVal -> m a) -> AnnInpVal -> m (Maybe [a]) -parseMany fn v = openInputValue v >>= \case - AGArray _ arrM -> mapM (mapM fn) arrM - _ -> tyMismatch "array" v - -onlyText - :: (MonadError QErr m) - => PGScalarValue -> m Text -onlyText = \case - PGValText t -> return t - PGValVarchar t -> return t - _ -> throw500 "expecting text for asPGColText" - -asPGColText :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m Text -asPGColText val = do - pgColVal <- openOpaqueValue =<< asPGColumnValue val - onlyText (pstValue $ _apvValue pgColVal) - -asPGColTextM :: (MonadReusability m, MonadError QErr m) => AnnInpVal -> m (Maybe Text) -asPGColTextM val = do - pgColValM <- traverse openOpaqueValue =<< asPGColumnValueM val - traverse onlyText (pstValue . _apvValue <$> pgColValM) - -annInpValueToJson :: AnnInpVal -> J.Value -annInpValueToJson annInpValue = - case _aivValue annInpValue of - AGScalar _ pgColumnValueM -> maybe J.Null pgScalarValueToJson pgColumnValueM - AGEnum _ enumValue -> case enumValue of - AGESynthetic enumValueM -> J.toJSON enumValueM - AGEReference _ enumValueM -> J.toJSON enumValueM - AGObject _ objectM -> J.toJSON $ fmap (fmap annInpValueToJson) objectM - AGArray _ valuesM -> J.toJSON $ fmap (fmap annInpValueToJson) valuesM diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Insert.hs b/server/src-lib/Hasura/GraphQL/Resolve/Insert.hs deleted file mode 100644 index 0981639d304..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Insert.hs +++ /dev/null @@ -1,592 +0,0 @@ -module Hasura.GraphQL.Resolve.Insert - ( convertInsert - , convertInsertOne - ) -where - -import Data.Has -import Hasura.EncJSON -import Hasura.Prelude -import Hasura.Session - -import qualified Data.Aeson as J -import qualified Data.Aeson.Casing as J -import qualified Data.Aeson.TH as J -import qualified Data.Environment as Env -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.Sequence as Seq -import qualified Data.Text as T -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Hasura.RQL.DML.Insert as RI -import qualified Hasura.RQL.DML.Returning as RR -import qualified Hasura.SQL.DML as S -import qualified Hasura.Tracing as Tracing - -import Hasura.GraphQL.Resolve.BoolExp -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Resolve.Mutation -import Hasura.GraphQL.Resolve.Select -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DML.Insert (insertOrUpdateCheckExpr) -import Hasura.RQL.DML.Internal (convAnnBoolExpPartialSQL, convPartialSQLExp, - sessVarFromCurrentSetting) -import Hasura.RQL.DML.Mutation -import Hasura.RQL.DML.RemoteJoin -import Hasura.RQL.GBoolExp (toSQLBoolExp) -import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) -import Hasura.SQL.Types -import Hasura.SQL.Value - -type ColumnValuesText = ColumnValues TxtEncodedPGVal - -newtype InsResp - = InsResp - { _irResponse :: Maybe J.Object - } deriving (Show, Eq) -$(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''InsResp) - -data AnnIns a - = AnnIns - { _aiInsObj :: !a - , _aiConflictClause :: !(Maybe RI.ConflictClauseP1) - , _aiCheckCond :: !(AnnBoolExpPartialSQL, Maybe AnnBoolExpPartialSQL) - , _aiTableCols :: ![PGColumnInfo] - , _aiDefVals :: !(Map.HashMap PGCol S.SQLExp) - } deriving (Show, Eq, Functor, Foldable, Traversable) - -type SingleObjIns = AnnIns AnnInsObj -type MultiObjIns = AnnIns [AnnInsObj] - -multiToSingles :: MultiObjIns -> [SingleObjIns] -multiToSingles = sequenceA - -data RelIns a - = RelIns - { _riAnnIns :: !a - , _riRelInfo :: !RelInfo - } deriving (Show, Eq) - -type ObjRelIns = RelIns SingleObjIns -type ArrRelIns = RelIns MultiObjIns - -type PGColWithValue = (PGCol, WithScalarType PGScalarValue) - -data CTEExp - = CTEExp - { _iweExp :: !S.CTE - , _iwePrepArgs :: !(Seq.Seq Q.PrepArg) - } deriving (Show, Eq) - -data AnnInsObj - = AnnInsObj - { _aioColumns :: ![PGColWithValue] - , _aioObjRels :: ![ObjRelIns] - , _aioArrRels :: ![ArrRelIns] - } deriving (Show, Eq) - -mkAnnInsObj - :: (MonadReusability m, MonadError QErr m, Has InsCtxMap r, MonadReader r m, Has FieldMap r) - => RelationInfoMap - -> PGColGNameMap - -> AnnGObject - -> m AnnInsObj -mkAnnInsObj relInfoMap allColMap annObj = - foldrM (traverseInsObj relInfoMap allColMap) emptyInsObj $ OMap.toList annObj - where - emptyInsObj = AnnInsObj [] [] [] - -traverseInsObj - :: (MonadReusability m, MonadError QErr m, Has InsCtxMap r, MonadReader r m, Has FieldMap r) - => RelationInfoMap - -> PGColGNameMap - -> (G.Name, AnnInpVal) - -> AnnInsObj - -> m AnnInsObj -traverseInsObj rim allColMap (gName, annVal) defVal@(AnnInsObj cols objRels arrRels) = - case _aivValue annVal of - AGScalar{} -> parseValue - AGEnum{} -> parseValue - _ -> parseObject - where - parseValue = do - (_, WithScalarType scalarType maybeScalarValue) <- asPGColumnTypeAndValueM annVal - columnInfo <- onNothing (Map.lookup gName allColMap) $ - throw500 "column not found in PGColGNameMap" - let columnName = pgiColumn columnInfo - scalarValue <- maybe (pure $ PGNull scalarType) openOpaqueValue maybeScalarValue - pure $ AnnInsObj ((columnName, WithScalarType scalarType scalarValue):cols) objRels arrRels - - parseObject = do - objM <- asObjectM annVal - -- if relational insert input is 'null' then ignore - -- return default value - fmap (fromMaybe defVal) $ forM objM $ \obj -> do - let relNameM = RelName <$> mkNonEmptyText (G.unName gName) - onConflictM = OMap.lookup "on_conflict" obj - relName <- onNothing relNameM $ throw500 "found empty GName String" - dataVal <- onNothing (OMap.lookup "data" obj) $ - throw500 "\"data\" object not found" - relInfo <- onNothing (Map.lookup relName rim) $ - throw500 $ "relation " <> relName <<> " not found" - - let rTable = riRTable relInfo - InsCtx rtColMap checkCond rtDefVals rtRelInfoMap rtUpdPerm <- getInsCtx rTable - let rtCols = Map.elems rtColMap - rtDefValsRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) rtDefVals - - withPathK (G.unName gName) $ case riType relInfo of - ObjRel -> do - dataObj <- asObject dataVal - annDataObj <- mkAnnInsObj rtRelInfoMap rtColMap dataObj - ccM <- forM onConflictM $ parseOnConflict rTable rtUpdPerm rtColMap - let singleObjIns = AnnIns annDataObj ccM (checkCond, rtUpdPerm >>= upfiCheck) rtCols rtDefValsRes - objRelIns = RelIns singleObjIns relInfo - return (AnnInsObj cols (objRelIns:objRels) arrRels) - - ArrRel -> do - arrDataVals <- asArray dataVal - let withNonEmptyArrData = do - annDataObjs <- forM arrDataVals $ \arrDataVal -> do - dataObj <- asObject arrDataVal - mkAnnInsObj rtRelInfoMap rtColMap dataObj - ccM <- forM onConflictM $ parseOnConflict rTable rtUpdPerm rtColMap - let multiObjIns = AnnIns annDataObjs ccM (checkCond, rtUpdPerm >>= upfiCheck) rtCols rtDefValsRes - arrRelIns = RelIns multiObjIns relInfo - return (AnnInsObj cols objRels (arrRelIns:arrRels)) - -- if array relation insert input data has empty objects - -- then ignore and return default value - bool withNonEmptyArrData (return defVal) $ null arrDataVals - -parseOnConflict - :: (MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r) - => QualifiedTable - -> Maybe UpdPermForIns - -> PGColGNameMap - -> AnnInpVal - -> m RI.ConflictClauseP1 -parseOnConflict tn updFiltrM allColMap val = withPathK "on_conflict" $ - flip withObject val $ \_ obj -> do - constraint <- RI.CTConstraint <$> parseConstraint obj - updCols <- getUpdCols obj - case updCols of - [] -> return $ RI.CP1DoNothing $ Just constraint - _ -> do - UpdPermForIns _ _ updFiltr preSet <- onNothing updFiltrM $ throw500 - "cannot update columns since update permission is not defined" - preSetRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) preSet - updFltrRes <- traverseAnnBoolExp - (convPartialSQLExp sessVarFromCurrentSetting) - updFiltr - whereExp <- parseWhereExp obj - let updateBoolExp = toSQLBoolExp (S.mkQual tn) updFltrRes - whereCondition = S.BEBin S.AndOp updateBoolExp whereExp - return $ RI.CP1Update constraint updCols preSetRes whereCondition - - where - getUpdCols o = do - updColsVal <- onNothing (OMap.lookup "update_columns" o) $ throw500 - "\"update_columns\" argument in expected in \"on_conflict\" field " - parseColumns allColMap updColsVal - - parseConstraint o = do - v <- onNothing (OMap.lookup "constraint" o) $ throw500 - "\"constraint\" is expected, but not found" - (_, enumVal) <- asEnumVal v - return $ ConstraintName $ G.unName $ G.unEnumValue enumVal - - parseWhereExp = - OMap.lookup "where" - >>> traverse (parseBoolExp >=> traverse (traverse resolveValTxt)) - >>> fmap (maybe (S.BELit True) (toSQLBoolExp (S.mkQual tn))) - -toSQLExps - :: (MonadError QErr m, MonadState PrepArgs m) - => [PGColWithValue] - -> m [(PGCol, S.SQLExp)] -toSQLExps cols = - forM cols $ \(c, v) -> do - prepExp <- prepareColVal v - return (c, prepExp) - -mkSQLRow :: Map.HashMap PGCol S.SQLExp -> [(PGCol, S.SQLExp)] -> [S.SQLExp] -mkSQLRow defVals withPGCol = map snd $ - flip map (Map.toList defVals) $ - \(col, defVal) -> (col,) $ fromMaybe defVal $ Map.lookup col withPGColMap - where - withPGColMap = Map.fromList withPGCol - -mkInsertQ - :: MonadError QErr m - => QualifiedTable - -> Maybe RI.ConflictClauseP1 - -> [PGColWithValue] - -> Map.HashMap PGCol S.SQLExp - -> RoleName - -> (AnnBoolExpSQL, Maybe AnnBoolExpSQL) - -> m CTEExp -mkInsertQ tn onConflictM insCols defVals role (insCheck, updCheck) = do - (givenCols, args) <- flip runStateT Seq.Empty $ toSQLExps insCols - let sqlConflict = RI.toSQLConflict <$> onConflictM - sqlExps = mkSQLRow defVals givenCols - valueExp = S.ValuesExp [S.TupleExp sqlExps] - tableCols = Map.keys defVals - sqlInsert = - S.SQLInsert tn tableCols valueExp sqlConflict - . Just - $ S.RetExp - [ S.selectStar - , S.Extractor - (insertOrUpdateCheckExpr tn onConflictM - (toSQLBoolExp (S.QualTable tn) insCheck) - (fmap (toSQLBoolExp (S.QualTable tn)) updCheck)) - Nothing - ] - - adminIns = return (CTEExp (S.CTEInsert sqlInsert) args) - nonAdminInsert = do - let cteIns = S.CTEInsert sqlInsert - return (CTEExp cteIns args) - - bool nonAdminInsert adminIns $ isAdmin role - -fetchFromColVals - :: MonadError QErr m - => ColumnValuesText - -> [PGColumnInfo] - -> m [(PGCol, WithScalarType PGScalarValue)] -fetchFromColVals colVal reqCols = - forM reqCols $ \ci -> do - let valM = Map.lookup (pgiColumn ci) colVal - val <- onNothing valM $ throw500 $ "column " - <> pgiColumn ci <<> " not found in given colVal" - pgColVal <- parseTxtEncodedPGValue (pgiType ci) val - return (pgiColumn ci, pgColVal) - --- | validate an insert object based on insert columns, --- | insert object relations and additional columns from parent -validateInsert - :: (MonadError QErr m) - => [PGCol] -- ^ inserting columns - -> [RelInfo] -- ^ object relation inserts - -> [PGCol] -- ^ additional fields from parent - -> m () -validateInsert insCols objRels addCols = do - -- validate insertCols - unless (null insConflictCols) $ throwVE $ - "cannot insert " <> showPGCols insConflictCols - <> " columns as their values are already being determined by parent insert" - - forM_ objRels $ \relInfo -> do - let lCols = Map.keys $ riMapping relInfo - relName = riName relInfo - relNameTxt = relNameToTxt relName - lColConflicts = lCols `intersect` (addCols <> insCols) - withPathK relNameTxt $ unless (null lColConflicts) $ throwVE $ - "cannot insert object relation ship " <> relName - <<> " as " <> showPGCols lColConflicts - <> " column values are already determined" - where - insConflictCols = insCols `intersect` addCols - --- | insert an object relationship and return affected rows --- | and parent dependent columns -insertObjRel - :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) - => Env.Environment - -> Bool - -> MutationRemoteJoinCtx - -> RoleName - -> ObjRelIns - -> m (Int, [PGColWithValue]) -insertObjRel env strfyNum rjCtx role objRelIns = - withPathK relNameTxt $ do - (affRows, colValM) <- withPathK "data" $ insertObj env strfyNum rjCtx role tn singleObjIns [] - colVal <- onNothing colValM $ throw400 NotSupported errMsg - retColsWithVals <- fetchFromColVals colVal rColInfos - let c = mergeListsWith (Map.toList mapCols) retColsWithVals - (\(_, rCol) (col, _) -> rCol == col) - (\(lCol, _) (_, cVal) -> (lCol, cVal)) - return (affRows, c) - where - RelIns singleObjIns relInfo = objRelIns - -- multiObjIns = singleToMulti singleObjIns - relName = riName relInfo - relNameTxt = relNameToTxt relName - mapCols = riMapping relInfo - tn = riRTable relInfo - allCols = _aiTableCols singleObjIns - rCols = Map.elems mapCols - rColInfos = getColInfos rCols allCols - errMsg = "cannot proceed to insert object relation " - <> relName <<> " since insert to table " - <> tn <<> " affects zero rows" - -decodeEncJSON :: (J.FromJSON a, QErrM m) => EncJSON -> m a -decodeEncJSON = - either (throw500 . T.pack) decodeValue . - J.eitherDecode . encJToLBS - --- | insert an array relationship and return affected rows -insertArrRel - :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) - => Env.Environment - -> Bool - -> MutationRemoteJoinCtx - -> RoleName - -> [PGColWithValue] - -> ArrRelIns - -> m Int -insertArrRel env strfyNum rjCtx role resCols arrRelIns = - withPathK relNameTxt $ do - let addCols = mergeListsWith resCols (Map.toList colMapping) - (\(col, _) (lCol, _) -> col == lCol) - (\(_, colVal) (_, rCol) -> (rCol, colVal)) - - resBS <- insertMultipleObjects env strfyNum rjCtx role tn multiObjIns addCols mutOutput "data" - resObj <- decodeEncJSON resBS - onNothing (Map.lookup ("affected_rows" :: T.Text) resObj) $ - throw500 "affected_rows not returned in array rel insert" - where - RelIns multiObjIns relInfo = arrRelIns - colMapping = riMapping relInfo - tn = riRTable relInfo - relNameTxt = relNameToTxt $ riName relInfo - mutOutput = RR.MOutMultirowFields [("affected_rows", RR.MCount)] - --- | insert an object with object and array relationships -insertObj - :: (HasVersion, MonadTx m, MonadIO m, Tracing.MonadTrace m) - => Env.Environment - -> Bool - -> MutationRemoteJoinCtx - -> RoleName - -> QualifiedTable - -> SingleObjIns - -> [PGColWithValue] -- ^ additional fields - -> m (Int, Maybe ColumnValuesText) -insertObj env strfyNum rjCtx role tn singleObjIns addCols = do - -- validate insert - validateInsert (map fst cols) (map _riRelInfo objRels) $ map fst addCols - - -- insert all object relations and fetch this insert dependent column values - objInsRes <- forM objRels $ insertObjRel env strfyNum rjCtx role - - -- prepare final insert columns - let objRelAffRows = sum $ map fst objInsRes - objRelDeterminedCols = concatMap snd objInsRes - finalInsCols = cols <> objRelDeterminedCols <> addCols - - -- prepare insert query as with expression - insCheck <- convAnnBoolExpPartialSQL sessVarFromCurrentSetting insCond - updCheck <- traverse (convAnnBoolExpPartialSQL sessVarFromCurrentSetting) updCond - - CTEExp cte insPArgs <- - mkInsertQ tn onConflictM finalInsCols defVals role (insCheck, updCheck) - - MutateResp affRows colVals <- liftTx $ mutateAndFetchCols tn allCols (cte, insPArgs) strfyNum - colValM <- asSingleObject colVals - - arrRelAffRows <- bool (withArrRels colValM) (return 0) $ null arrRels - let totAffRows = objRelAffRows + affRows + arrRelAffRows - - return (totAffRows, colValM) - where - AnnIns annObj onConflictM (insCond, updCond) allCols defVals = singleObjIns - AnnInsObj cols objRels arrRels = annObj - - arrRelDepCols = flip getColInfos allCols $ - concatMap (Map.keys . riMapping . _riRelInfo) arrRels - - withArrRels colValM = do - colVal <- onNothing colValM $ throw400 NotSupported cannotInsArrRelErr - arrDepColsWithVal <- fetchFromColVals colVal arrRelDepCols - arrInsARows <- forM arrRels $ insertArrRel env strfyNum rjCtx role arrDepColsWithVal - return $ sum arrInsARows - - asSingleObject = \case - [] -> pure Nothing - [r] -> pure $ Just r - _ -> throw500 "more than one row returned" - - cannotInsArrRelErr = - "cannot proceed to insert array relations since insert to table " - <> tn <<> " affects zero rows" - - --- | insert multiple Objects in postgres -insertMultipleObjects - :: ( HasVersion - , MonadTx m - , MonadIO m - , Tracing.MonadTrace m - ) - => Env.Environment - -> Bool - -> MutationRemoteJoinCtx - -> RoleName - -> QualifiedTable - -> MultiObjIns - -> [PGColWithValue] -- ^ additional fields - -> RR.MutationOutput - -> T.Text -- ^ error path - -> m EncJSON -insertMultipleObjects env strfyNum rjCtx role tn multiObjIns addCols mutOutput errP = - bool withoutRelsInsert withRelsInsert anyRelsToInsert - where - AnnIns insObjs onConflictM (insCond, updCond) tableColInfos defVals = multiObjIns - singleObjInserts = multiToSingles multiObjIns - insCols = map _aioColumns insObjs - allInsObjRels = concatMap _aioObjRels insObjs - allInsArrRels = concatMap _aioArrRels insObjs - anyRelsToInsert = not $ null allInsArrRels && null allInsObjRels - - withErrPath = withPathK errP - - -- insert all column rows at one go - withoutRelsInsert = withErrPath $ do - indexedForM_ insCols $ \insCol -> - validateInsert (map fst insCol) [] $ map fst addCols - - let withAddCols = flip map insCols $ union addCols - tableCols = Map.keys defVals - - (sqlRows, prepArgs) <- flip runStateT Seq.Empty $ do - rowsWithCol <- mapM toSQLExps withAddCols - return $ map (mkSQLRow defVals) rowsWithCol - - insCheck <- convAnnBoolExpPartialSQL sessVarFromCurrentSetting insCond - updCheck <- traverse (convAnnBoolExpPartialSQL sessVarFromCurrentSetting) updCond - - let insQP1 = RI.InsertQueryP1 tn tableCols sqlRows onConflictM - (insCheck, updCheck) mutOutput tableColInfos - p1 = (insQP1, prepArgs) - RI.execInsertQuery env strfyNum (Just rjCtx) p1 - - -- insert each object with relations - withRelsInsert = withErrPath $ do - insResps <- indexedForM singleObjInserts $ \objIns -> - insertObj env strfyNum rjCtx role tn objIns addCols - - let affRows = sum $ map fst insResps - columnValues = mapMaybe snd insResps - cteExp <- mkSelCTEFromColVals tn tableColInfos columnValues - let (mutOutputRJ, remoteJoins) = getRemoteJoinsMutationOutput mutOutput - sqlQuery = Q.fromBuilder $ toSQL $ - RR.mkMutationOutputExp tn tableColInfos (Just affRows) cteExp mutOutputRJ strfyNum - executeMutationOutputQuery env sqlQuery [] $ (,rjCtx) <$> remoteJoins - -prefixErrPath :: (MonadError QErr m) => Field -> m a -> m a -prefixErrPath fld = - withPathK "selectionSet" . fieldAsPath fld . withPathK "args" - -convertInsert - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has InsCtxMap r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> MutationRemoteJoinCtx - -> RoleName - -> QualifiedTable -- table - -> Field -- the mutation field - -> m (tx EncJSON) -convertInsert env rjCtx role tn fld = prefixErrPath fld $ do - selSet <- asObjectSelectionSet $ _fSelSet fld - mutOutputUnres <- RR.MOutMultirowFields <$> resolveMutationFields (_fType fld) selSet - mutOutputRes <- RR.traverseMutationOutput resolveValTxt mutOutputUnres - annVals <- withArg arguments "objects" asArray - -- if insert input objects is empty array then - -- do not perform insert and return mutation response - bool (withNonEmptyObjs annVals mutOutputRes) - (withEmptyObjs mutOutputRes) $ null annVals - where - withNonEmptyObjs annVals mutOutput = do - InsCtx tableColMap checkCond defValMap relInfoMap updPerm <- getInsCtx tn - annObjs <- mapM asObject annVals - annInsObjs <- forM annObjs $ mkAnnInsObj relInfoMap tableColMap - conflictClauseM <- forM onConflictM $ parseOnConflict tn updPerm tableColMap - defValMapRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) - defValMap - let multiObjIns = AnnIns annInsObjs conflictClauseM (checkCond, updPerm >>= upfiCheck) - tableCols defValMapRes - tableCols = Map.elems tableColMap - strfyNum <- stringifyNum <$> asks getter - return $ prefixErrPath fld $ insertMultipleObjects env strfyNum rjCtx role tn - multiObjIns [] mutOutput "objects" - withEmptyObjs mutOutput = - return $ return $ buildEmptyMutResp mutOutput - arguments = _fArguments fld - onConflictM = Map.lookup "on_conflict" arguments - -convertInsertOne - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , Has InsCtxMap r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> MutationRemoteJoinCtx - -> RoleName - -> QualifiedTable -- table - -> Field -- the mutation field - -> m (tx EncJSON) -convertInsertOne env rjCtx role qt field = prefixErrPath field $ do - selSet <- asObjectSelectionSet $ _fSelSet field - tableSelFields <- processTableSelectionSet (_fType field) selSet - let mutationOutputUnresolved = RR.MOutSinglerowObject tableSelFields - mutationOutputResolved <- RR.traverseMutationOutput resolveValTxt mutationOutputUnresolved - annInputObj <- withArg arguments "object" asObject - InsCtx tableColMap checkCond defValMap relInfoMap updPerm <- getInsCtx qt - annInsertObj <- mkAnnInsObj relInfoMap tableColMap annInputObj - conflictClauseM <- forM (Map.lookup "on_conflict" arguments) $ parseOnConflict qt updPerm tableColMap - defValMapRes <- mapM (convPartialSQLExp sessVarFromCurrentSetting) defValMap - let multiObjIns = AnnIns [annInsertObj] conflictClauseM (checkCond, updPerm >>= upfiCheck) - tableCols defValMapRes - tableCols = Map.elems tableColMap - strfyNum <- stringifyNum <$> asks getter - pure $ prefixErrPath field $ insertMultipleObjects env strfyNum rjCtx role qt - multiObjIns [] mutationOutputResolved "object" - where - arguments = _fArguments field - --- helper functions -getInsCtx - :: (MonadError QErr m, MonadReader r m, Has InsCtxMap r) - => QualifiedTable -> m InsCtx -getInsCtx tn = do - ctxMap <- asks getter - insCtx <- onNothing (Map.lookup tn ctxMap) $ - throw500 $ "table " <> tn <<> " not found" - let defValMap = fmap PSESQLExp $ S.mkColDefValMap $ map pgiColumn $ - Map.elems $ icAllCols insCtx - setCols = icSet insCtx - return $ insCtx {icSet = Map.union setCols defValMap} - -mergeListsWith - :: [a] -> [b] -> (a -> b -> Bool) -> (a -> b -> c) -> [c] -mergeListsWith _ [] _ _ = [] -mergeListsWith [] _ _ _ = [] -mergeListsWith (x:xs) l b f = case find (b x) l of - Nothing -> mergeListsWith xs l b f - Just y -> f x y : mergeListsWith xs l b f diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Introspect.hs b/server/src-lib/Hasura/GraphQL/Resolve/Introspect.hs deleted file mode 100644 index 75627cf9077..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Introspect.hs +++ /dev/null @@ -1,424 +0,0 @@ -module Hasura.GraphQL.Resolve.Introspect - ( schemaR - , typeR - ) where - -import Data.Has -import Hasura.Prelude - -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Data.Text as T -import qualified Hasura.SQL.Types as S -import qualified Hasura.SQL.Value as S -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Context -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Validate.Context -import Hasura.GraphQL.Validate.InputValue -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types - -data TypeKind - = TKSCALAR - | TKOBJECT - | TKINTERFACE - | TKUNION - | TKENUM - | TKINPUT_OBJECT - | TKLIST - | TKNON_NULL - deriving (Show, Eq) - -instance J.ToJSON TypeKind where - toJSON = J.toJSON . T.pack . drop 2 . show - -withSubFields - :: (MonadError QErr m) - => SelectionSet - -> (Field -> m J.Value) - -> m J.Object -withSubFields selSet fn = do - objectSelectionSet <- asObjectSelectionSet selSet - Map.fromList <$> traverseObjectSelectionSet objectSelectionSet fn - -- val <- fn fld - -- return (G.unName $ G.unAlias $ _fAlias fld, val) - -namedTyToTxt :: G.NamedType -> Text -namedTyToTxt = G.unName . G.unNamedType - -retJ :: (Applicative m, J.ToJSON a) => a -> m J.Value -retJ = pure . J.toJSON - -retJT :: (Applicative m) => Text -> m J.Value -retJT = pure . J.toJSON - --- 4.5.2.1 -scalarR - :: (MonadReusability m, MonadError QErr m) - => ScalarTyInfo - -> Field - -> m J.Object -scalarR (ScalarTyInfo descM name _ _) fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKSCALAR - "description" -> retJ $ fmap G.unDescription descM - "name" -> retJ name - _ -> return J.Null - --- 4.5.2.2 -objectTypeR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => ObjTyInfo - -> Field - -> m J.Object -objectTypeR objectType fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKOBJECT - "name" -> retJ $ namedTyToTxt n - "description" -> retJ $ fmap G.unDescription descM - "interfaces" -> fmap J.toJSON $ mapM (`ifaceR` subFld) $ Set.toList iFaces - "fields" -> fmap J.toJSON $ mapM (`fieldR` subFld) $ - sortOn _fiName $ - filter notBuiltinFld $ Map.elems flds - _ -> return J.Null - where - descM = _otiDesc objectType - n = _otiName objectType - iFaces = _otiImplIFaces objectType - flds = _otiFields objectType - -notBuiltinFld :: ObjFldInfo -> Bool -notBuiltinFld f = - fldName /= "__typename" && fldName /= "__type" && fldName /= "__schema" - where - fldName = _fiName f - -getImplTypes :: (MonadReader t m, Has TypeMap t) => AsObjType -> m [ObjTyInfo] -getImplTypes aot = do - tyInfo <- asks getter - return $ sortOn _otiName $ - Map.elems $ getPossibleObjTypes tyInfo aot - --- 4.5.2.3 -unionR - :: (MonadReader t m, MonadError QErr m, Has TypeMap t, MonadReusability m) - => UnionTyInfo -> Field -> m J.Object -unionR u@(UnionTyInfo descM n _) fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Field" - "kind" -> retJ TKUNION - "name" -> retJ $ namedTyToTxt n - "description" -> retJ $ fmap G.unDescription descM - "possibleTypes" -> fmap J.toJSON $ - mapM (`objectTypeR` subFld) =<< getImplTypes (AOTUnion u) - _ -> return J.Null - --- 4.5.2.4 -ifaceR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => G.NamedType - -> Field - -> m J.Object -ifaceR n fld = do - tyInfo <- getTyInfo n - case tyInfo of - TIIFace ifaceTyInfo -> ifaceR' ifaceTyInfo fld - _ -> throw500 $ "Unknown interface " <> showNamedTy n - -ifaceR' - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => IFaceTyInfo - -> Field - -> m J.Object -ifaceR' ifaceTyInfo fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKINTERFACE - "name" -> retJ $ namedTyToTxt name - "description" -> retJ $ fmap G.unDescription maybeDescription - "fields" -> fmap J.toJSON $ mapM (`fieldR` subFld) $ - sortOn _fiName $ - filter notBuiltinFld $ Map.elems fields - "possibleTypes" -> fmap J.toJSON $ mapM (`objectTypeR` subFld) - =<< getImplTypes (AOTIFace ifaceTyInfo) - _ -> return J.Null - where - maybeDescription = _ifDesc ifaceTyInfo - name = _ifName ifaceTyInfo - fields = _ifFields ifaceTyInfo - --- 4.5.2.5 -enumTypeR - :: ( Monad m, MonadReusability m, MonadError QErr m ) - => EnumTyInfo - -> Field - -> m J.Object -enumTypeR (EnumTyInfo descM n vals _) fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKENUM - "name" -> retJ $ namedTyToTxt n - "description" -> retJ $ fmap G.unDescription descM - "enumValues" -> do - includeDeprecated <- readIncludeDeprecated subFld - fmap J.toJSON $ - mapM (enumValueR subFld) $ - filter (\val -> includeDeprecated || not (_eviIsDeprecated val)) $ - sortOn _eviVal $ - Map.elems (normalizeEnumValues vals) - _ -> return J.Null - -readIncludeDeprecated - :: ( Monad m, MonadReusability m, MonadError QErr m ) - => Field - -> m Bool -readIncludeDeprecated subFld = do - let argM = Map.lookup "includeDeprecated" (_fArguments subFld) - case argM of - Nothing -> pure False - Just arg -> asScalarVal arg S.PGBoolean >>= \case - S.PGValBoolean b -> pure b - _ -> throw500 "unexpected non-Boolean argument for includeDeprecated" - -{- Note [Reusability of introspection queries with variables] -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Introspection queries can have variables, too, in particular to influence one of -two arguments: the @name@ argument of the @__type@ field, and the -@includeDeprecated@ argument of the @fields@ and @enumValues@ fields. The -current code does not cache all introspection queries with variables correctly. -As a workaround to this, whenever a variable is passed to an @includeDeprecated@ -argument, we mark the query as unreusable. This is the purpose of -'dummyReadIncludeDeprecated'. - -Now @fields@ and @enumValues@ are intended to be used when introspecting, -respectively [object and interface types] and enum types. However, it does not -suffice to only call 'dummyReadIncludeDeprecated' for such types, since @fields@ -and @enumValues@ are valid GraphQL fields regardless of what type we are looking -at. So precisely because @__Type@ is _thought of_ as a union, but _not -actually_ a union, we need to call 'dummyReadIncludeDeprecated' in all cases. - -See also issue #4547. --} - -dummyReadIncludeDeprecated - :: ( Monad m, MonadReusability m, MonadError QErr m ) - => Field - -> m () -dummyReadIncludeDeprecated fld = do - selSet <- unAliasedFields . unObjectSelectionSet - <$> asObjectSelectionSet (_fSelSet fld) - forM_ (toList selSet) $ \subFld -> - case _fName subFld of - "fields" -> readIncludeDeprecated subFld - "enumValues" -> readIncludeDeprecated subFld - _ -> return False - - --- 4.5.2.6 -inputObjR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => InpObjTyInfo - -> Field - -> m J.Object -inputObjR (InpObjTyInfo descM nt flds _) fld = do - dummyReadIncludeDeprecated fld - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKINPUT_OBJECT - "name" -> retJ $ namedTyToTxt nt - "description" -> retJ $ fmap G.unDescription descM - "inputFields" -> fmap J.toJSON $ mapM (inputValueR subFld) $ - sortOn _iviName $ Map.elems flds - _ -> return J.Null - --- 4.5.2.7 -listTypeR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => G.ListType -> Field -> m J.Object -listTypeR (G.ListType ty) fld = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKLIST - "ofType" -> J.toJSON <$> gtypeR ty subFld - _ -> return J.Null - --- 4.5.2.8 -nonNullR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => G.GType -> Field -> m J.Object -nonNullR gTyp fld = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Type" - "kind" -> retJ TKNON_NULL - "ofType" -> case gTyp of - G.TypeNamed (G.Nullability False) nt -> J.toJSON <$> namedTypeR nt subFld - G.TypeList (G.Nullability False) lt -> J.toJSON <$> listTypeR lt subFld - _ -> throw500 "nullable type passed to nonNullR" - _ -> return J.Null - -namedTypeR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => G.NamedType - -> Field - -> m J.Object -namedTypeR nt fld = do - tyInfo <- getTyInfo nt - namedTypeR' fld tyInfo - -namedTypeR' - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => Field - -> TypeInfo - -> m J.Object -namedTypeR' fld tyInfo = do - -- Now fetch the required type information from the corresponding - -- information generator - case tyInfo of - TIScalar colTy -> scalarR colTy fld - TIObj objTyInfo -> objectTypeR objTyInfo fld - TIEnum enumTypeInfo -> enumTypeR enumTypeInfo fld - TIInpObj inpObjTyInfo -> inputObjR inpObjTyInfo fld - TIIFace iFaceTyInfo -> ifaceR' iFaceTyInfo fld - TIUnion unionTyInfo -> unionR unionTyInfo fld - --- 4.5.3 -fieldR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => ObjFldInfo -> Field -> m J.Object -fieldR (ObjFldInfo descM n params ty _) fld = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Field" - "name" -> retJ $ G.unName n - "description" -> retJ $ fmap G.unDescription descM - "args" -> fmap J.toJSON $ mapM (inputValueR subFld) $ - sortOn _iviName $ Map.elems params - "type" -> J.toJSON <$> gtypeR ty subFld - "isDeprecated" -> retJ False - _ -> return J.Null - --- 4.5.4 -inputValueR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => Field -> InpValInfo -> m J.Object -inputValueR fld (InpValInfo descM n defM ty) = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__InputValue" - "name" -> retJ $ G.unName n - "description" -> retJ $ fmap G.unDescription descM - "type" -> J.toJSON <$> gtypeR ty subFld - -- TODO: figure out what the spec means by 'string encoding' - "defaultValue" -> retJ $ pPrintValueC <$> defM - _ -> return J.Null - --- 4.5.5 -enumValueR - :: (MonadError QErr m) - => Field -> EnumValInfo -> m J.Object -enumValueR fld (EnumValInfo descM enumVal isDeprecated) = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__EnumValue" - "name" -> retJ $ G.unName $ G.unEnumValue enumVal - "description" -> retJ $ fmap G.unDescription descM - "isDeprecated" -> retJ isDeprecated - _ -> return J.Null - --- 4.5.6 -directiveR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => Field -> DirectiveInfo -> m J.Object -directiveR fld (DirectiveInfo descM n args locs) = - withSubFields (_fSelSet fld) $ \subFld -> - case _fName subFld of - "__typename" -> retJT "__Directive" - "name" -> retJ $ G.unName n - "description" -> retJ $ fmap G.unDescription descM - "locations" -> retJ $ map showDirLoc locs - "args" -> fmap J.toJSON $ mapM (inputValueR subFld) $ - sortOn _iviName $ Map.elems args - _ -> return J.Null - -showDirLoc :: G.DirectiveLocation -> Text -showDirLoc = \case - G.DLExecutable edl -> T.pack $ drop 3 $ show edl - G.DLTypeSystem tsdl -> T.pack $ drop 4 $ show tsdl - -gtypeR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => G.GType -> Field -> m J.Object -gtypeR ty fld = - case ty of - G.TypeList (G.Nullability True) lt -> listTypeR lt fld - G.TypeList (G.Nullability False) _ -> nonNullR ty fld - G.TypeNamed (G.Nullability True) nt -> namedTypeR nt fld - G.TypeNamed (G.Nullability False) _ -> nonNullR ty fld - -schemaR - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m, MonadReusability m) - => Field -> m J.Object -schemaR fld = - withSubFields (_fSelSet fld) $ \subFld -> do - (tyMap :: TypeMap) <- asks getter - case _fName subFld of - "__typename" -> retJT "__Schema" - "types" -> fmap J.toJSON $ mapM (namedTypeR' subFld) $ - sortOn getNamedTy $ Map.elems tyMap - "queryType" -> J.toJSON <$> namedTypeR queryRootNamedType subFld - "mutationType" -> typeR' mutationRootNamedType subFld - "subscriptionType" -> typeR' subscriptionRootNamedType subFld - "directives" -> J.toJSON <$> mapM (directiveR subFld) - (sortOn _diName defaultDirectives) - _ -> return J.Null - -typeR - :: (MonadReusability m, MonadError QErr m, MonadReader r m, Has TypeMap r) - => Field -> m J.Value -typeR fld = do - name <- asPGColText =<< getArg args "name" - typeR' (G.NamedType $ G.Name name) fld - where - args = _fArguments fld - -typeR' - :: (MonadReader r m, Has TypeMap r, MonadError QErr m, MonadReusability m) - => G.NamedType -> Field -> m J.Value -typeR' n fld = do - tyMap <- asks getter - case Map.lookup n tyMap of - Nothing -> return J.Null - Just tyInfo -> J.Object <$> namedTypeR' fld tyInfo diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Mutation.hs b/server/src-lib/Hasura/GraphQL/Resolve/Mutation.hs deleted file mode 100644 index b06b81e3812..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Mutation.hs +++ /dev/null @@ -1,411 +0,0 @@ -{-# LANGUAGE ScopedTypeVariables #-} - -module Hasura.GraphQL.Resolve.Mutation - ( convertUpdate - , convertUpdateByPk - , convertDelete - , convertDeleteByPk - , resolveMutationFields - , buildEmptyMutResp - ) where - -import Data.Has -import Hasura.Prelude - -import qualified Control.Monad.Validate as MV -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.HashMap.Strict.InsOrd.Extended as OMap -import qualified Data.Sequence.NonEmpty as NESeq -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Hasura.RQL.DML.Delete as RD -import qualified Hasura.RQL.DML.Returning as RR -import qualified Hasura.RQL.DML.Update as RU - -import qualified Hasura.RQL.DML.Select as RS -import qualified Hasura.SQL.DML as S -import qualified Data.Environment as Env -import qualified Hasura.Tracing as Tracing - -import Hasura.EncJSON -import Hasura.GraphQL.Resolve.BoolExp -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Resolve.Select (processTableSelectionSet) -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DML.Internal (currentSession, sessVarFromCurrentSetting) -import Hasura.RQL.DML.Mutation (MutationRemoteJoinCtx) -import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) -import Hasura.SQL.Types -import Hasura.SQL.Value - -resolveMutationFields - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => G.NamedType -> ObjectSelectionSet -> m (RR.MutFldsG UnresolvedVal) -resolveMutationFields ty selSet = fmap (map (first FieldName)) $ - traverseObjectSelectionSet selSet $ \fld -> case _fName fld of - "__typename" -> return $ RR.MExp $ G.unName $ G.unNamedType ty - "affected_rows" -> return RR.MCount - "returning" -> do - annFlds <- asObjectSelectionSet (_fSelSet fld) - >>= processTableSelectionSet (_fType fld) - annFldsResolved <- traverse - (traverse (RS.traverseAnnField convertPGValueToTextValue)) annFlds - return $ RR.MRet annFldsResolved - G.Name t -> throw500 $ "unexpected field in mutation resp : " <> t - where - convertPGValueToTextValue = \case - UVPG annPGVal -> UVSQL <$> txtConverter annPGVal - UVSessVar colTy sessVar -> pure $ UVSessVar colTy sessVar - UVSQL sqlExp -> pure $ UVSQL sqlExp - UVSession -> pure UVSession - -convertRowObj - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap - -> AnnInpVal - -> m [(PGCol, UnresolvedVal)] -convertRowObj colGNameMap val = - flip withObject val $ \_ obj -> - forM (OMap.toList obj) $ \(k, v) -> do - prepExpM <- fmap mkParameterizablePGValue <$> asPGColumnValueM v - pgCol <- pgiColumn <$> resolvePGCol colGNameMap k - let prepExp = fromMaybe (UVSQL S.SENull) prepExpM - return (pgCol, prepExp) - -type ApplySQLOp = (PGCol, S.SQLExp) -> S.SQLExp - --- SET x = x -rhsExpOp :: S.SQLOp -> S.TypeAnn -> ApplySQLOp -rhsExpOp op annTy (col, e) = - S.mkSQLOpExp op (S.SEIden $ toIden col) annExp - where - annExp = S.SETyAnn e annTy - -lhsExpOp :: S.SQLOp -> S.TypeAnn -> ApplySQLOp -lhsExpOp op annTy (col, e) = - S.mkSQLOpExp op annExp $ S.SEIden $ toIden col - where - annExp = S.SETyAnn e annTy - --- Automatically generate type annotation by looking up the column name -typedRhsExpOp :: S.SQLOp -> S.TypeAnn -> PGColGNameMap -> ApplySQLOp -typedRhsExpOp op defaultAnnTy colGNameMap (colName, e) = - let annTypeM :: Maybe S.TypeAnn - annTypeM = do - fieldType <- pgiType <$> Map.lookup (G.Name $ getPGColTxt colName) colGNameMap - case fieldType of - PGColumnScalar x -> return $ S.mkTypeAnn $ PGTypeScalar x - _ -> Nothing - annType :: S.TypeAnn - annType = fromMaybe defaultAnnTy annTypeM - in rhsExpOp op annType (colName, e) - -convObjWithOp - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap -> ApplySQLOp -> AnnInpVal -> m [(PGCol, UnresolvedVal)] -convObjWithOp colGNameMap opFn val = - flip withObject val $ \_ obj -> forM (OMap.toList obj) $ \(k, v) -> do - colVal <- openOpaqueValue =<< asPGColumnValue v - pgCol <- pgiColumn <$> resolvePGCol colGNameMap k - -- TODO: why are we using txtEncoder here? - let encVal = txtEncoder $ pstValue $ _apvValue colVal - sqlExp = opFn (pgCol, encVal) - return (pgCol, UVSQL sqlExp) - -convDeleteAtPathObj - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap -> AnnInpVal -> m [(PGCol, UnresolvedVal)] -convDeleteAtPathObj colGNameMap val = - flip withObject val $ \_ obj -> forM (OMap.toList obj) $ \(k, v) -> do - vals <- traverse (openOpaqueValue <=< asPGColumnValue) =<< asArray v - pgCol <- pgiColumn <$> resolvePGCol colGNameMap k - let valExps = map (txtEncoder . pstValue . _apvValue) vals - annEncVal = S.SETyAnn (S.SEArray valExps) S.textArrTypeAnn - sqlExp = S.SEOpApp S.jsonbDeleteAtPathOp - [S.SEIden $ toIden pgCol, annEncVal] - return (pgCol, UVSQL sqlExp) - -convertUpdateP1 - :: forall m . (MonadReusability m, MonadError QErr m) - => UpdOpCtx -- the update context - -> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool expression parser - -> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver - -> Field -- the mutation field - -> m (RU.AnnUpdG UnresolvedVal) -convertUpdateP1 opCtx boolExpParser selectionResolver fld = do - -- a set expression is same as a row object - setExpM <- resolveUpdateOperator "_set" $ convertRowObj colGNameMap - -- where bool expression to filter column - whereExp <- boolExpParser args - -- increment operator on integer columns - incExpM <- resolveUpdateOperator "_inc" $ - convObjWithOp' $ typedRhsExpOp S.incOp S.numericTypeAnn colGNameMap - -- append jsonb value - appendExpM <- resolveUpdateOperator "_append" $ - convObjWithOp' $ rhsExpOp S.jsonbConcatOp S.jsonbTypeAnn - -- prepend jsonb value - prependExpM <- resolveUpdateOperator "_prepend" $ - convObjWithOp' $ lhsExpOp S.jsonbConcatOp S.jsonbTypeAnn - -- delete a key in jsonb object - deleteKeyExpM <- resolveUpdateOperator "_delete_key" $ - convObjWithOp' $ rhsExpOp S.jsonbDeleteOp S.textTypeAnn - -- delete an element in jsonb array - deleteElemExpM <- resolveUpdateOperator "_delete_elem" $ - convObjWithOp' $ rhsExpOp S.jsonbDeleteOp S.intTypeAnn - -- delete at path in jsonb value - deleteAtPathExpM <- resolveUpdateOperator "_delete_at_path" $ - convDeleteAtPathObj colGNameMap - - updateItems <- combineUpdateExpressions - [ setExpM, incExpM, appendExpM, prependExpM - , deleteKeyExpM, deleteElemExpM, deleteAtPathExpM - ] - - mutOutput <- selectionResolver fld - - pure $ RU.AnnUpd tn updateItems (unresolvedPermFilter, whereExp) unresolvedPermCheck mutOutput allCols - where - convObjWithOp' = convObjWithOp colGNameMap - allCols = Map.elems colGNameMap - UpdOpCtx tn _ colGNameMap filterExp checkExpr preSetCols = opCtx - args = _fArguments fld - resolvedPreSetItems = Map.toList $ fmap partialSQLExpToUnresolvedVal preSetCols - unresolvedPermFilter = fmapAnnBoolExp partialSQLExpToUnresolvedVal filterExp - unresolvedPermCheck = maybe annBoolExpTrue (fmapAnnBoolExp partialSQLExpToUnresolvedVal) checkExpr - - resolveUpdateOperator operator resolveAction = - (operator,) <$> withArgM args operator resolveAction - - combineUpdateExpressions :: [(G.Name, Maybe [(PGCol, UnresolvedVal)])] - -> m [(PGCol, UnresolvedVal)] - combineUpdateExpressions updateExps = do - let allOperatorNames = map fst updateExps - updateItems :: [(G.Name, [(PGCol, UnresolvedVal)])] - updateItems = mapMaybe (\(op, itemsM) -> (op,) <$> itemsM) updateExps - -- Atleast any one of operator is expected or preset expressions shouldn't be empty - if null updateItems && null resolvedPreSetItems then - throwVE $ "at least any one of " <> showNames allOperatorNames <> " is expected" - else do - let itemsWithOps :: [(PGCol, (G.Name, UnresolvedVal))] - itemsWithOps = concatMap (\(op, items) -> map (second (op,)) items) updateItems - validateMultiOps col items = do - when (length items > 1) $ MV.dispute [(col, map fst $ toList items)] - pure $ snd $ NESeq.head items - eitherResult :: Either - [(PGCol, [G.Name])] - (OMap.InsOrdHashMap PGCol UnresolvedVal) - eitherResult = MV.runValidate $ OMap.traverseWithKey validateMultiOps $ - OMap.groupTuples itemsWithOps - case eitherResult of - -- A column shouldn't be present in more than one operator. - -- If present, then generated UPDATE statement throws unexpected query error - Left columnsWithMultiOps -> throwVE $ - "column found in multiple operators; " - <> T.intercalate ". " - (map (\(col, ops) -> col <<> " in " <> showNames ops) - columnsWithMultiOps) - Right items -> pure $ resolvedPreSetItems <> OMap.toList items - -convertUpdateGeneric - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> UpdOpCtx -- the update context - -> MutationRemoteJoinCtx - -> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool exp parser - -> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver - -> Field - -> m (tx EncJSON) -convertUpdateGeneric env opCtx rjCtx boolExpParser selectionResolver fld = do - annUpdUnresolved <- convertUpdateP1 opCtx boolExpParser selectionResolver fld - (annUpdResolved, prepArgs) <- withPrepArgs $ RU.traverseAnnUpd - resolveValPrep annUpdUnresolved - strfyNum <- stringifyNum <$> asks getter - let whenNonEmptyItems = return $ RU.execUpdateQuery env strfyNum - (Just rjCtx) (annUpdResolved, prepArgs) - whenEmptyItems = return $ return $ - buildEmptyMutResp $ RU.uqp1Output annUpdResolved - -- if there are not set items then do not perform - -- update and return empty mutation response - bool whenNonEmptyItems whenEmptyItems $ null $ RU.uqp1SetExps annUpdResolved - -convertUpdate - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> UpdOpCtx -- the update context - -> MutationRemoteJoinCtx - -> Field -- the mutation field - -> m (tx EncJSON) -convertUpdate env opCtx rjCtx = - convertUpdateGeneric env opCtx rjCtx whereExpressionParser mutationFieldsResolver - -convertUpdateByPk - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> UpdOpCtx -- the update context - -> MutationRemoteJoinCtx - -> Field -- the mutation field - -> m (tx EncJSON) -convertUpdateByPk env opCtx rjCtx field = - convertUpdateGeneric env opCtx rjCtx boolExpParser tableSelectionAsMutationOutput field - where - boolExpParser args = withArg args "pk_columns" $ \inpVal -> do - obj <- asObject inpVal - pgColValToBoolExp (_uocAllCols opCtx) $ Map.fromList $ OMap.toList obj - - -convertDeleteGeneric - :: ( HasVersion, MonadReusability m - , MonadReader r m - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> DelOpCtx -- the delete context - -> MutationRemoteJoinCtx - -> (ArgsMap -> m AnnBoolExpUnresolved) -- the bool exp parser - -> (Field -> m (RR.MutationOutputG UnresolvedVal)) -- the selection set resolver - -> Field -- the mutation field - -> m (tx EncJSON) -convertDeleteGeneric env opCtx rjCtx boolExpParser selectionResolver fld = do - whereExp <- boolExpParser $ _fArguments fld - mutOutput <- selectionResolver fld - let unresolvedPermFltr = - fmapAnnBoolExp partialSQLExpToUnresolvedVal filterExp - annDelUnresolved = RD.AnnDel tn (unresolvedPermFltr, whereExp) - mutOutput allCols - (annDelResolved, prepArgs) <- withPrepArgs $ RD.traverseAnnDel - resolveValPrep annDelUnresolved - strfyNum <- stringifyNum <$> asks getter - return $ RD.execDeleteQuery env strfyNum (Just rjCtx) (annDelResolved, prepArgs) - where - DelOpCtx tn _ colGNameMap filterExp = opCtx - allCols = Map.elems colGNameMap - -convertDelete - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> DelOpCtx -- the delete context - -> MutationRemoteJoinCtx - -> Field -- the mutation field - -> m (tx EncJSON) -convertDelete env opCtx rjCtx = - convertDeleteGeneric env opCtx rjCtx whereExpressionParser mutationFieldsResolver - -convertDeleteByPk - :: ( HasVersion - , MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - , MonadIO tx - , MonadTx tx - , Tracing.MonadTrace tx - ) - => Env.Environment - -> DelOpCtx -- the delete context - -> MutationRemoteJoinCtx - -> Field -- the mutation field - -> m (tx EncJSON) -convertDeleteByPk env opCtx rjCtx field = - convertDeleteGeneric env opCtx rjCtx boolExpParser tableSelectionAsMutationOutput field - where - boolExpParser = pgColValToBoolExp (_docAllCols opCtx) - -whereExpressionParser - :: ( MonadReusability m, MonadError QErr m - , MonadReader r m, Has FieldMap r - ) - => ArgsMap -> m AnnBoolExpUnresolved -whereExpressionParser args = withArg args "where" parseBoolExp - -mutationFieldsResolver - :: ( MonadReusability m, MonadError QErr m - , MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => Field -> m (RR.MutationOutputG UnresolvedVal) -mutationFieldsResolver field = do - asObjectSelectionSet (_fSelSet field) >>= \selSet -> - RR.MOutMultirowFields <$> resolveMutationFields (_fType field) selSet - -tableSelectionAsMutationOutput - :: ( MonadReusability m, MonadError QErr m - , MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => Field -> m (RR.MutationOutputG UnresolvedVal) -tableSelectionAsMutationOutput field = - asObjectSelectionSet (_fSelSet field) >>= \selSet -> - RR.MOutSinglerowObject <$> processTableSelectionSet (_fType field) selSet - --- | build mutation response for empty objects -buildEmptyMutResp :: RR.MutationOutput -> EncJSON -buildEmptyMutResp = mkTx - where - mkTx = \case - RR.MOutMultirowFields mutFlds -> encJFromJValue $ OMap.fromList $ map (second convMutFld) mutFlds - RR.MOutSinglerowObject _ -> encJFromJValue $ J.Object mempty - -- generate empty mutation response - convMutFld = \case - RR.MCount -> J.toJSON (0 :: Int) - RR.MExp e -> J.toJSON e - RR.MRet _ -> J.toJSON ([] :: [J.Value]) - -resolveValPrep - :: (MonadState PrepArgs m) - => UnresolvedVal -> m S.SQLExp -resolveValPrep = \case - UVPG annPGVal -> prepare annPGVal - UVSessVar colTy sessVar -> sessVarFromCurrentSetting colTy sessVar - UVSQL sqlExp -> pure sqlExp - UVSession -> pure currentSession diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Select.hs b/server/src-lib/Hasura/GraphQL/Resolve/Select.hs deleted file mode 100644 index 2c54ac03cca..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Select.hs +++ /dev/null @@ -1,833 +0,0 @@ -module Hasura.GraphQL.Resolve.Select - ( convertSelect - , convertConnectionSelect - , convertConnectionFuncQuery - , convertSelectByPKey - , convertAggSelect - , convertFuncQuerySimple - , convertFuncQueryAgg - , parseColumns - , processTableSelectionSet - , resolveNodeId - , convertNodeSelect - , AnnSimpleSelect - ) where - -import Control.Lens (to, (^..), (^?), _2) -import Data.Has -import Data.Parser.JSONPath -import Hasura.Prelude - -import qualified Data.Aeson as J -import qualified Data.Aeson.Extended as J -import qualified Data.Aeson.Internal as J -import qualified Data.ByteString.Lazy as BL -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.List.NonEmpty as NE -import qualified Data.Sequence as Seq -import qualified Data.Sequence.NonEmpty as NESeq -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Hasura.RQL.DML.Select as RS -import qualified Hasura.SQL.DML as S - -import Hasura.GraphQL.Resolve.BoolExp -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Resolve.InputValue -import Hasura.GraphQL.Schema (isAggregateField) -import Hasura.GraphQL.Schema.Common (mkTableTy) -import Hasura.GraphQL.Validate -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DML.Internal (onlyPositiveInt) -import Hasura.RQL.Types -import Hasura.Server.Utils -import Hasura.SQL.Types -import Hasura.SQL.Value - -jsonPathToColExp :: (MonadError QErr m) => T.Text -> m (Maybe S.SQLExp) -jsonPathToColExp t = case parseJSONPath t of - Left s -> throw400 ParseFailed $ T.pack $ "parse json path error: " ++ s - Right [] -> return Nothing - Right jPaths -> return $ Just $ S.SEArray $ map elToColExp jPaths - where - elToColExp (Key k) = S.SELit k - elToColExp (Index i) = S.SELit $ T.pack (show i) - - -argsToColumnOp :: (MonadReusability m, MonadError QErr m) => ArgsMap -> m (Maybe RS.ColumnOp) -argsToColumnOp args = case Map.lookup "path" args of - Nothing -> return Nothing - Just txt -> do - mColTxt <- asPGColTextM txt - mColExps <- maybe (return Nothing) jsonPathToColExp mColTxt - pure $ RS.ColumnOp S.jsonbPathOp <$> mColExps - -type AnnFields = RS.AnnFieldsG UnresolvedVal - -resolveComputedField - :: ( MonadReusability m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r, MonadError QErr m - ) - => ComputedField -> Field -> m (RS.ComputedFieldSelect UnresolvedVal) -resolveComputedField computedField fld = fieldAsPath fld $ do - funcArgs <- parseFunctionArgs argSeq argFn $ Map.lookup "args" $ _fArguments fld - let argsWithTableArgument = withTableAndSessionArgument funcArgs - case fieldType of - CFTScalar scalarTy -> do - colOpM <- argsToColumnOp $ _fArguments fld - pure $ RS.CFSScalar $ - RS.ComputedFieldScalarSelect qf argsWithTableArgument scalarTy colOpM - CFTTable (ComputedFieldTable _ cols permFilter permLimit) -> do - let functionFrom = RS.FromFunction qf argsWithTableArgument Nothing - RS.CFSTable RS.JASMultipleRows <$> fromField functionFrom cols permFilter permLimit fld - where - ComputedField _ function argSeq fieldType = computedField - ComputedFieldFunction qf _ tableArg sessionArg _ = function - argFn :: FunctionArgItem -> InputFunctionArgument - argFn = IFAUnknown - withTableAndSessionArgument :: RS.FunctionArgsExpG UnresolvedVal - -> RS.FunctionArgsExpTableRow UnresolvedVal - withTableAndSessionArgument resolvedArgs = - let argsExp@(RS.FunctionArgsExp positional named) = RS.AEInput <$> resolvedArgs - tableRowArg = RS.AETableRow Nothing - withTable = case tableArg of - FTAFirst -> - RS.FunctionArgsExp (tableRowArg:positional) named - FTANamed argName index -> - RS.insertFunctionArg argName index tableRowArg argsExp - sessionArgVal = RS.AESession UVSession - alsoWithSession = case sessionArg of - Nothing -> withTable - Just (FunctionSessionArgument argName index) -> - RS.insertFunctionArg argName index sessionArgVal withTable - in alsoWithSession - -processTableSelectionSet - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => G.NamedType -> ObjectSelectionSet -> m AnnFields -processTableSelectionSet fldTy flds = - fmap (map (\(a, b) -> (FieldName a, b))) $ traverseObjectSelectionSet flds $ \fld -> do - let fldName = _fName fld - case fldName of - "__typename" -> return $ RS.AFExpression $ G.unName $ G.unNamedType fldTy - _ -> do - fldInfo <- getFldInfo fldTy fldName - case fldInfo of - RFNodeId tn pkeys -> pure $ RS.AFNodeId tn pkeys - RFPGColumn colInfo -> - RS.mkAnnColumnField colInfo <$> argsToColumnOp (_fArguments fld) - RFComputedField computedField -> - RS.AFComputedField <$> resolveComputedField computedField fld - RFRelationship (RelationshipField relInfo fieldKind colGNameMap tableFilter tableLimit) -> do - let relTN = riRTable relInfo - colMapping = riMapping relInfo - rn = riName relInfo - case fieldKind of - RFKSimple -> - case riType relInfo of - ObjRel -> do - annFields <- asObjectSelectionSet (_fSelSet fld) - >>= processTableSelectionSet (_fType fld) - pure $ RS.AFObjectRelation $ RS.AnnRelationSelectG rn colMapping $ - RS.AnnObjectSelectG annFields relTN $ - fmapAnnBoolExp partialSQLExpToUnresolvedVal tableFilter - ArrRel -> do - annSel <- fromField (RS.FromTable relTN) colGNameMap tableFilter tableLimit fld - pure $ RS.AFArrayRelation $ RS.ASSimple $ - RS.AnnRelationSelectG rn colMapping annSel - RFKAggregate -> do - aggSel <- fromAggField (RS.FromTable relTN) colGNameMap tableFilter tableLimit fld - pure $ RS.AFArrayRelation $ RS.ASAggregate $ RS.AnnRelationSelectG rn colMapping aggSel - RFKConnection pkCols -> do - connSel <- fromConnectionField (RS.FromTable relTN) pkCols tableFilter tableLimit fld - pure $ RS.AFArrayRelation $ RS.ASConnection $ RS.AnnRelationSelectG rn colMapping connSel - - RFRemoteRelationship info -> - pure $ RS.AFRemote $ RS.RemoteSelect - (unValidateArgsMap $ _fArguments fld) -- Unvalidate the input arguments - (unValidateSelectionSet $ _fSelSet fld) -- Unvalidate the selection fields - (_rfiHasuraFields info) - (_rfiRemoteFields info) - (_rfiRemoteSchema info) - -type TableAggregateFields = RS.TableAggregateFieldsG UnresolvedVal - -fromAggSelSet - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m TableAggregateFields -fromAggSelSet colGNameMap fldTy selSet = fmap toFields $ - traverseObjectSelectionSet selSet $ \Field{..} -> - case _fName of - "__typename" -> return $ RS.TAFExp $ G.unName $ G.unNamedType fldTy - "aggregate" -> do - objSelSet <- asObjectSelectionSet _fSelSet - RS.TAFAgg <$> convertAggregateField colGNameMap _fType objSelSet - "nodes" -> do - objSelSet <- asObjectSelectionSet _fSelSet - RS.TAFNodes <$> processTableSelectionSet _fType objSelSet - G.Name t -> throw500 $ "unexpected field in _agg node: " <> t - -fromConnectionSelSet - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => G.NamedType -> ObjectSelectionSet -> m (RS.ConnectionFields UnresolvedVal) -fromConnectionSelSet fldTy selSet = fmap toFields $ - traverseObjectSelectionSet selSet $ \Field{..} -> - case _fName of - "__typename" -> return $ RS.ConnectionTypename $ G.unName $ G.unNamedType fldTy - "pageInfo" -> do - fSelSet <- asObjectSelectionSet _fSelSet - RS.ConnectionPageInfo <$> parsePageInfoSelectionSet _fType fSelSet - "edges" -> do - fSelSet <- asObjectSelectionSet _fSelSet - RS.ConnectionEdges <$> parseEdgeSelectionSet _fType fSelSet - -- "aggregate" -> RS.TAFAgg <$> convertAggregateField colGNameMap fTy fSelSet - -- "nodes" -> RS.TAFNodes <$> processTableSelectionSet fTy fSelSet - G.Name t -> throw500 $ "unexpected field in _connection node: " <> t - -parseEdgeSelectionSet - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => G.NamedType -> ObjectSelectionSet -> m (RS.EdgeFields UnresolvedVal) -parseEdgeSelectionSet fldTy selSet = fmap toFields $ - traverseObjectSelectionSet selSet $ \f -> do - let fTy = _fType f - case _fName f of - "__typename" -> pure $ RS.EdgeTypename $ G.unName $ G.unNamedType fldTy - "cursor" -> pure RS.EdgeCursor - "node" -> do - fSelSet <- asObjectSelectionSet $ _fSelSet f - RS.EdgeNode <$> processTableSelectionSet fTy fSelSet - G.Name t -> throw500 $ "unexpected field in Edge node: " <> t - -parsePageInfoSelectionSet - :: ( MonadReusability m, MonadError QErr m) - => G.NamedType -> ObjectSelectionSet -> m RS.PageInfoFields -parsePageInfoSelectionSet fldTy selSet = - fmap toFields $ traverseObjectSelectionSet selSet $ \f -> - case _fName f of - "__typename" -> pure $ RS.PageInfoTypename $ G.unName $ G.unNamedType fldTy - "hasNextPage" -> pure RS.PageInfoHasNextPage - "hasPreviousPage" -> pure RS.PageInfoHasPreviousPage - "startCursor" -> pure RS.PageInfoStartCursor - "endCursor" -> pure RS.PageInfoEndCursor - -- "aggregate" -> RS.TAFAgg <$> convertAggregateField colGNameMap fTy fSelSet - -- "nodes" -> RS.TAFNodes <$> processTableSelectionSet fTy fSelSet - G.Name t -> throw500 $ "unexpected field in PageInfo node: " <> t - -type SelectArgs = RS.SelectArgsG UnresolvedVal - -parseSelectArgs - :: ( MonadReusability m, MonadError QErr m, MonadReader r m - , Has FieldMap r, Has OrdByCtx r - ) - => PGColGNameMap -> ArgsMap -> m SelectArgs -parseSelectArgs colGNameMap args = do - whereExpM <- withArgM args "where" parseBoolExp - ordByExpML <- withArgM args "order_by" parseOrderBy - let ordByExpM = NE.nonEmpty =<< ordByExpML - limitExpM <- withArgM args "limit" $ - parseNonNegativeInt "expecting Integer value for \"limit\"" - offsetExpM <- withArgM args "offset" $ asPGColumnValue >=> openOpaqueValue >=> txtConverter - distOnColsML <- withArgM args "distinct_on" $ parseColumns colGNameMap - let distOnColsM = NE.nonEmpty =<< distOnColsML - mapM_ (validateDistOn ordByExpM) distOnColsM - return $ RS.SelectArgs whereExpM ordByExpM limitExpM offsetExpM distOnColsM - where - validateDistOn Nothing _ = return () - validateDistOn (Just ordBys) cols = withPathK "args" $ do - let colsLen = length cols - initOrdBys = take colsLen $ toList ordBys - initOrdByCols = flip mapMaybe initOrdBys $ \ob -> - case obiColumn ob of - RS.AOCColumn pgCol -> Just $ pgiColumn pgCol - _ -> Nothing - isValid = (colsLen == length initOrdByCols) - && all (`elem` initOrdByCols) (toList cols) - - unless isValid $ throwVE - "\"distinct_on\" columns must match initial \"order_by\" columns" - -type AnnSimpleSelect = RS.AnnSimpleSelG UnresolvedVal - -fromField - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => RS.SelectFromG UnresolvedVal - -> PGColGNameMap - -> AnnBoolExpPartialSQL - -> Maybe Int - -> Field -> m AnnSimpleSelect -fromField selFrom colGNameMap permFilter permLimitM fld = fieldAsPath fld $ do - tableArgs <- parseSelectArgs colGNameMap args - selSet <- asObjectSelectionSet $ _fSelSet fld - annFlds <- processTableSelectionSet (_fType fld) selSet - let unresolvedPermFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter - let tabPerm = RS.TablePerm unresolvedPermFltr permLimitM - strfyNum <- stringifyNum <$> asks getter - return $ RS.AnnSelectG annFlds selFrom tabPerm tableArgs strfyNum - where - args = _fArguments fld - -getOrdByItemMap - :: ( MonadError QErr m - , MonadReader r m - , Has OrdByCtx r - ) - => G.NamedType -> m OrdByItemMap -getOrdByItemMap nt = do - ordByCtx <- asks getter - onNothing (Map.lookup nt ordByCtx) $ - throw500 $ "could not lookup " <> showNamedTy nt - -parseOrderBy - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has OrdByCtx r - ) - => AnnInpVal - -> m [RS.AnnOrderByItemG UnresolvedVal] -parseOrderBy = fmap concat . withArray f - where - f _ = mapM (withObject (getAnnObItems id)) - -getAnnObItems - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has OrdByCtx r - ) - => (RS.AnnOrderByElement UnresolvedVal -> RS.AnnOrderByElement UnresolvedVal) - -> G.NamedType - -> AnnGObject - -> m [RS.AnnOrderByItemG UnresolvedVal] -getAnnObItems f nt obj = do - ordByItemMap <- getOrdByItemMap nt - fmap concat $ forM (OMap.toList obj) $ \(k, v) -> do - ordByItem <- onNothing (Map.lookup k ordByItemMap) $ throw500 $ - "cannot lookup " <> showName k <> " order by item in " - <> showNamedTy nt <> " map" - case ordByItem of - OBIPGCol ci -> do - let aobCol = f $ RS.AOCColumn ci - (_, enumValM) <- asEnumValM v - ordByItemM <- forM enumValM $ \enumVal -> do - (ordTy, nullsOrd) <- parseOrderByEnum enumVal - return $ mkOrdByItemG ordTy aobCol nullsOrd - return $ maybe [] pure ordByItemM - - OBIRel ri fltr -> do - let unresolvedFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal fltr - let annObColFn = f . RS.AOCObjectRelation ri unresolvedFltr - flip withObjectM v $ \nameTy objM -> - maybe (pure []) (getAnnObItems annObColFn nameTy) objM - - OBIAgg ri relColGNameMap fltr -> do - let unresolvedFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal fltr - let aobColFn = f . RS.AOCArrayAggregation ri unresolvedFltr - flip withObjectM v $ \_ objM -> - maybe (pure []) (parseAggOrdBy relColGNameMap aobColFn) objM - -mkOrdByItemG :: S.OrderType -> a -> S.NullsOrder -> OrderByItemG a -mkOrdByItemG ordTy aobCol nullsOrd = - OrderByItemG (Just $ OrderType ordTy) aobCol (Just $ NullsOrder nullsOrd) - -parseAggOrdBy - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap - -> (RS.AnnAggregateOrderBy -> RS.AnnOrderByElement UnresolvedVal) - -> AnnGObject - -> m [RS.AnnOrderByItemG UnresolvedVal] -parseAggOrdBy colGNameMap f annObj = - fmap concat <$> forM (OMap.toList annObj) $ \(op, obVal) -> - case op of - "count" -> do - (_, enumValM) <- asEnumValM obVal - ordByItemM <- forM enumValM $ \enumVal -> do - (ordTy, nullsOrd) <- parseOrderByEnum enumVal - return $ mkOrdByItemG ordTy (f RS.AAOCount) nullsOrd - return $ maybe [] pure ordByItemM - - G.Name opText -> - flip withObject obVal $ \_ opObObj -> fmap catMaybes $ - forM (OMap.toList opObObj) $ \(colName, eVal) -> do - (_, enumValM) <- asEnumValM eVal - forM enumValM $ \enumVal -> do - (ordTy, nullsOrd) <- parseOrderByEnum enumVal - col <- resolvePGCol colGNameMap colName - let aobCol = f $ RS.AAOOp opText col - return $ mkOrdByItemG ordTy aobCol nullsOrd - -parseOrderByEnum - :: (MonadError QErr m) - => G.EnumValue - -> m (S.OrderType, S.NullsOrder) -parseOrderByEnum = \case - G.EnumValue "asc" -> return (S.OTAsc, S.NLast) - G.EnumValue "asc_nulls_last" -> return (S.OTAsc, S.NLast) - G.EnumValue "asc_nulls_first" -> return (S.OTAsc, S.NFirst) - G.EnumValue "desc" -> return (S.OTDesc, S.NFirst) - G.EnumValue "desc_nulls_first" -> return (S.OTDesc, S.NFirst) - G.EnumValue "desc_nulls_last" -> return (S.OTDesc, S.NLast) - G.EnumValue v -> throw500 $ - "enum value " <> showName v <> " not found in type order_by" - -parseNonNegativeInt - :: (MonadReusability m, MonadError QErr m) => Text -> AnnInpVal -> m Int -parseNonNegativeInt errMsg v = do - pgColVal <- openOpaqueValue =<< asPGColumnValue v - limit <- maybe (throwVE errMsg) return . pgColValueToInt . pstValue $ _apvValue pgColVal - -- validate int value - onlyPositiveInt limit - return limit - -type AnnSimpleSel = RS.AnnSimpleSelG UnresolvedVal - -fromFieldByPKey - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => QualifiedTable -> PGColArgMap - -> AnnBoolExpPartialSQL -> Field -> m AnnSimpleSel -fromFieldByPKey tn colArgMap permFilter fld = fieldAsPath fld $ do - boolExp <- pgColValToBoolExp colArgMap $ _fArguments fld - selSet <- asObjectSelectionSet $ _fSelSet fld - annFlds <- processTableSelectionSet fldTy selSet - let tabFrom = RS.FromTable tn - unresolvedPermFltr = fmapAnnBoolExp partialSQLExpToUnresolvedVal - permFilter - tabPerm = RS.TablePerm unresolvedPermFltr Nothing - tabArgs = RS.noSelectArgs { RS._saWhere = Just boolExp} - strfyNum <- stringifyNum <$> asks getter - return $ RS.AnnSelectG annFlds tabFrom tabPerm tabArgs strfyNum - where - fldTy = _fType fld - -convertSelect - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => SelOpCtx -> Field -> m (RS.AnnSimpleSelG UnresolvedVal) -convertSelect opCtx fld = - withPathK "selectionSet" $ - fromField (RS.FromTable qt) colGNameMap permFilter permLimit fld - where - SelOpCtx qt _ colGNameMap permFilter permLimit = opCtx - -convertSelectByPKey - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => SelPkOpCtx -> Field -> m (RS.AnnSimpleSelG UnresolvedVal) -convertSelectByPKey opCtx fld = - withPathK "selectionSet" $ - fromFieldByPKey qt colArgMap permFilter fld - where - SelPkOpCtx qt _ permFilter colArgMap = opCtx - --- agg select related -parseColumns - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap -> AnnInpVal -> m [PGCol] -parseColumns allColFldMap val = - flip withArray val $ \_ vals -> - forM vals $ \v -> do - (_, G.EnumValue enumVal) <- asEnumVal v - pgiColumn <$> resolvePGCol allColFldMap enumVal - -convertCount - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap -> ArgsMap -> m S.CountType -convertCount colGNameMap args = do - columnsM <- withArgM args "columns" $ parseColumns colGNameMap - isDistinct <- or <$> withArgM args "distinct" parseDistinct - maybe (return S.CTStar) (mkCType isDistinct) columnsM - where - parseDistinct v = do - val <- openOpaqueValue =<< asPGColumnValue v - case pstValue $ _apvValue val of - PGValBoolean b -> return b - _ -> - throw500 "expecting Boolean for \"distinct\"" - - mkCType isDistinct cols = return $ - bool (S.CTSimple cols) (S.CTDistinct cols) isDistinct - -toFields :: [(T.Text, a)] -> RS.Fields a -toFields = map (first FieldName) - -convertColumnFields - :: (MonadError QErr m) - => PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m RS.ColumnFields -convertColumnFields colGNameMap ty selSet = fmap toFields $ - traverseObjectSelectionSet selSet $ \fld -> - case _fName fld of - "__typename" -> return $ RS.PCFExp $ G.unName $ G.unNamedType ty - n -> RS.PCFCol . pgiColumn <$> resolvePGCol colGNameMap n - -convertAggregateField - :: (MonadReusability m, MonadError QErr m) - => PGColGNameMap -> G.NamedType -> ObjectSelectionSet -> m RS.AggregateFields -convertAggregateField colGNameMap ty selSet = fmap toFields $ - traverseObjectSelectionSet selSet $ \Field{..} -> - case _fName of - "__typename" -> return $ RS.AFExp $ G.unName $ G.unNamedType ty - "count" -> RS.AFCount <$> convertCount colGNameMap _fArguments - n -> do - fSelSet <- asObjectSelectionSet _fSelSet - colFlds <- convertColumnFields colGNameMap _fType fSelSet - unless (isAggregateField n) $ throwInvalidFld n - return $ RS.AFOp $ RS.AggregateOp (G.unName n) colFlds - where - throwInvalidFld (G.Name t) = - throw500 $ "unexpected field in _aggregate node: " <> t - -type AnnAggregateSelect = RS.AnnAggregateSelectG UnresolvedVal - -fromAggField - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => RS.SelectFromG UnresolvedVal - -> PGColGNameMap - -> AnnBoolExpPartialSQL - -> Maybe Int - -> Field -> m AnnAggregateSelect -fromAggField selectFrom colGNameMap permFilter permLimit fld = fieldAsPath fld $ do - tableArgs <- parseSelectArgs colGNameMap args - selSet <- asObjectSelectionSet $ _fSelSet fld - aggSelFlds <- fromAggSelSet colGNameMap (_fType fld) selSet - let unresolvedPermFltr = - fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter - let tabPerm = RS.TablePerm unresolvedPermFltr permLimit - strfyNum <- stringifyNum <$> asks getter - return $ RS.AnnSelectG aggSelFlds selectFrom tabPerm tableArgs strfyNum - where - args = _fArguments fld - -fromConnectionField - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => RS.SelectFromG UnresolvedVal - -> PrimaryKeyColumns - -> AnnBoolExpPartialSQL - -> Maybe Int - -> Field -> m (RS.ConnectionSelect UnresolvedVal) -fromConnectionField selectFrom pkCols permFilter permLimit fld = fieldAsPath fld $ do - (tableArgs, slice, split) <- parseConnectionArgs pkCols args - selSet <- asObjectSelectionSet $ _fSelSet fld - connSelFlds <- fromConnectionSelSet (_fType fld) selSet - strfyNum <- stringifyNum <$> asks getter - let unresolvedPermFltr = - fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter - tabPerm = RS.TablePerm unresolvedPermFltr permLimit - annSel = RS.AnnSelectG connSelFlds selectFrom tabPerm tableArgs strfyNum - pure $ RS.ConnectionSelect pkCols split slice annSel - where - args = _fArguments fld - -parseConnectionArgs - :: forall r m. - ( MonadReusability m, MonadError QErr m, MonadReader r m - , Has FieldMap r, Has OrdByCtx r - ) - => PrimaryKeyColumns - -> ArgsMap - -> m ( SelectArgs - , Maybe RS.ConnectionSlice - , Maybe (NE.NonEmpty (RS.ConnectionSplit UnresolvedVal)) - ) -parseConnectionArgs pKeyColumns args = do - whereExpM <- withArgM args "where" parseBoolExp - ordByExpML <- withArgM args "order_by" parseOrderBy - - slice <- case (Map.lookup "first" args, Map.lookup "last" args) of - (Nothing, Nothing) -> pure Nothing - (Just _, Just _) -> throwVE "\"first\" and \"last\" are not allowed at once" - (Just v, Nothing) -> Just . RS.SliceFirst <$> parseNonNegativeInt - "expecting Integer value for \"first\"" v - (Nothing, Just v) -> Just . RS.SliceLast <$> parseNonNegativeInt - "expecting Integer value for \"last\"" v - - maybeSplit <- case (Map.lookup "after" args, Map.lookup "before" args) of - (Nothing, Nothing) -> pure Nothing - (Just _, Just _) -> throwVE "\"after\" and \"before\" are not allowed at once" - (Just v, Nothing) -> fmap ((RS.CSKAfter,) . base64Decode) <$> asPGColTextM v - (Nothing, Just v) -> fmap ((RS.CSKBefore,) . base64Decode) <$> asPGColTextM v - - let ordByExpM = NE.nonEmpty =<< appendPrimaryKeyOrderBy <$> ordByExpML - tableArgs = RS.SelectArgs whereExpM ordByExpM Nothing Nothing Nothing - - split <- mapM (uncurry (validateConnectionSplit ordByExpM)) maybeSplit - pure (tableArgs, slice, split) - where - appendPrimaryKeyOrderBy :: [RS.AnnOrderByItemG v] -> [RS.AnnOrderByItemG v] - appendPrimaryKeyOrderBy orderBys = - let orderByColumnNames = - orderBys ^.. traverse . to obiColumn . RS._AOCColumn . to pgiColumn - pkeyOrderBys = flip mapMaybe (toList pKeyColumns) $ \pgColumnInfo -> - if pgiColumn pgColumnInfo `elem` orderByColumnNames then Nothing - else Just $ OrderByItemG Nothing (RS.AOCColumn pgColumnInfo) Nothing - in orderBys <> pkeyOrderBys - - validateConnectionSplit - :: Maybe (NonEmpty (RS.AnnOrderByItemG UnresolvedVal)) - -> RS.ConnectionSplitKind - -> BL.ByteString - -> m (NonEmpty (RS.ConnectionSplit UnresolvedVal)) - validateConnectionSplit maybeOrderBys splitKind cursorSplit = do - cursorValue <- either (const throwInvalidCursor) pure $ - J.eitherDecode cursorSplit - case maybeOrderBys of - Nothing -> forM (NESeq.toNonEmpty pKeyColumns) $ - \pgColumnInfo -> do - let columnJsonPath = [J.Key $ getPGColTxt $ pgiColumn pgColumnInfo] - pgColumnValue <- maybe throwInvalidCursor pure $ iResultToMaybe $ - executeJSONPath columnJsonPath cursorValue - pgValue <- parsePGScalarValue (pgiType pgColumnInfo) pgColumnValue - let unresolvedValue = UVPG $ AnnPGVal Nothing False pgValue - pure $ RS.ConnectionSplit splitKind unresolvedValue $ - OrderByItemG Nothing (RS.AOCColumn pgColumnInfo) Nothing - Just orderBys -> - forM orderBys $ \orderBy -> do - let OrderByItemG orderType annObCol nullsOrder = orderBy - orderByItemValue <- maybe throwInvalidCursor pure $ iResultToMaybe $ - executeJSONPath (getPathFromOrderBy annObCol) cursorValue - pgValue <- parsePGScalarValue (getOrderByColumnType annObCol) orderByItemValue - let unresolvedValue = UVPG $ AnnPGVal Nothing False pgValue - pure $ RS.ConnectionSplit splitKind unresolvedValue $ - OrderByItemG orderType (() <$ annObCol) nullsOrder - where - throwInvalidCursor = throwVE "the \"after\" or \"before\" cursor is invalid" - - iResultToMaybe = \case - J.ISuccess v -> Just v - J.IError{} -> Nothing - - getPathFromOrderBy = \case - RS.AOCColumn pgColInfo -> - let pathElement = J.Key $ getPGColTxt $ pgiColumn pgColInfo - in [pathElement] - RS.AOCObjectRelation relInfo _ obCol -> - let pathElement = J.Key $ relNameToTxt $ riName relInfo - in pathElement : getPathFromOrderBy obCol - RS.AOCArrayAggregation relInfo _ aggOb -> - let fieldName = J.Key $ relNameToTxt (riName relInfo) <> "_aggregate" - in fieldName : case aggOb of - RS.AAOCount -> [J.Key "count"] - RS.AAOOp t col -> [J.Key t, J.Key $ getPGColTxt $ pgiColumn col] - - getOrderByColumnType = \case - RS.AOCColumn pgColInfo -> pgiType pgColInfo - RS.AOCObjectRelation _ _ obCol -> getOrderByColumnType obCol - RS.AOCArrayAggregation _ _ aggOb -> - case aggOb of - RS.AAOCount -> PGColumnScalar PGInteger - RS.AAOOp _ colInfo -> pgiType colInfo - -convertAggSelect - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => SelOpCtx -> Field -> m (RS.AnnAggregateSelectG UnresolvedVal) -convertAggSelect opCtx fld = - withPathK "selectionSet" $ - fromAggField (RS.FromTable qt) colGNameMap permFilter permLimit fld - where - SelOpCtx qt _ colGNameMap permFilter permLimit = opCtx - -convertConnectionSelect - :: ( MonadReusability m, MonadError QErr m, MonadReader r m, Has FieldMap r - , Has OrdByCtx r, Has SQLGenCtx r - ) - => PrimaryKeyColumns -> SelOpCtx -> Field -> m (RS.ConnectionSelect UnresolvedVal) -convertConnectionSelect pkCols opCtx fld = - withPathK "selectionSet" $ - fromConnectionField (RS.FromTable qt) pkCols permFilter permLimit fld - where - SelOpCtx qt _ _ permFilter permLimit = opCtx - -parseFunctionArgs - :: (MonadReusability m, MonadError QErr m) - => Seq.Seq a - -> (a -> InputFunctionArgument) - -> Maybe AnnInpVal - -> m (RS.FunctionArgsExpG UnresolvedVal) -parseFunctionArgs argSeq argFn = withPathK "args" . \case - Nothing -> do - -- The input "args" field is not provided, hence resolve only known - -- input arguments as positional arguments - let positionalArgs = mapMaybe ((^? _IFAKnown._2) . argFn) $ toList argSeq - pure RS.emptyFunctionArgsExp{RS._faePositional = positionalArgs} - - Just val -> flip withObject val $ \_ obj -> do - (positionalArgs, argsLeft) <- spanMaybeM (parsePositionalArg obj) argSeq - namedArgs <- Map.fromList . catMaybes <$> traverse (parseNamedArg obj) argsLeft - pure $ RS.FunctionArgsExp positionalArgs namedArgs - where - parsePositionalArg obj inputArg = case argFn inputArg of - IFAKnown _ resolvedVal -> pure $ Just resolvedVal - IFAUnknown (FunctionArgItem gqlName _ _) -> - maybe (pure Nothing) (fmap Just . parseArg) $ OMap.lookup gqlName obj - - parseArg = fmap (maybe (UVSQL S.SENull) mkParameterizablePGValue) . asPGColumnValueM - - parseNamedArg obj inputArg = case argFn inputArg of - IFAKnown argName resolvedVal -> - pure $ Just (getFuncArgNameTxt argName, resolvedVal) - IFAUnknown (FunctionArgItem gqlName maybeSqlName hasDefault) -> - case OMap.lookup gqlName obj of - Just argInpVal -> case maybeSqlName of - Just sqlName -> Just . (getFuncArgNameTxt sqlName,) <$> parseArg argInpVal - Nothing -> throw400 NotSupported - "Only last set of positional arguments can be omitted" - Nothing -> if not (unHasDefault hasDefault) then - throw400 NotSupported "Non default arguments cannot be omitted" - else pure Nothing - -makeFunctionSelectFrom - :: (MonadReusability m, MonadError QErr m) - => QualifiedFunction - -> FunctionArgSeq - -> Field - -> m (RS.SelectFromG UnresolvedVal) -makeFunctionSelectFrom qf argSeq fld = withPathK "args" $ do - funcArgs <- parseFunctionArgs argSeq argFn $ Map.lookup "args" $ _fArguments fld - pure $ RS.FromFunction qf (RS.AEInput <$> funcArgs) Nothing - where - argFn (IAUserProvided val) = IFAUnknown val - argFn (IASessionVariables argName) = IFAKnown argName UVSession - -convertFuncQuerySimple - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => FuncQOpCtx -> Field -> m AnnSimpleSelect -convertFuncQuerySimple funcOpCtx fld = - withPathK "selectionSet" $ fieldAsPath fld $ do - selectFrom <- makeFunctionSelectFrom qf argSeq fld - fromField selectFrom colGNameMap permFilter permLimit fld - where - FuncQOpCtx qf argSeq _ colGNameMap permFilter permLimit = funcOpCtx - -convertFuncQueryAgg - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => FuncQOpCtx -> Field -> m AnnAggregateSelect -convertFuncQueryAgg funcOpCtx fld = - withPathK "selectionSet" $ fieldAsPath fld $ do - selectFrom <- makeFunctionSelectFrom qf argSeq fld - fromAggField selectFrom colGNameMap permFilter permLimit fld - where - FuncQOpCtx qf argSeq _ colGNameMap permFilter permLimit = funcOpCtx - -convertConnectionFuncQuery - :: ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => PrimaryKeyColumns -> FuncQOpCtx -> Field -> m (RS.ConnectionSelect UnresolvedVal) -convertConnectionFuncQuery pkCols funcOpCtx fld = - withPathK "selectionSet" $ fieldAsPath fld $ do - selectFrom <- makeFunctionSelectFrom qf argSeq fld - fromConnectionField selectFrom pkCols permFilter permLimit fld - where - FuncQOpCtx qf argSeq _ _ permFilter permLimit = funcOpCtx - -throwInvalidNodeId :: MonadError QErr m => Text -> m a -throwInvalidNodeId t = throwVE $ "the node id is invalid: " <> t - -resolveNodeId - :: ( MonadError QErr m - , MonadReusability m - ) - => Field -> m NodeId -resolveNodeId field = - withPathK "selectionSet" $ fieldAsPath field $ - withArg (_fArguments field) "id" $ asPGColText >=> - either (throwInvalidNodeId . T.pack) pure . J.eitherDecode . base64Decode - -convertNodeSelect - :: forall m r. ( MonadReusability m - , MonadError QErr m - , MonadReader r m - , Has FieldMap r - , Has OrdByCtx r - , Has SQLGenCtx r - ) - => SelOpCtx - -> PrimaryKeyColumns - -> NESeq.NESeq J.Value - -> Field - -> m (RS.AnnSimpleSelG UnresolvedVal) -convertNodeSelect selOpCtx pkeyColumns columnValues field = - withPathK "selectionSet" $ fieldAsPath field $ do - -- Parse selection set as interface - ifaceSelectionSet <- asInterfaceSelectionSet $ _fSelSet field - let tableObjectType = mkTableTy table - selSet = getMemberSelectionSet tableObjectType ifaceSelectionSet - unresolvedPermFilter = fmapAnnBoolExp partialSQLExpToUnresolvedVal permFilter - tablePerm = RS.TablePerm unresolvedPermFilter permLimit - -- Resolve the table selection set - annFields <- processTableSelectionSet tableObjectType selSet - -- Resolve the Node id primary key column values - pkeyColumnValues <- alignPkeyColumnValues - unresolvedPkeyValues <- flip Map.traverseWithKey pkeyColumnValues $ - \columnInfo jsonValue -> - let modifyErrFn t = "value of column " <> pgiColumn columnInfo - <<> " in node id: " <> t - in modifyErr modifyErrFn $ - (,columnInfo) . UVPG . AnnPGVal Nothing False <$> - parsePGScalarValue (pgiType columnInfo) jsonValue - - -- Generate the bool expression from the primary key column values - let pkeyBoolExp = BoolAnd $ flip map (Map.elems unresolvedPkeyValues) $ - \(unresolvedValue, columnInfo) -> (BoolFld . AVCol columnInfo) [AEQ True unresolvedValue] - selectArgs = RS.noSelectArgs{RS._saWhere = Just pkeyBoolExp} - strfyNum <- stringifyNum <$> asks getter - pure $ RS.AnnSelectG annFields (RS.FromTable table) tablePerm selectArgs strfyNum - where - SelOpCtx table _ _ permFilter permLimit = selOpCtx - - alignPkeyColumnValues :: m (Map.HashMap PGColumnInfo J.Value) - alignPkeyColumnValues = do - let NESeq.NESeq (firstPkColumn, remainingPkColumns) = pkeyColumns - NESeq.NESeq (firstColumnValue, remainingColumns) = columnValues - (nonAlignedPkColumns, nonAlignedColumnValues, alignedTuples) = - partitionThese $ toList $ align remainingPkColumns remainingColumns - - when (not $ null nonAlignedPkColumns) $ throwInvalidNodeId $ - "primary key columns " <> dquoteList (map pgiColumn nonAlignedPkColumns) <> " are missing" - - when (not $ null nonAlignedColumnValues) $ throwInvalidNodeId $ - "unexpected column values " <> J.encodeToStrictText nonAlignedColumnValues - - pure $ Map.fromList $ (firstPkColumn, firstColumnValue):alignedTuples diff --git a/server/src-lib/Hasura/GraphQL/Resolve/Types.hs b/server/src-lib/Hasura/GraphQL/Resolve/Types.hs deleted file mode 100644 index 63c1b446e9d..00000000000 --- a/server/src-lib/Hasura/GraphQL/Resolve/Types.hs +++ /dev/null @@ -1,337 +0,0 @@ -module Hasura.GraphQL.Resolve.Types - ( module Hasura.GraphQL.Resolve.Types - -- * Re-exports - , MonadReusability(..) - ) where - -import Control.Lens.TH -import Hasura.Prelude - -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as Map -import qualified Data.Sequence as Seq -import qualified Data.Sequence.NonEmpty as NESeq -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DDL.Headers (HeaderConf) -import Hasura.RQL.Types.Action -import Hasura.RQL.Types.BoolExp -import Hasura.RQL.Types.Column -import Hasura.RQL.Types.Common -import Hasura.RQL.Types.ComputedField -import Hasura.RQL.Types.CustomTypes -import Hasura.RQL.Types.Function -import Hasura.RQL.Types.RemoteRelationship -import Hasura.Session -import Hasura.SQL.Types -import Hasura.SQL.Value - -import qualified Hasura.SQL.DML as S - -type NodeSelectMap = Map.HashMap G.NamedType (SelOpCtx, PrimaryKeyColumns) - -data QueryCtx - = QCNodeSelect !NodeSelectMap - | QCSelect !SelOpCtx - | QCSelectConnection !PrimaryKeyColumns !SelOpCtx - | QCSelectPkey !SelPkOpCtx - | QCSelectAgg !SelOpCtx - | QCFuncQuery !FuncQOpCtx - | QCFuncAggQuery !FuncQOpCtx - | QCFuncConnection !PrimaryKeyColumns !FuncQOpCtx - | QCAsyncActionFetch !ActionSelectOpContext - | QCAction !ActionExecutionContext - deriving (Show, Eq) - -data MutationCtx - = MCInsert !InsOpCtx - | MCInsertOne !InsOpCtx - | MCUpdate !UpdOpCtx - | MCUpdateByPk !UpdOpCtx - | MCDelete !DelOpCtx - | MCDeleteByPk !DelOpCtx - | MCAction !ActionMutationExecutionContext - deriving (Show, Eq) - -type OpCtxMap a = Map.HashMap G.Name a -type QueryCtxMap = OpCtxMap QueryCtx -type MutationCtxMap = OpCtxMap MutationCtx - -data InsOpCtx - = InsOpCtx - { _iocTable :: !QualifiedTable - , _iocHeaders :: ![T.Text] - } deriving (Show, Eq) - -data SelOpCtx - = SelOpCtx - { _socTable :: !QualifiedTable - , _socHeaders :: ![T.Text] - , _socAllCols :: !PGColGNameMap - , _socFilter :: !AnnBoolExpPartialSQL - , _socLimit :: !(Maybe Int) - } deriving (Show, Eq) - -type PGColArgMap = Map.HashMap G.Name PGColumnInfo - -data SelPkOpCtx - = SelPkOpCtx - { _spocTable :: !QualifiedTable - , _spocHeaders :: ![T.Text] - , _spocFilter :: !AnnBoolExpPartialSQL - , _spocArgMap :: !PGColArgMap - } deriving (Show, Eq) - -type FunctionArgSeq = Seq.Seq (InputArgument FunctionArgItem) - -data FuncQOpCtx - = FuncQOpCtx - { _fqocFunction :: !QualifiedFunction - , _fqocArgs :: !FunctionArgSeq - , _fqocHeaders :: ![T.Text] - , _fqocAllCols :: !PGColGNameMap - , _fqocFilter :: !AnnBoolExpPartialSQL - , _fqocLimit :: !(Maybe Int) - } deriving (Show, Eq) - -data UpdOpCtx - = UpdOpCtx - { _uocTable :: !QualifiedTable - , _uocHeaders :: ![T.Text] - , _uocAllCols :: !PGColGNameMap - , _uocFilter :: !AnnBoolExpPartialSQL - , _uocCheck :: !(Maybe AnnBoolExpPartialSQL) - , _uocPresetCols :: !PreSetColsPartial - } deriving (Show, Eq) - -data DelOpCtx - = DelOpCtx - { _docTable :: !QualifiedTable - , _docHeaders :: ![T.Text] - , _docAllCols :: !PGColGNameMap - , _docFilter :: !AnnBoolExpPartialSQL - } deriving (Show, Eq) - -data ActionExecutionContext - = ActionExecutionContext - { _saecName :: !ActionName - , _saecOutputType :: !GraphQLType - , _saecOutputFields :: !ActionOutputFields - , _saecDefinitionList :: ![(PGCol, PGScalarType)] - , _saecWebhook :: !ResolvedWebhook - , _saecHeaders :: ![HeaderConf] - , _saecForwardClientHeaders :: !Bool - } deriving (Show, Eq) - -data ActionMutationExecutionContext - = ActionMutationSyncWebhook !ActionExecutionContext - | ActionMutationAsync - deriving (Show, Eq) - -data ActionSelectOpContext - = ActionSelectOpContext - { _asocOutputType :: !GraphQLType - , _asocDefinitionList :: ![(PGCol, PGScalarType)] - } deriving (Show, Eq) - --- (custom name | generated name) -> PG column info --- used in resolvers -type PGColGNameMap = Map.HashMap G.Name PGColumnInfo - -data RelationshipFieldKind - = RFKAggregate - | RFKSimple - | RFKConnection !PrimaryKeyColumns - deriving (Show, Eq) - -data RelationshipField - = RelationshipField - { _rfInfo :: !RelInfo - , _rfKind :: !RelationshipFieldKind - , _rfCols :: !PGColGNameMap - , _rfPermFilter :: !AnnBoolExpPartialSQL - , _rfPermLimit :: !(Maybe Int) - } deriving (Show, Eq) - -data ComputedFieldTable - = ComputedFieldTable - { _cftTable :: !QualifiedTable - , _cftCols :: !PGColGNameMap - , _cftPermFilter :: !AnnBoolExpPartialSQL - , _cftPermLimit :: !(Maybe Int) - } deriving (Show, Eq) - -data ComputedFieldType - = CFTScalar !PGScalarType - | CFTTable !ComputedFieldTable - deriving (Show, Eq) - -type ComputedFieldFunctionArgSeq = Seq.Seq FunctionArgItem - -data ComputedField - = ComputedField - { _cfName :: !ComputedFieldName - , _cfFunction :: !ComputedFieldFunction - , _cfArgSeq :: !ComputedFieldFunctionArgSeq - , _cfType :: !ComputedFieldType - } deriving (Show, Eq) - -data ResolveField - = RFPGColumn !PGColumnInfo - | RFRelationship !RelationshipField - | RFComputedField !ComputedField - | RFRemoteRelationship !RemoteFieldInfo - | RFNodeId !QualifiedTable !PrimaryKeyColumns - deriving (Show, Eq) - -type FieldMap = Map.HashMap (G.NamedType, G.Name) ResolveField - --- order by context -data OrdByItem - = OBIPGCol !PGColumnInfo - | OBIRel !RelInfo !AnnBoolExpPartialSQL - | OBIAgg !RelInfo !PGColGNameMap !AnnBoolExpPartialSQL - deriving (Show, Eq) - -type OrdByItemMap = Map.HashMap G.Name OrdByItem - -type OrdByCtx = Map.HashMap G.NamedType OrdByItemMap - -data FunctionArgItem - = FunctionArgItem - { _faiInputArgName :: !G.Name - , _faiSqlArgName :: !(Maybe FunctionArgName) - , _faiHasDefault :: !HasDefault - } deriving (Show, Eq) - --- insert context -type RelationInfoMap = Map.HashMap RelName RelInfo - -data UpdPermForIns - = UpdPermForIns - { upfiCols :: ![PGCol] - , upfiCheck :: !(Maybe AnnBoolExpPartialSQL) - , upfiFilter :: !AnnBoolExpPartialSQL - , upfiSet :: !PreSetColsPartial - } deriving (Show, Eq) - -data InsCtx - = InsCtx - { icAllCols :: !PGColGNameMap - , icCheck :: !AnnBoolExpPartialSQL - , icSet :: !PreSetColsPartial - , icRelations :: !RelationInfoMap - , icUpdPerm :: !(Maybe UpdPermForIns) - } deriving (Show, Eq) - -type InsCtxMap = Map.HashMap QualifiedTable InsCtx - -data AnnPGVal - = AnnPGVal - { _apvVariable :: !(Maybe G.Variable) - , _apvIsNullable :: !Bool - , _apvValue :: !(WithScalarType PGScalarValue) - } deriving (Show, Eq) - -type PrepFn m = AnnPGVal -> m S.SQLExp - --- lifts PartialSQLExp to UnresolvedVal -partialSQLExpToUnresolvedVal :: PartialSQLExp -> UnresolvedVal -partialSQLExpToUnresolvedVal = \case - PSESessVar ty sessVar -> UVSessVar ty sessVar - PSESQLExp s -> UVSQL s - --- | A value that will be converted to an sql expression eventually -data UnresolvedVal - -- | an entire session variables JSON object - = UVSession - | UVSessVar !(PGType PGScalarType) !SessionVariable - -- | a SQL value literal that can be parameterized over - | UVPG !AnnPGVal - -- | an arbitrary SQL expression, which /cannot/ be parameterized over - | UVSQL !S.SQLExp - deriving (Show, Eq) - -type AnnBoolExpUnresolved = AnnBoolExp UnresolvedVal - -data InputFunctionArgument - = IFAKnown !FunctionArgName !UnresolvedVal -- ^ Known value - | IFAUnknown !FunctionArgItem -- ^ Unknown value, need to be parsed - deriving (Show, Eq) - -{- Note [Relay Node Id] -~~~~~~~~~~~~~~~~~~~~~~~ - -The 'Node' interface in Relay schema has exactly one field which returns -a non-null 'ID' value. Each table object type in Relay schema should implement -'Node' interface to provide global object identification. -See https://relay.dev/graphql/objectidentification.htm for more details. - -To identify each row in a table, we need to encode the table information -(schema and name) and primary key column values in the 'Node' id. - -Node id data: -------------- -We are using JSON format for encoding and decoding the node id. The JSON -schema looks like following - -'[, "", "", "column-1", "column-2", ... "column-n"]' - -It is represented in the type @'NodeId'. The 'version-integer' represents the JSON -schema version to enable any backward compatibility if it is broken in upcoming versions. - -The stringified JSON is Base64 encoded and sent to client. Also the same -base64 encoded JSON string is accepted for 'node' field resolver's 'id' input. --} - -data NodeIdVersion - = NIVersion1 - deriving (Show, Eq) - -nodeIdVersionInt :: NodeIdVersion -> Int -nodeIdVersionInt NIVersion1 = 1 - -currentNodeIdVersion :: NodeIdVersion -currentNodeIdVersion = NIVersion1 - -instance J.FromJSON NodeIdVersion where - parseJSON v = do - versionInt :: Int <- J.parseJSON v - case versionInt of - 1 -> pure NIVersion1 - _ -> fail $ "expecting version 1 for node id, but got " <> show versionInt - -data V1NodeId - = V1NodeId - { _nidTable :: !QualifiedTable - , _nidColumns :: !(NESeq.NESeq J.Value) - } deriving (Show, Eq) - --- | The Relay 'Node' inteface's 'id' field value. --- See Note [Relay Node id]. -data NodeId - = NodeIdV1 !V1NodeId - deriving (Show, Eq) - -instance J.FromJSON NodeId where - parseJSON v = do - valueList <- J.parseJSON v - case valueList of - [] -> fail "unexpected GUID format, found empty list" - J.Number 1:rest -> NodeIdV1 <$> parseNodeIdV1 rest - J.Number n:_ -> fail $ "unsupported GUID version: " <> show n - _ -> fail "unexpected GUID format, needs to start with a version number" - where - parseNodeIdV1 (schemaValue:(nameValue:(firstColumn:remainingColumns))) = - V1NodeId - <$> (QualifiedObject <$> J.parseJSON schemaValue <*> J.parseJSON nameValue) - <*> pure (NESeq.NESeq (firstColumn, Seq.fromList remainingColumns)) - parseNodeIdV1 _ = fail "GUID version 1: expecting schema name, table name and at least one column value" - --- template haskell related -$(makePrisms ''ResolveField) -$(makeLenses ''ComputedField) -$(makePrisms ''ComputedFieldType) -$(makePrisms ''InputFunctionArgument) diff --git a/server/src-lib/Hasura/GraphQL/Schema.hs b/server/src-lib/Hasura/GraphQL/Schema.hs index 819bba7fa06..675219e83a7 100644 --- a/server/src-lib/Hasura/GraphQL/Schema.hs +++ b/server/src-lib/Hasura/GraphQL/Schema.hs @@ -1,961 +1,576 @@ +{-# LANGUAGE Arrows #-} module Hasura.GraphQL.Schema - ( mkGCtxMap - , GCtxMap - , GCtx(..) - , QueryCtx(..) - , MutationCtx(..) - , InsCtx(..) - , InsCtxMap - , RelationInfoMap - - , checkConflictingNode - , checkSchemaConflicts - - -- * To be consumed by Hasura.GraphQL.RelaySchema module - , mkGCtx - , isAggregateField - , qualObjectToName - , ppGCtx - , getSelPerm - , isValidObjectName - , mkAdminSelFlds - , noFilter - , getGCtx - , getMutationRootFieldsRole - , makeFieldMap - , mkMutationTypesAndFieldsRole - , mkAdminInsCtx - , mkValidConstraints - , getValidCols - , mkInsCtx + ( buildGQLContext ) where -import Control.Lens.Extended hiding (op) -import Data.List.Extended (duplicates) +import Hasura.Prelude +import qualified Data.Aeson as J import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap import qualified Data.HashSet as Set -import qualified Data.Sequence as Seq -import qualified Data.Text as T import qualified Language.GraphQL.Draft.Syntax as G +import Control.Arrow.Extended +import Control.Lens.Extended +import Control.Monad.Unique +import Data.Has +import Data.List.Extended (duplicates) + +import qualified Hasura.GraphQL.Parser as P + import Hasura.GraphQL.Context -import Hasura.GraphQL.Resolve.Context -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.DML.Internal (mkAdminRolePermInfo) +import Hasura.GraphQL.Execute.Types +import Hasura.GraphQL.Parser (Kind (..), Parser, Schema (..), + UnpreparedValue (..)) +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Internal.Parser (FieldParser (..)) +import Hasura.GraphQL.Schema.Action +import Hasura.GraphQL.Schema.Common +import Hasura.GraphQL.Schema.Introspect +import Hasura.GraphQL.Schema.Mutation +import Hasura.GraphQL.Schema.Select +import Hasura.GraphQL.Schema.Table +import Hasura.RQL.DDL.Schema.Cache.Common import Hasura.RQL.Types import Hasura.Session import Hasura.SQL.Types -import Hasura.GraphQL.Schema.Action -import Hasura.GraphQL.Schema.BoolExp -import Hasura.GraphQL.Schema.Builder -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Function -import Hasura.GraphQL.Schema.Merge -import Hasura.GraphQL.Schema.Mutation.Common -import Hasura.GraphQL.Schema.Mutation.Delete -import Hasura.GraphQL.Schema.Mutation.Insert -import Hasura.GraphQL.Schema.Mutation.Update -import Hasura.GraphQL.Schema.OrderBy -import Hasura.GraphQL.Schema.Select +-- | Whether the request is sent with `x-hasura-use-backend-only-permissions` set to `true`. +data Scenario = Backend | Frontend deriving (Enum, Show, Eq) -type TableSchemaCtx = RoleContext (TyAgg, RootFields, InsCtxMap) +buildGQLContext + :: forall arr m + . ( ArrowChoice arr + , ArrowWriter (Seq InconsistentMetadata) arr + , ArrowKleisli m arr + , MonadError QErr m + , MonadIO m + , MonadUnique m + , HasSQLGenCtx m + ) + => ( GraphQLQueryType + , TableCache + , FunctionCache + , HashMap RemoteSchemaName (RemoteSchemaCtx, MetadataObject) + , ActionCache + , NonObjectTypeMap + ) + `arr` + ( HashMap RoleName (RoleContext GQLContext) + , GQLContext + ) +buildGQLContext = + proc (queryType, allTables, allFunctions, allRemoteSchemas, allActions, nonObjectCustomTypes) -> do -getInsPerm :: TableInfo -> RoleName -> Maybe InsPermInfo -getInsPerm tabInfo roleName - | roleName == adminRoleName = _permIns $ mkAdminRolePermInfo (_tiCoreInfo tabInfo) - | otherwise = Map.lookup roleName rolePermInfoMap >>= _permIns - where - rolePermInfoMap = _tiRolePermInfoMap tabInfo + -- Scroll down a few pages for the actual body... -getTabInfo - :: MonadError QErr m - => TableCache -> QualifiedTable -> m TableInfo -getTabInfo tc t = - onNothing (Map.lookup t tc) $ - throw500 $ "table not found: " <>> t + let allRoles = Set.insert adminRoleName $ + (allTables ^.. folded.tiRolePermInfoMap.to Map.keys.folded) + <> (allActionInfos ^.. folded.aiPermissions.to Map.keys.folded) -isValidObjectName :: (ToTxt a) => QualifiedObject a -> Bool -isValidObjectName = G.isValidName . qualObjectToName + tableFilter = not . isSystemDefined . _tciSystemDefined + functionFilter = not . isSystemDefined . fiSystemDefined -isValidCol :: PGColumnInfo -> Bool -isValidCol = G.isValidName . pgiName + validTables = Map.filter (tableFilter . _tiCoreInfo) allTables + validFunctions = Map.elems $ Map.filter functionFilter allFunctions -isValidRel :: ToTxt a => RelName -> QualifiedObject a -> Bool -isValidRel rn rt = G.isValidName (mkRelName rn) && isValidObjectName rt + allActionInfos = Map.elems allActions + queryRemotesMap = + fmap (map fDefinition . piQuery . rscParsed . fst) allRemoteSchemas + buildFullestDBSchema + :: m ( Parser 'Output (P.ParseT Identity) (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue)) + , Maybe (Parser 'Output (P.ParseT Identity) (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue))) + ) + buildFullestDBSchema = do + SQLGenCtx{ stringifyNum } <- askSQLGenCtx + let gqlContext = + (,) + <$> queryWithIntrospection (Set.fromMap $ validTables $> ()) + validFunctions mempty mempty + allActionInfos nonObjectCustomTypes + <*> mutation (Set.fromMap $ validTables $> ()) mempty + allActionInfos nonObjectCustomTypes + flip runReaderT (adminRoleName, validTables, Frontend, QueryContext stringifyNum queryType queryRemotesMap) $ + P.runSchemaT gqlContext -isValidRemoteRel :: RemoteFieldInfo -> Bool -isValidRemoteRel = - G.isValidName . mkRemoteRelationshipName . _rfiName + -- build the admin context so that we can check against name clashes with remotes + adminHasuraContext <- bindA -< buildFullestDBSchema -isValidField :: FieldInfo -> Bool -isValidField = \case - FIColumn colInfo -> isValidCol colInfo - FIRelationship (RelInfo rn _ _ remTab _) -> isValidRel rn remTab - FIComputedField info -> G.isValidName $ mkComputedFieldName $ _cfiName info - FIRemoteRelationship remoteField -> isValidRemoteRel remoteField + queryFieldNames :: [G.Name] <- bindA -< + case P.discardNullability $ P.parserType $ fst adminHasuraContext of + -- It really ought to be this case; anything else is a programming error. + P.TNamed (P.Definition _ _ _ (P.TIObject (P.ObjectInfo rootFields _interfaces))) -> + pure $ fmap P.dName rootFields + _ -> throw500 "We encountered an root query of unexpected GraphQL type. It should be an object type." + let mutationFieldNames :: [G.Name] + mutationFieldNames = + case P.discardNullability . P.parserType <$> snd adminHasuraContext of + Just (P.TNamed def) -> + case P.dInfo def of + -- It really ought to be this case; anything else is a programming error. + P.TIObject (P.ObjectInfo rootFields _interfaces) -> fmap P.dName rootFields + _ -> [] + _ -> [] -upsertable :: [ConstraintName] -> Bool -> Bool -> Bool -upsertable uniqueOrPrimaryCons isUpsertAllowed isAView = - not (null uniqueOrPrimaryCons) && isUpsertAllowed && not isAView - -getValidCols - :: FieldInfoMap FieldInfo -> [PGColumnInfo] -getValidCols = filter isValidCol . getCols - -getValidRels :: FieldInfoMap FieldInfo -> [RelInfo] -getValidRels = filter isValidRel' . getRels - where - isValidRel' (RelInfo rn _ _ remTab _) = isValidRel rn remTab - -mkValidConstraints :: [ConstraintName] -> [ConstraintName] -mkValidConstraints = - filter (G.isValidName . G.Name . getConstraintTxt) - -isRelNullable - :: FieldInfoMap FieldInfo -> RelInfo -> Bool -isRelNullable fim ri = isNullable - where - lCols = Map.keys $ riMapping ri - allCols = getValidCols fim - lColInfos = getColInfos lCols allCols - isNullable = any pgiIsNullable lColInfos - -isAggregateField :: G.Name -> Bool -isAggregateField = flip elem (numAggregateOps <> compAggregateOps) - -mkComputedFieldFunctionArgSeq :: Seq.Seq FunctionArg -> ComputedFieldFunctionArgSeq -mkComputedFieldFunctionArgSeq inputArgs = - Seq.fromList $ procFuncArgs inputArgs faName $ - \fa t -> FunctionArgItem (G.Name t) (faName fa) (faHasDefault fa) - -mkMutationTypesAndFieldsRole - :: QualifiedTable - -> Maybe ([PGColumnInfo], RelationInfoMap) - -- ^ insert permission - -> Maybe [SelField] - -- ^ select permission - -> Maybe [PGColumnInfo] - -- ^ update cols - -> Maybe () - -- ^ delete cols - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -- ^ constraints - -> Maybe ViewInfo - -> (TypeMap, FieldMap) -mkMutationTypesAndFieldsRole tn insPermM selFldsM updColsM delPermM pkeyCols constraints viM = - (mkTyInfoMap allTypes, fieldMap) - where - - allTypes = relInsInpObjTys <> onConflictTypes <> jsonOpTys - <> mutationTypes <> referencedEnumTypes - - upsertPerm = isJust updColsM - isUpsertable = upsertable constraints upsertPerm $ isJust viM - updatableCols = maybe [] (map pgiName) updColsM - onConflictTypes = mkOnConflictTypes tn constraints updatableCols isUpsertable - jsonOpTys = fromMaybe [] updJSONOpInpObjTysM - relInsInpObjTys = maybe [] (map TIInpObj) $ - mutHelper viIsInsertable relInsInpObjsM - - mutationTypes = catMaybes - [ TIInpObj <$> mutHelper viIsInsertable insInpObjM - , TIInpObj <$> mutHelper viIsUpdatable updSetInpObjM - , TIInpObj <$> mutHelper viIsUpdatable updIncInpObjM - , TIInpObj <$> mutHelper viIsUpdatable primaryKeysInpObjM - , TIObj <$> mutRespObjM - ] - - mutHelper :: (ViewInfo -> Bool) -> Maybe a -> Maybe a - mutHelper f objM = bool Nothing objM $ isMutable f viM - - fieldMap = Map.unions $ catMaybes [insInpObjFldsM, updSetInpObjFldsM] - - -- helper - mkColFldMap ty cols = Map.fromList $ flip map cols $ - \ci -> ((ty, pgiName ci), RFPGColumn ci) - - -- insert input type - insInpObjM = uncurry (mkInsInp tn) <$> insPermM - -- column fields used in insert input object - insInpObjFldsM = (mkColFldMap (mkInsInpTy tn) . fst) <$> insPermM - -- relationship input objects - relInsInpObjsM = mkRelInsInps tn isUpsertable <$ insPermM - -- update set input type - updSetInpObjM = mkUpdSetInp tn <$> updColsM - -- update increment input type - updIncInpObjM = mkUpdIncInp tn updColsM - -- update json operator input type - updJSONOpInpObjsM = mkUpdJSONOpInp tn <$> updColsM - updJSONOpInpObjTysM = map TIInpObj <$> updJSONOpInpObjsM - -- fields used in set input object - updSetInpObjFldsM = mkColFldMap (mkUpdSetTy tn) <$> updColsM - - -- primary key columns input object for update_by_pk - primaryKeysInpObjM = guard (isJust selFldsM) *> (mkPKeyColumnsInpObj tn <$> pkeyCols) - - -- mut resp obj - mutRespObjM = - if isMut - then Just $ mkMutRespObj tn $ isJust selFldsM - else Nothing - - isMut = (isJust insPermM || isJust updColsM || isJust delPermM) - && any (`isMutable` viM) [viIsInsertable, viIsUpdatable, viIsDeletable] - - -- the types for all enums that are /referenced/ by this table (not /defined/ by this table; - -- there isn’t actually any need to generate a GraphQL enum type for an enum table if it’s - -- never referenced anywhere else) - referencedEnumTypes = - let allColumnInfos = - (selFldsM ^.. _Just.traverse._SFPGColumn) - <> (insPermM ^. _Just._1) - <> (updColsM ^. _Just) - <> (pkeyCols ^. _Just.pkColumns.to toList) - allEnumReferences = allColumnInfos ^.. traverse.to pgiType._PGColumnEnumReference - in flip map allEnumReferences $ \enumReference@(EnumReference referencedTableName _) -> - let typeName = mkTableEnumType referencedTableName - in TIEnum $ mkHsraEnumTyInfo Nothing typeName (EnumValuesReference enumReference) - --- see Note [Split schema generation (TODO)] -mkTyAggRole - :: QualifiedTable - -> Maybe PGDescription - -- ^ Postgres description - -> Maybe ([PGColumnInfo], RelationInfoMap) - -- ^ insert permission - -> Maybe (Bool, [SelField]) - -- ^ select permission - -> Maybe [PGColumnInfo] - -- ^ update cols - -> Maybe () - -- ^ delete cols - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -- ^ constraints - -> Maybe ViewInfo - -> [FunctionInfo] - -- ^ all functions - -> TyAgg -mkTyAggRole tn descM insPermM selPermM updColsM delPermM pkeyCols constraints viM funcs = - let (mutationTypes, mutationFields) = - mkMutationTypesAndFieldsRole tn insPermM selFldsM updColsM delPermM pkeyCols constraints viM - in TyAgg (mkTyInfoMap allTypes <> mutationTypes) - (fieldMap <> mutationFields) - scalars ordByCtx - where - - ordByCtx = fromMaybe Map.empty ordByCtxM - funcInpArgTys = bool [] (map TIInpObj funcArgInpObjs) $ isJust selFldsM - - allTypes = queryTypes <> aggQueryTypes - <> funcInpArgTys <> computedFieldFuncArgsInps - - queryTypes = map TIObj selectObjects <> - catMaybes - [ TIInpObj <$> boolExpInpObjM - , TIInpObj <$> ordByInpObjM - , TIEnum <$> selColInpTyM - ] - aggQueryTypes = map TIObj aggObjs <> map TIInpObj aggOrdByInps - - fieldMap = Map.unions $ catMaybes [boolExpInpObjFldsM , selObjFldsM] - scalars = selByPkScalarSet <> funcArgScalarSet <> computedFieldFuncArgScalars - - selFldsM = snd <$> selPermM - selColNamesM = (map pgiName . getPGColumnFields) <$> selFldsM - selColInpTyM = mkSelColumnTy tn <$> selColNamesM - - -- boolexp input type - boolExpInpObjM = case selFldsM of - Just selFlds -> Just $ mkBoolExpInp tn selFlds - -- no select permission - Nothing -> - -- but update/delete is defined - if isJust updColsM || isJust delPermM - then Just $ mkBoolExpInp tn [] - else Nothing - - -- funcargs input type - funcArgInpObjs = flip mapMaybe funcs $ \func -> - mkFuncArgsInp (fiName func) (getInputArgs func) - -- funcArgCtx = Map.unions funcArgCtxs - funcArgScalarSet = funcs ^.. folded.to getInputArgs.folded.to (_qptName.faType) - - -- helper - mkFldMap ty = Map.fromList . concatMap (mkFld ty) - mkFld ty = \case - SFPGColumn ci -> [((ty, pgiName ci), RFPGColumn ci)] - SFRelationship (RelationshipFieldInfo relInfo allowAgg cols permFilter permLimit _ _) -> - let relationshipName = riName relInfo - relFld = ( (ty, mkRelName relationshipName) - , RFRelationship $ RelationshipField relInfo RFKSimple cols permFilter permLimit - ) - aggRelFld = ( (ty, mkAggRelName relationshipName) - , RFRelationship $ RelationshipField relInfo RFKAggregate cols permFilter permLimit - ) - in case riType relInfo of - ObjRel -> [relFld] - ArrRel -> bool [relFld] [relFld, aggRelFld] allowAgg - SFComputedField cf -> pure - ( (ty, mkComputedFieldName $ _cfName cf) - , RFComputedField cf - ) - SFRemoteRelationship remoteField -> pure - ( (ty, G.Name (remoteRelationshipNameToText (_rfiName remoteField))) - , RFRemoteRelationship remoteField + -- This block of code checks that there are no conflicting root field names between remotes. + remotes :: + [ ( RemoteSchemaName + , ParsedIntrospection ) + ] <- (| foldlA' (\okSchemas (newSchemaName, (newSchemaContext, newMetadataObject)) -> do + checkedDuplicates <- (| withRecordInconsistency (do + let (queryOld, mutationOld) = + unzip $ fmap ((\case ParsedIntrospection q m _ -> (q,m)) . snd) okSchemas + let ParsedIntrospection queryNew mutationNew _subscriptionNew + = rscParsed newSchemaContext + -- Check for conflicts between remotes + bindErrorA -< + for_ (duplicates (fmap (P.getName . fDefinition) (queryNew ++ concat queryOld))) $ + \name -> throw400 Unexpected $ "Duplicate remote field " <> squote name + -- Check for conflicts between this remote and the tables + bindErrorA -< + for_ (duplicates (fmap (P.getName . fDefinition) queryNew ++ queryFieldNames)) $ + \name -> throw400 RemoteSchemaConflicts $ "Field cannot be overwritten by remote field " <> squote name + -- Ditto, but for mutations + case mutationNew of + Nothing -> returnA -< () + Just ms -> do + bindErrorA -< + for_ (duplicates (fmap (P.getName . fDefinition) (ms ++ concat (catMaybes mutationOld)))) $ + \name -> throw400 Unexpected $ "Duplicate remote field " <> squote name + -- Ditto, but for mutations + bindErrorA -< + for_ (duplicates (fmap (P.getName . fDefinition) ms ++ mutationFieldNames)) $ + \name -> throw400 Unexpected $ "Field cannot be overwritten by remote field " <> squote name + -- No need to check subscriptions as these are not supported + returnA -< ()) + |) newMetadataObject + case checkedDuplicates of + Nothing -> returnA -< okSchemas + Just _ -> returnA -< (newSchemaName, rscParsed newSchemaContext):okSchemas + ) |) [] (Map.toList allRemoteSchemas) - -- the fields used in bool exp - boolExpInpObjFldsM = mkFldMap (mkBoolExpTy tn) <$> selFldsM + let unauthenticatedContext :: m GQLContext + unauthenticatedContext = do + let gqlContext = GQLContext . finalizeParser <$> + unauthenticatedQueryWithIntrospection queryRemotes mutationRemotes + halfContext <- P.runSchemaT gqlContext + return $ halfContext $ finalizeParser <$> unauthenticatedMutation mutationRemotes - -- table obj - selectObjects = case selPermM of - Just (_, selFlds) -> - [ mkTableObj tn descM selFlds + -- | The 'query' type of the remotes. TODO: also expose mutation + -- remotes. NOT TODO: subscriptions, as we do not yet aim to support + -- these. + queryRemotes = concatMap (piQuery . snd) remotes + mutationRemotes = concatMap (concat . piMutation . snd) remotes + queryHasuraOrRelay = case queryType of + QueryHasura -> queryWithIntrospection (Set.fromMap $ validTables $> ()) + validFunctions queryRemotes mutationRemotes + allActionInfos nonObjectCustomTypes + QueryRelay -> relayWithIntrospection (Set.fromMap $ validTables $> ()) validFunctions + + buildContextForRoleAndScenario :: RoleName -> Scenario -> m GQLContext + buildContextForRoleAndScenario roleName scenario = do + SQLGenCtx{ stringifyNum } <- askSQLGenCtx + let gqlContext = GQLContext + <$> (finalizeParser <$> queryHasuraOrRelay) + <*> (fmap finalizeParser <$> mutation (Set.fromList $ Map.keys validTables) mutationRemotes + allActionInfos nonObjectCustomTypes) + flip runReaderT (roleName, validTables, scenario, QueryContext stringifyNum queryType queryRemotesMap) $ + P.runSchemaT gqlContext + + buildContextForRole :: RoleName -> m (RoleContext GQLContext) + buildContextForRole roleName = do + frontend <- buildContextForRoleAndScenario roleName Frontend + backend <- buildContextForRoleAndScenario roleName Backend + return $ RoleContext frontend $ Just backend + + finalizeParser :: Parser 'Output (P.ParseT Identity) a -> ParserFn a + finalizeParser parser = runIdentity . P.runParseT . P.runParser parser + + -- Here, finally the body starts. + + roleContexts <- bindA -< (Set.toMap allRoles & Map.traverseWithKey \roleName () -> + buildContextForRole roleName) + unauthenticated <- bindA -< unauthenticatedContext + returnA -< (roleContexts, unauthenticated) + +-- | Generate all the field parsers for query-type GraphQL requests. We don't +-- actually collect these into a @Parser@ using @selectionSet@ so that we can +-- insert the introspection before doing so. +query' + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => HashSet QualifiedTable + -> [FunctionInfo] + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [ActionInfo] + -> NonObjectTypeMap + -> m [P.FieldParser n (QueryRootField UnpreparedValue)] +query' allTables allFunctions allRemotes allActions nonObjectCustomTypes = do + tableSelectExpParsers <- for (toList allTables) \table -> do + selectPerms <- tableSelectPermissions table + customRootFields <- _tcCustomRootFields . _tciCustomConfig . _tiCoreInfo <$> askTableInfo table + for selectPerms \perms -> do + displayName <- qualifiedObjectToName table + let fieldsDesc = G.Description $ "fetch data from the table: " <>> table + aggName = displayName <> $$(G.litName "_aggregate") + aggDesc = G.Description $ "fetch aggregated fields from the table: " <>> table + pkName = displayName <> $$(G.litName "_by_pk") + pkDesc = G.Description $ "fetch data from the table: " <> table <<> " using primary key columns" + catMaybes <$> sequenceA + [ requiredFieldParser (RFDB . QDBSimple) $ selectTable table (fromMaybe displayName $ _tcrfSelect customRootFields) (Just fieldsDesc) perms + , mapMaybeFieldParser (RFDB . QDBPrimaryKey) $ selectTableByPk table (fromMaybe pkName $ _tcrfSelectByPk customRootFields) (Just pkDesc) perms + , mapMaybeFieldParser (RFDB . QDBAggregation) $ selectTableAggregate table (fromMaybe aggName $ _tcrfSelectAggregate customRootFields) (Just aggDesc) perms ] - Nothing -> [] - - -- aggregate objs and order by inputs - (aggObjs, aggOrdByInps) = case selPermM of - Just (True, selFlds) -> - let cols = getPGColumnFields selFlds - numCols = onlyNumCols cols - compCols = onlyComparableCols cols - objs = [ mkTableAggObj tn - , mkTableAggregateFieldsObj tn (numCols, numAggregateOps) (compCols, compAggregateOps) - ] <> mkColAggregateFieldsObjs selFlds - ordByInps = mkTabAggOrdByInpObj tn (numCols, numAggregateOps) (compCols, compAggregateOps) - : mkTabAggregateOpOrdByInpObjs tn (numCols, numAggregateOps) (compCols, compAggregateOps) - in (objs, ordByInps) - _ -> ([], []) - - getNumericCols = onlyNumCols . getPGColumnFields - getComparableCols = onlyComparableCols . getPGColumnFields - onlyFloat = const $ mkScalarTy PGFloat - - mkTypeMaker "sum" = mkColumnType - mkTypeMaker _ = onlyFloat - - mkColAggregateFieldsObjs flds = - let numCols = getNumericCols flds - compCols = getComparableCols flds - mkNumObjFld n = mkTableColAggregateFieldsObj tn n (mkTypeMaker n) numCols - mkCompObjFld n = mkTableColAggregateFieldsObj tn n mkColumnType compCols - numFldsObjs = bool (map mkNumObjFld numAggregateOps) [] $ null numCols - compFldsObjs = bool (map mkCompObjFld compAggregateOps) [] $ null compCols - in numFldsObjs <> compFldsObjs - -- the fields used in table object - selObjFldsM = mkFldMap (mkTableTy tn) <$> selFldsM - -- the scalar set for table_by_pk arguments - selByPkScalarSet = pkeyCols ^.. folded.to _pkColumns.folded.to pgiType._PGColumnScalar - - ordByInpCtxM = mkOrdByInpObj tn <$> selFldsM - (ordByInpObjM, ordByCtxM) = case ordByInpCtxM of - Just (a, b) -> (Just a, Just b) - Nothing -> (Nothing, Nothing) - - -- computed fields' function args input objects and scalar types - mkComputedFieldRequiredTypes computedFieldInfo = - let ComputedFieldFunction qf inputArgs _ _ _ = _cfFunction computedFieldInfo - scalarArgs = map (_qptName . faType) $ toList inputArgs - in (, scalarArgs) <$> mkFuncArgsInp qf inputArgs - - computedFieldReqTypes = catMaybes $ - maybe [] (map mkComputedFieldRequiredTypes . getComputedFields) selFldsM - - computedFieldFuncArgsInps = map (TIInpObj . fst) computedFieldReqTypes - computedFieldFuncArgScalars = Set.fromList $ concatMap snd computedFieldReqTypes - -makeFieldMap :: [(a, ObjFldInfo)] -> Map.HashMap G.Name (a, ObjFldInfo) -makeFieldMap = mapFromL (_fiName . snd) - --- see Note [Split schema generation (TODO)] -getMutationRootFieldsRole - :: QualifiedTable - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -> FieldInfoMap FieldInfo - -> Maybe ([T.Text], Bool) -- insert perm - -> Maybe (AnnBoolExpPartialSQL, Maybe Int, [T.Text], Bool) -- select filter - -> Maybe ([PGColumnInfo], PreSetColsPartial, AnnBoolExpPartialSQL, Maybe AnnBoolExpPartialSQL, [T.Text]) -- update filter - -> Maybe (AnnBoolExpPartialSQL, [T.Text]) -- delete filter - -> Maybe ViewInfo - -> TableConfig -- custom config - -> MutationRootFieldMap -getMutationRootFieldsRole tn primaryKey constraints fields insM - selM updM delM viM tableConfig = - makeFieldMap $ catMaybes - [ mutHelper viIsInsertable getInsDet insM - , onlyIfSelectPermExist $ mutHelper viIsInsertable getInsOneDet insM - , mutHelper viIsUpdatable getUpdDet updM - , onlyIfSelectPermExist $ mutHelper viIsUpdatable getUpdByPkDet $ (,) <$> updM <*> primaryKey - , mutHelper viIsDeletable getDelDet delM - , onlyIfSelectPermExist $ mutHelper viIsDeletable getDelByPkDet $ (,) <$> delM <*> primaryKey + functionSelectExpParsers <- for allFunctions \function -> do + let targetTable = fiReturnType function + functionName = fiName function + selectPerms <- tableSelectPermissions targetTable + for selectPerms \perms -> do + displayName <- qualifiedObjectToName functionName + let functionDesc = G.Description $ "execute function " <> functionName <<> " which returns " <>> targetTable + aggName = displayName <> $$(G.litName "_aggregate") + aggDesc = G.Description $ "execute function " <> functionName <<> " and query aggregates on result of table type " <>> targetTable + catMaybes <$> sequenceA + [ requiredFieldParser (RFDB . QDBSimple) $ selectFunction function displayName (Just functionDesc) perms + , mapMaybeFieldParser (RFDB . QDBAggregation) $ selectFunctionAggregate function aggName (Just aggDesc) perms ] + actionParsers <- for allActions $ \actionInfo -> + case _adType (_aiDefinition actionInfo) of + ActionMutation ActionSynchronous -> pure Nothing + ActionMutation ActionAsynchronous -> + fmap (fmap (RFAction . AQAsync)) <$> actionAsyncQuery actionInfo + ActionQuery -> + fmap (fmap (RFAction . AQQuery)) <$> actionExecute nonObjectCustomTypes actionInfo + pure $ (concat . catMaybes) (tableSelectExpParsers <> functionSelectExpParsers <> toRemoteFieldParser allRemotes) + <> catMaybes actionParsers where - customRootFields = _tcCustomRootFields tableConfig - colGNameMap = mkPGColGNameMap $ getCols fields + requiredFieldParser :: (a -> b) -> m (P.FieldParser n a) -> m (Maybe (P.FieldParser n b)) + requiredFieldParser f = fmap $ Just . fmap f - mutHelper :: (ViewInfo -> Bool) -> (a -> b) -> Maybe a -> Maybe b - mutHelper f getDet mutM = - bool Nothing (getDet <$> mutM) $ isMutable f viM + mapMaybeFieldParser :: (a -> b) -> m (Maybe (P.FieldParser n a)) -> m (Maybe (P.FieldParser n b)) + mapMaybeFieldParser f = fmap $ fmap $ fmap f - onlyIfSelectPermExist v = guard (isJust selM) *> v + toRemoteFieldParser p = [Just $ fmap (fmap RFRemote) p] - getCustomNameWith f = f customRootFields - - insCustName = getCustomNameWith _tcrfInsert - getInsDet (hdrs, upsertPerm) = - let isUpsertable = upsertable constraints upsertPerm $ isJust viM - in ( MCInsert $ InsOpCtx tn $ hdrs `union` maybe [] (^. _5) updM - , mkInsMutFld insCustName tn isUpsertable - ) - - insOneCustName = getCustomNameWith _tcrfInsertOne - getInsOneDet (hdrs, upsertPerm) = - let isUpsertable = upsertable constraints upsertPerm $ isJust viM - in ( MCInsertOne $ InsOpCtx tn $ hdrs `union` maybe [] (^. _5) updM - , mkInsertOneMutationField insOneCustName tn isUpsertable - ) - - updCustName = getCustomNameWith _tcrfUpdate - getUpdDet (updCols, preSetCols, updFltr, updCheck, hdrs) = - ( MCUpdate $ UpdOpCtx tn hdrs colGNameMap updFltr updCheck preSetCols - , mkUpdMutFld updCustName tn updCols - ) - - updByPkCustName = getCustomNameWith _tcrfUpdateByPk - getUpdByPkDet ((updCols, preSetCols, updFltr, updCheck, hdrs), pKey) = - ( MCUpdateByPk $ UpdOpCtx tn hdrs colGNameMap updFltr updCheck preSetCols - , mkUpdateByPkMutationField updByPkCustName tn updCols pKey - ) - - delCustName = getCustomNameWith _tcrfDelete - getDelDet (delFltr, hdrs) = - ( MCDelete $ DelOpCtx tn hdrs colGNameMap delFltr - , mkDelMutFld delCustName tn - ) - delByPkCustName = getCustomNameWith _tcrfDeleteByPk - getDelByPkDet ((delFltr, hdrs), pKey) = - ( MCDeleteByPk $ DelOpCtx tn hdrs colGNameMap delFltr - , mkDeleteByPkMutationField delByPkCustName tn pKey - ) - --- see Note [Split schema generation (TODO)] -getQueryRootFieldsRole - :: QualifiedTable - -> Maybe (PrimaryKey PGColumnInfo) - -> FieldInfoMap FieldInfo +-- | Similar to @query'@ but for Relay. +relayQuery' + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => HashSet QualifiedTable -> [FunctionInfo] - -> Maybe (AnnBoolExpPartialSQL, Maybe Int, [T.Text], Bool) -- select filter - -> TableConfig -- custom config - -> QueryRootFieldMap -getQueryRootFieldsRole tn primaryKey fields funcs selM tableConfig = - makeFieldMap $ - funcQueries - <> funcAggQueries - <> catMaybes - [ getSelDet <$> selM - , getSelAggDet selM - , getPKeySelDet <$> selM <*> primaryKey - ] - where - customRootFields = _tcCustomRootFields tableConfig - colGNameMap = mkPGColGNameMap $ getCols fields + -> m [P.FieldParser n (QueryRootField UnpreparedValue)] +relayQuery' allTables allFunctions = do + tableConnectionSelectParsers <- + for (toList allTables) $ \table -> runMaybeT do + pkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns) + <$> askTableInfo table + selectPerms <- MaybeT $ tableSelectPermissions table + displayName <- qualifiedObjectToName table + let fieldName = displayName <> $$(G.litName "_connection") + fieldDesc = Just $ G.Description $ "fetch data from the table: " <>> table + lift $ selectTableConnection table fieldName fieldDesc pkeyColumns selectPerms - funcQueries = maybe [] getFuncQueryFlds selM - funcAggQueries = maybe [] getFuncAggQueryFlds selM + functionConnectionSelectParsers <- + for allFunctions $ \function -> runMaybeT do + let returnTable = fiReturnType function + functionName = fiName function + pkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns) + <$> askTableInfo returnTable + selectPerms <- MaybeT $ tableSelectPermissions returnTable + displayName <- qualifiedObjectToName functionName + let fieldName = displayName <> $$(G.litName "_connection") + fieldDesc = Just $ G.Description $ "execute function " <> functionName + <<> " which returns " <>> returnTable + lift $ selectFunctionConnection function fieldName fieldDesc pkeyColumns selectPerms - getCustomNameWith f = f customRootFields + pure $ map ((RFDB . QDBConnection) <$>) $ catMaybes $ + tableConnectionSelectParsers <> functionConnectionSelectParsers - selCustName = getCustomNameWith _tcrfSelect - getSelDet (selFltr, pLimit, hdrs, _) = - selFldHelper QCSelect (mkSelFld selCustName) selFltr pLimit hdrs - - selAggCustName = getCustomNameWith _tcrfSelectAggregate - getSelAggDet (Just (selFltr, pLimit, hdrs, True)) = - Just $ selFldHelper QCSelectAgg (mkAggSelFld selAggCustName) - selFltr pLimit hdrs - getSelAggDet _ = Nothing - - selFldHelper f g pFltr pLimit hdrs = - ( f $ SelOpCtx tn hdrs colGNameMap pFltr pLimit - , g tn - ) - - selByPkCustName = getCustomNameWith _tcrfSelectByPk - getPKeySelDet (selFltr, _, hdrs, _) key = - let keyColumns = toList $ _pkColumns key - in ( QCSelectPkey . SelPkOpCtx tn hdrs selFltr $ mkPGColGNameMap keyColumns - , mkSelFldPKey selByPkCustName tn keyColumns - ) - - getFuncQueryFlds (selFltr, pLimit, hdrs, _) = - funcFldHelper QCFuncQuery mkFuncQueryFld selFltr pLimit hdrs - - getFuncAggQueryFlds (selFltr, pLimit, hdrs, True) = - funcFldHelper QCFuncAggQuery mkFuncAggQueryFld selFltr pLimit hdrs - getFuncAggQueryFlds _ = [] - - funcFldHelper f g pFltr pLimit hdrs = - flip map funcs $ \fi -> - ( f $ FuncQOpCtx (fiName fi) (mkFuncArgItemSeq fi) hdrs colGNameMap pFltr pLimit - , g fi $ fiDescription fi - ) - -getSelPermission :: TableInfo -> RoleName -> Maybe SelPermInfo -getSelPermission tabInfo roleName = - Map.lookup roleName (_tiRolePermInfoMap tabInfo) >>= _permSel - -getSelPerm - :: (MonadError QErr m) - => TableCache - -- all the fields of a table - -> FieldInfoMap FieldInfo - -- role and its permission - -> RoleName -> SelPermInfo - -> m (Bool, [SelField]) -getSelPerm tableCache fields roleName selPermInfo = do - selFlds <- fmap catMaybes $ forM (filter isValidField $ Map.elems fields) $ \case - FIColumn pgColInfo -> - return $ fmap SFPGColumn $ bool Nothing (Just pgColInfo) $ - Set.member (pgiColumn pgColInfo) $ spiCols selPermInfo - FIRelationship relInfo -> do - remTableInfo <- getTabInfo tableCache $ riRTable relInfo - let remTableSelPermM = getSelPermission remTableInfo roleName - remTableCoreInfo = _tiCoreInfo remTableInfo - remTableFlds = _tciFieldInfoMap remTableCoreInfo - remTableColGNameMap = - mkPGColGNameMap $ getValidCols remTableFlds - return $ flip fmap remTableSelPermM $ - \rmSelPermM -> SFRelationship RelationshipFieldInfo - { _rfiInfo = relInfo - , _rfiAllowAgg = spiAllowAgg rmSelPermM - , _rfiColumns = remTableColGNameMap - , _rfiPermFilter = spiFilter rmSelPermM - , _rfiPermLimit = spiLimit rmSelPermM - , _rfiPrimaryKeyColumns = _pkColumns <$> _tciPrimaryKey remTableCoreInfo - , _rfiIsNullable = isRelNullable fields relInfo - } - FIComputedField info -> do - let ComputedFieldInfo name function returnTy _ = info - inputArgSeq = mkComputedFieldFunctionArgSeq $ _cffInputArgs function - fmap (SFComputedField . ComputedField name function inputArgSeq) <$> - case returnTy of - CFRScalar scalarTy -> pure $ Just $ CFTScalar scalarTy - CFRSetofTable retTable -> do - retTableInfo <- getTabInfo tableCache retTable - let retTableSelPermM = getSelPermission retTableInfo roleName - retTableFlds = _tciFieldInfoMap $ _tiCoreInfo retTableInfo - retTableColGNameMap = - mkPGColGNameMap $ getValidCols retTableFlds - pure $ flip fmap retTableSelPermM $ - \selPerm -> CFTTable ComputedFieldTable - { _cftTable = retTable - , _cftCols = retTableColGNameMap - , _cftPermFilter = spiFilter selPerm - , _cftPermLimit = spiLimit selPerm - } - -- TODO: Derive permissions for remote relationships - FIRemoteRelationship remoteField -> pure $ Just (SFRemoteRelationship remoteField) - - return (spiAllowAgg selPermInfo, selFlds) - -mkInsCtx - :: MonadError QErr m - => RoleName - -> TableCache - -> FieldInfoMap FieldInfo - -> InsPermInfo - -> Maybe UpdPermInfo - -> m InsCtx -mkInsCtx role tableCache fields insPermInfo updPermM = do - relTupsM <- forM rels $ \relInfo -> do - let remoteTable = riRTable relInfo - relName = riName relInfo - remoteTableInfo <- getTabInfo tableCache remoteTable - let insPermM = getInsPerm remoteTableInfo role - viewInfoM = _tciViewInfo $ _tiCoreInfo remoteTableInfo - return $ bool Nothing (Just (relName, relInfo)) $ - isInsertable insPermM viewInfoM && isValidRel relName remoteTable - - let relInfoMap = Map.fromList $ catMaybes relTupsM - return $ InsCtx gNamePGColMap checkCond setCols relInfoMap updPermForIns - where - gNamePGColMap = mkPGColGNameMap allCols - allCols = getCols fields - rels = getValidRels fields - setCols = ipiSet insPermInfo - checkCond = ipiCheck insPermInfo - updPermForIns = mkUpdPermForIns <$> updPermM - mkUpdPermForIns upi = UpdPermForIns (toList $ upiCols upi) (upiCheck upi) - (upiFilter upi) (upiSet upi) - - isInsertable Nothing _ = False - isInsertable (Just _) viewInfoM = isMutable viIsInsertable viewInfoM - -mkAdminInsCtx - :: MonadError QErr m - => TableCache - -> FieldInfoMap FieldInfo - -> m InsCtx -mkAdminInsCtx tc fields = do - relTupsM <- forM rels $ \relInfo -> do - let remoteTable = riRTable relInfo - relName = riName relInfo - remoteTableInfo <- getTabInfo tc remoteTable - let viewInfoM = _tciViewInfo $ _tiCoreInfo remoteTableInfo - return $ bool Nothing (Just (relName, relInfo)) $ - isMutable viIsInsertable viewInfoM && isValidRel relName remoteTable - - let relInfoMap = Map.fromList $ catMaybes relTupsM - updPerm = UpdPermForIns updCols Nothing noFilter Map.empty - - return $ InsCtx colGNameMap noFilter Map.empty relInfoMap (Just updPerm) - where - allCols = getCols fields - colGNameMap = mkPGColGNameMap allCols - updCols = map pgiColumn allCols - rels = getValidRels fields - -mkAdminSelFlds - :: MonadError QErr m - => FieldInfoMap FieldInfo - -> TableCache - -> m [SelField] -mkAdminSelFlds fields tableCache = - forM (filter isValidField $ Map.elems fields) $ \case - FIColumn info -> pure $ SFPGColumn info - - FIRelationship info -> do - let remoteTable = riRTable info - remoteTableInfo <- _tiCoreInfo <$> getTabInfo tableCache remoteTable - let remoteTableFlds = _tciFieldInfoMap remoteTableInfo - remoteTableColGNameMap = - mkPGColGNameMap $ getValidCols remoteTableFlds - return $ SFRelationship RelationshipFieldInfo - { _rfiInfo = info - , _rfiAllowAgg = True - , _rfiColumns = remoteTableColGNameMap - , _rfiPermFilter = noFilter - , _rfiPermLimit = Nothing - , _rfiPrimaryKeyColumns = _pkColumns <$> _tciPrimaryKey remoteTableInfo - , _rfiIsNullable = isRelNullable fields info - } - - FIComputedField info -> do - let ComputedFieldInfo name function returnTy _ = info - inputArgSeq = mkComputedFieldFunctionArgSeq $ _cffInputArgs function - (SFComputedField . ComputedField name function inputArgSeq) <$> - case returnTy of - CFRScalar scalarTy -> pure $ CFTScalar scalarTy - CFRSetofTable retTable -> do - retTableInfo <- _tiCoreInfo <$> getTabInfo tableCache retTable - let retTableFlds = _tciFieldInfoMap retTableInfo - retTableColGNameMap = - mkPGColGNameMap $ getValidCols retTableFlds - pure $ CFTTable ComputedFieldTable - { _cftTable = retTable - , _cftCols = retTableColGNameMap - , _cftPermFilter = noFilter - , _cftPermLimit = Nothing - } - - FIRemoteRelationship info -> pure $ SFRemoteRelationship info - -mkGCtxRole - :: (MonadError QErr m) - => TableCache - -> QualifiedTable - -> Maybe PGDescription - -> FieldInfoMap FieldInfo - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] +-- | Parse query-type GraphQL requests without introspection +query + :: forall m n r + . (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => G.Name + -> HashSet QualifiedTable -> [FunctionInfo] - -> Maybe ViewInfo - -> TableConfig - -> RoleName - -> RolePermInfo - -> m (TyAgg, RootFields, InsCtxMap) -mkGCtxRole tableCache tn descM fields primaryKey constraints funcs viM tabConfigM role permInfo = do - selPermM <- mapM (getSelPerm tableCache fields role) $ _permSel permInfo - tabInsInfoM <- forM (_permIns permInfo) $ \ipi -> do - ctx <- mkInsCtx role tableCache fields ipi $ _permUpd permInfo - let permCols = flip getColInfos allCols $ Set.toList $ ipiCols ipi - return (ctx, (permCols, icRelations ctx)) - let insPermM = snd <$> tabInsInfoM - insCtxM = fst <$> tabInsInfoM - updColsM = filterColumnFields . upiCols <$> _permUpd permInfo - tyAgg = mkTyAggRole tn descM insPermM selPermM updColsM - (void $ _permDel permInfo) primaryKey constraints viM funcs - rootFlds = getRootFldsRole tn primaryKey constraints fields funcs - viM permInfo tabConfigM - insCtxMap = maybe Map.empty (Map.singleton tn) insCtxM - return (tyAgg, rootFlds, insCtxMap) - where - allCols = getCols fields - cols = getValidCols fields - filterColumnFields allowedSet = - filter ((`Set.member` allowedSet) . pgiColumn) cols + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [ActionInfo] + -> NonObjectTypeMap + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +query name allTables allFunctions allRemotes allActions nonObjectCustomTypes = do + queryFieldsParser <- query' allTables allFunctions allRemotes allActions nonObjectCustomTypes + pure $ P.selectionSet name Nothing queryFieldsParser + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) -getRootFldsRole - :: QualifiedTable - -> Maybe (PrimaryKey PGColumnInfo) - -> [ConstraintName] - -> FieldInfoMap FieldInfo +subscription + :: forall m n r + . (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => HashSet QualifiedTable -> [FunctionInfo] - -> Maybe ViewInfo - -> RolePermInfo - -> TableConfig - -> RootFields -getRootFldsRole tn pCols constraints fields funcs viM (RolePermInfo insM selM updM delM) tableConfig = - let queryFields = getQueryRootFieldsRole tn pCols fields funcs (mkSel <$> selM) tableConfig - mutationFields = getMutationRootFieldsRole tn pCols constraints fields - (mkIns <$> insM) (mkSel <$> selM) - (mkUpd <$> updM) (mkDel <$> delM) viM tableConfig - in RootFields queryFields mutationFields - where - mkIns i = (ipiRequiredHeaders i, isJust updM) - mkSel s = ( spiFilter s, spiLimit s - , spiRequiredHeaders s, spiAllowAgg s - ) - mkUpd u = ( flip getColInfos allCols $ Set.toList $ upiCols u - , upiSet u - , upiFilter u - , upiCheck u - , upiRequiredHeaders u - ) - mkDel d = (dpiFilter d, dpiRequiredHeaders d) + -> [ActionInfo] + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +subscription allTables allFunctions asyncActions = + query $$(G.litName "subscription_root") allTables allFunctions [] asyncActions mempty - allCols = getCols fields +queryRootFromFields + :: forall n + . MonadParse n + => [P.FieldParser n (QueryRootField UnpreparedValue)] + -> Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue)) +queryRootFromFields fps = + P.selectionSet $$(G.litName "query_root") Nothing fps + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) -mkGCtxMapTable - :: (MonadError QErr m) - => TableCache - -> FunctionCache - -> TableInfo - -> m (Map.HashMap RoleName TableSchemaCtx) -mkGCtxMapTable tableCache funcCache tabInfo = do - m <- flip Map.traverseWithKey rolePermsMap $ \roleName rolePerm -> - for rolePerm $ mkGCtxRole tableCache tn descM fields primaryKey validConstraints - tabFuncs viewInfo customConfig roleName - adminInsCtx <- mkAdminInsCtx tableCache fields - adminSelFlds <- mkAdminSelFlds fields tableCache - let adminCtx = mkTyAggRole tn descM (Just (cols, icRelations adminInsCtx)) - (Just (True, adminSelFlds)) (Just cols) (Just ()) - primaryKey validConstraints viewInfo tabFuncs - adminInsCtxMap = Map.singleton tn adminInsCtx - adminTableCtx = RoleContext (adminCtx, adminRootFlds, adminInsCtxMap) Nothing - pure $ Map.insert adminRoleName adminTableCtx m - where - TableInfo coreInfo rolePerms _ = tabInfo - TableCoreInfo tn descM _ fields primaryKey _ _ viewInfo _ customConfig = coreInfo - validConstraints = mkValidConstraints $ map _cName (tciUniqueOrPrimaryKeyConstraints coreInfo) - cols = getValidCols fields - tabFuncs = filter (isValidObjectName . fiName) $ getFuncsOfTable tn funcCache +emptyIntrospection + :: forall m n + . (MonadSchema n m, MonadError QErr m) + => m [P.FieldParser n (QueryRootField UnpreparedValue)] +emptyIntrospection = do + let emptyQueryP = queryRootFromFields @n [] + introspectionTypes <- collectTypes (P.parserType emptyQueryP) + let introspectionSchema = Schema + { sDescription = Nothing + , sTypes = introspectionTypes + , sQueryType = P.parserType emptyQueryP + , sMutationType = Nothing + , sSubscriptionType = Nothing + , sDirectives = mempty + } + return $ fmap (fmap RFRaw) [schema introspectionSchema, typeIntrospection introspectionSchema] - adminRootFlds = - let insertPermDetails = Just ([], True) - selectPermDetails = Just (noFilter, Nothing, [], True) - updatePermDetails = Just (cols, mempty, noFilter, Nothing, []) - deletePermDetails = Just (noFilter, []) +collectTypes + :: forall m a + . (MonadError QErr m, P.HasTypeDefinitions a) + => a + -> m (HashMap G.Name (P.Definition P.SomeTypeInfo)) +collectTypes x = case P.collectTypeDefinitions x of + Left (P.ConflictingDefinitions type1 _) -> throw500 $ + "found conflicting definitions for " <> P.getName type1 + <<> " when collecting types from the schema" + Right tps -> pure tps - queryFields = getQueryRootFieldsRole tn primaryKey fields tabFuncs - selectPermDetails customConfig - mutationFields = getMutationRootFieldsRole tn primaryKey - validConstraints fields insertPermDetails - selectPermDetails updatePermDetails - deletePermDetails viewInfo customConfig - in RootFields queryFields mutationFields +queryWithIntrospectionHelper + :: (MonadSchema n m, MonadError QErr m) + => [P.FieldParser n (QueryRootField UnpreparedValue)] + -> Maybe (Parser 'Output n (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue))) + -> Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue)) + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +queryWithIntrospectionHelper basicQueryFP mutationP subscriptionP = do + let + basicQueryP = queryRootFromFields basicQueryFP + emptyIntro <- emptyIntrospection + allBasicTypes <- collectTypes $ + [ P.parserType basicQueryP + , P.parserType subscriptionP + ] + ++ maybeToList (P.parserType <$> mutationP) + allIntrospectionTypes <- collectTypes (P.parserType (queryRootFromFields emptyIntro)) + let allTypes = Map.unions + [ allBasicTypes + , Map.filterWithKey (\name _info -> name /= $$(G.litName "query_root")) allIntrospectionTypes + ] + partialSchema = Schema + { sDescription = Nothing + , sTypes = allTypes + , sQueryType = P.parserType basicQueryP + , sMutationType = P.parserType <$> mutationP + , sSubscriptionType = Just $ P.parserType subscriptionP + , sDirectives = defaultDirectives + } + let partialQueryFields = + basicQueryFP ++ (fmap RFRaw <$> [schema partialSchema, typeIntrospection partialSchema]) + pure $ P.selectionSet $$(G.litName "query_root") Nothing partialQueryFields + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) - rolePermsMap :: Map.HashMap RoleName (RoleContext RolePermInfo) - rolePermsMap = flip Map.map rolePerms $ \permInfo -> - case _permIns permInfo of - Nothing -> RoleContext permInfo Nothing - Just insPerm -> - if ipiBackendOnly insPerm then - -- Remove insert permission from 'default' context and keep it in 'backend' context. - RoleContext { _rctxDefault = permInfo{_permIns = Nothing} - , _rctxBackend = Just permInfo - } - -- Remove insert permission from 'backend' context and keep it in 'default' context. - else RoleContext { _rctxDefault = permInfo - , _rctxBackend = Just permInfo{_permIns = Nothing} - } +-- | Prepare the parser for query-type GraphQL requests, but with introspection +-- for queries, mutations and subscriptions built in. +queryWithIntrospection + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + , Has Scenario r + ) + => HashSet QualifiedTable + -> [FunctionInfo] + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [ActionInfo] + -> NonObjectTypeMap + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +queryWithIntrospection allTables allFunctions queryRemotes mutationRemotes allActions nonObjectCustomTypes = do + basicQueryFP <- query' allTables allFunctions queryRemotes allActions nonObjectCustomTypes + mutationP <- mutation allTables mutationRemotes allActions nonObjectCustomTypes + subscriptionP <- subscription allTables allFunctions $ + filter (has (aiDefinition.adType._ActionMutation._ActionAsynchronous)) allActions + queryWithIntrospectionHelper basicQueryFP mutationP subscriptionP -noFilter :: AnnBoolExpPartialSQL -noFilter = annBoolExpTrue +relayWithIntrospection + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + , Has Scenario r + ) + => HashSet QualifiedTable + -> [FunctionInfo] + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +relayWithIntrospection allTables allFunctions = do + nodeFP <- fmap (RFDB . QDBPrimaryKey) <$> nodeField + basicQueryFP <- relayQuery' allTables allFunctions + mutationP <- mutation allTables [] [] mempty + let relayQueryFP = nodeFP:basicQueryFP + subscriptionP = P.selectionSet $$(G.litName "subscription_root") Nothing relayQueryFP + <&> fmap (P.handleTypename (RFRaw . J.String. G.unName)) + basicQueryP = queryRootFromFields relayQueryFP + emptyIntro <- emptyIntrospection + allBasicTypes <- collectTypes $ + [ P.parserType basicQueryP + , P.parserType subscriptionP + ] + ++ maybeToList (P.parserType <$> mutationP) + allIntrospectionTypes <- collectTypes (P.parserType (queryRootFromFields emptyIntro)) + let allTypes = Map.unions + [ allBasicTypes + , Map.filterWithKey (\name _info -> name /= $$(G.litName "query_root")) allIntrospectionTypes + ] + partialSchema = Schema + { sDescription = Nothing + , sTypes = allTypes + , sQueryType = P.parserType basicQueryP + , sMutationType = P.parserType <$> mutationP + , sSubscriptionType = Just $ P.parserType subscriptionP + , sDirectives = defaultDirectives + } + let partialQueryFields = + nodeFP : basicQueryFP ++ (fmap RFRaw <$> [schema partialSchema, typeIntrospection partialSchema]) + pure $ P.selectionSet $$(G.litName "query_root") Nothing partialQueryFields + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) -{- Note [Split schema generation (TODO)] -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -As of writing this, the schema is generated per table per role and for queries and mutations -separately. See functions "mkTyAggRole", "getQueryRootFieldsRole" and "getMutationRootFieldsRole". -This approach makes hard to differentiate schema generation for each operation -(select, insert, delete and update) based on respective permission information and safe merging -those schemas eventually. For backend-only inserts (see https://github.com/hasura/graphql-engine/pull/4224) -we need to somehow defer the logic of merging schema for inserts with others based on its -backend-only credibility. This requires significant refactor of this module and -we can't afford to do it as of now since we're going to rewrite the entire GraphQL schema -generation (see https://github.com/hasura/graphql-engine/pull/4111). For aforementioned -backend-only inserts, we're following a hacky implementation of generating schema for -both default session and one with backend privilege -- the later differs with the former by -only having the schema related to insert operation. --} +-- | Prepare the parser for query-type GraphQL requests, but with introspection +-- for queries, mutations and subscriptions built in. +unauthenticatedQueryWithIntrospection + :: forall m n + . ( MonadSchema n m + , MonadError QErr m + ) + => [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> m (Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue))) +unauthenticatedQueryWithIntrospection queryRemotes mutationRemotes = do + let basicQueryFP = fmap (fmap RFRemote) queryRemotes + mutationP = unauthenticatedMutation mutationRemotes + subscriptionP = unauthenticatedSubscription @n + queryWithIntrospectionHelper basicQueryFP mutationP subscriptionP -mkGCtxMap - :: forall m. (MonadError QErr m) - => TableCache -> FunctionCache -> ActionCache -> m GCtxMap -mkGCtxMap tableCache functionCache actionCache = do - typesMapL <- mapM (mkGCtxMapTable tableCache functionCache) $ - filter (tableFltr . _tiCoreInfo) $ Map.elems tableCache - let actionsSchema = mkActionsSchema actionCache - typesMap <- combineTypes actionsSchema typesMapL - let gCtxMap = flip Map.map typesMap $ - fmap (\(ty, flds, insCtxMap) -> mkGCtx ty flds insCtxMap) - pure gCtxMap - where - tableFltr ti = not (isSystemDefined $ _tciSystemDefined ti) && isValidObjectName (_tciName ti) +mutation + :: forall m n r + . (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r, Has Scenario r) + => HashSet QualifiedTable + -> [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> [ActionInfo] + -> NonObjectTypeMap + -> m (Maybe (Parser 'Output n (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue)))) +mutation allTables allRemotes allActions nonObjectCustomTypes = do + mutationParsers <- for (toList allTables) \table -> do + tableCoreInfo <- _tiCoreInfo <$> askTableInfo table + displayName <- qualifiedObjectToName table + tablePerms <- tablePermissions table + for tablePerms \permissions -> do + let customRootFields = _tcCustomRootFields $ _tciCustomConfig tableCoreInfo + viewInfo = _tciViewInfo tableCoreInfo + selectPerms = _permSel permissions - combineTypes - :: Map.HashMap RoleName (RootFields, TyAgg) - -> [Map.HashMap RoleName TableSchemaCtx] - -> m (Map.HashMap RoleName TableSchemaCtx) - combineTypes actionsSchema tableCtxMaps = do - let tableCtxsMap = - foldr (Map.unionWith (++) . Map.map pure) - ((\(rf, tyAgg) -> pure $ RoleContext (tyAgg, rf, mempty) Nothing) <$> actionsSchema) - tableCtxMaps + -- If we're in a frontend scenario, we should not include backend_only inserts + scenario <- asks getter + let scenarioInsertPermissionM = do + insertPermission <- _permIns permissions + if scenario == Frontend && ipiBackendOnly insertPermission + then Nothing + else return insertPermission + inserts <- fmap join $ whenMaybe (isMutable viIsInsertable viewInfo) $ for scenarioInsertPermissionM \insertPerms -> do + let insertName = $$(G.litName "insert_") <> displayName + insertDesc = G.Description $ "insert data into the table: " <>> table + insertOneName = $$(G.litName "insert_") <> displayName <> $$(G.litName "_one") + insertOneDesc = G.Description $ "insert a single row into the table: " <>> table + insert <- insertIntoTable table (fromMaybe insertName $ _tcrfInsert customRootFields) (Just insertDesc) insertPerms selectPerms (_permUpd permissions) + -- select permissions are required for InsertOne: the + -- selection set is the same as a select on that table, and it + -- therefore can't be populated if the user doesn't have + -- select permissions + insertOne <- for selectPerms \selPerms -> + insertOneIntoTable table (fromMaybe insertOneName $ _tcrfInsertOne customRootFields) (Just insertOneDesc) insertPerms selPerms (_permUpd permissions) + pure $ fmap (RFDB . MDBInsert) insert : maybe [] (pure . fmap (RFDB . MDBInsert)) insertOne - flip Map.traverseWithKey tableCtxsMap $ \_ tableSchemaCtxs -> do - let defaultTableSchemaCtxs = map _rctxDefault tableSchemaCtxs - backendGCtxTypesMaybe = - -- If no table has 'backend' schema context then - -- aggregated context should be Nothing - if all (isNothing . _rctxBackend) tableSchemaCtxs then Nothing - else Just $ flip map tableSchemaCtxs $ - -- Consider 'default' if 'backend' doesn't exist for any table - -- see Note [Split schema generation (TODO)] - \(RoleContext def backend) -> fromMaybe def backend + updates <- fmap join $ whenMaybe (isMutable viIsUpdatable viewInfo) $ for (_permUpd permissions) \updatePerms -> do + let updateName = $$(G.litName "update_") <> displayName + updateDesc = G.Description $ "update data of the table: " <>> table + updateByPkName = $$(G.litName "update_") <> displayName <> $$(G.litName "_by_pk") + updateByPkDesc = G.Description $ "update single row of the table: " <>> table + update <- updateTable table (fromMaybe updateName $ _tcrfUpdate customRootFields) (Just updateDesc) updatePerms selectPerms + -- likewise; furthermore, primary keys can only be tested in + -- the `where` clause if the user has select permissions for + -- them, which at the very least requires select permissions + updateByPk <- join <$> for selectPerms + (updateTableByPk table (fromMaybe updateByPkName $ _tcrfUpdateByPk customRootFields) (Just updateByPkDesc) updatePerms) + pure $ fmap (RFDB . MDBUpdate) <$> catMaybes [update, updateByPk] - RoleContext <$> combineTypes' defaultTableSchemaCtxs - <*> mapM combineTypes' backendGCtxTypesMaybe - where - combineTypes' :: [(TyAgg, RootFields, InsCtxMap)] -> m (TyAgg, RootFields, InsCtxMap) - combineTypes' typeList = do - let tyAgg = mconcat $ map (^. _1) typeList - insCtx = mconcat $ map (^. _3) typeList - rootFields <- combineRootFields $ map (^. _2) typeList - pure (tyAgg, rootFields, insCtx) + deletes <- fmap join $ whenMaybe (isMutable viIsDeletable viewInfo) $ for (_permDel permissions) \deletePerms -> do + let deleteName = $$(G.litName "delete_") <> displayName + deleteDesc = G.Description $ "delete data from the table: " <>> table + deleteByPkName = $$(G.litName "delete_") <> displayName <> $$(G.litName "_by_pk") + deleteByPkDesc = G.Description $ "delete single row from the table: " <>> table + delete <- deleteFromTable table (fromMaybe deleteName $ _tcrfDelete customRootFields) (Just deleteDesc) deletePerms selectPerms - combineRootFields :: [RootFields] -> m RootFields - combineRootFields rootFields = do - let duplicateQueryFields = duplicates $ - concatMap (Map.keys . _rootQueryFields) rootFields - duplicateMutationFields = duplicates $ - concatMap (Map.keys . _rootMutationFields) rootFields + -- ditto + deleteByPk <- join <$> for selectPerms + (deleteFromTableByPk table (fromMaybe deleteByPkName $ _tcrfDeleteByPk customRootFields) (Just deleteByPkDesc) deletePerms) + pure $ fmap (RFDB . MDBDelete) delete : maybe [] (pure . fmap (RFDB . MDBDelete)) deleteByPk - -- TODO: The following exception should result in inconsistency - when (not $ null duplicateQueryFields) $ - throw400 Unexpected $ "following query root fields are duplicated: " - <> showNames duplicateQueryFields + pure $ concat $ catMaybes [inserts, updates, deletes] - when (not $ null duplicateMutationFields) $ - throw400 Unexpected $ "following mutation root fields are duplicated: " - <> showNames duplicateMutationFields + actionParsers <- for allActions $ \actionInfo -> + case _adType (_aiDefinition actionInfo) of + ActionMutation ActionSynchronous -> + fmap (fmap (RFAction . AMSync)) <$> actionExecute nonObjectCustomTypes actionInfo + ActionMutation ActionAsynchronous -> + fmap (fmap (RFAction . AMAsync)) <$> actionAsyncMutation nonObjectCustomTypes actionInfo + ActionQuery -> pure Nothing - pure $ mconcat rootFields + let mutationFieldsParser = concat (catMaybes mutationParsers) <> catMaybes actionParsers <> fmap (fmap RFRemote) allRemotes + pure if null mutationFieldsParser + then Nothing + else Just $ P.selectionSet $$(G.litName "mutation_root") (Just $ G.Description "mutation root") mutationFieldsParser + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) -getGCtx :: BackendOnlyFieldAccess -> SchemaCache -> RoleName -> GCtx -getGCtx backendOnlyFieldAccess sc roleName = - case Map.lookup roleName (scGCtxMap sc) of - Nothing -> scDefaultRemoteGCtx sc - Just (RoleContext defaultGCtx maybeBackendGCtx) -> - case backendOnlyFieldAccess of - BOFAAllowed -> - -- When backend field access is allowed and if there's no 'backend_only' - -- permissions defined, we should allow access to non backend only fields - fromMaybe defaultGCtx maybeBackendGCtx - BOFADisallowed -> defaultGCtx +unauthenticatedMutation + :: forall n + . MonadParse n + => [P.FieldParser n (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + -> Maybe (Parser 'Output n (OMap.InsOrdHashMap G.Name (MutationRootField UnpreparedValue))) +unauthenticatedMutation allRemotes = + let mutationFieldsParser = fmap (fmap RFRemote) allRemotes + in if null mutationFieldsParser + then Nothing + else Just $ P.selectionSet $$(G.litName "mutation_root") Nothing mutationFieldsParser + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) --- pretty print GCtx -ppGCtx :: GCtx -> String -ppGCtx gCtx = - "GCtx [" - <> "\n types = " <> show types - <> "\n query root = " <> show qRoot - <> "\n mutation root = " <> show mRoot - <> "\n subscription root = " <> show sRoot - <> "\n]" - - where - types = map (G.unName . G.unNamedType) $ Map.keys $ _gTypes gCtx - qRoot = (,) (_otiName qRootO) $ - map G.unName $ Map.keys $ _otiFields qRootO - mRoot = (,) (_otiName <$> mRootO) $ - maybe [] (map G.unName . Map.keys . _otiFields) mRootO - sRoot = (,) (_otiName <$> sRootO) $ - maybe [] (map G.unName . Map.keys . _otiFields) sRootO - qRootO = _gQueryRoot gCtx - mRootO = _gMutRoot gCtx - sRootO = _gSubRoot gCtx - -mkGCtx :: TyAgg -> RootFields -> InsCtxMap -> GCtx -mkGCtx tyAgg (RootFields queryFields mutationFields) insCtxMap = - let queryRoot = mkQueryRootTyInfo qFlds - scalarTys = map (TIScalar . mkHsraScalarTyInfo) (Set.toList allScalarTypes) - compTys = map (TIInpObj . mkCompExpInp) (Set.toList allComparableTypes) - ordByEnumTyM = bool (Just ordByEnumTy) Nothing $ null qFlds - allTys = Map.union tyInfos $ mkTyInfoMap $ - catMaybes [ Just $ TIObj queryRoot - , TIObj <$> mutRootM - , TIObj <$> subRootM - , TIEnum <$> ordByEnumTyM - ] <> - scalarTys <> compTys <> defaultTypes <> wiredInGeoInputTypes - <> wiredInRastInputTypes - -- for now subscription root is query root - in GCtx allTys fldInfos queryRoot mutRootM subRootM ordByEnums - (Map.map fst queryFields) (Map.map fst mutationFields) insCtxMap - where - TyAgg tyInfos fldInfos scalars ordByEnums = tyAgg - colTys = Set.fromList $ map pgiType $ mapMaybe (^? _RFPGColumn) $ - Map.elems fldInfos - mkMutRoot = - mkHsraObjTyInfo (Just "mutation root") mutationRootNamedType Set.empty . - mapFromL _fiName - mutRootM = bool (Just $ mkMutRoot mFlds) Nothing $ null mFlds - mkSubRoot = - mkHsraObjTyInfo (Just "subscription root") - subscriptionRootNamedType Set.empty . mapFromL _fiName - subRootM = bool (Just $ mkSubRoot qFlds) Nothing $ null qFlds - - qFlds = rootFieldInfos queryFields - mFlds = rootFieldInfos mutationFields - rootFieldInfos = map snd . Map.elems - - anyGeoTypes = any (isScalarColumnWhere isGeoType) colTys - allComparableTypes = - if anyGeoTypes - -- due to casting, we need to generate both geometry and geography - -- operations even if just one of the two appears in the schema - then Set.union (Set.fromList [PGColumnScalar PGGeometry, PGColumnScalar PGGeography]) colTys - else colTys - - additionalScalars = Set.fromList $ - -- raster comparison expression needs geometry input - (guard anyRasterTypes *> pure PGGeometry) - -- scalar computed field return types - <> mapMaybe (^? _RFComputedField.cfType._CFTScalar) (Map.elems fldInfos) - - allScalarTypes = (allComparableTypes ^.. folded._PGColumnScalar) - <> additionalScalars <> scalars - - wiredInGeoInputTypes = guard anyGeoTypes *> map TIInpObj geoInputTypes - - anyRasterTypes = any (isScalarColumnWhere (== PGRaster)) colTys - wiredInRastInputTypes = guard anyRasterTypes *> - map TIInpObj rasterIntersectsInputTypes +unauthenticatedSubscription + :: forall n + . MonadParse n + => Parser 'Output n (OMap.InsOrdHashMap G.Name (QueryRootField UnpreparedValue)) +unauthenticatedSubscription = + P.selectionSet $$(G.litName "subscription_root") Nothing [] + <&> fmap (P.handleTypename (RFRaw . J.String . G.unName)) diff --git a/server/src-lib/Hasura/GraphQL/Schema/Action.hs b/server/src-lib/Hasura/GraphQL/Schema/Action.hs index 3450dc80aec..7c99ac14892 100644 --- a/server/src-lib/Hasura/GraphQL/Schema/Action.hs +++ b/server/src-lib/Hasura/GraphQL/Schema/Action.hs @@ -1,380 +1,324 @@ module Hasura.GraphQL.Schema.Action - ( mkActionsSchema + ( actionExecute + , actionAsyncMutation + , actionAsyncQuery ) where -import qualified Data.HashMap.Strict as Map -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Schema.Builder -import Hasura.GraphQL.Schema.Common (mkDescriptionWith) - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types import Hasura.Prelude +import Data.Has + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict as Map +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Parser as P +import qualified Hasura.GraphQL.Parser.Internal.Parser as P +import qualified Hasura.RQL.DML.Internal as RQL +import qualified Hasura.RQL.DML.Select.Types as RQL + +import Hasura.GraphQL.Parser (FieldParser, InputFieldsParser, Kind (..), + Parser, UnpreparedValue (..)) +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Schema.Common +import Hasura.GraphQL.Schema.Select import Hasura.RQL.Types import Hasura.Session import Hasura.SQL.Types +import Hasura.SQL.Value -mkAsyncActionSelectionType :: ActionName -> G.NamedType -mkAsyncActionSelectionType = G.NamedType . unActionName -mkAsyncActionQueryResponseObj - :: ActionName - -- Name of the action - -> GraphQLType - -- output type - -> ObjTyInfo -mkAsyncActionQueryResponseObj actionName outputType = - mkHsraObjTyInfo - (Just description) - (mkAsyncActionSelectionType actionName) -- "(action_name)" - mempty -- no arguments - (mapFromL _fiName fieldDefinitions) - where - description = G.Description $ "fields of action: " <>> actionName - - mkFieldDefinition (fieldName, fieldDescription, fieldType) = - mkHsraObjFldInfo - (Just fieldDescription) - fieldName - mempty - fieldType - - fieldDefinitions = map mkFieldDefinition - [ ( "id", "the unique id of an action" - , G.toGT $ mkScalarTy PGUUID) - , ( "created_at", "the time at which this action was created" - , G.toGT $ mkScalarTy PGTimeStampTZ) - , ( "errors", "errors related to the invocation" - , G.toGT $ mkScalarTy PGJSON) - , ( "output", "the output fields of this action" - , unGraphQLType outputType) - ] - -mkQueryActionField - :: ActionName +-- | actionExecute is used to execute either a query action or a synchronous +-- mutation action. A query action or a synchronous mutation action accepts +-- the field name and input arguments and a selectionset. The +-- input argument and selectionset types are defined by the user. +-- +-- +-- > action_name(action_input_arguments) { +-- > col1: col1_type +-- > col2: col2_type +-- > } +actionExecute + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => NonObjectTypeMap -> ActionInfo - -> [(PGCol, PGScalarType)] - -> (ActionExecutionContext, ObjFldInfo) -mkQueryActionField actionName actionInfo definitionList = - ( actionExecutionContext - , fieldInfo - ) + -> m (Maybe (FieldParser n (AnnActionExecution UnpreparedValue))) +actionExecute nonObjectTypeMap actionInfo = runMaybeT do + roleName <- lift askRoleName + guard $ roleName == adminRoleName || roleName `Map.member` permissions + let fieldName = unActionName actionName + description = G.Description <$> comment + inputArguments <- lift $ actionInputArguments nonObjectTypeMap $ _adArguments definition + selectionSet <- lift $ actionOutputFields outputObject + stringifyNum <- asks $ qcStringifyNum . getter + pure $ P.subselection fieldName description inputArguments selectionSet + <&> \(argsJson, fields) -> AnnActionExecution + { _aaeName = actionName + , _aaeFields = fields + , _aaePayload = argsJson + , _aaeOutputType = _adOutputType definition + , _aaeOutputFields = getActionOutputFields outputObject + , _aaeDefinitionList = mkDefinitionList outputObject + , _aaeWebhook = _adHandler definition + , _aaeHeaders = _adHeaders definition + , _aaeForwardClientHeaders = _adForwardClientHeaders definition + , _aaeStrfyNum = stringifyNum + } where - definition = _aiDefinition actionInfo - actionExecutionContext = - ActionExecutionContext - actionName - (_adOutputType definition) - (getActionOutputFields $ _aiOutputObject actionInfo) - definitionList - (_adHandler definition) - (_adHeaders definition) - (_adForwardClientHeaders definition) + ActionInfo actionName outputObject definition permissions comment = actionInfo - description = mkDescriptionWith (PGDescription <$> _aiComment actionInfo) $ - "perform the action: " <>> actionName - - fieldInfo = - mkHsraObjFldInfo - (Just description) - (unActionName actionName) - (mapFromL _iviName $ map mkActionArgument $ _adArguments definition) - actionFieldResponseType - - mkActionArgument argument = - InpValInfo (_argDescription argument) (unArgumentName $ _argName argument) - Nothing $ unGraphQLType $ _argType argument - - actionFieldResponseType = unGraphQLType $ _adOutputType definition - -mkMutationActionField - :: ActionName +-- | actionAsyncMutation is used to execute a asynchronous mutation action. An +-- asynchronous action expects the field name and the input arguments to the +-- action. A selectionset is *not* expected. An action ID (UUID) will be +-- returned after performing the action +-- +-- > action_name(action_input_arguments) +actionAsyncMutation + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => NonObjectTypeMap -> ActionInfo - -> [(PGCol, PGScalarType)] - -> ActionMutationKind - -> (ActionMutationExecutionContext, ObjFldInfo) -mkMutationActionField actionName actionInfo definitionList kind = - ( actionExecutionContext - , fieldInfo - ) + -> m (Maybe (FieldParser n AnnActionMutationAsync)) +actionAsyncMutation nonObjectTypeMap actionInfo = runMaybeT do + roleName <- lift askRoleName + guard $ roleName == adminRoleName || roleName `Map.member` permissions + inputArguments <- lift $ actionInputArguments nonObjectTypeMap $ _adArguments definition + actionId <- lift actionIdParser + let fieldName = unActionName actionName + description = G.Description <$> comment + pure $ P.selection fieldName description inputArguments actionId + <&> AnnActionMutationAsync actionName where - definition = _aiDefinition actionInfo - actionExecutionContext = - case kind of - ActionSynchronous -> - ActionMutationSyncWebhook $ ActionExecutionContext actionName - (_adOutputType definition) - (getActionOutputFields $ _aiOutputObject actionInfo) - definitionList - (_adHandler definition) - (_adHeaders definition) - (_adForwardClientHeaders definition) - ActionAsynchronous -> ActionMutationAsync + ActionInfo actionName _ definition permissions comment = actionInfo - description = mkDescriptionWith (PGDescription <$> _aiComment actionInfo) $ - "perform the action: " <>> actionName +-- | actionAsyncQuery is used to query/subscribe to the result of an +-- asynchronous mutation action. The only input argument to an +-- asynchronous mutation action is the action ID (UUID) and a selection +-- set is expected, the selection set contains 4 fields namely 'id', +-- 'created_at','errors' and 'output'. The result of the action can be queried +-- through the 'output' field. +-- +-- > action_name (id: UUID!) { +-- > id: UUID! +-- > created_at: timestampz! +-- > errors: JSON +-- > output: user_defined_type! +-- > } +actionAsyncQuery + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => ActionInfo + -> m (Maybe (FieldParser n (AnnActionAsyncQuery UnpreparedValue))) +actionAsyncQuery actionInfo = runMaybeT do + roleName <- lift askRoleName + guard $ roleName == adminRoleName || roleName `Map.member` permissions + actionId <- lift actionIdParser + actionOutputParser <- lift $ actionOutputFields outputObject + createdAtFieldParser <- + lift $ P.column (PGColumnScalar PGTimeStampTZ) (G.Nullability False) + errorsFieldParser <- + lift $ P.column (PGColumnScalar PGJSON) (G.Nullability True) - fieldInfo = - mkHsraObjFldInfo - (Just description) - (unActionName actionName) - (mapFromL _iviName $ map mkActionArgument $ _adArguments definition) - actionFieldResponseType + let fieldName = unActionName actionName + description = G.Description <$> comment + actionIdInputField = + P.field idFieldName (Just idFieldDescription) actionId + allFieldParsers = + let idField = P.selection_ idFieldName (Just idFieldDescription) actionId $> AsyncId + createdAtField = P.selection_ $$(G.litName "created_at") + (Just "the time at which this action was created") + createdAtFieldParser $> AsyncCreatedAt + errorsField = P.selection_ $$(G.litName "errors") + (Just "errors related to the invocation") + errorsFieldParser $> AsyncErrors + outputField = P.subselection_ $$(G.litName "output") + (Just "the output fields of this action") + actionOutputParser <&> AsyncOutput + in [idField, createdAtField, errorsField, outputField] + selectionSet = + let outputTypeName = unActionName actionName + desc = G.Description $ "fields of action: " <>> actionName + in P.selectionSet outputTypeName (Just desc) allFieldParsers + <&> parsedSelectionsToFields AsyncTypename - mkActionArgument argument = - InpValInfo (_argDescription argument) (unArgumentName $ _argName argument) - Nothing $ unGraphQLType $ _argType argument - - actionFieldResponseType = - case kind of - ActionSynchronous -> unGraphQLType $ _adOutputType definition - ActionAsynchronous -> G.toGT $ G.toNT $ mkScalarTy PGUUID - -mkQueryField - :: ActionName - -> Maybe Text - -> ResolvedActionDefinition - -> [(PGCol, PGScalarType)] - -> ActionMutationKind - -> Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo) -mkQueryField actionName comment definition definitionList kind = - case kind of - ActionAsynchronous -> - Just ( ActionSelectOpContext (_adOutputType definition) definitionList - - , mkHsraObjFldInfo (Just description) (unActionName actionName) - (mapFromL _iviName [idArgument]) - (G.toGT $ G.toGT $ mkAsyncActionSelectionType actionName) - - , TIObj $ mkAsyncActionQueryResponseObj actionName $ - _adOutputType definition - ) - ActionSynchronous -> Nothing + stringifyNum <- asks $ qcStringifyNum . getter + pure $ P.subselection fieldName description actionIdInputField selectionSet + <&> \(idArg, fields) -> AnnActionAsyncQuery + { _aaaqName = actionName + , _aaaqActionId = idArg + , _aaaqOutputType = _adOutputType definition + , _aaaqFields = fields + , _aaaqDefinitionList = mkDefinitionList outputObject + , _aaaqStringifyNum = stringifyNum + } where - description = mkDescriptionWith (PGDescription <$> comment) $ - "retrieve the result of action: " <>> actionName + ActionInfo actionName outputObject definition permissions comment = actionInfo + idFieldName = $$(G.litName "id") + idFieldDescription = "the unique id of an action" - idArgument = - InpValInfo (Just idDescription) "id" Nothing $ G.toNT $ mkScalarTy PGUUID - where - idDescription = G.Description $ "id of the action: " <>> actionName +-- | Async action's unique id +actionIdParser + :: (MonadSchema n m, MonadError QErr m) + => m (Parser 'Both n UnpreparedValue) +actionIdParser = + fmap P.mkParameter <$> P.column (PGColumnScalar PGUUID) (G.Nullability False) -mkPGFieldType - :: ObjectFieldName - -> (G.GType, OutputFieldTypeInfo) - -> HashMap ObjectFieldName PGColumnInfo - -> PGScalarType -mkPGFieldType fieldName (fieldType, fieldTypeInfo) fieldReferences = - case (G.isListType fieldType, fieldTypeInfo) of - -- for scalar lists, we treat them as json columns - (True, _) -> PGJSON - -- enums the same - (False, OutputFieldEnum _) -> PGJSON - -- default to PGJSON unless you have to join with a postgres table - -- i.e, if this field is specified as part of some relationship's - -- mapping, we can cast this column's value as the remote column's type - (False, OutputFieldScalar _) -> - case Map.lookup fieldName fieldReferences of - Just columnInfo -> unsafePGColumnToRepresentation $ pgiType columnInfo - Nothing -> PGJSON - - -mkDefinitionList :: AnnotatedObjectType -> HashMap ObjectFieldName PGColumnInfo -> [(PGCol, PGScalarType)] -mkDefinitionList annotatedOutputType fieldReferences = - [ (unsafePGCol $ coerce k, mkPGFieldType k v fieldReferences) - | (k, v) <- Map.toList $ _aotAnnotatedFields annotatedOutputType - ] - -mkFieldMap - :: AnnotatedObjectType - -> ActionInfo - -> HashMap ObjectFieldName PGColumnInfo - -> RoleName - -> HashMap (G.NamedType,G.Name) ResolveField -mkFieldMap annotatedOutputType actionInfo fieldReferences roleName = - Map.fromList $ fields <> catMaybes relationships +actionOutputFields + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => AnnotatedObjectType + -> m (Parser 'Output n (RQL.AnnFieldsG UnpreparedValue)) +actionOutputFields outputObject = do + let scalarOrEnumFields = map scalarOrEnumFieldParser $ toList $ _otdFields outputObject + relationshipFields <- forM (_otdRelationships outputObject) $ traverse relationshipFieldParser + let allFieldParsers = scalarOrEnumFields <> + maybe [] (catMaybes . toList) relationshipFields + outputTypeName = unObjectTypeName $ _otdName outputObject + outputTypeDescription = _otdDescription outputObject + pure $ P.selectionSet outputTypeName outputTypeDescription allFieldParsers + <&> parsedSelectionsToFields RQL.AFExpression where - fields = - flip map (Map.toList $ _aotAnnotatedFields annotatedOutputType) $ - \(fieldName, (fieldType, fieldTypeInfo)) -> - ( (actionOutputBaseType, unObjectFieldName fieldName) - , RFPGColumn $ PGColumnInfo - (unsafePGCol $ coerce fieldName) - (coerce fieldName) - 0 - (PGColumnScalar $ mkPGFieldType fieldName (fieldType, fieldTypeInfo) fieldReferences) - (G.isNullable fieldType) - Nothing - ) - relationships = - flip map (Map.toList $ _aotRelationships annotatedOutputType) $ - \(relationshipName, relationship) -> - let remoteTableInfo = _trRemoteTable relationship - remoteTable = _tciName $ _tiCoreInfo remoteTableInfo - filterAndLimitM = getFilterAndLimit remoteTableInfo + scalarOrEnumFieldParser + :: ObjectFieldDefinition (G.GType, AnnotatedObjectFieldType) + -> FieldParser n (RQL.AnnFieldG UnpreparedValue) + scalarOrEnumFieldParser (ObjectFieldDefinition name _ description ty) = + let (gType, objectFieldType) = ty + fieldName = unObjectFieldName name + -- FIXME? (from master) + pgColumnInfo = PGColumnInfo (unsafePGCol $ G.unName fieldName) + fieldName 0 (PGColumnScalar PGJSON) (G.isNullable gType) Nothing + fieldParser = case objectFieldType of + AOFTScalar def -> customScalarParser def + AOFTEnum def -> customEnumParser def + in bool P.nonNullableField id (G.isNullable gType) $ + P.selection_ (unObjectFieldName name) description fieldParser + $> RQL.mkAnnColumnField pgColumnInfo Nothing + + relationshipFieldParser + :: TypeRelationship TableInfo PGColumnInfo + -> m (Maybe (FieldParser n (RQL.AnnFieldG UnpreparedValue))) + relationshipFieldParser typeRelationship = runMaybeT do + let TypeRelationship relName relType tableInfo fieldMapping = typeRelationship + tableName = _tciName $ _tiCoreInfo tableInfo + fieldName = unRelationshipName relName + roleName <- lift askRoleName + tablePerms <- MaybeT $ pure $ RQL.getPermInfoMaybe roleName PASelect tableInfo + tableParser <- lift $ selectTable tableName fieldName Nothing tablePerms + pure $ tableParser <&> \selectExp -> + let tableRelName = RelName $ mkNonEmptyTextUnsafe $ G.unName fieldName columnMapping = Map.fromList $ - [ (unsafePGCol $ coerce k, pgiColumn v) - | (k, v) <- Map.toList $ _trFieldMapping relationship - ] - in case filterAndLimitM of - Just (tableFilter, tableLimit) -> - Just ( ( actionOutputBaseType - , unRelationshipName relationshipName - ) - , RFRelationship $ RelationshipField - (RelInfo - -- RelationshipName, which is newtype wrapper over G.Name is always - -- non-empty text so as to conform GraphQL spec - (RelName $ mkNonEmptyTextUnsafe $ coerce relationshipName) - (_trType relationship) - columnMapping remoteTable True) - RFKSimple mempty - tableFilter - tableLimit - ) - Nothing -> Nothing + [ (unsafePGCol $ G.unName $ unObjectFieldName k, pgiColumn v) + | (k, v) <- Map.toList fieldMapping + ] + in case relType of + ObjRel -> RQL.AFObjectRelation $ RQL.AnnRelationSelectG tableRelName columnMapping $ + RQL.AnnObjectSelectG (RQL._asnFields selectExp) tableName $ + RQL._tpFilter $ RQL._asnPerm selectExp + ArrRel -> RQL.AFArrayRelation $ RQL.ASSimple $ + RQL.AnnRelationSelectG tableRelName columnMapping selectExp - getFilterAndLimit remoteTableInfo = - if roleName == adminRoleName - then Just (annBoolExpTrue, Nothing) - else do - selectPermisisonInfo <- - getSelectPermissionInfoM remoteTableInfo roleName - return (spiFilter selectPermisisonInfo, spiLimit selectPermisisonInfo) - - actionOutputBaseType = - G.getBaseType $ unGraphQLType $ _adOutputType $ _aiDefinition actionInfo - -mkFieldReferences :: AnnotatedObjectType -> HashMap ObjectFieldName PGColumnInfo -mkFieldReferences annotatedOutputType= - Map.unions $ map _trFieldMapping $ Map.elems $ - _aotRelationships annotatedOutputType - -mkMutationActionFieldsAndTypes - :: ActionInfo - -> ActionPermissionInfo - -> ActionMutationKind - -> ( Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo) - -- context, field, response type info - , (ActionMutationExecutionContext, ObjFldInfo) -- mutation field - , FieldMap - ) -mkMutationActionFieldsAndTypes actionInfo permission kind = - ( mkQueryField actionName comment definition definitionList kind - , mkMutationActionField actionName actionInfo definitionList kind - , fieldMap - ) +mkDefinitionList :: AnnotatedObjectType -> [(PGCol, PGScalarType)] +mkDefinitionList ObjectTypeDefinition{..} = + flip map (toList _otdFields) $ \ObjectFieldDefinition{..} -> + (unsafePGCol . G.unName . unObjectFieldName $ _ofdName,) $ + case Map.lookup _ofdName fieldReferences of + Nothing -> fieldTypeToScalarType $ snd _ofdType + Just columnInfo -> unsafePGColumnToRepresentation $ pgiType columnInfo where - actionName = _aiName actionInfo - annotatedOutputType = _aiOutputObject actionInfo - definition = _aiDefinition actionInfo - roleName = _apiRole permission - comment = _aiComment actionInfo + fieldReferences = + Map.unions $ map _trFieldMapping $ maybe [] toList _otdRelationships - -- all the possible field references - fieldReferences = mkFieldReferences annotatedOutputType - definitionList = mkDefinitionList annotatedOutputType fieldReferences - - fieldMap = mkFieldMap annotatedOutputType actionInfo fieldReferences roleName - -mkQueryActionFieldsAndTypes - :: ActionInfo - -> ActionPermissionInfo - -> ((ActionExecutionContext, ObjFldInfo) - , FieldMap - ) -mkQueryActionFieldsAndTypes actionInfo permission = - ( mkQueryActionField actionName actionInfo definitionList - , fieldMap - ) +actionInputArguments + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => NonObjectTypeMap + -> [ArgumentDefinition (G.GType, NonObjectCustomType)] + -> m (InputFieldsParser n J.Value) +actionInputArguments nonObjectTypeMap arguments = do + argumentParsers <- for arguments $ \argument -> do + let ArgumentDefinition argumentName (gType, nonObjectType) argumentDescription = argument + name = unArgumentName argumentName + (name,) <$> argumentParser name argumentDescription gType nonObjectType + pure $ J.Object <$> inputFieldsToObject argumentParsers where - actionName = _aiName actionInfo - roleName = _apiRole permission - annotatedOutputType = _aiOutputObject actionInfo + inputFieldsToObject + :: [(G.Name, InputFieldsParser n (Maybe J.Value))] + -> InputFieldsParser n J.Object + inputFieldsToObject inputFields = + let mkTuple (name, parser) = fmap (G.unName name,) <$> parser + in fmap (Map.fromList . catMaybes) $ traverse mkTuple inputFields - fieldReferences = mkFieldReferences annotatedOutputType + argumentParser + :: G.Name + -> Maybe G.Description + -> G.GType + -> NonObjectCustomType + -> m (InputFieldsParser n (Maybe J.Value)) + argumentParser name description gType = \case + NOCTScalar def -> pure $ mkArgumentInputFieldParser name description gType $ customScalarParser def + NOCTEnum def -> pure $ mkArgumentInputFieldParser name description gType $ customEnumParser def + NOCTInputObject def -> do + let InputObjectTypeDefinition typeName objectDescription inputFields = def + objectName = unInputObjectTypeName typeName + inputFieldsParsers <- forM (toList inputFields) $ \inputField -> do + let InputObjectFieldName fieldName = _iofdName inputField + GraphQLType fieldType = _iofdType inputField + nonObjectFieldType <- + onNothing (Map.lookup (G.getBaseType fieldType) nonObjectTypeMap) $ + throw500 "object type for a field found in custom input object type" + (fieldName,) <$> argumentParser fieldName (_iofdDescription inputField) fieldType nonObjectFieldType - definitionList = mkDefinitionList annotatedOutputType fieldReferences + pure $ mkArgumentInputFieldParser name description gType $ + P.object objectName objectDescription $ + J.Object <$> inputFieldsToObject inputFieldsParsers - fieldMap = mkFieldMap annotatedOutputType actionInfo fieldReferences roleName - -mkMutationActionSchemaOne - :: ActionInfo - -> ActionMutationKind - -> Map.HashMap RoleName - ( Maybe (ActionSelectOpContext, ObjFldInfo, TypeInfo) - , (ActionMutationExecutionContext, ObjFldInfo) - , FieldMap - ) -mkMutationActionSchemaOne actionInfo kind = - flip Map.map permissions $ \permission -> - mkMutationActionFieldsAndTypes actionInfo permission kind +mkArgumentInputFieldParser + :: forall m k. (MonadParse m, 'Input P.<: k) + => G.Name + -> Maybe G.Description + -> G.GType + -> Parser k m J.Value + -> InputFieldsParser m (Maybe J.Value) +mkArgumentInputFieldParser name description gType parser = + if G.isNullable gType + then P.fieldOptional name description modifiedParser + else Just <$> P.field name description modifiedParser where - adminPermission = ActionPermissionInfo adminRoleName - permissions = Map.insert adminRoleName adminPermission $ _aiPermissions actionInfo + modifiedParser = parserModifier gType parser -mkQueryActionSchemaOne - :: ActionInfo - -> Map.HashMap RoleName - ( (ActionExecutionContext, ObjFldInfo) - , FieldMap - ) -mkQueryActionSchemaOne actionInfo = - flip Map.map permissions $ \permission -> - mkQueryActionFieldsAndTypes actionInfo permission - where - adminPermission = ActionPermissionInfo adminRoleName - permissions = Map.insert adminRoleName adminPermission $ _aiPermissions actionInfo - -mkActionsSchema - :: ActionCache - -> Map.HashMap RoleName (RootFields, TyAgg) -mkActionsSchema = - foldl' - (\aggregate actionInfo -> - case _adType $ _aiDefinition actionInfo of - ActionQuery -> - Map.foldrWithKey (accumulateQuery (_aiPgScalars actionInfo)) aggregate $ - mkQueryActionSchemaOne actionInfo - ActionMutation kind -> - Map.foldrWithKey (accumulateMutation (_aiPgScalars actionInfo)) aggregate $ - mkMutationActionSchemaOne actionInfo kind - ) - mempty - where - -- we'll need to add uuid and timestamptz for actions - initRoleState = - ( mempty - , foldr addScalarToTyAgg mempty [PGJSON, PGTimeStampTZ, PGUUID] - ) - - addScalarsToTyAgg = foldr addScalarToTyAgg - - accumulateQuery pgScalars roleName (actionField, fields) = - Map.alter (Just . addToStateSync . fromMaybe initRoleState) roleName + parserModifier + :: G.GType -> Parser k m J.Value -> Parser k m J.Value + parserModifier = \case + G.TypeNamed nullable _ -> nullableModifier nullable + G.TypeList nullable ty -> + nullableModifier nullable . fmap J.toJSON . P.list . parserModifier ty where - addToStateSync (rootFields, tyAgg) = - ( addQueryField (first QCAction actionField) rootFields - -- Add reused PG scalars to TyAgg - , addFieldsToTyAgg fields $ addScalarsToTyAgg tyAgg pgScalars - ) + nullableModifier = + bool (fmap J.toJSON) (fmap J.toJSON . P.nullable) . G.unNullability - accumulateMutation pgScalars roleName (queryFieldM, actionField, fields) = - Map.alter (Just . addToState . fromMaybe initRoleState) roleName - where - addToState (rootFields, tyAgg) = - let rootFieldsWithActionMutation = - addMutationField (first MCAction actionField) rootFields - -- Add reused PG scalars to TyAgg - tyAggWithFieldsAndPgScalars = - addFieldsToTyAgg fields $ addScalarsToTyAgg tyAgg pgScalars - in case queryFieldM of - Just (fldCtx, fldDefinition, responseTypeInfo) -> - -- Add async action's query resolver and response type - ( addQueryField (QCAsyncActionFetch fldCtx, fldDefinition) - rootFieldsWithActionMutation - , addTypeInfoToTyAgg responseTypeInfo tyAggWithFieldsAndPgScalars - ) - Nothing -> (rootFieldsWithActionMutation, tyAggWithFieldsAndPgScalars) +customScalarParser + :: MonadParse m + => AnnotatedScalarType -> Parser 'Both m J.Value +customScalarParser = \case + ASTCustom ScalarTypeDefinition{..} -> + if | _stdName == idScalar -> J.toJSON <$> P.identifier + | _stdName == intScalar -> J.toJSON <$> P.int + | _stdName == floatScalar -> J.toJSON <$> P.float + | _stdName == stringScalar -> J.toJSON <$> P.string + | _stdName == boolScalar -> J.toJSON <$> P.boolean + | otherwise -> P.namedJSON _stdName _stdDescription + ASTReusedPgScalar name pgScalarType -> + let schemaType = P.NonNullable $ P.TNamed $ P.mkDefinition name Nothing P.TIScalar + in P.Parser + { pType = schemaType + , pParser = P.valueToJSON (P.toGraphQLType schemaType) >=> + either (parseErrorWith ParseFailed . qeError) + (pure . pgScalarValueToJson) . runAesonParser (parsePGValue pgScalarType) + } + +customEnumParser + :: MonadParse m + => EnumTypeDefinition -> Parser 'Both m J.Value +customEnumParser (EnumTypeDefinition typeName description enumValues) = + let enumName = unEnumTypeName typeName + enumValueDefinitions = enumValues <&> \enumValue -> + let valueName = G.unEnumValue $ _evdValue enumValue + in (, J.toJSON valueName) $ P.mkDefinition valueName + (_evdDescription enumValue) P.EnumValueInfo + in P.enum enumName description enumValueDefinitions diff --git a/server/src-lib/Hasura/GraphQL/Schema/BoolExp.hs b/server/src-lib/Hasura/GraphQL/Schema/BoolExp.hs index 826ac82a394..79c1169bdcf 100644 --- a/server/src-lib/Hasura/GraphQL/Schema/BoolExp.hs +++ b/server/src-lib/Hasura/GraphQL/Schema/BoolExp.hs @@ -1,293 +1,302 @@ module Hasura.GraphQL.Schema.BoolExp - ( geoInputTypes - , rasterIntersectsInputTypes - , mkCompExpInp - - , mkBoolExpTy - , mkBoolExpInp + ( boolExp ) where -import qualified Data.HashMap.Strict as Map +import Hasura.Prelude + +import qualified Data.HashMap.Strict.Extended as M import qualified Language.GraphQL.Draft.Syntax as G -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude +import qualified Hasura.GraphQL.Parser as P + +import Hasura.GraphQL.Parser (InputFieldsParser, Kind (..), Parser, + UnpreparedValue, mkParameter) +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Schema.Table import Hasura.RQL.Types +import Hasura.SQL.DML import Hasura.SQL.Types +import Hasura.SQL.Value -typeToDescription :: G.NamedType -> G.Description -typeToDescription = G.Description . G.unName . G.unNamedType +type ComparisonExp = OpExpG UnpreparedValue -mkCompExpTy :: PGColumnType -> G.NamedType -mkCompExpTy = addTypeSuffix "_comparison_exp" . mkColumnType +-- | +-- > input type_bool_exp { +-- > _or: [type_bool_exp!] +-- > _and: [type_bool_exp!] +-- > _not: type_bool_exp +-- > column: type_comparison_exp +-- > ... +-- > } +boolExp + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> Maybe SelPermInfo + -> m (Parser 'Input n (AnnBoolExp UnpreparedValue)) +boolExp table selectPermissions = memoizeOn 'boolExp table $ do + name <- qualifiedObjectToName table <&> (<> $$(G.litName "_bool_exp")) + let description = G.Description $ + "Boolean expression to filter rows from the table " <> table <<> + ". All fields are combined with a logical 'AND'." -mkCastExpTy :: PGColumnType -> G.NamedType -mkCastExpTy = addTypeSuffix "_cast_exp" . mkColumnType + tableFieldParsers <- catMaybes <$> maybe + (pure []) + (traverse mkField <=< tableSelectFields table) + selectPermissions + recur <- boolExp table selectPermissions + -- Bafflingly, ApplicativeDo doesn’t work if we inline this definition (I + -- think the TH splices throw it off), so we have to define it separately. + let specialFieldParsers = + [ P.fieldOptional $$(G.litName "_or") Nothing (BoolOr <$> P.list recur) + , P.fieldOptional $$(G.litName "_and") Nothing (BoolAnd <$> P.list recur) + , P.fieldOptional $$(G.litName "_not") Nothing (BoolNot <$> recur) + ] --- TODO(shahidhk) this should ideally be st_d_within_geometry -{- -input st_d_within_input { - distance: Float! - from: geometry! -} --} -stDWithinGeometryInpTy :: G.NamedType -stDWithinGeometryInpTy = G.NamedType "st_d_within_input" - -{- -input st_d_within_geography_input { - distance: Float! - from: geography! - use_spheroid: Bool! -} --} -stDWithinGeographyInpTy :: G.NamedType -stDWithinGeographyInpTy = G.NamedType "st_d_within_geography_input" - - --- | Makes an input type declaration for the @_cast@ field of a comparison expression. --- (Currently only used for casting between geometry and geography types.) -mkCastExpressionInputType :: PGColumnType -> [PGColumnType] -> InpObjTyInfo -mkCastExpressionInputType sourceType targetTypes = - mkHsraInpTyInfo (Just description) (mkCastExpTy sourceType) (fromInpValL targetFields) + pure $ BoolAnd <$> P.object name (Just description) do + tableFields <- map BoolFld . catMaybes <$> sequenceA tableFieldParsers + specialFields <- catMaybes <$> sequenceA specialFieldParsers + pure (tableFields ++ specialFields) where - description = mconcat - [ "Expression to compare the result of casting a column of type " - , typeToDescription $ mkColumnType sourceType - , ". Multiple cast targets are combined with logical 'AND'." - ] - targetFields = map targetField targetTypes - targetField targetType = InpValInfo - Nothing - (G.unNamedType $ mkColumnType targetType) - Nothing - (G.toGT $ mkCompExpTy targetType) + mkField + :: FieldInfo -> m (Maybe (InputFieldsParser n (Maybe (AnnBoolExpFld UnpreparedValue)))) + mkField fieldInfo = runMaybeT do + fieldName <- MaybeT $ pure $ fieldInfoGraphQLName fieldInfo + P.fieldOptional fieldName Nothing <$> case fieldInfo of + -- field_name: field_type_comparison_exp + FIColumn columnInfo -> + lift $ fmap (AVCol columnInfo) <$> comparisonExps (pgiType columnInfo) ---- | make compare expression input type -mkCompExpInp :: PGColumnType -> InpObjTyInfo -mkCompExpInp colTy = - InpObjTyInfo (Just tyDesc) (mkCompExpTy colTy) (fromInpValL $ concat - [ map (mk colGqlType) eqOps - , guard (isScalarWhere (/= PGRaster)) *> map (mk colGqlType) compOps - , map (mk $ G.toLT $ G.toNT colGqlType) listOps - , guard (isScalarWhere isStringType) *> map (mk $ mkScalarTy PGText) stringOps - , guard (isScalarWhere (== PGJSONB)) *> map opToInpVal jsonbOps - , guard (isScalarWhere (== PGGeometry)) *> - (stDWithinGeoOpInpVal stDWithinGeometryInpTy : map geoOpToInpVal (geoOps ++ geomOps)) - , guard (isScalarWhere (== PGGeography)) *> - (stDWithinGeoOpInpVal stDWithinGeographyInpTy : map geoOpToInpVal geoOps) - , [InpValInfo Nothing "_is_null" Nothing $ G.TypeNamed (G.Nullability True) $ G.NamedType "Boolean"] - , castOpInputValues - , guard (isScalarWhere (== PGRaster)) *> map opToInpVal rasterOps - ]) TLHasuraType + -- field_name: field_type_bool_exp + FIRelationship relationshipInfo -> do + let remoteTable = riRTable relationshipInfo + remotePermissions <- lift $ tableSelectPermissions remoteTable + lift $ fmap (AVRel relationshipInfo) <$> boolExp remoteTable remotePermissions + + -- Using computed fields in boolean expressions is not currently supported. + FIComputedField _ -> empty + + -- Using remote relationship fields in boolean expressions is not supported. + FIRemoteRelationship _ -> empty + +comparisonExps + :: forall m n. (MonadSchema n m, MonadError QErr m) + => PGColumnType -> m (Parser 'Input n [ComparisonExp]) +comparisonExps = P.memoize 'comparisonExps \columnType -> do + geogInputParser <- geographyWithinDistanceInput + geomInputParser <- geometryWithinDistanceInput + ignInputParser <- intersectsGeomNbandInput + ingInputParser <- intersectsNbandGeomInput + -- see Note [Columns in comparison expression are never nullable] + columnParser <- P.column columnType (G.Nullability False) + nullableTextParser <- P.column (PGColumnScalar PGText) (G.Nullability True) + textParser <- P.column (PGColumnScalar PGText) (G.Nullability False) + maybeCastParser <- castExp columnType + let name = P.getName columnParser <> $$(G.litName "_comparison_exp") + desc = G.Description $ "Boolean expression to compare columns of type " + <> P.getName columnParser + <<> ". All fields are combined with logical 'AND'." + textListParser = P.list textParser `P.bind` traverse P.openOpaque + columnListParser = P.list columnParser `P.bind` traverse P.openOpaque + pure $ P.object name (Just desc) $ fmap catMaybes $ sequenceA $ concat + [ flip (maybe []) maybeCastParser $ \castParser -> + [ P.fieldOptional $$(G.litName "_cast") Nothing (ACast <$> castParser) + ] + -- Common ops for all types + , [ P.fieldOptional $$(G.litName "_is_null") Nothing (bool ANISNOTNULL ANISNULL <$> P.boolean) + , P.fieldOptional $$(G.litName "_eq") Nothing (AEQ True . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_neq") Nothing (ANE True . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_in") Nothing (AIN . mkListLiteral columnType <$> columnListParser) + , P.fieldOptional $$(G.litName "_nin") Nothing (ANIN . mkListLiteral columnType <$> columnListParser) + ] + -- Comparison ops for non Raster types + , guard (isScalarColumnWhere (/= PGRaster) columnType) *> + [ P.fieldOptional $$(G.litName "_gt") Nothing (AGT . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_lt") Nothing (ALT . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_gte") Nothing (AGTE . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_lte") Nothing (ALTE . mkParameter <$> columnParser) + ] + -- Ops for Raster types + , guard (isScalarColumnWhere (== PGRaster) columnType) *> + [ P.fieldOptional $$(G.litName "_st_intersects_rast") + Nothing + (ASTIntersectsRast . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_intersects_nband_geom") + Nothing + (ASTIntersectsNbandGeom <$> ingInputParser) + , P.fieldOptional $$(G.litName "_st_intersects_geom_nband") + Nothing + (ASTIntersectsGeomNband <$> ignInputParser) + ] + -- Ops for String like types + , guard (isScalarColumnWhere isStringType columnType) *> + [ P.fieldOptional $$(G.litName "_like") + (Just "does the column match the given pattern") + (ALIKE . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_nlike") + (Just "does the column NOT match the given pattern") + (ANLIKE . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_ilike") + (Just "does the column match the given case-insensitive pattern") + (AILIKE . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_nilike") + (Just "does the column NOT match the given case-insensitive pattern") + (ANILIKE . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_similar") + (Just "does the column match the given SQL regular expression") + (ASIMILAR . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_nsimilar") + (Just "does the column NOT match the given SQL regular expression") + (ANSIMILAR . mkParameter <$> columnParser) + ] + -- Ops for JSONB type + , guard (isScalarColumnWhere (== PGJSONB) columnType) *> + [ P.fieldOptional $$(G.litName "_contains") + (Just "does the column contain the given json value at the top level") + (AContains . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_contained_in") + (Just "is the column contained in the given json value") + (AContainedIn . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_has_key") + (Just "does the string exist as a top-level key in the column") + (AHasKey . mkParameter <$> nullableTextParser) + , P.fieldOptional $$(G.litName "_has_keys_any") + (Just "do any of these strings exist as top-level keys in the column") + (AHasKeysAny . mkListLiteral (PGColumnScalar PGText) <$> textListParser) + , P.fieldOptional $$(G.litName "_has_keys_all") + (Just "do all of these strings exist as top-level keys in the column") + (AHasKeysAll . mkListLiteral (PGColumnScalar PGText) <$> textListParser) + ] + -- Ops for Geography type + , guard (isScalarColumnWhere (== PGGeography) columnType) *> + [ P.fieldOptional $$(G.litName "_st_intersects") + (Just "does the column spatially intersect the given geography value") + (ASTIntersects . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_d_within") + (Just "is the column within a given distance from the given geography value") + (ASTDWithinGeog <$> geogInputParser) + ] + -- Ops for Geometry type + , guard (isScalarColumnWhere (== PGGeometry) columnType) *> + [ P.fieldOptional $$(G.litName "_st_contains") + (Just "does the column contain the given geometry value") + (ASTContains . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_crosses") + (Just "does the column cross the given geometry value") + (ASTCrosses . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_equals") + (Just "is the column equal to given geometry value (directionality is ignored)") + (ASTEquals . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_overlaps") + (Just "does the column 'spatially overlap' (intersect but not completely contain) the given geometry value") + (ASTOverlaps . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_touches") + (Just "does the column have atleast one point in common with the given geometry value") + (ASTTouches . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_within") + (Just "is the column contained in the given geometry value") + (ASTWithin . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_intersects") + (Just "does the column spatially intersect the given geometry value") + (ASTIntersects . mkParameter <$> columnParser) + , P.fieldOptional $$(G.litName "_st_d_within") + (Just "is the column within a given distance from the given geometry value") + (ASTDWithinGeom <$> geomInputParser) + ] + ] where - colGqlType = mkColumnType colTy - colTyDesc = typeToDescription colGqlType - tyDesc = - "expression to compare columns of type " <> colTyDesc - <> ". All fields are combined with logical 'AND'." - isScalarWhere = flip isScalarColumnWhere colTy - mk t n = InpValInfo Nothing n Nothing $ G.toGT t + mkListLiteral :: PGColumnType -> [P.PGColumnValue] -> UnpreparedValue + mkListLiteral columnType columnValues = P.UVLiteral $ SETyAnn + (SEArray $ txtEncoder . pstValue . P.pcvValue <$> columnValues) + (mkTypeAnn $ PGTypeArray $ unsafePGColumnToRepresentation columnType) - -- colScalarListTy = GA.GTList colGTy - eqOps = - ["_eq", "_neq"] - compOps = - ["_gt", "_lt", "_gte", "_lte"] + castExp :: PGColumnType -> m (Maybe (Parser 'Input n (CastExp UnpreparedValue))) + castExp sourceType = do + let maybeScalars = case sourceType of + PGColumnScalar PGGeography -> Just (PGGeography, PGGeometry) + PGColumnScalar PGGeometry -> Just (PGGeometry, PGGeography) + _ -> Nothing - listOps = - [ "_in", "_nin" ] - -- TODO - -- columnOps = - -- [ "_ceq", "_cneq", "_cgt", "_clt", "_cgte", "_clte"] - stringOps = - [ "_like", "_nlike", "_ilike", "_nilike" - , "_similar", "_nsimilar" - ] + forM maybeScalars $ \(sourceScalar, targetScalar) -> do + sourceName <- P.mkScalarTypeName sourceScalar <&> (<> $$(G.litName "_cast_exp")) + targetName <- P.mkScalarTypeName targetScalar + targetOpExps <- comparisonExps $ PGColumnScalar targetScalar + let field = P.fieldOptional targetName Nothing $ (targetScalar, ) <$> targetOpExps + pure $ P.object sourceName Nothing $ M.fromList . maybeToList <$> field - opToInpVal (opName, ty, desc) = InpValInfo (Just desc) opName Nothing ty +geographyWithinDistanceInput + :: forall m n. (MonadSchema n m, MonadError QErr m) + => m (Parser 'Input n (DWithinGeogOp UnpreparedValue)) +geographyWithinDistanceInput = do + geographyParser <- P.column (PGColumnScalar PGGeography) (G.Nullability False) + -- FIXME + -- It doesn't make sense for this value to be nullable; it only is for + -- backwards compatibility; if an explicit Null value is given, it will be + -- forwarded to the underlying SQL function, that in turns treat a null value + -- as an error. We can fix this by rejecting explicit null values, by marking + -- this field non-nullable in a future release. + booleanParser <- P.column (PGColumnScalar PGBoolean) (G.Nullability True) + floatParser <- P.column (PGColumnScalar PGFloat) (G.Nullability False) + pure $ P.object $$(G.litName "st_d_within_geography_input") Nothing $ + DWithinGeogOp <$> (mkParameter <$> P.field $$(G.litName "distance") Nothing floatParser) + <*> (mkParameter <$> P.field $$(G.litName "from") Nothing geographyParser) + <*> (mkParameter <$> P.fieldWithDefault $$(G.litName "use_spheroid") Nothing (G.VBoolean True) booleanParser) - jsonbOps = - [ ( "_contains" - , G.toGT $ mkScalarTy PGJSONB - , "does the column contain the given json value at the top level" - ) - , ( "_contained_in" - , G.toGT $ mkScalarTy PGJSONB - , "is the column contained in the given json value" - ) - , ( "_has_key" - , G.toGT $ mkScalarTy PGText - , "does the string exist as a top-level key in the column" - ) - , ( "_has_keys_any" - , G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText - , "do any of these strings exist as top-level keys in the column" - ) - , ( "_has_keys_all" - , G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText - , "do all of these strings exist as top-level keys in the column" - ) - ] +geometryWithinDistanceInput + :: forall m n. (MonadSchema n m, MonadError QErr m) + => m (Parser 'Input n (DWithinGeomOp UnpreparedValue)) +geometryWithinDistanceInput = do + geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False) + floatParser <- P.column (PGColumnScalar PGFloat) (G.Nullability False) + pure $ P.object $$(G.litName "st_d_within_input") Nothing $ + DWithinGeomOp <$> (mkParameter <$> P.field $$(G.litName "distance") Nothing floatParser) + <*> (mkParameter <$> P.field $$(G.litName "from") Nothing geometryParser) - castOpInputValues = - -- currently, only geometry/geography types support casting - guard (isScalarWhere isGeoType) $> - InpValInfo Nothing "_cast" Nothing (G.toGT $ mkCastExpTy colTy) +intersectsNbandGeomInput + :: forall m n. (MonadSchema n m, MonadError QErr m) + => m (Parser 'Input n (STIntersectsNbandGeommin UnpreparedValue)) +intersectsNbandGeomInput = do + geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False) + integerParser <- P.column (PGColumnScalar PGInteger) (G.Nullability False) + pure $ P.object $$(G.litName "st_intersects_nband_geom_input") Nothing $ + STIntersectsNbandGeommin <$> (mkParameter <$> P.field $$(G.litName "nband") Nothing integerParser) + <*> (mkParameter <$> P.field $$(G.litName "geommin") Nothing geometryParser) - stDWithinGeoOpInpVal ty = - InpValInfo (Just stDWithinGeoDesc) "_st_d_within" Nothing $ G.toGT ty - stDWithinGeoDesc = - "is the column within a distance from a " <> colTyDesc <> " value" +intersectsGeomNbandInput + :: forall m n. (MonadSchema n m, MonadError QErr m) + => m (Parser 'Input n (STIntersectsGeomminNband UnpreparedValue)) +intersectsGeomNbandInput = do + geometryParser <- P.column (PGColumnScalar PGGeometry) (G.Nullability False) + integerParser <- P.column (PGColumnScalar PGInteger) (G.Nullability False) + pure $ P.object $$(G.litName "st_intersects_geom_nband_input") Nothing $ STIntersectsGeomminNband + <$> ( mkParameter <$> P.field $$(G.litName "geommin") Nothing geometryParser) + <*> (fmap mkParameter <$> P.fieldOptional $$(G.litName "nband") Nothing integerParser) - geoOpToInpVal (opName, desc) = - InpValInfo (Just desc) opName Nothing $ G.toGT colGqlType +{- Note [Columns in comparison expression are never nullable] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +In comparisonExps, we hardcode `Nullability False` when calling `column`, which +might seem a bit sketchy. Shouldn’t the nullability depend on the nullability of +the underlying Postgres column? - -- operators applicable only to geometry types - geomOps :: [(G.Name, G.Description)] - geomOps = - [ - ( "_st_contains" - , "does the column contain the given geometry value" - ) - , ( "_st_crosses" - , "does the column crosses the given geometry value" - ) - , ( "_st_equals" - , "is the column equal to given geometry value. Directionality is ignored" - ) - , ( "_st_overlaps" - , "does the column 'spatially overlap' (intersect but not completely contain) the given geometry value" - ) - , ( "_st_touches" - , "does the column have atleast one point in common with the given geometry value" - ) - , ( "_st_within" - , "is the column contained in the given geometry value" - ) - ] +No. If we did that, then we would allow boolean expressions like this: - -- operators applicable to geometry and geography types - geoOps = - [ - ( "_st_intersects" - , "does the column spatially intersect the given " <> colTyDesc <> " value" - ) - ] + delete_users(where: {status: {eq: null}}) - -- raster related operators - rasterOps = - [ - ( "_st_intersects_rast" - , G.toGT $ mkScalarTy PGRaster - , boolFnMsg <> "ST_Intersects(raster , raster )" - ) - , ( "_st_intersects_nband_geom" - , G.toGT stIntersectsNbandGeomInputTy - , boolFnMsg <> "ST_Intersects(raster , integer nband, geometry geommin)" - ) - , ( "_st_intersects_geom_nband" - , G.toGT stIntersectsGeomNbandInputTy - , boolFnMsg <> "ST_Intersects(raster , geometry geommin, integer nband=NULL)" - ) - ] +The user expects this to generate SQL like - boolFnMsg = "evaluates the following boolean Postgres function; " + DELETE FROM users WHERE users.status IS NULL -geoInputTypes :: [InpObjTyInfo] -geoInputTypes = - [ stDWithinGeometryInputType - , stDWithinGeographyInputType - , mkCastExpressionInputType (PGColumnScalar PGGeometry) [PGColumnScalar PGGeography] - , mkCastExpressionInputType (PGColumnScalar PGGeography) [PGColumnScalar PGGeometry] - ] - where - stDWithinGeometryInputType = - mkHsraInpTyInfo Nothing stDWithinGeometryInpTy $ fromInpValL - [ InpValInfo Nothing "from" Nothing $ G.toGT $ G.toNT $ mkScalarTy PGGeometry - , InpValInfo Nothing "distance" Nothing $ G.toNT $ mkScalarTy PGFloat - ] - stDWithinGeographyInputType = - mkHsraInpTyInfo Nothing stDWithinGeographyInpTy $ fromInpValL - [ InpValInfo Nothing "from" Nothing $ G.toGT $ G.toNT $ mkScalarTy PGGeography - , InpValInfo Nothing "distance" Nothing $ G.toNT $ mkScalarTy PGFloat - , InpValInfo - Nothing "use_spheroid" (Just $ G.VCBoolean True) $ G.toGT $ mkScalarTy PGBoolean - ] +but it doesn’t. We treat null to mean “no condition was specified” (since +that’s how GraphQL indicates an optional field was omitted), and we actually +generate SQL like this: -stIntersectsNbandGeomInputTy :: G.NamedType -stIntersectsNbandGeomInputTy = G.NamedType "st_intersects_nband_geom_input" + DELETE FROM users -stIntersectsGeomNbandInputTy :: G.NamedType -stIntersectsGeomNbandInputTy = G.NamedType "st_intersects_geom_nband_input" +Now we’ve gone and deleted every user in the database. Hopefully the user had +backups! -rasterIntersectsInputTypes :: [InpObjTyInfo] -rasterIntersectsInputTypes = - [ stIntersectsNbandGeomInput - , stIntersectsGeomNbandInput - ] - where - stIntersectsNbandGeomInput = - mkHsraInpTyInfo Nothing stIntersectsNbandGeomInputTy $ fromInpValL - [ InpValInfo Nothing "nband" Nothing $ - G.toGT $ G.toNT $ mkScalarTy PGInteger - , InpValInfo Nothing "geommin" Nothing $ - G.toGT $ G.toNT $ mkScalarTy PGGeometry - ] - - stIntersectsGeomNbandInput = - mkHsraInpTyInfo Nothing stIntersectsGeomNbandInputTy $ fromInpValL - [ InpValInfo Nothing "geommin" Nothing $ - G.toGT $ G.toNT $ mkScalarTy PGGeometry - , InpValInfo Nothing "nband" Nothing $ - G.toGT $ mkScalarTy PGInteger - ] - -mkBoolExpName :: QualifiedTable -> G.Name -mkBoolExpName tn = - qualObjectToName tn <> "_bool_exp" - -mkBoolExpTy :: QualifiedTable -> G.NamedType -mkBoolExpTy = - G.NamedType . mkBoolExpName - --- table_bool_exp -mkBoolExpInp - :: QualifiedTable - -- the fields that are allowed - -> [SelField] - -> InpObjTyInfo -mkBoolExpInp tn fields = - mkHsraInpTyInfo (Just desc) boolExpTy $ Map.fromList - [(_iviName inpVal, inpVal) | inpVal <- inpValues] - where - desc = G.Description $ - "Boolean expression to filter rows from the table " <> tn <<> - ". All fields are combined with a logical 'AND'." - - -- the type of this boolean expression - boolExpTy = mkBoolExpTy tn - - -- all the fields of this input object - inpValues = combinators <> mapMaybe mkFldExpInp fields - - mk n ty = InpValInfo Nothing n Nothing $ G.toGT ty - - boolExpListTy = G.toLT boolExpTy - - combinators = - [ mk "_not" boolExpTy - , mk "_and" boolExpListTy - , mk "_or" boolExpListTy - ] - - mkFldExpInp = \case - SFPGColumn (PGColumnInfo _ name _ colTy _ _) -> - Just $ mk name (mkCompExpTy colTy) - SFRelationship relationshipField -> - let relationshipName = riName $ _rfiInfo relationshipField - remoteTable = riRTable $ _rfiInfo relationshipField - in Just $ mk (mkRelName relationshipName) (mkBoolExpTy remoteTable) - SFComputedField _ -> Nothing -- TODO: support computed fields in bool exps - SFRemoteRelationship{} -> Nothing +We avoid this problem by making the column value non-nullable (which is correct, +since we never treat a null value as a SQL NULL), then creating the field using +fieldOptional. This creates a parser that rejects nulls, but won’t be called at +all if the field is not specified, which is permitted by the GraphQL +specification. See Note [Optional fields and nullability] in +Hasura.GraphQL.Parser.Internal.Parser for more details. -} diff --git a/server/src-lib/Hasura/GraphQL/Schema/Builder.hs b/server/src-lib/Hasura/GraphQL/Schema/Builder.hs deleted file mode 100644 index b4d4230db8f..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Builder.hs +++ /dev/null @@ -1,86 +0,0 @@ -module Hasura.GraphQL.Schema.Builder - ( TyAgg(..) - , FieldMap - , taTypes - , taFields - , taScalars - , taOrdBy - , addFieldsToTyAgg - , addTypeInfoToTyAgg - , addScalarToTyAgg - , QueryRootFieldMap - , MutationRootFieldMap - , RootFields(..) - , addQueryField - , addMutationField - ) where - -import Control.Lens - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.SQL.Types - --- | A /types aggregate/, which holds role-specific information about visible GraphQL types. --- Importantly, it holds more than just the information needed by GraphQL: it also includes how the --- GraphQL types relate to Postgres types, which is used to validate literals provided for --- Postgres-specific scalars. -data TyAgg - = TyAgg - { _taTypes :: !TypeMap - , _taFields :: !FieldMap - , _taScalars :: !(Set.HashSet PGScalarType) - , _taOrdBy :: !OrdByCtx - } deriving (Show, Eq) -$(makeLenses ''TyAgg) - -addFieldsToTyAgg :: FieldMap -> TyAgg -> TyAgg -addFieldsToTyAgg fields = - over taFields (Map.union fields) - -addTypeInfoToTyAgg :: TypeInfo -> TyAgg -> TyAgg -addTypeInfoToTyAgg typeInfo tyAgg = - tyAgg & taTypes.at (getNamedTy typeInfo) ?~ typeInfo - -addScalarToTyAgg :: PGScalarType -> TyAgg -> TyAgg -addScalarToTyAgg pgScalarType = - over taScalars (Set.insert pgScalarType) - -instance Semigroup TyAgg where - (TyAgg t1 f1 s1 o1) <> (TyAgg t2 f2 s2 o2) = - TyAgg (Map.union t1 t2) (Map.union f1 f2) - (Set.union s1 s2) (Map.union o1 o2) - -instance Monoid TyAgg where - mempty = TyAgg Map.empty Map.empty Set.empty Map.empty - -type QueryRootFieldMap = Map.HashMap G.Name (QueryCtx, ObjFldInfo) -type MutationRootFieldMap = Map.HashMap G.Name (MutationCtx, ObjFldInfo) - --- | A role-specific mapping from root field names to allowed operations. -data RootFields - = RootFields - { _rootQueryFields :: !QueryRootFieldMap - , _rootMutationFields :: !MutationRootFieldMap - } deriving (Show, Eq) -$(makeLenses ''RootFields) - -addQueryField :: (QueryCtx, ObjFldInfo) -> RootFields -> RootFields -addQueryField v rootFields = - rootFields & rootQueryFields.at (_fiName $ snd v) ?~ v - -addMutationField :: (MutationCtx, ObjFldInfo) -> RootFields -> RootFields -addMutationField v rootFields = - rootFields & rootMutationFields.at (_fiName $ snd v) ?~ v - -instance Semigroup RootFields where - RootFields a1 b1 <> RootFields a2 b2 - = RootFields (Map.union a1 a2) (Map.union b1 b2) - -instance Monoid RootFields where - mempty = RootFields Map.empty Map.empty diff --git a/server/src-lib/Hasura/GraphQL/Schema/Common.hs b/server/src-lib/Hasura/GraphQL/Schema/Common.hs index 500a705f0e6..0a75e600ae2 100644 --- a/server/src-lib/Hasura/GraphQL/Schema/Common.hs +++ b/server/src-lib/Hasura/GraphQL/Schema/Common.hs @@ -1,170 +1,96 @@ -module Hasura.GraphQL.Schema.Common - ( qualObjectToName - , addTypeSuffix - , fromInpValL +module Hasura.GraphQL.Schema.Common where - , RelationshipFieldInfo(..) - , SelField(..) - , _SFPGColumn - , getPGColumnFields - , getRelationshipFields - , getComputedFields - , getRemoteRelationships - - , mkColumnType - , mkRelName - , mkAggRelName - , mkConnectionRelName - , mkComputedFieldName - - , mkTableTy - , mkTableConnectionTy - , mkTableEdgeTy - , mkTableEnumType - , mkTableAggTy - - , mkColumnEnumVal - , mkColumnInputVal - , mkDescriptionWith - - , mkFuncArgsTy - - , mkPGColGNameMap - - , numAggregateOps - , compAggregateOps - - , nodeType - , nodeIdType - ) where - -import qualified Data.HashMap.Strict as Map -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Control.Lens - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Validate.Types import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict.InsOrd as OMap + +import Language.GraphQL.Draft.Syntax as G + +import qualified Data.Text as T +import qualified Hasura.GraphQL.Execute.Types as ET (GraphQLQueryType) +import qualified Hasura.GraphQL.Parser as P +import qualified Hasura.RQL.DML.Select.Types as RQL (Fields) + import Hasura.RQL.Types import Hasura.SQL.Types -data RelationshipFieldInfo - = RelationshipFieldInfo - { _rfiInfo :: !RelInfo - , _rfiAllowAgg :: !Bool - , _rfiColumns :: !PGColGNameMap - , _rfiPermFilter :: !AnnBoolExpPartialSQL - , _rfiPermLimit :: !(Maybe Int) - , _rfiPrimaryKeyColumns :: !(Maybe PrimaryKeyColumns) - , _rfiIsNullable :: !Bool - } deriving (Show, Eq) +data QueryContext = + QueryContext + { qcStringifyNum :: !Bool + , qcQueryType :: !ET.GraphQLQueryType + , qcRemoteFields :: !(HashMap RemoteSchemaName [P.Definition P.FieldInfo]) + } -data SelField - = SFPGColumn !PGColumnInfo - | SFRelationship !RelationshipFieldInfo - | SFComputedField !ComputedField - | SFRemoteRelationship !RemoteFieldInfo +textToName :: MonadError QErr m => Text -> m G.Name +textToName textName = G.mkName textName `onNothing` throw400 ValidationFailed + ("cannot include " <> textName <<> " in the GraphQL schema because " + <> " it is not a valid GraphQL identifier") + +partialSQLExpToUnpreparedValue :: PartialSQLExp -> P.UnpreparedValue +partialSQLExpToUnpreparedValue (PSESessVar pftype var) = P.UVSessionVar pftype var +partialSQLExpToUnpreparedValue (PSESQLExp sqlExp) = P.UVLiteral sqlExp + +mapField + :: Functor m + => P.InputFieldsParser m (Maybe a) + -> (a -> b) + -> P.InputFieldsParser m (Maybe b) +mapField fp f = fmap (fmap f) fp + +parsedSelectionsToFields + :: (Text -> a) -- ^ how to handle @__typename@ fields + -> OMap.InsOrdHashMap G.Name (P.ParsedSelection a) + -> RQL.Fields a +parsedSelectionsToFields mkTypename = OMap.toList + >>> map (FieldName . G.unName *** P.handleTypename (mkTypename . G.unName)) + +numericAggOperators :: [G.Name] +numericAggOperators = + [ $$(G.litName "sum") + , $$(G.litName "avg") + , $$(G.litName "stddev") + , $$(G.litName "stddev_samp") + , $$(G.litName "stddev_pop") + , $$(G.litName "variance") + , $$(G.litName "var_samp") + , $$(G.litName "var_pop") + ] + +comparisonAggOperators :: [G.Name] +comparisonAggOperators = [$$(litName "max"), $$(litName "min")] + +data NodeIdVersion + = NIVersion1 deriving (Show, Eq) -$(makePrisms ''SelField) -getPGColumnFields :: [SelField] -> [PGColumnInfo] -getPGColumnFields = mapMaybe (^? _SFPGColumn) +nodeIdVersionInt :: NodeIdVersion -> Int +nodeIdVersionInt NIVersion1 = 1 -getRelationshipFields :: [SelField] -> [RelationshipFieldInfo] -getRelationshipFields = mapMaybe (^? _SFRelationship) +currentNodeIdVersion :: NodeIdVersion +currentNodeIdVersion = NIVersion1 -getComputedFields :: [SelField] -> [ComputedField] -getComputedFields = mapMaybe (^? _SFComputedField) - -getRemoteRelationships :: [SelField] -> [RemoteFieldInfo] -getRemoteRelationships = mapMaybe (^? _SFRemoteRelationship) - -qualObjectToName :: (ToTxt a) => QualifiedObject a -> G.Name -qualObjectToName = G.Name . snakeCaseQualObject - -addTypeSuffix :: Text -> G.NamedType -> G.NamedType -addTypeSuffix suffix baseType = - G.NamedType $ G.unNamedType baseType <> G.Name suffix - -fromInpValL :: [InpValInfo] -> Map.HashMap G.Name InpValInfo -fromInpValL = mapFromL _iviName - -mkRelName :: RelName -> G.Name -mkRelName rn = G.Name $ relNameToTxt rn - -mkAggRelName :: RelName -> G.Name -mkAggRelName rn = G.Name $ relNameToTxt rn <> "_aggregate" - -mkConnectionRelName :: RelName -> G.Name -mkConnectionRelName rn = G.Name $ relNameToTxt rn <> "_connection" - -mkComputedFieldName :: ComputedFieldName -> G.Name -mkComputedFieldName = G.Name . computedFieldNameToText - -mkColumnType :: PGColumnType -> G.NamedType -mkColumnType = \case - PGColumnScalar scalarType -> mkScalarTy scalarType - PGColumnEnumReference (EnumReference enumTable _) -> mkTableEnumType enumTable - -mkTableTy :: QualifiedTable -> G.NamedType -mkTableTy = G.NamedType . qualObjectToName - -mkTableConnectionTy :: QualifiedTable -> G.NamedType -mkTableConnectionTy = addTypeSuffix "Connection" . mkTableTy - -mkTableEdgeTy :: QualifiedTable -> G.NamedType -mkTableEdgeTy = addTypeSuffix "Edge" . mkTableTy - -mkTableEnumType :: QualifiedTable -> G.NamedType -mkTableEnumType = addTypeSuffix "_enum" . mkTableTy - -mkTableAggTy :: QualifiedTable -> G.NamedType -mkTableAggTy = addTypeSuffix "_aggregate" . mkTableTy - --- used for 'distinct_on' in select and upsert's 'update columns' -mkColumnEnumVal :: G.Name -> EnumValInfo -mkColumnEnumVal colName = - EnumValInfo (Just "column name") (G.EnumValue colName) False - -mkColumnInputVal :: PGColumnInfo -> InpValInfo -mkColumnInputVal ci = - InpValInfo (mkDescription <$> pgiDescription ci) (pgiName ci) - Nothing $ G.toGT $ G.toNT $ mkColumnType $ pgiType ci +instance J.FromJSON NodeIdVersion where + parseJSON v = do + versionInt :: Int <- J.parseJSON v + case versionInt of + 1 -> pure NIVersion1 + _ -> fail $ "expecting version 1 for node id, but got " <> show versionInt mkDescriptionWith :: Maybe PGDescription -> Text -> G.Description mkDescriptionWith descM defaultTxt = G.Description $ case descM of Nothing -> defaultTxt Just (PGDescription descTxt) -> T.unlines [descTxt, "\n", defaultTxt] -mkDescription :: PGDescription -> G.Description -mkDescription = G.Description . getPGDescription - -mkFuncArgsName :: QualifiedFunction -> G.Name -mkFuncArgsName fn = - qualObjectToName fn <> "_args" - -mkFuncArgsTy :: QualifiedFunction -> G.NamedType -mkFuncArgsTy = - G.NamedType . mkFuncArgsName - -mkPGColGNameMap :: [PGColumnInfo] -> PGColGNameMap -mkPGColGNameMap cols = Map.fromList $ - flip map cols $ \ci -> (pgiName ci, ci) - -numAggregateOps :: [G.Name] -numAggregateOps = [ "sum", "avg", "stddev", "stddev_samp", "stddev_pop" - , "variance", "var_samp", "var_pop" - ] - -compAggregateOps :: [G.Name] -compAggregateOps = ["max", "min"] - -nodeType :: G.NamedType -nodeType = - G.NamedType "Node" - -nodeIdType :: G.GType -nodeIdType = - G.toGT $ G.toNT $ G.NamedType "ID" +-- | The default @'skip' and @'include' directives +defaultDirectives :: [P.DirectiveInfo] +defaultDirectives = + [mkDirective $$(G.litName "skip"), mkDirective $$(G.litName "include")] + where + ifInputField = + P.mkDefinition $$(G.litName "if") Nothing $ P.IFRequired $ P.TNamed $ + P.mkDefinition $$(G.litName "Boolean") Nothing P.TIScalar + dirLocs = map G.DLExecutable + [G.EDLFIELD, G.EDLFRAGMENT_SPREAD, G.EDLINLINE_FRAGMENT] + mkDirective name = + P.DirectiveInfo name Nothing [ifInputField] dirLocs diff --git a/server/src-lib/Hasura/GraphQL/Schema/CustomTypes.hs b/server/src-lib/Hasura/GraphQL/Schema/CustomTypes.hs deleted file mode 100644 index e6d227e9205..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/CustomTypes.hs +++ /dev/null @@ -1,176 +0,0 @@ -module Hasura.GraphQL.Schema.CustomTypes - ( buildCustomTypesSchemaPartial - , buildCustomTypesSchema - ) where - -import Control.Lens - -import qualified Data.HashMap.Strict as Map -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Context (defaultTypes) -import Hasura.GraphQL.Schema.Common (mkTableTy) -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.Session -import Hasura.SQL.Types - -import qualified Hasura.GraphQL.Validate.Types as VT - -buildObjectTypeInfo :: RoleName -> AnnotatedObjectType -> VT.ObjTyInfo -buildObjectTypeInfo roleName annotatedObjectType = - let description = _otdDescription objectDefinition - namedType = unObjectTypeName $ _otdName objectDefinition - fieldMap = mapFromL VT._fiName $ fields <> catMaybes relationships - -- 'mkObjTyInfo' function takes care of inserting `__typename` field - in VT.mkObjTyInfo description namedType mempty fieldMap VT.TLCustom - where - objectDefinition = _aotDefinition annotatedObjectType - - relationships = - flip map (toList $ _aotRelationships annotatedObjectType) $ - \(TypeRelationship name ty remoteTableInfo _) -> - if isJust (getSelectPermissionInfoM remoteTableInfo roleName) || - roleName == adminRoleName - then Just (relationshipToFieldInfo name ty $ _tciName $ _tiCoreInfo remoteTableInfo) - else Nothing - where - relationshipToFieldInfo name relTy remoteTableName = - let fieldTy = case relTy of - ObjRel -> G.toGT $ mkTableTy remoteTableName - ArrRel -> G.toGT $ G.toLT $ mkTableTy remoteTableName - in VT.ObjFldInfo - { VT._fiDesc = Nothing -- TODO - , VT._fiName = unRelationshipName name - , VT._fiParams = mempty - , VT._fiTy = fieldTy - , VT._fiLoc = VT.TLCustom - } - - fields = - map convertObjectFieldDefinition $ - toList $ _otdFields objectDefinition - where - convertObjectFieldDefinition fieldDefinition = - VT.ObjFldInfo - { VT._fiDesc = _ofdDescription fieldDefinition - , VT._fiName = unObjectFieldName $ _ofdName fieldDefinition - , VT._fiParams = mempty - , VT._fiTy = unGraphQLType $ _ofdType fieldDefinition - , VT._fiLoc = VT.TLCustom - } - -buildCustomTypesSchema - :: NonObjectTypeMap -> AnnotatedObjects -> RoleName -> VT.TypeMap -buildCustomTypesSchema nonObjectTypeMap annotatedObjectTypes roleName = - unNonObjectTypeMap nonObjectTypeMap <> objectTypeInfos - where - objectTypeInfos = - mapFromL VT.getNamedTy $ - map (VT.TIObj . buildObjectTypeInfo roleName) $ - Map.elems annotatedObjectTypes - -annotateObjectType - :: (MonadError QErr m) - => TableCache -> NonObjectTypeMap -> ObjectTypeDefinition -> m AnnotatedObjectType -annotateObjectType tableCache nonObjectTypeMap objectDefinition = do - annotatedFields <- - fmap Map.fromList $ forM (toList $ _otdFields objectDefinition) $ - \objectField -> do - let fieldName = _ofdName objectField - fieldType = unGraphQLType $ _ofdType objectField - fieldBaseType = G.getBaseType fieldType - baseTypeInfo <- getFieldTypeInfo fieldBaseType - return (fieldName, (fieldType, baseTypeInfo)) - annotatedRelationships <- - fmap Map.fromList $ forM relationships $ - \relationship -> do - let relationshipName = _trName relationship - remoteTable = _trRemoteTable relationship - remoteTableInfo <- onNothing (Map.lookup remoteTable tableCache) $ - throw500 $ "missing table info for: " <>> remoteTable - annotatedFieldMapping <- - forM (_trFieldMapping relationship) $ \remoteTableColumn -> do - let fieldName = fromPGCol remoteTableColumn - onNothing (getPGColumnInfoM remoteTableInfo fieldName) $ - throw500 $ "missing column info of " <> fieldName - <<> " in table" <>> remoteTable - return ( relationshipName - , relationship & trRemoteTable .~ remoteTableInfo - & trFieldMapping .~ annotatedFieldMapping) - return $ AnnotatedObjectType objectDefinition - annotatedFields annotatedRelationships - where - relationships = fromMaybe [] $ _otdRelationships objectDefinition - getFieldTypeInfo typeName = do - let inputTypeInfos = unNonObjectTypeMap nonObjectTypeMap - <> mapFromL VT.getNamedTy defaultTypes - typeInfo <- onNothing (Map.lookup typeName inputTypeInfos) $ - throw500 $ "the type: " <> VT.showNamedTy typeName <> - " is not found in non-object cutom types" - case typeInfo of - VT.TIScalar scalarTypeInfo -> return $ OutputFieldScalar scalarTypeInfo - VT.TIEnum enumTypeInfo -> return $ OutputFieldEnum enumTypeInfo - _ -> throw500 $ - "expecting only scalar/enum typeinfo for an object type's field: " <> - VT.showNamedTy typeName - -buildCustomTypesSchemaPartial - :: (QErrM m) - => TableCache - -> CustomTypes - -> HashSet PGScalarType - -- ^ Postgres base types used in the custom type definitions; - -- see Note [Postgres scalars in custom types]. - -> m (NonObjectTypeMap, AnnotatedObjects) -buildCustomTypesSchemaPartial tableCache customTypes pgScalars = do - let typeInfos = - map (VT.TIEnum . convertEnumDefinition) enumDefinitions <> - map (VT.TIInpObj . convertInputObjectDefinition) inputObjectDefinitions <> - map (VT.TIScalar . convertScalarDefinition) scalarDefinitions <> - map (VT.TIScalar . VT.mkHsraScalarTyInfo) (toList pgScalars) - nonObjectTypeMap = NonObjectTypeMap $ mapFromL VT.getNamedTy typeInfos - - annotatedObjectTypes <- mapFromL (_otdName . _aotDefinition) <$> - traverse (annotateObjectType tableCache nonObjectTypeMap) objectDefinitions - - return (nonObjectTypeMap, annotatedObjectTypes) - where - inputObjectDefinitions = fromMaybe [] $ _ctInputObjects customTypes - objectDefinitions = fromMaybe [] $ _ctObjects customTypes - scalarDefinitions = fromMaybe [] $ _ctScalars customTypes - enumDefinitions = fromMaybe [] $ _ctEnums customTypes - - convertScalarDefinition scalarDefinition = - flip VT.fromScalarTyDef VT.TLCustom $ G.ScalarTypeDefinition - (_stdDescription scalarDefinition) - (G.unNamedType $ _stdName scalarDefinition) mempty - - convertEnumDefinition enumDefinition = - VT.EnumTyInfo (_etdDescription enumDefinition) - (unEnumTypeName $ _etdName enumDefinition) - (VT.EnumValuesSynthetic $ mapFromL VT._eviVal $ - map convertEnumValueDefinition $ toList $ _etdValues enumDefinition) - VT.TLCustom - where - convertEnumValueDefinition enumValueDefinition = - VT.EnumValInfo (_evdDescription enumValueDefinition) - (_evdValue enumValueDefinition) - (fromMaybe False $ _evdIsDeprecated enumValueDefinition) - - convertInputObjectDefinition inputObjectDefinition = - VT.InpObjTyInfo - { VT._iotiDesc = _iotdDescription inputObjectDefinition - , VT._iotiName = unInputObjectTypeName $ _iotdName inputObjectDefinition - , VT._iotiFields = mapFromL VT._iviName $ map convertInputFieldDefinition $ - toList $ _iotdFields inputObjectDefinition - , VT._iotiLoc = VT.TLCustom - } - where - convertInputFieldDefinition fieldDefinition = - VT.InpValInfo - { VT._iviDesc = _iofdDescription fieldDefinition - , VT._iviName = unInputObjectFieldName $ _iofdName fieldDefinition - , VT._iviDefVal = Nothing - , VT._iviType = unGraphQLType $ _iofdType fieldDefinition - } diff --git a/server/src-lib/Hasura/GraphQL/Schema/Function.hs b/server/src-lib/Hasura/GraphQL/Schema/Function.hs deleted file mode 100644 index da39113a648..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Function.hs +++ /dev/null @@ -1,149 +0,0 @@ -module Hasura.GraphQL.Schema.Function - ( procFuncArgs - , mkFuncArgsInp - , mkFuncQueryFld - , mkFuncQueryConnectionFld - , mkFuncAggQueryFld - , mkFuncArgsTy - , mkFuncArgItemSeq - ) where - -import qualified Data.Sequence as Seq -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Select -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.SQL.Types - -{- -input function_args { - arg1: arg-type1! - . . - . . - argn: arg-typen! -} --} - -procFuncArgs :: Seq.Seq a -> (a -> Maybe FunctionArgName) -> (a -> Text -> b) -> [b] -procFuncArgs argSeq nameFn resultFn = - fst $ foldl mkItem ([], 1::Int) argSeq - where - mkItem (items, argNo) fa = - case nameFn fa of - Just argName -> - let argT = getFuncArgNameTxt argName - in (items <> pure (resultFn fa argT), argNo) - Nothing -> - let argT = "arg_" <> T.pack (show argNo) - in (items <> pure (resultFn fa argT), argNo + 1) - -mkFuncArgsInp :: QualifiedFunction -> Seq.Seq FunctionArg -> Maybe InpObjTyInfo -mkFuncArgsInp funcName funcArgs = - bool (Just inpObj) Nothing $ null funcArgs - where - funcArgsTy = mkFuncArgsTy funcName - - inpObj = mkHsraInpTyInfo Nothing funcArgsTy $ - fromInpValL argInps - - argInps = procFuncArgs funcArgs faName mkInpVal - - mkInpVal fa t = - InpValInfo Nothing (G.Name t) Nothing $ - G.toGT $ mkScalarTy $ _qptName $ faType fa - -{- - -function( - args: function_args - where: table_bool_exp - limit: Int - offset: Int -): [table!]! - --} - -mkFuncArgs :: FunctionInfo -> ParamMap -mkFuncArgs funInfo = - fromInpValL $ funcInpArgs <> mkSelArgs retTable - where - funcName = fiName funInfo - funcArgs = getInputArgs funInfo - retTable = fiReturnType funInfo - - funcArgDesc = G.Description $ "input parameters for function " <>> funcName - funcInpArg = InpValInfo (Just funcArgDesc) "args" Nothing $ G.toGT $ G.toNT $ - mkFuncArgsTy funcName - funcInpArgs = bool [funcInpArg] [] $ null funcArgs - -mkFuncQueryFld - :: FunctionInfo -> Maybe PGDescription -> ObjFldInfo -mkFuncQueryFld funInfo descM = - mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty - where - retTable = fiReturnType funInfo - funcName = fiName funInfo - - desc = mkDescriptionWith descM $ "execute function " <> funcName - <<> " which returns " <>> retTable - fldName = qualObjectToName funcName - - ty = G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkTableTy retTable - -mkFuncQueryConnectionFld - :: FunctionInfo -> Maybe PGDescription -> ObjFldInfo -mkFuncQueryConnectionFld funInfo descM = - mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty - where - retTable = fiReturnType funInfo - funcName = fiName funInfo - - desc = mkDescriptionWith descM $ "execute function " <> funcName - <<> " which returns " <>> retTable - fldName = qualObjectToName funcName <> "_connection" - - ty = G.toGT $ G.toNT $ mkTableConnectionTy retTable - -{- - -function_aggregate( - args: function_args - where: table_bool_exp - limit: Int - offset: Int -): table_aggregate! - --} - -mkFuncAggQueryFld - :: FunctionInfo -> Maybe PGDescription -> ObjFldInfo -mkFuncAggQueryFld funInfo descM = - mkHsraObjFldInfo (Just desc) fldName (mkFuncArgs funInfo) ty - where - funcName = fiName funInfo - retTable = fiReturnType funInfo - - desc = mkDescriptionWith descM $ "execute function " <> funcName - <<> " and query aggregates on result of table type " - <>> retTable - - fldName = qualObjectToName funcName <> "_aggregate" - - ty = G.toGT $ G.toNT $ mkTableAggTy retTable - - -mkFuncArgItemSeq :: FunctionInfo -> Seq (InputArgument FunctionArgItem) -mkFuncArgItemSeq functionInfo = - let inputArgs = fiInputArgs functionInfo - in Seq.fromList $ procFuncArgs inputArgs nameFn resultFn - where - nameFn = \case - IAUserProvided fa -> faName fa - IASessionVariables name -> Just name - resultFn arg gName = flip fmap arg $ - \fa -> FunctionArgItem (G.Name gName) (faName fa) (faHasDefault fa) diff --git a/server/src-lib/Hasura/GraphQL/Schema/Insert.hs b/server/src-lib/Hasura/GraphQL/Schema/Insert.hs new file mode 100644 index 00000000000..4f26bea2c70 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Schema/Insert.hs @@ -0,0 +1,76 @@ +module Hasura.GraphQL.Schema.Insert where + +import Hasura.Prelude + +import qualified Hasura.RQL.DML.Insert.Types as RQL +import qualified Hasura.RQL.DML.Returning.Types as RQL + +import Hasura.RQL.Types.BoolExp +import Hasura.RQL.Types.Column +import Hasura.RQL.Types.Common +import Hasura.SQL.Types + + +-- At time of writing (August 2020), GraphQL queries and mutations get +-- translated into corresponding RQL queries: RQL is used as the internal +-- intermediary representation, before a query gets translated into +-- SQL. However, RQL inserts represenation does not support nested insertions, +-- which means that GraphQL inserts need a separate representation, found here. + +-- FIXME: this code doesn't belong in this folder: arguably, since this is an +-- internal representation of a mutation, it should belong alongside RQL rather +-- than alongside the schema code, especially if we transition RQL to only be an +-- intermediary representation library rather than an actual API (see [1] for +-- more information). +-- [1] https://gist.github.com/abooij/07165b5ac36097178a334bc03805c33b + +-- FIXME: this representation was lifted almost verbatim from pre-PDV code, and +-- hasn't been adapted to reflect the changes that PDV brought. It is therefore +-- quite likely that some of the information stored in those structures is +-- redundant, and that they can be simplified. + +data AnnInsert v + = AnnInsert + { _aiFieldName :: !Text + , _aiIsSingle :: Bool + , _aiData :: AnnMultiInsert v + } + +data AnnIns a v + = AnnIns + { _aiInsObj :: !a + , _aiTableName :: !QualifiedTable + , _aiConflictClause :: !(Maybe (RQL.ConflictClauseP1 v)) + , _aiCheckCond :: !(AnnBoolExp v, Maybe (AnnBoolExp v)) + , _aiTableCols :: ![PGColumnInfo] + , _aiDefVals :: !(PreSetColsG v) + } deriving (Show, Eq) + +type SingleObjIns v = AnnIns (AnnInsObj v) v +type MultiObjIns v = AnnIns [AnnInsObj v] v + +data RelIns a + = RelIns + { _riAnnIns :: !a + , _riRelInfo :: !RelInfo + } deriving (Show, Eq) + +type ObjRelIns v = RelIns (SingleObjIns v) +type ArrRelIns v = RelIns (MultiObjIns v) + +data AnnInsObj v + = AnnInsObj + { _aioColumns :: ![(PGCol, v)] + , _aioObjRels :: ![ObjRelIns v] + , _aioArrRels :: ![ArrRelIns v] + } deriving (Show, Eq) + +type AnnSingleInsert v = (SingleObjIns v, RQL.MutationOutputG v) +type AnnMultiInsert v = (MultiObjIns v, RQL.MutationOutputG v) + +instance Semigroup (AnnInsObj v) where + (AnnInsObj col1 obj1 rel1) <> (AnnInsObj col2 obj2 rel2) = + AnnInsObj (col1 <> col2) (obj1 <> obj2) (rel1 <> rel2) + +instance Monoid (AnnInsObj v) where + mempty = AnnInsObj [] [] [] diff --git a/server/src-lib/Hasura/GraphQL/Schema/Introspect.hs b/server/src-lib/Hasura/GraphQL/Schema/Introspect.hs new file mode 100644 index 00000000000..22802209868 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Schema/Introspect.hs @@ -0,0 +1,602 @@ +module Hasura.GraphQL.Schema.Introspect where + +import Hasura.Prelude +-- import qualified Hasura.RQL.Types + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.List.NonEmpty as NE +import qualified Data.Text as T +import qualified Data.Vector as V +import qualified Language.GraphQL.Draft.Printer as GP +import qualified Language.GraphQL.Draft.Syntax as G +import qualified Text.Builder as T + +import qualified Hasura.GraphQL.Parser as P + +import Hasura.GraphQL.Parser (FieldParser, Kind (..), Parser, Schema (..)) +import Hasura.GraphQL.Parser.Class + +{- +Note [Basics of introspection schema generation] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +We generate the introspection schema from the existing schema for queries, +mutations and subscriptions. In other words, we generate one @Parser@ from some +other @Parser@s. In this way, we avoid having to remember what types we have to +expose through introspection explicitly, as we did in a previous version of +graphql-engine. + +However the schema information is obtained, the @Schema@ type stores it. From a +@Schema@ object we then produce one @FieldParser@ that reads a `__schema` field, +and one that reads a `__type` field. The idea is that these parsers simply +output a JSON value directly, and so indeed the type of @schema@, for instance, +is @FieldParser n J.Value@. + +The idea of "just output the JSON object directly" breaks down when we want to +output a list of things, however, such as in the `types` field of `__schema`. +In the case of `types`, the JSON object to be generated is influenced by the +underlying selection set, so that, for instance, + +``` +query { + __schema { + types { + name + } + } +} +``` + +means that we only output the _name_ of every type in our schema. One naive +approach one might consider here would be to have a parser + +``` +typeField :: P.Type k -> Parser n J.Value +``` + +that takes a type, and is able to produce a JSON value for it, and then to apply +this parser to all the types in our schema. + +However, we only have *one* selection set to parse: so which of the parsers we +obtained should we use to parse it? And what should we do in the theoretical +case that we have a schema without any types? (The latter is actually not +possible since we always have `query_root`, but it illustrates the problem that +there is no canonical choice of type to use to parse the selection set.) +Additionally, this would allow us to get the JSON output for *one* type, rather +than for our list of types. After all, @Parser n@ is *not* a @Monad@ (it's not +even an @Applicative@), so we don't have a map @(a -> Parser n b) -> [a] -> m +[b]@. + +In order to resolve this conundrum, let's consider what the ordinary Postgres +schema generates for a query such as follows. + +``` +query { + author { + articles { + title + } + } +} +``` + +Now the @Parser@ for an article's title does not directly give the desired +output data: indeed, there would be many different titles, rather than any +single one we can return. Instead, it returns a value that can, after parsing, +be used to build an output, along the lines of: + +``` +articleTitle :: FieldParser n SQLArticleTitle +``` + +(This is a heavily simplified representation of reality.) + +These values can be thought of as an intermediate representation that can then +be used to generate and run SQL that gives the desired JSON output at a later +stage. In other words, in the above example, @SQLArticleTitle@ can be thought +of as a function @Article -> Title@ that, given an article, gives back its +title. + +Such an approach could help us as well, as, from instructions on how to generate +a JSON return for a given `__Type`, surely we can later simply apply this +construction to all types desired. + +However, we don't _need_ to build an intermediate AST to do this: we can simply +output the conversion functions directly. So the type of @typeField@ is closer +to: + +``` +typeField :: Parser n (P.Type k -> J.Value) +``` + +This says that we are always able to parse a selection set for a `__Type`, and +once we do, we obtain a map, which we refer to as `printer` in this module, +which can output a JSON object for a given GraphQL type from our schema. + +To use `typeField` as part of another selection set, we build up a corresponding +`FieldParser`, thus obtaining a printer, then apply this printer to all desired +types, and output the final JSON object as a J.Array of the printed results, +like so (again, heavily simplified): + +``` + types :: FieldParser n J.Value + types = do + printer <- P.subselection_ $$(G.litName "types") Nothing typeField + return $ J.Array $ map printer $ allSchemaTypes +``` + +Upon reading this you may be bewildered how we are able to use do notation for +@FieldParser@, which does not have a @Monad@ instance, or even an @Applicative@ +instance. It just so happens that, as long as we write our do blocks carefully, +so that we only use the functoriality of @FieldParser@, the simplification rules +of GHC kick in just in time to avoid any application of @(>>=)@ or @return@. +Arguably the above notation is prettier than equivalent code that explicitly +reduces this to applications of @fmap@. If you, dear reader, feel like the do +notation adds more confusion than value, you should feel free to change this, as +there is no deeper meaning to the application of do notation than ease of +reading. +-} + +-- | Generate a __type introspection parser +typeIntrospection + :: forall n + . MonadParse n + => Schema + -> FieldParser n J.Value +typeIntrospection fakeSchema = do + let nameArg :: P.InputFieldsParser n G.Name + nameArg = G.unsafeMkName <$> P.field $$(G.litName "name") Nothing P.string + name'printer <- P.subselection $$(G.litName "__type") Nothing nameArg typeField + return $ case Map.lookup (fst name'printer) (sTypes fakeSchema) of + Nothing -> J.Null + Just (P.Definition n u d (P.SomeTypeInfo i)) -> + snd name'printer (SomeType (P.Nullable (P.TNamed (P.Definition n u d i)))) + +-- | Generate a __schema introspection parser. +schema + :: forall n + . MonadParse n + => Schema + -> FieldParser n J.Value +schema fakeSchema = + let schemaSetParser = schemaSet fakeSchema + in P.subselection_ $$(G.litName "__schema") Nothing schemaSetParser + +{- +type __Type { + kind: __TypeKind! + name: String + description: String + + # should be non-null for OBJECT and INTERFACE only, must be null for the others + fields(includeDeprecated: Boolean = false): [__Field!] + + # should be non-null for OBJECT and INTERFACE only, must be null for the others + interfaces: [__Type!] + + # should be non-null for INTERFACE and UNION only, always null for the others + possibleTypes: [__Type!] + + # should be non-null for ENUM only, must be null for the others + enumValues(includeDeprecated: Boolean = false): [__EnumValue!] + + # should be non-null for INPUT_OBJECT only, must be null for the others + inputFields: [__InputValue!] + + # should be non-null for NON_NULL and LIST only, must be null for the others + ofType: __Type +} +-} + +data SomeType = forall k . SomeType (P.Type k) + +typeField + :: forall n + . MonadParse n + => Parser 'Output n (SomeType -> J.Value) +typeField = + let + includeDeprecated :: P.InputFieldsParser n Bool + includeDeprecated = + P.fieldWithDefault $$(G.litName "includeDeprecated") Nothing (G.VBoolean False) (P.nullable P.boolean) + <&> fromMaybe False + kind :: FieldParser n (SomeType -> J.Value) + kind = P.selection_ $$(G.litName "kind") Nothing typeKind $> + \case SomeType tp -> + case tp of + P.NonNullable _ -> + J.String "NON_NULL" + P.Nullable (P.TList _) -> + J.String "LIST" + P.Nullable (P.TNamed (P.Definition _ _ _ P.TIScalar)) -> + J.String "SCALAR" + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIEnum _))) -> + J.String "ENUM" + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInputObject _))) -> + J.String "INPUT_OBJECT" + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject _))) -> + J.String "OBJECT" + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface _))) -> + J.String "INTERFACE" + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIUnion _))) -> + J.String "UNION" + name :: FieldParser n (SomeType -> J.Value) + name = P.selection_ $$(G.litName "name") Nothing P.string $> + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition name' _ _ _)) -> + nameAsJSON name' + _ -> J.Null + description :: FieldParser n (SomeType -> J.Value) + description = P.selection_ $$(G.litName "description") Nothing P.string $> + \case SomeType tp -> + case P.discardNullability tp of + P.TNamed (P.Definition _ _ (Just desc) _) -> + J.String (G.unDescription desc) + _ -> J.Null + fields :: FieldParser n (SomeType -> J.Value) + fields = do + -- TODO handle the value of includeDeprecated + includeDeprecated'printer <- P.subselection $$(G.litName "fields") Nothing includeDeprecated fieldField + return $ + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject (P.ObjectInfo fields' _interfaces')))) -> + J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName fields' + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface (P.InterfaceInfo fields' _objects')))) -> + J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName fields' + _ -> J.Null + interfaces :: FieldParser n (SomeType -> J.Value) + interfaces = do + printer <- P.subselection_ $$(G.litName "interfaces") Nothing typeField + return $ + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIObject (P.ObjectInfo _fields' interfaces')))) -> + J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIInterface) $ sortOn P.dName interfaces' + _ -> J.Null + possibleTypes :: FieldParser n (SomeType -> J.Value) + possibleTypes = do + printer <- P.subselection_ $$(G.litName "possibleTypes") Nothing typeField + return $ + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInterface (P.InterfaceInfo _fields' objects')))) -> + J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIObject) $ sortOn P.dName objects' + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIUnion (P.UnionInfo objects')))) -> + J.Array $ V.fromList $ fmap (printer . SomeType . P.Nullable . P.TNamed . fmap P.TIObject) $ sortOn P.dName objects' + _ -> J.Null + enumValues :: FieldParser n (SomeType -> J.Value) + enumValues = do + -- TODO handle the value of includeDeprecated + includeDeprecated'printer <- P.subselection $$(G.litName "enumValues") Nothing includeDeprecated enumValue + return $ + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIEnum vals))) -> + J.Array $ V.fromList $ fmap (snd includeDeprecated'printer) $ sortOn P.dName $ toList vals + _ -> J.Null + inputFields :: FieldParser n (SomeType -> J.Value) + inputFields = do + printer <- P.subselection_ $$(G.litName "inputFields") Nothing inputValue + return $ + \case SomeType tp -> + case tp of + P.Nullable (P.TNamed (P.Definition _ _ _ (P.TIInputObject (P.InputObjectInfo fieldDefs)))) -> + J.Array $ V.fromList $ map printer $ sortOn P.dName fieldDefs + _ -> J.Null + ofType :: FieldParser n (SomeType -> J.Value) + ofType = do + printer <- P.subselection_ $$(G.litName "ofType") Nothing typeField + return $ \case + SomeType (P.NonNullable x) -> + printer $ SomeType $ P.Nullable x + SomeType (P.Nullable (P.TList x)) -> + printer $ SomeType x + _ -> J.Null + in + applyPrinter <$> + P.selectionSet + $$(G.litName "__Type") + Nothing + [ kind + , name + , description + , fields + , interfaces + , possibleTypes + , enumValues + , inputFields + , ofType + ] + +{- +type __InputValue { + name: String! + description: String + type: __Type! + defaultValue: String +} +-} +inputValue + :: forall n + . MonadParse n + => Parser 'Output n (P.Definition P.InputFieldInfo -> J.Value) +inputValue = + let + name :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value) + name = P.selection_ $$(G.litName "name") Nothing P.string $> + nameAsJSON . P.dName + description :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value) + description = P.selection_ $$(G.litName "description") Nothing P.string $> + maybe J.Null (J.String . G.unDescription) . P.dDescription + typeF :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value) + typeF = do + printer <- P.subselection_ $$(G.litName "type") Nothing typeField + return $ \defInfo -> case P.dInfo defInfo of + P.IFRequired tp -> printer $ SomeType $ P.NonNullable tp + P.IFOptional tp _ -> printer $ SomeType tp + defaultValue :: FieldParser n (P.Definition P.InputFieldInfo -> J.Value) + defaultValue = P.selection_ $$(G.litName "defaultValue") Nothing P.string $> + \defInfo -> case P.dInfo defInfo of + P.IFOptional _ (Just val) -> J.String $ T.run $ GP.value val + _ -> J.Null + in + applyPrinter <$> + P.selectionSet + $$(G.litName "__InputValue") + Nothing + [ name + , description + , typeF + , defaultValue + ] + +{- +type __EnumValue { + name: String! + description: String + isDeprecated: Boolean! + deprecationReason: String +} +-} +enumValue + :: forall n + . MonadParse n + => Parser 'Output n (P.Definition P.EnumValueInfo -> J.Value) +enumValue = + let + name :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value) + name = P.selection_ $$(G.litName "name") Nothing P.string $> + nameAsJSON . P.dName + description :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value) + description = P.selection_ $$(G.litName "description") Nothing P.string $> + maybe J.Null (J.String . G.unDescription) . P.dDescription + -- TODO We don't seem to support enum value deprecation + isDeprecated :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value) + isDeprecated = P.selection_ $$(G.litName "isDeprecated") Nothing P.string $> + const (J.Bool False) + deprecationReason :: FieldParser n (P.Definition P.EnumValueInfo -> J.Value) + deprecationReason = P.selection_ $$(G.litName "deprecationReason") Nothing P.string $> + const J.Null + in + applyPrinter <$> + P.selectionSet + $$(G.litName "__EnumValue") + Nothing + [ name + , description + , isDeprecated + , deprecationReason + ] + +{- +enum __TypeKind { + ENUM + INPUT_OBJECT + INTERFACE + LIST + NON_NULL + OBJECT + SCALAR + UNION +} +-} +typeKind + :: forall n + . MonadParse n + => Parser 'Both n () +typeKind = P.enum + $$(G.litName "__TypeKind") + Nothing + (NE.fromList + [ mkDefinition $$(G.litName "ENUM") + , mkDefinition $$(G.litName "INPUT_OBJECT") + , mkDefinition $$(G.litName "INTERFACE") + , mkDefinition $$(G.litName "LIST") + , mkDefinition $$(G.litName "NON_NULL") + , mkDefinition $$(G.litName "OBJECT") + , mkDefinition $$(G.litName "SCALAR") + , mkDefinition $$(G.litName "UNION") + ]) + where + mkDefinition name = (P.Definition name Nothing Nothing P.EnumValueInfo, ()) + +{- +type __Field { + name: String! + description: String + args: [__InputValue!]! + type: __Type! + isDeprecated: Boolean! + deprecationReason: String +} +-} +fieldField + :: forall n + . MonadParse n + => Parser 'Output n (P.Definition P.FieldInfo -> J.Value) +fieldField = + let + name :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + name = P.selection_ $$(G.litName "name") Nothing P.string $> + nameAsJSON . P.dName + description :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + description = P.selection_ $$(G.litName "description") Nothing P.string $> \defInfo -> + case P.dDescription defInfo of + Nothing -> J.Null + Just desc -> J.String (G.unDescription desc) + args :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + args = do + printer <- P.subselection_ $$(G.litName "args") Nothing inputValue + return $ J.Array . V.fromList . map printer . sortOn P.dName . P.fArguments . P.dInfo + typeF :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + typeF = do + printer <- P.subselection_ $$(G.litName "type") Nothing typeField + return $ printer . (\case P.FieldInfo _ tp -> SomeType tp) . P.dInfo + -- TODO We don't seem to track deprecation info + isDeprecated :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + isDeprecated = P.selection_ $$(G.litName "isDeprecated") Nothing P.string $> + const (J.Bool False) + deprecationReason :: FieldParser n (P.Definition P.FieldInfo -> J.Value) + deprecationReason = P.selection_ $$(G.litName "deprecationReason") Nothing P.string $> + const J.Null + in + applyPrinter <$> + P.selectionSet $$(G.litName "__Field") Nothing + [ name + , description + , args + , typeF + , isDeprecated + , deprecationReason + ] + +{- +type __Directive { + name: String! + description: String + locations: [__DirectiveLocation!]! + args: [__InputValue!]! + isRepeatable: Boolean! +} +-} + +directiveSet + :: forall n + . MonadParse n + => Parser 'Output n (P.DirectiveInfo -> J.Value) +directiveSet = + let + name :: FieldParser n (P.DirectiveInfo -> J.Value) + name = P.selection_ $$(G.litName "name") Nothing P.string $> + (J.toJSON . P.diName) + description :: FieldParser n (P.DirectiveInfo -> J.Value) + description = P.selection_ $$(G.litName "description") Nothing P.string $> + (J.toJSON . P.diDescription) + locations :: FieldParser n (P.DirectiveInfo -> J.Value) + locations = P.selection_ $$(G.litName "locations") Nothing P.string $> + (J.toJSON . map showDirLoc . P.diLocations) + args :: FieldParser n (P.DirectiveInfo -> J.Value) + args = do + printer <- P.subselection_ $$(G.litName "args") Nothing inputValue + pure $ J.toJSON . map printer . P.diArguments + isRepeatable :: FieldParser n (P.DirectiveInfo -> J.Value) + isRepeatable = P.selection_ $$(G.litName "isRepeatable") Nothing P.string $> + const J.Null + in + applyPrinter <$> P.selectionSet $$(G.litName "__Directive") Nothing + [ name + , description + , locations + , args + , isRepeatable + ] + where + showDirLoc :: G.DirectiveLocation -> Text + showDirLoc = \case + G.DLExecutable edl -> T.pack $ drop 3 $ show edl + G.DLTypeSystem tsdl -> T.pack $ drop 4 $ show tsdl + + +{- +type __Schema { + description: String + types: [__Type!]! + queryType: __Type! + mutationType: __Type + subscriptionType: __Type + directives: [__Directive!]! +} +-} +schemaSet + :: forall n + . MonadParse n + => Schema + -> Parser 'Output n J.Value +schemaSet fakeSchema = + let + description :: FieldParser n J.Value + description = P.selection_ $$(G.litName "description") Nothing P.string $> + case sDescription fakeSchema of + Nothing -> J.Null + Just s -> J.String $ G.unDescription s + types :: FieldParser n J.Value + types = do + printer <- P.subselection_ $$(G.litName "types") Nothing typeField + return $ J.Array $ V.fromList $ map (printer . schemaTypeToSomeType) $ + sortOn P.dName $ Map.elems $ sTypes fakeSchema + where + schemaTypeToSomeType + :: P.Definition P.SomeTypeInfo + -> SomeType + schemaTypeToSomeType (P.Definition n u d (P.SomeTypeInfo i)) = + SomeType $ P.Nullable $ P.TNamed (P.Definition n u d i) + queryType :: FieldParser n J.Value + queryType = do + printer <- P.subselection_ $$(G.litName "queryType") Nothing typeField + return $ printer $ SomeType $ sQueryType fakeSchema + mutationType :: FieldParser n J.Value + mutationType = do + printer <- P.subselection_ $$(G.litName "mutationType") Nothing typeField + return $ case sMutationType fakeSchema of + Nothing -> J.Null + Just tp -> printer $ SomeType tp + subscriptionType :: FieldParser n J.Value + subscriptionType = do + printer <- P.subselection_ $$(G.litName "subscriptionType") Nothing typeField + return $ case sSubscriptionType fakeSchema of + Nothing -> J.Null + Just tp -> printer $ SomeType tp + directives :: FieldParser n J.Value + directives = do + printer <- P.subselection_ $$(G.litName "directives") Nothing directiveSet + return $ J.toJSON $ map printer $ sDirectives fakeSchema + in + selectionSetToJSON . fmap (P.handleTypename nameAsJSON) <$> + P.selectionSet + $$(G.litName "__Schema") + Nothing + [ description + , types + , queryType + , mutationType + , subscriptionType + , directives + ] + +selectionSetToJSON + :: OMap.InsOrdHashMap G.Name J.Value + -> J.Value +selectionSetToJSON = J.Object . OMap.toHashMap . OMap.mapKeys G.unName + +applyPrinter + :: OMap.InsOrdHashMap G.Name (P.ParsedSelection (a -> J.Value)) + -> a + -> J.Value +applyPrinter = flip (\x -> selectionSetToJSON . fmap (($ x) . P.handleTypename (const . nameAsJSON))) + +nameAsJSON :: G.Name -> J.Value +nameAsJSON = J.String . G.unName diff --git a/server/src-lib/Hasura/GraphQL/Schema/Merge.hs b/server/src-lib/Hasura/GraphQL/Schema/Merge.hs deleted file mode 100644 index a1d90d898cd..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Merge.hs +++ /dev/null @@ -1,152 +0,0 @@ -module Hasura.GraphQL.Schema.Merge - ( mergeGCtx - , checkSchemaConflicts - , checkConflictingNode - ) where - - -import qualified Data.HashMap.Strict as Map -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Context -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types - -mergeGCtx :: (MonadError QErr m) => GCtx -> GCtx -> m GCtx -mergeGCtx lGCtx rGCtx = do - checkSchemaConflicts lGCtx rGCtx - pure GCtx { _gTypes = mergedTypeMap - , _gFields = _gFields lGCtx <> _gFields rGCtx - , _gQueryRoot = mergedQueryRoot - , _gMutRoot = mergedMutationRoot - , _gSubRoot = mergedSubRoot - , _gOrdByCtx = _gOrdByCtx lGCtx <> _gOrdByCtx rGCtx - , _gQueryCtxMap = _gQueryCtxMap lGCtx <> _gQueryCtxMap rGCtx - , _gMutationCtxMap = _gMutationCtxMap lGCtx <> _gMutationCtxMap rGCtx - , _gInsCtxMap = _gInsCtxMap lGCtx <> _gInsCtxMap rGCtx - } - where - mergedQueryRoot = _gQueryRoot lGCtx <> _gQueryRoot rGCtx - mergedMutationRoot = _gMutRoot lGCtx <> _gMutRoot rGCtx - mergedSubRoot = _gSubRoot lGCtx <> _gSubRoot rGCtx - mergedTypeMap = - let mergedTypes = _gTypes lGCtx <> _gTypes rGCtx - modifyQueryRootField = Map.insert queryRootNamedType (TIObj mergedQueryRoot) - modifyMaybeRootField tyname maybeObj m = case maybeObj of - Nothing -> m - Just obj -> Map.insert tyname (TIObj obj) m - in modifyMaybeRootField subscriptionRootNamedType mergedSubRoot $ - modifyMaybeRootField mutationRootNamedType mergedMutationRoot $ - modifyQueryRootField mergedTypes - -checkSchemaConflicts - :: (MonadError QErr m) - => GCtx -> GCtx -> m () -checkSchemaConflicts gCtx remoteCtx = do - let typeMap = _gTypes gCtx -- hasura typemap - -- check type conflicts - let hTypes = Map.elems typeMap - hTyNames = map G.unNamedType $ Map.keys typeMap - -- get the root names from the remote schema - rmQRootName = _otiName $ _gQueryRoot remoteCtx - rmMRootName = maybeToList $ _otiName <$> _gMutRoot remoteCtx - rmSRootName = maybeToList $ _otiName <$> _gSubRoot remoteCtx - rmRootNames = map G.unNamedType (rmQRootName:(rmMRootName ++ rmSRootName)) - let rmTypes = Map.filterWithKey - (\k _ -> G.unNamedType k `notElem` builtinTy ++ rmRootNames) - $ _gTypes remoteCtx - - isTyInfoSame ty = any (`tyinfoEq` ty) hTypes - -- name is same and structure is not same - isSame n ty = G.unNamedType n `elem` hTyNames && - not (isTyInfoSame ty) - conflictedTypes = Map.filterWithKey isSame rmTypes - conflictedTyNames = map G.unNamedType $ Map.keys conflictedTypes - - unless (Map.null conflictedTypes) $ - throw400 RemoteSchemaConflicts $ tyMsg conflictedTyNames - - -- check node conflicts - let rmQRoot = _otiFields $ _gQueryRoot remoteCtx - rmMRoot = _otiFields <$> _gMutRoot remoteCtx - rmRoots = filter (`notElem` builtinNodes ++ rmRootNames) . Map.keys <$> - mergeMaybeMaps (Just rmQRoot) rmMRoot - hQR = _otiFields <$> - (getObjTyM =<< Map.lookup hQRName typeMap) - hMR = _otiFields <$> - (getObjTyM =<< Map.lookup hMRName typeMap) - hRoots = Map.keys <$> mergeMaybeMaps hQR hMR - - case (rmRoots, hRoots) of - (Just rmR, Just hR) -> do - let conflictedNodes = filter (`elem` hR) rmR - unless (null conflictedNodes) $ - throw400 RemoteSchemaConflicts $ nodesMsg conflictedNodes - _ -> return () - - where - tyinfoEq a b = case (a, b) of - (TIScalar t1, TIScalar t2) -> typeEq t1 t2 - (TIObj t1, TIObj t2) -> typeEq t1 t2 - (TIEnum t1, TIEnum t2) -> typeEq t1 t2 - (TIInpObj t1, TIInpObj t2) -> typeEq t1 t2 - _ -> False - - hQRName = queryRootNamedType - hMRName = mutationRootNamedType - tyMsg ty = "types: [ " <> namesToTxt ty <> - " ] have mismatch with current graphql schema. HINT: Types must be same." - nodesMsg n = "top-level nodes: [ " <> namesToTxt n <> - " ] already exist in current graphql schema. HINT: Top-level nodes can't be same." - namesToTxt = T.intercalate ", " . map G.unName - builtinNodes = ["__type", "__schema", "__typename"] - builtinTy = [ "__Directive" - , "__DirectiveLocation" - , "__EnumValue" - , "__Field" - , "__InputValue" - , "__Schema" - , "__Type" - , "__TypeKind" - , "Int" - , "Float" - , "String" - , "Boolean" - , "ID" - ] - -checkConflictingNode - :: (MonadError QErr m) - => TypeMap - -- ^ See 'GCtx'. - -> G.Name - -> m () -checkConflictingNode typeMap node = do - let hQR = _otiFields <$> - (getObjTyM =<< Map.lookup hQRName typeMap) - hMR = _otiFields <$> - (getObjTyM =<< Map.lookup hMRName typeMap) - hRoots = Map.keys <$> mergeMaybeMaps hQR hMR - case hRoots of - Just hR -> - when (node `elem` hR) $ - throw400 RemoteSchemaConflicts msg - _ -> return () - where - hQRName = queryRootNamedType - hMRName = mutationRootNamedType - msg = "node " <> G.unName node <> - " already exists in current graphql schema" - -mergeMaybeMaps - :: (Eq k, Hashable k) - => Maybe (Map.HashMap k v) - -> Maybe (Map.HashMap k v) - -> Maybe (Map.HashMap k v) -mergeMaybeMaps m1 m2 = case (m1, m2) of - (Nothing, Nothing) -> Nothing - (Just m1', Nothing) -> Just m1' - (Nothing, Just m2') -> Just m2' - (Just m1', Just m2') -> Just $ Map.union m1' m2' diff --git a/server/src-lib/Hasura/GraphQL/Schema/Mutation.hs b/server/src-lib/Hasura/GraphQL/Schema/Mutation.hs new file mode 100644 index 00000000000..db470a5c0dd --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Schema/Mutation.hs @@ -0,0 +1,531 @@ +{-# LANGUAGE ViewPatterns #-} + +module Hasura.GraphQL.Schema.Mutation + ( insertIntoTable + , insertOneIntoTable + , updateTable + , updateTableByPk + , deleteFromTable + , deleteFromTableByPk + ) where + + +import Data.Has +import Hasura.Prelude + +import qualified Data.HashMap.Strict as Map +import qualified Data.HashMap.Strict.InsOrd.Extended as OMap +import qualified Data.HashSet as Set +import qualified Data.Text as T +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Parser as P +import qualified Hasura.RQL.DML.Delete.Types as RQL +import qualified Hasura.RQL.DML.Insert.Types as RQL +import qualified Hasura.RQL.DML.Returning.Types as RQL +import qualified Hasura.RQL.DML.Update as RQL +import qualified Hasura.RQL.DML.Update.Types as RQL +import qualified Hasura.SQL.DML as S + +import Hasura.GraphQL.Parser (FieldParser, InputFieldsParser, Kind (..), + Parser, UnpreparedValue (..), mkParameter) +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Schema.BoolExp +import Hasura.GraphQL.Schema.Common +import Hasura.GraphQL.Schema.Insert +import Hasura.GraphQL.Schema.Select +import Hasura.GraphQL.Schema.Table +import Hasura.RQL.Types +import Hasura.SQL.Types + + + +-- insert + +-- | Construct a root field, normally called insert_tablename, that can be used to add several rows to a DB table +insertIntoTable + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> InsPermInfo -- ^ insert permissions of the table + -> Maybe SelPermInfo -- ^ select permissions of the table (if any) + -> Maybe UpdPermInfo -- ^ update permissions of the table (if any) + -> m (FieldParser n (AnnInsert UnpreparedValue)) +insertIntoTable table fieldName description insertPerms selectPerms updatePerms = do + columns <- tableColumns table + selectionParser <- mutationSelectionSet table selectPerms + objectsParser <- P.list <$> tableFieldsInput table insertPerms + conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms + let objectsName = $$(G.litName "objects") + objectsDesc = "the rows to be inserted" + argsParser = do + conflictClause <- mkConflictClause conflictParser + objects <- P.field objectsName (Just objectsDesc) objectsParser + pure (conflictClause, objects) + pure $ P.subselection fieldName description argsParser selectionParser + <&> \((conflictClause, objects), output) -> AnnInsert (G.unName fieldName) False + ( mkInsertObject objects table columns conflictClause insertPerms updatePerms + , RQL.MOutMultirowFields output + ) + +mkConflictClause :: MonadParse n => Maybe (Parser 'Input n a) -> InputFieldsParser n (Maybe a) +mkConflictClause conflictParser + = maybe + (pure Nothing) -- Empty InputFieldsParser (no arguments allowed) + (P.fieldOptional conflictName (Just conflictDesc)) + conflictParser + where + conflictName = $$(G.litName "on_conflict") + conflictDesc = "on conflict condition" + +-- | Variant of 'insertIntoTable' that inserts a single row +insertOneIntoTable + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> InsPermInfo -- ^ insert permissions of the table + -> SelPermInfo -- ^ select permissions of the table + -> Maybe UpdPermInfo -- ^ update permissions of the table (if any) + -> m (FieldParser n (AnnInsert UnpreparedValue)) +insertOneIntoTable table fieldName description insertPerms selectPerms updatePerms = do + columns <- tableColumns table + selectionParser <- tableSelectionSet table selectPerms + objectParser <- tableFieldsInput table insertPerms + conflictParser <- fmap join $ sequenceA $ conflictObject table (Just selectPerms) <$> updatePerms + let objectName = $$(G.litName "object") + objectDesc = "the row to be inserted" + argsParser = do + conflictClause <- mkConflictClause conflictParser + object <- P.field objectName (Just objectDesc) objectParser + pure (conflictClause, object) + pure $ P.subselection fieldName description argsParser selectionParser + <&> \((conflictClause, object), output) -> AnnInsert (G.unName fieldName) True + ( mkInsertObject [object] table columns conflictClause insertPerms updatePerms + , RQL.MOutSinglerowObject output + ) + +-- | We specify the data of an individual row to insert through this input parser. +tableFieldsInput + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable -- ^ qualified name of the table + -> InsPermInfo -- ^ insert permissions of the table + -> m (Parser 'Input n (AnnInsObj UnpreparedValue)) +tableFieldsInput table insertPerms = memoizeOn 'tableFieldsInput table do + tableName <- qualifiedObjectToName table + allFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table + objectFields <- catMaybes <$> for (Map.elems allFields) \case + FIComputedField _ -> pure Nothing + FIRemoteRelationship _ -> pure Nothing + FIColumn columnInfo -> + whenMaybe (Set.member (pgiColumn columnInfo) (ipiCols insertPerms)) do + let columnName = pgiName columnInfo + columnDesc = pgiDescription columnInfo + fieldParser <- P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo) + pure $ P.fieldOptional columnName columnDesc fieldParser `mapField` + \(mkParameter -> value) -> AnnInsObj [(pgiColumn columnInfo, value)] [] [] + FIRelationship relationshipInfo -> runMaybeT $ do + let otherTable = riRTable relationshipInfo + relName = riName relationshipInfo + permissions <- MaybeT $ tablePermissions otherTable + relFieldName <- lift $ textToName $ relNameToTxt relName + insPerms <- MaybeT $ pure $ _permIns permissions + let selPerms = _permSel permissions + updPerms = _permUpd permissions + lift $ case riType relationshipInfo of + ObjRel -> do + parser <- objectRelationshipInput otherTable insPerms selPerms updPerms + pure $ P.fieldOptional relFieldName Nothing parser `mapField` + \objRelIns -> AnnInsObj [] [RelIns objRelIns relationshipInfo] [] + ArrRel -> do + parser <- P.nullable <$> arrayRelationshipInput otherTable insPerms selPerms updPerms + pure $ P.fieldOptional relFieldName Nothing parser <&> \arrRelIns -> do + rel <- join arrRelIns + Just $ AnnInsObj [] [] [RelIns rel relationshipInfo | not $ null $ _aiInsObj rel] + let objectName = tableName <> $$(G.litName "_insert_input") + objectDesc = G.Description $ "input type for inserting data into table " <>> table + pure $ P.object objectName (Just objectDesc) $ catMaybes <$> sequenceA objectFields + <&> mconcat + +-- | Used by 'tableFieldsInput' for object data that is nested through object relationships +objectRelationshipInput + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> InsPermInfo + -> Maybe SelPermInfo + -> Maybe UpdPermInfo + -> m (Parser 'Input n (SingleObjIns UnpreparedValue)) +objectRelationshipInput table insertPerms selectPerms updatePerms = + memoizeOn 'objectRelationshipInput table do + tableName <- qualifiedObjectToName table + columns <- tableColumns table + objectParser <- tableFieldsInput table insertPerms + conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms + let objectName = $$(G.litName "data") + inputName = tableName <> $$(G.litName "_obj_rel_insert_input") + inputDesc = G.Description $ "input type for inserting object relation for remote table " <>> table + inputParser = do + conflictClause <- mkConflictClause conflictParser + object <- P.field objectName Nothing objectParser + pure $ mkInsertObject object table columns conflictClause insertPerms updatePerms + pure $ P.object inputName (Just inputDesc) inputParser + +-- | Used by 'tableFieldsInput' for object data that is nested through object relationships +arrayRelationshipInput + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> InsPermInfo + -> Maybe SelPermInfo + -> Maybe UpdPermInfo + -> m (Parser 'Input n (MultiObjIns UnpreparedValue)) +arrayRelationshipInput table insertPerms selectPerms updatePerms = + memoizeOn 'arrayRelationshipInput table do + tableName <- qualifiedObjectToName table + columns <- tableColumns table + objectParser <- tableFieldsInput table insertPerms + conflictParser <- fmap join $ sequenceA $ conflictObject table selectPerms <$> updatePerms + let objectsName = $$(G.litName "data") + inputName = tableName <> $$(G.litName "_arr_rel_insert_input") + inputDesc = G.Description $ "input type for inserting array relation for remote table " <>> table + inputParser = do + conflictClause <- mkConflictClause conflictParser + objects <- P.field objectsName Nothing $ P.list objectParser + pure $ mkInsertObject objects table columns conflictClause insertPerms updatePerms + pure $ P.object inputName (Just inputDesc) inputParser + +mkInsertObject + :: a + -> QualifiedTable + -> [PGColumnInfo] + -> Maybe (RQL.ConflictClauseP1 UnpreparedValue) + -> InsPermInfo + -> Maybe UpdPermInfo + -> AnnIns a UnpreparedValue +mkInsertObject objects table columns conflictClause insertPerms updatePerms = + AnnIns { _aiInsObj = objects + , _aiTableName = table + , _aiConflictClause = conflictClause + , _aiCheckCond = (insertCheck, updateCheck) + , _aiTableCols = columns + , _aiDefVals = defaultValues + } + where insertCheck = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ ipiCheck insertPerms + updateCheck = fmapAnnBoolExp partialSQLExpToUnpreparedValue <$> (upiCheck =<< updatePerms) + defaultValues = Map.union (partialSQLExpToUnpreparedValue <$> ipiSet insertPerms) + $ fmap UVLiteral $ S.mkColDefValMap $ map pgiColumn columns + +-- | Specifies the "ON CONFLICT" SQL clause +conflictObject + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> Maybe SelPermInfo + -> UpdPermInfo + -> m (Maybe (Parser 'Input n (RQL.ConflictClauseP1 UnpreparedValue))) +conflictObject table selectPerms updatePerms = runMaybeT $ do + tableName <- lift $ qualifiedObjectToName table + columnsEnum <- MaybeT $ tableUpdateColumnsEnum table updatePerms + constraints <- MaybeT $ tciUniqueOrPrimaryKeyConstraints . _tiCoreInfo <$> askTableInfo table + constraintParser <- lift $ conflictConstraint constraints table + whereExpParser <- lift $ boolExp table selectPerms + let objectName = tableName <> $$(G.litName "_on_conflict") + objectDesc = G.Description $ "on conflict condition type for table " <>> table + constraintName = $$(G.litName "constraint") + columnsName = $$(G.litName "update_columns") + whereExpName = $$(G.litName "where") + fieldsParser = do + constraint <- RQL.CTConstraint <$> P.field constraintName Nothing constraintParser + columns <- P.field columnsName Nothing $ P.list columnsEnum + whereExp <- P.fieldOptional whereExpName Nothing whereExpParser + pure $ case columns of + [] -> RQL.CP1DoNothing $ Just constraint + _ -> RQL.CP1Update constraint columns preSetColumns $ + BoolAnd $ catMaybes [whereExp, Just $ fmapAnnBoolExp partialSQLExpToUnpreparedValue $ upiFilter updatePerms] + pure $ P.object objectName (Just objectDesc) fieldsParser + where preSetColumns = partialSQLExpToUnpreparedValue <$> upiSet updatePerms + +conflictConstraint + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => NonEmpty Constraint + -> QualifiedTable + -> m (Parser 'Both n ConstraintName) +conflictConstraint constraints table = memoizeOn 'conflictConstraint table $ do + tableName <- qualifiedObjectToName table + constraintEnumValues <- for constraints \constraint -> do + name <- textToName $ getConstraintTxt $ _cName constraint + pure ( P.mkDefinition name (Just "unique or primary key constraint") P.EnumValueInfo + , _cName constraint + ) + let enumName = tableName <> $$(G.litName "_constraint") + enumDesc = G.Description $ "unique or primary key constraints on table " <>> table + pure $ P.enum enumName (Just enumDesc) constraintEnumValues + + + +-- update + +-- | Construct a root field, normally called update_tablename, that can be used +-- to update rows in a DB table specified by filters. Only returns a parser if +-- there are columns the user is allowed to update; otherwise returns Nothing. +updateTable + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> UpdPermInfo -- ^ update permissions of the table + -> Maybe SelPermInfo -- ^ select permissions of the table (if any) + -> m (Maybe (FieldParser n (RQL.AnnUpdG UnpreparedValue))) +updateTable table fieldName description updatePerms selectPerms = runMaybeT $ do + let whereName = $$(G.litName "where") + whereDesc = "filter the rows which have to be updated" + opArgs <- MaybeT $ updateOperators table updatePerms + columns <- lift $ tableColumns table + whereArg <- lift $ P.field whereName (Just whereDesc) <$> boolExp table selectPerms + selection <- lift $ mutationSelectionSet table selectPerms + let argsParser = liftA2 (,) opArgs whereArg + pure $ P.subselection fieldName description argsParser selection + <&> mkUpdateObject table columns updatePerms . fmap RQL.MOutMultirowFields + +-- | Construct a root field, normally called update_tablename, that can be used +-- to update a single in a DB table, specified by primary key. Only returns a +-- parser if there are columns the user is allowed to update and if the user has +-- select permissions on all primary keys; otherwise returns Nothing. +updateTableByPk + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> UpdPermInfo -- ^ update permissions of the table + -> SelPermInfo -- ^ select permissions of the table + -> m (Maybe (FieldParser n (RQL.AnnUpdG UnpreparedValue))) +updateTableByPk table fieldName description updatePerms selectPerms = runMaybeT $ do + tableName <- qualifiedObjectToName table + columns <- lift $ tableSelectColumns table selectPerms + pkArgs <- MaybeT $ primaryKeysArguments table selectPerms + opArgs <- MaybeT $ updateOperators table updatePerms + selection <- lift $ tableSelectionSet table selectPerms + let pkFieldName = $$(G.litName "pk_columns") + pkObjectName = tableName <> $$(G.litName "_pk_columns_input") + pkObjectDesc = G.Description $ "primary key columns input for table: " <> G.unName tableName + argsParser = do + operators <- opArgs + primaryKeys <- P.field pkFieldName Nothing $ P.object pkObjectName (Just pkObjectDesc) pkArgs + pure (operators, primaryKeys) + pure $ P.subselection fieldName description argsParser selection + <&> mkUpdateObject table columns updatePerms . fmap RQL.MOutSinglerowObject + +mkUpdateObject + :: QualifiedTable + -> [PGColumnInfo] + -> UpdPermInfo + -> ( ( [(PGCol, RQL.UpdOpExpG UnpreparedValue)] + , AnnBoolExp UnpreparedValue + ) + , RQL.MutationOutputG UnpreparedValue + ) + -> RQL.AnnUpdG UnpreparedValue +mkUpdateObject table columns updatePerms ((opExps, whereExp), mutationOutput) = + RQL.AnnUpd { RQL.uqp1Table = table + , RQL.uqp1OpExps = opExps + , RQL.uqp1Where = (permissionFilter, whereExp) + , RQL.uqp1Check = checkExp + , RQL.uqp1Output = mutationOutput + , RQL.uqp1AllCols = columns + } + where + permissionFilter = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ upiFilter updatePerms + checkExp = maybe annBoolExpTrue (fmapAnnBoolExp partialSQLExpToUnpreparedValue) $ upiCheck updatePerms + +-- | Various update operators +updateOperators + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => QualifiedTable -- ^ qualified name of the table + -> UpdPermInfo -- ^ update permissions of the table + -> m (Maybe (InputFieldsParser n [(PGCol, RQL.UpdOpExpG UnpreparedValue)])) +updateOperators table updatePermissions = do + tableName <- qualifiedObjectToName table + columns <- tableUpdateColumns table updatePermissions + let numericCols = onlyNumCols columns + jsonCols = onlyJSONBCols columns + parsers <- catMaybes <$> sequenceA + [ updateOperator tableName $$(G.litName "_set") + columnParser RQL.UpdSet columns + "sets the columns of the filtered rows to the given values" + (G.Description $ "input type for updating data in table " <>> table) + + , updateOperator tableName $$(G.litName "_inc") + columnParser RQL.UpdInc numericCols + "increments the numeric columns with given value of the filtered values" + (G.Description $"input type for incrementing numeric columns in table " <>> table) + + , let desc = "prepend existing jsonb value of filtered columns with new jsonb value" + in updateOperator tableName $$(G.litName "_prepend") + columnParser RQL.UpdPrepend jsonCols desc desc + + , let desc = "append existing jsonb value of filtered columns with new jsonb value" + in updateOperator tableName $$(G.litName "_append") + columnParser RQL.UpdAppend jsonCols desc desc + + , let desc = "delete key/value pair or string element. key/value pairs are matched based on their key value" + in updateOperator tableName $$(G.litName "_delete_key") + nullableTextParser RQL.UpdDeleteKey jsonCols desc desc + + , let desc = "delete the array element with specified index (negative integers count from the end). " + <> "throws an error if top level container is not an array" + in updateOperator tableName $$(G.litName "_delete_elem") + nonNullableIntParser RQL.UpdDeleteElem jsonCols desc desc + + , let desc = "delete the field or element with specified path (for JSON arrays, negative integers count from the end)" + in updateOperator tableName $$(G.litName "_delete_at_path") + (fmap P.list . nonNullableTextParser) RQL.UpdDeleteAtPath jsonCols desc desc + ] + whenMaybe (not $ null parsers) do + let allowedOperators = fst <$> parsers + pure $ fmap catMaybes (sequenceA $ snd <$> parsers) + `P.bindFields` \opExps -> do + -- there needs to be at least one operator in the update, even if it is empty + let presetColumns = Map.toList $ RQL.UpdSet . partialSQLExpToUnpreparedValue <$> upiSet updatePermissions + when (null opExps && null presetColumns) $ parseError $ + "at least any one of " <> T.intercalate ", " allowedOperators <> " is expected" + + -- no column should appear twice + let flattenedExps = concat opExps + erroneousExps = OMap.filter ((>1) . length) $ OMap.groupTuples flattenedExps + unless (OMap.null erroneousExps) $ parseError $ + "column found in multiple operators; " <> + T.intercalate ". " [ dquote column <> " in " <> T.intercalate ", " (toList $ RQL.updateOperatorText <$> ops) + | (column, ops) <- OMap.toList erroneousExps + ] + + pure $ presetColumns <> flattenedExps + where + columnParser columnInfo = fmap P.mkParameter <$> P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo) + nonNullableTextParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGText) (G.Nullability False) + nullableTextParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGText) (G.Nullability True) + nonNullableIntParser _ = fmap P.mkParameter <$> P.column (PGColumnScalar PGInteger) (G.Nullability False) + + updateOperator + :: G.Name + -> G.Name + -> (PGColumnInfo -> m (Parser 'Both n a)) + -> (a -> RQL.UpdOpExpG UnpreparedValue) + -> [PGColumnInfo] + -> G.Description + -> G.Description + -> m (Maybe (Text, InputFieldsParser n (Maybe [(PGCol, RQL.UpdOpExpG UnpreparedValue)]))) + updateOperator tableName opName mkParser updOpExp columns opDesc objDesc = + whenMaybe (not $ null columns) do + fields <- for columns \columnInfo -> do + let fieldName = pgiName columnInfo + fieldDesc = pgiDescription columnInfo + fieldParser <- mkParser columnInfo + pure $ P.fieldOptional fieldName fieldDesc fieldParser + `mapField` \value -> (pgiColumn columnInfo, updOpExp value) + let objName = tableName <> opName <> $$(G.litName "_input") + pure $ (G.unName opName,) + $ P.fieldOptional opName (Just opDesc) + $ P.object objName (Just objDesc) + $ catMaybes <$> sequenceA fields + + + +-- delete + +-- | Construct a root field, normally called delete_tablename, that can be used +-- to delete several rows from a DB table +deleteFromTable + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> DelPermInfo -- ^ delete permissions of the table + -> Maybe SelPermInfo -- ^ select permissions of the table (if any) + -> m (FieldParser n (RQL.AnnDelG UnpreparedValue)) +deleteFromTable table fieldName description deletePerms selectPerms = do + let whereName = $$(G.litName "where") + whereDesc = "filter the rows which have to be deleted" + whereArg <- P.field whereName (Just whereDesc) <$> boolExp table selectPerms + selection <- mutationSelectionSet table selectPerms + columns <- tableColumns table + pure $ P.subselection fieldName description whereArg selection + <&> mkDeleteObject table columns deletePerms . fmap RQL.MOutMultirowFields + +-- | Construct a root field, normally called delete_tablename, that can be used +-- to delete an individual rows from a DB table, specified by primary key +deleteFromTableByPk + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> DelPermInfo -- ^ delete permissions of the table + -> SelPermInfo -- ^ select permissions of the table + -> m (Maybe (FieldParser n (RQL.AnnDelG UnpreparedValue))) +deleteFromTableByPk table fieldName description deletePerms selectPerms = runMaybeT $ do + columns <- lift $ tableSelectColumns table selectPerms + pkArgs <- MaybeT $ primaryKeysArguments table selectPerms + selection <- lift $ tableSelectionSet table selectPerms + pure $ P.subselection fieldName description pkArgs selection + <&> mkDeleteObject table columns deletePerms . fmap RQL.MOutSinglerowObject + +mkDeleteObject + :: QualifiedTable + -> [PGColumnInfo] + -> DelPermInfo + -> (AnnBoolExp UnpreparedValue, RQL.MutationOutputG UnpreparedValue) + -> RQL.AnnDelG UnpreparedValue +mkDeleteObject table columns deletePerms (whereExp, mutationOutput) = + RQL.AnnDel { RQL.dqp1Table = table + , RQL.dqp1Where = (permissionFilter, whereExp) + , RQL.dqp1Output = mutationOutput + , RQL.dqp1AllCols = columns + } + where + permissionFilter = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ dpiFilter deletePerms + + + +-- common + +-- | All mutations allow returning results, such as what the updated database +-- rows look like. This parser allows a query to specify what data to fetch. +mutationSelectionSet + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable + -> Maybe SelPermInfo + -> m (Parser 'Output n (RQL.MutFldsG UnpreparedValue)) +mutationSelectionSet table selectPerms = + memoizeOn 'mutationSelectionSet table do + tableName <- qualifiedObjectToName table + returning <- runMaybeT do + permissions <- MaybeT $ pure selectPerms + tableSet <- lift $ tableSelectionList table permissions + let returningName = $$(G.litName "returning") + returningDesc = "data from the rows affected by the mutation" + pure $ RQL.MRet <$> P.subselection_ returningName (Just returningDesc) tableSet + let affectedRowsName = $$(G.litName "affected_rows") + affectedRowsDesc = "number of rows affected by the mutation" + selectionName = tableName <> $$(G.litName "_mutation_response") + selectionDesc = G.Description $ "response of any mutation on the table " <>> table + + selectionFields = catMaybes + [ Just $ RQL.MCount <$ + P.selection_ affectedRowsName (Just affectedRowsDesc) P.int + , returning + ] + pure $ P.selectionSet selectionName (Just selectionDesc) selectionFields + <&> parsedSelectionsToFields RQL.MExp + +-- | How to specify a database row by primary key. +primaryKeysArguments + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => QualifiedTable + -> SelPermInfo + -> m (Maybe (InputFieldsParser n (AnnBoolExp UnpreparedValue))) +primaryKeysArguments table selectPerms = runMaybeT $ do + primaryKeys <- MaybeT $ _tciPrimaryKey . _tiCoreInfo <$> askTableInfo table + let columns = _pkColumns primaryKeys + guard $ all (\c -> pgiColumn c `Set.member` spiCols selectPerms) columns + lift $ fmap (BoolAnd . toList) . sequenceA <$> for columns \columnInfo -> do + field <- P.column (pgiType columnInfo) (G.Nullability False) + pure $ BoolFld . AVCol columnInfo . pure . AEQ True . mkParameter <$> + P.field (pgiName columnInfo) (pgiDescription columnInfo) field diff --git a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Common.hs b/server/src-lib/Hasura/GraphQL/Schema/Mutation/Common.hs deleted file mode 100644 index b1c61c3694c..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Common.hs +++ /dev/null @@ -1,51 +0,0 @@ -module Hasura.GraphQL.Schema.Mutation.Common - ( mkPGColInp - , mkMutRespTy - , mkMutRespObj - ) where - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.SQL.Types - -mkPGColInp :: PGColumnInfo -> InpValInfo -mkPGColInp ci = - InpValInfo Nothing (pgiName ci) Nothing $ G.toGT $ mkColumnType $ pgiType ci - --- table_mutation_response -mkMutRespTy :: QualifiedTable -> G.NamedType -mkMutRespTy tn = - G.NamedType $ qualObjectToName tn <> "_mutation_response" - -{- -type table_mutation_response { - affected_rows: Int! - returning: [table!]! -} --} -mkMutRespObj - :: QualifiedTable - -> Bool -- is sel perm defined - -> ObjTyInfo -mkMutRespObj tn sel = - mkHsraObjTyInfo (Just objDesc) (mkMutRespTy tn) Set.empty $ mapFromL _fiName - $ affectedRowsFld : bool [] [returningFld] sel - where - objDesc = G.Description $ - "response of any mutation on the table " <>> tn - affectedRowsFld = - mkHsraObjFldInfo (Just desc) "affected_rows" Map.empty $ - G.toGT $ G.toNT $ mkScalarTy PGInteger - where - desc = "number of affected rows by the mutation" - returningFld = - mkHsraObjFldInfo (Just desc) "returning" Map.empty $ - G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkTableTy tn - where - desc = "data of the affected rows by the mutation" diff --git a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Delete.hs b/server/src-lib/Hasura/GraphQL/Schema/Mutation/Delete.hs deleted file mode 100644 index 89e1bd72473..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Delete.hs +++ /dev/null @@ -1,57 +0,0 @@ -module Hasura.GraphQL.Schema.Mutation.Delete - ( mkDelMutFld - , mkDeleteByPkMutationField - ) where - -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Schema.BoolExp -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Mutation.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.SQL.Types - -{- - -delete_table( - where : table_bool_exp! -): table_mutation_response - --} - -mkDelMutFld :: Maybe G.Name -> QualifiedTable -> ObjFldInfo -mkDelMutFld mCustomName tn = - mkHsraObjFldInfo (Just desc) fldName (fromInpValL [filterArg]) $ - G.toGT $ mkMutRespTy tn - where - desc = G.Description $ "delete data from the table: " <>> tn - - defFldName = "delete_" <> qualObjectToName tn - fldName = fromMaybe defFldName mCustomName - - filterArgDesc = "filter the rows which have to be deleted" - filterArg = - InpValInfo (Just filterArgDesc) "where" Nothing $ G.toGT $ - G.toNT $ mkBoolExpTy tn - -{- -delete_table_by_pk( - col1: col-ty1! - col2: col-ty2! -): table --} - -mkDeleteByPkMutationField - :: Maybe G.Name - -> QualifiedTable - -> PrimaryKey PGColumnInfo - -> ObjFldInfo -mkDeleteByPkMutationField mCustomName qt primaryKey = - mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputArgs) $ - G.toGT $ mkTableTy qt - where - description = G.Description $ "delete single row from the table: " <>> qt - fieldName = flip fromMaybe mCustomName $ "delete_" <> qualObjectToName qt <> "_by_pk" - inputArgs = map mkColumnInputVal $ toList $ _pkColumns primaryKey diff --git a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Insert.hs b/server/src-lib/Hasura/GraphQL/Schema/Mutation/Insert.hs deleted file mode 100644 index 2adcb291792..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Insert.hs +++ /dev/null @@ -1,236 +0,0 @@ -module Hasura.GraphQL.Schema.Mutation.Insert - ( mkInsInp - , mkInsInpTy - , mkRelInsInps - , mkInsMutFld - , mkInsertOneMutationField - , mkOnConflictTypes - ) where - -import qualified Data.HashMap.Strict as Map -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Resolve.Types -import Hasura.GraphQL.Schema.BoolExp -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Mutation.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.SQL.Types - --- table_insert_input -mkInsInpTy :: QualifiedTable -> G.NamedType -mkInsInpTy tn = - G.NamedType $ qualObjectToName tn <> "_insert_input" - --- table_obj_rel_insert_input -mkObjInsInpTy :: QualifiedTable -> G.NamedType -mkObjInsInpTy tn = - G.NamedType $ qualObjectToName tn <> "_obj_rel_insert_input" - --- table_arr_rel_insert_input -mkArrInsInpTy :: QualifiedTable -> G.NamedType -mkArrInsInpTy tn = - G.NamedType $ qualObjectToName tn <> "_arr_rel_insert_input" - - --- table_on_conflict -mkOnConflictInpTy :: QualifiedTable -> G.NamedType -mkOnConflictInpTy tn = - G.NamedType $ qualObjectToName tn <> "_on_conflict" - --- table_constraint -mkConstraintInpTy :: QualifiedTable -> G.NamedType -mkConstraintInpTy tn = - G.NamedType $ qualObjectToName tn <> "_constraint" - --- table_update_column -mkUpdColumnInpTy :: QualifiedTable -> G.NamedType -mkUpdColumnInpTy tn = - G.NamedType $ qualObjectToName tn <> "_update_column" - -{- -input table_obj_rel_insert_input { - data: table_insert_input! - on_conflict: table_on_conflict -} - --} - -{- -input table_arr_rel_insert_input { - data: [table_insert_input!]! - on_conflict: table_on_conflict -} - --} - -mkRelInsInps - :: QualifiedTable -> Bool -> [InpObjTyInfo] -mkRelInsInps tn upsertAllowed = [objRelInsInp, arrRelInsInp] - where - onConflictInpVal = - InpValInfo Nothing "on_conflict" Nothing $ G.toGT $ mkOnConflictInpTy tn - - onConflictInp = bool [] [onConflictInpVal] upsertAllowed - - objRelDesc = G.Description $ - "input type for inserting object relation for remote table " <>> tn - - objRelDataInp = InpValInfo Nothing "data" Nothing $ G.toGT $ - G.toNT $ mkInsInpTy tn - objRelInsInp = mkHsraInpTyInfo (Just objRelDesc) (mkObjInsInpTy tn) - $ fromInpValL $ objRelDataInp : onConflictInp - - arrRelDesc = G.Description $ - "input type for inserting array relation for remote table " <>> tn - - arrRelDataInp = InpValInfo Nothing "data" Nothing $ G.toGT $ - G.toNT $ G.toLT $ G.toNT $ mkInsInpTy tn - arrRelInsInp = mkHsraInpTyInfo (Just arrRelDesc) (mkArrInsInpTy tn) - $ fromInpValL $ arrRelDataInp : onConflictInp - -{- - -input table_insert_input { - col1: colty1 - . - . - coln: coltyn -} - --} - -mkInsInp - :: QualifiedTable -> [PGColumnInfo] -> RelationInfoMap -> InpObjTyInfo -mkInsInp tn insCols relInfoMap = - mkHsraInpTyInfo (Just desc) (mkInsInpTy tn) $ fromInpValL $ - map mkPGColInp insCols <> relInps - where - desc = G.Description $ - "input type for inserting data into table " <>> tn - - relInps = flip map (Map.toList relInfoMap) $ - \(relName, relInfo) -> - let remoteQT = riRTable relInfo - tyMaker = case riType relInfo of - ObjRel -> mkObjInsInpTy - ArrRel -> mkArrInsInpTy - in InpValInfo Nothing (mkRelName relName) Nothing $ - G.toGT $ tyMaker remoteQT - - -{- - -input table_on_conflict { - constraint: table_constraint! - update_columns: [table_column!] - where: table_bool_exp -} - --} - -mkOnConflictInp :: QualifiedTable -> InpObjTyInfo -mkOnConflictInp tn = - mkHsraInpTyInfo (Just desc) (mkOnConflictInpTy tn) $ fromInpValL - [constraintInpVal, updateColumnsInpVal, whereInpVal] - where - desc = G.Description $ - "on conflict condition type for table " <>> tn - - constraintInpVal = InpValInfo Nothing (G.Name "constraint") Nothing $ - G.toGT $ G.toNT $ mkConstraintInpTy tn - - updateColumnsInpVal = InpValInfo Nothing (G.Name "update_columns") Nothing $ - G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkUpdColumnInpTy tn - - whereInpVal = InpValInfo Nothing (G.Name "where") Nothing $ - G.toGT $ mkBoolExpTy tn -{- - -insert_table( - objects: [table_insert_input!]! - on_conflict: table_on_conflict - ): table_mutation_response! --} - -mkInsMutFld :: Maybe G.Name -> QualifiedTable -> Bool -> ObjFldInfo -mkInsMutFld mCustomName tn isUpsertable = - mkHsraObjFldInfo (Just desc) fldName (fromInpValL inputVals) $ - G.toGT $ mkMutRespTy tn - where - inputVals = catMaybes [Just objectsArg , mkOnConflictInputVal tn isUpsertable] - desc = G.Description $ - "insert data into the table: " <>> tn - - defFldName = "insert_" <> qualObjectToName tn - fldName = fromMaybe defFldName mCustomName - - objsArgDesc = "the rows to be inserted" - objectsArg = - InpValInfo (Just objsArgDesc) "objects" Nothing $ G.toGT $ - G.toNT $ G.toLT $ G.toNT $ mkInsInpTy tn - -mkConstraintTy :: QualifiedTable -> [ConstraintName] -> EnumTyInfo -mkConstraintTy tn cons = enumTyInfo - where - enumTyInfo = mkHsraEnumTyInfo (Just desc) (mkConstraintInpTy tn) $ - EnumValuesSynthetic . mapFromL _eviVal $ map mkConstraintEnumVal cons - - desc = G.Description $ - "unique or primary key constraints on table " <>> tn - - mkConstraintEnumVal (ConstraintName n) = - EnumValInfo (Just "unique or primary key constraint") - (G.EnumValue $ G.Name n) False - -mkUpdColumnTy :: QualifiedTable -> [G.Name] -> EnumTyInfo -mkUpdColumnTy tn cols = enumTyInfo - where - enumTyInfo = mkHsraEnumTyInfo (Just desc) (mkUpdColumnInpTy tn) $ - EnumValuesSynthetic . mapFromL _eviVal $ map mkColumnEnumVal cols - - desc = G.Description $ - "update columns of table " <>> tn - -mkOnConflictTypes - :: QualifiedTable -> [ConstraintName] -> [G.Name] -> Bool -> [TypeInfo] -mkOnConflictTypes tn uniqueOrPrimaryCons cols = - bool [] tyInfos - where - tyInfos = [ TIEnum $ mkConstraintTy tn uniqueOrPrimaryCons - , TIEnum $ mkUpdColumnTy tn cols - , TIInpObj $ mkOnConflictInp tn - ] - -mkOnConflictInputVal :: QualifiedTable -> Bool -> Maybe InpValInfo -mkOnConflictInputVal qt = - bool Nothing (Just onConflictArg) - where - onConflictDesc = "on conflict condition" - onConflictArg = InpValInfo (Just onConflictDesc) "on_conflict" - Nothing $ G.toGT $ mkOnConflictInpTy qt - - -{- -insert_table_one( - object: table_insert_input! - on_conflict: table_on_conflict - ): table --} - -mkInsertOneMutationField :: Maybe G.Name -> QualifiedTable -> Bool -> ObjFldInfo -mkInsertOneMutationField mCustomName qt isUpsertable = - mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputVals) $ - G.toGT $ mkTableTy qt - where - description = G.Description $ "insert a single row into the table: " <>> qt - - fieldName = flip fromMaybe mCustomName $ "insert_" <> qualObjectToName qt <> "_one" - - inputVals = catMaybes [Just objectArg, mkOnConflictInputVal qt isUpsertable] - - objectArgDesc = "the row to be inserted" - objectArg = InpValInfo (Just objectArgDesc) "object" Nothing $ G.toGT $ - G.toNT $ mkInsInpTy qt diff --git a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Update.hs b/server/src-lib/Hasura/GraphQL/Schema/Mutation/Update.hs deleted file mode 100644 index c5be9192ae8..00000000000 --- a/server/src-lib/Hasura/GraphQL/Schema/Mutation/Update.hs +++ /dev/null @@ -1,301 +0,0 @@ -module Hasura.GraphQL.Schema.Mutation.Update - ( mkUpdSetInp - , mkUpdIncInp - , mkUpdJSONOpInp - , mkUpdSetTy - , mkUpdMutFld - , mkPKeyColumnsInpObj - , mkUpdateByPkMutationField - ) where - -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.Schema.BoolExp -import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Schema.Mutation.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types -import Hasura.SQL.Types - --- table_set_input -mkUpdSetTy :: QualifiedTable -> G.NamedType -mkUpdSetTy tn = - G.NamedType $ qualObjectToName tn <> "_set_input" - -{- -input table_set_input { - col1: colty1 - . - . - coln: coltyn -} --} -mkUpdSetInp - :: QualifiedTable -> [PGColumnInfo] -> InpObjTyInfo -mkUpdSetInp tn cols = - mkHsraInpTyInfo (Just desc) (mkUpdSetTy tn) $ - fromInpValL $ map mkPGColInp cols - where - desc = G.Description $ - "input type for updating data in table " <>> tn - --- table_inc_input -mkUpdIncTy :: QualifiedTable -> G.NamedType -mkUpdIncTy tn = - G.NamedType $ qualObjectToName tn <> "_inc_input" - -{- -input table_inc_input { - integer-col1: int - . - . - integer-coln: int -} --} - -mkUpdIncInp - :: QualifiedTable -> Maybe [PGColumnInfo] -> Maybe InpObjTyInfo -mkUpdIncInp tn = maybe Nothing mkType - where - mkType cols = let numCols = onlyNumCols cols - incObjTy = - mkHsraInpTyInfo (Just desc) (mkUpdIncTy tn) $ - fromInpValL $ map mkPGColInp numCols - in bool (Just incObjTy) Nothing $ null numCols - desc = G.Description $ - "input type for incrementing integer column in table " <>> tn - --- table__input -mkJSONOpTy :: QualifiedTable -> G.Name -> G.NamedType -mkJSONOpTy tn op = - G.NamedType $ qualObjectToName tn <> op <> "_input" - --- json ops are _concat, _delete_key, _delete_elem, _delete_at_path -{- -input table_concat_input { - jsonb-col1: json - . - . - jsonb-coln: json -} --} - -{- -input table_delete_key_input { - jsonb-col1: string - . - . - jsonb-coln: string -} --} - -{- -input table_delete_elem_input { - jsonb-col1: int - . - . - jsonb-coln: int -} --} - -{- -input table_delete_at_path_input { - jsonb-col1: [string] - . - . - jsonb-coln: [string] -} --} - --- jsonb operators and descriptions -prependOp :: G.Name -prependOp = "_prepend" - -prependDesc :: G.Description -prependDesc = "prepend existing jsonb value of filtered columns with new jsonb value" - -appendOp :: G.Name -appendOp = "_append" - -appendDesc :: G.Description -appendDesc = "append existing jsonb value of filtered columns with new jsonb value" - -deleteKeyOp :: G.Name -deleteKeyOp = "_delete_key" - -deleteKeyDesc :: G.Description -deleteKeyDesc = "delete key/value pair or string element." - <> " key/value pairs are matched based on their key value" - -deleteElemOp :: G.Name -deleteElemOp = "_delete_elem" - -deleteElemDesc :: G.Description -deleteElemDesc = "delete the array element with specified index (negative integers count from the end)." - <> " throws an error if top level container is not an array" - -deleteAtPathOp :: G.Name -deleteAtPathOp = "_delete_at_path" - -deleteAtPathDesc :: G.Description -deleteAtPathDesc = "delete the field or element with specified path" - <> " (for JSON arrays, negative integers count from the end)" - -mkUpdJSONOpInp - :: QualifiedTable -> [PGColumnInfo] -> [InpObjTyInfo] -mkUpdJSONOpInp tn cols = bool inpObjs [] $ null jsonbCols - where - jsonbCols = onlyJSONBCols cols - jsonbColNames = map pgiName jsonbCols - - inpObjs = [ prependInpObj, appendInpObj, deleteKeyInpObj - , deleteElemInpObj, deleteAtPathInpObj - ] - - appendInpObj = - mkHsraInpTyInfo (Just appendDesc) (mkJSONOpTy tn appendOp) $ - fromInpValL $ map mkPGColInp jsonbCols - - prependInpObj = - mkHsraInpTyInfo (Just prependDesc) (mkJSONOpTy tn prependOp) $ - fromInpValL $ map mkPGColInp jsonbCols - - deleteKeyInpObj = - mkHsraInpTyInfo (Just deleteKeyDesc) (mkJSONOpTy tn deleteKeyOp) $ - fromInpValL $ map deleteKeyInpVal jsonbColNames - deleteKeyInpVal n = - InpValInfo Nothing n Nothing $ G.toGT $ G.NamedType "String" - - deleteElemInpObj = - mkHsraInpTyInfo (Just deleteElemDesc) (mkJSONOpTy tn deleteElemOp) $ - fromInpValL $ map deleteElemInpVal jsonbColNames - deleteElemInpVal n = - InpValInfo Nothing n Nothing $ G.toGT $ G.NamedType "Int" - - deleteAtPathInpObj = - mkHsraInpTyInfo (Just deleteAtPathDesc) (mkJSONOpTy tn deleteAtPathOp) $ - fromInpValL $ map deleteAtPathInpVal jsonbColNames - deleteAtPathInpVal n = - InpValInfo Nothing n Nothing $ G.toGT $ G.toLT $ G.NamedType "String" - -{- - -update_table( - where : table_bool_exp! - _set : table_set_input - _inc : table_inc_input - _concat: table_concat_input - _delete_key: table_delete_key_input - _delete_elem: table_delete_elem_input - _delete_path_at: table_delete_path_at_input -): table_mutation_response - --} - -mkIncInpVal :: QualifiedTable -> [PGColumnInfo] -> Maybe InpValInfo -mkIncInpVal tn cols = bool (Just incArg) Nothing $ null numCols - where - numCols = onlyNumCols cols - incArgDesc = "increments the integer columns with given value of the filtered values" - incArg = - InpValInfo (Just incArgDesc) "_inc" Nothing $ G.toGT $ mkUpdIncTy tn - -mkJSONOpInpVals :: QualifiedTable -> [PGColumnInfo] -> [InpValInfo] -mkJSONOpInpVals tn cols = bool jsonbOpArgs [] $ null jsonbCols - where - jsonbCols = onlyJSONBCols cols - jsonbOpArgs = [appendArg, prependArg, deleteKeyArg, deleteElemArg, deleteAtPathArg] - - appendArg = - InpValInfo (Just appendDesc) appendOp Nothing $ G.toGT $ mkJSONOpTy tn appendOp - - prependArg = - InpValInfo (Just prependDesc) prependOp Nothing $ G.toGT $ mkJSONOpTy tn prependOp - - deleteKeyArg = - InpValInfo (Just deleteKeyDesc) deleteKeyOp Nothing $ - G.toGT $ mkJSONOpTy tn deleteKeyOp - - deleteElemArg = - InpValInfo (Just deleteElemDesc) deleteElemOp Nothing $ - G.toGT $ mkJSONOpTy tn deleteElemOp - - deleteAtPathArg = - InpValInfo (Just deleteAtPathDesc) deleteAtPathOp Nothing $ - G.toGT $ mkJSONOpTy tn deleteAtPathOp - -mkUpdateOpInputs :: QualifiedTable -> [PGColumnInfo] -> [InpValInfo] -mkUpdateOpInputs qt cols = - catMaybes [Just setInp , mkIncInpVal qt cols] <> mkJSONOpInpVals qt cols - where - setArgDesc = "sets the columns of the filtered rows to the given values" - setInp = - InpValInfo (Just setArgDesc) "_set" Nothing $ G.toGT $ mkUpdSetTy qt - - -mkUpdMutFld :: Maybe G.Name -> QualifiedTable -> [PGColumnInfo] -> ObjFldInfo -mkUpdMutFld mCustomName tn cols = - mkHsraObjFldInfo (Just desc) fldName (fromInpValL inputValues) $ - G.toGT $ mkMutRespTy tn - where - inputValues = [filterArg] <> mkUpdateOpInputs tn cols - desc = G.Description $ "update data of the table: " <>> tn - - defFldName = "update_" <> qualObjectToName tn - fldName = fromMaybe defFldName mCustomName - - filterArgDesc = "filter the rows which have to be updated" - filterArg = - InpValInfo (Just filterArgDesc) "where" Nothing $ G.toGT $ - G.toNT $ mkBoolExpTy tn - -{- - -update_table_by_pk( - columns: table_pk_columns_input! - _set : table_set_input - _inc : table_inc_input - _concat: table_concat_input - _delete_key: table_delete_key_input - _delete_elem: table_delete_elem_input - _delete_path_at: table_delete_path_at_input -) --} - -{- -input table_pk_columns_input { - col1: col-ty1! - col2: col-ty2! -} - -where col1, col2 are primary key columns --} - -mkPKeyColumnsInpTy :: QualifiedTable -> G.NamedType -mkPKeyColumnsInpTy qt = - G.NamedType $ qualObjectToName qt <> "_pk_columns_input" - -mkPKeyColumnsInpObj :: QualifiedTable -> PrimaryKey PGColumnInfo -> InpObjTyInfo -mkPKeyColumnsInpObj qt primaryKey = - mkHsraInpTyInfo (Just description) (mkPKeyColumnsInpTy qt) $ - fromInpValL $ map mkColumnInputVal $ toList $ _pkColumns primaryKey - where - description = G.Description $ "primary key columns input for table: " <>> qt - -mkUpdateByPkMutationField - :: Maybe G.Name - -> QualifiedTable - -> [PGColumnInfo] - -> PrimaryKey PGColumnInfo - -> ObjFldInfo -mkUpdateByPkMutationField mCustomName qt cols _ = - mkHsraObjFldInfo (Just description) fieldName (fromInpValL inputArgs) $ - G.toGT $ mkTableTy qt - where - description = G.Description $ "update single row of the table: " <>> qt - fieldName = flip fromMaybe mCustomName $ "update_" <> qualObjectToName qt <> "_by_pk" - - inputArgs = pure primaryKeyColumnsInp <> mkUpdateOpInputs qt cols - primaryKeyColumnsInp = - InpValInfo Nothing "pk_columns" Nothing $ G.toGT $ G.toNT $ mkPKeyColumnsInpTy qt diff --git a/server/src-lib/Hasura/GraphQL/Schema/OrderBy.hs b/server/src-lib/Hasura/GraphQL/Schema/OrderBy.hs index a4cac50aa9a..08f29f057b3 100644 --- a/server/src-lib/Hasura/GraphQL/Schema/OrderBy.hs +++ b/server/src-lib/Hasura/GraphQL/Schema/OrderBy.hs @@ -1,178 +1,171 @@ module Hasura.GraphQL.Schema.OrderBy - ( mkOrdByTy - , ordByEnumTy - , mkOrdByInpObj - , mkTabAggOrdByInpObj - , mkTabAggregateOpOrdByInpObjs + ( orderByExp ) where -import qualified Data.HashMap.Strict as Map +import Hasura.Prelude + +import qualified Data.List.NonEmpty as NE import qualified Language.GraphQL.Draft.Syntax as G -import Hasura.GraphQL.Resolve.Types +import qualified Hasura.GraphQL.Parser as P +import qualified Hasura.RQL.DML.Select as RQL +import Hasura.RQL.Types as RQL +import Hasura.SQL.DML as SQL + +import Hasura.GraphQL.Parser (InputFieldsParser, Kind (..), Parser, + UnpreparedValue) +import Hasura.GraphQL.Parser.Class import Hasura.GraphQL.Schema.Common -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude -import Hasura.RQL.Types +import Hasura.GraphQL.Schema.Table import Hasura.SQL.Types -ordByTy :: G.NamedType -ordByTy = G.NamedType "order_by" -ordByEnumTy :: EnumTyInfo -ordByEnumTy = - mkHsraEnumTyInfo (Just desc) ordByTy $ - EnumValuesSynthetic . mapFromL _eviVal $ map mkEnumVal enumVals +-- | Corresponds to an object type for an order by. +-- +-- > input table_order_by { +-- > col1: order_by +-- > col2: order_by +-- > . . +-- > . . +-- > coln: order_by +-- > obj-rel: _order_by +-- > } +orderByExp + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Input n [RQL.AnnOrderByItemG UnpreparedValue]) +orderByExp table selectPermissions = memoizeOn 'orderByExp table $ do + name <- qualifiedObjectToName table <&> (<> $$(G.litName "_order_by")) + let description = G.Description $ + "Ordering options when selecting data from " <> table <<> "." + tableFields <- tableSelectFields table selectPermissions + fieldParsers <- sequenceA . catMaybes <$> traverse mkField tableFields + pure $ concat . catMaybes <$> P.object name (Just description) fieldParsers where - desc = G.Description "column ordering options" - mkEnumVal (n, d) = - EnumValInfo (Just d) (G.EnumValue n) False - enumVals = - [ ( "asc" - , "in the ascending order, nulls last" - ), - ( "asc_nulls_last" - , "in the ascending order, nulls last" - ), - ( "asc_nulls_first" - , "in the ascending order, nulls first" - ), - ( "desc" - , "in the descending order, nulls first" - ), - ( "desc_nulls_first" - , "in the descending order, nulls first" - ), - ( "desc_nulls_last" - , "in the descending order, nulls last" - ) - ] + mkField + :: FieldInfo + -> m (Maybe (InputFieldsParser n (Maybe [RQL.AnnOrderByItemG UnpreparedValue]))) + mkField fieldInfo = runMaybeT $ + case fieldInfo of + FIColumn columnInfo -> do + let fieldName = pgiName columnInfo + pure $ P.fieldOptional fieldName Nothing orderByOperator + <&> fmap (pure . mkOrderByItemG (RQL.AOCColumn columnInfo)) . join + FIRelationship relationshipInfo -> do + let remoteTable = riRTable relationshipInfo + fieldName <- MaybeT $ pure $ G.mkName $ relNameToTxt $ riName relationshipInfo + perms <- MaybeT $ tableSelectPermissions remoteTable + let newPerms = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ spiFilter perms + case riType relationshipInfo of + ObjRel -> do + otherTableParser <- lift $ orderByExp remoteTable perms + pure $ do + otherTableOrderBy <- join <$> P.fieldOptional fieldName Nothing (P.nullable otherTableParser) + pure $ fmap (map $ fmap $ RQL.AOCObjectRelation relationshipInfo newPerms) otherTableOrderBy + ArrRel -> do + let aggregateFieldName = fieldName <> $$(G.litName "_aggregate") + aggregationParser <- lift $ orderByAggregation remoteTable perms + pure $ do + aggregationOrderBy <- join <$> P.fieldOptional aggregateFieldName Nothing (P.nullable aggregationParser) + pure $ fmap (map $ fmap $ RQL.AOCArrayAggregation relationshipInfo newPerms) aggregationOrderBy + FIComputedField _ -> empty + FIRemoteRelationship _ -> empty -mkTabAggregateOpOrdByTy :: QualifiedTable -> G.Name -> G.NamedType -mkTabAggregateOpOrdByTy tn op = - G.NamedType $ qualObjectToName tn <> "_" <> op <> "_order_by" -{- -input table__order_by { - col1: order_by - . . - . . -} --} -mkTabAggregateOpOrdByInpObjs - :: QualifiedTable - -> ([PGColumnInfo], [G.Name]) - -> ([PGColumnInfo], [G.Name]) - -> [InpObjTyInfo] -mkTabAggregateOpOrdByInpObjs tn (numCols, numericAggregateOps) (compCols, compareAggregateOps) = - mapMaybe (mkInpObjTyM numCols) numericAggregateOps - <> mapMaybe (mkInpObjTyM compCols) compareAggregateOps +-- local definitions + +type OrderInfo = (SQL.OrderType, SQL.NullsOrder) + + +orderByAggregation + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Input n [OrderByItemG RQL.AnnAggregateOrderBy]) +orderByAggregation table selectPermissions = do + -- WIP NOTE + -- there is heavy duplication between this and Select.tableAggregationFields + -- it might be worth putting some of it in common, just to avoid issues when + -- we change one but not the other? + tableName <- qualifiedObjectToName table + allColumns <- tableSelectColumns table selectPermissions + let numColumns = onlyNumCols allColumns + compColumns = onlyComparableCols allColumns + numFields = catMaybes <$> traverse mkField numColumns + compFields = catMaybes <$> traverse mkField compColumns + aggFields = fmap (concat . catMaybes . concat) $ sequenceA $ catMaybes + [ -- count + Just $ P.fieldOptional $$(G.litName "count") Nothing orderByOperator + <&> pure . fmap (pure . mkOrderByItemG RQL.AAOCount) . join + , -- operators on numeric columns + if null numColumns then Nothing else Just $ + for numericAggOperators \operator -> + parseOperator operator tableName numFields + , -- operators on comparable columns + if null compColumns then Nothing else Just $ + for comparisonAggOperators \operator -> + parseOperator operator tableName compFields + ] + let objectName = tableName <> $$(G.litName "_aggregate_order_by") + description = G.Description $ "order by aggregate values of table " <>> table + pure $ P.object objectName (Just description) aggFields where + mkField :: PGColumnInfo -> InputFieldsParser n (Maybe (PGColumnInfo, OrderInfo)) + mkField columnInfo = + P.fieldOptional (pgiName columnInfo) (pgiDescription columnInfo) orderByOperator + <&> fmap (columnInfo,) . join - mkDesc (G.Name op) = - G.Description $ "order by " <> op <> "() on columns of table " <>> tn + parseOperator + :: G.Name + -> G.Name + -> InputFieldsParser n [(PGColumnInfo, OrderInfo)] + -> InputFieldsParser n (Maybe [OrderByItemG RQL.AnnAggregateOrderBy]) + parseOperator operator tableName columns = + let opText = G.unName operator + objectName = tableName <> $$(G.litName "_") <> operator <> $$(G.litName "_order_by") + objectDesc = Just $ G.Description $ "order by " <> opText <> "() on columns of table " <>> table + in P.fieldOptional operator Nothing (P.object objectName objectDesc columns) + `mapField` map (\(col, info) -> mkOrderByItemG (RQL.AAOOp opText col) info) - mkInpObjTyM cols op = bool (Just $ mkInpObjTy cols op) Nothing $ null cols - mkInpObjTy cols op = - mkHsraInpTyInfo (Just $ mkDesc op) (mkTabAggregateOpOrdByTy tn op) $ - fromInpValL $ map mkColInpVal cols - mkColInpVal ci = InpValInfo Nothing (pgiName ci) Nothing $ G.toGT - ordByTy -mkTabAggOrdByTy :: QualifiedTable -> G.NamedType -mkTabAggOrdByTy tn = - G.NamedType $ qualObjectToName tn <> "_aggregate_order_by" - -{- -input table_aggregate_order_by { -count: order_by - : table__order_by -} --} - -mkTabAggOrdByInpObj - :: QualifiedTable - -> ([PGColumnInfo], [G.Name]) - -> ([PGColumnInfo], [G.Name]) - -> InpObjTyInfo -mkTabAggOrdByInpObj tn (numCols, numericAggregateOps) (compCols, compareAggregateOps) = - mkHsraInpTyInfo (Just desc) (mkTabAggOrdByTy tn) $ fromInpValL $ - numOpOrdBys <> compOpOrdBys <> [countInpVal] +orderByOperator :: MonadParse m => Parser 'Both m (Maybe OrderInfo) +orderByOperator = + P.nullable $ P.enum $$(G.litName "order_by") (Just "column ordering options") $ NE.fromList + [ ( define $$(G.litName "asc") "in ascending order, nulls last" + , (SQL.OTAsc, SQL.NLast) + ) + , ( define $$(G.litName "asc_nulls_first") "in ascending order, nulls first" + , (SQL.OTAsc, SQL.NFirst) + ) + , ( define $$(G.litName "asc_nulls_last") "in ascending order, nulls last" + , (SQL.OTAsc, SQL.NLast) + ) + , ( define $$(G.litName "desc") "in descending order, nulls first" + , (SQL.OTDesc, SQL.NFirst) + ) + , ( define $$(G.litName "desc_nulls_first") "in descending order, nulls first" + , (SQL.OTDesc, SQL.NFirst) + ) + , ( define $$(G.litName "desc_nulls_last") "in descending order, nulls last" + , (SQL.OTDesc, SQL.NLast) + ) + ] where - desc = G.Description $ - "order by aggregate values of table " <>> tn + define name desc = P.mkDefinition name (Just desc) P.EnumValueInfo - numOpOrdBys = bool (map mkInpValInfo numericAggregateOps) [] $ null numCols - compOpOrdBys = bool (map mkInpValInfo compareAggregateOps) [] $ null compCols - mkInpValInfo op = InpValInfo Nothing op Nothing $ G.toGT $ - mkTabAggregateOpOrdByTy tn op - countInpVal = InpValInfo Nothing "count" Nothing $ G.toGT ordByTy -mkOrdByTy :: QualifiedTable -> G.NamedType -mkOrdByTy tn = - G.NamedType $ qualObjectToName tn <> "_order_by" +-- local helpers -{- -input table_order_by { - col1: order_by - col2: order_by - . . - . . - coln: order_by - obj-rel: _order_by -} --} +mkOrderByItemG :: a -> OrderInfo -> OrderByItemG a +mkOrderByItemG column (orderType, nullsOrder) = + OrderByItemG { obiType = Just $ RQL.OrderType orderType + , obiColumn = column + , obiNulls = Just $ RQL.NullsOrder nullsOrder + } -mkOrdByInpObj - :: QualifiedTable -> [SelField] -> (InpObjTyInfo, OrdByCtx) -mkOrdByInpObj tn selFlds = (inpObjTy, ordByCtx) - where - inpObjTy = - mkHsraInpTyInfo (Just desc) namedTy $ fromInpValL $ - map mkColOrdBy pgColumnFields <> map mkObjRelOrdBy objRels - <> mapMaybe mkArrayAggregateSelectOrdBy arrRels - - namedTy = mkOrdByTy tn - desc = G.Description $ - "ordering options when selecting data from " <>> tn - - pgColumnFields = getPGColumnFields selFlds - relFltr ty = flip filter (getRelationshipFields selFlds) $ - \rf -> riType (_rfiInfo rf) == ty - objRels = relFltr ObjRel - arrRels = relFltr ArrRel - - mkColOrdBy columnInfo = - InpValInfo Nothing (pgiName columnInfo) Nothing $ G.toGT ordByTy - mkObjRelOrdBy relationshipField = - let ri = _rfiInfo relationshipField - in InpValInfo Nothing (mkRelName $ riName ri) Nothing $ - G.toGT $ mkOrdByTy $ riRTable ri - - mkArrayAggregateSelectOrdBy relationshipField = - let ri = _rfiInfo relationshipField - isAggAllowed = _rfiAllowAgg relationshipField - ivi = InpValInfo Nothing (mkAggRelName $ riName ri) Nothing $ - G.toGT $ mkTabAggOrdByTy $ riRTable ri - in bool Nothing (Just ivi) isAggAllowed - - ordByCtx = Map.singleton namedTy $ Map.fromList $ - colOrdBys <> relOrdBys <> arrRelOrdBys - colOrdBys = map (pgiName &&& OBIPGCol) pgColumnFields - relOrdBys = flip map objRels $ - \relationshipField -> - let ri = _rfiInfo relationshipField - fltr = _rfiPermFilter relationshipField - in ( mkRelName $ riName ri - , OBIRel ri fltr - ) - - arrRelOrdBys = flip mapMaybe arrRels $ - \(RelationshipFieldInfo ri isAggAllowed colGNameMap fltr _ _ _) -> - let obItem = ( mkAggRelName $ riName ri - , OBIAgg ri colGNameMap fltr - ) - in bool Nothing (Just obItem) isAggAllowed +aliasToName :: G.Name -> FieldName +aliasToName = FieldName . G.unName diff --git a/server/src-lib/Hasura/GraphQL/Schema/Remote.hs b/server/src-lib/Hasura/GraphQL/Schema/Remote.hs new file mode 100644 index 00000000000..6dbe227d0dc --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Schema/Remote.hs @@ -0,0 +1,420 @@ +module Hasura.GraphQL.Schema.Remote + ( buildRemoteParser + , remoteFieldFullSchema + , inputValueDefinitionParser + , lookupObject + , lookupType + , lookupScalar + ) where + +import Hasura.Prelude +import Hasura.RQL.Types +import Hasura.SQL.Types + +import Language.GraphQL.Draft.Syntax as G +import qualified Data.List.NonEmpty as NE +import Data.Type.Equality +import Data.Foldable (sequenceA_) + +import Hasura.GraphQL.Parser as P +import qualified Hasura.GraphQL.Parser.Internal.Parser as P + +buildRemoteParser + :: forall m n + . (MonadSchema n m, MonadError QErr m) + => IntrospectionResult + -> RemoteSchemaInfo + -> m ( [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)] + , Maybe [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)] + , Maybe [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)]) +buildRemoteParser (IntrospectionResult sdoc query_root mutation_root subscription_root) info = do + queryT <- makeParsers query_root + mutationT <- traverse makeParsers mutation_root + subscriptionT <- traverse makeParsers subscription_root + return (queryT, mutationT, subscriptionT) + where + makeFieldParser :: G.FieldDefinition -> m (P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)) + makeFieldParser fieldDef = do + fp <- remoteField' sdoc fieldDef + return $ do + raw <- P.unsafeRawField (P.fDefinition fp) + return (info, raw) + makeParsers :: G.Name -> m [P.FieldParser n (RemoteSchemaInfo, Field NoFragments Variable)] + makeParsers rootName = + case lookupType sdoc rootName of + Just (G.TypeDefinitionObject o) -> + traverse makeFieldParser $ _otdFieldsDefinition o + _ -> throw400 Unexpected $ rootName <<> " has to be an object type" + +-- | 'remoteFieldFullSchema' takes the 'SchemaIntrospection' and a 'G.Name' and will +-- return a 'SelectionSet' parser if the 'G.Name' is found and is a 'TypeDefinitionObject', +-- otherwise, an error will be thrown. +remoteFieldFullSchema + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.Name + -> m (Parser 'Output n (G.SelectionSet NoFragments Variable)) +remoteFieldFullSchema sdoc name = + P.memoizeOn 'remoteFieldFullSchema name do + fieldObjectType <- + case lookupType sdoc name of + Just (G.TypeDefinitionObject o) -> pure o + _ -> throw400 RemoteSchemaError $ "object with " <> G.unName name <> " not found" + fieldParser <- remoteSchemaObject sdoc fieldObjectType + pure $ P.unsafeRawParser (P.pType fieldParser) + +remoteField' + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.FieldDefinition + -> m (FieldParser n ()) +remoteField' schemaDoc (G.FieldDefinition description name argsDefinition gType _) = + let + addNullableList :: FieldParser n () -> FieldParser n () + addNullableList (P.FieldParser (Definition name' un desc (FieldInfo args typ)) parser) + = P.FieldParser (Definition name' un desc (FieldInfo args (Nullable (TList typ)))) parser + + addNonNullableList :: FieldParser n () -> FieldParser n () + addNonNullableList (P.FieldParser (Definition name' un desc (FieldInfo args typ)) parser) + = P.FieldParser (Definition name' un desc (FieldInfo args (NonNullable (TList typ)))) parser + + -- TODO add directives, deprecation + convertType :: G.GType -> m (FieldParser n ()) + convertType gType' = do + case gType' of + G.TypeNamed (Nullability True) fieldTypeName -> + P.nullableField <$> remoteFieldFromName schemaDoc name description fieldTypeName argsDefinition + G.TypeList (Nullability True) gType'' -> + addNullableList <$> convertType gType'' + G.TypeNamed (Nullability False) fieldTypeName -> do + P.nullableField <$> remoteFieldFromName schemaDoc name description fieldTypeName argsDefinition + G.TypeList (Nullability False) gType'' -> + addNonNullableList <$> convertType gType'' + in convertType gType + +-- | 'remoteSchemaObject' returns a output parser for a given 'ObjectTypeDefinition'. +remoteSchemaObject + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.ObjectTypeDefinition + -> m (Parser 'Output n ()) +remoteSchemaObject schemaDoc defn@(G.ObjectTypeDefinition description name interfaces _directives subFields) = + P.memoizeOn 'remoteSchemaObject defn do + subFieldParsers <- traverse (remoteField' schemaDoc) subFields + interfaceDefs <- traverse getInterface interfaces + implements <- traverse (remoteSchemaInterface schemaDoc) interfaceDefs + -- TODO: also check sub-interfaces, when these are supported in a future graphql spec + traverse_ validateImplementsFields interfaceDefs + pure $ void $ P.selectionSetObject name description subFieldParsers implements + where + getInterface :: G.Name -> m (G.InterfaceTypeDefinition [G.Name]) + getInterface interfaceName = + onNothing (lookupInterface schemaDoc interfaceName) $ + throw400 RemoteSchemaError $ "Could not find interface " <> squote interfaceName + <> " implemented by Object type " <> squote name + validateImplementsFields :: G.InterfaceTypeDefinition [G.Name] -> m () + validateImplementsFields interface = + traverse_ (validateImplementsField (_itdName interface)) (G._itdFieldsDefinition interface) + validateImplementsField :: G.Name -> G.FieldDefinition -> m () + validateImplementsField interfaceName interfaceField = + case lookup (G._fldName interfaceField) (zip (fmap G._fldName subFields) subFields) of + Nothing -> throw400 RemoteSchemaError $ + "Interface field " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField) + <> " expected, but " <> squote name <> " does not provide it" + Just f -> do + unless (validateSubType (G._fldType f) (G._fldType interfaceField)) $ + throw400 RemoteSchemaError $ + "The type of Object field " <> squote name <> "." <> dquote (G._fldName f) + <> " (" <> G.showGT (G._fldType f) + <> ") is not the same type/sub type of Interface field " + <> squote interfaceName <> "." <> dquote (G._fldName interfaceField) + <> " (" <> G.showGT (G._fldType interfaceField) <> ")" + traverse_ (validateArgument (G._fldArgumentsDefinition f)) (G._fldArgumentsDefinition interfaceField) + traverse_ (validateNoExtraNonNull (G._fldArgumentsDefinition interfaceField)) (G._fldArgumentsDefinition f) + where + validateArgument :: G.ArgumentsDefinition -> G.InputValueDefinition -> m () + validateArgument objectFieldArgs ifaceArgument = + case lookup (G._ivdName ifaceArgument) (zip (fmap G._ivdName objectFieldArgs) objectFieldArgs) of + Nothing -> + throw400 RemoteSchemaError $ + "Interface field argument " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField) + <> "(" <> dquote (G._ivdName ifaceArgument) <> ":) required, but Object field " <> squote name <> "." <> dquote (G._fldName f) + <> " does not provide it" + Just a -> + unless (G._ivdType a == G._ivdType ifaceArgument) $ + throw400 RemoteSchemaError $ + "Interface field argument " <> squote interfaceName <> "." <> dquote (G._fldName interfaceField) + <> "(" <> dquote (G._ivdName ifaceArgument) <> ":) expects type " + <> G.showGT (G._ivdType ifaceArgument) + <> ", but " <> squote name <> "." <> dquote (G._fldName f) <> "(" + <> dquote (G._ivdName ifaceArgument) <> ":) has type " + <> G.showGT (G._ivdType a) + validateNoExtraNonNull :: G.ArgumentsDefinition -> G.InputValueDefinition -> m () + validateNoExtraNonNull ifaceArguments objectFieldArg = + case lookup (G._ivdName objectFieldArg) (zip (fmap G._ivdName ifaceArguments) ifaceArguments) of + Just _ -> pure () + Nothing -> + unless (G.isNullable (G._ivdType objectFieldArg)) $ + throw400 RemoteSchemaError $ + "Object field argument " <> squote name <> "." <> dquote (G._fldName f) <> "(" + <> dquote (G._ivdName objectFieldArg) <> ":) is of required type " + <> G.showGT (G._ivdType objectFieldArg) <> ", but is not provided by Interface field " + <> squote interfaceName <> "." <> dquote (G._fldName interfaceField) + validateSubType :: G.GType -> G.GType -> Bool + -- TODO this ignores nullability which is probably wrong, even though the GraphQL spec is ambiguous + validateSubType (G.TypeList _ x) (G.TypeList _ y) = validateSubType x y + -- It is OK to "upgrade" the strictness + validateSubType (G.TypeNamed (Nullability False) x) (G.TypeNamed (Nullability True) y) = + validateSubType (G.TypeNamed (Nullability True) x) (G.TypeNamed (Nullability True) y) + validateSubType (G.TypeNamed nx x) (G.TypeNamed ny y) = + case (lookupType schemaDoc x , lookupType schemaDoc y) of + (Just x' , Just y') -> nx == ny && validateSubTypeDefinition x' y' + _ -> False + validateSubType _ _ = False + validateSubTypeDefinition x' y' | x' == y' = True + validateSubTypeDefinition (TypeDefinitionObject otd) (TypeDefinitionInterface itd) + = G._otdName otd `elem` G._itdPossibleTypes itd + validateSubTypeDefinition (TypeDefinitionObject _otd) (TypeDefinitionUnion _utd) + = True -- TODO write appropriate check (may require saving 'possibleTypes' in Syntax.hs) + validateSubTypeDefinition _ _ = False + +-- | 'remoteSchemaInterface' returns a output parser for a given 'InterfaceTypeDefinition'. +remoteSchemaInterface + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.InterfaceTypeDefinition [G.Name] + -> m (Parser 'Output n ()) +remoteSchemaInterface schemaDoc defn@(G.InterfaceTypeDefinition description name _directives fields possibleTypes) = + P.memoizeOn 'remoteSchemaObject defn do + subFieldParsers <- traverse (remoteField' schemaDoc) fields + objs :: [Parser 'Output n ()] <- + traverse (getObject >=> remoteSchemaObject schemaDoc) possibleTypes + -- In the Draft GraphQL spec (> June 2018), interfaces can themselves + -- implement superinterfaces. In the future, we may need to support this + -- here. + when (null subFieldParsers) $ + throw400 RemoteSchemaError $ "List of fields cannot be empty for interface " <> squote name + -- TODO: another way to obtain 'possibleTypes' is to lookup all the object + -- types in the schema document that claim to implement this interface. We + -- should have a check that expresses that that collection of objects is equal + -- to 'possibelTypes'. + pure $ void $ P.selectionSetInterface name description subFieldParsers objs + where + getObject :: G.Name -> m G.ObjectTypeDefinition + getObject objectName = + onNothing (lookupObject schemaDoc objectName) $ + case lookupInterface schemaDoc objectName of + Nothing -> throw400 RemoteSchemaError $ "Could not find type " <> squote objectName + <> ", which is defined as a member type of Interface " <> squote name + Just _ -> throw400 RemoteSchemaError $ "Interface type " <> squote name <> + " can only include object types. It cannot include " <> squote objectName + +-- | 'remoteSchemaUnion' returns a output parser for a given 'UnionTypeDefinition'. +remoteSchemaUnion + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.UnionTypeDefinition + -> m (Parser 'Output n ()) +remoteSchemaUnion schemaDoc defn@(G.UnionTypeDefinition description name _directives objectNames) = + P.memoizeOn 'remoteSchemaObject defn do + objDefs <- traverse getObject objectNames + objs :: [Parser 'Output n ()] <- traverse (remoteSchemaObject schemaDoc) objDefs + when (null objs) $ + throw400 RemoteSchemaError $ "List of member types cannot be empty for union type " <> squote name + pure $ void $ P.selectionSetUnion name description objs + where + getObject :: G.Name -> m G.ObjectTypeDefinition + getObject objectName = + onNothing (lookupObject schemaDoc objectName) $ + case lookupInterface schemaDoc objectName of + Nothing -> throw400 RemoteSchemaError $ "Could not find type " <> squote objectName + <> ", which is defined as a member type of Union " <> squote name + Just _ -> throw400 RemoteSchemaError $ "Union type " <> squote name <> + " can only include object types. It cannot include " <> squote objectName + +-- | remoteSchemaInputObject returns an input parser for a given 'G.InputObjectTypeDefinition' +remoteSchemaInputObject + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.InputObjectTypeDefinition + -> m (Parser 'Input n ()) +remoteSchemaInputObject schemaDoc defn@(G.InputObjectTypeDefinition desc name _ valueDefns) = + P.memoizeOn 'remoteSchemaInputObject defn do + argsParser <- argumentsParser valueDefns schemaDoc + pure $ P.object name desc argsParser + +lookupType :: SchemaIntrospection -> G.Name -> Maybe (G.TypeDefinition [G.Name]) +lookupType (SchemaIntrospection types) name = find (\tp -> getNamedTyp tp == name) types + where + getNamedTyp :: G.TypeDefinition possibleTypes -> G.Name + getNamedTyp ty = case ty of + G.TypeDefinitionScalar t -> G._stdName t + G.TypeDefinitionObject t -> G._otdName t + G.TypeDefinitionInterface t -> G._itdName t + G.TypeDefinitionUnion t -> G._utdName t + G.TypeDefinitionEnum t -> G._etdName t + G.TypeDefinitionInputObject t -> G._iotdName t + +lookupObject :: SchemaIntrospection -> G.Name -> Maybe G.ObjectTypeDefinition +lookupObject (SchemaIntrospection types) name = go types + where + go :: [TypeDefinition possibleTypes] -> Maybe G.ObjectTypeDefinition + go ((G.TypeDefinitionObject t):tps) + | G._otdName t == name = Just t + | otherwise = go tps + go (_:tps) = go tps + go [] = Nothing + +lookupInterface :: SchemaIntrospection -> G.Name -> Maybe (G.InterfaceTypeDefinition [G.Name]) +lookupInterface (SchemaIntrospection types) name = go types + where + go :: [TypeDefinition possibleTypes] -> Maybe (G.InterfaceTypeDefinition possibleTypes) + go ((G.TypeDefinitionInterface t):tps) + | G._itdName t == name = Just t + | otherwise = go tps + go (_:tps) = go tps + go [] = Nothing + +lookupScalar :: SchemaIntrospection -> G.Name -> Maybe G.ScalarTypeDefinition +lookupScalar (SchemaIntrospection types) name = go types + where + go :: [TypeDefinition possibleTypes] -> Maybe G.ScalarTypeDefinition + go ((G.TypeDefinitionScalar t):tps) + | G._stdName t == name = Just t + | otherwise = go tps + go (_:tps) = go tps + go [] = Nothing + +-- | 'remoteFieldFromName' accepts a GraphQL name and searches for its definition +-- in the 'SchemaIntrospection'. +remoteFieldFromName + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.Name + -> Maybe G.Description + -> G.Name + -> G.ArgumentsDefinition + -> m (FieldParser n ()) +remoteFieldFromName sdoc fieldName description fieldTypeName argsDefns = + case lookupType sdoc fieldTypeName of + Nothing -> throw400 RemoteSchemaError $ "Could not find type with name " <> G.unName fieldName + Just typeDef -> remoteField sdoc fieldName description argsDefns typeDef + +-- | 'inputValuefinitionParser' accepts a 'G.InputValueDefinition' and will return an +-- 'InputFieldsParser' for it. If a non 'Input' GraphQL type is found in the 'type' of +-- the 'InputValueDefinition' then an error will be thrown. +inputValueDefinitionParser + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => G.SchemaIntrospection + -> G.InputValueDefinition + -> m (InputFieldsParser n (Maybe (InputValue Variable))) +inputValueDefinitionParser schemaDoc (G.InputValueDefinition desc name fieldType maybeDefaultVal) = + let fieldConstructor :: forall k. 'Input <: k => Parser k n () -> InputFieldsParser n (Maybe (InputValue Variable)) + fieldConstructor parser = + let wrappedParser :: Parser k n (InputValue Variable) + wrappedParser = + P.Parser + { P.pType = P.pType parser + , P.pParser = \value -> P.pParser parser value $> castWith (P.inputParserInput @k) value + } + in case maybeDefaultVal of + Nothing -> + if G.isNullable fieldType + then fieldOptional name desc wrappedParser + else Just <$> field name desc wrappedParser + Just defaultVal -> Just <$> fieldWithDefault name desc defaultVal wrappedParser + doNullability :: forall k . 'Input <: k => G.Nullability -> Parser k n () -> Parser k n () + doNullability (G.Nullability True) = void . P.nullable + doNullability (G.Nullability False) = id + buildField + :: G.GType + -> (forall k. 'Input <: k => Parser k n () -> InputFieldsParser n (Maybe (InputValue Variable))) + -> m (InputFieldsParser n (Maybe (InputValue Variable))) + buildField fieldType' fieldConstructor' = case fieldType' of + G.TypeNamed nullability typeName -> + case lookupType schemaDoc typeName of + Nothing -> throw400 RemoteSchemaError $ "Could not find type with name " <> G.unName typeName + Just typeDef -> + case typeDef of + G.TypeDefinitionScalar (G.ScalarTypeDefinition scalarDesc name' _) -> + pure $ fieldConstructor' $ doNullability nullability $ remoteFieldScalarParser name' scalarDesc + G.TypeDefinitionEnum defn -> + pure $ fieldConstructor' $ doNullability nullability $ remoteFieldEnumParser defn + G.TypeDefinitionObject _ -> throw400 RemoteSchemaError "expected input type, but got output type" -- couldn't find the equivalent error in Validate/Types.hs, so using a new error message + G.TypeDefinitionInputObject defn -> + fieldConstructor' . doNullability nullability <$> remoteSchemaInputObject schemaDoc defn + G.TypeDefinitionUnion _ -> throw400 RemoteSchemaError "expected input type, but got output type" + G.TypeDefinitionInterface _ -> throw400 RemoteSchemaError "expected input type, but got output type" + G.TypeList nullability subType -> buildField subType (fieldConstructor' . doNullability nullability . void . P.list) + in buildField fieldType fieldConstructor + +argumentsParser + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => G.ArgumentsDefinition + -> G.SchemaIntrospection + -> m (InputFieldsParser n ()) +argumentsParser args schemaDoc = + sequenceA_ <$> traverse (inputValueDefinitionParser schemaDoc) args + +-- | 'remoteField' accepts a 'G.TypeDefinition' and will returns a 'FieldParser' for it. +-- Note that the 'G.TypeDefinition' should be of the GraphQL 'Output' kind, when an +-- GraphQL 'Input' kind is provided, then error will be thrown. +remoteField + :: forall n m + . (MonadSchema n m, MonadError QErr m) + => SchemaIntrospection + -> G.Name + -> Maybe G.Description + -> G.ArgumentsDefinition + -> G.TypeDefinition [G.Name] + -> m (FieldParser n ()) -- TODO return something useful, maybe? +remoteField sdoc fieldName description argsDefn typeDefn = do + -- TODO add directives + argsParser <- argumentsParser argsDefn sdoc + case typeDefn of + G.TypeDefinitionObject objTypeDefn -> do + remoteSchemaObj <- remoteSchemaObject sdoc objTypeDefn + pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj + G.TypeDefinitionScalar (G.ScalarTypeDefinition desc name' _) -> + pure $ P.selection fieldName description argsParser $ remoteFieldScalarParser name' desc + G.TypeDefinitionEnum enumTypeDefn -> + pure $ P.selection fieldName description argsParser $ remoteFieldEnumParser enumTypeDefn + G.TypeDefinitionInterface ifaceTypeDefn -> do + remoteSchemaObj <- remoteSchemaInterface sdoc ifaceTypeDefn + pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj + G.TypeDefinitionUnion unionTypeDefn -> do + remoteSchemaObj <- remoteSchemaUnion sdoc unionTypeDefn + pure $ void $ P.subselection fieldName description argsParser remoteSchemaObj + _ -> throw400 RemoteSchemaError "expected output type, but got input type" + +remoteFieldScalarParser + :: MonadParse n + => G.Name + -> Maybe G.Description + -> Parser 'Both n () +remoteFieldScalarParser name description = + case G.unName name of + "Boolean" -> P.boolean $> () + "Int" -> P.int $> () + "Float" -> P.float $> () + "String" -> P.string $> () + "ID" -> P.identifier $> () + _ -> P.unsafeRawScalar name description $> () + +remoteFieldEnumParser + :: MonadParse n + => G.EnumTypeDefinition + -> Parser 'Both n () +remoteFieldEnumParser (G.EnumTypeDefinition desc name _ valueDefns) = + let enumValDefns = valueDefns <&> \(G.EnumValueDefinition enumDesc enumName _) -> + (mkDefinition (G.unEnumValue enumName) enumDesc P.EnumValueInfo,()) + in P.enum name desc $ NE.fromList enumValDefns diff --git a/server/src-lib/Hasura/GraphQL/Schema/Select.hs b/server/src-lib/Hasura/GraphQL/Schema/Select.hs index cfc0c40f17a..6727b37a035 100644 --- a/server/src-lib/Hasura/GraphQL/Schema/Select.hs +++ b/server/src-lib/Hasura/GraphQL/Schema/Select.hs @@ -1,498 +1,1361 @@ +{-# LANGUAGE RecordWildCards #-} +{-# LANGUAGE ViewPatterns #-} +-- | Generate table selection schema both for ordinary Hasura-type and +-- relay-type queries. All schema with "relay" or "connection" in the name is +-- used exclusively by relay. module Hasura.GraphQL.Schema.Select - ( mkTableObj - , mkRelayTableObj - , mkTableAggObj - , mkSelColumnTy - , mkTableAggregateFieldsObj - , mkTableColAggregateFieldsObj - , mkTableEdgeObj - , pageInfoObj - , mkTableConnectionObj - , mkTableConnectionTy - - , mkSelFld - , mkAggSelFld - , mkSelFldPKey - , mkSelFldConnection - - , mkRemoteRelationshipName - , mkSelArgs - , mkConnectionArgs + ( selectTable + , selectTableByPk + , selectTableAggregate + , selectTableConnection + , selectFunction + , selectFunctionAggregate + , selectFunctionConnection + , tableSelectionSet + , tableSelectionList + , nodeField ) where -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Language.GraphQL.Draft.Syntax as G -import Hasura.GraphQL.Resolve.Types +import Hasura.Prelude + +import Control.Lens hiding (index) +import Data.Has +import Data.Int (Int32) +import Data.Parser.JSONPath +import Data.Traversable (mapAccumL) + +import qualified Data.Aeson as J +import qualified Data.Aeson.Extended as J +import qualified Data.Aeson.Internal as J +import qualified Data.ByteString.Lazy as BL +import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set +import qualified Data.List.NonEmpty as NE +import qualified Data.Sequence as Seq +import qualified Data.Sequence.NonEmpty as NESeq +import qualified Data.Text as T +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Execute.Types as ET +import qualified Hasura.GraphQL.Parser as P +import qualified Hasura.GraphQL.Parser.Internal.Parser as P +import qualified Hasura.RQL.DML.Select as RQL +import qualified Hasura.RQL.Types.BoolExp as RQL +import qualified Hasura.SQL.DML as SQL + +import Hasura.GraphQL.Parser (FieldParser, InputFieldsParser, Kind (..), + Parser, UnpreparedValue (..), mkParameter) +import Hasura.GraphQL.Parser.Class +import Hasura.GraphQL.Parser.Schema (toGraphQLType) import Hasura.GraphQL.Schema.BoolExp import Hasura.GraphQL.Schema.Common import Hasura.GraphQL.Schema.OrderBy -import Hasura.GraphQL.Validate.Types -import Hasura.Prelude +import Hasura.GraphQL.Schema.Remote +import Hasura.GraphQL.Schema.Table import Hasura.RQL.Types +import Hasura.Server.Utils (executeJSONPath) import Hasura.SQL.Types +import Hasura.SQL.Value -mkSelColumnTy :: QualifiedTable -> [G.Name] -> EnumTyInfo -mkSelColumnTy tn cols = enumTyInfo +type SelectExp = RQL.AnnSimpleSelG UnpreparedValue +type AggSelectExp = RQL.AnnAggregateSelectG UnpreparedValue +type ConnectionSelectExp = RQL.ConnectionSelect UnpreparedValue +type SelectArgs = RQL.SelectArgsG UnpreparedValue +type TablePerms = RQL.TablePermG UnpreparedValue +type AnnotatedFields = RQL.AnnFieldsG UnpreparedValue +type AnnotatedField = RQL.AnnFieldG UnpreparedValue + + + +-- 1. top level selection functions +-- write a blurb? + +-- | Simple table selection. +-- +-- The field for the table accepts table selection arguments, and +-- expects a selection of fields +-- +-- > table_name(limit: 10) { +-- > col1: col1_type +-- > col2: col2_type +-- > }: [table!]! +selectTable + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> SelPermInfo -- ^ select permissions of the table + -> m (FieldParser n SelectExp) +selectTable table fieldName description selectPermissions = do + stringifyNum <- asks $ qcStringifyNum . getter + tableArgsParser <- tableArgs table selectPermissions + selectionSetParser <- tableSelectionList table selectPermissions + pure $ P.subselection fieldName description tableArgsParser selectionSetParser + <&> \(args, fields) -> RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromTable table + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = args + , RQL._asnStrfyNum = stringifyNum + } + +-- | Simple table connection selection. +-- +-- The field for the table accepts table connection selection argument, and +-- expects a selection of connection fields +-- +-- > table_name_connection(first: 1) { +-- > pageInfo: { +-- > hasNextPage: Boolean! +-- > endCursor: String! +-- > } +-- > edges: { +-- > cursor: String! +-- > node: { +-- > id: ID! +-- > col1: col1_type +-- > col2: col2_type +-- > } +-- > } +-- > }: table_nameConnection! +selectTableConnection + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> PrimaryKeyColumns -- ^ primary key columns + -> SelPermInfo -- ^ select permissions of the table + -> m (FieldParser n ConnectionSelectExp) +selectTableConnection table fieldName description pkeyColumns selectPermissions = do + stringifyNum <- asks $ qcStringifyNum . getter + selectArgsParser <- tableConnectionArgs pkeyColumns table selectPermissions + selectionSetParser <- P.nonNullableParser <$> tableConnectionSelectionSet table selectPermissions + pure $ P.subselection fieldName description selectArgsParser selectionSetParser + <&> \((args, split, slice), fields) -> RQL.ConnectionSelect + { RQL._csPrimaryKeyColumns = pkeyColumns + , RQL._csSplit = split + , RQL._csSlice = slice + , RQL._csSelect = RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromTable table + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = args + , RQL._asnStrfyNum = stringifyNum + } + } + + +-- | Table selection by primary key. +-- +-- > table_name(id: 42) { +-- > col1: col1_type +-- > col2: col2_type +-- > }: table +-- +-- Returns Nothing if there's nothing that can be selected with +-- current permissions or if there are primary keys the user +-- doesn't have select permissions for. +selectTableByPk + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> SelPermInfo -- ^ select permissions of the table + -> m (Maybe (FieldParser n SelectExp)) +selectTableByPk table fieldName description selectPermissions = runMaybeT do + stringifyNum <- asks $ qcStringifyNum . getter + primaryKeys <- MaybeT $ fmap _pkColumns . _tciPrimaryKey . _tiCoreInfo <$> askTableInfo table + guard $ all (\c -> pgiColumn c `Set.member` spiCols selectPermissions) primaryKeys + argsParser <- lift $ sequenceA <$> for primaryKeys \columnInfo -> do + field <- P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo) + pure $ BoolFld . AVCol columnInfo . pure . AEQ True . mkParameter <$> + P.field (pgiName columnInfo) (pgiDescription columnInfo) field + selectionSetParser <- lift $ tableSelectionSet table selectPermissions + pure $ P.subselection fieldName description argsParser selectionSetParser + <&> \(boolExpr, fields) -> + let defaultPerms = tablePermissionsInfo selectPermissions + -- Do not account permission limit since the result is just a nullable object + permissions = defaultPerms { RQL._tpLimit = Nothing } + whereExpr = Just $ BoolAnd $ toList boolExpr + in RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromTable table + , RQL._asnPerm = permissions + , RQL._asnArgs = RQL.noSelectArgs { RQL._saWhere = whereExpr } + , RQL._asnStrfyNum = stringifyNum + } + +-- | Table aggregation selection +-- +-- Parser for an aggregation selection of a table. +-- > table_aggregate(limit: 10) { +-- > aggregate: table_aggregate_fields +-- > nodes: [table!]! +-- > } :: table_aggregate! +-- +-- Returns Nothing if there's nothing that can be selected with +-- current permissions. +selectTableAggregate + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable -- ^ qualified name of the table + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> SelPermInfo -- ^ select permissions of the table + -> m (Maybe (FieldParser n AggSelectExp)) +selectTableAggregate table fieldName description selectPermissions = runMaybeT do + guard $ spiAllowAgg selectPermissions + stringifyNum <- asks $ qcStringifyNum . getter + tableName <- lift $ qualifiedObjectToName table + tableArgsParser <- lift $ tableArgs table selectPermissions + aggregateParser <- lift $ tableAggregationFields table selectPermissions + nodesParser <- lift $ tableSelectionList table selectPermissions + let selectionName = tableName <> $$(G.litName "_aggregate") + aggregationParser = P.nonNullableParser $ + parsedSelectionsToFields RQL.TAFExp <$> + P.selectionSet selectionName (Just $ G.Description $ "aggregated selection of " <>> table) + [ RQL.TAFNodes <$> P.subselection_ $$(G.litName "nodes") Nothing nodesParser + , RQL.TAFAgg <$> P.subselection_ $$(G.litName "aggregate") Nothing aggregateParser + ] + pure $ P.subselection fieldName description tableArgsParser aggregationParser + <&> \(args, fields) -> RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromTable table + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = args + , RQL._asnStrfyNum = stringifyNum + } + +{- Note [Selectability of tables] +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The GraphQL specification requires that if the type of a selected field is an +interface, union, or object, then its subselection set must not be empty +(Section 5.3.3). Since we model database tables by GraphQL objects, this means +that a table can be selected as a GraphQL field only if it has fields that we +can select, such as a column. It is perfectly fine not to allow any selections +of any columns of the table in the database. In that case, the table would not +be selectable as a field in GraphQL. + +However, this is not the end of the story. In addition to scalar fields, we +support relationships between tables, so that we may have another table B as a +selected field of this table A. Then the selectability of A depends on the +selectability of B: if we permit selection a column of B, then, as a +consequence, we permit selection of the relationship from A to B, and hence we +permit selection of A, as there would now be valid GraphQL syntax that selects +A. In turn, the selectability of B can depend on the selectability of a further +table C, through a relationship from B to C. + +Now consider the case of a table A, whose columns themselves are not selectable, +but which has a relationship with itself. Is A selectable? In fact, if A has +no further relationships with other tables, or any computed fields, A is not +selectable. But as soon as any leaf field in the transitive closure of tables +related to A becomes selectable, A itself becomes selectable. + +In summary, figuring out the selectability of a table is a mess. In order to +avoid doing graph theory, for now, we simply pretend that GraphQL did not have +the restriction of only allowing selections of fields of type objects when its +subselection is non-empty. In practice, this white lie is somewhat unlikely to +cause errors on the client side, for the following reasons: + +- Introspection of the GraphQL schema is normally provided to aid development of + valid GraphQL schemas, and so any errors in the exposed schema can be caught + at development time: when a developer is building a GraphQL query using schema + introspection, they will eventually find out that the selection they aim to do + is not valid GraphQL. Put differently: exposing a given field through + introspection is not the same as claiming that there is a valid GraphQL query + that selects that field. + +- We only support tables that have at least one column (since we require primary + keys), so that the admin role can select every table anyway. +-} + +-- | Fields of a table +-- +-- > type table{ +-- > # table columns +-- > column_1: column1_type +-- > . +-- > column_n: columnn_type +-- > +-- > # table relationships +-- > object_relationship: remote_table +-- > array_relationship: [remote_table!]! +-- > +-- > # computed fields +-- > computed_field: field_type +-- > +-- > # remote relationships +-- > remote_field: field_type +-- > } +tableSelectionSet + :: ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Output n AnnotatedFields) +tableSelectionSet table selectPermissions = memoizeOn 'tableSelectionSet table do + tableInfo <- _tiCoreInfo <$> askTableInfo table + tableName <- qualifiedObjectToName table + let tableFields = Map.elems $ _tciFieldInfoMap tableInfo + tablePkeyColumns = _pkColumns <$> _tciPrimaryKey tableInfo + description = Just $ mkDescriptionWith (_tciDescription tableInfo) $ + "columns and relationships of " <>> table + fieldParsers <- concat <$> for tableFields \fieldInfo -> + fieldSelection table tablePkeyColumns fieldInfo selectPermissions + + -- We don't check *here* that the subselection set is non-empty, + -- even though the GraphQL specification requires that it is (see + -- Note [Selectability of tables]). However, the GraphQL parser + -- enforces that a selection set, if present, is non-empty; and our + -- parser later verifies that a selection set is present if + -- required, meaning that not having this check here does not allow + -- for the construction of invalid queries. + + queryType <- asks $ qcQueryType . getter + case (queryType, tablePkeyColumns) of + -- A relay table + (ET.QueryRelay, Just pkeyColumns) -> do + let nodeIdFieldParser = + P.selection_ $$(G.litName "id") Nothing P.identifier $> RQL.AFNodeId table pkeyColumns + allFieldParsers = fieldParsers <> [nodeIdFieldParser] + nodeInterface <- node + pure $ P.selectionSetObject tableName description allFieldParsers [nodeInterface] + <&> parsedSelectionsToFields RQL.AFExpression + _ -> + pure $ P.selectionSetObject tableName description fieldParsers [] + <&> parsedSelectionsToFields RQL.AFExpression + +-- | List of table fields object. +-- Just a @'nonNullableObjectList' wrapper over @'tableSelectionSet'. +-- > table_name: [table!]! +tableSelectionList + :: ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Output n AnnotatedFields) +tableSelectionList table selectPermissions = + nonNullableObjectList <$> tableSelectionSet table selectPermissions + +-- | Converts an output type parser from object_type to [object_type!]! +nonNullableObjectList :: Parser 'Output m a -> Parser 'Output m a +nonNullableObjectList = + P.nonNullableParser . P.multiple . P.nonNullableParser + +-- | Connection fields of a table +-- +-- > type tableConnection{ +-- > pageInfo: PageInfo! +-- > edges: [tableEdge!]! +-- > } +-- +-- > type PageInfo{ +-- > startCursor: String! +-- > endCursor: String! +-- > hasNextPage: Boolean! +-- > hasPreviousPage: Boolean! +-- > } +-- +-- > type tableEdge{ +-- > cursor: String! +-- > node: table! +-- > } +tableConnectionSelectionSet + :: forall m n r. ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Output n (RQL.ConnectionFields UnpreparedValue)) +tableConnectionSelectionSet table selectPermissions = do + tableName <- qualifiedObjectToName table + edgesParser <- tableEdgesSelectionSet + let connectionTypeName = tableName <> $$(G.litName "Connection") + pageInfo = P.subselection_ $$(G.litName "pageInfo") Nothing + pageInfoSelectionSet <&> RQL.ConnectionPageInfo + edges = P.subselection_ $$(G.litName "edges") Nothing + edgesParser <&> RQL.ConnectionEdges + connectionDescription = G.Description $ "A Relay connection object on " <>> table + pure $ P.nonNullableParser $ + P.selectionSet connectionTypeName (Just connectionDescription) [pageInfo, edges] + <&> parsedSelectionsToFields RQL.ConnectionTypename where - enumTyInfo = mkHsraEnumTyInfo (Just desc) (mkSelColumnInpTy tn) $ - EnumValuesSynthetic . mapFromL _eviVal $ map mkColumnEnumVal cols + pageInfoSelectionSet :: Parser 'Output n RQL.PageInfoFields + pageInfoSelectionSet = + let startCursorField = P.selection_ $$(G.litName "startCursor") Nothing + P.string $> RQL.PageInfoStartCursor + endCursorField = P.selection_ $$(G.litName "endCursor") Nothing + P.string $> RQL.PageInfoEndCursor + hasNextPageField = P.selection_ $$(G.litName "hasNextPage") Nothing + P.boolean $> RQL.PageInfoHasNextPage + hasPreviousPageField = P.selection_ $$(G.litName "hasPreviousPage") Nothing + P.boolean $> RQL.PageInfoHasPreviousPage + allFields = + [ startCursorField, endCursorField + , hasNextPageField, hasPreviousPageField + ] + in P.nonNullableParser $ P.selectionSet $$(G.litName "PageInfo") Nothing allFields + <&> parsedSelectionsToFields RQL.PageInfoTypename - desc = G.Description $ - "select columns of table " <>> tn + tableEdgesSelectionSet + :: m (Parser 'Output n (RQL.EdgeFields UnpreparedValue)) + tableEdgesSelectionSet = do + tableName <- qualifiedObjectToName table + edgeNodeParser <- P.nonNullableParser <$> tableSelectionSet table selectPermissions + let edgesType = tableName <> $$(G.litName "Edge") + cursor = P.selection_ $$(G.litName "cursor") Nothing + P.string $> RQL.EdgeCursor + edgeNode = P.subselection_ $$(G.litName "node") Nothing + edgeNodeParser <&> RQL.EdgeNode + pure $ nonNullableObjectList $ P.selectionSet edgesType Nothing [cursor, edgeNode] + <&> parsedSelectionsToFields RQL.EdgeTypename ---table_select_column -mkSelColumnInpTy :: QualifiedTable -> G.NamedType -mkSelColumnInpTy tn = - G.NamedType $ qualObjectToName tn <> "_select_column" +-- | User-defined function (AKA custom function) +selectFunction + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => FunctionInfo -- ^ SQL function info + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> SelPermInfo -- ^ select permissions of the target table + -> m (FieldParser n SelectExp) +selectFunction function fieldName description selectPermissions = do + stringifyNum <- asks $ qcStringifyNum . getter + let table = fiReturnType function + tableArgsParser <- tableArgs table selectPermissions + functionArgsParser <- customSQLFunctionArgs function + selectionSetParser <- tableSelectionList table selectPermissions + let argsParser = liftA2 (,) functionArgsParser tableArgsParser + pure $ P.subselection fieldName description argsParser selectionSetParser + <&> \((funcArgs, tableArgs'), fields) -> RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromFunction (fiName function) funcArgs Nothing + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = tableArgs' + , RQL._asnStrfyNum = stringifyNum + } -mkTableAggregateFieldsTy :: QualifiedTable -> G.NamedType -mkTableAggregateFieldsTy = addTypeSuffix "_aggregate_fields" . mkTableTy +selectFunctionAggregate + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => FunctionInfo -- ^ SQL function info + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> SelPermInfo -- ^ select permissions of the target table + -> m (Maybe (FieldParser n AggSelectExp)) +selectFunctionAggregate function fieldName description selectPermissions = runMaybeT do + let table = fiReturnType function + stringifyNum <- asks $ qcStringifyNum . getter + guard $ spiAllowAgg selectPermissions + tableArgsParser <- lift $ tableArgs table selectPermissions + functionArgsParser <- lift $ customSQLFunctionArgs function + aggregateParser <- lift $ tableAggregationFields table selectPermissions + selectionName <- lift $ qualifiedObjectToName table <&> (<> $$(G.litName "_aggregate")) + nodesParser <- lift $ tableSelectionList table selectPermissions + let argsParser = liftA2 (,) functionArgsParser tableArgsParser + aggregationParser = fmap (parsedSelectionsToFields RQL.TAFExp) $ + P.nonNullableParser $ + P.selectionSet selectionName Nothing + [ RQL.TAFNodes <$> P.subselection_ $$(G.litName "nodes") Nothing nodesParser + , RQL.TAFAgg <$> P.subselection_ $$(G.litName "aggregate") Nothing aggregateParser + ] + pure $ P.subselection fieldName description argsParser aggregationParser + <&> \((funcArgs, tableArgs'), fields) -> RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromFunction (fiName function) funcArgs Nothing + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = tableArgs' + , RQL._asnStrfyNum = stringifyNum + } -mkTableColAggregateFieldsTy :: G.Name -> QualifiedTable -> G.NamedType -mkTableColAggregateFieldsTy op tn = - G.NamedType $ qualObjectToName tn <> "_" <> op <> "_fields" +selectFunctionConnection + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => FunctionInfo -- ^ SQL function info + -> G.Name -- ^ field display name + -> Maybe G.Description -- ^ field description, if any + -> PrimaryKeyColumns -- ^ primary key columns of the target table + -> SelPermInfo -- ^ select permissions of the target table + -> m (FieldParser n ConnectionSelectExp) +selectFunctionConnection function fieldName description pkeyColumns selectPermissions = do + stringifyNum <- asks $ qcStringifyNum . getter + let table = fiReturnType function + tableConnectionArgsParser <- tableConnectionArgs pkeyColumns table selectPermissions + functionArgsParser <- customSQLFunctionArgs function + selectionSetParser <- tableConnectionSelectionSet table selectPermissions + let argsParser = liftA2 (,) functionArgsParser tableConnectionArgsParser + pure $ P.subselection fieldName description argsParser selectionSetParser + <&> \((funcArgs, (args, split, slice)), fields) -> RQL.ConnectionSelect + { RQL._csPrimaryKeyColumns = pkeyColumns + , RQL._csSplit = split + , RQL._csSlice = slice + , RQL._csSelect = RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromFunction (fiName function) funcArgs Nothing + , RQL._asnPerm = tablePermissionsInfo selectPermissions + , RQL._asnArgs = args + , RQL._asnStrfyNum = stringifyNum + } + } -mkTableByPkName :: QualifiedTable -> G.Name -mkTableByPkName tn = qualObjectToName tn <> "_by_pk" --- Support argument params for PG columns -mkPGColParams :: PGColumnType -> ParamMap -mkPGColParams colType - | isScalarColumnWhere isJSONType colType = - let pathDesc = "JSON select path" - in Map.fromList - [ (G.Name "path", InpValInfo (Just pathDesc) "path" Nothing $ G.toGT $ mkScalarTy PGText) ] - | otherwise = Map.empty -mkPGColFld :: PGColumnInfo -> ObjFldInfo -mkPGColFld colInfo = - mkHsraObjFldInfo desc name (mkPGColParams colTy) ty +-- 2. local parsers +-- Parsers that are used but not exported: sub-components + +-- | Argument to filter rows returned from table selection +-- > where: table_bool_exp +tableWhere + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (InputFieldsParser n (Maybe (RQL.AnnBoolExp UnpreparedValue))) +tableWhere table selectPermissions = do + boolExpParser <- boolExp table (Just selectPermissions) + pure $ fmap join $ + P.fieldOptional whereName whereDesc $ P.nullable boolExpParser where - PGColumnInfo _ name _ colTy isNullable pgDesc = colInfo - desc = (G.Description . getPGDescription) <$> pgDesc - ty = bool notNullTy nullTy isNullable - columnType = mkColumnType colTy - notNullTy = G.toGT $ G.toNT columnType - nullTy = G.toGT columnType + whereName = $$(G.litName "where") + whereDesc = Just $ G.Description "filter the rows returned" -mkComputedFieldFld :: ComputedField -> ObjFldInfo -mkComputedFieldFld field = - uncurry (mkHsraObjFldInfo (Just desc) fieldName) $ case fieldType of - CFTScalar scalarTy -> - let inputParams = mkPGColParams (PGColumnScalar scalarTy) - <> fromInpValL (maybeToList maybeFunctionInputArg) - in (inputParams, G.toGT $ mkScalarTy scalarTy) - CFTTable computedFieldtable -> - let table = _cftTable computedFieldtable - -- TODO: connection stuff - in ( fromInpValL $ maybeToList maybeFunctionInputArg <> mkSelArgs table - , G.toGT $ G.toLT $ G.toNT $ mkTableTy table +-- | Argument to sort rows returned from table selection +-- > order_by: [table_order_by!] +tableOrderBy + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (InputFieldsParser n (Maybe (NonEmpty (RQL.AnnOrderByItemG UnpreparedValue)))) +tableOrderBy table selectPermissions = do + orderByParser <- orderByExp table selectPermissions + pure $ do + maybeOrderByExps <- fmap join $ + P.fieldOptional orderByName orderByDesc $ P.nullable $ P.list orderByParser + pure $ maybeOrderByExps >>= NE.nonEmpty . concat + where + orderByName = $$(G.litName "order_by") + orderByDesc = Just $ G.Description "sort the rows by one or more columns" + +-- | Argument to distinct select on columns returned from table selection +-- > distinct_on: [table_select_column!] +tableDistinctOn + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (InputFieldsParser n (Maybe (NonEmpty PGCol))) +tableDistinctOn table selectPermissions = do + columnsEnum <- tableSelectColumnsEnum table selectPermissions + pure $ do + maybeDistinctOnColumns <- join.join <$> for columnsEnum + (P.fieldOptional distinctOnName distinctOnDesc . P.nullable . P.list) + pure $ maybeDistinctOnColumns >>= NE.nonEmpty + where + distinctOnName = $$(G.litName "distinct_on") + distinctOnDesc = Just $ G.Description "distinct select on columns" + +-- | Arguments for a table selection +-- +-- > distinct_on: [table_select_column!] +-- > limit: Int +-- > offset: Int +-- > order_by: [table_order_by!] +-- > where: table_bool_exp +tableArgs + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (InputFieldsParser n SelectArgs) +tableArgs table selectPermissions = do + whereParser <- tableWhere table selectPermissions + orderByParser <- tableOrderBy table selectPermissions + distinctParser <- tableDistinctOn table selectPermissions + let selectArgs = do + whereF <- whereParser + orderBy <- orderByParser + limit <- fmap join $ P.fieldOptional limitName limitDesc $ P.nullable positiveInt + offset <- fmap join $ P.fieldOptional offsetName offsetDesc $ P.nullable fakeBigInt + distinct <- distinctParser + pure $ RQL.SelectArgs + { RQL._saWhere = whereF + , RQL._saOrderBy = orderBy + , RQL._saLimit = fromIntegral <$> limit + , RQL._saOffset = txtEncoder <$> offset + , RQL._saDistinct = distinct + } + pure $ selectArgs `P.bindFields` + \args -> do + traverse_ (validateDistinctOn $ RQL._saOrderBy args) $ RQL._saDistinct args + pure args + where + -- TODO: THIS IS A TEMPORARY FIX + -- while offset is exposed in the schema as a GraphQL Int, which + -- is a bounded Int32, previous versions of the code used to also + -- silently accept a string as an input for the offset as a way to + -- support int64 values (postgres bigint) + -- a much better way of supporting this would be to expose the + -- offset in the code as a postgres bigint, but for now, to avoid + -- a breaking change, we are defining a custom parser that also + -- accepts a string + fakeBigInt :: Parser 'Both n PGScalarValue + fakeBigInt = P.Parser + { pType = fakeBigIntSchemaType + , pParser = P.peelVariable (Just $ toGraphQLType fakeBigIntSchemaType) >=> \case + P.GraphQLValue (G.VInt i) -> PGValBigInt <$> convertWith scientificToInteger (fromInteger i) + P.JSONValue (J.Number n) -> PGValBigInt <$> convertWith scientificToInteger n + P.GraphQLValue (G.VString s) -> pure $ PGValUnknown s + P.JSONValue (J.String s) -> pure $ PGValUnknown s + v -> P.typeMismatch $$(G.litName "Int") "a 32-bit integer, or a 64-bit integer represented as a string" v + } + fakeBigIntSchemaType = P.NonNullable $ P.TNamed $ P.mkDefinition $$(G.litName "Int") Nothing P.TIScalar + convertWith f = either (parseErrorWith ParseFailed . qeError) pure . runAesonParser f + + -- TH splices mess up ApplicativeDo + -- see (FIXME: link to bug here) + limitName = $$(G.litName "limit") + offsetName = $$(G.litName "offset") + limitDesc = Just $ G.Description "limit the number of rows returned" + offsetDesc = Just $ G.Description "skip the first n rows. Use only with order_by" + + validateDistinctOn Nothing _ = return () + validateDistinctOn (Just orderByCols) distinctOnCols = do + let colsLen = length distinctOnCols + initOrderBys = take colsLen $ NE.toList orderByCols + initOrdByCols = flip mapMaybe initOrderBys $ \ob -> + case obiColumn ob of + RQL.AOCColumn pgCol -> Just $ pgiColumn pgCol + _ -> Nothing + isValid = (colsLen == length initOrdByCols) + && all (`elem` initOrdByCols) (toList distinctOnCols) + unless isValid $ parseError + "\"distinct_on\" columns must match initial \"order_by\" columns" + +-- TODO: +-- this should either be moved to Common, or to Parser itself; even better, +-- we could think of exposing a "PositiveInt" custom scalar type in the schema +positiveInt :: MonadParse n => Parser 'Both n Int32 +positiveInt = P.int `P.bind` \value -> do + when (value < 0) $ parseErrorWith NotSupported "unexpected negative value" + pure value + +-- | Arguments for a table connection selection +-- +-- > distinct_on: [table_select_column!] +-- > order_by: [table_order_by!] +-- > where: table_bool_exp +-- > first: Int +-- > last: Int +-- > before: String +-- > after: String +tableConnectionArgs + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => PrimaryKeyColumns + -> QualifiedTable + -> SelPermInfo + -> m ( InputFieldsParser n + ( SelectArgs + , Maybe (NonEmpty (RQL.ConnectionSplit UnpreparedValue)) + , Maybe RQL.ConnectionSlice + ) + ) +tableConnectionArgs pkeyColumns table selectPermissions = do + whereParser <- tableWhere table selectPermissions + orderByParser <- fmap (fmap appendPrimaryKeyOrderBy) <$> tableOrderBy table selectPermissions + distinctParser <- tableDistinctOn table selectPermissions + let maybeFirst = fmap join $ P.fieldOptional $$(G.litName "first") + Nothing $ P.nullable positiveInt + maybeLast = fmap join $ P.fieldOptional $$(G.litName "last") + Nothing $ P.nullable positiveInt + maybeAfter = fmap join $ P.fieldOptional $$(G.litName "after") + Nothing $ P.nullable base64Text + maybeBefore = fmap join $ P.fieldOptional $$(G.litName "before") + Nothing $ P.nullable base64Text + firstAndLast = (,) <$> maybeFirst <*> maybeLast + afterBeforeAndOrderBy = (,,) <$> maybeAfter <*> maybeBefore <*> orderByParser + + pure $ do + whereF <- whereParser + orderBy <- orderByParser + distinct <- distinctParser + split <- afterBeforeAndOrderBy `P.bindFields` \(after, before, orderBy') -> do + rawSplit <- case (after, before) of + (Nothing, Nothing) -> pure Nothing + (Just _, Just _) -> parseError "\"after\" and \"before\" are not allowed at once" + (Just v, Nothing) -> pure $ Just (RQL.CSKAfter, v) + (Nothing, Just v) -> pure $ Just (RQL.CSKBefore, v) + for rawSplit (uncurry (parseConnectionSplit orderBy')) + + slice <- firstAndLast `P.bindFields` \case + (Nothing, Nothing) -> pure Nothing + (Just _, Just _) -> parseError "\"first\" and \"last\" are not allowed at once" + (Just v, Nothing) -> pure $ Just $ RQL.SliceFirst $ fromIntegral v + (Nothing, Just v) -> pure $ Just $ RQL.SliceLast $ fromIntegral v + + pure ( RQL.SelectArgs whereF orderBy Nothing Nothing distinct + , split + , slice ) where - columnDescription = "A computed field, executes function " <>> qf - desc = mkDescriptionWith (_cffDescription function) columnDescription - fieldName = mkComputedFieldName name - ComputedField name function _ fieldType = field - qf = _cffName function + base64Text = base64Decode <$> P.string - maybeFunctionInputArg = - let funcArgDesc = G.Description $ "input parameters for function " <>> qf - inputValue = InpValInfo (Just funcArgDesc) "args" Nothing $ - G.toGT $ G.toNT $ mkFuncArgsTy qf - inputArgs = _cffInputArgs function - in bool (Just inputValue) Nothing $ null inputArgs + appendPrimaryKeyOrderBy :: NonEmpty (RQL.AnnOrderByItemG v) -> NonEmpty (RQL.AnnOrderByItemG v) + appendPrimaryKeyOrderBy orderBys@(h NE.:| t) = + let orderByColumnNames = + orderBys ^.. traverse . to obiColumn . RQL._AOCColumn . to pgiColumn + pkeyOrderBys = flip mapMaybe (toList pkeyColumns) $ \pgColumnInfo -> + if pgiColumn pgColumnInfo `elem` orderByColumnNames then Nothing + else Just $ OrderByItemG Nothing (RQL.AOCColumn pgColumnInfo) Nothing + in h NE.:| (t <> pkeyOrderBys) + parseConnectionSplit + :: Maybe (NonEmpty (RQL.AnnOrderByItemG UnpreparedValue)) + -> RQL.ConnectionSplitKind + -> BL.ByteString + -> n (NonEmpty (RQL.ConnectionSplit UnpreparedValue)) + parseConnectionSplit maybeOrderBys splitKind cursorSplit = do + cursorValue <- either (const throwInvalidCursor) pure $ + J.eitherDecode cursorSplit + case maybeOrderBys of + Nothing -> forM (NESeq.toNonEmpty pkeyColumns) $ + \pgColumnInfo -> do + let columnJsonPath = [J.Key $ getPGColTxt $ pgiColumn pgColumnInfo] + columnType = pgiType pgColumnInfo + pgColumnValue <- maybe throwInvalidCursor pure $ iResultToMaybe $ + executeJSONPath columnJsonPath cursorValue + pgValue <- liftQErr $ parsePGScalarValue columnType pgColumnValue + let unresolvedValue = flip UVParameter Nothing $ P.PGColumnValue columnType pgValue + pure $ RQL.ConnectionSplit splitKind unresolvedValue $ + OrderByItemG Nothing (RQL.AOCColumn pgColumnInfo) Nothing + Just orderBys -> + forM orderBys $ \orderBy -> do + let OrderByItemG orderType annObCol nullsOrder = orderBy + columnType = getOrderByColumnType annObCol + orderByItemValue <- maybe throwInvalidCursor pure $ iResultToMaybe $ + executeJSONPath (getPathFromOrderBy annObCol) cursorValue + pgValue <- liftQErr $ parsePGScalarValue columnType orderByItemValue + let unresolvedValue = flip UVParameter Nothing $ P.PGColumnValue columnType pgValue + pure $ RQL.ConnectionSplit splitKind unresolvedValue $ + OrderByItemG orderType (() <$ annObCol) nullsOrder + where + throwInvalidCursor = parseError "the \"after\" or \"before\" cursor is invalid" + liftQErr = either (parseError . qeError) pure . runExcept --- where: table_bool_exp --- limit: Int --- offset: Int --- distinct_on: [table_select_column!] -mkSelArgs :: QualifiedTable -> [InpValInfo] -mkSelArgs tn = - [ InpValInfo (Just whereDesc) "where" Nothing $ G.toGT $ mkBoolExpTy tn - , InpValInfo (Just limitDesc) "limit" Nothing $ G.toGT $ mkScalarTy PGInteger - , InpValInfo (Just offsetDesc) "offset" Nothing $ G.toGT $ mkScalarTy PGInteger - , InpValInfo (Just orderByDesc) "order_by" Nothing $ G.toGT $ G.toLT $ G.toNT $ - mkOrdByTy tn - , InpValInfo (Just distinctDesc) "distinct_on" Nothing $ G.toGT $ G.toLT $ - G.toNT $ mkSelColumnInpTy tn - ] + iResultToMaybe = \case + J.ISuccess v -> Just v + J.IError{} -> Nothing + + getPathFromOrderBy = \case + RQL.AOCColumn pgColInfo -> + let pathElement = J.Key $ getPGColTxt $ pgiColumn pgColInfo + in [pathElement] + RQL.AOCObjectRelation relInfo _ obCol -> + let pathElement = J.Key $ relNameToTxt $ riName relInfo + in pathElement : getPathFromOrderBy obCol + RQL.AOCArrayAggregation relInfo _ aggOb -> + let fieldName = J.Key $ relNameToTxt (riName relInfo) <> "_aggregate" + in fieldName : case aggOb of + RQL.AAOCount -> [J.Key "count"] + RQL.AAOOp t col -> [J.Key t, J.Key $ getPGColTxt $ pgiColumn col] + + getOrderByColumnType = \case + RQL.AOCColumn pgColInfo -> pgiType pgColInfo + RQL.AOCObjectRelation _ _ obCol -> getOrderByColumnType obCol + RQL.AOCArrayAggregation _ _ aggOb -> + case aggOb of + RQL.AAOCount -> PGColumnScalar PGInteger + RQL.AAOOp _ colInfo -> pgiType colInfo + +-- | Aggregation fields +-- +-- > type table_aggregate_fields{ +-- > count(distinct: Boolean, columns: [table_select_column!]): Int! +-- > sum: table_sum_fields +-- > avg: table_avg_fields +-- > stddev: table_stddev_fields +-- > stddev_pop: table_stddev_pop_fields +-- > variance: table_variance_fields +-- > var_pop: table_var_pop_fields +-- > max: table_max_fields +-- > min: table_min_fields +-- > } +tableAggregationFields + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m (Parser 'Output n RQL.AggregateFields) +tableAggregationFields table selectPermissions = do + tableName <- qualifiedObjectToName table + allColumns <- tableSelectColumns table selectPermissions + let numericColumns = onlyNumCols allColumns + comparableColumns = onlyComparableCols allColumns + selectName = tableName <> $$(G.litName "_aggregate_fields") + description = G.Description $ "aggregate fields of " <>> table + count <- countField + numericAndComparable <- fmap concat $ sequenceA $ catMaybes + [ -- operators on numeric columns + if null numericColumns then Nothing else Just $ + for numericAggOperators $ \operator -> do + numFields <- mkNumericAggFields operator numericColumns + pure $ parseAggOperator operator tableName numFields + , -- operators on comparable columns + if null comparableColumns then Nothing else Just $ do + comparableFields <- traverse mkColumnAggField comparableColumns + pure $ comparisonAggOperators & map \operator -> + parseAggOperator operator tableName comparableFields + ] + let aggregateFields = count : numericAndComparable + pure $ P.selectionSet selectName (Just description) aggregateFields + <&> parsedSelectionsToFields RQL.AFExp where - whereDesc = "filter the rows returned" - limitDesc = "limit the number of rows returned" - offsetDesc = "skip the first n rows. Use only with order_by" - orderByDesc = "sort the rows by one or more columns" - distinctDesc = "distinct select on columns" + mkNumericAggFields :: G.Name -> [PGColumnInfo] -> m [FieldParser n RQL.PGColFld] + mkNumericAggFields name + | name == $$(G.litName "sum") = traverse mkColumnAggField + | otherwise = traverse \columnInfo -> + pure $ P.selection_ (pgiName columnInfo) (pgiDescription columnInfo) + (P.nullable P.float) $> RQL.PCFCol (pgiColumn columnInfo) --- distinct_on: [table_select_column!] --- where: table_bool_exp --- order_by: table_order_by --- first: Int --- after: String --- last: Int --- before: String -mkConnectionArgs :: QualifiedTable -> [InpValInfo] -mkConnectionArgs tn = - [ InpValInfo (Just whereDesc) "where" Nothing $ G.toGT $ mkBoolExpTy tn - , InpValInfo (Just orderByDesc) "order_by" Nothing $ G.toGT $ G.toLT $ G.toNT $ - mkOrdByTy tn - , InpValInfo (Just distinctDesc) "distinct_on" Nothing $ G.toGT $ G.toLT $ - G.toNT $ mkSelColumnInpTy tn - , InpValInfo Nothing "first" Nothing $ G.toGT $ mkScalarTy PGInteger - , InpValInfo Nothing "after" Nothing $ G.toGT $ mkScalarTy PGText - , InpValInfo Nothing "last" Nothing $ G.toGT $ mkScalarTy PGInteger - , InpValInfo Nothing "before" Nothing $ G.toGT $ mkScalarTy PGText - ] + mkColumnAggField :: PGColumnInfo -> m (FieldParser n RQL.PGColFld) + mkColumnAggField columnInfo = do + field <- P.column (pgiType columnInfo) (G.Nullability True) + pure $ P.selection_ (pgiName columnInfo) (pgiDescription columnInfo) field + $> RQL.PCFCol (pgiColumn columnInfo) + + countField :: m (FieldParser n RQL.AggregateField) + countField = do + columnsEnum <- tableSelectColumnsEnum table selectPermissions + let columnsName = $$(G.litName "columns") + distinctName = $$(G.litName "distinct") + args = do + distinct <- P.fieldOptional distinctName Nothing P.boolean + columns <- maybe (pure Nothing) (P.fieldOptional columnsName Nothing . P.list) columnsEnum + pure $ case columns of + Nothing -> SQL.CTStar + Just cols -> if fromMaybe False distinct + then SQL.CTDistinct cols + else SQL.CTSimple cols + pure $ RQL.AFCount <$> P.selection $$(G.litName "count") Nothing args P.int + + parseAggOperator + :: G.Name + -> G.Name + -> [FieldParser n RQL.PGColFld] + -> FieldParser n RQL.AggregateField + parseAggOperator operator tableName columns = + let opText = G.unName operator + setName = tableName <> $$(G.litName "_") <> operator <> $$(G.litName "_fields") + setDesc = Just $ G.Description $ "aggregate " <> opText <> " on columns" + subselectionParser = P.selectionSet setName setDesc columns + <&> parsedSelectionsToFields RQL.PCFExp + in P.subselection_ operator Nothing subselectionParser + <&> (RQL.AFOp . RQL.AggregateOp opText) + +lookupRemoteField' + :: (MonadSchema n m, MonadTableInfo r m) + => [P.Definition P.FieldInfo] + -> FieldCall + -> m P.FieldInfo +lookupRemoteField' fieldInfos (FieldCall fcName _) = + case find ((== fcName) . P.dName) fieldInfos of + Nothing -> throw400 RemoteSchemaError $ "field with name " <> fcName <<> " not found" + Just (P.Definition _ _ _ fieldInfo) -> pure fieldInfo + +lookupRemoteField + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => [P.Definition P.FieldInfo] + -> NonEmpty FieldCall + -> m P.FieldInfo +lookupRemoteField fieldInfos (fieldCall :| rest) = + case NE.nonEmpty rest of + Nothing -> lookupRemoteField' fieldInfos fieldCall + Just rest' -> do + (P.FieldInfo _ type') <- lookupRemoteField' fieldInfos fieldCall + (P.Definition _ _ _ (P.ObjectInfo objFieldInfos _)) + <- onNothing (P.getObjectInfo type') $ + throw400 RemoteSchemaError $ "field " <> fcName fieldCall <<> " is expected to be an object" + lookupRemoteField objFieldInfos rest' + +-- | An individual field of a table +-- +-- > field_name(arg_name: arg_type, ...): field_type +fieldSelection + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => QualifiedTable + -> Maybe PrimaryKeyColumns + -> FieldInfo + -> SelPermInfo + -> m [FieldParser n AnnotatedField] +fieldSelection table maybePkeyColumns fieldInfo selectPermissions = + case fieldInfo of + FIColumn columnInfo -> maybeToList <$> runMaybeT do + queryType <- asks $ qcQueryType . getter + let columnName = pgiColumn columnInfo + fieldName = pgiName columnInfo + if | fieldName == $$(G.litName "id") && queryType == ET.QueryRelay -> do + pkeyColumns <- MaybeT $ pure maybePkeyColumns + pure $ P.selection_ fieldName Nothing P.identifier + $> RQL.AFNodeId table pkeyColumns + | otherwise -> do + guard $ Set.member columnName (spiCols selectPermissions) + let pathArg = jsonPathArg $ pgiType columnInfo + field <- lift $ P.column (pgiType columnInfo) (G.Nullability $ pgiIsNullable columnInfo) + pure $ P.selection fieldName (pgiDescription columnInfo) pathArg field + <&> RQL.mkAnnColumnField columnInfo + + FIRelationship relationshipInfo -> + concat . maybeToList <$> relationshipField relationshipInfo + + FIComputedField computedFieldInfo -> + maybeToList <$> computedField computedFieldInfo selectPermissions + + FIRemoteRelationship remoteFieldInfo -> + concat . maybeToList <$> remoteRelationshipField remoteFieldInfo + +-- | Field parsers for a table relationship +relationshipField + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => RelInfo -> m (Maybe [FieldParser n AnnotatedField]) +relationshipField relationshipInfo = runMaybeT do + let otherTable = riRTable relationshipInfo + colMapping = riMapping relationshipInfo + relName = riName relationshipInfo + nullable = riIsNullable relationshipInfo + remotePerms <- MaybeT $ tableSelectPermissions otherTable + relFieldName <- lift $ textToName $ relNameToTxt relName + case riType relationshipInfo of + ObjRel -> do + let desc = Just $ G.Description "An object relationship" + selectionSetParser <- lift $ tableSelectionSet otherTable remotePerms + pure $ pure $ (if nullable then id else P.nonNullableField) $ + P.subselection_ relFieldName desc selectionSetParser + <&> \fields -> RQL.AFObjectRelation $ RQL.AnnRelationSelectG relName colMapping $ + RQL.AnnObjectSelectG fields otherTable $ + RQL._tpFilter $ tablePermissionsInfo remotePerms + ArrRel -> do + let arrayRelDesc = Just $ G.Description "An array relationship" + otherTableParser <- lift $ selectTable otherTable relFieldName arrayRelDesc remotePerms + let arrayRelField = otherTableParser <&> \selectExp -> RQL.AFArrayRelation $ + RQL.ASSimple $ RQL.AnnRelationSelectG relName colMapping selectExp + relAggFieldName = relFieldName <> $$(G.litName "_aggregate") + relAggDesc = Just $ G.Description "An aggregate relationship" + remoteAggField <- lift $ selectTableAggregate otherTable relAggFieldName relAggDesc remotePerms + remoteConnectionField <- runMaybeT $ do + -- Parse array connection field only for relay schema + queryType <- asks $ qcQueryType . getter + guard $ queryType == ET.QueryRelay + pkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns) + <$> askTableInfo otherTable + let relConnectionName = relFieldName <> $$(G.litName "_connection") + relConnectionDesc = Just $ G.Description "An array relationship connection" + lift $ lift $ selectTableConnection otherTable relConnectionName + relConnectionDesc pkeyColumns remotePerms + pure $ catMaybes [ Just arrayRelField + , fmap (RQL.AFArrayRelation . RQL.ASAggregate . RQL.AnnRelationSelectG relName colMapping) <$> remoteAggField + , fmap (RQL.AFArrayRelation . RQL.ASConnection . RQL.AnnRelationSelectG relName colMapping) <$> remoteConnectionField + ] + +-- | Computed field parser +computedField + :: forall m n r + . (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => ComputedFieldInfo + -> SelPermInfo + -> m (Maybe (FieldParser n AnnotatedField)) +computedField ComputedFieldInfo{..} selectPermissions = runMaybeT do + stringifyNum <- asks $ qcStringifyNum . getter + fieldName <- lift $ textToName $ computedFieldNameToText _cfiName + functionArgsParser <- lift $ computedFieldFunctionArgs _cfiFunction + case _cfiReturnType of + CFRScalar scalarReturnType -> do + guard $ _cfiName `Set.member` spiScalarComputedFields selectPermissions + let fieldArgsParser = do + args <- functionArgsParser + colOp <- jsonPathArg $ PGColumnScalar scalarReturnType + pure $ RQL.AFComputedField $ RQL.CFSScalar $ RQL.ComputedFieldScalarSelect + { RQL._cfssFunction = _cffName _cfiFunction + , RQL._cfssType = scalarReturnType + , RQL._cfssColumnOp = colOp + , RQL._cfssArguments = args + } + dummyParser <- lift $ P.column (PGColumnScalar scalarReturnType) (G.Nullability True) + pure $ P.selection fieldName (Just fieldDescription) fieldArgsParser dummyParser + CFRSetofTable tableName -> do + remotePerms <- MaybeT $ tableSelectPermissions tableName + selectArgsParser <- lift $ tableArgs tableName remotePerms + selectionSetParser <- lift $ P.multiple . P.nonNullableParser <$> tableSelectionSet tableName remotePerms + let fieldArgsParser = liftA2 (,) functionArgsParser selectArgsParser + pure $ P.subselection fieldName (Just fieldDescription) fieldArgsParser selectionSetParser <&> + \((functionArgs', args), fields) -> + RQL.AFComputedField $ RQL.CFSTable RQL.JASMultipleRows $ RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromFunction (_cffName _cfiFunction) functionArgs' Nothing + , RQL._asnPerm = tablePermissionsInfo remotePerms + , RQL._asnArgs = args + , RQL._asnStrfyNum = stringifyNum + } where - whereDesc = "filter the rows returned" - orderByDesc = "sort the rows by one or more columns" - distinctDesc = "distinct select on columns" + fieldDescription = + let defaultDescription = "A computed field, executes function " <>> _cffName _cfiFunction + in mkDescriptionWith (_cffDescription _cfiFunction) defaultDescription -{- + computedFieldFunctionArgs + :: ComputedFieldFunction -> m (InputFieldsParser n (RQL.FunctionArgsExpTableRow UnpreparedValue)) + computedFieldFunctionArgs ComputedFieldFunction{..} = + functionArgs _cffName (IAUserProvided <$> _cffInputArgs) <&> fmap addTableAndSessionArgument + where + tableRowArgument = RQL.AETableRow Nothing -array_relationship( - where: remote_table_bool_exp - limit: Int - offset: Int -): [remote_table!]! -array_relationship_aggregate( - where: remote_table_bool_exp - limit: Int - offset: Int -): remote_table_aggregate! -object_relationship: remote_table + addTableAndSessionArgument args@(RQL.FunctionArgsExp positional named) = + let withTable = case _cffTableArgument of + FTAFirst -> RQL.FunctionArgsExp (tableRowArgument : positional) named + FTANamed argName index -> RQL.insertFunctionArg argName index tableRowArgument args + sessionArgVal = RQL.AESession UVSession + in + case _cffSessionArgument of + Nothing -> withTable + Just (FunctionSessionArgument argName index) -> + RQL.insertFunctionArg argName index sessionArgVal withTable + +-- | Remote relationship field parsers +remoteRelationshipField + :: (MonadSchema n m, MonadTableInfo r m, MonadRole r m, Has QueryContext r) + => RemoteFieldInfo -> m (Maybe [FieldParser n AnnotatedField]) +remoteRelationshipField remoteFieldInfo = runMaybeT do + queryType <- asks $ qcQueryType . getter + -- https://github.com/hasura/graphql-engine/issues/5144 + -- The above issue is easily fixable by removing the following guard and 'MaybeT' monad transformation + guard $ queryType == ET.QueryHasura + remoteSchemasFieldDefns <- asks $ qcRemoteFields . getter + let remoteSchemaName = _rfiRemoteSchemaName remoteFieldInfo + fieldDefns <- + case Map.lookup remoteSchemaName remoteSchemasFieldDefns of + Nothing -> + throw500 $ "unexpected: remote schema " + <> remoteSchemaName + <<> " not found" + Just fieldDefns -> pure fieldDefns + + fieldName <- textToName $ remoteRelationshipNameToText $ _rfiName remoteFieldInfo + remoteFieldsArgumentsParser <- + sequenceA <$> for (Map.toList $ _rfiParamMap remoteFieldInfo) \(name, inpValDefn) -> do + parser <- lift $ inputValueDefinitionParser (_rfiSchemaIntrospect remoteFieldInfo) inpValDefn + pure $ parser `mapField` RQL.RemoteFieldArgument name + + -- This selection set parser, should be of the remote node's selection set parser, which comes + -- from the fieldCall + nestedFieldInfo <- lift $ lookupRemoteField fieldDefns $ unRemoteFields $ _rfiRemoteFields remoteFieldInfo + let remoteFieldsArgumentsParser' = fmap catMaybes remoteFieldsArgumentsParser + case nestedFieldInfo of + P.FieldInfo{ P.fType = fieldType } -> do + let fieldInfo' = P.FieldInfo + { P.fArguments = P.ifDefinitions remoteFieldsArgumentsParser' + , P.fType = fieldType } + pure $ pure $ P.unsafeRawField (P.mkDefinition fieldName Nothing fieldInfo') + `P.bindField` \G.Field{ G._fArguments = args, G._fSelectionSet = selSet } -> do + remoteArgs <- P.ifParser remoteFieldsArgumentsParser' $ P.GraphQLValue <$> args + pure $ RQL.AFRemote $ RQL.RemoteSelect + { _rselArgs = remoteArgs + , _rselSelection = selSet + , _rselHasuraColumns = _rfiHasuraFields remoteFieldInfo + , _rselFieldCall = unRemoteFields $ _rfiRemoteFields remoteFieldInfo + , _rselRemoteSchema = _rfiRemoteSchema remoteFieldInfo + } + +-- | The custom SQL functions' input "args" field parser +-- > function_name(args: function_args) +customSQLFunctionArgs + :: (MonadSchema n m, MonadTableInfo r m) + => FunctionInfo + -> m (InputFieldsParser n (RQL.FunctionArgsExpTableRow UnpreparedValue)) +customSQLFunctionArgs FunctionInfo{..} = functionArgs fiName fiInputArgs + +-- | Parses the arguments to the underlying sql function of a computed field or +-- a custom function. All arguments to the underlying sql function are parsed +-- as an "args" object. Named arguments are expected in a field with the same +-- name, while positional arguments are expected in an field named "arg_$n". +-- Note that collisions are possible, but ignored for now, if a named argument +-- is also named "arg_$n". (FIXME: link to an issue?) +-- +-- If the function requires no argument, or if its only argument is not +-- user-provided (the session argument in the case of custom functions, the +-- table row argument in the case of computed fields), the args object will +-- be omitted. +functionArgs + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => QualifiedFunction + -> Seq.Seq FunctionInputArgument + -> m (InputFieldsParser n (RQL.FunctionArgsExpTableRow UnpreparedValue)) +functionArgs functionName (toList -> inputArgs) = do + -- First, we iterate through the original sql arguments in order, to find the + -- corresponding graphql names. At the same time, we create the input field + -- parsers, in three groups: session argument, optional arguments, and + -- mandatory arguments. Optional arguments have a default value, mandatory + -- arguments don't. + let (names, session, optional, mandatory) = mconcat $ snd $ mapAccumL splitArguments 1 inputArgs + defaultArguments = RQL.FunctionArgsExp (snd <$> session) Map.empty + + if | length session > 1 -> + -- We somehow found more than one session argument; this should never + -- happen and is an error on our side. + throw500 "there shouldn't be more than one session argument" + | null optional && null mandatory -> + -- There are no user-provided arguments to the function: there will be + -- no args field. + pure $ pure defaultArguments + | otherwise -> do + -- There are user-provided arguments: we need to parse an args object. + argumentParsers <- sequenceA $ optional <> mandatory + objectName <- qualifiedObjectToName functionName <&> (<> $$(G.litName "_args")) + let fieldName = $$(G.litName "args") + fieldDesc = G.Description $ "input parameters for function " <>> functionName + objectParser = P.object objectName Nothing (sequenceA argumentParsers) `P.bind` \arguments -> do + -- After successfully parsing, we create a dictionary of the parsed fields + -- and we re-iterate through the original list of sql arguments, now with + -- the knowledge of their graphql name. + let foundArguments = Map.fromList $ catMaybes arguments <> session + argsWithNames = zip names inputArgs + + -- All elements (in the orignal sql order) that are found in the result map + -- are treated as positional arguments, whether they were originally named or + -- not. + (positional, left) <- spanMaybeM (\(name, _) -> pure $ Map.lookup name foundArguments) argsWithNames + + -- If there are arguments left, it means we found one that was not passed + -- positionally. As a result, any remaining argument will have to be passed + -- by name. We fail with a parse error if we encounter a positional sql + -- argument (that does not have a name in the sql function), as: + -- * only the last positional arguments can be omitted; + -- * it has no name we can use. + -- We also fail if we find a mandatory argument that was not + -- provided by the user. + named <- Map.fromList . catMaybes <$> traverse (namedArgument foundArguments) left + pure $ RQL.FunctionArgsExp positional named + + pure $ P.field fieldName (Just fieldDesc) objectParser --} -mkRelationshipField - :: Bool -> RelationshipFieldInfo -> [ObjFldInfo] -mkRelationshipField isRelay fieldInfo = - if | not isRelay -> mkFields False - | isRelay && isJust maybePkey -> mkFields True - | otherwise -> [] where - mkFields includeConnField = - let boolGuard a = bool Nothing (Just a) - in case relType of - ArrRel -> arrRelFld : catMaybes - [ boolGuard aggArrRelFld allowAgg - , boolGuard arrConnFld includeConnField - ] - ObjRel -> [objRelFld] + sessionPlaceholder :: RQL.ArgumentExp UnpreparedValue + sessionPlaceholder = RQL.AEInput P.UVSession - RelationshipFieldInfo relInfo allowAgg _ _ _ maybePkey isNullable = fieldInfo - RelInfo relName relType _ remoteTable isManual = relInfo + splitArguments + :: Int + -> FunctionInputArgument + -> (Int, ( [Text] -- graphql names, in order + , [(Text, RQL.ArgumentExp UnpreparedValue)] -- session argument + , [m (InputFieldsParser n (Maybe (Text, RQL.ArgumentExp UnpreparedValue)))] -- optional argument + , [m (InputFieldsParser n (Maybe (Text, RQL.ArgumentExp UnpreparedValue)))] -- mandatory argument + ) + ) + splitArguments positionalIndex (IASessionVariables name) = + let argName = getFuncArgNameTxt name + in (positionalIndex, ([argName], [(argName, sessionPlaceholder)], [], [])) + splitArguments positionalIndex (IAUserProvided arg) = + let (argName, newIndex) = case faName arg of + Nothing -> ("arg_" <> T.pack (show positionalIndex), positionalIndex + 1) + Just name -> (getFuncArgNameTxt name, positionalIndex) + in if unHasDefault $ faHasDefault arg + then (newIndex, ([argName], [], [parseArgument arg argName], [])) + else (newIndex, ([argName], [], [], [parseArgument arg argName])) - remTabTy = mkTableTy remoteTable + parseArgument :: FunctionArg -> Text -> m (InputFieldsParser n (Maybe (Text, RQL.ArgumentExp UnpreparedValue))) + parseArgument arg name = do + columnParser <- P.column (PGColumnScalar $ _qptName $ faType arg) (G.Nullability True) + fieldName <- textToName name - objRelFld = - mkHsraObjFldInfo (Just "An object relationship") - (mkRelName relName) Map.empty $ - bool (G.toGT . G.toNT) G.toGT (isManual || isNullable) remTabTy + -- While some arguments are "mandatory" (i.e. they don't have a default + -- value), we don't enforce the distinction at the GraphQL type system + -- level, because all postgres function arguments are nullable, and + -- GraphQL conflates nullability and optionality (see Note [Optional + -- fields and nullability]). Making the field "mandatory" in the GraphQL + -- sense would mean giving a default value of `null`, implicitly passing + -- `null` to the postgres function if the user were to omit the + -- argument. For backwards compatibility reasons, and also to avoid + -- surprises, we prefer to reject the query if a mandatory argument is + -- missing rather than filling the blanks for the user. + let argParser = P.fieldOptional fieldName Nothing columnParser + pure $ argParser `mapField` ((name,) . RQL.AEInput . mkParameter) - arrRelFld = - mkHsraObjFldInfo (Just "An array relationship") (mkRelName relName) - (fromInpValL $ mkSelArgs remoteTable) $ - G.toGT $ G.toNT $ G.toLT $ G.toNT remTabTy + namedArgument + :: HashMap Text (RQL.ArgumentExp UnpreparedValue) + -> (Text, InputArgument FunctionArg) + -> n (Maybe (Text, RQL.ArgumentExp UnpreparedValue)) + namedArgument dictionary (name, inputArgument) = case inputArgument of + IASessionVariables _ -> pure $ Just (name, sessionPlaceholder) + IAUserProvided arg -> case Map.lookup name dictionary of + Just parsedValue -> case faName arg of + Just _ -> pure $ Just (name, parsedValue) + Nothing -> parseErrorWith NotSupported "Only last set of positional arguments can be omitted" + Nothing -> whenMaybe (not $ unHasDefault $ faHasDefault arg) $ + parseErrorWith NotSupported "Non default arguments cannot be omitted" - arrConnFld = - mkHsraObjFldInfo (Just "An array relationship connection") (mkConnectionRelName relName) - (fromInpValL $ mkConnectionArgs remoteTable) $ - G.toGT $ G.toNT $ mkTableConnectionTy remoteTable - aggArrRelFld = mkHsraObjFldInfo (Just "An aggregated array relationship") - (mkAggRelName relName) (fromInpValL $ mkSelArgs remoteTable) $ - G.toGT $ G.toNT $ mkTableAggTy remoteTable - -mkTableObjectDescription :: QualifiedTable -> Maybe PGDescription -> G.Description -mkTableObjectDescription tn pgDescription = - mkDescriptionWith pgDescription $ "columns and relationships of " <>> tn - -mkTableObjectFields :: Bool -> [SelField] -> [ObjFldInfo] -mkTableObjectFields isRelay = - concatMap \case - SFPGColumn info -> pure $ mkPGColFld info - SFRelationship info -> mkRelationshipField isRelay info - SFComputedField info -> pure $ mkComputedFieldFld info - SFRemoteRelationship info -> - -- https://github.com/hasura/graphql-engine/issues/5144 - if isRelay then [] else pure $ mkRemoteRelationshipFld info - -{- -type table { - col1: colty1 - . - . - rel1: relty1 -} --} -mkTableObj - :: QualifiedTable - -> Maybe PGDescription - -> [SelField] - -> ObjTyInfo -mkTableObj tn descM allowedFields = - mkObjTyInfo (Just desc) (mkTableTy tn) Set.empty (mapFromL _fiName fields) TLHasuraType +-- | The "path" argument for json column fields +jsonPathArg :: MonadParse n => PGColumnType -> InputFieldsParser n (Maybe RQL.ColumnOp) +jsonPathArg columnType + | isScalarColumnWhere isJSONType columnType = + P.fieldOptional fieldName description P.string `P.bindFields` fmap join . traverse toColExp + | otherwise = pure Nothing where - fields = mkTableObjectFields False allowedFields - desc = mkTableObjectDescription tn descM + fieldName = $$(G.litName "path") + description = Just "JSON select path" + toColExp textValue = case parseJSONPath textValue of + Left err -> parseError $ T.pack $ "parse json path error: " ++ err + Right [] -> pure Nothing + Right jPaths -> pure $ Just $ RQL.ColumnOp SQL.jsonbPathOp $ SQL.SEArray $ map elToColExp jPaths + elToColExp (Key k) = SQL.SELit k + elToColExp (Index i) = SQL.SELit $ T.pack (show i) -mkRelayTableObj - :: QualifiedTable - -> Maybe PGDescription - -> [SelField] - -> ObjTyInfo -mkRelayTableObj tn descM allowedFields = - mkObjTyInfo (Just desc) (mkTableTy tn) Set.empty (mapFromL _fiName fields) TLHasuraType - where - fields = - let idColumnFilter = \case - SFPGColumn columnInfo -> (/=) "id" $ pgiName columnInfo - _ -> True - in (:) nodeIdField $ mkTableObjectFields True $ - -- Remove "id" column - filter idColumnFilter allowedFields +tablePermissionsInfo :: SelPermInfo -> TablePerms +tablePermissionsInfo selectPermissions = RQL.TablePerm + { RQL._tpFilter = fmapAnnBoolExp partialSQLExpToUnpreparedValue $ spiFilter selectPermissions + , RQL._tpLimit = spiLimit selectPermissions + } - nodeIdField = mkHsraObjFldInfo Nothing "id" mempty nodeIdType - desc = mkTableObjectDescription tn descM +------------------------ Node interface from Relay --------------------------- -mkRemoteRelationshipName :: RemoteRelationshipName -> G.Name -mkRemoteRelationshipName = - G.Name . remoteRelationshipNameToText +{- Note [Relay Node Id] +~~~~~~~~~~~~~~~~~~~~~~~ -mkRemoteRelationshipFld :: RemoteFieldInfo -> ObjFldInfo -mkRemoteRelationshipFld remoteField = - mkHsraObjFldInfo description fieldName paramMap gType - where - description = Just "Remote relationship field" - fieldName = mkRemoteRelationshipName $ _rfiName remoteField - paramMap = _rfiParamMap remoteField - gType = _rfiGType remoteField +The 'Node' interface in Relay schema has exactly one field which returns +a non-null 'ID' value. Each table object type in Relay schema should implement +'Node' interface to provide global object identification. +See https://relay.dev/graphql/objectidentification.htm for more details. -{- -type table_aggregate { - agg: table_aggregate_fields - nodes: [table!]! -} --} -mkTableAggObj - :: QualifiedTable -> ObjTyInfo -mkTableAggObj tn = - mkHsraObjTyInfo (Just desc) (mkTableAggTy tn) Set.empty $ mapFromL _fiName - [aggFld, nodesFld] - where - desc = G.Description $ - "aggregated selection of " <>> tn +To identify each row in a table, we need to encode the table information +(schema and name) and primary key column values in the 'Node' id. - aggFld = mkHsraObjFldInfo Nothing "aggregate" Map.empty $ G.toGT $ - mkTableAggregateFieldsTy tn - nodesFld = mkHsraObjFldInfo Nothing "nodes" Map.empty $ G.toGT $ - G.toNT $ G.toLT $ G.toNT $ mkTableTy tn +Node id data: +------------- +We are using JSON format for encoding and decoding the node id. The JSON +schema looks like following -{- -type table_aggregate_fields{ - count: Int - sum: table_sum_fields - avg: table_avg_fields - stddev: table_stddev_fields - stddev_pop: table_stddev_pop_fields - variance: table_variance_fields - var_pop: table_var_pop_fields - max: table_max_fields - min: table_min_fields -} --} -mkTableAggregateFieldsObj - :: QualifiedTable - -> ([PGColumnInfo], [G.Name]) - -> ([PGColumnInfo], [G.Name]) - -> ObjTyInfo -mkTableAggregateFieldsObj tn (numCols, numericAggregateOps) (compCols, compareAggregateOps) = - mkHsraObjTyInfo (Just desc) (mkTableAggregateFieldsTy tn) Set.empty $ mapFromL _fiName $ - countFld : (numFlds <> compFlds) - where - desc = G.Description $ - "aggregate fields of " <>> tn +'[, "", "", "column-1", "column-2", ... "column-n"]' - countFld = mkHsraObjFldInfo Nothing "count" countParams $ G.toGT $ - mkScalarTy PGInteger - - countParams = fromInpValL [countColInpVal, distinctInpVal] - - countColInpVal = InpValInfo Nothing "columns" Nothing $ G.toGT $ - G.toLT $ G.toNT $ mkSelColumnInpTy tn - distinctInpVal = InpValInfo Nothing "distinct" Nothing $ G.toGT $ - mkScalarTy PGBoolean - - numFlds = bool (map mkColumnOpFld numericAggregateOps) [] $ null numCols - compFlds = bool (map mkColumnOpFld compareAggregateOps) [] $ null compCols - - mkColumnOpFld op = mkHsraObjFldInfo Nothing op Map.empty $ G.toGT $ - mkTableColAggregateFieldsTy op tn - -{- -type table__fields{ - num_col: Int - . . - . . -} --} -mkTableColAggregateFieldsObj - :: QualifiedTable - -> G.Name - -> (PGColumnType -> G.NamedType) - -> [PGColumnInfo] - -> ObjTyInfo -mkTableColAggregateFieldsObj tn op f cols = - mkHsraObjTyInfo (Just desc) (mkTableColAggregateFieldsTy op tn) Set.empty $ mapFromL _fiName $ - map mkColObjFld cols - where - desc = G.Description $ "aggregate " <> G.unName op <> " on columns" - - mkColObjFld ci = mkHsraObjFldInfo Nothing (pgiName ci) Map.empty $ - G.toGT $ f $ pgiType ci - -{- - -table( - where: table_bool_exp - limit: Int - offset: Int -): [table!]! - --} -mkSelFld :: Maybe G.Name -> QualifiedTable -> ObjFldInfo -mkSelFld mCustomName tn = - mkHsraObjFldInfo (Just desc) fldName args ty - where - desc = G.Description $ "fetch data from the table: " <>> tn - fldName = fromMaybe (qualObjectToName tn) mCustomName - args = fromInpValL $ mkSelArgs tn - ty = G.toGT $ G.toNT $ G.toLT $ G.toNT $ mkTableTy tn - -{- - -table( - where: table_bool_exp - limit: Int - offset: Int -): tableConnection! +It is represented in the type @'NodeId'. The 'version-integer' represents the JSON +schema version to enable any backward compatibility if it is broken in upcoming versions. +The stringified JSON is Base64 encoded and sent to client. Also the same +base64 encoded JSON string is accepted for 'node' field resolver's 'id' input. -} -mkSelFldConnection :: Maybe G.Name -> QualifiedTable -> ObjFldInfo -mkSelFldConnection mCustomName tn = - mkHsraObjFldInfo (Just desc) fldName args ty +data V1NodeId + = V1NodeId + { _nidTable :: !QualifiedTable + , _nidColumns :: !(NESeq.NESeq J.Value) + } deriving (Show, Eq) + +-- | The Relay 'Node' inteface's 'id' field value. +-- See Note [Relay Node id]. +data NodeId + = NodeIdV1 !V1NodeId + deriving (Show, Eq) + +instance J.FromJSON NodeId where + parseJSON v = do + valueList <- J.parseJSON v + case valueList of + [] -> fail "unexpected GUID format, found empty list" + J.Number 1:rest -> NodeIdV1 <$> parseNodeIdV1 rest + J.Number n:_ -> fail $ "unsupported GUID version: " <> show n + _ -> fail "unexpected GUID format, needs to start with a version number" + where + parseNodeIdV1 (schemaValue:(nameValue:(firstColumn:remainingColumns))) = + V1NodeId + <$> (QualifiedObject <$> J.parseJSON schemaValue <*> J.parseJSON nameValue) + <*> pure (firstColumn NESeq.:<|| Seq.fromList remainingColumns) + parseNodeIdV1 _ = fail "GUID version 1: expecting schema name, table name and at least one column value" + +throwInvalidNodeId :: MonadParse n => Text -> n a +throwInvalidNodeId t = parseError $ "the node id is invalid: " <> t + +-- | The 'node' root field of a Relay request. +node + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => m (P.Parser 'Output n (HashMap QualifiedTable (SelPermInfo, PrimaryKeyColumns, AnnotatedFields))) +node = memoizeOn 'node () do + let idDescription = G.Description "A globally unique identifier" + idField = P.selection_ $$(G.litName "id") (Just idDescription) P.identifier + nodeInterfaceDescription = G.Description "An object with globally unique ID" + allTables :: TableCache <- asks getter + tables :: HashMap QualifiedTable (Parser 'Output n (SelPermInfo, NESeq PGColumnInfo, AnnotatedFields)) <- + Map.mapMaybe id <$> flip Map.traverseWithKey allTables \table _ -> runMaybeT do + tablePkeyColumns <- MaybeT $ (^? tiCoreInfo.tciPrimaryKey._Just.pkColumns) <$> askTableInfo table + selectPermissions <- MaybeT $ tableSelectPermissions table + annotatedFieldsParser <- lift $ tableSelectionSet table selectPermissions + pure $ (selectPermissions, tablePkeyColumns,) <$> annotatedFieldsParser + pure $ P.selectionSetInterface $$(G.litName "Node") + (Just nodeInterfaceDescription) [idField] tables + +nodeField + :: forall m n r + . ( MonadSchema n m + , MonadTableInfo r m + , MonadRole r m + , Has QueryContext r + ) + => m (P.FieldParser n SelectExp) +nodeField = do + let idDescription = G.Description "A globally unique id" + idArgument = P.field $$(G.litName "id") (Just idDescription) P.identifier + stringifyNum <- asks $ qcStringifyNum . getter + nodeObject <- node + return $ P.subselection $$(G.litName "node") Nothing idArgument nodeObject `P.bindField` + \(ident, parseds) -> do + NodeIdV1 (V1NodeId table columnValues) <- parseNodeId ident + (perms, pkeyColumns, fields) <- + onNothing (Map.lookup table parseds) $ + withArgsPath $ throwInvalidNodeId $ "the table " <>> ident + whereExp <- buildNodeIdBoolExp columnValues pkeyColumns + return $ RQL.AnnSelectG + { RQL._asnFields = fields + , RQL._asnFrom = RQL.FromTable table + , RQL._asnPerm = tablePermissionsInfo perms + , RQL._asnArgs = RQL.SelectArgs + { RQL._saWhere = Just whereExp + , RQL._saOrderBy = Nothing + , RQL._saLimit = Nothing + , RQL._saOffset = Nothing + , RQL._saDistinct = Nothing + } + , RQL._asnStrfyNum = stringifyNum + } where - desc = G.Description $ "fetch data from the table: " <>> tn - fldName = fromMaybe (qualObjectToName tn <> "_connection") mCustomName - args = fromInpValL $ mkConnectionArgs tn - ty = G.toGT $ G.toNT $ mkTableConnectionTy tn + parseNodeId :: Text -> n NodeId + parseNodeId = + either (withArgsPath . throwInvalidNodeId . T.pack) pure . J.eitherDecode . base64Decode + withArgsPath = withPath (++ [Key "args", Key "id"]) -{- -type tableConnection { - pageInfo: PageInfo! - edges: [tableEdge!]! -} --} -mkTableConnectionObj - :: QualifiedTable -> ObjTyInfo -mkTableConnectionObj tn = - mkHsraObjTyInfo (Just desc) (mkTableConnectionTy tn) Set.empty $ mapFromL _fiName - [pageInfoFld, edgesFld] - where - desc = G.Description $ "A Relay Connection object on " <>> tn - pageInfoFld = mkHsraObjFldInfo Nothing "pageInfo" Map.empty $ - G.toGT $ G.toNT pageInfoTy - edgesFld = mkHsraObjFldInfo Nothing "edges" Map.empty $ G.toGT $ - G.toNT $ G.toLT $ G.toNT $ mkTableEdgeTy tn + buildNodeIdBoolExp + :: NESeq.NESeq J.Value + -> NESeq.NESeq PGColumnInfo + -> n (RQL.AnnBoolExp UnpreparedValue) + buildNodeIdBoolExp columnValues pkeyColumns = do + let firstPkColumn NESeq.:<|| remainingPkColumns = pkeyColumns + firstColumnValue NESeq.:<|| remainingColumns = columnValues + (nonAlignedPkColumns, nonAlignedColumnValues, alignedTuples) = + partitionThese $ toList $ align remainingPkColumns remainingColumns -booleanScalar :: G.NamedType -booleanScalar = G.NamedType "Boolean" + unless (null nonAlignedPkColumns) $ throwInvalidNodeId $ + "primary key columns " <> dquoteList (map pgiColumn nonAlignedPkColumns) <> " are missing" -stringScalar :: G.NamedType -stringScalar = G.NamedType "String" + unless (null nonAlignedColumnValues) $ throwInvalidNodeId $ + "unexpected column values " <> J.encodeToStrictText nonAlignedColumnValues -pageInfoTyName :: G.Name -pageInfoTyName = "PageInfo" + let allTuples = (firstPkColumn, firstColumnValue):alignedTuples -pageInfoTy :: G.NamedType -pageInfoTy = G.NamedType pageInfoTyName -{- -type PageInfo { - hasNextPage: Boolean! - hasPrevousPage: Boolean! - startCursor: String! - endCursor: String! -} --} -pageInfoObj :: ObjTyInfo -pageInfoObj = - mkHsraObjTyInfo Nothing pageInfoTy Set.empty $ mapFromL _fiName - [hasNextPage, hasPreviousPage, startCursor, endCursor] - where - hasNextPage = mkHsraObjFldInfo Nothing "hasNextPage" Map.empty $ - G.toGT $ G.toNT booleanScalar - hasPreviousPage = mkHsraObjFldInfo Nothing "hasPreviousPage" Map.empty $ - G.toGT $ G.toNT booleanScalar - startCursor = mkHsraObjFldInfo Nothing "startCursor" Map.empty $ - G.toGT $ G.toNT stringScalar - endCursor = mkHsraObjFldInfo Nothing "endCursor" Map.empty $ - G.toGT $ G.toNT stringScalar - -{- -type tableConnection { - cursor: String! - node: table! -} --} -mkTableEdgeObj - :: QualifiedTable -> ObjTyInfo -mkTableEdgeObj tn = - mkHsraObjTyInfo Nothing (mkTableEdgeTy tn) Set.empty $ mapFromL _fiName - [cursor, node] - where - cursor = mkHsraObjFldInfo Nothing "cursor" Map.empty $ - G.toGT $ G.toNT stringScalar - node = mkHsraObjFldInfo Nothing "node" Map.empty $ - G.toGT $ G.toNT $ mkTableTy tn - -{- -table_by_pk( - col1: value1!, - . . - . . - coln: valuen! -): table --} -mkSelFldPKey :: Maybe G.Name -> QualifiedTable -> [PGColumnInfo] -> ObjFldInfo -mkSelFldPKey mCustomName tn cols = - mkHsraObjFldInfo (Just desc) fldName args ty - where - desc = G.Description $ "fetch data from the table: " <> tn - <<> " using primary key columns" - fldName = fromMaybe (mkTableByPkName tn) mCustomName - args = fromInpValL $ map mkColumnInputVal cols - ty = G.toGT $ mkTableTy tn - -{- - -table_aggregate( - where: table_bool_exp - limit: Int - offset: Int -): table_aggregate! - --} -mkAggSelFld - :: Maybe G.Name -> QualifiedTable -> ObjFldInfo -mkAggSelFld mCustomName tn = - mkHsraObjFldInfo (Just desc) fldName args ty - where - desc = G.Description $ "fetch aggregated fields from the table: " - <>> tn - defFldName = qualObjectToName tn <> "_aggregate" - fldName = fromMaybe defFldName mCustomName - args = fromInpValL $ mkSelArgs tn - ty = G.toGT $ G.toNT $ mkTableAggTy tn + either (parseErrorWith ParseFailed . qeError) pure $ runExcept $ + fmap RQL.BoolAnd $ for allTuples $ \(columnInfo, columnValue) -> do + let modifyErrFn t = "value of column " <> pgiColumn columnInfo + <<> " in node id: " <> t + pgColumnType = pgiType columnInfo + pgValue <- modifyErr modifyErrFn $ parsePGScalarValue pgColumnType columnValue + let unpreparedValue = flip UVParameter Nothing $ P.PGColumnValue pgColumnType pgValue + pure $ RQL.BoolFld $ RQL.AVCol columnInfo [RQL.AEQ True unpreparedValue] diff --git a/server/src-lib/Hasura/GraphQL/Schema/Table.hs b/server/src-lib/Hasura/GraphQL/Schema/Table.hs new file mode 100644 index 00000000000..0fa5ab21c22 --- /dev/null +++ b/server/src-lib/Hasura/GraphQL/Schema/Table.hs @@ -0,0 +1,171 @@ +-- | Helper functions for generating the schema of database tables +module Hasura.GraphQL.Schema.Table + ( tableSelectColumnsEnum + , tableUpdateColumnsEnum + , tablePermissions + , tableSelectPermissions + , tableUpdatePermissions + , tableDeletePermissions + , tableSelectFields + , tableColumns + , tableSelectColumns + , tableUpdateColumns + ) where + +import Hasura.Prelude + +import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set +import qualified Language.GraphQL.Draft.Syntax as G + +import qualified Hasura.GraphQL.Parser as P + +import Hasura.GraphQL.Parser (Kind (..), Parser) +import Hasura.GraphQL.Parser.Class +import Hasura.RQL.DML.Internal (getRolePermInfo) +import Hasura.RQL.Types +import Hasura.SQL.Types + +-- | Table select columns enum +-- +-- Parser for an enum type that matches the columns of the given +-- table. Used as a parameter for "distinct", among others. Maps to +-- the table_select_column object. +-- +-- Return Nothing if there's no column the current user has "select" +-- permissions for. +tableSelectColumnsEnum + :: (MonadSchema n m, MonadRole r m, MonadTableInfo r m) + => QualifiedTable + -> SelPermInfo + -> m (Maybe (Parser 'Both n PGCol)) +tableSelectColumnsEnum table selectPermissions = do + tableName <- qualifiedObjectToName table + columns <- tableSelectColumns table selectPermissions + let enumName = tableName <> $$(G.litName "_select_column") + description = Just $ G.Description $ + "select columns of table " <>> table + pure $ P.enum enumName description <$> nonEmpty + [ ( define $ pgiName column + , pgiColumn column + ) + | column <- columns + ] + where + define name = + P.mkDefinition name (Just $ G.Description "column name") P.EnumValueInfo + +-- | Table update columns enum +-- +-- Parser for an enum type that matches the columns of the given +-- table. Used for conflict resolution in "insert" mutations, among +-- others. Maps to the table_update_column object. +-- +-- Return Nothing if there's no column the current user has "update" +-- permissions for. +tableUpdateColumnsEnum + :: (MonadSchema n m, MonadRole r m, MonadTableInfo r m) + => QualifiedTable + -> UpdPermInfo + -> m (Maybe (Parser 'Both n PGCol)) +tableUpdateColumnsEnum table updatePermissions = do + tableName <- qualifiedObjectToName table + columns <- tableUpdateColumns table updatePermissions + let enumName = tableName <> $$(G.litName "_update_column") + description = Just $ G.Description $ + "update columns of table " <>> table + pure $ P.enum enumName description <$> nonEmpty + [ ( define $ pgiName column + , pgiColumn column + ) + | column <- columns + ] + where + define name = + P.mkDefinition name (Just $ G.Description "column name") P.EnumValueInfo + +tablePermissions + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> m (Maybe RolePermInfo) +tablePermissions table = do + roleName <- askRoleName + tableInfo <- askTableInfo table + pure $ getRolePermInfo roleName tableInfo + +tableSelectPermissions + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> m (Maybe SelPermInfo) +tableSelectPermissions table = (_permSel =<<) <$> tablePermissions table + +tableUpdatePermissions + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> m (Maybe UpdPermInfo) +tableUpdatePermissions table = (_permUpd =<<) <$> tablePermissions table + +tableDeletePermissions + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> m (Maybe DelPermInfo) +tableDeletePermissions table = (_permDel =<<) <$> tablePermissions table + +tableSelectFields + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m [FieldInfo] +tableSelectFields table permissions = do + tableFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table + filterM canBeSelected $ Map.elems tableFields + where + canBeSelected (FIColumn columnInfo) = + pure $ Set.member (pgiColumn columnInfo) (spiCols permissions) + canBeSelected (FIRelationship relationshipInfo) = + isJust <$> tableSelectPermissions (riRTable relationshipInfo) + canBeSelected (FIComputedField computedFieldInfo) = + case _cfiReturnType computedFieldInfo of + CFRScalar _ -> + pure $ Set.member (_cfiName computedFieldInfo) $ spiScalarComputedFields permissions + CFRSetofTable tableName -> + isJust <$> tableSelectPermissions tableName + -- TODO (from master): Derive permissions for remote relationships + canBeSelected (FIRemoteRelationship _) = pure True + +tableColumns + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => QualifiedTable + -> m [PGColumnInfo] +tableColumns table = + mapMaybe columnInfo . Map.elems . _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table + where + columnInfo (FIColumn ci) = Just ci + columnInfo _ = Nothing + +tableSelectColumns + :: forall m n r. (MonadSchema n m, MonadTableInfo r m, MonadRole r m) + => QualifiedTable + -> SelPermInfo + -> m [PGColumnInfo] +tableSelectColumns table permissions = + mapMaybe columnInfo <$> tableSelectFields table permissions + where + columnInfo (FIColumn ci) = Just ci + columnInfo _ = Nothing + +tableUpdateColumns + :: forall m n r. (MonadSchema n m, MonadTableInfo r m) + => QualifiedTable + -> UpdPermInfo + -> m [PGColumnInfo] +tableUpdateColumns table permissions = do + tableFields <- _tciFieldInfoMap . _tiCoreInfo <$> askTableInfo table + pure $ mapMaybe isUpdatable $ Map.elems tableFields + where + isUpdatable (FIColumn columnInfo) = + if Set.member (pgiColumn columnInfo) (upiCols permissions) + && not (Map.member (pgiColumn columnInfo) (upiSet permissions)) + then Just columnInfo + else Nothing + isUpdatable _ = Nothing diff --git a/server/src-lib/Hasura/GraphQL/Transport/HTTP.hs b/server/src-lib/Hasura/GraphQL/Transport/HTTP.hs index 30928737630..e87576d5d5b 100644 --- a/server/src-lib/Hasura/GraphQL/Transport/HTTP.hs +++ b/server/src-lib/Hasura/GraphQL/Transport/HTTP.hs @@ -16,7 +16,9 @@ module Hasura.GraphQL.Transport.HTTP import Control.Monad.Morph (hoist) import Hasura.EncJSON +import Hasura.GraphQL.Context import Hasura.GraphQL.Logging (MonadQueryLog (..)) +import Hasura.GraphQL.Parser.Column (UnpreparedValue) import Hasura.GraphQL.Transport.HTTP.Protocol import Hasura.HTTP import Hasura.Prelude @@ -27,11 +29,12 @@ import Hasura.Server.Version (HasVersion) import Hasura.Session import Hasura.Tracing (MonadTrace, TraceT, trace) +import qualified Data.Aeson as J import qualified Data.Environment as Env +import qualified Data.HashMap.Strict as Map import qualified Database.PG.Query as Q import qualified Hasura.GraphQL.Execute as E import qualified Hasura.GraphQL.Execute.Query as EQ -import qualified Hasura.GraphQL.Resolve as R import qualified Hasura.Logging as L import qualified Hasura.Server.Telemetry.Counters as Telem import qualified Hasura.Tracing as Tracing @@ -43,7 +46,7 @@ import qualified Network.Wai.Extended as Wai class Monad m => MonadExecuteQuery m where executeQuery :: GQLReqParsed - -> [R.QueryRootFldUnresolved] + -> [QueryRootField UnpreparedValue] -> Maybe EQ.GeneratedSqlMap -> PGExecCtx -> Q.TxAccess @@ -84,30 +87,61 @@ runGQ env logger reqId userInfo ipAddress reqHeaders queryType reqUnparsed = do -- The response and misc telemetry data: let telemTransport = Telem.HTTP (telemTimeTot_DT, (telemCacheHit, telemLocality, (telemTimeIO_DT, telemQueryType, !resp))) <- withElapsedTime $ do - E.ExecutionCtx _ sqlGenCtx pgExecCtx planCache sc scVer httpManager enableAL <- ask + E.ExecutionCtx _ sqlGenCtx pgExecCtx {- planCache -} sc scVer httpManager enableAL <- ask -- run system authorization on the GraphQL API reqParsed <- E.checkGQLExecution userInfo (reqHeaders, ipAddress) enableAL sc reqUnparsed >>= flip onLeft throwError - (telemCacheHit, execPlan) <- E.getResolvedExecPlan env logger pgExecCtx planCache + (telemCacheHit, execPlan) <- E.getResolvedExecPlan env logger pgExecCtx {- planCache -} userInfo sqlGenCtx sc scVer queryType httpManager reqHeaders (reqUnparsed, reqParsed) case execPlan of + E.QueryExecutionPlan queryPlan asts -> + case queryPlan of + E.ExecStepDB txGenSql -> do + (telemTimeIO, telemQueryType, respHdrs, resp) <- + runQueryDB reqId (reqUnparsed,reqParsed) asts userInfo txGenSql + return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp respHdrs)) + E.ExecStepRemote (rsi, opDef, _varValsM) -> + runRemoteGQ telemCacheHit rsi opDef + E.ExecStepRaw (name, json) -> do + (telemTimeIO, obj) <- withElapsedTime $ + return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json + return (telemCacheHit, Telem.Local, (telemTimeIO, Telem.Query, HttpResponse obj [])) + E.MutationExecutionPlan mutationPlan -> + case mutationPlan of + E.ExecStepDB (tx, responseHeaders) -> do + (telemTimeIO, telemQueryType, resp) <- runMutationDB reqId reqUnparsed userInfo tx + return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp responseHeaders)) + E.ExecStepRemote (rsi, opDef, _varValsM) -> + runRemoteGQ telemCacheHit rsi opDef + E.ExecStepRaw (name, json) -> do + (telemTimeIO, obj) <- withElapsedTime $ + return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json + return (telemCacheHit, Telem.Local, (telemTimeIO, Telem.Query, HttpResponse obj [])) + E.SubscriptionExecutionPlan _sub -> + throw400 UnexpectedPayload "subscriptions are not supported over HTTP, use websockets instead" +{- E.GExPHasura resolvedOp -> do (telemTimeIO, telemQueryType, respHdrs, resp) <- runHasuraGQ reqId (reqUnparsed, reqParsed) userInfo resolvedOp return (telemCacheHit, Telem.Local, (telemTimeIO, telemQueryType, HttpResponse resp respHdrs)) E.GExPRemote rsi opDef -> do let telemQueryType | G._todType opDef == G.OperationTypeMutation = Telem.Mutation | otherwise = Telem.Query - (telemTimeIO, resp) <- E.execRemoteGQ env reqId userInfo reqHeaders reqUnparsed rsi $ G._todType opDef - pure (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp)) - + (telemTimeIO, resp) <- E.execRemoteGQ reqId userInfo reqHeaders reqUnparsed rsi $ G._todType opDef + return (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp)) +-} let telemTimeIO = convertDuration telemTimeIO_DT telemTimeTot = convertDuration telemTimeTot_DT - Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..} return resp + where + runRemoteGQ telemCacheHit rsi opDef = do + let telemQueryType | G._todType opDef == G.OperationTypeMutation = Telem.Mutation + | otherwise = Telem.Query + (telemTimeIO, resp) <- E.execRemoteGQ env reqId userInfo reqHeaders reqUnparsed rsi opDef + return (telemCacheHit, Telem.Remote, (telemTimeIO, telemQueryType, resp)) -- | Run (execute) a batched GraphQL query (see 'GQLBatchedReqs') runGQBatched @@ -131,7 +165,7 @@ runGQBatched -> GQLBatchedReqs GQLQueryText -- ^ the batched request with unparsed GraphQL query -> m (HttpResponse EncJSON) -runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs queryType query = do +runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs queryType query = case query of GQLSingleRequest req -> runGQ env logger reqId userInfo ipAddress reqHdrs queryType req @@ -150,6 +184,60 @@ runGQBatched env logger reqId responseErrorsConfig userInfo ipAddress reqHdrs qu try = flip catchError (pure . Left) . fmap Right +runQueryDB + :: ( MonadIO m + , MonadError QErr m + , MonadReader E.ExecutionCtx m + , MonadQueryLog m + , MonadTrace m + , MonadExecuteQuery m + ) + => RequestId + -> (GQLReqUnparsed, GQLReqParsed) + -> [QueryRootField UnpreparedValue] + -> UserInfo + -> (Tracing.TraceT (LazyTx QErr) EncJSON, EQ.GeneratedSqlMap) + -> m (DiffTime, Telem.QueryType, HTTP.ResponseHeaders, EncJSON) + -- ^ Also return 'Mutation' when the operation was a mutation, and the time + -- spent in the PG query; for telemetry. +runQueryDB reqId (query, queryParsed) asts _userInfo (tx, genSql) = do + -- log the generated SQL and the graphql query + E.ExecutionCtx logger _ pgExecCtx _ _ _ _ <- ask + logQueryLog logger query (Just genSql) reqId + (telemTimeIO, respE) <- withElapsedTime $ runExceptT $ trace "pg" $ + Tracing.interpTraceT id $ executeQuery queryParsed asts (Just genSql) pgExecCtx Q.ReadOnly tx + (respHdrs,resp) <- liftEither respE + let !json = encodeGQResp $ GQSuccess $ encJToLBS resp + telemQueryType = Telem.Query + return (telemTimeIO, telemQueryType, respHdrs, json) + +runMutationDB + :: ( MonadIO m + , MonadError QErr m + , MonadReader E.ExecutionCtx m + , MonadQueryLog m + , MonadTrace m + ) + => RequestId + -> GQLReqUnparsed + -> UserInfo + -> Tracing.TraceT (LazyTx QErr) EncJSON + -> m (DiffTime, Telem.QueryType, EncJSON) + -- ^ Also return 'Mutation' when the operation was a mutation, and the time + -- spent in the PG query; for telemetry. +runMutationDB reqId query userInfo tx = do + E.ExecutionCtx logger _ pgExecCtx _ _ _ _ <- ask + -- log the graphql query + logQueryLog logger query Nothing reqId + ctx <- Tracing.currentContext + (telemTimeIO, respE) <- withElapsedTime $ runExceptT $ trace "pg" $ + Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadWrite . withTraceContext ctx . withUserInfo userInfo) tx + resp <- liftEither respE + let !json = encodeGQResp $ GQSuccess $ encJToLBS resp + telemQueryType = Telem.Mutation + return (telemTimeIO, telemQueryType, json) + +{- runHasuraGQ :: ( MonadIO m , MonadError QErr m @@ -187,3 +275,4 @@ runHasuraGQ reqId (query, queryParsed) userInfo resolvedOp = do let !json = encodeGQResp $ GQSuccess $ encJToLBS resp telemQueryType = case resolvedOp of E.ExOpMutation{} -> Telem.Mutation ; _ -> Telem.Query return (telemTimeIO, telemQueryType, respHdrs, json) +-} diff --git a/server/src-lib/Hasura/GraphQL/Transport/HTTP/Protocol.hs b/server/src-lib/Hasura/GraphQL/Transport/HTTP/Protocol.hs index 065b811825b..cbb51c5f2f8 100644 --- a/server/src-lib/Hasura/GraphQL/Transport/HTTP/Protocol.hs +++ b/server/src-lib/Hasura/GraphQL/Transport/HTTP/Protocol.hs @@ -1,6 +1,3 @@ -{-# LANGUAGE NamedFieldPuns #-} -{-# LANGUAGE RecordWildCards #-} - module Hasura.GraphQL.Transport.HTTP.Protocol ( GQLReq(..) , GQLBatchedReqs(..) @@ -19,37 +16,25 @@ module Hasura.GraphQL.Transport.HTTP.Protocol , RemoteGqlResp(..) , GraphqlResponse(..) , encodeGraphqlResponse - , GQRespValue(..), gqRespData, gqRespErrors - , encodeGQRespValue - , parseGQRespValue - , parseEncJObject - , GQJoinError(..), gqJoinErrorToValue ) where -import Control.Lens import Hasura.EncJSON -import Hasura.GraphQL.Utils import Hasura.Prelude import Hasura.RQL.Types -import Language.GraphQL.Draft.Instances () import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Aeson as J -import qualified Data.Aeson.Casing as J -import qualified Data.Aeson.Ordered as OJ -import qualified Data.Aeson.TH as J -import qualified Data.ByteString.Lazy as BL -import qualified Data.HashMap.Strict as Map -import qualified Data.Vector as V -import qualified Language.GraphQL.Draft.Parser as G -import qualified Language.GraphQL.Draft.Syntax as G -import qualified VectorBuilder.Builder as VB -import qualified VectorBuilder.Vector as VB +import qualified Data.Aeson as J +import qualified Data.Aeson.Casing as J +import qualified Data.Aeson.TH as J +import qualified Data.ByteString.Lazy as BL +import qualified Data.HashMap.Strict as Map +import qualified Language.GraphQL.Draft.Parser as G +import qualified Language.GraphQL.Draft.Syntax as G newtype GQLExecDoc - = GQLExecDoc { unGQLExecDoc :: [G.ExecutableDefinition] } - deriving (Ord, Show, Eq, Hashable, Lift) + = GQLExecDoc { unGQLExecDoc :: [G.ExecutableDefinition G.Name] } + deriving (Ord, Show, Eq, Hashable,Lift) instance J.FromJSON GQLExecDoc where parseJSON v = GQLExecDoc . G.getExecutableDefinitions <$> J.parseJSON v @@ -62,9 +47,9 @@ newtype OperationName deriving (Ord, Show, Eq, Hashable, J.ToJSON, Lift) instance J.FromJSON OperationName where - parseJSON v = OperationName . G.Name <$> J.parseJSON v + parseJSON v = OperationName <$> J.parseJSON v -type VariableValues = Map.HashMap G.Variable J.Value +type VariableValues = Map.HashMap G.Name J.Value data GQLReq a = GQLReq @@ -105,7 +90,7 @@ type GQLReqParsed = GQLReq GQLExecDoc toParsed :: (MonadError QErr m ) => GQLReqUnparsed -> m GQLReqParsed toParsed req = case G.parseExecutableDoc gqlText of - Left _ -> withPathK "query" $ throwVE "not a valid graphql query" + Left _ -> withPathK "query" $ throw400 ValidationFailed "not a valid graphql query" Right a -> return $ req { _grQuery = GQLExecDoc $ G.getExecutableDefinitions a } where gqlText = _unGQLQueryText $ _grQuery req @@ -114,40 +99,11 @@ encodeGQErr :: Bool -> QErr -> J.Value encodeGQErr includeInternal qErr = J.object [ "errors" J..= [encodeGQLErr includeInternal qErr]] --- | https://graphql.github.io/graphql-spec/June2018/#sec-Response-Format --- --- NOTE: this type and parseGQRespValue are a lax representation of the spec, --- since... --- - remote GraphQL servers may not conform strictly, and... --- - we use this type as an accumulator. --- --- Ideally we'd have something correct by construction for hasura results --- someplace. -data GQRespValue = - GQRespValue - { _gqRespData :: OJ.Object - -- ^ 'OJ.empty' (corresponding to the invalid `"data": {}`) indicates an error. - , _gqRespErrors :: VB.Builder OJ.Value - -- ^ An 'OJ.Array', but with efficient cons and concatenation. Null indicates - -- query success. - } - -makeLenses ''GQRespValue - -newtype GQJoinError = GQJoinError Text - deriving (Show, Eq, IsString, Monoid, Semigroup) - --- | https://graphql.github.io/graphql-spec/June2018/#sec-Errors "Error result format" -gqJoinErrorToValue :: GQJoinError -> OJ.Value -gqJoinErrorToValue (GQJoinError msg) = - OJ.Object (OJ.fromList [("message", OJ.String msg)]) - data GQResult a = GQSuccess !a | GQPreExecError ![J.Value] | GQExecError ![J.Value] - | GQGeneric !GQRespValue - deriving (Functor, Foldable, Traversable) + deriving (Show, Eq, Functor, Foldable, Traversable) type GQResponse = GQResult BL.ByteString @@ -156,6 +112,13 @@ isExecError = \case GQExecError _ -> True _ -> False +encodeGQResp :: GQResponse -> EncJSON +encodeGQResp gqResp = + encJFromAssocList $ case gqResp of + GQSuccess r -> [("data", encJFromLBS r)] + GQPreExecError e -> [("errors", encJFromJValue e)] + GQExecError e -> [("data", "null"), ("errors", encJFromJValue e)] + -- | Represents GraphQL response from a remote server data RemoteGqlResp = RemoteGqlResp @@ -181,53 +144,3 @@ encodeGraphqlResponse :: GraphqlResponse -> EncJSON encodeGraphqlResponse = \case GRHasura resp -> encodeGQResp resp GRRemote resp -> encodeRemoteGqlResp resp - --- emptyResp :: GQRespValue --- emptyResp = GQRespValue OJ.empty VB.empty - -parseEncJObject :: EncJSON -> Either String OJ.Object -parseEncJObject = OJ.eitherDecode . encJToLBS >=> \case - OJ.Object obj -> pure obj - _ -> Left "expected object for GraphQL response" - -parseGQRespValue :: EncJSON -> Either String GQRespValue -parseGQRespValue = parseEncJObject >=> \obj -> do - _gqRespData <- - case OJ.lookup "data" obj of - -- "an error was encountered before execution began": - Nothing -> pure OJ.empty - -- "an error was encountered during the execution that prevented a valid response": - Just OJ.Null -> pure OJ.empty - Just (OJ.Object dobj) -> pure dobj - Just _ -> Left "expected object or null for GraphQL data response" - _gqRespErrors <- - case OJ.lookup "errors" obj of - Nothing -> pure VB.empty - Just (OJ.Array vec) -> pure $ VB.vector vec - Just _ -> Left "expected array for GraphQL error response" - pure (GQRespValue {_gqRespData, _gqRespErrors}) - -encodeGQRespValue :: GQRespValue -> EncJSON -encodeGQRespValue GQRespValue{..} = OJ.toEncJSON $ OJ.Object $ OJ.fromList $ - -- "If the data entry in the response is not present, the errors entry in the - -- response must not be empty. It must contain at least one error. " - if _gqRespData == OJ.empty && not anyErrors - then - let msg = "Somehow did not accumulate any errors or data from graphql queries" - in [("errors", OJ.Array $ V.singleton $ OJ.Object (OJ.fromList [("message", OJ.String msg)]) )] - else - -- NOTE: "If an error was encountered during the execution that prevented - -- a valid response, the data entry in the response should be null." - -- TODO it's not clear to me how we can enforce that here or if we should try. - ("data", OJ.Object _gqRespData) : - [("errors", OJ.Array gqRespErrorsV) | anyErrors ] - where - gqRespErrorsV = VB.build _gqRespErrors - anyErrors = not $ V.null gqRespErrorsV - -encodeGQResp :: GQResponse -> EncJSON -encodeGQResp = \case - GQSuccess r -> encJFromAssocList [("data", encJFromLBS r)] - GQPreExecError e -> encJFromAssocList [("errors", encJFromJValue e)] - GQExecError e -> encJFromAssocList [("data", "null"), ("errors", encJFromJValue e)] - GQGeneric v -> encodeGQRespValue v diff --git a/server/src-lib/Hasura/GraphQL/Transport/WebSocket.hs b/server/src-lib/Hasura/GraphQL/Transport/WebSocket.hs index c1d38e7021b..a5628a2f4e4 100644 --- a/server/src-lib/Hasura/GraphQL/Transport/WebSocket.hs +++ b/server/src-lib/Hasura/GraphQL/Transport/WebSocket.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE CPP #-} {-# LANGUAGE RankNTypes #-} {-# LANGUAGE RecordWildCards #-} @@ -22,6 +23,7 @@ import qualified Data.Aeson.Casing as J import qualified Data.Aeson.TH as J import qualified Data.ByteString.Lazy as BL import qualified Data.CaseInsensitive as CI +import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map import qualified Data.Text as T import qualified Data.Text.Encoding as TE @@ -34,16 +36,17 @@ import qualified Network.HTTP.Types as H import qualified Network.Wai.Extended as Wai import qualified Network.WebSockets as WS import qualified StmContainers.Map as STMMap -import qualified Data.Environment as Env import Control.Concurrent.Extended (sleep) import Control.Exception.Lifted import Data.String +#ifndef PROFILING import GHC.AssertNF +#endif import Hasura.EncJSON -import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery(..)) import Hasura.GraphQL.Logging (MonadQueryLog (..)) +import Hasura.GraphQL.Transport.HTTP (MonadExecuteQuery (..)) import Hasura.GraphQL.Transport.HTTP.Protocol import Hasura.GraphQL.Transport.WebSocket.Protocol import Hasura.HTTP @@ -52,8 +55,7 @@ import Hasura.RQL.Types import Hasura.Server.Auth (AuthMode, UserAuthentication, resolveUserInfo) import Hasura.Server.Cors -import Hasura.Server.Utils (RequestId, - getRequestId) +import Hasura.Server.Utils (RequestId, getRequestId) import Hasura.Server.Version (HasVersion) import Hasura.Session @@ -217,7 +219,7 @@ data WSServerEnv , _wseHManager :: !H.Manager , _wseCorsPolicy :: !CorsPolicy , _wseSQLCtx :: !SQLGenCtx - , _wseQueryCache :: !E.PlanCache + -- , _wseQueryCache :: !E.PlanCache -- See Note [Temporarily disabling query plan caching] , _wseServer :: !WSServer , _wseEnableAllowlist :: !Bool } @@ -341,12 +343,89 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do reqParsedE <- lift $ E.checkGQLExecution userInfo (reqHdrs, ipAddress) enableAL sc q reqParsed <- either (withComplete . preExecErr requestId) return reqParsedE execPlanE <- runExceptT $ E.getResolvedExecPlan env logger pgExecCtx - planCache userInfo sqlGenCtx sc scVer queryType httpMgr reqHdrs (q, reqParsed) + {- planCache -} userInfo sqlGenCtx sc scVer queryType httpMgr reqHdrs (q, reqParsed) (telemCacheHit, execPlan) <- either (withComplete . preExecErr requestId) return execPlanE - let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx planCache sc scVer httpMgr enableAL + let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx {- planCache -} sc scVer httpMgr enableAL case execPlan of + E.QueryExecutionPlan queryPlan asts -> + case queryPlan of + E.ExecStepDB (tx, genSql) -> Tracing.trace "Query" $ + execQueryOrMut timerTot Telem.Query telemCacheHit (Just genSql) requestId $ + fmap snd $ Tracing.interpTraceT id $ executeQuery reqParsed asts (Just genSql) pgExecCtx Q.ReadOnly tx + E.ExecStepRemote (rsi, opDef, _varValsM) -> + runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi + E.ExecStepRaw (name, json) -> + execQueryOrMut timerTot Telem.Query telemCacheHit Nothing requestId $ + return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json + E.MutationExecutionPlan mutationPlan -> + case mutationPlan of + E.ExecStepDB (tx, _) -> Tracing.trace "Mutate" do + ctx <- Tracing.currentContext + execQueryOrMut timerTot Telem.Mutation telemCacheHit Nothing requestId $ + Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadWrite . withTraceContext ctx . withUserInfo userInfo) tx + E.ExecStepRemote (rsi, opDef, _varValsM) -> + runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi + E.ExecStepRaw (name, json) -> + execQueryOrMut timerTot Telem.Query telemCacheHit Nothing requestId $ + return $ encJFromJValue $ J.Object $ Map.singleton (G.unName name) json + E.SubscriptionExecutionPlan lqOp -> do + -- log the graphql query + logQueryLog logger q Nothing requestId + let subscriberMetadata = LQ.mkSubscriberMetadata $ J.object + [ "websocket_id" J..= WS.getWSId wsConn + , "operation_id" J..= opId + ] + -- NOTE!: we mask async exceptions higher in the call stack, but it's + -- crucial we don't lose lqId after addLiveQuery returns successfully. + !lqId <- liftIO $ LQ.addLiveQuery logger subscriberMetadata lqMap lqOp liveQOnChange + let !opName = _grOperationName q +#ifndef PROFILING + liftIO $ $assertNFHere $! (lqId, opName) -- so we don't write thunks to mutable vars +#endif + liftIO $ STM.atomically $ + -- NOTE: see crucial `lookup` check above, ensuring this doesn't clobber: + STMMap.insert (lqId, opName) opId opMap + logOpEv ODStarted (Just requestId) + + -- case execPlan of + -- E.GExPHasura resolvedOp -> + -- runHasuraGQ timerTot telemCacheHit requestId q userInfo resolvedOp + -- E.GExPRemote rsi opDef -> + -- runRemoteGQ timerTot telemCacheHit execCtx requestId userInfo reqHdrs opDef rsi + where + telemTransport = Telem.WebSocket + execQueryOrMut + :: ExceptT () m DiffTime + -> Telem.QueryType + -> Telem.CacheHit + -> Maybe EQ.GeneratedSqlMap + -> RequestId + -> ExceptT QErr (ExceptT () m) EncJSON + -> ExceptT () m () + execQueryOrMut timerTot telemQueryType telemCacheHit genSql requestId action = do + let telemLocality = Telem.Local + logOpEv ODStarted (Just requestId) + -- log the generated SQL and the graphql query + logQueryLog logger q genSql requestId + withElapsedTime (runExceptT action) >>= \case + (_, Left err) -> postExecErr requestId err + (telemTimeIO_DT, Right encJson) -> do + -- Telemetry. NOTE: don't time network IO: + telemTimeTot <- Seconds <$> timerTot + sendSuccResp encJson $ LQ.LiveQueryMetadata telemTimeIO_DT + let telemTimeIO = convertDuration telemTimeIO_DT + Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..} + + sendCompleted (Just requestId) +{- + runHasuraGQ :: ExceptT () m DiffTime + -> Telem.CacheHit -> RequestId -> GQLReqUnparsed -> UserInfo -> E.ExecOp + -> ExceptT () m () + runHasuraGQ timerTot telemCacheHit reqId query userInfo = \case + E.ExOpQuery opTx genSql _asts -> + execQueryOrMut Telem.Query genSql $ runQueryTx pgExecCtx opTx E.GExPHasura resolvedOp -> runHasuraGQ timerTot telemCacheHit requestId q reqParsed userInfo resolvedOp E.GExPRemote rsi opDef -> @@ -410,10 +489,11 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do Telem.recordTimingMetric Telem.RequestDimensions{..} Telem.RequestTimings{..} sendCompleted (Just reqId) +-} runRemoteGQ :: ExceptT () m DiffTime -> Telem.CacheHit -> E.ExecutionCtx -> RequestId -> UserInfo -> [H.Header] - -> G.TypedOperationDefinition -> RemoteSchemaInfo + -> G.TypedOperationDefinition G.NoFragments G.Name -> RemoteSchemaInfo -> ExceptT () m () runRemoteGQ timerTot telemCacheHit execCtx reqId userInfo reqHdrs opDef rsi = do let telemLocality = Telem.Remote @@ -425,8 +505,8 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do G.OperationTypeQuery -> return Telem.Query -- if it's not a subscription, use HTTP to execute the query on the remote - (runExceptT $ flip runReaderT execCtx $ - E.execRemoteGQ env reqId userInfo reqHdrs q rsi (G._todType opDef)) >>= \case + runExceptT (flip runReaderT execCtx $ E.execRemoteGQ env reqId userInfo reqHdrs q rsi opDef) + >>= \case Left err -> postExecErr reqId err Right (telemTimeIO_DT, !val) -> do -- Telemetry. NOTE: don't time network IO: @@ -445,7 +525,7 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do invalidGqlErr err = err500 Unexpected $ "Failed parsing GraphQL response from remote: " <> err - WSServerEnv logger pgExecCtx lqMap getSchemaCache httpMgr _ sqlGenCtx planCache + WSServerEnv logger pgExecCtx lqMap getSchemaCache httpMgr _ sqlGenCtx {- planCache -} _ enableAL = serverEnv WSConnData userInfoR opMap errRespTy queryType = WS.getData wsConn @@ -507,7 +587,7 @@ onStart env serverEnv wsConn (StartMsg opId q) = catchAndIgnore $ do (SMData $ DataMsg opId $ GRHasura $ GQSuccess $ BL.fromStrict bs) (LQ.LiveQueryMetadata dTime) resp -> sendMsg wsConn $ SMData $ DataMsg opId $ GRHasura $ - (BL.fromStrict . LQ._lqrPayload) <$> resp + BL.fromStrict . LQ._lqrPayload <$> resp catchAndIgnore :: ExceptT () m () -> m () catchAndIgnore m = void $ runExceptT m @@ -603,7 +683,7 @@ onConnInit :: (HasVersion, MonadIO m, UserAuthentication (Tracing.TraceT m)) => L.Logger L.Hasura -> H.Manager -> WSConn -> AuthMode -> Maybe ConnParams -> Tracing.TraceT m () onConnInit logger manager wsConn authMode connParamsM = do - -- TODO: what should be the behaviour of connection_init message when a + -- TODO(from master): what should be the behaviour of connection_init message when a -- connection is already iniatilized? Currently, we seem to be doing -- something arbitrary which isn't correct. Ideally, we should stick to -- this: @@ -621,7 +701,9 @@ onConnInit logger manager wsConn authMode connParamsM = do Left e -> do let !initErr = CSInitError $ qeError e liftIO $ do +#ifndef PROFILING $assertNFHere initErr -- so we don't write thunks to mutable vars +#endif STM.atomically $ STM.writeTVar (_wscUser $ WS.getData wsConn) initErr let connErr = ConnErrMsg $ qeError e @@ -631,11 +713,13 @@ onConnInit logger manager wsConn authMode connParamsM = do Right (userInfo, expTimeM) -> do let !csInit = CSInitialised $ WsClientState userInfo expTimeM paramHeaders ipAddress liftIO $ do +#ifndef PROFILING $assertNFHere csInit -- so we don't write thunks to mutable vars +#endif STM.atomically $ STM.writeTVar (_wscUser $ WS.getData wsConn) csInit sendMsg wsConn SMConnAck - -- TODO: send it periodically? Why doesn't apollo's protocol use + -- TODO(from master): send it periodically? Why doesn't apollo's protocol use -- ping/pong frames of websocket spec? sendMsg wsConn SMConnKeepAlive where @@ -685,14 +769,14 @@ createWSServerEnv -> CorsPolicy -> SQLGenCtx -> Bool - -> E.PlanCache + -- -> E.PlanCache -> m WSServerEnv createWSServerEnv logger isPgCtx lqState getSchemaCache httpManager - corsPolicy sqlGenCtx enableAL planCache = do + corsPolicy sqlGenCtx enableAL {- planCache -} = do wsServer <- liftIO $ STM.atomically $ WS.createWSServer logger return $ WSServerEnv logger isPgCtx lqState getSchemaCache httpManager corsPolicy - sqlGenCtx planCache wsServer enableAL + sqlGenCtx {- planCache -} wsServer enableAL createWSServerApp :: ( HasVersion diff --git a/server/src-lib/Hasura/GraphQL/Transport/WebSocket/Server.hs b/server/src-lib/Hasura/GraphQL/Transport/WebSocket/Server.hs index 0371a88552d..f5e90b747bc 100644 --- a/server/src-lib/Hasura/GraphQL/Transport/WebSocket/Server.hs +++ b/server/src-lib/Hasura/GraphQL/Transport/WebSocket/Server.hs @@ -1,5 +1,6 @@ {-# LANGUAGE NondecreasingIndentation #-} {-# LANGUAGE RankNTypes #-} +{-# LANGUAGE CPP #-} module Hasura.GraphQL.Transport.WebSocket.Server ( WSId(..) @@ -45,7 +46,9 @@ import qualified Data.TByteString as TBS import qualified Data.UUID as UUID import qualified Data.UUID.V4 as UUID import Data.Word (Word16) +#ifndef PROFILING import GHC.AssertNF +#endif import GHC.Int (Int64) import Hasura.Prelude import qualified ListT @@ -116,10 +119,6 @@ $(J.deriveToJSON } ''WSLog) -instance L.ToEngineLog WSLog L.Hasura where - toEngineLog wsLog = - (L.LevelDebug, L.ELTInternal L.ILTWsServer, J.toJSON wsLog) - class Monad m => MonadWSLog m where -- | Takes WS server log data and logs it -- logWSServer @@ -131,6 +130,10 @@ instance MonadWSLog m => MonadWSLog (ExceptT e m) where instance MonadWSLog m => MonadWSLog (ReaderT r m) where logWSLog l ws = lift $ logWSLog l ws +instance L.ToEngineLog WSLog L.Hasura where + toEngineLog wsLog = + (L.LevelDebug, L.ELTInternal L.ILTWsServer, J.toJSON wsLog) + data WSQueueResponse = WSQueueResponse { _wsqrMessage :: !BL.ByteString @@ -172,7 +175,9 @@ closeConnWithCode wsConn code bs = do -- so that sendMsg doesn't block sendMsg :: WSConn a -> WSQueueResponse -> IO () sendMsg wsConn = \ !resp -> do +#ifndef PROFILING $assertNFHere resp -- so we don't write thunks to mutable vars +#endif STM.atomically $ STM.writeTQueue (_wcSendQ wsConn) resp type ConnMap a = STMMap.Map WSId (WSConn a) @@ -362,7 +367,6 @@ createServerApp (WSServer logger@(L.Logger writeLog) serverStatus) wsHandlers !i _hOnClose wsHandlers wsConn logWSLog logger $ WSLog (_wcConnId wsConn) EClosed Nothing - shutdown :: WSServer a -> IO () shutdown (WSServer (L.Logger writeLog) serverStatus) = do writeLog $ L.debugT "Shutting websockets server down" diff --git a/server/src-lib/Hasura/GraphQL/Utils.hs b/server/src-lib/Hasura/GraphQL/Utils.hs index 10367786700..74efb4d02e7 100644 --- a/server/src-lib/Hasura/GraphQL/Utils.hs +++ b/server/src-lib/Hasura/GraphQL/Utils.hs @@ -1,23 +1,14 @@ module Hasura.GraphQL.Utils ( showName - , showNamedTy - , throwVE - , getBaseTy , groupTuples , groupListWith , mkMapWith , showNames - , unwrapTy , simpleGraphQLQuery - , jsonValueToGValue ) where import Hasura.Prelude -import Hasura.RQL.Types.Error -import Data.Scientific (floatingOrInteger) - -import qualified Data.Aeson as A import qualified Data.HashMap.Strict as Map import qualified Data.List.NonEmpty as NE import qualified Data.Text as T @@ -26,26 +17,6 @@ import qualified Language.GraphQL.Draft.Syntax as G showName :: G.Name -> Text showName name = "\"" <> G.unName name <> "\"" -throwVE :: (MonadError QErr m) => Text -> m a -throwVE = throw400 ValidationFailed - -showNamedTy :: G.NamedType -> Text -showNamedTy nt = - "'" <> G.showNT nt <> "'" - -getBaseTy :: G.GType -> G.NamedType -getBaseTy = \case - G.TypeNamed _ n -> n - G.TypeList _ lt -> getBaseTyL lt - where - getBaseTyL = getBaseTy . G.unListType - -unwrapTy :: G.GType -> G.GType -unwrapTy = - \case - G.TypeList _ lt -> G.unListType lt - nt -> nt - groupListWith :: (Eq k, Hashable k, Foldable t, Functor t) => (v -> k) -> t v -> Map.HashMap k (NE.NonEmpty v) @@ -81,15 +52,3 @@ showNames names = -- A simple graphql query to be used in generators simpleGraphQLQuery :: Text simpleGraphQLQuery = "query {author {id name}}" - --- | Convert a JSON value to a GraphQL value. -jsonValueToGValue :: A.Value -> G.Value -jsonValueToGValue = \case - A.String t -> G.VString $ G.StringValue t - -- TODO: Note the danger zone of scientific: - A.Number n -> either (\(_::Float) -> G.VFloat n) G.VInt (floatingOrInteger n) - A.Bool b -> G.VBoolean b - A.Object o -> G.VObject $ G.ObjectValueG $ - map (uncurry G.ObjectFieldG . (G.Name *** jsonValueToGValue)) $ Map.toList o - A.Array a -> G.VList $ G.ListValueG $ map jsonValueToGValue $ toList a - A.Null -> G.VNull diff --git a/server/src-lib/Hasura/GraphQL/Validate.hs b/server/src-lib/Hasura/GraphQL/Validate.hs deleted file mode 100644 index a79460d7d03..00000000000 --- a/server/src-lib/Hasura/GraphQL/Validate.hs +++ /dev/null @@ -1,331 +0,0 @@ -module Hasura.GraphQL.Validate - ( validateGQ - , showVars - , RootSelectionSet(..) - , SelectionSet(..) - , Field(..) - , getTypedOp - , QueryParts(..) - , getQueryParts - - , ReusableVariableTypes(..) - , ReusableVariableValues - , validateVariablesForReuse - - , isQueryInAllowlist - - , unValidateArgsMap - , unValidateSelectionSet - , unValidateField - ) where - -import Hasura.Prelude - -import Data.Has -import Data.Time - -import qualified Data.Aeson as A -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.HashSet as HS -import qualified Data.Text as T -import qualified Data.UUID as UUID -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.NormalForm -import Hasura.GraphQL.Resolve.InputValue (annInpValueToJson) -import Hasura.GraphQL.Schema -import Hasura.GraphQL.Transport.HTTP.Protocol -import Hasura.GraphQL.Utils -import Hasura.GraphQL.Validate.Context -import Hasura.GraphQL.Validate.InputValue -import Hasura.GraphQL.Validate.SelectionSet -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.DML.Select.Types -import Hasura.RQL.Types -import Hasura.SQL.Time -import Hasura.SQL.Value - -data QueryParts - = QueryParts - { qpOpDef :: !G.TypedOperationDefinition - , qpOpRoot :: !ObjTyInfo - , qpFragDefsL :: ![G.FragmentDefinition] - , qpVarValsM :: !(Maybe VariableValues) - } deriving (Show, Eq) - -getTypedOp - :: (MonadError QErr m) - => Maybe OperationName - -> [G.SelectionSet] - -> [G.TypedOperationDefinition] - -> m G.TypedOperationDefinition -getTypedOp opNameM selSets opDefs = - case (opNameM, selSets, opDefs) of - (Just opName, [], _) -> do - let n = _unOperationName opName - opDefM = find (\opDef -> G._todName opDef == Just n) opDefs - onNothing opDefM $ throwVE $ - "no such operation found in the document: " <> showName n - (Just _, _, _) -> - throwVE $ "operationName cannot be used when " <> - "an anonymous operation exists in the document" - (Nothing, [selSet], []) -> - return $ G.TypedOperationDefinition G.OperationTypeQuery Nothing [] [] selSet - (Nothing, [], [opDef]) -> - return opDef - (Nothing, _, _) -> - throwVE $ "exactly one operation has to be present " <> - "in the document when operationName is not specified" - --- | For all the variables defined there will be a value in the final map --- If no default, not in variables and nullable, then null value -validateVariables - :: (MonadReader r m, Has TypeMap r, MonadError QErr m) - => [G.VariableDefinition] -> VariableValues -> m AnnVarVals -validateVariables varDefsL inpVals = withPathK "variableValues" $ do - varDefs <- onLeft (mkMapWith G._vdVariable varDefsL) $ \dups -> - throwVE $ "the following variables are defined more than once: " <> - showVars dups - - let unexpectedVars = filter (not . (`Map.member` varDefs)) $ Map.keys inpVals - unless (null unexpectedVars) $ - throwVE $ "unexpected variables in variableValues: " <> - showVars unexpectedVars - - traverse validateVariable varDefs - where - validateVariable (G.VariableDefinition var ty defM) = do - let baseTy = getBaseTy ty - baseTyInfo <- getTyInfoVE baseTy - -- check that the variable is defined on input types - when (isObjTy baseTyInfo) $ throwVE $ - "variables can only be defined on input types" - <> "(enums, scalars, input objects), but " - <> showNamedTy baseTy <> " is an object type" - - let defM' = bool (defM <|> Just G.VCNull) defM $ G.isNotNull ty - annDefM <- withPathK "defaultValue" $ - mapM (validateInputValue constValueParser ty) defM' - let inpValM = Map.lookup var inpVals - annInpValM <- withPathK (G.unName $ G.unVariable var) $ - mapM (validateInputValue jsonParser ty) inpValM - let varValM = annInpValM <|> annDefM - onNothing varValM $ throwVE $ - "expecting a value for non-nullable variable: " <> - showVars [var] <> - " of type: " <> G.showGT ty <> - " in variableValues" - - -showVars :: (Functor f, Foldable f) => f G.Variable -> Text -showVars = showNames . fmap G.unVariable - --- | This is similar in spirit to 'validateVariables' but uses preexisting 'ReusableVariableTypes' --- information to parse Postgres values directly for use with a reusable query plan. (Ideally, it --- would be nice to be able to share more of the logic instead of duplicating it.) -validateVariablesForReuse - :: (MonadError QErr m) - => ReusableVariableTypes -> Maybe VariableValues -> m ReusableVariableValues -validateVariablesForReuse (ReusableVariableTypes varTypes) varValsM = - withPathK "variableValues" $ do - let unexpectedVars = filter (not . (`Map.member` varTypes)) $ Map.keys varVals - unless (null unexpectedVars) $ - throwVE $ "unexpected variables: " <> showVars unexpectedVars - - flip Map.traverseWithKey varTypes $ \varName varType -> - withPathK (G.unName $ G.unVariable varName) $ do - varVal <- onNothing (Map.lookup varName varVals) $ - throwVE "expected a value for non-nullable variable" - -- TODO: we don't have the graphql type - -- <> " of type: " <> T.pack (show varType) - parsePGScalarValue varType varVal - where - varVals = fromMaybe Map.empty varValsM - -validateFrag - :: (MonadError QErr m, MonadReader r m, Has TypeMap r) - => G.FragmentDefinition -> m FragDef -validateFrag (G.FragmentDefinition n onTy dirs selSet) = do - unless (null dirs) $ throwVE - "unexpected directives at fragment definition" - fragmentTypeInfo <- getFragmentTyInfo onTy - return $ FragDef n fragmentTypeInfo selSet - -validateGQ - :: (MonadError QErr m, MonadReader GCtx m, MonadReusability m) - => QueryParts -> m RootSelectionSet -validateGQ (QueryParts opDef opRoot fragDefsL varValsM) = do - ctx <- ask - - -- annotate the variables of this operation - annVarVals <- validateVariables (G._todVariableDefinitions opDef) $ fromMaybe Map.empty varValsM - - -- annotate the fragments - fragDefs <- onLeft (mkMapWith G._fdName fragDefsL) $ \dups -> - throwVE $ "the following fragments are defined more than once: " <> - showNames dups - annFragDefs <- mapM validateFrag fragDefs - - -- build a validation ctx - let valCtx = ValidationCtx (_gTypes ctx) annVarVals annFragDefs - - selSet <- flip runReaderT valCtx $ parseObjectSelectionSet valCtx opRoot $ - G._todSelectionSet opDef - - case G._todType opDef of - G.OperationTypeQuery -> return $ RQuery selSet - G.OperationTypeMutation -> return $ RMutation selSet - G.OperationTypeSubscription -> - case OMap.toList $ unAliasedFields $ unObjectSelectionSet selSet of - [] -> throw500 "empty selset for subscription" - (_:rst) -> do - -- As an internal testing feature, we support subscribing to multiple - -- selection sets. First check if the corresponding directive is set. - let multipleAllowed = G.Directive "_multiple_top_level_fields" [] `elem` G._todDirectives opDef - unless (multipleAllowed || null rst) $ - throwVE "subscriptions must select one top level field" - return $ RSubscription selSet - -isQueryInAllowlist :: GQLExecDoc -> HS.HashSet GQLQuery -> Bool -isQueryInAllowlist q = HS.member gqlQuery - where - gqlQuery = GQLQuery $ G.ExecutableDocument $ stripTypenames $ - unGQLExecDoc q - -getQueryParts - :: ( MonadError QErr m, MonadReader GCtx m) - => GQLReqParsed - -> m QueryParts -getQueryParts (GQLReq opNameM q varValsM) = do - -- get the operation that needs to be evaluated - opDef <- getTypedOp opNameM selSets opDefs - ctx <- ask - - -- get the operation root - opRoot <- case G._todType opDef of - G.OperationTypeQuery -> return $ _gQueryRoot ctx - G.OperationTypeMutation -> - onNothing (_gMutRoot ctx) $ throwVE "no mutations exist" - G.OperationTypeSubscription -> - onNothing (_gSubRoot ctx) $ throwVE "no subscriptions exist" - return $ QueryParts opDef opRoot fragDefsL varValsM - where - (selSets, opDefs, fragDefsL) = G.partitionExDefs $ unGQLExecDoc q - --- | Convert the validated arguments to GraphQL parser AST arguments -unValidateArgsMap :: ArgsMap -> [RemoteFieldArgument] -unValidateArgsMap argsMap = - map (\(n, inpVal) -> - let _rfaArgument = G.Argument n $ unValidateInpVal inpVal - _rfaVariable = unValidateInpVariable inpVal - in RemoteFieldArgument {..}) - . Map.toList $ argsMap - --- | Convert the validated field to GraphQL parser AST field -unValidateField :: G.Alias -> Field -> G.Field -unValidateField alias (Field name _ argsMap selSet) = - let args = map (\(n, inpVal) -> G.Argument n $ unValidateInpVal inpVal) $ - Map.toList argsMap - in G.Field (Just alias) name args [] $ unValidateSelectionSet selSet - --- | Convert the validated selection set to GraphQL parser AST selection set -unValidateSelectionSet :: SelectionSet -> G.SelectionSet -unValidateSelectionSet = \case - SelectionSetObject selectionSet -> fromSelectionSet selectionSet - SelectionSetInterface selectionSet -> fromScopedSelectionSet selectionSet - SelectionSetUnion selectionSet -> fromScopedSelectionSet selectionSet - SelectionSetNone -> mempty - where - fromAliasedFields :: (IsField f) => AliasedFields f -> G.SelectionSet - fromAliasedFields = - map (G.SelectionField . uncurry unValidateField) . - OMap.toList . fmap toField . unAliasedFields - fromSelectionSet = - fromAliasedFields . unObjectSelectionSet - toInlineSelection typeName = - G.SelectionInlineFragment . G.InlineFragment (Just typeName) mempty . - fromSelectionSet - fromScopedSelectionSet (ScopedSelectionSet base specific) = - map (uncurry toInlineSelection) (Map.toList specific) <> fromAliasedFields base - --- | Get the variable definition and it's value (if exists) -unValidateInpVariable :: AnnInpVal -> Maybe [(G.VariableDefinition,A.Value)] -unValidateInpVariable inputValue = - case (_aivValue inputValue) of - AGScalar _ _ -> mkVariableDefnValueTuple inputValue - AGEnum _ _ -> mkVariableDefnValueTuple inputValue - AGObject _ o -> - (\obj -> - let listObjects = OMap.toList obj - in concat $ - mapMaybe (\(_, inpVal) -> unValidateInpVariable inpVal) listObjects) - <$> o - AGArray _ _ -> mkVariableDefnValueTuple inputValue - where - mkVariableDefnValueTuple val = maybe Nothing (\vars -> Just [vars]) $ - variableDefnValueTuple val - - variableDefnValueTuple :: AnnInpVal -> Maybe (G.VariableDefinition,A.Value) - variableDefnValueTuple inpVal@AnnInpVal {..} = - let varDefn = G.VariableDefinition <$> _aivVariable <*> Just _aivType <*> Just Nothing - in (,) <$> varDefn <*> Just (annInpValueToJson inpVal) - --- | Convert the validated input value to GraphQL value, if the input value --- is a variable then it will be returned without resolving it, otherwise it --- will be resolved -unValidateInpVal :: AnnInpVal -> G.Value -unValidateInpVal (AnnInpVal _ var val) = fromMaybe G.VNull $ - -- if a variable is found, then directly return that, if not found then - -- convert it into a G.Value and return it - case var of - Just var' -> Just $ G.VVariable var' - Nothing -> - case val of - AGScalar _ v -> pgScalarToGValue <$> v - AGEnum _ v -> pgEnumToGEnum v - AGObject _ o -> - (G.VObject . G.ObjectValueG - . map (uncurry G.ObjectFieldG . (second unValidateInpVal)) - . OMap.toList - ) <$> o - AGArray _ vs -> (G.VList . G.ListValueG . map unValidateInpVal) <$> vs - - where - pgEnumToGEnum :: AnnGEnumValue -> Maybe G.Value - pgEnumToGEnum = \case - AGESynthetic v -> G.VEnum <$> v - AGEReference _ v -> (G.VEnum . G.EnumValue . G.Name . getEnumValue) <$> v - - pgScalarToGValue :: PGScalarValue -> G.Value - pgScalarToGValue = \case - PGValInteger i -> G.VInt $ fromIntegral i - PGValSmallInt i -> G.VInt $ fromIntegral i - PGValBigInt i -> G.VInt $ fromIntegral i - PGValFloat f -> G.VFloat $ realToFrac f - PGValDouble d -> G.VFloat $ realToFrac d - -- TODO: Scientific is a danger zone; use its safe conv function. - PGValNumeric sc -> G.VFloat $ realToFrac sc - PGValMoney m -> G.VFloat $ realToFrac m - PGValBoolean b -> G.VBoolean b - PGValChar t -> toStringValue $ T.singleton t - PGValVarchar t -> toStringValue t - PGValText t -> toStringValue t - PGValCitext t -> toStringValue t - PGValDate d -> toStringValue $ T.pack $ showGregorian d - PGValTimeStampTZ u -> toStringValue $ T.pack $ - formatTime defaultTimeLocale "%FT%T%QZ" u - PGValTimeStamp u -> toStringValue $ T.pack $ - formatTime defaultTimeLocale "%FT%T%QZ" u - PGValTimeTZ (ZonedTimeOfDay tod tz) -> - toStringValue $ T.pack (show tod ++ timeZoneOffsetString tz) - PGNull _ -> G.VNull - PGValJSON (Q.JSON v) -> jsonValueToGValue v - PGValJSONB (Q.JSONB v) -> jsonValueToGValue v - PGValGeo v -> jsonValueToGValue $ A.toJSON v - PGValRaster v -> jsonValueToGValue $ A.toJSON v - PGValUUID u -> toStringValue $ UUID.toText u - PGValUnknown t -> toStringValue t - where - toStringValue = G.VString . G.StringValue diff --git a/server/src-lib/Hasura/GraphQL/Validate/Context.hs b/server/src-lib/Hasura/GraphQL/Validate/Context.hs deleted file mode 100644 index a21d8e84d99..00000000000 --- a/server/src-lib/Hasura/GraphQL/Validate/Context.hs +++ /dev/null @@ -1,78 +0,0 @@ -module Hasura.GraphQL.Validate.Context - ( ValidationCtx(..) - , getFieldInfo - , getInpFieldInfo - , getTyInfo - , getTyInfoVE - , getFragmentTyInfo - , module Hasura.GraphQL.Utils - ) where - -import Hasura.Prelude - -import qualified Data.HashMap.Strict as Map -import qualified Language.GraphQL.Draft.Syntax as G - -import Data.Has -import Hasura.GraphQL.Utils -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types - -getFieldInfo - :: ( MonadError QErr m) - => G.NamedType -> ObjFieldMap -> G.Name -> m ObjFldInfo -getFieldInfo typeName fieldMap fldName = - onNothing (Map.lookup fldName fieldMap) $ throwVE $ - "field " <> showName fldName <> - " not found in type: " <> showNamedTy typeName - -getInpFieldInfo - :: ( MonadError QErr m) - => InpObjTyInfo -> G.Name -> m G.GType -getInpFieldInfo tyInfo fldName = - fmap _iviType $ onNothing (Map.lookup fldName $ _iotiFields tyInfo) $ - throwVE $ "field " <> showName fldName <> - " not found in type: " <> showNamedTy (_iotiName tyInfo) - -data ValidationCtx - = ValidationCtx - { _vcTypeMap :: !TypeMap - -- these are in the scope of the operation - , _vcVarVals :: !AnnVarVals - -- all the fragments - , _vcFragDefMap :: !FragDefMap - } deriving (Show, Eq) - -instance Has TypeMap ValidationCtx where - getter = _vcTypeMap - modifier f ctx = ctx { _vcTypeMap = f $ _vcTypeMap ctx } - -getTyInfo - :: ( MonadReader r m , Has TypeMap r - , MonadError QErr m) - => G.NamedType - -> m TypeInfo -getTyInfo namedTy = do - tyMap <- asks getter - onNothing (Map.lookup namedTy tyMap) $ - throw500 $ "type info not found for: " <> showNamedTy namedTy - -getTyInfoVE - :: ( MonadReader r m , Has TypeMap r - , MonadError QErr m) - => G.NamedType - -> m TypeInfo -getTyInfoVE namedTy = do - tyMap <- asks getter - onNothing (Map.lookup namedTy tyMap) $ - throwVE $ "no such type exists in the schema: " <> showNamedTy namedTy - -getFragmentTyInfo - :: (MonadReader r m, Has TypeMap r, MonadError QErr m) - => G.NamedType -> m FragmentTypeInfo -getFragmentTyInfo onType = - getTyInfoVE onType >>= \case - TIObj tyInfo -> pure $ FragmentTyObject tyInfo - TIIFace tyInfo -> pure $ FragmentTyInterface tyInfo - TIUnion tyInfo -> pure $ FragmentTyUnion tyInfo - _ -> throwVE "fragments can only be defined on object/interface/union types" diff --git a/server/src-lib/Hasura/GraphQL/Validate/InputValue.hs b/server/src-lib/Hasura/GraphQL/Validate/InputValue.hs deleted file mode 100644 index 87a09d7b6a3..00000000000 --- a/server/src-lib/Hasura/GraphQL/Validate/InputValue.hs +++ /dev/null @@ -1,347 +0,0 @@ -module Hasura.GraphQL.Validate.InputValue - ( validateInputValue - , jsonParser - , valueParser - , constValueParser - , pPrintValueC - ) where - -import Hasura.Prelude - -import Data.Has -import Data.List.Extended (duplicates) - -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd as OMap -import qualified Data.Text as T -import qualified Data.Vector as V -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Hasura.RQL.Types as RQL - -import Hasura.GraphQL.Utils -import Hasura.GraphQL.Validate.Context -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types -import Hasura.SQL.Value - -newtype P a = P { unP :: Maybe (Either (G.Variable, AnnInpVal) a)} - -pNull :: (Monad m) => m (P a) -pNull = return $ P Nothing - -pVal :: (Monad m) => a -> m (P a) -pVal = return . P . Just . Right - -resolveVar - :: ( MonadError QErr m - , MonadReader ValidationCtx m) - => G.Variable -> m AnnInpVal -resolveVar var = do - varVals <- _vcVarVals <$> ask - onNothing (Map.lookup var varVals) $ - throwVE $ "no such variable defined in the operation: " - <> showName (G.unVariable var) - -pVar - :: ( MonadError QErr m - , MonadReader ValidationCtx m) - => G.Variable -> m (P a) -pVar var = do - annInpVal <- resolveVar var - return . P . Just $ Left (var, annInpVal) - -data InputValueParser a m - = InputValueParser - { getScalar :: a -> m (P J.Value) - , getList :: a -> m (P [a]) - , getObject :: a -> m (P [(G.Name, a)]) - , getEnum :: a -> m (P G.EnumValue) - } - -jsonParser :: (MonadError QErr m) => InputValueParser J.Value m -jsonParser = - InputValueParser jScalar jList jObject jEnum - where - jEnum (J.String t) = pVal $ G.EnumValue $ G.Name t - jEnum J.Null = pNull - jEnum _ = throwVE "expecting a JSON string for Enum" - - jList (J.Array l) = pVal $ V.toList l - jList J.Null = pNull - jList v = pVal [v] - - jObject (J.Object m) = pVal [(G.Name t, v) | (t, v) <- Map.toList m] - jObject J.Null = pNull - jObject _ = throwVE "expecting a JSON object" - - jScalar J.Null = pNull - jScalar v = pVal v - -toJValue :: (MonadError QErr m) => G.Value -> m J.Value -toJValue = \case - G.VVariable _ -> - throwVE "variables are not allowed in scalars" - G.VInt i -> return $ J.toJSON i - G.VFloat f -> return $ J.toJSON f - G.VString (G.StringValue t) -> return $ J.toJSON t - G.VBoolean b -> return $ J.toJSON b - G.VNull -> return J.Null - G.VEnum (G.EnumValue n) -> return $ J.toJSON n - G.VList (G.ListValueG vals) -> - J.toJSON <$> mapM toJValue vals - G.VObject (G.ObjectValueG objs) -> - J.toJSON . Map.fromList <$> mapM toTup objs - where - toTup (G.ObjectFieldG f v) = (f,) <$> toJValue v - -valueParser - :: ( MonadError QErr m - , MonadReader ValidationCtx m) - => InputValueParser G.Value m -valueParser = - InputValueParser pScalar pList pObject pEnum - where - pEnum (G.VVariable var) = pVar var - pEnum (G.VEnum e) = pVal e - pEnum G.VNull = pNull - pEnum _ = throwVE "expecting an enum" - - pList (G.VVariable var) = pVar var - pList (G.VList lv) = pVal $ G.unListValue lv - pList G.VNull = pNull - pList v = pVal [v] - - pObject (G.VVariable var) = pVar var - pObject (G.VObject ov) = pVal - [(G._ofName oFld, G._ofValue oFld) | oFld <- G.unObjectValue ov] - pObject G.VNull = pNull - pObject _ = throwVE "expecting an object" - - -- scalar json - pScalar (G.VVariable var) = pVar var - pScalar G.VNull = pNull - pScalar (G.VInt v) = pVal $ J.Number $ fromIntegral v - pScalar (G.VFloat v) = pVal $ J.Number v - pScalar (G.VBoolean b) = pVal $ J.Bool b - pScalar (G.VString sv) = pVal $ J.String $ G.unStringValue sv - pScalar (G.VEnum _) = throwVE "unexpected enum for a scalar" - pScalar v = pVal =<< toJValue v - -pPrintValueC :: G.ValueConst -> Text -pPrintValueC = \case - G.VCInt i -> T.pack $ show i - G.VCFloat f -> T.pack $ show f - G.VCString (G.StringValue t) -> T.pack $ show t - G.VCBoolean b -> bool "false" "true" b - G.VCNull -> "null" - G.VCEnum (G.EnumValue n) -> G.unName n - G.VCList (G.ListValueG vals) -> withSquareBraces $ T.intercalate ", " $ map pPrintValueC vals - G.VCObject (G.ObjectValueG objs) -> withCurlyBraces $ T.intercalate ", " $ map ppObjFld objs - where - ppObjFld (G.ObjectFieldG f v) = G.unName f <> ": " <> pPrintValueC v - withSquareBraces t = "[" <> t <> "]" - withCurlyBraces t = "{" <> t <> "}" - - -toJValueC :: G.ValueConst -> J.Value -toJValueC = \case - G.VCInt i -> J.toJSON i - G.VCFloat f -> J.toJSON f - G.VCString (G.StringValue t) -> J.toJSON t - G.VCBoolean b -> J.toJSON b - G.VCNull -> J.Null - G.VCEnum (G.EnumValue n) -> J.toJSON n - G.VCList (G.ListValueG vals) -> - J.toJSON $ map toJValueC vals - G.VCObject (G.ObjectValueG objs) -> - J.toJSON . OMap.fromList $ map toTup objs - where - toTup (G.ObjectFieldG f v) = (f, toJValueC v) - -constValueParser :: (MonadError QErr m) => InputValueParser G.ValueConst m -constValueParser = - InputValueParser pScalar pList pObject pEnum - where - pEnum (G.VCEnum e) = pVal e - pEnum G.VCNull = pNull - pEnum _ = throwVE "expecting an enum" - - pList (G.VCList lv) = pVal $ G.unListValue lv - pList G.VCNull = pNull - pList v = pVal [v] - - pObject (G.VCObject ov) = pVal - [(G._ofName oFld, G._ofValue oFld) | oFld <- G.unObjectValue ov] - pObject G.VCNull = pNull - pObject _ = throwVE "expecting an object" - - -- scalar json - pScalar G.VCNull = pNull - pScalar (G.VCInt v) = pVal $ J.Number $ fromIntegral v - pScalar (G.VCFloat v) = pVal $ J.Number v - pScalar (G.VCBoolean b) = pVal $ J.Bool b - pScalar (G.VCString sv) = pVal $ J.String $ G.unStringValue sv - pScalar (G.VCEnum _) = throwVE "unexpected enum for a scalar" - pScalar v = pVal $ toJValueC v - -validateObject - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m - ) - => InputValueParser a m - -> InpObjTyInfo -> [(G.Name, a)] -> m AnnGObject -validateObject valParser tyInfo flds = do - - -- check duplicates - unless (null dups) $ - throwVE $ "when parsing a value of type: " <> showNamedTy (_iotiName tyInfo) - <> ", the following fields are duplicated: " - <> showNames dups - - -- make default values object - defValObj <- fmap (OMap.fromList . catMaybes) $ - forM (Map.toList $ _iotiFields tyInfo) $ - \(fldName, inpValInfo) -> do - let ty = _iviType inpValInfo - isNotNull = G.isNotNull ty - defValM = _iviDefVal inpValInfo - hasDefVal = isJust defValM - fldPresent = fldName `elem` inpFldNames - - when (not fldPresent && isNotNull && not hasDefVal) $ - throwVE $ "field " <> G.unName fldName <> " of type " - <> G.showGT ty <> " is required, but not found" - - convDefValM <- validateInputValue constValueParser ty `mapM` defValM - return $ (fldName,) <$> convDefValM - - -- compute input values object - inpValObj <- fmap OMap.fromList $ forM flds $ \(fldName, fldVal) -> - withPathK (G.unName fldName) $ do - fldTy <- getInpFieldInfo tyInfo fldName - convFldVal <- validateInputValue valParser fldTy fldVal - return (fldName, convFldVal) - - return $ inpValObj `OMap.union` defValObj - - where - inpFldNames = map fst flds - dups = duplicates inpFldNames - -validateNamedTypeVal - :: ( MonadReader r m, Has TypeMap r - , MonadError QErr m) - => InputValueParser a m - -> (G.Nullability, G.NamedType) -> a -> m AnnInpVal -validateNamedTypeVal inpValParser (nullability, nt) val = do - tyInfo <- getTyInfo nt - case tyInfo of - -- this should never happen - TIObj _ -> - throwUnexpTypeErr "object" - TIIFace _ -> - throwUnexpTypeErr "interface" - TIUnion _ -> - throwUnexpTypeErr "union" - TIInpObj ioti -> - withParsed gType (getObject inpValParser) val $ - fmap (AGObject nt) . mapM (validateObject inpValParser ioti) - TIEnum eti -> - withParsed gType (getEnum inpValParser) val $ - fmap (AGEnum nt) . validateEnum eti - TIScalar (ScalarTyInfo _ _ pgColTy _) -> - withParsed gType (getScalar inpValParser) val $ - fmap (AGScalar pgColTy) . mapM (validateScalar pgColTy) - where - throwUnexpTypeErr ty = throw500 $ "unexpected " <> ty <> " type info for: " - <> showNamedTy nt - - validateEnum enumTyInfo maybeEnumValue = case (_etiValues enumTyInfo, maybeEnumValue) of - (EnumValuesSynthetic _, Nothing) -> pure $ AGESynthetic Nothing - (EnumValuesReference reference, Nothing) -> pure $ AGEReference reference Nothing - (EnumValuesSynthetic values, Just enumValue) - | Map.member enumValue values -> pure $ AGESynthetic (Just enumValue) - (EnumValuesReference reference@(EnumReference _ values), Just enumValue) - | rqlEnumValue <- RQL.EnumValue . G.unName $ G.unEnumValue enumValue - , Map.member rqlEnumValue values - -> pure $ AGEReference reference (Just rqlEnumValue) - (_, Just enumValue) -> throwVE $ - "unexpected value " <> showName (G.unEnumValue enumValue) <> " for enum: " <> showNamedTy nt - - validateScalar pgColTy = runAesonParser (parsePGValue pgColTy) - gType = G.TypeNamed nullability nt - -validateList - :: (MonadError QErr m, MonadReader r m, Has TypeMap r) - => InputValueParser a m - -> (G.Nullability, G.ListType) - -> a - -> m AnnInpVal -validateList inpValParser (nullability, listTy) val = - withParsed ty (getList inpValParser) val $ \lM -> do - let baseTy = G.unListType listTy - AGArray listTy <$> - mapM (indexedMapM (validateInputValue inpValParser baseTy)) lM - where - ty = G.TypeList nullability listTy - -validateInputValue - :: (MonadError QErr m, MonadReader r m, Has TypeMap r) - => InputValueParser a m - -> G.GType - -> a - -> m AnnInpVal -validateInputValue inpValParser ty val = - case ty of - G.TypeNamed nullability nt -> - validateNamedTypeVal inpValParser (nullability, nt) val - G.TypeList nullability lt -> - validateList inpValParser (nullability, lt) val - -withParsed - :: (Monad m, MonadError QErr m) - => G.GType - -> (val -> m (P specificVal)) - -> val - -> (Maybe specificVal -> m AnnGValue) - -> m AnnInpVal -withParsed expectedTy valParser val fn = do - parsedVal <- valParser val - case unP parsedVal of - Nothing -> - if G.isNullable expectedTy - then AnnInpVal expectedTy Nothing <$> fn Nothing - else throwVE $ "null value found for non-nullable type: " - <> G.showGT expectedTy - Just (Right v) -> AnnInpVal expectedTy Nothing <$> fn (Just v) - Just (Left (var, v)) -> do - let varTxt = G.unName $ G.unVariable var - unless (isTypeAllowed expectedTy $ _aivType v) $ - throwVE $ "variable " <> varTxt - <> " of type " <> G.showGT (_aivType v) - <> " is used in position expecting " <> G.showGT expectedTy - return $ v { _aivVariable = Just var } - where - -- is the type 'ofType' allowed at a position of type 'atType' - -- Examples: - -- . a! is allowed at a - -- . [a!]! is allowed at [a] - -- . but 'a' is not allowed at 'a!' - isTypeAllowed ofType atType = - case (ofType, atType) of - (G.TypeNamed ofTyN ofNt, G.TypeNamed atTyN atNt) -> - checkNullability ofTyN atTyN && (ofNt == atNt) - (G.TypeList ofTyN ofLt, G.TypeList atTyN atLt) -> - checkNullability ofTyN atTyN && - isTypeAllowed (G.unListType ofLt) (G.unListType atLt) - _ -> False - - -- only when 'atType' is non nullable and 'ofType' is nullable, - -- this check fails - checkNullability (G.Nullability ofNullable) (G.Nullability atNullable) = - case (ofNullable, atNullable) of - (True, _) -> True - (False, False) -> True - (False, True) -> False diff --git a/server/src-lib/Hasura/GraphQL/Validate/SelectionSet.hs b/server/src-lib/Hasura/GraphQL/Validate/SelectionSet.hs deleted file mode 100644 index 64b3972cd7c..00000000000 --- a/server/src-lib/Hasura/GraphQL/Validate/SelectionSet.hs +++ /dev/null @@ -1,550 +0,0 @@ -{-# LANGUAGE RecordWildCards #-} -{-# LANGUAGE TypeFamilyDependencies #-} -module Hasura.GraphQL.Validate.SelectionSet - ( ArgsMap - , Field(..) - , AliasedFields(..) - , SelectionSet(..) - , ObjectSelectionSet(..) - , traverseObjectSelectionSet - , InterfaceSelectionSet - , UnionSelectionSet - , RootSelectionSet(..) - , parseObjectSelectionSet - , asObjectSelectionSet - , asInterfaceSelectionSet - , getMemberSelectionSet - ) where - -import Hasura.Prelude - -import qualified Data.HashMap.Strict as Map -import qualified Data.HashMap.Strict.InsOrd.Extended as OMap -import qualified Data.HashSet as Set -import qualified Data.List as L -import qualified Data.Sequence.NonEmpty as NE -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G - -import Hasura.GraphQL.NormalForm -import Hasura.GraphQL.Validate.Context -import Hasura.GraphQL.Validate.InputValue -import Hasura.GraphQL.Validate.Types -import Hasura.RQL.Types -import Hasura.SQL.Value - -class HasSelectionSet a where - - getTypename :: a -> G.NamedType - getMemberTypes :: a -> Set.HashSet G.NamedType - - fieldToSelectionSet - :: G.Alias -> NormalizedField a -> NormalizedSelectionSet a - - parseField_ - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - , MonadState [G.Name] m - ) - => a - -> G.Field - -> m (Maybe (NormalizedField a)) - - mergeNormalizedSelectionSets - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - ) - => [NormalizedSelectionSet a] - -> m (NormalizedSelectionSet a) - - fromObjectSelectionSet - :: G.NamedType - -- ^ parent typename - -> G.NamedType - -- ^ fragment typename - -> Set.HashSet G.NamedType - -- ^ common types - -> NormalizedSelectionSet ObjTyInfo - -> NormalizedSelectionSet a - - fromInterfaceSelectionSet - :: G.NamedType - -- ^ parent typename - -> G.NamedType - -- ^ fragment typename - -> Set.HashSet G.NamedType - -> NormalizedSelectionSet IFaceTyInfo - -> NormalizedSelectionSet a - - fromUnionSelectionSet - :: G.NamedType - -- ^ parent typename - -> G.NamedType - -- ^ fragment typename - -> Set.HashSet G.NamedType - -- ^ common types - -> NormalizedSelectionSet UnionTyInfo - -> NormalizedSelectionSet a - -parseObjectSelectionSet - :: ( MonadError QErr m - , MonadReusability m - ) - => ValidationCtx - -> ObjTyInfo - -> G.SelectionSet - -> m ObjectSelectionSet -parseObjectSelectionSet validationCtx objectTypeInfo selectionSet = - flip evalStateT [] $ flip runReaderT validationCtx $ - parseSelectionSet objectTypeInfo selectionSet - -selectionToSelectionSet - :: HasSelectionSet a - => NormalizedSelection a -> NormalizedSelectionSet a -selectionToSelectionSet = \case - SelectionField alias fld -> fieldToSelectionSet alias fld - SelectionInlineFragmentSpread selectionSet -> selectionSet - SelectionFragmentSpread _ selectionSet -> selectionSet - -parseSelectionSet - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - , HasSelectionSet a - , MonadState [G.Name] m - ) - => a - -> G.SelectionSet - -> m (NormalizedSelectionSet a) -parseSelectionSet fieldTypeInfo selectionSet = do - visitedFragments <- get - withPathK "selectionSet" $ do - -- The visited fragments state shouldn't accumulate over a selection set. - normalizedSelections <- - catMaybes <$> mapM (parseSelection visitedFragments fieldTypeInfo) selectionSet - mergeNormalizedSelections normalizedSelections - where - mergeNormalizedSelections = mergeNormalizedSelectionSets . map selectionToSelectionSet - --- | While interfaces and objects have fields, unions do not, so --- this is a specialized function for every Object type -parseSelection - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - , HasSelectionSet a - ) - => [G.Name] - -> a -- parent type info - -> G.Selection - -> m (Maybe (NormalizedSelection a)) -parseSelection visitedFragments parentTypeInfo = - flip evalStateT visitedFragments . \case - G.SelectionField fld -> withPathK (G.unName $ G._fName fld) $ do - let fieldName = G._fName fld - fieldAlias = fromMaybe (G.Alias fieldName) $ G._fAlias fld - fmap (SelectionField fieldAlias) <$> parseField_ parentTypeInfo fld - G.SelectionFragmentSpread (G.FragmentSpread name directives) -> do - FragDef _ fragmentTyInfo fragmentSelectionSet <- getFragmentInfo name - withPathK (G.unName name) $ - fmap (SelectionFragmentSpread name) <$> - parseFragment parentTypeInfo fragmentTyInfo directives fragmentSelectionSet - G.SelectionInlineFragment G.InlineFragment{..} -> do - let fragmentType = fromMaybe (getTypename parentTypeInfo) _ifTypeCondition - fragmentTyInfo <- getFragmentTyInfo fragmentType - withPathK "inlineFragment" $ fmap SelectionInlineFragmentSpread <$> - parseFragment parentTypeInfo fragmentTyInfo _ifDirectives _ifSelectionSet - -parseFragment - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - , MonadState [G.Name] m - , HasSelectionSet a - ) - => a - -> FragmentTypeInfo - -> [G.Directive] - -> G.SelectionSet - -> m (Maybe (NormalizedSelectionSet a)) -parseFragment parentTyInfo fragmentTyInfo directives fragmentSelectionSet = do - commonTypes <- validateSpread - case fragmentTyInfo of - FragmentTyObject objTyInfo -> - withDirectives directives $ - fmap (fromObjectSelectionSet parentType fragmentType commonTypes) $ - parseSelectionSet objTyInfo fragmentSelectionSet - FragmentTyInterface interfaceTyInfo -> - withDirectives directives $ - fmap (fromInterfaceSelectionSet parentType fragmentType commonTypes) $ - parseSelectionSet interfaceTyInfo fragmentSelectionSet - FragmentTyUnion unionTyInfo -> - withDirectives directives $ - fmap (fromUnionSelectionSet parentType fragmentType commonTypes) $ - parseSelectionSet unionTyInfo fragmentSelectionSet - where - validateSpread = do - let commonTypes = parentTypeMembers `Set.intersection` fragmentTypeMembers - if null commonTypes then - -- TODO: better error location by capturing the fragment source - - -- named or otherwise - -- throwVE $ "cannot spread fragment " <> showName name <> " defined on " <> - throwVE $ "cannot spread fragment defined on " <> showNamedTy fragmentType - <> " when selecting fields of type " <> showNamedTy parentType - else pure commonTypes - - parentType = getTypename parentTyInfo - parentTypeMembers = getMemberTypes parentTyInfo - - fragmentType = case fragmentTyInfo of - FragmentTyObject tyInfo -> getTypename tyInfo - FragmentTyInterface tyInfo -> getTypename tyInfo - FragmentTyUnion tyInfo -> getTypename tyInfo - fragmentTypeMembers = case fragmentTyInfo of - FragmentTyObject tyInfo -> getMemberTypes tyInfo - FragmentTyInterface tyInfo -> getMemberTypes tyInfo - FragmentTyUnion tyInfo -> getMemberTypes tyInfo - -class IsField f => MergeableField f where - - checkFieldMergeability - :: (MonadError QErr m) => G.Alias -> NE.NESeq f -> m f - -instance MergeableField Field where - - checkFieldMergeability alias fields = do - let groupedFlds = toList $ NE.toSeq fields - fldNames = L.nub $ map getFieldName groupedFlds - args = L.nub $ map getFieldArguments groupedFlds - when (length fldNames > 1) $ - throwVE $ "cannot merge different fields under the same alias (" - <> showName (G.unAlias alias) <> "): " - <> showNames fldNames - when (length args > 1) $ - throwVE $ "cannot merge fields with different arguments" - <> " under the same alias: " - <> showName (G.unAlias alias) - let fld = NE.head fields - mergedGroupSelectionSet <- mergeSelectionSets $ fmap _fSelSet fields - return $ fld { _fSelSet = mergedGroupSelectionSet } - -instance MergeableField Typename where - - checkFieldMergeability _ fields = pure $ NE.head fields - -parseArguments - :: ( MonadReader ValidationCtx m - , MonadError QErr m - ) - => ParamMap - -> [G.Argument] - -> m ArgsMap -parseArguments fldParams argsL = do - - args <- onLeft (mkMapWith G._aName argsL) $ \dups -> - throwVE $ "the following arguments are defined more than once: " <> - showNames dups - - let requiredParams = Map.filter (G.isNotNull . _iviType) fldParams - - inpArgs <- forM args $ \(G.Argument argName argVal) -> - withPathK (G.unName argName) $ do - argTy <- getArgTy argName - validateInputValue valueParser argTy argVal - - forM_ requiredParams $ \argDef -> do - let param = _iviName argDef - onNothing (Map.lookup param inpArgs) $ throwVE $ mconcat - [ "the required argument ", showName param, " is missing"] - - return inpArgs - - where - getArgTy argName = - onNothing (_iviType <$> Map.lookup argName fldParams) $ throwVE $ - "no such argument " <> showName argName <> " is expected" - -mergeFields - :: ( MonadError QErr m - , MergeableField f - ) - -- => Seq.Seq Field - => [AliasedFields f] - -> m (AliasedFields f) -mergeFields flds = - AliasedFields <$> OMap.traverseWithKey checkFieldMergeability groups - where - groups = foldr (OMap.unionWith (<>)) mempty $ - map (fmap NE.init . unAliasedFields) flds - -appendSelectionSets - :: (MonadError QErr m) => SelectionSet -> SelectionSet -> m SelectionSet -appendSelectionSets = curry \case - (SelectionSetObject s1, SelectionSetObject s2) -> - SelectionSetObject <$> mergeObjectSelectionSets [s1, s2] - (SelectionSetInterface s1, SelectionSetInterface s2) -> - SelectionSetInterface <$> appendScopedSelectionSet s1 s2 - (SelectionSetUnion s1, SelectionSetUnion s2) -> - SelectionSetUnion <$> appendScopedSelectionSet s1 s2 - (SelectionSetNone, SelectionSetNone) -> pure SelectionSetNone - (_, _) -> throw500 $ "mergeSelectionSets: 'same kind' assertion failed" - - --- query q { --- author { --- id --- } --- author { --- name --- } --- } --- --- | When we are merging two selection sets down two different trees they --- should be of the same type, however, as it is not enforced in the type --- system, an internal error is thrown when this assumption is violated -mergeSelectionSets - :: (MonadError QErr m) => NE.NESeq SelectionSet -> m SelectionSet --- mergeSelectionSets = curry $ \case -mergeSelectionSets selectionSets = - foldM appendSelectionSets (NE.head selectionSets) $ NE.tail selectionSets - -mergeObjectSelectionSets - :: (MonadError QErr m) => [ObjectSelectionSet] -> m ObjectSelectionSet -mergeObjectSelectionSets = - fmap ObjectSelectionSet . mergeFields . map unObjectSelectionSet - -mergeObjectSelectionSetMaps - :: (MonadError QErr m) => [ObjectSelectionSetMap] -> m ObjectSelectionSetMap -mergeObjectSelectionSetMaps selectionSetMaps = - traverse mergeObjectSelectionSets $ - foldr (Map.unionWith (<>)) mempty $ map (fmap (:[])) selectionSetMaps - -appendScopedSelectionSet - :: (MonadError QErr m, MergeableField f) - => ScopedSelectionSet f -> ScopedSelectionSet f -> m (ScopedSelectionSet f) -appendScopedSelectionSet s1 s2 = - ScopedSelectionSet - <$> mergeFields [_sssBaseSelectionSet s1, _sssBaseSelectionSet s2] - <*> mergeObjectSelectionSetMaps [s1MembersUnified, s2MembersUnified] - - where - s1Base = fmap toField $ _sssBaseSelectionSet s1 - s2Base = fmap toField $ _sssBaseSelectionSet s2 - - s1MembersUnified = - (_sssMemberSelectionSets s1) - <> fmap (const (ObjectSelectionSet s1Base)) (_sssMemberSelectionSets s2) - - s2MembersUnified = - (_sssMemberSelectionSets s2) - <> fmap (const (ObjectSelectionSet s2Base)) (_sssMemberSelectionSets s1) - -mergeScopedSelectionSets - :: (MonadError QErr m, MergeableField f) - => [ScopedSelectionSet f] -> m (ScopedSelectionSet f) -mergeScopedSelectionSets selectionSets = - foldM appendScopedSelectionSet emptyScopedSelectionSet selectionSets - -withDirectives - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - ) - => [G.Directive] - -> m a - -> m (Maybe a) -withDirectives dirs act = do - procDirs <- withPathK "directives" $ do - dirDefs <- onLeft (mkMapWith G._dName dirs) $ \dups -> - throwVE $ "the following directives are used more than once: " <> - showNames dups - - flip Map.traverseWithKey dirDefs $ \name dir -> - withPathK (G.unName name) $ do - dirInfo <- onNothing (Map.lookup (G._dName dir) defDirectivesMap) $ - throwVE $ "unexpected directive: " <> showName name - procArgs <- withPathK "args" $ parseArguments (_diParams dirInfo) - (G._dArguments dir) - getIfArg procArgs - - let shouldSkip = fromMaybe False $ Map.lookup "skip" procDirs - shouldInclude = fromMaybe True $ Map.lookup "include" procDirs - - if not shouldSkip && shouldInclude - then Just <$> act - else return Nothing - - where - getIfArg m = do - val <- onNothing (Map.lookup "if" m) $ throw500 - "missing if argument in the directive" - when (isJust $ _aivVariable val) markNotReusable - case _aivValue val of - AGScalar _ (Just (PGValBoolean v)) -> return v - _ -> throw500 "did not find boolean scalar for if argument" - -getFragmentInfo - :: (MonadReader ValidationCtx m, MonadError QErr m, MonadState [G.Name] m) - => G.Name - -- ^ fragment name - -> m FragDef -getFragmentInfo name = do - -- check for cycles - visitedFragments <- get - if name `elem` visitedFragments - then throwVE $ "cannot spread fragment " <> showName name - <> " within itself via " - <> T.intercalate "," (map G.unName visitedFragments) - else put $ name:visitedFragments - fragInfo <- Map.lookup name <$> asks _vcFragDefMap - onNothing fragInfo $ throwVE $ "fragment '" <> G.unName name <> "' not found" - -denormalizeField - :: ( MonadReader ValidationCtx m - , MonadError QErr m - , MonadReusability m - , MonadState [G.Name] m - ) - => ObjFldInfo - -> G.Field - -> m (Maybe Field) -denormalizeField fldInfo (G.Field _ name args dirs selSet) = do - - let fldTy = _fiTy fldInfo - fldBaseTy = getBaseTy fldTy - - fldTyInfo <- getTyInfo fldBaseTy - - argMap <- withPathK "args" $ parseArguments (_fiParams fldInfo) args - - fields <- case (fldTyInfo, selSet) of - - (TIObj _, []) -> - throwVE $ "field " <> showName name <> " of type " - <> G.showGT fldTy <> " must have a selection of subfields" - - (TIObj objTyInfo, _) -> - SelectionSetObject <$> parseSelectionSet objTyInfo selSet - - (TIIFace _, []) -> - throwVE $ "field " <> showName name <> " of type " - <> G.showGT fldTy <> " must have a selection of subfields" - - (TIIFace interfaceTyInfo, _) -> - SelectionSetInterface <$> parseSelectionSet interfaceTyInfo selSet - - (TIUnion _, []) -> - throwVE $ "field " <> showName name <> " of type " - <> G.showGT fldTy <> " must have a selection of subfields" - - (TIUnion unionTyInfo, _) -> - SelectionSetUnion <$> parseSelectionSet unionTyInfo selSet - - (TIScalar _, []) -> return SelectionSetNone - -- when scalar/enum and no empty set - (TIScalar _, _) -> - throwVE $ "field " <> showName name <> " must not have a " - <> "selection since type " <> G.showGT fldTy <> " has no subfields" - - (TIEnum _, []) -> return SelectionSetNone - (TIEnum _, _) -> - throwVE $ "field " <> showName name <> " must not have a " - <> "selection since type " <> G.showGT fldTy <> " has no subfields" - - (TIInpObj _, _) -> - throwVE $ "internal error: unexpected input type for field: " - <> showName name - - withDirectives dirs $ pure $ Field name fldBaseTy argMap fields - -type instance NormalizedSelectionSet ObjTyInfo = ObjectSelectionSet -type instance NormalizedField ObjTyInfo = Field - -instance HasSelectionSet ObjTyInfo where - - getTypename = _otiName - getMemberTypes = Set.singleton . _otiName - - parseField_ objTyInfo field = do - fieldInfo <- getFieldInfo (_otiName objTyInfo) (_otiFields objTyInfo) $ G._fName field - denormalizeField fieldInfo field - - fieldToSelectionSet alias fld = - ObjectSelectionSet $ AliasedFields $ OMap.singleton alias fld - - mergeNormalizedSelectionSets = mergeObjectSelectionSets - - fromObjectSelectionSet _ _ _ objectSelectionSet = - objectSelectionSet - - fromInterfaceSelectionSet parentType _ _ interfaceSelectionSet = - getMemberSelectionSet parentType interfaceSelectionSet - - fromUnionSelectionSet parentType _ _ unionSelectionSet = - getMemberSelectionSet parentType unionSelectionSet - -type instance NormalizedSelectionSet IFaceTyInfo = InterfaceSelectionSet -type instance NormalizedField IFaceTyInfo = Field - -instance HasSelectionSet IFaceTyInfo where - - getTypename = _ifName - getMemberTypes = _ifMemberTypes - - parseField_ interfaceTyInfo field = do - fieldInfo <- getFieldInfo (_ifName interfaceTyInfo) (_ifFields interfaceTyInfo) - $ G._fName field - denormalizeField fieldInfo field - - fieldToSelectionSet alias field = - ScopedSelectionSet (AliasedFields $ OMap.singleton alias field) mempty - - mergeNormalizedSelectionSets = mergeScopedSelectionSets - - fromObjectSelectionSet _ fragmentType _ objectSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.singleton fragmentType objectSelectionSet - - fromInterfaceSelectionSet _ _ commonTypes interfaceSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.fromList $ flip map (toList commonTypes) $ - \commonType -> (commonType, getMemberSelectionSet commonType interfaceSelectionSet) - - fromUnionSelectionSet _ _ commonTypes unionSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.fromList $ flip map (toList commonTypes) $ - \commonType -> (commonType, getMemberSelectionSet commonType unionSelectionSet) - -type instance NormalizedSelectionSet UnionTyInfo = UnionSelectionSet -type instance NormalizedField UnionTyInfo = Typename - -instance HasSelectionSet UnionTyInfo where - - getTypename = _utiName - getMemberTypes = _utiMemberTypes - - parseField_ unionTyInfo field = do - let fieldMap = Map.singleton (_fiName typenameFld) typenameFld - fieldInfo <- getFieldInfo (_utiName unionTyInfo) fieldMap $ G._fName field - fmap (const Typename) <$> denormalizeField fieldInfo field - - fieldToSelectionSet alias field = - ScopedSelectionSet (AliasedFields $ OMap.singleton alias field) mempty - - mergeNormalizedSelectionSets = mergeScopedSelectionSets - - fromObjectSelectionSet _ fragmentType _ objectSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.singleton fragmentType objectSelectionSet - - fromInterfaceSelectionSet _ _ commonTypes interfaceSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.fromList $ flip map (toList commonTypes) $ - \commonType -> (commonType, getMemberSelectionSet commonType interfaceSelectionSet) - - fromUnionSelectionSet _ _ commonTypes unionSelectionSet = - ScopedSelectionSet (AliasedFields mempty) $ - Map.fromList $ flip map (toList commonTypes) $ - \commonType -> (commonType, getMemberSelectionSet commonType unionSelectionSet) diff --git a/server/src-lib/Hasura/GraphQL/Validate/Types.hs b/server/src-lib/Hasura/GraphQL/Validate/Types.hs deleted file mode 100644 index ef42eb42ca1..00000000000 --- a/server/src-lib/Hasura/GraphQL/Validate/Types.hs +++ /dev/null @@ -1,819 +0,0 @@ -{-# LANGUAGE GADTs #-} -module Hasura.GraphQL.Validate.Types - ( InpValInfo(..) - , ParamMap - - , typenameFld - , ObjFldInfo(..) - , mkHsraObjFldInfo - , ObjFieldMap - - -- Don't expose 'ObjTyInfo' constructor. Instead use 'mkObjTyInfo' or 'mkHsraObjTyInfo' - -- which will auto-insert the compulsory '__typename' field. - , ObjTyInfo - , _otiDesc - , _otiName - , _otiImplIFaces - , _otiFields - , mkObjTyInfo - , mkHsraObjTyInfo - - -- Don't expose 'IFaceTyInfo' constructor. Instead use 'mkIFaceTyInfo' - -- which will auto-insert the compulsory '__typename' field. - , IFaceTyInfo - , _ifDesc - , _ifName - , _ifFields - , _ifMemberTypes - , mkIFaceTyInfo - - , IFacesSet - , UnionTyInfo(..) - , FragDef(..) - , FragmentTypeInfo(..) - , FragDefMap - , AnnVarVals - , AnnInpVal(..) - - , EnumTyInfo(..) - , mkHsraEnumTyInfo - - , EnumValuesInfo(..) - , normalizeEnumValues - , EnumValInfo(..) - , InpObjFldMap - , InpObjTyInfo(..) - , mkHsraInpTyInfo - - , ScalarTyInfo(..) - , fromScalarTyDef - , mkHsraScalarTyInfo - - , DirectiveInfo(..) - , AsObjType(..) - , defaultDirectives - , defDirectivesMap - , defaultSchema - , TypeInfo(..) - , isObjTy - , isIFaceTy - , getPossibleObjTypes - , getObjTyM - , getUnionTyM - , mkScalarTy - , pgColTyToScalar - , getNamedTy - , mkTyInfoMap - , fromTyDef - , fromSchemaDoc - , fromSchemaDocQ - , TypeMap - , TypeLoc (..) - , typeEq - , AnnGValue(..) - , AnnGEnumValue(..) - , AnnGObject - , hasNullVal - , getAnnInpValKind - , stripTypenames - - , ReusableVariableTypes(..) - , ReusableVariableValues - - , QueryReusability(..) - , _Reusable - , _NotReusable - , MonadReusability(..) - , ReusabilityT - , runReusabilityT - , runReusabilityTWith - , evalReusabilityT - - , module Hasura.GraphQL.Utils - ) where - -import Hasura.Prelude - -import qualified Data.Aeson as J -import qualified Data.Aeson.Casing as J -import qualified Data.Aeson.TH as J -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G -import qualified Language.GraphQL.Draft.TH as G -import qualified Language.Haskell.TH.Syntax as TH - -import Control.Lens (makePrisms) - -import qualified Hasura.RQL.Types.Column as RQL -import qualified Hasura.Tracing as Tracing - -import Hasura.GraphQL.NormalForm -import Hasura.GraphQL.Utils -import Hasura.RQL.Instances () -import Hasura.RQL.Types.Common -import Hasura.RQL.Types.RemoteSchema (RemoteSchemaInfo, RemoteSchemaName) -import Hasura.SQL.Types -import Hasura.SQL.Value - -typeEq :: (EquatableGType a, Eq (EqProps a)) => a -> a -> Bool -typeEq a b = getEqProps a == getEqProps b - -data EnumValInfo - = EnumValInfo - { _eviDesc :: !(Maybe G.Description) - , _eviVal :: !G.EnumValue - , _eviIsDeprecated :: !Bool - } deriving (Show, Eq, TH.Lift) - -fromEnumValDef :: G.EnumValueDefinition -> EnumValInfo -fromEnumValDef (G.EnumValueDefinition descM val _) = - EnumValInfo descM val False - -data EnumValuesInfo - = EnumValuesSynthetic !(Map.HashMap G.EnumValue EnumValInfo) - -- ^ Values for an enum that exists only in the GraphQL schema and does not - -- have any external source of truth. - | EnumValuesReference !RQL.EnumReference - -- ^ Values for an enum that is backed by an enum table reference (see - -- "Hasura.RQL.Schema.Enum"). - deriving (Show, Eq, TH.Lift) - -normalizeEnumValues :: EnumValuesInfo -> Map.HashMap G.EnumValue EnumValInfo -normalizeEnumValues = \case - EnumValuesSynthetic values -> values - EnumValuesReference (RQL.EnumReference _ values) -> - mapFromL _eviVal . flip map (Map.toList values) $ - \(RQL.EnumValue name, RQL.EnumValueInfo maybeDescription) -> EnumValInfo - { _eviVal = G.EnumValue $ G.Name name - , _eviDesc = G.Description <$> maybeDescription - , _eviIsDeprecated = False } - -data EnumTyInfo - = EnumTyInfo - { _etiDesc :: !(Maybe G.Description) - , _etiName :: !G.NamedType - , _etiValues :: !EnumValuesInfo - , _etiLoc :: !TypeLoc - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType EnumTyInfo where - type EqProps EnumTyInfo = (G.NamedType, Map.HashMap G.EnumValue EnumValInfo) - getEqProps ety = (,) (_etiName ety) (normalizeEnumValues $ _etiValues ety) - -fromEnumTyDef :: G.EnumTypeDefinition -> TypeLoc -> EnumTyInfo -fromEnumTyDef (G.EnumTypeDefinition descM n _ valDefs) loc = - EnumTyInfo descM (G.NamedType n) (EnumValuesSynthetic enumVals) loc - where - enumVals = Map.fromList - [(G._evdName valDef, fromEnumValDef valDef) | valDef <- valDefs] - -mkHsraEnumTyInfo - :: Maybe G.Description - -> G.NamedType - -> EnumValuesInfo - -> EnumTyInfo -mkHsraEnumTyInfo descM ty enumVals = - EnumTyInfo descM ty enumVals TLHasuraType - -fromInpValDef :: G.InputValueDefinition -> InpValInfo -fromInpValDef (G.InputValueDefinition descM n ty defM) = - InpValInfo descM n defM ty - -type ParamMap = Map.HashMap G.Name InpValInfo - --- | location of the type: a hasura type or a remote type -data TypeLoc - = TLHasuraType - | TLRemoteType !RemoteSchemaName !RemoteSchemaInfo - | TLCustom - deriving (Show, Eq, TH.Lift, Generic) - -$(J.deriveJSON - J.defaultOptions { J.constructorTagModifier = J.snakeCase . drop 2 - , J.sumEncoding = J.TaggedObject "type" "detail" - } - ''TypeLoc) - -instance Hashable TypeLoc - -data ObjFldInfo - = ObjFldInfo - { _fiDesc :: !(Maybe G.Description) - , _fiName :: !G.Name - , _fiParams :: !ParamMap - , _fiTy :: !G.GType - , _fiLoc :: !TypeLoc - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType ObjFldInfo where - type EqProps ObjFldInfo = (G.Name, G.GType, ParamMap) - getEqProps o = (,,) (_fiName o) (_fiTy o) (_fiParams o) - -fromFldDef :: G.FieldDefinition -> TypeLoc -> ObjFldInfo -fromFldDef (G.FieldDefinition descM n args ty _) loc = - ObjFldInfo descM n params ty loc - where - params = Map.fromList [(G._ivdName arg, fromInpValDef arg) | arg <- args] - -mkHsraObjFldInfo - :: Maybe G.Description - -> G.Name - -> ParamMap - -> G.GType - -> ObjFldInfo -mkHsraObjFldInfo descM name params ty = - ObjFldInfo descM name params ty TLHasuraType - -type ObjFieldMap = Map.HashMap G.Name ObjFldInfo - -type IFacesSet = Set.HashSet G.NamedType - -data ObjTyInfo - = ObjTyInfo - { _otiDesc :: !(Maybe G.Description) - , _otiName :: !G.NamedType - , _otiImplIFaces :: !IFacesSet - , _otiFields :: !ObjFieldMap - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType ObjTyInfo where - type EqProps ObjTyInfo = - (G.NamedType, Set.HashSet G.NamedType, Map.HashMap G.Name (G.Name, G.GType, ParamMap)) - getEqProps a = (,,) (_otiName a) (_otiImplIFaces a) (Map.map getEqProps (_otiFields a)) - -instance Monoid ObjTyInfo where - mempty = ObjTyInfo Nothing (G.NamedType "") Set.empty Map.empty - -instance Semigroup ObjTyInfo where - objA <> objB = - objA { _otiFields = Map.union (_otiFields objA) (_otiFields objB) - , _otiImplIFaces = _otiImplIFaces objA `Set.union` _otiImplIFaces objB - } - -mkObjTyInfo - :: Maybe G.Description -> G.NamedType - -> IFacesSet -> ObjFieldMap -> TypeLoc -> ObjTyInfo -mkObjTyInfo descM ty iFaces flds _ = - ObjTyInfo descM ty iFaces $ Map.insert (_fiName newFld) newFld flds - where newFld = typenameFld - -mkHsraObjTyInfo - :: Maybe G.Description - -> G.NamedType - -> IFacesSet - -> ObjFieldMap - -> ObjTyInfo -mkHsraObjTyInfo descM ty implIFaces flds = - mkObjTyInfo descM ty implIFaces flds TLHasuraType - -mkIFaceTyInfo - :: Maybe G.Description -> G.NamedType - -> Map.HashMap G.Name ObjFldInfo -> MemberTypes -> IFaceTyInfo -mkIFaceTyInfo descM ty flds = - IFaceTyInfo descM ty $ Map.insert (_fiName newFld) newFld flds - where - newFld = typenameFld - -typenameFld :: ObjFldInfo -typenameFld = - ObjFldInfo (Just desc) "__typename" Map.empty - (G.toGT $ G.toNT $ G.NamedType "String") TLHasuraType - where - desc = "The name of the current Object type at runtime" - -fromObjTyDef :: G.ObjectTypeDefinition -> TypeLoc -> ObjTyInfo -fromObjTyDef (G.ObjectTypeDefinition descM n ifaces _ flds) loc = - mkObjTyInfo descM (G.NamedType n) (Set.fromList ifaces) fldMap loc - where - fldMap = Map.fromList [(G._fldName fld, fromFldDef fld loc) | fld <- flds] - -data IFaceTyInfo - = IFaceTyInfo - { _ifDesc :: !(Maybe G.Description) - , _ifName :: !G.NamedType - , _ifFields :: !ObjFieldMap - , _ifMemberTypes :: !MemberTypes - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType IFaceTyInfo where - type EqProps IFaceTyInfo = - (G.NamedType, Map.HashMap G.Name (G.Name, G.GType, ParamMap)) - getEqProps a = (,) (_ifName a) (Map.map getEqProps (_ifFields a)) - -instance Semigroup IFaceTyInfo where - objA <> objB = - objA { _ifFields = Map.union (_ifFields objA) (_ifFields objB) - } - -fromIFaceDef - :: InterfaceImplementations -> G.InterfaceTypeDefinition -> TypeLoc -> IFaceTyInfo -fromIFaceDef interfaceImplementations (G.InterfaceTypeDefinition descM n _ flds) loc = - mkIFaceTyInfo descM (G.NamedType n) fldMap implementations - where - fldMap = Map.fromList [(G._fldName fld, fromFldDef fld loc) | fld <- flds] - implementations = fromMaybe mempty $ Map.lookup (G.NamedType n) interfaceImplementations - -type MemberTypes = Set.HashSet G.NamedType - -data UnionTyInfo - = UnionTyInfo - { _utiDesc :: !(Maybe G.Description) - , _utiName :: !G.NamedType - , _utiMemberTypes :: !MemberTypes - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType UnionTyInfo where - type EqProps UnionTyInfo = - (G.NamedType, Set.HashSet G.NamedType) - getEqProps a = (,) (_utiName a) (_utiMemberTypes a) - -instance Monoid UnionTyInfo where - mempty = UnionTyInfo Nothing (G.NamedType "") Set.empty - -instance Semigroup UnionTyInfo where - objA <> objB = - objA { _utiMemberTypes = Set.union (_utiMemberTypes objA) (_utiMemberTypes objB) - } - -fromUnionTyDef :: G.UnionTypeDefinition -> UnionTyInfo -fromUnionTyDef (G.UnionTypeDefinition descM n _ mt) = UnionTyInfo descM (G.NamedType n) $ Set.fromList mt - -type InpObjFldMap = Map.HashMap G.Name InpValInfo - -data InpObjTyInfo - = InpObjTyInfo - { _iotiDesc :: !(Maybe G.Description) - , _iotiName :: !G.NamedType - , _iotiFields :: !InpObjFldMap - , _iotiLoc :: !TypeLoc - } deriving (Show, Eq, TH.Lift) - -instance EquatableGType InpObjTyInfo where - type EqProps InpObjTyInfo = (G.NamedType, Map.HashMap G.Name (G.Name, G.GType)) - getEqProps a = (,) (_iotiName a) (Map.map getEqProps $ _iotiFields a) - -fromInpObjTyDef :: G.InputObjectTypeDefinition -> TypeLoc -> InpObjTyInfo -fromInpObjTyDef (G.InputObjectTypeDefinition descM n _ inpFlds) loc = - InpObjTyInfo descM (G.NamedType n) fldMap loc - where - fldMap = Map.fromList - [(G._ivdName inpFld, fromInpValDef inpFld) | inpFld <- inpFlds] - -mkHsraInpTyInfo - :: Maybe G.Description - -> G.NamedType - -> InpObjFldMap - -> InpObjTyInfo -mkHsraInpTyInfo descM ty flds = - InpObjTyInfo descM ty flds TLHasuraType - -data ScalarTyInfo - = ScalarTyInfo - { _stiDesc :: !(Maybe G.Description) - , _stiName :: !G.Name - , _stiType :: !PGScalarType - , _stiLoc :: !TypeLoc - } deriving (Show, Eq, TH.Lift) - -mkHsraScalarTyInfo :: PGScalarType -> ScalarTyInfo -mkHsraScalarTyInfo ty = - ScalarTyInfo Nothing (G.Name $ pgColTyToScalar ty) ty TLHasuraType - -instance EquatableGType ScalarTyInfo where - type EqProps ScalarTyInfo = PGScalarType - getEqProps = _stiType - -fromScalarTyDef - :: G.ScalarTypeDefinition - -> TypeLoc - -> ScalarTyInfo -fromScalarTyDef (G.ScalarTypeDefinition descM n _) = - ScalarTyInfo descM n ty - where - ty = case n of - "Int" -> PGInteger - "Float" -> PGFloat - "String" -> PGText - "Boolean" -> PGBoolean - "ID" -> PGText - _ -> textToPGScalarType $ G.unName n - -data TypeInfo - = TIScalar !ScalarTyInfo - | TIObj !ObjTyInfo - | TIEnum !EnumTyInfo - | TIInpObj !InpObjTyInfo - | TIIFace !IFaceTyInfo - | TIUnion !UnionTyInfo - deriving (Show, Eq, TH.Lift) - -instance J.ToJSON TypeInfo where - toJSON _ = J.String "toJSON not implemented for TypeInfo" - -data AsObjType - = AOTIFace IFaceTyInfo - | AOTUnion UnionTyInfo - -getPossibleObjTypes :: TypeMap -> AsObjType -> Map.HashMap G.NamedType ObjTyInfo -getPossibleObjTypes tyMap = \case - (AOTIFace i) -> - toObjMap $ mapMaybe (extrObjTyInfoM tyMap) $ Set.toList $ _ifMemberTypes i - (AOTUnion u) -> - toObjMap $ mapMaybe (extrObjTyInfoM tyMap) $ Set.toList $ _utiMemberTypes u - -- toObjMap $ mapMaybe previewImplTypeM $ Map.elems tyMap - -- where - -- previewImplTypeM = \case - -- TIObj objTyInfo -> bool Nothing (Just objTyInfo) $ - -- _ifName i `elem` _otiImplIFaces objTyInfo - -- _ -> Nothing - - -toObjMap :: [ObjTyInfo] -> Map.HashMap G.NamedType ObjTyInfo -toObjMap = foldr (\o -> Map.insert (_otiName o) o) Map.empty - - -isObjTy :: TypeInfo -> Bool -isObjTy = \case - (TIObj _) -> True - _ -> False - -getObjTyM :: TypeInfo -> Maybe ObjTyInfo -getObjTyM = \case - (TIObj t) -> return t - _ -> Nothing - -getUnionTyM :: TypeInfo -> Maybe UnionTyInfo -getUnionTyM = \case - (TIUnion u) -> return u - _ -> Nothing - -isIFaceTy :: TypeInfo -> Bool -isIFaceTy = \case - (TIIFace _) -> True - _ -> False - -data SchemaPath - = SchemaPath - { _spTypeName :: !(Maybe G.NamedType) - , _spFldName :: !(Maybe G.Name) - , _spArgName :: !(Maybe G.Name) - , _spType :: !(Maybe T.Text) - } - -setFldNameSP :: SchemaPath -> G.Name -> SchemaPath -setFldNameSP sp fn = sp { _spFldName = Just fn} - -setArgNameSP :: SchemaPath -> G.Name -> SchemaPath -setArgNameSP sp an = sp { _spArgName = Just an} - -showSP :: SchemaPath -> Text -showSP (SchemaPath t f a _) = maybe "" (\x -> showNamedTy x <> fN) t - where - fN = maybe "" (\x -> "." <> showName x <> aN) f - aN = maybe "" showArg a - showArg x = "(" <> showName x <> ":)" - -showSPTxt' :: SchemaPath -> Text -showSPTxt' (SchemaPath _ f a t) = maybe "" (<> " "<> fld) t - where - fld = maybe "" (const $ "field " <> arg) f - arg = maybe "" (const "argument ") a - -showSPTxt :: SchemaPath -> Text -showSPTxt p = showSPTxt' p <> showSP p - -validateIFace :: MonadError Text f => IFaceTyInfo -> f () -validateIFace (IFaceTyInfo _ n flds _) = - when (isFldListEmpty flds) $ throwError $ "List of fields cannot be empty for interface " <> showNamedTy n - -validateObj :: TypeMap -> ObjTyInfo -> Either Text () -validateObj tyMap objTyInfo@(ObjTyInfo _ n _ flds) = do - when (isFldListEmpty flds) $ throwError $ "List of fields cannot be empty for " <> objTxt - mapM_ (extrIFaceTyInfo' >=> validateIFaceImpl objTyInfo) $ _otiImplIFaces objTyInfo - where - extrIFaceTyInfo' t = withObjTxt $ extrIFaceTyInfo tyMap t - withObjTxt x = x `catchError` \e -> throwError $ e <> " implemented by " <> objTxt - objTxt = "Object type " <> showNamedTy n - validateIFaceImpl = implmntsIFace tyMap - -isFldListEmpty :: ObjFieldMap -> Bool -isFldListEmpty = Map.null . Map.delete "__typename" - -validateUnion :: MonadError Text m => TypeMap -> UnionTyInfo -> m () -validateUnion tyMap (UnionTyInfo _ un mt) = do - when (Set.null mt) $ throwError $ "List of member types cannot be empty for union type " <> showNamedTy un - mapM_ valIsObjTy $ Set.toList mt - where - valIsObjTy mn = case Map.lookup mn tyMap of - Just (TIObj t) -> return t - Nothing -> throwError $ "Could not find type " <> showNamedTy mn <> ", which is defined as a member type of Union " <> showNamedTy un - _ -> throwError $ "Union type " <> showNamedTy un <> " can only include object types. It cannot include " <> showNamedTy mn - -implmntsIFace :: TypeMap -> ObjTyInfo -> IFaceTyInfo -> Either Text () -implmntsIFace tyMap objTyInfo iFaceTyInfo = do - let path = - ( SchemaPath (Just $ _otiName objTyInfo) Nothing Nothing (Just "Object") - , SchemaPath (Just $ _ifName iFaceTyInfo) Nothing Nothing (Just "Interface") - ) - mapM_ (includesIFaceFld path) $ _ifFields iFaceTyInfo - where - includesIFaceFld (spO,spIF) ifFld = do - let pathA@(spOA, spIFA) = (spO, setFldNameSP spIF $ _fiName ifFld) - objFld <- sameNameFld pathA ifFld - let pathB = (setFldNameSP spOA $ _fiName objFld, spIFA) - validateIsSubType' pathB (_fiTy objFld) (_fiTy ifFld) - hasAllArgs pathB objFld ifFld - isExtraArgsNullable pathB objFld ifFld - - validateIsSubType' (spO,spIF) oFld iFld = validateIsSubType tyMap oFld iFld `catchError` \_ -> - throwError $ "The type of " <> showSPTxt spO <> " (" <> G.showGT oFld <> - ") is not the same type/sub type of " <> showSPTxt spIF <> " (" <> G.showGT iFld <> ")" - - sameNameFld (spO, spIF) ifFld = do - let spIFN = setFldNameSP spIF $ _fiName ifFld - onNothing (Map.lookup (_fiName ifFld) objFlds) - $ throwError $ showSPTxt spIFN <> " expected, but " <> showSP spO <> " does not provide it" - - hasAllArgs (spO, spIF) objFld ifFld = forM_ (_fiParams ifFld) $ \ifArg -> do - objArg <- sameNameArg ifArg - let (spON, spIFN) = (setArgNameSP spO $ _iviName objArg, setArgNameSP spIF $ _iviName ifArg) - unless (_iviType objArg == _iviType ifArg) $ throwError $ - showSPTxt spIFN <> " expects type " <> G.showGT (_iviType ifArg) <> ", but " <> - showSP spON <> " has type " <> G.showGT (_iviType objArg) - where - sameNameArg ivi = do - let spIFN = setArgNameSP spIF $ _iviName ivi - onNothing (Map.lookup (_iviName ivi) objArgs) $ throwError $ showSPTxt spIFN <> " required, but " <> - showSPTxt spO <> " does not provide it" - objArgs = _fiParams objFld - - isExtraArgsNullable (spO, spIF) objFld ifFld = forM_ extraArgs isInpValNullable - where - extraArgs = Map.difference (_fiParams objFld) (_fiParams ifFld) - isInpValNullable ivi = unless (G.isNullable $ _iviType ivi) $ throwError $ - showSPTxt (setArgNameSP spO $ _iviName ivi) <> " is of required type " - <> G.showGT (_iviType ivi) <> ", but is not provided by " <> showSPTxt spIF - - objFlds = _otiFields objTyInfo - -extrTyInfo :: TypeMap -> G.NamedType -> Either Text TypeInfo -extrTyInfo tyMap tn = maybe - (throwError $ "Could not find type with name " <> showNamedTy tn) - return - $ Map.lookup tn tyMap - -extrIFaceTyInfo :: MonadError Text m => Map.HashMap G.NamedType TypeInfo -> G.NamedType -> m IFaceTyInfo -extrIFaceTyInfo tyMap tn = case Map.lookup tn tyMap of - Just (TIIFace i) -> return i - _ -> throwError $ "Could not find interface " <> showNamedTy tn - -extrObjTyInfoM :: TypeMap -> G.NamedType -> Maybe ObjTyInfo -extrObjTyInfoM tyMap tn = case Map.lookup tn tyMap of - Just (TIObj o) -> return o - _ -> Nothing - -validateIsSubType :: Map.HashMap G.NamedType TypeInfo -> G.GType -> G.GType -> Either Text () -validateIsSubType tyMap subFldTy supFldTy = do - checkNullMismatch subFldTy supFldTy - case (subFldTy,supFldTy) of - (G.TypeNamed _ subTy, G.TypeNamed _ supTy) -> do - subTyInfo <- extrTyInfo tyMap subTy - supTyInfo <- extrTyInfo tyMap supTy - isSubTypeBase subTyInfo supTyInfo - (G.TypeList _ (G.ListType sub), G.TypeList _ (G.ListType sup) ) -> - validateIsSubType tyMap sub sup - _ -> throwError $ showIsListTy subFldTy <> " Type " <> G.showGT subFldTy <> - " cannot be a sub-type of " <> showIsListTy supFldTy <> " Type " <> G.showGT supFldTy - where - checkNullMismatch subTy supTy = when (G.isNotNull supTy && G.isNullable subTy ) $ - throwError $ "Nullable Type " <> G.showGT subFldTy <> " cannot be a sub-type of Non-Null Type " <> G.showGT supFldTy - showIsListTy = \case - G.TypeList {} -> "List" - G.TypeNamed {} -> "Named" - --- TODO Should we check the schema location as well? -isSubTypeBase :: (MonadError Text m) => TypeInfo -> TypeInfo -> m () -isSubTypeBase subTyInfo supTyInfo = case (subTyInfo,supTyInfo) of - (TIObj obj, TIIFace iFace) -> unless (_ifName iFace `elem` _otiImplIFaces obj) notSubTyErr - _ -> unless (subTyInfo == supTyInfo) notSubTyErr - where - showTy = showNamedTy . getNamedTy - notSubTyErr = throwError $ "Type " <> showTy subTyInfo <> " is not a sub type of " <> showTy supTyInfo - --- map postgres types to builtin scalars -pgColTyToScalar :: PGScalarType -> Text -pgColTyToScalar = \case - PGInteger -> "Int" - PGBoolean -> "Boolean" - PGFloat -> "Float" - PGText -> "String" - PGVarchar -> "String" - t -> toSQLTxt t - -mkScalarTy :: PGScalarType -> G.NamedType -mkScalarTy = - G.NamedType . G.Name . pgColTyToScalar - -getNamedTy :: TypeInfo -> G.NamedType -getNamedTy = \case - TIScalar t -> G.NamedType $ _stiName t - TIObj t -> _otiName t - TIIFace i -> _ifName i - TIEnum t -> _etiName t - TIInpObj t -> _iotiName t - TIUnion u -> _utiName u - -mkTyInfoMap :: [TypeInfo] -> TypeMap -mkTyInfoMap tyInfos = - Map.fromList [(getNamedTy tyInfo, tyInfo) | tyInfo <- tyInfos] - -fromTyDef :: InterfaceImplementations -> TypeLoc -> G.TypeDefinition -> TypeInfo -fromTyDef interfaceImplementations loc tyDef = case tyDef of - G.TypeDefinitionScalar t -> TIScalar $ fromScalarTyDef t loc - G.TypeDefinitionObject t -> TIObj $ fromObjTyDef t loc - G.TypeDefinitionInterface t -> TIIFace $ fromIFaceDef interfaceImplementations t loc - G.TypeDefinitionUnion t -> TIUnion $ fromUnionTyDef t - G.TypeDefinitionEnum t -> TIEnum $ fromEnumTyDef t loc - G.TypeDefinitionInputObject t -> TIInpObj $ fromInpObjTyDef t loc - -type InterfaceImplementations = Map.HashMap G.NamedType MemberTypes - -fromSchemaDoc :: G.SchemaDocument -> TypeLoc -> Either Text TypeMap -fromSchemaDoc (G.SchemaDocument tyDefs) loc = do - let tyMap = mkTyInfoMap $ map (fromTyDef interfaceImplementations loc) tyDefs - validateTypeMap tyMap - return tyMap - where - interfaceImplementations :: InterfaceImplementations - interfaceImplementations = - foldr (Map.unionWith (<>)) mempty $ flip mapMaybe tyDefs $ \case - G.TypeDefinitionObject objectDefinition -> - Just $ Map.fromList $ zip - (G._otdImplementsInterfaces objectDefinition) - (repeat $ Set.singleton $ G.NamedType $ G._otdName objectDefinition) - _ -> Nothing - -validateTypeMap :: TypeMap -> Either Text () -validateTypeMap tyMap = mapM_ validateTy $ Map.elems tyMap - where - validateTy (TIObj o) = validateObj tyMap o - validateTy (TIUnion u) = validateUnion tyMap u - validateTy (TIIFace i) = validateIFace i - validateTy _ = return () - -fromSchemaDocQ :: G.SchemaDocument -> TypeLoc -> TH.Q TH.Exp -fromSchemaDocQ sd loc = case fromSchemaDoc sd loc of - Left e -> fail $ T.unpack e - Right tyMap -> TH.ListE <$> mapM TH.lift (Map.elems tyMap) - -defaultSchema :: G.SchemaDocument -defaultSchema = $(G.parseSchemaDocQ "src-rsr/schema.graphql") - --- fromBaseSchemaFileQ :: FilePath -> TH.Q TH.Exp --- fromBaseSchemaFileQ fp = --- fromSchemaDocQ $(G.parseSchemaDocQ fp) - -type TypeMap = Map.HashMap G.NamedType TypeInfo - -data DirectiveInfo - = DirectiveInfo - { _diDescription :: !(Maybe G.Description) - , _diName :: !G.Name - , _diParams :: !ParamMap - , _diLocations :: ![G.DirectiveLocation] - } deriving (Show, Eq) - --- TODO: generate this from template haskell once we have a parser for directive defs --- directive @skip(if: Boolean!) on FIELD | FRAGMENT_SPREAD | INLINE_FRAGMENT -defaultDirectives :: [DirectiveInfo] -defaultDirectives = - [mkDirective "skip", mkDirective "include"] - where - mkDirective n = DirectiveInfo Nothing n args dirLocs - args = Map.singleton "if" $ InpValInfo Nothing "if" Nothing $ - G.TypeNamed (G.Nullability False) $ mkScalarTy PGBoolean - dirLocs = map G.DLExecutable - [G.EDLFIELD, G.EDLFRAGMENT_SPREAD, G.EDLINLINE_FRAGMENT] - -defDirectivesMap :: Map.HashMap G.Name DirectiveInfo -defDirectivesMap = mapFromL _diName defaultDirectives - -data FragDef - = FragDef - { _fdName :: !G.Name - , _fdTyInfo :: !FragmentTypeInfo - , _fdSelSet :: !G.SelectionSet - } deriving (Show, Eq) - -data FragmentTypeInfo - = FragmentTyObject !ObjTyInfo - | FragmentTyInterface !IFaceTyInfo - | FragmentTyUnion !UnionTyInfo - deriving (Show, Eq) - -type FragDefMap = Map.HashMap G.Name FragDef - -type AnnVarVals = - Map.HashMap G.Variable AnnInpVal - -stripTypenames :: [G.ExecutableDefinition] -> [G.ExecutableDefinition] -stripTypenames = map filterExecDef - where - filterExecDef = \case - G.ExecutableDefinitionOperation opDef -> - G.ExecutableDefinitionOperation $ filterOpDef opDef - G.ExecutableDefinitionFragment fragDef -> - let newSelset = filterSelSet $ G._fdSelectionSet fragDef - in G.ExecutableDefinitionFragment fragDef{G._fdSelectionSet = newSelset} - - filterOpDef = \case - G.OperationDefinitionTyped typeOpDef -> - let newSelset = filterSelSet $ G._todSelectionSet typeOpDef - in G.OperationDefinitionTyped typeOpDef{G._todSelectionSet = newSelset} - G.OperationDefinitionUnTyped selset -> - G.OperationDefinitionUnTyped $ filterSelSet selset - - filterSelSet = mapMaybe filterSel - filterSel s = case s of - G.SelectionField f -> - if G._fName f == "__typename" - then Nothing - else - let newSelset = filterSelSet $ G._fSelectionSet f - in Just $ G.SelectionField f{G._fSelectionSet = newSelset} - _ -> Just s - --- | Used by 'Hasura.GraphQL.Validate.validateVariablesForReuse' to parse new sets of variables for --- reusable query plans; see also 'QueryReusability'. -newtype ReusableVariableTypes - = ReusableVariableTypes { unReusableVarTypes :: Map.HashMap G.Variable RQL.PGColumnType } - deriving (Show, Eq, Semigroup, Monoid, J.ToJSON) -type ReusableVariableValues = Map.HashMap G.Variable (WithScalarType PGScalarValue) - --- | Tracks whether or not a query is /reusable/. Reusable queries are nice, since we can cache --- their resolved ASTs and avoid re-resolving them if we receive an identical query. However, we --- can’t always safely reuse queries if they have variables, since some variable values can affect --- the generated SQL. For example, consider the following query: --- --- > query users_where($condition: users_bool_exp!) { --- > users(where: $condition) { --- > id --- > } --- > } --- --- Different values for @$condition@ will produce completely different queries, so we can’t reuse --- its plan (unless the variable values were also all identical, of course, but we don’t bother --- caching those). --- --- If a query does turn out to be reusable, we build up a 'ReusableVariableTypes' value that maps --- variable names to their types so that we can use a fast path for validating new sets of --- variables (namely 'Hasura.GraphQL.Validate.validateVariablesForReuse'). -data QueryReusability - = Reusable !ReusableVariableTypes - | NotReusable - deriving (Show, Eq) -$(makePrisms ''QueryReusability) - -instance Semigroup QueryReusability where - Reusable a <> Reusable b = Reusable (a <> b) - _ <> _ = NotReusable -instance Monoid QueryReusability where - mempty = Reusable mempty - -class (Monad m) => MonadReusability m where - recordVariableUse :: G.Variable -> RQL.PGColumnType -> m () - markNotReusable :: m () - -instance (MonadReusability m) => MonadReusability (ReaderT r m) where - recordVariableUse a b = lift $ recordVariableUse a b - markNotReusable = lift markNotReusable - -instance (MonadReusability m) => MonadReusability (StateT s m) where - recordVariableUse a b = lift $ recordVariableUse a b - markNotReusable = lift markNotReusable - -newtype ReusabilityT m a = ReusabilityT { unReusabilityT :: StateT QueryReusability m a } - deriving (Functor, Applicative, Monad, MonadError e, MonadReader r, MonadIO, MonadTrans) - -instance (Monad m) => MonadReusability (ReusabilityT m) where - recordVariableUse varName varType = ReusabilityT $ - modify' (<> Reusable (ReusableVariableTypes $ Map.singleton varName varType)) - markNotReusable = ReusabilityT $ put NotReusable - -instance Tracing.MonadTrace m => Tracing.MonadTrace (ReusabilityT m) where - trace name (ReusabilityT ma) = ReusabilityT (Tracing.trace name ma) - currentContext = lift Tracing.currentContext - currentReporter = lift Tracing.currentReporter - attachMetadata = lift . Tracing.attachMetadata - -runReusabilityT :: ReusabilityT m a -> m (a, QueryReusability) -runReusabilityT = runReusabilityTWith mempty - --- | Like 'runReusabilityT', but starting from an existing 'QueryReusability' state. -runReusabilityTWith :: QueryReusability -> ReusabilityT m a -> m (a, QueryReusability) -runReusabilityTWith initialReusability = flip runStateT initialReusability . unReusabilityT - -evalReusabilityT :: (Monad m) => ReusabilityT m a -> m a -evalReusabilityT = flip evalStateT mempty . unReusabilityT diff --git a/server/src-lib/Hasura/Incremental/Internal/Dependency.hs b/server/src-lib/Hasura/Incremental/Internal/Dependency.hs index 50557f93348..392378ae8ea 100644 --- a/server/src-lib/Hasura/Incremental/Internal/Dependency.hs +++ b/server/src-lib/Hasura/Incremental/Internal/Dependency.hs @@ -11,7 +11,6 @@ import qualified Data.URL.Template as UT import qualified Language.GraphQL.Draft.Syntax as G import qualified Network.URI.Extended as N -import Control.Applicative import Data.Aeson (Value) import Data.CaseInsensitive (CI) import Data.Functor.Classes (Eq1 (..), Eq2 (..)) @@ -165,6 +164,7 @@ instance Cacheable Integer where unchanged _ = (==) instance Cacheable Scientific where unchanged _ = (==) instance Cacheable Text where unchanged _ = (==) instance Cacheable N.URIAuth where unchanged _ = (==) +instance Cacheable G.Name where unchanged _ = (==) instance Cacheable DiffTime where unchanged _ = (==) instance Cacheable NominalDiffTime where unchanged _ = (==) instance Cacheable UTCTime where unchanged _ = (==) @@ -202,44 +202,49 @@ instance (Cacheable a, Cacheable b, Cacheable c, Cacheable d) => Cacheable (a, b instance (Cacheable a, Cacheable b, Cacheable c, Cacheable d, Cacheable e) => Cacheable (a, b, c, d, e) instance Cacheable Bool +instance Cacheable Void instance Cacheable Value -instance Cacheable G.Argument -instance Cacheable G.Directive -instance Cacheable G.ExecutableDefinition -instance Cacheable G.Field instance Cacheable G.FragmentDefinition -instance Cacheable G.FragmentSpread instance Cacheable G.GType -instance Cacheable G.InlineFragment instance Cacheable G.Nullability -instance Cacheable G.OperationDefinition instance Cacheable G.OperationType -instance Cacheable G.Selection -instance Cacheable G.TypedOperationDefinition -instance Cacheable G.Value -instance Cacheable G.ValueConst instance Cacheable G.VariableDefinition +instance Cacheable G.InputValueDefinition +instance Cacheable G.EnumValueDefinition +instance Cacheable G.FieldDefinition +instance Cacheable G.ScalarTypeDefinition +instance Cacheable G.UnionTypeDefinition +instance Cacheable possibleTypes => Cacheable (G.InterfaceTypeDefinition possibleTypes) +instance Cacheable G.EnumTypeDefinition +instance Cacheable G.InputObjectTypeDefinition +instance Cacheable G.ObjectTypeDefinition +instance Cacheable possibleTypes => Cacheable (G.TypeDefinition possibleTypes) instance Cacheable N.URI instance Cacheable UT.Variable instance Cacheable UT.TemplateItem instance Cacheable UT.URLTemplate instance (Cacheable a) => Cacheable (Maybe a) instance (Cacheable a, Cacheable b) => Cacheable (Either a b) -instance (Cacheable a) => Cacheable [a] -instance (Cacheable a) => Cacheable (NonEmpty a) -instance (Cacheable a) => Cacheable (G.ObjectFieldG a) +instance (Cacheable a) => Cacheable (NESeq a) +instance Cacheable a => Cacheable [a] +instance Cacheable a => Cacheable (NonEmpty a) +instance Cacheable a => Cacheable (G.Directive a) +instance Cacheable a => Cacheable (G.ExecutableDefinition a) +instance (Cacheable (a b), Cacheable b) => Cacheable (G.Field a b) +instance Cacheable a => Cacheable (G.FragmentSpread a) +instance (Cacheable (a b), Cacheable b) => Cacheable (G.InlineFragment a b) +instance (Cacheable (a b), Cacheable b) => Cacheable (G.OperationDefinition a b) +instance (Cacheable (a b), Cacheable b) => Cacheable (G.Selection a b) +instance (Cacheable (a b), Cacheable b) => Cacheable (G.TypedOperationDefinition a b) + +instance Cacheable a => Cacheable (G.Value a) -deriving instance Cacheable G.Alias -deriving instance Cacheable G.EnumValue -deriving instance Cacheable G.ExecutableDocument -deriving instance Cacheable G.ListType -deriving instance Cacheable G.Name -deriving instance Cacheable G.NamedType -deriving instance Cacheable G.StringValue -deriving instance Cacheable G.Variable deriving instance Cacheable G.Description -deriving instance (Cacheable a) => Cacheable (G.ListValueG a) -deriving instance (Cacheable a) => Cacheable (G.ObjectValueG a) +deriving instance Cacheable G.EnumValue +deriving instance Cacheable a => Cacheable (G.ExecutableDocument a) + +instance Cacheable G.SchemaDocument +instance Cacheable G.SchemaIntrospection class GCacheable f where gunchanged :: f p -> f p -> Accesses -> Bool diff --git a/server/src-lib/Hasura/Incremental/Internal/Rule.hs b/server/src-lib/Hasura/Incremental/Internal/Rule.hs index 7c612442874..ab227a9c440 100644 --- a/server/src-lib/Hasura/Incremental/Internal/Rule.hs +++ b/server/src-lib/Hasura/Incremental/Internal/Rule.hs @@ -9,7 +9,6 @@ import Hasura.Prelude hiding (id, (.)) import qualified Data.HashMap.Strict as HM -import Control.Applicative hiding (liftA) import Control.Arrow.Extended import Control.Category import Data.Profunctor diff --git a/server/src-lib/Hasura/Incremental/Select.hs b/server/src-lib/Hasura/Incremental/Select.hs index 8fb99affa6f..41b0e4db087 100644 --- a/server/src-lib/Hasura/Incremental/Select.hs +++ b/server/src-lib/Hasura/Incremental/Select.hs @@ -1,6 +1,4 @@ {-# OPTIONS_HADDOCK not-home #-} -{-# LANGUAGE GADTs #-} -{-# LANGUAGE RoleAnnotations #-} module Hasura.Incremental.Select ( Select(..) @@ -27,6 +25,7 @@ import Control.Monad.Unique import Data.GADT.Compare import Data.Kind import Data.Proxy (Proxy (..)) +import Data.Type.Equality import GHC.OverloadedLabels (IsLabel (..)) import GHC.Records (HasField (..)) import GHC.TypeLits (KnownSymbol, sameSymbol, symbolVal) diff --git a/server/src-lib/Hasura/Prelude.hs b/server/src-lib/Hasura/Prelude.hs index bb727773b7e..ed2ffc76105 100644 --- a/server/src-lib/Hasura/Prelude.hs +++ b/server/src-lib/Hasura/Prelude.hs @@ -7,6 +7,7 @@ module Hasura.Prelude , onNothing , onJust , onLeft + , whenMaybe , choice , afold , bsToTxt @@ -24,7 +25,7 @@ module Hasura.Prelude , module Data.Time.Clock.Units ) where -import Control.Applicative as M (Alternative (..)) +import Control.Applicative as M (Alternative (..), liftA2) import Control.Arrow as M (first, second, (&&&), (***), (<<<), (>>>)) import Control.DeepSeq as M (NFData, deepseq, force) import Control.Monad.Base as M @@ -32,34 +33,39 @@ import Control.Monad.Except as M import Control.Monad.Identity as M import Control.Monad.Reader as M import Control.Monad.State.Strict as M +import Control.Monad.Trans.Maybe as M (MaybeT (..)) import Control.Monad.Writer.Strict as M (MonadWriter (..), WriterT (..), execWriterT, runWriterT) import Data.Align as M (Semialign (align, alignWith)) import Data.Bool as M (bool) import Data.Data as M (Data (..)) import Data.Either as M (lefts, partitionEithers, rights) -import Data.Foldable as M (asum, foldrM, for_, toList, traverse_) +import Data.Foldable as M (asum, fold, foldrM, for_, toList, + traverse_) import Data.Function as M (on, (&)) import Data.Functor as M (($>), (<&>)) import Data.Hashable as M (Hashable) import Data.HashMap.Strict as M (HashMap) +import Data.HashMap.Strict.InsOrd as M (InsOrdHashMap) import Data.HashSet as M (HashSet) import Data.List as M (find, findIndex, foldl', group, intercalate, intersect, lookup, sort, sortBy, sortOn, union, unionBy, (\\)) -import Data.List.NonEmpty as M (NonEmpty (..)) +import Data.List.NonEmpty as M (NonEmpty (..), nonEmpty) import Data.Maybe as M (catMaybes, fromMaybe, isJust, isNothing, listToMaybe, mapMaybe, maybeToList) import Data.Monoid as M (getAlt) import Data.Ord as M (comparing) import Data.Semigroup as M (Semigroup (..)) import Data.Sequence as M (Seq) +import Data.Sequence.NonEmpty as M (NESeq) import Data.String as M (IsString) import Data.Text as M (Text) import Data.These as M (These (..), fromThese, mergeThese, mergeTheseWith, partitionThese, these) import Data.Time.Clock.Units import Data.Traversable as M (for) +import Data.Void as M (Void, absurd) import Data.Word as M (Word64) import GHC.Generics as M (Generic) import Prelude as M hiding (fail, init, lookup) @@ -67,10 +73,9 @@ import Test.QuickCheck.Arbitrary.Generic as M import Text.Read as M (readEither, readMaybe) import qualified Data.ByteString as B -import qualified Data.ByteString.Lazy as BL - import qualified Data.ByteString.Base64.Lazy as Base64 import Data.Coerce +import qualified Data.ByteString.Lazy as BL import qualified Data.HashMap.Strict as Map import qualified Data.Set as Set import qualified Data.Text as T @@ -98,6 +103,10 @@ onJust m action = maybe (return ()) action m onLeft :: (Monad m) => Either e a -> (e -> m a) -> m a onLeft e f = either f return e +whenMaybe :: Applicative m => Bool -> m a -> m (Maybe a) +whenMaybe True = fmap Just +whenMaybe False = const $ pure Nothing + choice :: (Alternative f) => [f a] -> f a choice = asum @@ -114,7 +123,6 @@ base64Decode :: Text -> BL.ByteString base64Decode = Base64.decodeLenient . BL.fromStrict . txtToBs - -- Like 'span', but monadic and with a function that produces 'Maybe' instead of 'Bool' spanMaybeM :: (Foldable f, Monad m) @@ -142,7 +150,7 @@ findWithIndex p l = do i <- findIndex p l pure (v, i) --- TODO: Move to Data.HashMap.Strict.Extended; rename to fromListWith? +-- TODO (from master): Move to Data.HashMap.Strict.Extended; rename to fromListWith? mapFromL :: (Eq k, Hashable k) => (a -> k) -> [a] -> Map.HashMap k a mapFromL f = Map.fromList . map (\v -> (f v, v)) diff --git a/server/src-lib/Hasura/RQL/DDL/Action.hs b/server/src-lib/Hasura/RQL/DDL/Action.hs index 26c78c01d72..7ac5d3fc857 100644 --- a/server/src-lib/Hasura/RQL/DDL/Action.hs +++ b/server/src-lib/Hasura/RQL/DDL/Action.hs @@ -24,21 +24,18 @@ module Hasura.RQL.DDL.Action ) where import Hasura.EncJSON -import Hasura.GraphQL.Context (defaultTypes) import Hasura.GraphQL.Utils import Hasura.Prelude +import Hasura.RQL.DDL.CustomTypes (lookupPGScalar) import Hasura.RQL.Types import Hasura.Session import Hasura.SQL.Types -import qualified Hasura.GraphQL.Validate.Types as VT - import qualified Data.Aeson as J import qualified Data.Aeson.Casing as J import qualified Data.Aeson.TH as J import qualified Data.Environment as Env import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set import qualified Database.PG.Query as Q import qualified Language.GraphQL.Draft.Syntax as G @@ -99,55 +96,38 @@ referred scalars. resolveAction :: QErrM m => Env.Environment - -> (NonObjectTypeMap, AnnotatedObjects) - -> HashSet PGScalarType - -- ^ List of all Postgres scalar types. + -> AnnotatedCustomTypes -> ActionDefinitionInput + -> HashSet PGScalarType -- See Note [Postgres scalars in custom types] -> m ( ResolvedActionDefinition , AnnotatedObjectType - , HashSet PGScalarType - -- ^ see Note [Postgres scalars in action input arguments]. ) -resolveAction env customTypes allPGScalars actionDefinition = do - let responseType = unGraphQLType $ _adOutputType actionDefinition - responseBaseType = G.getBaseType responseType - - reusedPGScalars <- execWriterT $ - forM (_adArguments actionDefinition) $ \argument -> do - let argumentBaseType = G.getBaseType $ unGraphQLType $ _argType argument - maybeArgTypeInfo = getNonObjectTypeInfo argumentBaseType - maybePGScalar = find ((==) argumentBaseType . VT.mkScalarTy) allPGScalars - - if | Just argTypeInfo <- maybeArgTypeInfo -> - case argTypeInfo of - VT.TIScalar _ -> pure () - VT.TIEnum _ -> pure () - VT.TIInpObj _ -> pure () - _ -> throw400 InvalidParams $ "the argument's base type: " - <> showNamedTy argumentBaseType <> - " should be a scalar/enum/input_object" - -- Collect the referred Postgres scalar. See Note [Postgres scalars in action input arguments]. - | Just pgScalar <- maybePGScalar -> tell $ Set.singleton pgScalar - | Nothing <- maybeArgTypeInfo -> - throw400 NotExists $ "the type: " <> showNamedTy argumentBaseType - <> " is not defined in custom types" - | otherwise -> pure () +resolveAction env AnnotatedCustomTypes{..} ActionDefinition{..} allPGScalars = do + resolvedArguments <- forM _adArguments $ \argumentDefinition -> do + forM argumentDefinition $ \argumentType -> do + let gType = unGraphQLType argumentType + argumentBaseType = G.getBaseType gType + (gType,) <$> + if | Just pgScalar <- lookupPGScalar allPGScalars argumentBaseType -> + pure $ NOCTScalar $ ASTReusedPgScalar argumentBaseType pgScalar + | Just nonObjectType <- Map.lookup argumentBaseType _actNonObjects -> + pure nonObjectType + | otherwise -> + throw400 InvalidParams $ + "the type: " <> showName argumentBaseType + <> " is not defined in custom types or it is not a scalar/enum/input_object" -- Check if the response type is an object - outputObject <- getObjectTypeInfo responseBaseType - resolvedDef <- traverse (resolveWebhook env) actionDefinition - pure (resolvedDef, outputObject, reusedPGScalars) - where - getNonObjectTypeInfo typeName = - let nonObjectTypeMap = unNonObjectTypeMap $ fst $ customTypes - inputTypeInfos = nonObjectTypeMap <> mapFromL VT.getNamedTy defaultTypes - in Map.lookup typeName inputTypeInfos - - getObjectTypeInfo typeName = - onNothing (Map.lookup (ObjectTypeName typeName) (snd customTypes)) $ - throw400 NotExists $ "the type: " - <> showNamedTy typeName <> - " is not an object type defined in custom types" + let outputType = unGraphQLType _adOutputType + outputBaseType = G.getBaseType outputType + outputObject <- onNothing (Map.lookup outputBaseType _actObjects) $ + throw400 NotExists $ "the type: " <> showName outputBaseType + <> " is not an object type defined in custom types" + resolvedWebhook <- resolveWebhook env _adHandler + pure ( ActionDefinition resolvedArguments _adOutputType _adType + _adHeaders _adForwardClientHeaders resolvedWebhook + , outputObject + ) runUpdateAction :: forall m. ( QErrM m , CacheRWM m, MonadTx m) diff --git a/server/src-lib/Hasura/RQL/DDL/ComputedField.hs b/server/src-lib/Hasura/RQL/DDL/ComputedField.hs index 47c9a238a31..5fdc946ecf5 100644 --- a/server/src-lib/Hasura/RQL/DDL/ComputedField.hs +++ b/server/src-lib/Hasura/RQL/DDL/ComputedField.hs @@ -134,13 +134,13 @@ addComputedFieldP2Setup trackedTables table computedField definition rawFunction (rfiReturnTypeName rawFunctionInfo) (rfiReturnTypeType rawFunctionInfo) - computedFieldGraphQLName = G.Name $ computedFieldNameToText computedField + computedFieldGraphQLName = G.mkName $ computedFieldNameToText computedField mkComputedFieldInfo :: (MV.MonadValidate [ComputedFieldValidateError] m) => m ComputedFieldInfo mkComputedFieldInfo = do -- Check if computed field name is a valid GraphQL name - unless (G.isValidName computedFieldGraphQLName) $ + unless (isJust computedFieldGraphQLName) $ MV.dispute $ pure $ CFVENotValidGraphQLName computedField -- Check if function is VOLATILE diff --git a/server/src-lib/Hasura/RQL/DDL/CustomTypes.hs b/server/src-lib/Hasura/RQL/DDL/CustomTypes.hs index fe12adfdef9..cd6347ff292 100644 --- a/server/src-lib/Hasura/RQL/DDL/CustomTypes.hs +++ b/server/src-lib/Hasura/RQL/DDL/CustomTypes.hs @@ -1,27 +1,26 @@ +{-# LANGUAGE RecordWildCards #-} module Hasura.RQL.DDL.CustomTypes ( runSetCustomTypes , persistCustomTypes , clearCustomTypes , resolveCustomTypes + , lookupPGScalar ) where import Control.Monad.Validate -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as Set -import qualified Data.List.Extended as L -import qualified Data.Text as T -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G +import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set +import qualified Data.List.Extended as L +import qualified Data.Text as T +import qualified Database.PG.Query as Q +import qualified Language.GraphQL.Draft.Syntax as G import Hasura.EncJSON -import Hasura.GraphQL.Validate.Types (mkScalarTy) import Hasura.Prelude import Hasura.RQL.Types import Hasura.SQL.Types -import Hasura.GraphQL.Schema.CustomTypes (buildCustomTypesSchemaPartial) - {- Note [Postgres scalars in custom types] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It’s very convenient to be able to reference Postgres scalars in custom type @@ -53,13 +52,21 @@ validateCustomTypeDefinitions :: (MonadValidate [CustomTypeValidationError] m) => TableCache -> CustomTypes - -> HashSet PGScalarType -- ^ all Postgres base types - -> m (HashSet PGScalarType) -- ^ see Note [Postgres scalars in custom types] -validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT do + -> HashSet PGScalarType + -- ^ all Postgres base types. See Note [Postgres scalars in custom types] + -> m AnnotatedCustomTypes +validateCustomTypeDefinitions tableCache customTypes allPGScalars = do unless (null duplicateTypes) $ dispute $ pure $ DuplicateTypeNames duplicateTypes traverse_ validateEnum enumDefinitions - traverse_ validateInputObject inputObjectDefinitions - traverse_ validateObject objectDefinitions + reusedPGScalars <- execWriterT $ traverse_ validateInputObject inputObjectDefinitions + annotatedObjects <- mapFromL (unObjectTypeName . _otdName) <$> + traverse validateObject objectDefinitions + let scalarTypeMap = Map.map NOCTScalar $ + Map.map ASTCustom scalarTypes <> Map.mapWithKey ASTReusedPgScalar reusedPGScalars + enumTypeMap = Map.map NOCTEnum enumTypes + inputObjectTypeMap = Map.map NOCTInputObject inputObjectTypes + nonObjectTypeMap = scalarTypeMap <> enumTypeMap <> inputObjectTypeMap + pure $ AnnotatedCustomTypes nonObjectTypeMap annotatedObjects where inputObjectDefinitions = fromMaybe [] $ _ctInputObjects customTypes objectDefinitions = fromMaybe [] $ _ctObjects customTypes @@ -74,12 +81,13 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT map (unObjectTypeName . _otdName) objectDefinitions scalarTypes = - Set.fromList $ map _stdName scalarDefinitions <> defaultScalars + mapFromL _stdName $ scalarDefinitions <> defaultScalars enumTypes = - Set.fromList $ map (unEnumTypeName . _etdName) enumDefinitions + mapFromL (unEnumTypeName . _etdName) enumDefinitions - defaultScalars = map G.NamedType ["Int", "Float", "String", "Boolean", "ID"] + inputObjectTypes = + mapFromL (unInputObjectTypeName . _iotdName) inputObjectDefinitions validateEnum :: (MonadValidate [CustomTypeValidationError] m) @@ -94,7 +102,7 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT validateInputObject :: ( MonadValidate [CustomTypeValidationError] m - , MonadWriter (Set.HashSet PGScalarType) m + , MonadWriter (Map.HashMap G.Name PGScalarType) m ) => InputObjectTypeDefinition -> m () validateInputObject inputObjectDefinition = do @@ -108,118 +116,126 @@ validateCustomTypeDefinitions tableCache customTypes allPGScalars = execWriterT dispute $ pure $ InputObjectDuplicateFields inputObjectTypeName duplicateFieldNames - let inputObjectTypes = - Set.fromList $ map (unInputObjectTypeName . _iotdName) - inputObjectDefinitions - - let inputTypes = - scalarTypes `Set.union` enumTypes `Set.union` inputObjectTypes + let mapToSet = Set.fromList . Map.keys + inputTypes = + mapToSet scalarTypes `Set.union` mapToSet enumTypes `Set.union` mapToSet inputObjectTypes -- check that fields reference input types for_ (_iotdFields inputObjectDefinition) $ \inputObjectField -> do let fieldBaseType = G.getBaseType $ unGraphQLType $ _iofdType inputObjectField if | Set.member fieldBaseType inputTypes -> pure () - | Just pgScalar <- lookupPGScalar fieldBaseType -> - tell $ Set.singleton pgScalar + | Just pgScalar <- lookupPGScalar allPGScalars fieldBaseType -> + tell $ Map.singleton fieldBaseType pgScalar | otherwise -> refute $ pure $ InputObjectFieldTypeDoesNotExist (_iotdName inputObjectDefinition) (_iofdName inputObjectField) fieldBaseType validateObject - :: ( MonadValidate [CustomTypeValidationError] m - , MonadWriter (Set.HashSet PGScalarType) m - ) - => ObjectTypeDefinition -> m () + :: (MonadValidate [CustomTypeValidationError] m) + => ObjectType -> m AnnotatedObjectType validateObject objectDefinition = do let objectTypeName = _otdName objectDefinition fieldNames = map (unObjectFieldName . _ofdName) $ toList (_otdFields objectDefinition) - relationships = fromMaybe [] $ _otdRelationships objectDefinition - relNames = map (unRelationshipName . _trName) relationships + maybeRelationships = _otdRelationships objectDefinition + relNames = maybe [] + (map (unRelationshipName . _trName) . toList) maybeRelationships duplicateFieldNames = L.duplicates $ fieldNames <> relNames - fields = toList $ _otdFields objectDefinition + fields = _otdFields objectDefinition -- check for duplicate field names unless (null duplicateFieldNames) $ dispute $ pure $ ObjectDuplicateFields objectTypeName duplicateFieldNames - scalarFields <- fmap Map.fromList $ for fields $ \objectField -> do - let fieldType = _ofdType objectField - fieldBaseType = G.getBaseType $ unGraphQLType fieldType - fieldName = _ofdName objectField - + scalarOrEnumFields <- for fields $ \objectField -> do + let fieldName = _ofdName objectField -- check that arguments are not defined when (isJust $ _ofdArguments objectField) $ dispute $ pure $ ObjectFieldArgumentsNotAllowed objectTypeName fieldName - let objectTypes = Set.fromList $ map (unObjectTypeName . _otdName) - objectDefinitions + forM objectField $ \fieldType -> do + let fieldBaseType = G.getBaseType $ unGraphQLType fieldType + objectTypes = Set.fromList $ map (unObjectTypeName . _otdName) + objectDefinitions - -- check that the fields only reference scalars and enums - -- and not other object types - if | Set.member fieldBaseType scalarTypes -> pure () - | Set.member fieldBaseType enumTypes -> pure () - | Set.member fieldBaseType objectTypes -> - dispute $ pure $ ObjectFieldObjectBaseType - objectTypeName fieldName fieldBaseType - | Just pgScalar <- lookupPGScalar fieldBaseType -> - tell $ Set.singleton pgScalar - | otherwise -> - dispute $ pure $ ObjectFieldTypeDoesNotExist - objectTypeName fieldName fieldBaseType + -- check that the fields only reference scalars and enums + -- and not other object types + annotatedObjectFieldType <- + if | Just scalarDef <- Map.lookup fieldBaseType scalarTypes -> + pure $ AOFTScalar $ ASTCustom scalarDef + | Just enumDef <- Map.lookup fieldBaseType enumTypes -> + pure $ AOFTEnum enumDef + | Set.member fieldBaseType objectTypes -> + refute $ pure $ ObjectFieldObjectBaseType + objectTypeName fieldName fieldBaseType + | Just pgScalar <- lookupPGScalar allPGScalars fieldBaseType -> + pure $ AOFTScalar $ ASTReusedPgScalar fieldBaseType pgScalar + | otherwise -> + refute $ pure $ ObjectFieldTypeDoesNotExist + objectTypeName fieldName fieldBaseType + pure (unGraphQLType fieldType, annotatedObjectFieldType) - pure (fieldName, fieldType) - - for_ relationships $ \relationshipField -> do - let relationshipName = _trName relationshipField - remoteTable = _trRemoteTable relationshipField - fieldMapping = _trFieldMapping relationshipField + let scalarOrEnumFieldMap = Map.fromList $ + map (_ofdName &&& (fst . _ofdType)) $ toList $ scalarOrEnumFields + annotatedRelationships <- forM maybeRelationships $ \relationships -> + forM relationships $ \TypeRelationship{..} -> do --check that the table exists - remoteTableInfo <- onNothing (Map.lookup remoteTable tableCache) $ + remoteTableInfo <- onNothing (Map.lookup _trRemoteTable tableCache) $ refute $ pure $ ObjectRelationshipTableDoesNotExist - objectTypeName relationshipName remoteTable + objectTypeName _trName _trRemoteTable -- check that the column mapping is sane - forM_ (Map.toList fieldMapping) $ \(fieldName, columnName) -> do + annotatedFieldMapping <- flip Map.traverseWithKey _trFieldMapping $ + \fieldName columnName -> do + case Map.lookup fieldName scalarOrEnumFieldMap of + Nothing -> dispute $ pure $ ObjectRelationshipFieldDoesNotExist + objectTypeName _trName fieldName + Just fieldType -> + -- the field should be a non-list type scalar + when (G.isListType fieldType) $ + dispute $ pure $ ObjectRelationshipFieldListType + objectTypeName _trName fieldName - case Map.lookup fieldName scalarFields of - Nothing -> dispute $ pure $ ObjectRelationshipFieldDoesNotExist - objectTypeName relationshipName fieldName - Just fieldType -> - -- the field should be a non-list type scalar - when (isListType fieldType) $ - dispute $ pure $ ObjectRelationshipFieldListType - objectTypeName relationshipName fieldName + -- the column should be a column of the table + case getPGColumnInfoM remoteTableInfo (fromPGCol columnName) of + Nothing -> + refute $ pure $ ObjectRelationshipColumnDoesNotExist + objectTypeName _trName _trRemoteTable columnName + Just pgColumnInfo -> pure pgColumnInfo - -- the column should be a column of the table - when (getPGColumnInfoM remoteTableInfo (fromPGCol columnName) == Nothing) $ - dispute $ pure $ ObjectRelationshipColumnDoesNotExist - objectTypeName relationshipName remoteTable columnName - return () + pure $ TypeRelationship _trName _trType remoteTableInfo annotatedFieldMapping - lookupPGScalar baseType = -- see Note [Postgres scalars in custom types] - find ((==) baseType . mkScalarTy) allPGScalars + pure $ ObjectTypeDefinition objectTypeName (_otdDescription objectDefinition) + scalarOrEnumFields annotatedRelationships + +-- see Note [Postgres scalars in custom types] +lookupPGScalar :: Set.HashSet PGScalarType -> G.Name -> Maybe PGScalarType +lookupPGScalar allPGScalars baseType = + fmap snd + $ find ((==) baseType . fst) + $ flip mapMaybe (toList allPGScalars) + $ \pgScalar -> (,pgScalar) <$> G.mkName (toSQLTxt pgScalar) data CustomTypeValidationError - = DuplicateTypeNames !(Set.HashSet G.NamedType) + = DuplicateTypeNames !(Set.HashSet G.Name) -- ^ type names have to be unique across all types | InputObjectFieldTypeDoesNotExist - !InputObjectTypeName !InputObjectFieldName !G.NamedType + !InputObjectTypeName !InputObjectFieldName !G.Name -- ^ field name and the field's base type | InputObjectDuplicateFields !InputObjectTypeName !(Set.HashSet InputObjectFieldName) -- ^ duplicate field declaration in input objects | ObjectFieldTypeDoesNotExist - !ObjectTypeName !ObjectFieldName !G.NamedType + !ObjectTypeName !ObjectFieldName !G.Name -- ^ field name and the field's base type | ObjectDuplicateFields !ObjectTypeName !(Set.HashSet G.Name) -- ^ duplicate field declaration in objects | ObjectFieldArgumentsNotAllowed !ObjectTypeName !ObjectFieldName -- ^ object fields can't have arguments - | ObjectFieldObjectBaseType !ObjectTypeName !ObjectFieldName !G.NamedType + | ObjectFieldObjectBaseType !ObjectTypeName !ObjectFieldName !G.Name -- ^ object fields can't have object types as base types | ObjectRelationshipTableDoesNotExist !ObjectTypeName !RelationshipName !QualifiedTable @@ -315,11 +331,13 @@ clearCustomTypes = do resolveCustomTypes :: (MonadError QErr m) - => TableCache -> CustomTypes -> HashSet PGScalarType -> m (NonObjectTypeMap, AnnotatedObjects) -resolveCustomTypes tableCache customTypes allPGScalars = do - reusedPGScalars <- either (throw400 ConstraintViolation . showErrors) pure + => TableCache + -> CustomTypes + -> HashSet PGScalarType + -> m AnnotatedCustomTypes +resolveCustomTypes tableCache customTypes allPGScalars = + either (throw400 ConstraintViolation . showErrors) pure =<< runValidateT (validateCustomTypeDefinitions tableCache customTypes allPGScalars) - buildCustomTypesSchemaPartial tableCache customTypes reusedPGScalars where showErrors :: [CustomTypeValidationError] -> T.Text showErrors allErrors = diff --git a/server/src-lib/Hasura/RQL/DDL/EventTrigger.hs b/server/src-lib/Hasura/RQL/DDL/EventTrigger.hs index f73074cb3ea..26d70b55949 100644 --- a/server/src-lib/Hasura/RQL/DDL/EventTrigger.hs +++ b/server/src-lib/Hasura/RQL/DDL/EventTrigger.hs @@ -7,7 +7,7 @@ module Hasura.RQL.DDL.EventTrigger , runRedeliverEvent , runInvokeEventTrigger - -- TODO: review + -- TODO(from master): review , delEventTriggerFromCatalog , subTableP2 , subTableP2Setup diff --git a/server/src-lib/Hasura/RQL/DDL/Metadata.hs b/server/src-lib/Hasura/RQL/DDL/Metadata.hs index 86c91f87b72..457229c59e9 100644 --- a/server/src-lib/Hasura/RQL/DDL/Metadata.hs +++ b/server/src-lib/Hasura/RQL/DDL/Metadata.hs @@ -74,8 +74,10 @@ runClearMetadata _ = do applyQP1 :: (QErrM m) => ReplaceMetadata -> m () -applyQP1 (ReplaceMetadata _ tables functionsMeta schemas collections - allowlist _ actions cronTriggers) = do +applyQP1 (ReplaceMetadata _ tables functionsMeta schemas + collections + allowlist _ actions + cronTriggers) = do withPathK "tables" $ do checkMultipleDecls "tables" $ map _tmTable tables @@ -299,12 +301,11 @@ fetchMetadata = do customTypes <- fetchCustomTypes - -- fetch actions + -- -- fetch actions actions <- fetchActions cronTriggers <- fetchCronTriggers - return $ ReplaceMetadata currentMetadataVersion (HMIns.elems postRelMap) functions diff --git a/server/src-lib/Hasura/RQL/DDL/Metadata/Generator.hs b/server/src-lib/Hasura/RQL/DDL/Metadata/Generator.hs index 24c23368bd5..ca52a6ebb46 100644 --- a/server/src-lib/Hasura/RQL/DDL/Metadata/Generator.hs +++ b/server/src-lib/Hasura/RQL/DDL/Metadata/Generator.hs @@ -3,21 +3,7 @@ module Hasura.RQL.DDL.Metadata.Generator (genReplaceMetadata) where -import Hasura.GraphQL.Utils (simpleGraphQLQuery) import Hasura.Prelude -import Hasura.RQL.DDL.Headers -import Hasura.RQL.DDL.Metadata.Types -import Hasura.RQL.Types -import Hasura.SQL.Types - -import qualified Hasura.RQL.DDL.ComputedField as ComputedField -import qualified Hasura.RQL.DDL.Permission as Permission -import qualified Hasura.RQL.DDL.Permission.Internal as Permission -import qualified Hasura.RQL.DDL.QueryCollection as Collection -import qualified Hasura.RQL.DDL.Relationship as Relationship -import qualified Hasura.RQL.DDL.Schema as Schema - -import System.Cron.Types import qualified Data.Aeson as J import qualified Data.Text as T @@ -31,11 +17,25 @@ import qualified System.Cron.Parser as Cr import Data.List.Extended (duplicates) import Data.Scientific +import System.Cron.Types import Test.QuickCheck import Test.QuickCheck.Instances.Semigroup () import Test.QuickCheck.Instances.Time () import Test.QuickCheck.Instances.UnorderedContainers () +import qualified Hasura.RQL.DDL.ComputedField as ComputedField +import qualified Hasura.RQL.DDL.Permission as Permission +import qualified Hasura.RQL.DDL.Permission.Internal as Permission +import qualified Hasura.RQL.DDL.QueryCollection as Collection +import qualified Hasura.RQL.DDL.Relationship as Relationship +import qualified Hasura.RQL.DDL.Schema as Schema + +import Hasura.GraphQL.Utils (simpleGraphQLQuery) +import Hasura.RQL.DDL.Headers +import Hasura.RQL.DDL.Metadata.Types +import Hasura.RQL.Types +import Hasura.SQL.Types + genReplaceMetadata :: Gen ReplaceMetadata genReplaceMetadata = do version <- arbitrary @@ -55,7 +55,7 @@ genReplaceMetadata = do MVVersion2 -> FMVersion2 <$> arbitrary instance Arbitrary G.Name where - arbitrary = G.Name . T.pack <$> listOf1 (elements ['a'..'z']) + arbitrary = G.unsafeMkName . T.pack <$> listOf1 (elements ['a'..'z']) instance Arbitrary MetadataVersion where arbitrary = genericArbitrary @@ -205,18 +205,12 @@ instance Arbitrary Collection.CreateCollection where instance Arbitrary Collection.CollectionReq where arbitrary = genericArbitrary -instance Arbitrary G.NamedType where - arbitrary = G.NamedType <$> arbitrary - instance Arbitrary G.Description where arbitrary = G.Description <$> arbitrary instance Arbitrary G.Nullability where arbitrary = genericArbitrary -instance Arbitrary G.ListType where - arbitrary = G.ListType <$> arbitrary - instance Arbitrary G.GType where arbitrary = genericArbitrary @@ -247,16 +241,16 @@ instance Arbitrary RelationshipName where instance Arbitrary ObjectFieldName where arbitrary = genericArbitrary -instance Arbitrary TypeRelationshipDefinition where +instance (Arbitrary a, Arbitrary b) => Arbitrary (TypeRelationship a b) where arbitrary = genericArbitrary instance Arbitrary ObjectTypeName where arbitrary = genericArbitrary -instance Arbitrary ObjectFieldDefinition where +instance (Arbitrary a) => Arbitrary (ObjectFieldDefinition a) where arbitrary = genericArbitrary -instance Arbitrary ObjectTypeDefinition where +instance (Arbitrary a, Arbitrary b, Arbitrary c) => Arbitrary (ObjectTypeDefinition a b c) where arbitrary = genericArbitrary instance Arbitrary ScalarTypeDefinition where @@ -277,7 +271,7 @@ instance Arbitrary CustomTypes where instance Arbitrary ArgumentName where arbitrary = genericArbitrary -instance Arbitrary ArgumentDefinition where +instance (Arbitrary a) => Arbitrary (ArgumentDefinition a) where arbitrary = genericArbitrary instance Arbitrary ActionMutationKind where @@ -286,7 +280,7 @@ instance Arbitrary ActionMutationKind where instance Arbitrary ActionType where arbitrary = genericArbitrary -instance (Arbitrary a) => Arbitrary (ActionDefinition a) where +instance (Arbitrary a, Arbitrary b) => Arbitrary (ActionDefinition a b) where arbitrary = genericArbitrary instance Arbitrary ActionName where @@ -301,19 +295,11 @@ instance Arbitrary ActionPermissionMetadata where instance Arbitrary ActionMetadata where arbitrary = genericArbitrary -deriving instance Arbitrary G.StringValue -deriving instance Arbitrary G.Variable -deriving instance Arbitrary G.ListValue -deriving instance Arbitrary G.ObjectValue - -instance Arbitrary G.Value where - arbitrary = genericArbitrary - -instance (Arbitrary a) => Arbitrary (G.ObjectFieldG a) where - arbitrary = genericArbitrary - deriving instance Arbitrary RemoteArguments +instance Arbitrary a => Arbitrary (G.Value a) where + arbitrary = genericArbitrary + instance Arbitrary FieldCall where arbitrary = genericArbitrary diff --git a/server/src-lib/Hasura/RQL/DDL/Metadata/Types.hs b/server/src-lib/Hasura/RQL/DDL/Metadata/Types.hs index 06a63954900..4d1733e8d3b 100644 --- a/server/src-lib/Hasura/RQL/DDL/Metadata/Types.hs +++ b/server/src-lib/Hasura/RQL/DDL/Metadata/Types.hs @@ -490,16 +490,16 @@ replaceMetadataToOrdJSON ( ReplaceMetadata ] <> catMaybes [maybeDescriptionToMaybeOrdPair fieldDescM] - objectTypeToOrdJSON :: ObjectTypeDefinition -> AO.Value + objectTypeToOrdJSON :: ObjectType -> AO.Value objectTypeToOrdJSON (ObjectTypeDefinition tyName descM fields rels) = AO.object $ [ ("name", AO.toOrdered tyName) , ("fields", AO.array $ map fieldDefinitionToOrdJSON $ toList fields) ] <> catMaybes [ maybeDescriptionToMaybeOrdPair descM - , listToMaybeOrdPair "relationships" AO.toOrdered =<< rels + , maybeAnyToMaybeOrdPair "relationships" AO.toOrdered rels ] where - fieldDefinitionToOrdJSON :: ObjectFieldDefinition -> AO.Value + fieldDefinitionToOrdJSON :: ObjectFieldDefinition GraphQLType -> AO.Value fieldDefinitionToOrdJSON (ObjectFieldDefinition fieldName argsValM fieldDescM ty) = AO.object $ [ ("name", AO.toOrdered fieldName) , ("type", AO.toOrdered ty) @@ -530,7 +530,7 @@ replaceMetadataToOrdJSON ( ReplaceMetadata , listToMaybeOrdPair "permissions" permToOrdJSON permissions ] where - argDefinitionToOrdJSON :: ArgumentDefinition -> AO.Value + argDefinitionToOrdJSON :: ArgumentDefinition GraphQLType -> AO.Value argDefinitionToOrdJSON (ArgumentDefinition argName ty descM) = AO.object $ [ ("name", AO.toOrdered argName) , ("type", AO.toOrdered ty) diff --git a/server/src-lib/Hasura/RQL/DDL/Permission.hs b/server/src-lib/Hasura/RQL/DDL/Permission.hs index 6ef201ba57f..cd757dfa822 100644 --- a/server/src-lib/Hasura/RQL/DDL/Permission.hs +++ b/server/src-lib/Hasura/RQL/DDL/Permission.hs @@ -88,9 +88,9 @@ TRUE TRUE (OR NOT-SET) TRUE -- Insert permission data InsPerm = InsPerm - { ipCheck :: !BoolExp - , ipSet :: !(Maybe (ColumnValues Value)) - , ipColumns :: !(Maybe PermColSpec) + { ipCheck :: !BoolExp + , ipSet :: !(Maybe (ColumnValues Value)) + , ipColumns :: !(Maybe PermColSpec) , ipBackendOnly :: !(Maybe Bool) -- see Note [Backend only permissions] } deriving (Show, Eq, Lift, Generic) instance Cacheable InsPerm diff --git a/server/src-lib/Hasura/RQL/DDL/Relationship.hs b/server/src-lib/Hasura/RQL/DDL/Relationship.hs index dd6171f73b1..d268be5e371 100644 --- a/server/src-lib/Hasura/RQL/DDL/Relationship.hs +++ b/server/src-lib/Hasura/RQL/DDL/Relationship.hs @@ -98,7 +98,7 @@ objRelP2Setup qt foreignKeys (RelDef rn ru _) = case ru of mkDependency tableName reason col = SchemaDependency (SOTableObj tableName $ TOCol col) reason dependencies = map (mkDependency qt DRLeftColumn) lCols <> map (mkDependency refqt DRRightColumn) rCols - pure (RelInfo rn ObjRel (rmColumns rm) refqt True, dependencies) + pure (RelInfo rn ObjRel (rmColumns rm) refqt True True, dependencies) RUFKeyOn columnName -> do ForeignKey constraint foreignTable colMap <- getRequiredFkey columnName (HS.toList foreignKeys) let dependencies = @@ -108,7 +108,10 @@ objRelP2Setup qt foreignKeys (RelDef rn ru _) = case ru of -- neither the using_col nor the constraint name will help. , SchemaDependency (SOTable foreignTable) DRRemoteTable ] - pure (RelInfo rn ObjRel colMap foreignTable False, dependencies) + -- TODO(PDV?): this is too optimistic. Some object relationships are nullable, but + -- we are marking some as non-nullable here. This should really be done by + -- checking nullability in the SQL schema. + pure (RelInfo rn ObjRel colMap foreignTable False False, dependencies) arrRelP2Setup :: (QErrM m) @@ -122,7 +125,7 @@ arrRelP2Setup foreignKeys qt (RelDef rn ru _) = case ru of (lCols, rCols) = unzip $ HM.toList $ rmColumns rm deps = map (\c -> SchemaDependency (SOTableObj qt $ TOCol c) DRLeftColumn) lCols <> map (\c -> SchemaDependency (SOTableObj refqt $ TOCol c) DRRightColumn) rCols - pure (RelInfo rn ArrRel (rmColumns rm) refqt True, deps) + pure (RelInfo rn ArrRel (rmColumns rm) refqt True True, deps) RUFKeyOn (ArrRelUsingFKeyOn refqt refCol) -> do foreignTableForeignKeys <- getTableInfo refqt foreignKeys let keysThatReferenceUs = filter ((== qt) . _fkForeignTable) (HS.toList foreignTableForeignKeys) @@ -135,7 +138,7 @@ arrRelP2Setup foreignKeys qt (RelDef rn ru _) = case ru of , SchemaDependency (SOTable refqt) DRRemoteTable ] mapping = HM.fromList $ map swap $ HM.toList colMap - pure (RelInfo rn ArrRel mapping refqt False, deps) + pure (RelInfo rn ArrRel mapping refqt False False, deps) purgeRelDep :: (MonadTx m) => SchemaObjId -> m () purgeRelDep (SOTableObj tn (TOPerm rn pt)) = purgePerm tn rn pt diff --git a/server/src-lib/Hasura/RQL/DDL/RemoteRelationship.hs b/server/src-lib/Hasura/RQL/DDL/RemoteRelationship.hs index 3b62a2a4efc..9a5cb46f60e 100644 --- a/server/src-lib/Hasura/RQL/DDL/RemoteRelationship.hs +++ b/server/src-lib/Hasura/RQL/DDL/RemoteRelationship.hs @@ -1,4 +1,5 @@ {-# LANGUAGE ViewPatterns #-} +{-# LANGUAGE RecordWildCards #-} module Hasura.RQL.DDL.RemoteRelationship ( runCreateRemoteRelationship , runDeleteRemoteRelationship @@ -11,20 +12,20 @@ module Hasura.RQL.DDL.RemoteRelationship ) where import Hasura.EncJSON -import Hasura.GraphQL.Validate.Types import Hasura.Prelude -import Hasura.RQL.DDL.RemoteRelationship.Validate import Hasura.RQL.Types +import Hasura.RQL.Types.Column () import Hasura.SQL.Types +import Hasura.RQL.DDL.RemoteRelationship.Validate import Instances.TH.Lift () import qualified Database.PG.Query as Q +import qualified Data.HashSet as HS runCreateRemoteRelationship :: (MonadTx m, CacheRWM m) => RemoteRelationship -> m EncJSON runCreateRemoteRelationship remoteRelationship = do - -- Few checks void $ askTabInfo $ rtrTable remoteRelationship liftTx $ persistRemoteRelationship remoteRelationship buildSchemaCacheFor $ MOTableObj table $ MTORemoteRelationship $ rtrName remoteRelationship @@ -37,29 +38,28 @@ resolveRemoteRelationship => RemoteRelationship -> [PGColumnInfo] -> RemoteSchemaMap - -> m (RemoteFieldInfo, TypeMap, [SchemaDependency]) -resolveRemoteRelationship remoteRelationship pgColumns remoteSchemaMap = do - (remoteField, typesMap) <- either (throw400 RemoteSchemaError . validateErrorToText) - pure - (validateRemoteRelationship remoteRelationship remoteSchemaMap pgColumns) - - let schemaDependencies = - let table = rtrTable remoteRelationship - columns = _rfiHasuraFields remoteField - remoteSchemaName = rtrRemoteSchema remoteRelationship - tableDep = SchemaDependency (SOTable table) DRTable + -> m (RemoteFieldInfo, [SchemaDependency]) +resolveRemoteRelationship remoteRelationship + pgColumns + remoteSchemaMap = do + eitherRemoteField <- runExceptT $ + validateRemoteRelationship remoteRelationship remoteSchemaMap pgColumns + remoteField <- either (throw400 RemoteSchemaError . errorToText) pure $ eitherRemoteField + let table = rtrTable remoteRelationship + schemaDependencies = + let tableDep = SchemaDependency (SOTable table) DRTable columnsDep = map (\column -> SchemaDependency (SOTableObj table $ TOCol column) DRRemoteRelationship ) $ - map pgiColumn (toList columns) + map pgiColumn $ HS.toList $ _rfiHasuraFields remoteField remoteSchemaDep = - SchemaDependency (SORemoteSchema remoteSchemaName) DRRemoteSchema + SchemaDependency (SORemoteSchema $ rtrRemoteSchema remoteRelationship) DRRemoteSchema in (tableDep : remoteSchemaDep : columnsDep) - pure (remoteField, typesMap, schemaDependencies) + pure (remoteField, schemaDependencies) runUpdateRemoteRelationship :: (MonadTx m, CacheRWM m) => RemoteRelationship -> m EncJSON runUpdateRemoteRelationship remoteRelationship = do diff --git a/server/src-lib/Hasura/RQL/DDL/RemoteRelationship/Validate.hs b/server/src-lib/Hasura/RQL/DDL/RemoteRelationship/Validate.hs index ae137ce5db5..f9c515af45e 100644 --- a/server/src-lib/Hasura/RQL/DDL/RemoteRelationship/Validate.hs +++ b/server/src-lib/Hasura/RQL/DDL/RemoteRelationship/Validate.hs @@ -4,219 +4,190 @@ module Hasura.RQL.DDL.RemoteRelationship.Validate ( validateRemoteRelationship - , validateErrorToText + , errorToText ) where -import Data.Bifunctor import Data.Foldable -import Data.Validation -import Hasura.GraphQL.Validate.Types +import Hasura.GraphQL.Schema.Remote +import Hasura.GraphQL.Parser.Column import Hasura.Prelude hiding (first) import Hasura.RQL.Types -import Hasura.Server.Utils (makeReasonMessage) import Hasura.SQL.Types import qualified Data.HashMap.Strict as HM import qualified Data.HashSet as HS -import qualified Data.List.NonEmpty as NE import qualified Data.Text as T -import qualified Hasura.GraphQL.Schema as GS import qualified Language.GraphQL.Draft.Syntax as G -- | An error validating the remote relationship. data ValidationError = RemoteSchemaNotFound !RemoteSchemaName - | CouldntFindRemoteField !G.Name !G.NamedType + | CouldntFindRemoteField !G.Name !G.Name | FieldNotFoundInRemoteSchema !G.Name | NoSuchArgumentForRemote !G.Name | MissingRequiredArgument !G.Name - | TypeNotFound !G.NamedType + | TypeNotFound !G.Name | TableNotFound !QualifiedTable | TableFieldNonexistent !QualifiedTable !FieldName | ExpectedTypeButGot !G.GType !G.GType - | InvalidType !G.GType!T.Text - | InvalidVariable !G.Variable !(HM.HashMap G.Variable PGColumnInfo) + | InvalidType !G.GType !T.Text + | InvalidVariable !G.Name !(HM.HashMap G.Name PGColumnInfo) | NullNotAllowedHere | InvalidGTypeForStripping !G.GType | UnsupportedMultipleElementLists | UnsupportedEnum + | InvalidGraphQLName !T.Text deriving (Show, Eq) -validateErrorToText :: NE.NonEmpty ValidationError -> Text -validateErrorToText (toList -> errs) = - "cannot validate remote relationship " <> makeReasonMessage errs errorToText - where - errorToText :: ValidationError -> Text - errorToText = \case - RemoteSchemaNotFound name -> - "remote schema with name " <> name <<> " not found" - CouldntFindRemoteField name ty -> - "remote field with name " <> name <<> " and type " <> ty <<> " not found" - FieldNotFoundInRemoteSchema name -> - "field with name " <> name <<> " not found in remote schema" - NoSuchArgumentForRemote name -> - "argument with name " <> name <<> " not found in remote schema" - MissingRequiredArgument name -> - "required argument with name " <> name <<> " is missing" - TypeNotFound ty -> - "type with name " <> ty <<> " not found" - TableNotFound name -> - "table with name " <> name <<> " not found" - TableFieldNonexistent table fieldName -> - "field with name " <> fieldName <<> " not found in table " <>> table - ExpectedTypeButGot expTy actualTy -> - "expected type " <> getBaseTy expTy <<> " but got " <>> getBaseTy actualTy - InvalidType ty err -> - "type " <> getBaseTy ty <<> err - InvalidVariable var _ -> - "variable " <> G.unVariable var <<> " is not found" - NullNotAllowedHere -> - "null is not allowed here" - InvalidGTypeForStripping ty -> - "type " <> getBaseTy ty <<> " is invalid for stripping" - UnsupportedMultipleElementLists -> - "multiple elements in list value is not supported" - UnsupportedEnum -> - "enum value is not supported" +errorToText :: ValidationError -> Text +errorToText = \case + RemoteSchemaNotFound name -> + "remote schema with name " <> name <<> " not found" + CouldntFindRemoteField name ty -> + "remote field with name " <> name <<> " and type " <> ty <<> " not found" + FieldNotFoundInRemoteSchema name -> + "field with name " <> name <<> " not found in remote schema" + NoSuchArgumentForRemote name -> + "argument with name " <> name <<> " not found in remote schema" + MissingRequiredArgument name -> + "required argument with name " <> name <<> " is missing" + TypeNotFound ty -> + "type with name " <> ty <<> " not found" + TableNotFound name -> + "table with name " <> name <<> " not found" + TableFieldNonexistent table fieldName -> + "field with name " <> fieldName <<> " not found in table " <>> table + ExpectedTypeButGot expTy actualTy -> + "expected type " <> G.getBaseType expTy <<> " but got " <>> G.getBaseType actualTy + InvalidType ty err -> + "type " <> G.getBaseType ty <<> err + InvalidVariable var _ -> + "variable " <> var <<> " is not found" + NullNotAllowedHere -> + "null is not allowed here" + InvalidGTypeForStripping ty -> + "type " <> G.getBaseType ty <<> " is invalid for stripping" + UnsupportedMultipleElementLists -> + "multiple elements in list value is not supported" + UnsupportedEnum -> + "enum value is not supported" + InvalidGraphQLName t -> + t <<> " is not a valid GraphQL identifier" -- | Validate a remote relationship given a context. -validateRemoteRelationship :: - RemoteRelationship +validateRemoteRelationship + :: (MonadError ValidationError m) + => RemoteRelationship -> RemoteSchemaMap -> [PGColumnInfo] - -> Either (NonEmpty ValidationError) (RemoteFieldInfo, TypeMap) + -> m RemoteFieldInfo validateRemoteRelationship remoteRelationship remoteSchemaMap pgColumns = do let remoteSchemaName = rtrRemoteSchema remoteRelationship table = rtrTable remoteRelationship hasuraFields <- forM (toList $ rtrHasuraFields remoteRelationship) $ - \fieldName -> case find ((==) fieldName . fromPGCol . pgiColumn) pgColumns of - Nothing -> Left $ pure $ TableFieldNonexistent table fieldName - Just r -> pure r - case HM.lookup remoteSchemaName remoteSchemaMap of - Nothing -> Left $ pure $ RemoteSchemaNotFound remoteSchemaName - Just (RemoteSchemaCtx _ gctx rsi) -> do - (_leafTyInfo, leafGType, (leafParamMap, leafTypeMap)) <- - foldl - (\eitherObjTyInfoAndTypes fieldCall -> - case eitherObjTyInfoAndTypes of - Left err -> Left err - Right (objTyInfo, _, (_, typeMap)) -> do - objFldInfo <- lookupField (fcName fieldCall) objTyInfo - case _fiLoc objFldInfo of - TLHasuraType -> - Left - (pure (FieldNotFoundInRemoteSchema (fcName fieldCall))) - TLCustom -> - Left - (pure (FieldNotFoundInRemoteSchema (fcName fieldCall))) - TLRemoteType {} -> do - let providedArguments = - remoteArgumentsToMap (fcArguments fieldCall) - toEither - (validateRemoteArguments - (_fiParams objFldInfo) - providedArguments - (HM.fromList - (map - (first pgColumnToVariable) - (HM.toList $ mapFromL (pgiColumn) pgColumns))) - (GS._gTypes gctx)) - (newParamMap, newTypeMap) <- - first - pure - (runStateT - (stripInMap - remoteRelationship - (GS._gTypes gctx) - (_fiParams objFldInfo) - providedArguments) - typeMap) - innerObjTyInfo <- - if isObjType (GS._gTypes gctx) objFldInfo - then getTyInfoFromField (GS._gTypes gctx) objFldInfo - else if isScalarType (GS._gTypes gctx) objFldInfo - then pure objTyInfo - else (Left - (pure - (InvalidType - (_fiTy objFldInfo) - "only objects or scalar types expected"))) - pure - ( innerObjTyInfo - , _fiTy objFldInfo - , (newParamMap, newTypeMap))) - (pure - ( GS._gQueryRoot gctx - , G.toGT (_otiName $ GS._gQueryRoot gctx) - , (mempty, mempty))) - (unRemoteFields $ rtrRemoteField remoteRelationship) - pure - ( RemoteFieldInfo - { _rfiName = rtrName remoteRelationship - , _rfiGType = leafGType - , _rfiParamMap = leafParamMap - , _rfiHasuraFields = HS.fromList hasuraFields - , _rfiRemoteFields = unRemoteFields $ rtrRemoteField remoteRelationship - , _rfiRemoteSchema = rsi - } - , leafTypeMap) + \fieldName -> onNothing (find ((==) fieldName . fromPGCol . pgiColumn) pgColumns) $ + throwError $ TableFieldNonexistent table fieldName + pgColumnsVariables <- (mapM (\(k,v) -> do + variableName <- pgColumnToVariable k + pure $ (variableName,v) + )) $ (HM.toList $ mapFromL (pgiColumn) pgColumns) + let pgColumnsVariablesMap = HM.fromList pgColumnsVariables + (RemoteSchemaCtx rsName introspectionResult rsi _ _) <- + onNothing (HM.lookup remoteSchemaName remoteSchemaMap) $ + throwError $ RemoteSchemaNotFound remoteSchemaName + let schemaDoc@(G.SchemaIntrospection originalDefns) = irDoc introspectionResult + queryRootName = irQueryRoot introspectionResult + queryRoot <- onNothing (lookupObject schemaDoc queryRootName) $ + throwError $ FieldNotFoundInRemoteSchema queryRootName + (_, (leafParamMap, leafTypeMap)) <- + foldlM + (buildRelationshipTypeInfo pgColumnsVariablesMap schemaDoc) + (queryRoot,(mempty,mempty)) + (unRemoteFields $ rtrRemoteField remoteRelationship) + pure $ RemoteFieldInfo + { _rfiName = rtrName remoteRelationship + , _rfiParamMap = leafParamMap + , _rfiHasuraFields = HS.fromList hasuraFields + , _rfiRemoteFields = rtrRemoteField remoteRelationship + , _rfiRemoteSchema = rsi + -- adding the new types after stripping the values to the + -- schema document + , _rfiSchemaIntrospect = G.SchemaIntrospection $ originalDefns <> HM.elems leafTypeMap + , _rfiRemoteSchemaName = rsName + } where - getTyInfoFromField types field = - let baseTy = getBaseTy (_fiTy field) - fieldName = _fiName field - typeInfo = HM.lookup baseTy types - in case typeInfo of - Just (TIObj objTyInfo) -> pure objTyInfo - _ -> Left (pure (FieldNotFoundInRemoteSchema fieldName)) - isObjType types field = - let baseTy = getBaseTy (_fiTy field) - typeInfo = HM.lookup baseTy types - in case typeInfo of - Just (TIObj _) -> True - _ -> False - - isScalarType types field = - let baseTy = getBaseTy (_fiTy field) - typeInfo = HM.lookup baseTy types - in case typeInfo of - Just (TIScalar _) -> True - _ -> False - - remoteArgumentsToMap = - HM.fromList . - map (\field -> (G._ofName field, G._ofValue field)) . - getRemoteArguments + getObjTyInfoFromField schemaDoc field = + let baseTy = G.getBaseType (G._fldType field) + in lookupObject schemaDoc baseTy + isScalarType schemaDoc field = + let baseTy = G.getBaseType (G._fldType field) + in isJust $ lookupScalar schemaDoc baseTy + buildRelationshipTypeInfo pgColumnsVariablesMap schemaDoc (objTyInfo,(_,typeMap)) fieldCall = do + objFldDefinition <- lookupField (fcName fieldCall) objTyInfo + let providedArguments = getRemoteArguments $ fcArguments fieldCall + (validateRemoteArguments + (mapFromL G._ivdName (G._fldArgumentsDefinition objFldDefinition)) + providedArguments + pgColumnsVariablesMap + schemaDoc) + let eitherParamAndTypeMap = + runStateT + (stripInMap + remoteRelationship + schemaDoc + (mapFromL G._ivdName (G._fldArgumentsDefinition objFldDefinition)) + providedArguments) + $ typeMap + (newParamMap, newTypeMap) <- onLeft eitherParamAndTypeMap $ throwError + innerObjTyInfo <- onNothing (getObjTyInfoFromField schemaDoc objFldDefinition) $ + bool (throwError $ + (InvalidType (G._fldType objFldDefinition) "only objects or scalar types expected")) + (pure objTyInfo) + (isScalarType schemaDoc objFldDefinition) + pure + ( innerObjTyInfo + , (newParamMap,newTypeMap)) -- | Return a map with keys deleted whose template argument is -- specified as an atomic (variable, constant), keys which are kept -- have their values modified by 'stripObject' or 'stripList'. -stripInMap :: - RemoteRelationship -> HM.HashMap G.NamedType TypeInfo - -> HM.HashMap G.Name InpValInfo - -> HM.HashMap G.Name G.Value - -> StateT (HM.HashMap G.NamedType TypeInfo) (Either ValidationError) (HM.HashMap G.Name InpValInfo) -stripInMap remoteRelationshipName types schemaArguments templateArguments = +-- This function creates the 'HashMap G.Name G.InputValueDefinition' which modifies +-- the original input parameters (if any) of the remote node/table being used. Only +-- list or object types are preserved and other types are stripped off. The object or +-- list types are preserved because they can be merged, if any arguments are +-- provided by the user while querying a remote join field. +stripInMap + :: RemoteRelationship + -> G.SchemaIntrospection + -> HM.HashMap G.Name G.InputValueDefinition + -> HM.HashMap G.Name (G.Value G.Name) + -> StateT + (HashMap G.Name (G.TypeDefinition [G.Name])) + (Either ValidationError) + (HM.HashMap G.Name G.InputValueDefinition) +stripInMap remoteRelationship types schemaArguments providedArguments = fmap (HM.mapMaybe id) (HM.traverseWithKey (\name inpValInfo -> - case HM.lookup name templateArguments of + case HM.lookup name providedArguments of Nothing -> pure (Just inpValInfo) Just value -> do - maybeNewGType <- stripValue remoteRelationshipName types (_iviType inpValInfo) value + maybeNewGType <- stripValue remoteRelationship types (G._ivdType inpValInfo) value pure (fmap - (\newGType -> inpValInfo {_iviType = newGType}) + (\newGType -> inpValInfo {G._ivdType = newGType}) maybeNewGType)) schemaArguments) -- | Strip a value type completely, or modify it, if the given value -- is atomic-ish. -stripValue :: - RemoteRelationship -> HM.HashMap G.NamedType TypeInfo +stripValue + :: RemoteRelationship + -> G.SchemaIntrospection -> G.GType - -> G.Value - -> StateT (HM.HashMap G.NamedType TypeInfo) (Either ValidationError) (Maybe G.GType) + -> G.Value G.Name + -> StateT (HashMap G.Name (G.TypeDefinition [G.Name])) (Either ValidationError) (Maybe G.GType) stripValue remoteRelationshipName types gtype value = do case value of G.VVariable {} -> pure Nothing @@ -226,45 +197,47 @@ stripValue remoteRelationshipName types gtype value = do G.VBoolean {} -> pure Nothing G.VNull {} -> pure Nothing G.VEnum {} -> pure Nothing - G.VList (G.ListValueG values) -> + G.VList values -> case values of [] -> pure Nothing [gvalue] -> stripList remoteRelationshipName types gtype gvalue _ -> lift (Left UnsupportedMultipleElementLists) - G.VObject (G.unObjectValue -> keypairs) -> - fmap Just (stripObject remoteRelationshipName types gtype keypairs) + G.VObject keyPairs -> + fmap Just (stripObject remoteRelationshipName types gtype keyPairs) --- | Produce a new type for the list, or strip it entirely. -stripList :: - RemoteRelationship - -> HM.HashMap G.NamedType TypeInfo +-- -- | Produce a new type for the list, or strip it entirely. +stripList + :: RemoteRelationship + -> G.SchemaIntrospection -> G.GType - -> G.Value - -> StateT (HM.HashMap G.NamedType TypeInfo) (Either ValidationError) (Maybe G.GType) + -> G.Value G.Name + -> StateT (HashMap G.Name (G.TypeDefinition [G.Name])) (Either ValidationError) (Maybe G.GType) stripList remoteRelationshipName types originalOuterGType value = case originalOuterGType of - G.TypeList nullability (G.ListType innerGType) -> do + G.TypeList nullability innerGType -> do maybeNewInnerGType <- stripValue remoteRelationshipName types innerGType value pure (fmap - (\newGType -> G.TypeList nullability (G.ListType newGType)) + (\newGType -> G.TypeList nullability newGType) maybeNewInnerGType) _ -> lift (Left (InvalidGTypeForStripping originalOuterGType)) --- | Produce a new type for the given InpValInfo, modified by --- 'stripInMap'. Objects can't be deleted entirely, just keys of an --- object. -stripObject :: - RemoteRelationship -> HM.HashMap G.NamedType TypeInfo +-- -- | Produce a new type for the given InpValInfo, modified by +-- -- 'stripInMap'. Objects can't be deleted entirely, just keys of an +-- -- object. +stripObject + :: RemoteRelationship + -> G.SchemaIntrospection -> G.GType - -> [G.ObjectFieldG G.Value] - -> StateT (HM.HashMap G.NamedType TypeInfo) (Either ValidationError) G.GType -stripObject remoteRelationshipName types originalGtype keypairs = + -> HashMap G.Name (G.Value G.Name) + -> StateT (HashMap G.Name (G.TypeDefinition [G.Name])) (Either ValidationError) G.GType +stripObject remoteRelationshipName schemaDoc originalGtype templateArguments = case originalGtype of G.TypeNamed nullability originalNamedType -> - case HM.lookup (getBaseTy originalGtype) types of - Just (TIInpObj originalInpObjTyInfo) -> do - let originalSchemaArguments = _iotiFields originalInpObjTyInfo + case lookupType schemaDoc (G.getBaseType originalGtype) of + Just (G.TypeDefinitionInputObject originalInpObjTyInfo) -> do + let originalSchemaArguments = + mapFromL G._ivdName $ G._iotdValueDefinitions originalInpObjTyInfo newNamedType = renameNamedType (renameTypeForRelationship remoteRelationshipName) @@ -272,25 +245,23 @@ stripObject remoteRelationshipName types originalGtype keypairs = newSchemaArguments <- stripInMap remoteRelationshipName - types + schemaDoc originalSchemaArguments templateArguments let newInpObjTyInfo = originalInpObjTyInfo - {_iotiFields = newSchemaArguments, _iotiName = newNamedType} + { G._iotdValueDefinitions = HM.elems newSchemaArguments + , G._iotdName = newNamedType + } newGtype = G.TypeNamed nullability newNamedType - modify (HM.insert newNamedType (TIInpObj newInpObjTyInfo)) + modify (HM.insert newNamedType (G.TypeDefinitionInputObject newInpObjTyInfo)) pure newGtype _ -> lift (Left (InvalidGTypeForStripping originalGtype)) _ -> lift (Left (InvalidGTypeForStripping originalGtype)) - where - templateArguments :: HM.HashMap G.Name G.Value - templateArguments = - HM.fromList (map (\(G.ObjectFieldG key val) -> (key, val)) keypairs) --- | Produce a new name for a type, used when stripping the schema --- types for a remote relationship. --- TODO: Consider a separator character to avoid conflicts. +-- -- | Produce a new name for a type, used when stripping the schema +-- -- types for a remote relationship. +-- TODO: Consider a separator character to avoid conflicts. (from master) renameTypeForRelationship :: RemoteRelationship -> Text -> Text renameTypeForRelationship rtr text = text <> "_remote_rel_" <> name @@ -298,134 +269,160 @@ renameTypeForRelationship rtr text = QualifiedObject (SchemaName schema) (TableName table) = rtrTable rtr -- | Rename a type. -renameNamedType :: (Text -> Text) -> G.NamedType -> G.NamedType -renameNamedType rename (G.NamedType (G.Name text)) = - G.NamedType (G.Name (rename text)) +renameNamedType :: (Text -> Text) -> G.Name -> G.Name +renameNamedType rename = + G.unsafeMkName . rename . G.unName -- | Convert a field name to a variable name. -pgColumnToVariable :: PGCol -> G.Variable -pgColumnToVariable = G.Variable . G.Name . getPGColTxt +pgColumnToVariable :: (MonadError ValidationError m) => PGCol -> m G.Name +pgColumnToVariable pgCol = + let pgColText = getPGColTxt pgCol + in maybe (throwError $ InvalidGraphQLName pgColText) pure $ G.mkName pgColText -- | Lookup the field in the schema. -lookupField :: - G.Name - -> ObjTyInfo - -> Either (NonEmpty ValidationError) ObjFldInfo +lookupField + :: (MonadError ValidationError m) + => G.Name + -> G.ObjectTypeDefinition + -> m G.FieldDefinition lookupField name objFldInfo = viaObject objFldInfo where viaObject = - maybe (Left (pure (CouldntFindRemoteField name $ _otiName objFldInfo))) pure . - HM.lookup name . - _otiFields + maybe (throwError (CouldntFindRemoteField name $ G._otdName objFldInfo)) pure . + lookup name . + HM.toList . + mapFromL G._fldName . + G._otdFieldsDefinition -- | Validate remote input arguments against the remote schema. -validateRemoteArguments :: - HM.HashMap G.Name InpValInfo - -> HM.HashMap G.Name G.Value - -> HM.HashMap G.Variable PGColumnInfo - -> HM.HashMap G.NamedType TypeInfo - -> Validation (NonEmpty ValidationError) () -validateRemoteArguments expectedArguments providedArguments permittedVariables types = do - traverse validateProvided (HM.toList providedArguments) +validateRemoteArguments + :: (MonadError ValidationError m) + => HM.HashMap G.Name G.InputValueDefinition + -> HM.HashMap G.Name (G.Value G.Name) + -> HM.HashMap G.Name PGColumnInfo + -> G.SchemaIntrospection + -> m () +validateRemoteArguments expectedArguments providedArguments permittedVariables schemaDocument = do + traverse_ validateProvided (HM.toList providedArguments) -- Not neccessary to validate if all required args are provided in the relationship -- traverse validateExpected (HM.toList expectedArguments) - pure () where validateProvided (providedName, providedValue) = case HM.lookup providedName expectedArguments of - Nothing -> Failure (pure (NoSuchArgumentForRemote providedName)) - Just (_iviType -> expectedType) -> - validateType permittedVariables providedValue expectedType types - -- validateExpected (expectedKey, expectedInpValInfo) = - -- if G.isNullable (_iviType expectedInpValInfo) - -- then pure () - -- else case _iviDefVal expectedInpValInfo of - -- Just {} -> pure () - -- Nothing -> - -- case HM.lookup expectedKey providedArguments of - -- Nothing -> - -- Failure (pure (MissingRequiredArgument expectedKey)) - -- Just {} -> pure () + Nothing -> throwError (NoSuchArgumentForRemote providedName) + Just (G._ivdType -> expectedType) -> + validateType permittedVariables providedValue expectedType schemaDocument +unwrapGraphQLType :: G.GType -> G.GType +unwrapGraphQLType = \case + G.TypeList _ lt -> lt + nt -> nt -- | Validate a value against a type. -validateType :: - HM.HashMap G.Variable PGColumnInfo - -> G.Value +validateType + :: (MonadError ValidationError m) + => HM.HashMap G.Name PGColumnInfo + -> G.Value G.Name -> G.GType - -> HM.HashMap G.NamedType TypeInfo - -> Validation (NonEmpty ValidationError) () -validateType permittedVariables value expectedGType types = + -> G.SchemaIntrospection + -> m () +validateType permittedVariables value expectedGType schemaDocument = case value of G.VVariable variable -> case HM.lookup variable permittedVariables of - Nothing -> Failure (pure (InvalidVariable variable permittedVariables)) - Just fieldInfo -> - bindValidation - (columnInfoToNamedType fieldInfo) - (\actualNamedType -> assertType (G.toGT actualNamedType) expectedGType) - G.VInt {} -> assertType (G.toGT $ mkScalarTy PGInteger) expectedGType - G.VFloat {} -> assertType (G.toGT $ mkScalarTy PGFloat) expectedGType - G.VBoolean {} -> assertType (G.toGT $ mkScalarTy PGBoolean) expectedGType - G.VNull -> Failure (pure NullNotAllowedHere) - G.VString {} -> assertType (G.toGT $ mkScalarTy PGText) expectedGType - G.VEnum _ -> Failure (pure UnsupportedEnum) - G.VList (G.unListValue -> values) -> do + Nothing -> throwError (InvalidVariable variable permittedVariables) + Just fieldInfo -> do + namedType <- columnInfoToNamedType fieldInfo + assertType (mkGraphQLType namedType) expectedGType + G.VInt {} -> do + intScalarGType <- (mkGraphQLType <$> mkScalarTy PGInteger) + assertType intScalarGType expectedGType + G.VFloat {} -> do + floatScalarGType <- (mkGraphQLType <$> mkScalarTy PGFloat) + assertType floatScalarGType expectedGType + G.VBoolean {} -> do + boolScalarGType <- (mkGraphQLType <$> mkScalarTy PGBoolean) + assertType boolScalarGType expectedGType + G.VNull -> throwError NullNotAllowedHere + G.VString {} -> do + stringScalarGType <- (mkGraphQLType <$> mkScalarTy PGText) + assertType stringScalarGType expectedGType + G.VEnum _ -> throwError UnsupportedEnum + G.VList values -> do case values of [] -> pure () [_] -> pure () - _ -> Failure (pure UnsupportedMultipleElementLists) - (assertListType expectedGType) + _ -> throwError UnsupportedMultipleElementLists + assertListType expectedGType (flip traverse_ values (\val -> - validateType permittedVariables val (unwrapTy expectedGType) types)) - pure () - G.VObject (G.unObjectValue -> values) -> + validateType permittedVariables val (unwrapGraphQLType expectedGType) schemaDocument)) + G.VObject values -> flip traverse_ - values - (\(G.ObjectFieldG name val) -> - let expectedNamedType = getBaseTy expectedGType + (HM.toList values) + (\(name,val) -> + let expectedNamedType = G.getBaseType expectedGType in - case HM.lookup expectedNamedType types of - Nothing -> Failure (pure $ TypeNotFound expectedNamedType) + case lookupType schemaDocument expectedNamedType of + Nothing -> throwError $ (TypeNotFound expectedNamedType) Just typeInfo -> case typeInfo of - TIInpObj inpObjTypeInfo -> - case HM.lookup name (_iotiFields inpObjTypeInfo) of - Nothing -> Failure (pure $ NoSuchArgumentForRemote name) - Just (_iviType -> expectedType) -> - validateType permittedVariables val expectedType types - _ -> - Failure - (pure $ - InvalidType - (G.toGT $ G.NamedType name) - "not an input object type")) + G.TypeDefinitionInputObject inpObjTypeInfo -> + let objectTypeDefnsMap = + mapFromL G._ivdName $ (G._iotdValueDefinitions inpObjTypeInfo) + in + case HM.lookup name objectTypeDefnsMap of + Nothing -> throwError $ NoSuchArgumentForRemote name + Just (G._ivdType -> expectedType) -> + validateType permittedVariables val expectedType schemaDocument + _ -> do + throwError $ InvalidType (mkGraphQLType name) "not an input object type") + where + mkGraphQLType = + G.TypeNamed (G.Nullability False) -assertType :: G.GType -> G.GType -> Validation (NonEmpty ValidationError) () + mkScalarTy scalarType = do + eitherScalar <- runExceptT $ mkScalarTypeName scalarType + case eitherScalar of + Left _ -> throwError $ InvalidGraphQLName $ toSQLTxt scalarType + Right s -> pure s + +assertType + :: (MonadError ValidationError m) + => G.GType + -> G.GType + -> m () assertType actualType expectedType = do -- check if both are list types or both are named types (when - (isListType' actualType /= isListType' expectedType) - (Failure (pure $ ExpectedTypeButGot expectedType actualType))) + (G.isListType actualType /= G.isListType expectedType) + (throwError $ ExpectedTypeButGot expectedType actualType)) -- if list type then check over unwrapped type, else check base types - if isListType' actualType - then assertType (unwrapTy actualType) (unwrapTy expectedType) + if G.isListType actualType + then assertType (unwrapGraphQLType actualType) (unwrapGraphQLType expectedType) else (when - (getBaseTy actualType /= getBaseTy expectedType) - (Failure (pure $ ExpectedTypeButGot expectedType actualType))) + (G.getBaseType actualType /= G.getBaseType expectedType) + (throwError $ ExpectedTypeButGot expectedType actualType)) pure () -assertListType :: G.GType -> Validation (NonEmpty ValidationError) () +assertListType :: (MonadError ValidationError m) => G.GType -> m () assertListType actualType = - (when (not $ isListType' actualType) - (Failure (pure $ InvalidType actualType "is not a list type"))) + (when (not $ G.isListType actualType) + (throwError $ InvalidType actualType "is not a list type")) -- | Convert a field info to a named type, if possible. -columnInfoToNamedType :: PGColumnInfo -> Validation (NonEmpty ValidationError) G.NamedType -columnInfoToNamedType pci = case pgiType pci of - PGColumnScalar scalarType -> pure $ mkScalarTy scalarType - _ -> Failure $ pure UnsupportedEnum +columnInfoToNamedType + :: (MonadError ValidationError m) + => PGColumnInfo + -> m G.Name +columnInfoToNamedType pci = + case pgiType pci of + PGColumnScalar scalarType -> do + eitherScalar <- runExceptT $ mkScalarTypeName scalarType + case eitherScalar of + Left _ -> throwError $ InvalidGraphQLName $ toSQLTxt scalarType + Right s -> pure s + _ -> throwError UnsupportedEnum diff --git a/server/src-lib/Hasura/RQL/DDL/RemoteSchema.hs b/server/src-lib/Hasura/RQL/DDL/RemoteSchema.hs index acfbf232867..8f1e4b0d6fe 100644 --- a/server/src-lib/Hasura/RQL/DDL/RemoteSchema.hs +++ b/server/src-lib/Hasura/RQL/DDL/RemoteSchema.hs @@ -7,32 +7,29 @@ module Hasura.RQL.DDL.RemoteSchema , fetchRemoteSchemas , addRemoteSchemaP1 , addRemoteSchemaP2Setup + , addRemoteSchemaP2 , runIntrospectRemoteSchema , addRemoteSchemaToCatalog ) where -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as Map -import qualified Data.HashSet as S -import qualified Data.Text as T -import qualified Database.PG.Query as Q - +import Control.Monad.Unique import Hasura.EncJSON -import Hasura.GraphQL.NormalForm +-- import Hasura.GraphQL.NormalForm import Hasura.GraphQL.RemoteServer -import Hasura.GraphQL.Schema.Merge +-- import Hasura.GraphQL.Schema.Merge import Hasura.Prelude import Hasura.RQL.DDL.Deps + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as S +import qualified Database.PG.Query as Q + import Hasura.RQL.Types import Hasura.Server.Version (HasVersion) import Hasura.SQL.Types -import qualified Data.Environment as Env -import qualified Hasura.GraphQL.Context as GC -import qualified Hasura.GraphQL.Resolve.Introspect as RI -import qualified Hasura.GraphQL.Schema as GS -import qualified Hasura.GraphQL.Validate as VQ -import qualified Hasura.GraphQL.Validate.Types as VT +import qualified Data.Environment as Env runAddRemoteSchema :: ( HasVersion @@ -40,6 +37,7 @@ runAddRemoteSchema , CacheRWM m , MonadTx m , MonadIO m + , MonadUnique m , HasHttpManager m ) => Env.Environment @@ -63,25 +61,16 @@ addRemoteSchemaP1 name = do <> name <<> " already exists" addRemoteSchemaP2Setup - :: (HasVersion, QErrM m, MonadIO m, HasHttpManager m) + :: (HasVersion, QErrM m, MonadIO m, MonadUnique m, HasHttpManager m) => Env.Environment - -> AddRemoteSchemaQuery - -> m RemoteSchemaCtx + -> AddRemoteSchemaQuery -> m RemoteSchemaCtx addRemoteSchemaP2Setup env (AddRemoteSchemaQuery name def _) = do httpMgr <- askHttpManager - rsi <- validateRemoteSchemaDef env name def - gCtx <- fetchRemoteSchema env httpMgr rsi - pure $ RemoteSchemaCtx name (convRemoteGCtx gCtx) rsi - where - convRemoteGCtx rmGCtx = - GC.emptyGCtx { GS._gTypes = GC._rgTypes rmGCtx - , GS._gQueryRoot = GC._rgQueryRoot rmGCtx - , GS._gMutRoot = GC._rgMutationRoot rmGCtx - , GS._gSubRoot = GC._rgSubscriptionRoot rmGCtx - } + rsi <- validateRemoteSchemaDef env def + fetchRemoteSchema env httpMgr name rsi addRemoteSchemaP2 - :: (HasVersion, MonadTx m, MonadIO m, HasHttpManager m) => Env.Environment -> AddRemoteSchemaQuery -> m () + :: (HasVersion, MonadTx m, MonadIO m, MonadUnique m, HasHttpManager m) => Env.Environment -> AddRemoteSchemaQuery -> m () addRemoteSchemaP2 env q = do void $ addRemoteSchemaP2Setup env q liftTx $ addRemoteSchemaToCatalog q @@ -103,9 +92,6 @@ removeRemoteSchemaP1 rsn = do let rmSchemas = scRemoteSchemas sc void $ onNothing (Map.lookup rsn rmSchemas) $ throw400 NotExists "no such remote schema" - case Map.lookup rsn rmSchemas of - Just _ -> return () - Nothing -> throw400 NotExists "no such remote schema" let depObjs = getDependentObjs sc remoteSchemaDepId when (depObjs /= []) $ reportDeps depObjs where @@ -149,35 +135,15 @@ fetchRemoteSchemas = ORDER BY name ASC |] () True where - fromRow (n, Q.AltJ def, comm) = AddRemoteSchemaQuery n def comm + fromRow (name, Q.AltJ def, comment) = + AddRemoteSchemaQuery name def comment runIntrospectRemoteSchema :: (CacheRM m, QErrM m) => RemoteSchemaNameQuery -> m EncJSON runIntrospectRemoteSchema (RemoteSchemaNameQuery rsName) = do sc <- askSchemaCache - rGCtx <- - case Map.lookup rsName (scRemoteSchemas sc) of - Nothing -> - throw400 NotExists $ - "remote schema: " <> remoteSchemaNameToTxt rsName <> " not found" - Just rCtx -> mergeGCtx (rscGCtx rCtx) GC.emptyGCtx - -- merge with emptyGCtx to get default query fields - queryParts <- flip runReaderT rGCtx $ VQ.getQueryParts introspectionQuery - (rootSelSet, _) <- flip runReaderT rGCtx $ VT.runReusabilityT $ VQ.validateGQ queryParts - schemaField <- - case rootSelSet of - VQ.RQuery selSet -> getSchemaField $ toList $ unAliasedFields $ - unObjectSelectionSet selSet - _ -> throw500 "expected query for introspection" - (introRes, _) <- flip runReaderT rGCtx $ VT.runReusabilityT $ RI.schemaR schemaField - pure $ wrapInSpecKeys introRes - where - wrapInSpecKeys introObj = - encJFromAssocList - [ ( T.pack "data" - , encJFromAssocList [(T.pack "__schema", encJFromJValue introObj)]) - ] - getSchemaField = \case - [] -> throw500 "found empty when looking for __schema field" - [f] -> pure f - _ -> throw500 "expected __schema field, found many fields" + (RemoteSchemaCtx _ _ _ introspectionByteString _) <- + onNothing (Map.lookup rsName (scRemoteSchemas sc)) $ + throw400 NotExists $ + "remote schema: " <> rsName <<> " not found" + pure $ encJFromLBS introspectionByteString diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Cache.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Cache.hs index 3eed104a71d..7f6574fc97a 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Cache.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Cache.hs @@ -32,18 +32,11 @@ import Control.Monad.Unique import Data.Aeson import Data.List (nub) -import qualified Hasura.GraphQL.Context as GC -import qualified Hasura.GraphQL.RelaySchema as Relay -import qualified Hasura.GraphQL.Schema as GS -import qualified Hasura.GraphQL.Validate.Types as VT import qualified Hasura.Incremental as Inc -import qualified Language.GraphQL.Draft.Syntax as G import Hasura.Db -import Hasura.GraphQL.RemoteServer -import Hasura.GraphQL.Schema.CustomTypes -import Hasura.GraphQL.Schema.Merge -import Hasura.GraphQL.Utils (showNames) +import Hasura.GraphQL.Execute.Types +import Hasura.GraphQL.Schema (buildGQLContext) import Hasura.RQL.DDL.Action import Hasura.RQL.DDL.ComputedField import Hasura.RQL.DDL.CustomTypes @@ -63,45 +56,8 @@ import Hasura.RQL.DDL.Utils (clearHdbViews) import Hasura.RQL.Types import Hasura.RQL.Types.Catalog import Hasura.Server.Version (HasVersion) -import Hasura.Session import Hasura.SQL.Types -mergeCustomTypes - :: MonadError QErr f - => GS.GCtxMap -> GS.GCtx -> (NonObjectTypeMap, AnnotatedObjects) - -> f (GS.GCtxMap, GS.GCtx) -mergeCustomTypes gCtxMap remoteSchemaCtx customTypesState = do - let adminCustomTypes = uncurry buildCustomTypesSchema customTypesState adminRoleName - let commonTypes = M.intersectionWith (,) existingTypes adminCustomTypes - conflictingCustomTypes = - map (G.unNamedType . fst) $ M.toList $ - flip M.filter commonTypes $ \case - -- only scalars can be common - (VT.TIScalar _, VT.TIScalar _) -> False - (_, _) -> True - unless (null conflictingCustomTypes) $ - throw400 InvalidCustomTypes $ - "following custom types conflict with the " <> - "autogenerated hasura types or from remote schemas: " - <> showNames conflictingCustomTypes - - let gCtxMapWithCustomTypes = flip M.mapWithKey gCtxMap $ \roleName schemaCtx -> - flip fmap schemaCtx $ \gCtx -> - let customTypes = uncurry buildCustomTypesSchema customTypesState roleName - in addCustomTypes gCtx customTypes - - -- populate the gctx of each role with the custom types - return ( gCtxMapWithCustomTypes - , addCustomTypes remoteSchemaCtx adminCustomTypes - ) - where - addCustomTypes gCtx customTypes = - gCtx { GS._gTypes = GS._gTypes gCtx <> customTypes} - existingTypes = - case M.lookup adminRoleName gCtxMap of - Just schemaCtx -> GS._gTypes $ GC._rctxDefault schemaCtx - Nothing -> GS._gTypes remoteSchemaCtx - buildRebuildableSchemaCache :: (HasVersion, MonadIO m, MonadUnique m, MonadTx m, HasHttpManager m, HasSQLGenCtx m) => Env.Environment @@ -118,7 +74,7 @@ newtype CacheRWT m a -- Control.Monad.Trans.Writer.CPS) are leaky, and we don’t have that yet. = CacheRWT (StateT (RebuildableSchemaCache m, CacheInvalidations) m a) deriving - ( Functor, Applicative, Monad, MonadIO, MonadReader r, MonadError e, MonadTx + ( Functor, Applicative, Monad, MonadIO, MonadUnique, MonadReader r, MonadError e, MonadTx , UserInfoM, HasHttpManager, HasSQLGenCtx, HasSystemDefined ) runCacheRWT @@ -157,7 +113,8 @@ buildSchemaCacheRule -- Note: by supplying BuildReason via MonadReader, it does not participate in caching, which is -- what we want! :: ( HasVersion, ArrowChoice arr, Inc.ArrowDistribute arr, Inc.ArrowCache m arr - , MonadIO m, MonadTx m, MonadReader BuildReason m, HasHttpManager m, HasSQLGenCtx m ) + , MonadIO m, MonadUnique m, MonadTx m + , MonadReader BuildReason m, HasHttpManager m, HasSQLGenCtx m ) => Env.Environment -> (CatalogMetadata, InvalidationKeys) `arr` SchemaCache buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do @@ -173,43 +130,57 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do resolveDependencies -< (outputs, unresolvedDependencies) -- Step 3: Build the GraphQL schema. - ((remoteSchemaMap, gqlSchema, remoteGQLSchema), gqlSchemaInconsistentObjects) - <- runWriterA buildGQLSchema -< ( _boTables resolvedOutputs - , _boFunctions resolvedOutputs - , _boRemoteSchemas resolvedOutputs - , _boCustomTypes resolvedOutputs - , _boActions resolvedOutputs - , _boRemoteRelationshipTypes resolvedOutputs - ) + (gqlContext, gqlSchemaInconsistentObjects) <- runWriterA buildGQLContext -< + ( QueryHasura + , (_boTables resolvedOutputs) + , (_boFunctions resolvedOutputs) + , (_boRemoteSchemas resolvedOutputs) + , (_boActions resolvedOutputs) + , (_actNonObjects $ _boCustomTypes resolvedOutputs) + ) -- Step 4: Build the relay GraphQL schema - relayGQLSchema <- bindA -< Relay.mkRelayGCtxMap (_boTables resolvedOutputs) (_boFunctions resolvedOutputs) + (relayContext, relaySchemaInconsistentObjects) <- runWriterA buildGQLContext -< + ( QueryRelay + , (_boTables resolvedOutputs) + , (_boFunctions resolvedOutputs) + , (_boRemoteSchemas resolvedOutputs) + , (_boActions resolvedOutputs) + , (_actNonObjects $ _boCustomTypes resolvedOutputs) + ) returnA -< SchemaCache { scTables = _boTables resolvedOutputs , scActions = _boActions resolvedOutputs , scFunctions = _boFunctions resolvedOutputs - , scRemoteSchemas = remoteSchemaMap + -- TODO this is not the right value: we should track what part of the schema + -- we can stitch without consistencies, I think. + , scRemoteSchemas = fmap fst (_boRemoteSchemas resolvedOutputs) -- remoteSchemaMap , scAllowlist = _boAllowlist resolvedOutputs - , scCustomTypes = _boCustomTypes resolvedOutputs - , scGCtxMap = gqlSchema - , scDefaultRemoteGCtx = remoteGQLSchema - , scRelayGCtxMap = relayGQLSchema + -- , scCustomTypes = _boCustomTypes resolvedOutputs + , scGQLContext = fst gqlContext + , scUnauthenticatedGQLContext = snd gqlContext + , scRelayContext = fst relayContext + , scUnauthenticatedRelayContext = snd relayContext + -- , scGCtxMap = gqlSchema + -- , scDefaultRemoteGCtx = remoteGQLSchema , scDepMap = resolvedDependencies - , scInconsistentObjs = - inconsistentObjects <> dependencyInconsistentObjects <> toList gqlSchemaInconsistentObjects , scCronTriggers = _boCronTriggers resolvedOutputs + , scInconsistentObjs = + inconsistentObjects + <> dependencyInconsistentObjects + <> toList gqlSchemaInconsistentObjects + <> toList relaySchemaInconsistentObjects } where buildAndCollectInfo :: ( ArrowChoice arr, Inc.ArrowDistribute arr, Inc.ArrowCache m arr - , ArrowWriter (Seq CollectedInfo) arr, MonadIO m, MonadTx m, MonadReader BuildReason m + , ArrowWriter (Seq CollectedInfo) arr, MonadIO m, MonadUnique m, MonadTx m, MonadReader BuildReason m , HasHttpManager m, HasSQLGenCtx m ) => (CatalogMetadata, Inc.Dependency InvalidationKeys) `arr` BuildOutputs buildAndCollectInfo = proc (catalogMetadata, invalidationKeys) -> do let CatalogMetadata tables relationships permissions eventTriggers remoteSchemas functions allowlistDefs - computedFields catalogCustomTypes actions remoteRelationships cronTriggers = catalogMetadata @@ -224,18 +195,15 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do let relationshipsByTable = M.groupOn _crTable relationships computedFieldsByTable = M.groupOn (_afcTable . _cccComputedField) computedFields remoteRelationshipsByTable = M.groupOn rtrTable remoteRelationships - rawTableCoreInfos <- (tableRawInfos >- returnA) + tableCoreInfos <- (tableRawInfos >- returnA) >-> (\info -> (info, relationshipsByTable) >- alignExtraTableInfo mkRelationshipMetadataObject) >-> (\info -> (info, computedFieldsByTable) >- alignExtraTableInfo mkComputedFieldMetadataObject) >-> (\info -> (info, remoteRelationshipsByTable) >- alignExtraTableInfo mkRemoteRelationshipMetadataObject) >-> (| Inc.keyed (\_ (((tableRawInfo, tableRelationships), tableComputedFields), tableRemoteRelationships) -> do let columns = _tciFieldInfoMap tableRawInfo - (allFields, typeMap) <- addNonColumnFields -< + allFields <- addNonColumnFields -< (tableRawInfos, columns, M.map fst remoteSchemaMap, tableRelationships, tableComputedFields, tableRemoteRelationships) - returnA -< (tableRawInfo { _tciFieldInfoMap = allFields }, typeMap)) |) - - let tableCoreInfos = M.map fst rawTableCoreInfos - remoteRelationshipTypes = mconcat $ map snd $ M.elems rawTableCoreInfos + returnA -< (tableRawInfo { _tciFieldInfoMap = allFields })) |) -- permissions and event triggers tableCoreInfosDep <- Inc.newDependency -< tableCoreInfos @@ -284,16 +252,18 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do (bindErrorA -< resolveCustomTypes tableCache customTypes pgScalars) |) (MetadataObject MOCustomTypes $ toJSON customTypes) - -- actions - actionCache <- case maybeResolvedCustomTypes of - Just resolvedCustomTypes -> buildActions -< ((resolvedCustomTypes, pgScalars), actions) + -- -- actions + (actionCache, annotatedCustomTypes) <- case maybeResolvedCustomTypes of + Just resolvedCustomTypes -> do + actionCache' <- buildActions -< ((resolvedCustomTypes, pgScalars), actions) + returnA -< (actionCache', resolvedCustomTypes) -- If the custom types themselves are inconsistent, we can’t really do -- anything with actions, so just mark them all inconsistent. Nothing -> do recordInconsistencies -< ( map mkActionMetadataObject actions , "custom types are inconsistent" ) - returnA -< M.empty + returnA -< (M.empty, emptyAnnotatedCustomTypes) cronTriggersMap <- buildCronTriggers -< ((),cronTriggers) @@ -303,10 +273,7 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do , _boFunctions = functionCache , _boRemoteSchemas = remoteSchemaMap , _boAllowlist = allowList - -- If 'maybeResolvedCustomTypes' is 'Nothing', then custom types are inconsinstent. - -- In such case, use empty resolved value of custom types. - , _boCustomTypes = fromMaybe (NonObjectTypeMap mempty, mempty) maybeResolvedCustomTypes - , _boRemoteRelationshipTypes = remoteRelationshipTypes + , _boCustomTypes = annotatedCustomTypes , _boCronTriggers = cronTriggersMap } @@ -402,7 +369,7 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do buildActions :: ( ArrowChoice arr, Inc.ArrowDistribute arr, Inc.ArrowCache m arr , ArrowWriter (Seq CollectedInfo) arr) - => ( ((NonObjectTypeMap, AnnotatedObjects), HashSet PGScalarType) + => ( (AnnotatedCustomTypes, HashSet PGScalarType) , [ActionMetadata] ) `arr` HashMap ActionName ActionInfo buildActions = buildInfoMap _amName mkActionMetadataObject buildAction @@ -412,17 +379,17 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do addActionContext e = "in action " <> name <<> "; " <> e (| withRecordInconsistency ( (| modifyErrA (do - (resolvedDef, outObject, reusedPgScalars) <- liftEitherA <<< bindA -< - runExceptT $ resolveAction env resolvedCustomTypes pgScalars def + (resolvedDef, outObject) <- liftEitherA <<< bindA -< + runExceptT $ resolveAction env resolvedCustomTypes def pgScalars let permissionInfos = map (ActionPermissionInfo . _apmRole) actionPermissions permissionMap = mapFromL _apiRole permissionInfos - returnA -< ActionInfo name outObject resolvedDef permissionMap reusedPgScalars comment) + returnA -< ActionInfo name outObject resolvedDef permissionMap comment) |) addActionContext) |) (mkActionMetadataObject action) buildRemoteSchemas :: ( ArrowChoice arr, Inc.ArrowDistribute arr, ArrowWriter (Seq CollectedInfo) arr - , Inc.ArrowCache m arr , MonadIO m, HasHttpManager m ) + , Inc.ArrowCache m arr , MonadIO m, MonadUnique m, HasHttpManager m ) => ( Inc.Dependency (HashMap RemoteSchemaName Inc.InvalidationKey) , [AddRemoteSchemaQuery] ) `arr` HashMap RemoteSchemaName (RemoteSchemaCtx, MetadataObject) @@ -437,42 +404,6 @@ buildSchemaCacheRule env = proc (catalogMetadata, invalidationKeys) -> do runExceptT $ addRemoteSchemaP2Setup env remoteSchema) |) (mkRemoteSchemaMetadataObject remoteSchema) - -- Builds the GraphQL schema and merges in remote schemas. This function is kind of gross, as - -- it’s possible for the remote schema merging to fail, at which point we have to mark them - -- inconsistent. This means we have to accumulate the consistent remote schemas as we go, in - -- addition to the built GraphQL context. - buildGQLSchema - :: ( ArrowChoice arr, ArrowWriter (Seq InconsistentMetadata) arr, ArrowKleisli m arr - , MonadError QErr m ) - => ( TableCache - , FunctionCache - , HashMap RemoteSchemaName (RemoteSchemaCtx, MetadataObject) - , (NonObjectTypeMap, AnnotatedObjects) - , ActionCache - , VT.TypeMap - ) `arr` (RemoteSchemaMap, GS.GCtxMap, GS.GCtx) - buildGQLSchema = proc (tableCache, functionCache, remoteSchemas, customTypes, actionCache, remoteRelationshipTypes) -> do - baseGQLSchema <- bindA -< GS.mkGCtxMap tableCache functionCache actionCache - (| foldlA' (\(remoteSchemaMap, gqlSchemas, remoteGQLSchemas) - (remoteSchemaName, (remoteSchema, metadataObject)) -> - (| withRecordInconsistency (do - let gqlSchema = rscGCtx remoteSchema - mergedGQLSchemas <- bindErrorA -< mergeRemoteSchema gqlSchemas gqlSchema - mergedRemoteGQLSchemas <- bindErrorA -< mergeGCtx remoteGQLSchemas gqlSchema - let mergedRemoteSchemaMap = M.insert remoteSchemaName remoteSchema remoteSchemaMap - returnA -< (mergedRemoteSchemaMap, mergedGQLSchemas, mergedRemoteGQLSchemas)) - |) metadataObject - >-> (| onNothingA ((remoteSchemaMap, gqlSchemas, remoteGQLSchemas) >- returnA) |)) - |) (M.empty, baseGQLSchema, GC.emptyGCtx) (M.toList remoteSchemas) - -- merge the custom types into schema - >-> (\(remoteSchemaMap, gqlSchema', defGqlCtx') -> do - (gqlSchema, defGqlCtx) <- bindA -< mergeCustomTypes gqlSchema' defGqlCtx' customTypes - returnA -< ( remoteSchemaMap - , M.map (mergeRemoteTypesWithGCtx remoteRelationshipTypes <$>) gqlSchema - , mergeRemoteTypesWithGCtx remoteRelationshipTypes defGqlCtx - ) - ) - -- | @'withMetadataCheck' cascade action@ runs @action@ and checks if the schema changed as a -- result. If it did, it checks to ensure the changes do not violate any integrity constraints, and -- if not, incorporates them into the schema cache. diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Common.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Common.hs index 8c97055eb74..abdc2775c6d 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Common.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Common.hs @@ -7,15 +7,14 @@ module Hasura.RQL.DDL.Schema.Cache.Common where import Hasura.Prelude -import qualified Data.HashMap.Strict.Extended as M -import qualified Data.HashSet as HS -import qualified Data.Sequence as Seq +import qualified Data.HashMap.Strict.Extended as M +import qualified Data.HashSet as HS +import qualified Data.Sequence as Seq import Control.Arrow.Extended import Control.Lens -import qualified Hasura.Incremental as Inc -import qualified Hasura.GraphQL.Validate.Types as VT +import qualified Hasura.Incremental as Inc import Hasura.RQL.Types import Hasura.RQL.Types.Catalog @@ -53,18 +52,17 @@ data BuildInputs -- 'MonadWriter' side channel. data BuildOutputs = BuildOutputs - { _boTables :: !TableCache - , _boActions :: !ActionCache - , _boFunctions :: !FunctionCache - , _boRemoteSchemas :: !(HashMap RemoteSchemaName (RemoteSchemaCtx, MetadataObject)) + { _boTables :: !TableCache + , _boActions :: !ActionCache + , _boFunctions :: !FunctionCache + , _boRemoteSchemas :: !(HashMap RemoteSchemaName (RemoteSchemaCtx, MetadataObject)) -- ^ We preserve the 'MetadataObject' from the original catalog metadata in the output so we can -- reuse it later if we need to mark the remote schema inconsistent during GraphQL schema -- generation (because of field conflicts). - , _boAllowlist :: !(HS.HashSet GQLQuery) - , _boCustomTypes :: !(NonObjectTypeMap, AnnotatedObjects) - , _boRemoteRelationshipTypes :: !VT.TypeMap - , _boCronTriggers :: !(M.HashMap TriggerName CronTriggerInfo) - } deriving (Show, Eq) + , _boAllowlist :: !(HS.HashSet GQLQuery) + , _boCustomTypes :: !AnnotatedCustomTypes + , _boCronTriggers :: !(M.HashMap TriggerName CronTriggerInfo) + } $(makeLenses ''BuildOutputs) data RebuildableSchemaCache m diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Dependencies.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Dependencies.hs index d391e785d77..40beabb7afd 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Dependencies.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Dependencies.hs @@ -132,12 +132,12 @@ deleteMetadataObject objectId = case objectId of MORemoteSchema name -> boRemoteSchemas %~ M.delete name MOCronTrigger name -> boCronTriggers %~ M.delete name MOTableObj tableName tableObjectId -> boTables.ix tableName %~ case tableObjectId of - MTORel name _ -> tiCoreInfo.tciFieldInfoMap %~ M.delete (fromRel name) - MTOComputedField name -> tiCoreInfo.tciFieldInfoMap %~ M.delete (fromComputedField name) + MTORel name _ -> tiCoreInfo.tciFieldInfoMap %~ M.delete (fromRel name) + MTOComputedField name -> tiCoreInfo.tciFieldInfoMap %~ M.delete (fromComputedField name) MTORemoteRelationship name -> tiCoreInfo.tciFieldInfoMap %~ M.delete (fromRemoteRelationship name) MTOPerm roleName permType -> withPermType permType \accessor -> tiRolePermInfoMap.ix roleName.permAccToLens accessor .~ Nothing MTOTrigger name -> tiEventTriggerInfoMap %~ M.delete name - MOCustomTypes -> boCustomTypes %~ const (NonObjectTypeMap mempty, mempty) + MOCustomTypes -> boCustomTypes %~ const emptyAnnotatedCustomTypes MOAction name -> boActions %~ M.delete name MOActionPermission name role -> boActions.ix name.aiPermissions %~ M.delete role diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Fields.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Fields.hs index bd6cad0ba00..7ec09e15b5a 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Fields.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Cache/Fields.hs @@ -12,7 +12,6 @@ import qualified Data.HashMap.Strict.Extended as M import qualified Data.HashSet as HS import qualified Data.Sequence as Seq import qualified Language.GraphQL.Draft.Syntax as G -import qualified Hasura.GraphQL.Validate.Types as VT import Control.Arrow.Extended import Data.Aeson @@ -37,7 +36,7 @@ addNonColumnFields , [CatalogRelation] , [CatalogComputedField] , [RemoteRelationship] - ) `arr` (FieldInfoMap FieldInfo, VT.TypeMap) + ) `arr` FieldInfoMap FieldInfo addNonColumnFields = proc (rawTableInfo, columns, remoteSchemaMap, relationships, computedFields, remoteRelationships) -> do relationshipInfos <- buildInfoMapPreservingMetadata _crRelName mkRelationshipMetadataObject buildRelationship @@ -55,13 +54,11 @@ addNonColumnFields = proc (rawTableInfo, columns, remoteSchemaMap, relationships let mapKey f = M.fromList . map (first f) . M.toList relationshipFields = mapKey fromRel relationshipInfos computedFieldFields = mapKey fromComputedField computedFieldInfos - remoteRelationshipFields = mapKey fromRemoteRelationship $ - M.map (\((rf, _), mo) -> (rf, mo)) rawRemoteRelationshipInfos - typeMap = mconcat $ map (snd . fst) $ M.elems rawRemoteRelationshipInfos + remoteRelationshipFields = mapKey fromRemoteRelationship rawRemoteRelationshipInfos -- First, check for conflicts between non-column fields, since we can raise a better error -- message in terms of the two metadata objects that define them. - fieldInfoMap <- (align relationshipFields computedFieldFields >- returnA) + (align relationshipFields computedFieldFields >- returnA) >-> (| Inc.keyed (\fieldName fields -> (fieldName, fields) >- noFieldConflicts FIRelationship FIComputedField) |) -- Second, align with remote relationship fields >-> (\fields -> align (M.catMaybes fields) remoteRelationshipFields >- returnA) @@ -73,8 +70,6 @@ addNonColumnFields = proc (rawTableInfo, columns, remoteSchemaMap, relationships -- Finally, check for conflicts with the columns themselves. >-> (\fields -> align columns (M.catMaybes fields) >- returnA) >-> (| Inc.keyed (\_ fields -> fields >- noColumnConflicts) |) - - returnA -< (fieldInfoMap, typeMap) where noFieldConflicts this that = proc (fieldName, fields) -> case fields of This (thisField, metadata) -> returnA -< Just (this thisField, metadata) @@ -168,7 +163,7 @@ mkRemoteRelationshipMetadataObject rr = buildRemoteRelationship :: ( ArrowChoice arr, ArrowWriter (Seq CollectedInfo) arr , ArrowKleisli m arr, MonadError QErr m ) - => (([PGColumnInfo], RemoteSchemaMap), RemoteRelationship) `arr` Maybe (RemoteFieldInfo, VT.TypeMap) + => (([PGColumnInfo], RemoteSchemaMap), RemoteRelationship) `arr` Maybe RemoteFieldInfo buildRemoteRelationship = proc ((pgColumns, remoteSchemaMap), remoteRelationship) -> do let relationshipName = rtrName remoteRelationship tableName = rtrTable remoteRelationship @@ -177,8 +172,8 @@ buildRemoteRelationship = proc ((pgColumns, remoteSchemaMap), remoteRelationship addRemoteRelationshipContext e = "in remote relationship" <> relationshipName <<> ": " <> e (| withRecordInconsistency ( (| modifyErrA (do - (remoteField, typeMap, dependencies) <- bindErrorA -< resolveRemoteRelationship remoteRelationship pgColumns remoteSchemaMap + (remoteField, dependencies) <- bindErrorA -< resolveRemoteRelationship remoteRelationship pgColumns remoteSchemaMap recordDependencies -< (metadataObject, schemaObj, dependencies) - returnA -< (remoteField, typeMap)) + returnA -< remoteField) |)(addTableContext tableName . addRemoteRelationshipContext)) |) metadataObject diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Enum.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Enum.hs index fd29ff27ed3..d7012d092d8 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Enum.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Enum.hs @@ -53,7 +53,7 @@ resolveEnumReferences enumTables = resolveEnumReference foreignKey = do [(localColumn, foreignColumn)] <- pure $ M.toList (_fkColumnMapping foreignKey) (primaryKey, enumValues) <- M.lookup (_fkForeignTable foreignKey) enumTables - guard (_pkColumns primaryKey == NESeq.NESeq (foreignColumn, Seq.Empty)) + guard (_pkColumns primaryKey == foreignColumn NESeq.:<|| Seq.Empty) pure (localColumn, EnumReference (_fkForeignTable foreignKey) enumValues) data EnumTableIntegrityError @@ -79,14 +79,12 @@ fetchAndValidateEnumValues tableName maybePrimaryKey columnInfos = fetchAndValidate = do primaryKeyColumn <- tolerate validatePrimaryKey maybeCommentColumn <- validateColumns primaryKeyColumn - enumValues <- maybe (refute mempty) (fetchEnumValues maybeCommentColumn) primaryKeyColumn - validateEnumValues enumValues - pure enumValues + maybe (refute mempty) (fetchEnumValues maybeCommentColumn) primaryKeyColumn where validatePrimaryKey = case maybePrimaryKey of Nothing -> refute [EnumTableMissingPrimaryKey] Just primaryKey -> case _pkColumns primaryKey of - NESeq.NESeq (column, Seq.Empty) -> case prciType column of + column NESeq.:<|| Seq.Empty -> case prciType column of PGText -> pure column _ -> refute [EnumTableNonTextualPrimaryKey column] columns -> refute [EnumTableMultiColumnPrimaryKey $ map prciName (toList columns)] @@ -106,22 +104,23 @@ fetchAndValidateEnumValues tableName maybePrimaryKey columnInfos = query = Q.fromBuilder $ toSQL S.mkSelect { S.selFrom = Just $ S.mkSimpleFromExp tableName , S.selExtr = [S.mkExtr (prciName primaryKeyColumn), commentExtr] } - fmap mkEnumValues . liftTx $ Q.withQE defaultTxErrorHandler query () True - - mkEnumValues rows = M.fromList . flip map rows $ \(key, comment) -> - (EnumValue key, EnumValueInfo comment) - - validateEnumValues enumValues = do - let enumValueNames = map (G.Name . getEnumValue) (M.keys enumValues) - when (null enumValueNames) $ - refute [EnumTableNoEnumValues] - let badNames = map G.unName $ filter (not . isValidEnumName) enumValueNames - for_ (NE.nonEmpty badNames) $ \someBadNames -> - refute [EnumTableInvalidEnumValueNames someBadNames] + rawEnumValues <- liftTx $ Q.withQE defaultTxErrorHandler query () True + when (null rawEnumValues) $ dispute [EnumTableNoEnumValues] + let enumValues = flip map rawEnumValues $ + \(enumValueText, comment) -> + case mkValidEnumValueName enumValueText of + Nothing -> Left enumValueText + Just enumValue -> Right (EnumValue enumValue, EnumValueInfo comment) + badNames = lefts enumValues + validEnums = rights enumValues + case NE.nonEmpty badNames of + Just someBadNames -> refute [EnumTableInvalidEnumValueNames someBadNames] + Nothing -> pure $ M.fromList validEnums -- https://graphql.github.io/graphql-spec/June2018/#EnumValue - isValidEnumName name = - G.isValidName name && name `notElem` ["true", "false", "null"] + mkValidEnumValueName name = + if name `elem` ["true", "false", "null"] then Nothing + else G.mkName name showErrors :: [EnumTableIntegrityError] -> T.Text showErrors allErrors = diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Function.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Function.hs index 9c5660f2dcb..1339a932065 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Function.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Function.hs @@ -5,19 +5,18 @@ Description: Create/delete SQL functions to/from Hasura metadata. module Hasura.RQL.DDL.Schema.Function where import Hasura.EncJSON -import Hasura.GraphQL.Utils (showNames) import Hasura.Incremental (Cacheable) import Hasura.Prelude import Hasura.RQL.Types -import Hasura.Server.Utils (makeReasonMessage) +import Hasura.Server.Utils (englishList, makeReasonMessage) import Hasura.SQL.Types +import Control.Lens import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH import Language.Haskell.TH.Syntax (Lift) -import qualified Hasura.GraphQL.Schema as GS import qualified Language.GraphQL.Draft.Syntax as G import qualified Control.Monad.Validate as MV @@ -62,12 +61,13 @@ mkFunctionArgs defArgsNo tys argNames = validateFuncArgs :: MonadError QErr m => [FunctionArg] -> m () validateFuncArgs args = - unless (null invalidArgs) $ throw400 NotSupported $ - "arguments: " <> showNames invalidArgs - <> " are not in compliance with GraphQL spec" + for_ (nonEmpty invalidArgs) \someInvalidArgs -> + throw400 NotSupported $ + "arguments: " <> englishList "and" someInvalidArgs + <> " are not in compliance with GraphQL spec" where funcArgsText = mapMaybe (fmap getFuncArgNameTxt . faName) args - invalidArgs = filter (not . G.isValidName) $ map G.Name funcArgsText + invalidArgs = filter (not . isJust . G.mkName) funcArgsText data FunctionIntegrityError = FunctionNameNotGQLCompliant @@ -101,7 +101,8 @@ mkFunctionInfo qf systemDefined config rawFuncInfo = throwValidateError = MV.dispute . pure validateFunction = do - unless (G.isValidName $ GS.qualObjectToName qf) $ throwValidateError FunctionNameNotGQLCompliant + unless (has _Right $ qualifiedObjectToName qf) $ + throwValidateError FunctionNameNotGQLCompliant when hasVariadic $ throwValidateError FunctionVariadic when (retTyTyp /= PGKindComposite) $ throwValidateError FunctionReturnNotCompositeType unless retSet $ throwValidateError FunctionReturnNotSetof @@ -121,7 +122,7 @@ mkFunctionInfo qf systemDefined config rawFuncInfo = validateFunctionArgNames = do let argNames = mapMaybe faName functionArgs - invalidArgs = filter (not . G.isValidName . G.Name . getFuncArgNameTxt) argNames + invalidArgs = filter (not . isJust . G.mkName . getFuncArgNameTxt) argNames when (not $ null invalidArgs) $ throwValidateError $ FunctionInvalidArgumentNames invalidArgs diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Rename.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Rename.hs index 0c510a34ed6..5574ad1d6a8 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Rename.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Rename.hs @@ -371,34 +371,27 @@ updateColInRemoteRelationship remoteRelationshipName renameCol = do newColFieldName = FieldName $ newColPGTxt modifiedHasuraFlds = Set.insert newColFieldName $ Set.delete oldColFieldName hasuraFlds fieldCalls = unRemoteFields remoteFields - oldColName = G.Name $ oldColPGTxt - newColName = G.Name $ newColPGTxt - modifiedFieldCalls = NE.map (\(FieldCall name args) -> + oldColName <- parseGraphQLName oldColPGTxt + newColName <- parseGraphQLName newColPGTxt + let modifiedFieldCalls = NE.map (\(FieldCall name args) -> let remoteArgs = getRemoteArguments args in FieldCall name $ RemoteArguments $ - map (\(G.ObjectFieldG key val) -> - G.ObjectFieldG key $ replaceVariableName oldColName newColName val - ) $ remoteArgs + fmap (replaceVariableName oldColName newColName) remoteArgs ) $ fieldCalls liftTx $ RR.updateRemoteRelInCatalog (RemoteRelationship remoteRelationshipName qt modifiedHasuraFlds remoteSchemaName (RemoteFields modifiedFieldCalls)) where - replaceVariableName :: G.Name -> G.Name -> G.Value -> G.Value + parseGraphQLName txt = maybe (throw400 ParseFailed $ errMsg) pure $ G.mkName txt + where + errMsg = txt <> " is not a valid GraphQL name" + + replaceVariableName :: G.Name -> G.Name -> G.Value G.Name -> G.Value G.Name replaceVariableName oldColName newColName = \case - G.VVariable (G.Variable oldColName') -> - G.VVariable $ - if oldColName == oldColName' - then (G.Variable newColName) - else (G.Variable oldColName') - G.VList (G.unListValue -> values) -> G.VList $ G.ListValueG $ map (replaceVariableName oldColName newColName) values - G.VObject (G.unObjectValue -> values) -> - G.VObject $ G.ObjectValueG $ - map (\(G.ObjectFieldG key val) -> G.ObjectFieldG key $ replaceVariableName oldColName newColName val) values - G.VInt i -> G.VInt i - G.VFloat f -> G.VFloat f - G.VBoolean b -> G.VBoolean b - G.VNull -> G.VNull - G.VString s -> G.VString s - G.VEnum e -> G.VEnum e + G.VVariable oldColName' -> + G.VVariable $ bool oldColName newColName $ oldColName == oldColName' + G.VList values -> G.VList $ map (replaceVariableName oldColName newColName) values + G.VObject values -> + G.VObject $ fmap (replaceVariableName oldColName newColName) values + v -> v -- rename columns in relationship definitions updateColInEventTriggerDef diff --git a/server/src-lib/Hasura/RQL/DDL/Schema/Table.hs b/server/src-lib/Hasura/RQL/DDL/Schema/Table.hs index 437e1ea62f6..1e747ae8a1b 100644 --- a/server/src-lib/Hasura/RQL/DDL/Schema/Table.hs +++ b/server/src-lib/Hasura/RQL/DDL/Schema/Table.hs @@ -22,23 +22,13 @@ module Hasura.RQL.DDL.Schema.Table , processTableChanges ) where -import Hasura.EncJSON import Hasura.Prelude -import Hasura.RQL.DDL.Deps -import Hasura.RQL.DDL.Schema.Cache.Common -import Hasura.RQL.DDL.Schema.Catalog -import Hasura.RQL.DDL.Schema.Diff -import Hasura.RQL.DDL.Schema.Enum -import Hasura.RQL.DDL.Schema.Rename -import Hasura.RQL.Types -import Hasura.RQL.Types.Catalog -import Hasura.Server.Utils -import Hasura.SQL.Types +import qualified Data.HashMap.Strict.Extended as Map +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.HashSet as S +import qualified Data.Text as T import qualified Database.PG.Query as Q -import qualified Hasura.GraphQL.Schema as GS -import qualified Hasura.GraphQL.Context as GC -import qualified Hasura.Incremental as Inc import qualified Language.GraphQL.Draft.Syntax as G import Control.Arrow.Extended @@ -50,9 +40,22 @@ import Instances.TH.Lift () import Language.Haskell.TH.Syntax (Lift) import Network.URI.Extended () -import qualified Data.HashMap.Strict.Extended as M -import qualified Data.HashSet as S -import qualified Data.Text as T +import qualified Hasura.Incremental as Inc + +import Hasura.EncJSON +import Hasura.GraphQL.Schema.Common (textToName) +import Hasura.GraphQL.Context +import Hasura.RQL.DDL.Deps +import Hasura.RQL.DDL.Schema.Cache.Common +import Hasura.RQL.DDL.Schema.Catalog +import Hasura.RQL.DDL.Schema.Diff +import Hasura.RQL.DDL.Schema.Enum +import Hasura.RQL.DDL.Schema.Rename +import Hasura.RQL.Types +import Hasura.RQL.Types.Catalog +import Hasura.Server.Utils +import Hasura.SQL.Types + data TrackTable = TrackTable @@ -93,18 +96,76 @@ $(deriveJSON (aesonDrop 2 snakeCase){omitNothingFields=True} ''UntrackTable) trackExistingTableOrViewP1 :: (QErrM m, CacheRWM m) => QualifiedTable -> m () trackExistingTableOrViewP1 qt = do rawSchemaCache <- askSchemaCache - when (M.member qt $ scTables rawSchemaCache) $ + when (Map.member qt $ scTables rawSchemaCache) $ throw400 AlreadyTracked $ "view/table already tracked : " <>> qt let qf = fmap (FunctionName . getTableTxt) qt - when (M.member qf $ scFunctions rawSchemaCache) $ + when (Map.member qf $ scFunctions rawSchemaCache) $ throw400 NotSupported $ "function with name " <> qt <<> " already exists" +-- | Check whether a given name would conflict with the current schema by doing +-- an internal introspection +checkConflictingNode + :: MonadError QErr m + => SchemaCache + -> T.Text + -> m () +checkConflictingNode sc tnGQL = do + let queryParser = gqlQueryParser $ scUnauthenticatedGQLContext sc + -- { + -- __schema { + -- queryType { + -- fields { + -- name + -- } + -- } + -- } + -- } + introspectionQuery = + [ G.SelectionField $ G.Field Nothing $$(G.litName "__schema") mempty [] + [ G.SelectionField $ G.Field Nothing $$(G.litName "queryType") mempty [] + [ G.SelectionField $ G.Field Nothing $$(G.litName "fields") mempty [] + [ G.SelectionField $ G.Field Nothing $$(G.litName "name") mempty [] + [] + ] + ] + ] + ] + case queryParser introspectionQuery of + Left _ -> pure () + Right (results, _reusability) -> do + case OMap.lookup $$(G.litName "__schema") results of + Just (RFRaw (Object schema)) -> do + let names = do + Object queryType <- Map.lookup "queryType" schema + Array fields <- Map.lookup "fields" queryType + traverse (\case Object field -> do + String name <- Map.lookup "name" field + pure name + _ -> Nothing) fields + case names of + Nothing -> pure () + Just ns -> + if tnGQL `elem` ns + then throw400 RemoteSchemaConflicts $ + "node " <> tnGQL <> + " already exists in current graphql schema" + else pure () + _ -> pure () + trackExistingTableOrViewP2 :: (MonadTx m, CacheRWM m, HasSystemDefined m) => QualifiedTable -> Bool -> TableConfig -> m EncJSON trackExistingTableOrViewP2 tableName isEnum config = do - typeMap <- GC._gTypes . scDefaultRemoteGCtx <$> askSchemaCache - GS.checkConflictingNode typeMap $ GS.qualObjectToName tableName + sc <- askSchemaCache + {- + The next line does more than what it says on the tin. Removing the following + call to 'checkConflictingNode' causes memory usage to spike when newly + tracking a large amount (~100) of tables. The memory usage can be triggered + by first creating a large amount of tables through SQL, without tracking the + tables, and then clicking "track all" in the console. Curiously, this high + memory usage happens even when no substantial GraphQL schema is generated. + -} + checkConflictingNode sc $ snakeCaseQualObject tableName saveTableToCatalog tableName isEnum config buildSchemaCacheFor (MOTable tableName) return successMsg @@ -148,7 +209,7 @@ instance FromJSON SetTableCustomFields where SetTableCustomFields <$> o .: "table" <*> o .:? "custom_root_fields" .!= emptyCustomRootFields - <*> o .:? "custom_column_names" .!= M.empty + <*> o .:? "custom_column_names" .!= Map.empty runSetTableCustomFieldsQV2 :: (MonadTx m, CacheRWM m) => SetTableCustomFields -> m EncJSON @@ -162,7 +223,7 @@ unTrackExistingTableOrViewP1 :: (CacheRM m, QErrM m) => UntrackTable -> m () unTrackExistingTableOrViewP1 (UntrackTable vn _) = do rawSchemaCache <- askSchemaCache - case M.lookup vn (scTables rawSchemaCache) of + case Map.lookup vn (scTables rawSchemaCache) of Just ti -> -- Check if table/view is system defined when (isSystemDefined $ _tciSystemDefined $ _tiCoreInfo ti) $ throw400 NotSupported $ @@ -210,10 +271,9 @@ processTableChanges ti tableDiff = do procAlteredCols sc tn withNewTabName newTN = do - let tnGQL = GS.qualObjectToName newTN - typeMap = GC._gTypes $ scDefaultRemoteGCtx sc + let tnGQL = snakeCaseQualObject newTN -- check for GraphQL schema conflicts on new name - GS.checkConflictingNode typeMap tnGQL + checkConflictingNode sc tnGQL procAlteredCols sc tn -- update new table in catalog renameTableInCatalog newTN tn @@ -228,7 +288,7 @@ processTableChanges ti tableDiff = do possiblyDropCustomColumnNames tn = do let TableConfig customFields customColumnNames = _tciCustomConfig ti - modifiedCustomColumnNames = foldl' (flip M.delete) customColumnNames droppedCols + modifiedCustomColumnNames = foldl' (flip Map.delete) customColumnNames droppedCols when (modifiedCustomColumnNames /= customColumnNames) $ liftTx $ updateTableConfig tn $ TableConfig customFields modifiedCustomColumnNames @@ -296,20 +356,20 @@ buildTableCache , Inc.ArrowCache m arr, MonadTx m ) => ( [CatalogTable] , Inc.Dependency Inc.InvalidationKey - ) `arr` M.HashMap QualifiedTable TableRawInfo + ) `arr` Map.HashMap QualifiedTable TableRawInfo buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) -> do rawTableInfos <- (| Inc.keyed (| withTable (\tables -> (tables, reloadMetadataInvalidationKey) >- first noDuplicateTables >>> buildRawTableInfo) |) - |) (M.groupOnNE _ctName catalogTables) - let rawTableCache = M.catMaybes rawTableInfos - enumTables = flip M.mapMaybe rawTableCache \rawTableInfo -> + |) (Map.groupOnNE _ctName catalogTables) + let rawTableCache = Map.catMaybes rawTableInfos + enumTables = flip Map.mapMaybe rawTableCache \rawTableInfo -> (,) <$> _tciPrimaryKey rawTableInfo <*> _tciEnumValues rawTableInfo tableInfos <- (| Inc.keyed (| withTable (\table -> processTableInfo -< (enumTables, table)) |) |) rawTableCache - returnA -< M.catMaybes tableInfos + returnA -< Map.catMaybes tableInfos where withTable :: ErrorA QErr arr (e, s) a -> arr (e, (QualifiedTable, s)) (Maybe a) withTable f = withRecordInconsistency f <<< @@ -361,7 +421,7 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) -- types. processTableInfo :: ErrorA QErr arr - ( M.HashMap QualifiedTable (PrimaryKey PGCol, EnumValues) + ( Map.HashMap QualifiedTable (PrimaryKey PGCol, EnumValues) , TableCoreInfoG PGRawColumnInfo PGCol ) TableRawInfo processTableInfo = proc (enumTables, rawInfo) -> liftEitherA -< do @@ -370,7 +430,7 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) columnInfoMap <- alignCustomColumnNames columns (_tcCustomColumnNames $ _tciCustomConfig rawInfo) >>= traverse (processColumnInfo enumReferences (_tciName rawInfo)) - assertNoDuplicateFieldNames (M.elems columnInfoMap) + assertNoDuplicateFieldNames (Map.elems columnInfoMap) primaryKey <- traverse (resolvePrimaryKeyColumns columnInfoMap) (_tciPrimaryKey rawInfo) pure rawInfo @@ -381,7 +441,7 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) resolvePrimaryKeyColumns :: (QErrM n) => HashMap FieldName a -> PrimaryKey PGCol -> n (PrimaryKey a) resolvePrimaryKeyColumns columnMap = traverseOf (pkColumns.traverse) \columnName -> - M.lookup (fromPGCol columnName) columnMap + Map.lookup (fromPGCol columnName) columnMap `onNothing` throw500 "column in primary key not in table!" alignCustomColumnNames @@ -390,9 +450,9 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) -> CustomColumnNames -> n (FieldInfoMap (PGRawColumnInfo, G.Name)) alignCustomColumnNames columns customNames = do - let customNamesByFieldName = M.fromList $ map (first fromPGCol) $ M.toList customNames - flip M.traverseWithKey (align columns customNamesByFieldName) \columnName -> \case - This column -> pure (column, G.Name $ getFieldNameTxt columnName) + let customNamesByFieldName = Map.fromList $ map (first fromPGCol) $ Map.toList customNames + flip Map.traverseWithKey (align columns customNamesByFieldName) \columnName -> \case + This column -> (column,) <$> textToName (getFieldNameTxt columnName) These column customName -> pure (column, customName) That customName -> throw400 NotExists $ "the custom field name " <> customName <<> " was given for the column " <> columnName <<> ", but no such column exists" @@ -401,7 +461,7 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) -- known enum tables. processColumnInfo :: (QErrM n) - => M.HashMap PGCol (NonEmpty EnumReference) + => Map.HashMap PGCol (NonEmpty EnumReference) -> QualifiedTable -- ^ the table this column belongs to -> (PGRawColumnInfo, G.Name) -> n PGColumnInfo @@ -418,7 +478,7 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) where pgCol = prciName rawInfo resolveColumnType = - case M.lookup pgCol tableEnumReferences of + case Map.lookup pgCol tableEnumReferences of -- no references? not an enum Nothing -> pure $ PGColumnScalar (prciType rawInfo) -- one reference? is an enum @@ -430,9 +490,9 @@ buildTableCache = Inc.cache proc (catalogTables, reloadMetadataInvalidationKey) <> T.intercalate ", " (map (dquote . erTable) $ toList enumReferences) <> ")" assertNoDuplicateFieldNames columns = - flip M.traverseWithKey (M.groupOn pgiName columns) \name columnsWithName -> + flip Map.traverseWithKey (Map.groupOn pgiName columns) \name columnsWithName -> case columnsWithName of one:two:more -> throw400 AlreadyExists $ "the definitions of columns " - <> englishList (dquoteTxt . pgiColumn <$> (one:|two:more)) + <> englishList "and" (dquoteTxt . pgiColumn <$> (one:|two:more)) <> " are in conflict: they are mapped to the same field name, " <>> name _ -> pure () diff --git a/server/src-lib/Hasura/RQL/DML/Delete.hs b/server/src-lib/Hasura/RQL/DML/Delete.hs index 1292cd03725..cb751acd88f 100644 --- a/server/src-lib/Hasura/RQL/DML/Delete.hs +++ b/server/src-lib/Hasura/RQL/DML/Delete.hs @@ -9,7 +9,7 @@ module Hasura.RQL.DML.Delete ) where import Data.Aeson -import Instances.TH.Lift () +import Instances.TH.Lift () import qualified Data.Sequence as DS import qualified Data.Environment as Env @@ -17,24 +17,17 @@ import qualified Hasura.Tracing as Tracing import Hasura.EncJSON import Hasura.Prelude +import Hasura.RQL.DML.Delete.Types import Hasura.RQL.DML.Internal import Hasura.RQL.DML.Mutation import Hasura.RQL.DML.Returning import Hasura.RQL.GBoolExp +import Hasura.Server.Version (HasVersion) import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) -import Hasura.SQL.Types -import qualified Database.PG.Query as Q -import qualified Hasura.SQL.DML as S +import qualified Database.PG.Query as Q +import qualified Hasura.SQL.DML as S -data AnnDelG v - = AnnDel - { dqp1Table :: !QualifiedTable - , dqp1Where :: !(AnnBoolExp v, AnnBoolExp v) - , dqp1Output :: !(MutationOutputG v) - , dqp1AllCols :: ![PGColumnInfo] - } deriving (Show, Eq) traverseAnnDel :: (Applicative f) @@ -49,8 +42,6 @@ traverseAnnDel f annUpd = where AnnDel tn (whr, fltr) mutOutput allCols = annUpd -type AnnDel = AnnDelG S.SQLExp - mkDeleteCTE :: AnnDel -> S.CTE mkDeleteCTE (AnnDel tn (fltr, wc) _ _) = diff --git a/server/src-lib/Hasura/RQL/DML/Delete/Types.hs b/server/src-lib/Hasura/RQL/DML/Delete/Types.hs new file mode 100644 index 00000000000..00fafc8897b --- /dev/null +++ b/server/src-lib/Hasura/RQL/DML/Delete/Types.hs @@ -0,0 +1,21 @@ +module Hasura.RQL.DML.Delete.Types where + + +import Hasura.Prelude + +import qualified Hasura.SQL.DML as S + +import Hasura.RQL.DML.Returning.Types +import Hasura.RQL.Types.BoolExp +import Hasura.RQL.Types.Column +import Hasura.SQL.Types + +data AnnDelG v + = AnnDel + { dqp1Table :: !QualifiedTable + , dqp1Where :: !(AnnBoolExp v, AnnBoolExp v) + , dqp1Output :: !(MutationOutputG v) + , dqp1AllCols :: ![PGColumnInfo] + } deriving (Show, Eq) + +type AnnDel = AnnDelG S.SQLExp diff --git a/server/src-lib/Hasura/RQL/DML/Insert.hs b/server/src-lib/Hasura/RQL/DML/Insert.hs index 5f8b3ea1661..56d7c4b5f04 100644 --- a/server/src-lib/Hasura/RQL/DML/Insert.hs +++ b/server/src-lib/Hasura/RQL/DML/Insert.hs @@ -1,74 +1,63 @@ -module Hasura.RQL.DML.Insert where +module Hasura.RQL.DML.Insert + ( insertCheckExpr + , insertOrUpdateCheckExpr + , mkInsertCTE + , runInsert + , execInsertQuery + , toSQLConflict + ) where + +import Hasura.Prelude import Data.Aeson.Types -import Instances.TH.Lift () +import Instances.TH.Lift () -import qualified Data.HashMap.Strict as HM -import qualified Data.HashSet as HS -import qualified Data.Sequence as DS +import qualified Data.HashMap.Strict as HM +import qualified Data.HashSet as HS +import qualified Data.Sequence as DS +import qualified Database.PG.Query as Q + +import qualified Hasura.SQL.DML as S import Hasura.EncJSON -import Hasura.Prelude +import Hasura.RQL.DML.Insert.Types import Hasura.RQL.DML.Internal import Hasura.RQL.DML.Mutation import Hasura.RQL.DML.Returning import Hasura.RQL.GBoolExp -import Hasura.RQL.Instances () import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) +import Hasura.Server.Version (HasVersion) import Hasura.Session import Hasura.SQL.Types -import qualified Data.Environment as Env -import qualified Database.PG.Query as Q -import qualified Hasura.SQL.DML as S -import qualified Hasura.Tracing as Tracing - -data ConflictTarget - = CTColumn ![PGCol] - | CTConstraint !ConstraintName - deriving (Show, Eq) - -data ConflictClauseP1 - = CP1DoNothing !(Maybe ConflictTarget) - | CP1Update !ConflictTarget ![PGCol] !PreSetCols !S.BoolExp - deriving (Show, Eq) - -data InsertQueryP1 - = InsertQueryP1 - { iqp1Table :: !QualifiedTable - , iqp1Cols :: ![PGCol] - , iqp1Tuples :: ![[S.SQLExp]] - , iqp1Conflict :: !(Maybe ConflictClauseP1) - , iqp1CheckCond :: !(AnnBoolExpSQL, Maybe AnnBoolExpSQL) - , iqp1Output :: !MutationOutput - , iqp1AllCols :: ![PGColumnInfo] - } deriving (Show, Eq) +import qualified Data.Environment as Env +import qualified Hasura.Tracing as Tracing mkInsertCTE :: InsertQueryP1 -> S.CTE -mkInsertCTE (InsertQueryP1 tn cols vals c (insCheck, updCheck) _ _) = +mkInsertCTE (InsertQueryP1 tn cols vals conflict (insCheck, updCheck) _ _) = S.CTEInsert insert where tupVals = S.ValuesExp $ map S.TupleExp vals insert = - S.SQLInsert tn cols tupVals (toSQLConflict <$> c) + S.SQLInsert tn cols tupVals (toSQLConflict tn <$> conflict) . Just . S.RetExp $ [ S.selectStar , S.Extractor - (insertOrUpdateCheckExpr tn c - (toSQLBoolExp (S.QualTable tn) insCheck) - (fmap (toSQLBoolExp (S.QualTable tn)) updCheck)) + (insertOrUpdateCheckExpr tn conflict + (toSQLBool insCheck) + (fmap toSQLBool updCheck)) Nothing ] + toSQLBool = toSQLBoolExp $ S.QualTable tn -toSQLConflict :: ConflictClauseP1 -> S.SQLConflict -toSQLConflict conflict = case conflict of - CP1DoNothing Nothing -> S.DoNothing Nothing - CP1DoNothing (Just ct) -> S.DoNothing $ Just $ toSQLCT ct - CP1Update ct inpCols preSet filtr -> S.Update (toSQLCT ct) - (S.buildUpsertSetExp inpCols preSet) $ Just $ S.WhereFrag filtr +toSQLConflict :: QualifiedTable -> ConflictClauseP1 S.SQLExp -> S.SQLConflict +toSQLConflict tableName = \case + CP1DoNothing ct -> S.DoNothing $ toSQLCT <$> ct + CP1Update ct inpCols preSet filtr -> S.Update + (toSQLCT ct) (S.buildUpsertSetExp inpCols preSet) $ + Just $ S.WhereFrag $ toSQLBoolExp (S.QualTable tableName) filtr where toSQLCT ct = case ct of CTColumn pgCols -> S.SQLColumn pgCols @@ -114,7 +103,7 @@ buildConflictClause -> TableInfo -> [PGCol] -> OnConflict - -> m ConflictClauseP1 + -> m (ConflictClauseP1 S.SQLExp) buildConflictClause sessVarBldr tableInfo inpCols (OnConflict mTCol mTCons act) = case (mTCol, mTCons, act) of (Nothing, Nothing, CAIgnore) -> return $ CP1DoNothing Nothing @@ -131,21 +120,19 @@ buildConflictClause sessVarBldr tableInfo inpCols (OnConflict mTCol mTCons act) (updFltr, preSet) <- getUpdPerm resolvedUpdFltr <- convAnnBoolExpPartialSQL sessVarBldr updFltr resolvedPreSet <- mapM (convPartialSQLExp sessVarBldr) preSet - return $ CP1Update (CTColumn $ getPGCols col) inpCols resolvedPreSet $ - toSQLBool resolvedUpdFltr + return $ CP1Update (CTColumn $ getPGCols col) inpCols resolvedPreSet resolvedUpdFltr (Nothing, Just cons, CAUpdate) -> do validateConstraint cons (updFltr, preSet) <- getUpdPerm resolvedUpdFltr <- convAnnBoolExpPartialSQL sessVarBldr updFltr resolvedPreSet <- mapM (convPartialSQLExp sessVarBldr) preSet - return $ CP1Update (CTConstraint cons) inpCols resolvedPreSet $ - toSQLBool resolvedUpdFltr + return $ CP1Update (CTConstraint cons) inpCols resolvedPreSet resolvedUpdFltr (Just _, Just _, _) -> throw400 UnexpectedPayload "'constraint' and 'constraint_on' cannot be set at a time" where coreInfo = _tiCoreInfo tableInfo fieldInfoMap = _tciFieldInfoMap coreInfo - toSQLBool = toSQLBoolExp (S.mkQual $ _tciName coreInfo) + -- toSQLBool = toSQLBoolExp (S.mkQual $ _tciName coreInfo) validateCols c = do let targetcols = getPGCols c @@ -153,7 +140,8 @@ buildConflictClause sessVarBldr tableInfo inpCols (OnConflict mTCol mTCons act) \pgCol -> askPGType fieldInfoMap pgCol "" validateConstraint c = do - let tableConsNames = _cName <$> tciUniqueOrPrimaryKeyConstraints coreInfo + let tableConsNames = maybe [] toList $ + fmap _cName <$> tciUniqueOrPrimaryKeyConstraints coreInfo withPathK "constraint" $ unless (c `elem` tableConsNames) $ throw400 Unexpected $ "constraint " <> getConstraintTxt c @@ -262,9 +250,11 @@ execInsertQuery => Env.Environment -> Bool -> Maybe MutationRemoteJoinCtx - -> (InsertQueryP1, DS.Seq Q.PrepArg) -> m EncJSON + -> (InsertQueryP1, DS.Seq Q.PrepArg) + -> m EncJSON execInsertQuery env strfyNum remoteJoinCtx (u, p) = - runMutation env $ mkMutation remoteJoinCtx (iqp1Table u) (insertCTE, p) + runMutation env + $ mkMutation remoteJoinCtx (iqp1Table u) (insertCTE, p) (iqp1Output u) (iqp1AllCols u) strfyNum where insertCTE = mkInsertCTE u @@ -319,7 +309,7 @@ insertCheckExpr errorMessage condExpr = -- the @xmax@ system column. insertOrUpdateCheckExpr :: QualifiedTable - -> Maybe ConflictClauseP1 + -> Maybe (ConflictClauseP1 S.SQLExp) -> S.BoolExp -> Maybe S.BoolExp -> S.SQLExp diff --git a/server/src-lib/Hasura/RQL/DML/Insert/Types.hs b/server/src-lib/Hasura/RQL/DML/Insert/Types.hs new file mode 100644 index 00000000000..12189b857cc --- /dev/null +++ b/server/src-lib/Hasura/RQL/DML/Insert/Types.hs @@ -0,0 +1,34 @@ +module Hasura.RQL.DML.Insert.Types where + + +import Hasura.Prelude + +import Hasura.RQL.DML.Returning.Types +import Hasura.RQL.Types.BoolExp +import Hasura.RQL.Types.Column +import Hasura.SQL.Types + +import qualified Hasura.SQL.DML as S + + +data ConflictTarget + = CTColumn ![PGCol] + | CTConstraint !ConstraintName + deriving (Show, Eq) + +data ConflictClauseP1 v + = CP1DoNothing !(Maybe ConflictTarget) + | CP1Update !ConflictTarget ![PGCol] !(PreSetColsG v) (AnnBoolExp v) + deriving (Show, Eq, Functor, Foldable, Traversable) + + +data InsertQueryP1 + = InsertQueryP1 + { iqp1Table :: !QualifiedTable + , iqp1Cols :: ![PGCol] + , iqp1Tuples :: ![[S.SQLExp]] + , iqp1Conflict :: !(Maybe (ConflictClauseP1 S.SQLExp)) + , iqp1CheckCond :: !(AnnBoolExpSQL, Maybe AnnBoolExpSQL) + , iqp1Output :: !MutationOutput + , iqp1AllCols :: ![PGColumnInfo] + } deriving (Show, Eq) diff --git a/server/src-lib/Hasura/RQL/DML/Internal.hs b/server/src-lib/Hasura/RQL/DML/Internal.hs index 2cbb8f58005..f3dc814c3c6 100644 --- a/server/src-lib/Hasura/RQL/DML/Internal.hs +++ b/server/src-lib/Hasura/RQL/DML/Internal.hs @@ -52,13 +52,18 @@ askPermInfo' -> m (Maybe c) askPermInfo' pa tableInfo = do roleName <- askCurRole - let mrpi = getRolePermInfo roleName - return $ mrpi >>= (^. permAccToLens pa) - where - rpim = _tiRolePermInfoMap tableInfo - getRolePermInfo roleName - | roleName == adminRoleName = Just $ mkAdminRolePermInfo (_tiCoreInfo tableInfo) - | otherwise = M.lookup roleName rpim + return $ getPermInfoMaybe roleName pa tableInfo + +getPermInfoMaybe :: RoleName -> PermAccessor c -> TableInfo -> Maybe c +getPermInfoMaybe roleName pa tableInfo = + getRolePermInfo roleName tableInfo >>= (^. permAccToLens pa) + +getRolePermInfo :: RoleName -> TableInfo -> Maybe RolePermInfo +getRolePermInfo roleName tableInfo + | roleName == adminRoleName = + Just $ mkAdminRolePermInfo (_tiCoreInfo tableInfo) + | otherwise = + M.lookup roleName (_tiRolePermInfoMap tableInfo) askPermInfo :: (UserInfoM m, QErrM m) @@ -79,9 +84,9 @@ askPermInfo pa tableInfo = do pt = permTypeToCode $ permAccToType pa isTabUpdatable :: RoleName -> TableInfo -> Bool -isTabUpdatable roleName ti - | roleName == adminRoleName = True - | otherwise = isJust $ M.lookup roleName rpim >>= _permUpd +isTabUpdatable role ti + | role == adminRoleName = True + | otherwise = isJust $ M.lookup role rpim >>= _permUpd where rpim = _tiRolePermInfoMap ti diff --git a/server/src-lib/Hasura/RQL/DML/Mutation.hs b/server/src-lib/Hasura/RQL/DML/Mutation.hs index 3fbd77f4536..83871c14bd6 100644 --- a/server/src-lib/Hasura/RQL/DML/Mutation.hs +++ b/server/src-lib/Hasura/RQL/DML/Mutation.hs @@ -1,5 +1,5 @@ module Hasura.RQL.DML.Mutation - ( Mutation + ( Mutation(..) , mkMutation , MutationRemoteJoinCtx , runMutation @@ -25,13 +25,14 @@ import Hasura.EncJSON import Hasura.RQL.DML.Internal import Hasura.RQL.DML.RemoteJoin import Hasura.RQL.DML.Returning +import Hasura.RQL.DML.Returning.Types import Hasura.RQL.DML.Select -import Hasura.RQL.Instances () +import Hasura.RQL.Instances () import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) -import Hasura.Session +import Hasura.Server.Version (HasVersion) import Hasura.SQL.Types import Hasura.SQL.Value +import Hasura.Session type MutationRemoteJoinCtx = (HTTP.Manager, [N.Header], UserInfo) @@ -140,7 +141,6 @@ executeMutationOutputQuery env query prepArgs = \case Just (remoteJoins, (httpManager, reqHeaders, userInfo)) -> executeQueryWithRemoteJoins env httpManager reqHeaders userInfo query prepArgs remoteJoins - mutateAndFetchCols :: QualifiedTable -> [PGColumnInfo] diff --git a/server/src-lib/Hasura/RQL/DML/RemoteJoin.hs b/server/src-lib/Hasura/RQL/DML/RemoteJoin.hs index b94384050e1..fabb1ac0666 100644 --- a/server/src-lib/Hasura/RQL/DML/RemoteJoin.hs +++ b/server/src-lib/Hasura/RQL/DML/RemoteJoin.hs @@ -11,15 +11,14 @@ module Hasura.RQL.DML.RemoteJoin import Hasura.Prelude import Control.Lens -import Data.List (nub) import Data.Validation import Hasura.EncJSON +import Hasura.GraphQL.Parser hiding (field) import Hasura.GraphQL.RemoteServer (execRemoteGQ') import Hasura.GraphQL.Transport.HTTP.Protocol -import Hasura.GraphQL.Utils import Hasura.RQL.DML.Internal -import Hasura.RQL.DML.Returning +import Hasura.RQL.DML.Returning.Types import Hasura.RQL.DML.Select.Types import Hasura.RQL.Types import Hasura.Server.Version (HasVersion) @@ -39,8 +38,8 @@ import qualified Data.List.NonEmpty as NE import qualified Data.Text as T import qualified Database.PG.Query as Q import qualified Hasura.Tracing as Tracing -import qualified Language.GraphQL.Draft.Printer.Text as G import qualified Language.GraphQL.Draft.Syntax as G +import qualified Language.GraphQL.Draft.Printer as G import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types as N @@ -77,6 +76,8 @@ executeQueryWithRemoteJoins env manager reqHdrs userInfo q prepArgs rjs = do newtype FieldPath = FieldPath {unFieldPath :: [FieldName]} deriving (Show, Eq, Semigroup, Monoid, Hashable) +type Alias = G.Name + appendPath :: FieldName -> FieldPath -> FieldPath appendPath fieldName = FieldPath . (<> [fieldName]) . unFieldPath @@ -94,18 +95,23 @@ getCounter = do modify incCounter pure c --- | Generate the alias for remote field. -pathToAlias :: FieldPath -> Counter -> G.Alias -pathToAlias path counter = - G.Alias $ G.Name $ T.intercalate "_" (map getFieldNameTxt $ unFieldPath path) - <> "__" <> (T.pack . show . unCounter) counter +parseGraphQLName :: (MonadError QErr m) => Text -> m G.Name +parseGraphQLName txt = maybe (throw400 RemoteSchemaError $ errMsg) pure $ G.mkName txt + where + errMsg = txt <> " is not a valid GraphQL name" --- | A 'RemoteJoin' represents the context of remote relationship to be extracted from 'AnnFldG's. +-- | Generate the alias for remote field. +pathToAlias :: (MonadError QErr m) => FieldPath -> Counter -> m Alias +pathToAlias path counter = do + parseGraphQLName $ T.intercalate "_" (map getFieldNameTxt $ unFieldPath path) + <> "__" <> (T.pack . show . unCounter) counter + +-- | A 'RemoteJoin' represents the context of remote relationship to be extracted from 'AnnFieldG's. data RemoteJoin = RemoteJoin { _rjName :: !FieldName -- ^ The remote join field name. , _rjArgs :: ![RemoteFieldArgument] -- ^ User-provided arguments with variables. - , _rjSelSet :: !G.SelectionSet -- ^ User-provided selection set of remote field. + , _rjSelSet :: !(G.SelectionSet G.NoFragments Variable) -- ^ User-provided selection set of remote field. , _rjHasuraFields :: !(HashSet FieldName) -- ^ Table fields. , _rjFieldCall :: !(NonEmpty FieldCall) -- ^ Remote server fields. , _rjRemoteSchema :: !RemoteSchemaInfo -- ^ The remote schema server info. @@ -183,7 +189,6 @@ transformConnectionSelect path ConnectionSelect{..} = do EdgeNode annFields -> EdgeNode <$> transformAnnFields (appendPath fieldName edgePath) annFields - -- | Traverse through 'MutationOutput' and collect remote join fields (if any) getRemoteJoinsMutationOutput :: MutationOutput -> (MutationOutput, Maybe RemoteJoins) getRemoteJoinsMutationOutput = @@ -197,9 +202,9 @@ getRemoteJoinsMutationOutput = MOutSinglerowObject <$> transformAnnFields path annFields where transfromMutationFields fields = - forM fields $ \(fieldName, field) -> do + forM fields $ \(fieldName, field') -> do let fieldPath = appendPath fieldName path - (fieldName,) <$> case field of + (fieldName,) <$> case field' of MCount -> pure MCount MExp t -> pure $ MExp t MRet annFields -> MRet <$> transformAnnFields fieldPath annFields @@ -215,9 +220,9 @@ transformAnnFields path fields = do phantomColumns = filter ((`notElem` pgColumnFields) . fromPGCol . pgiColumn) hasuraColumnL in RemoteJoin fieldName argsMap selSet hasuraColumnFields remoteFields rsi phantomColumns - transformedFields <- forM fields $ \(fieldName, field) -> do + transformedFields <- forM fields $ \(fieldName, field') -> do let fieldPath = appendPath fieldName path - (fieldName,) <$> case field of + (fieldName,) <$> case field' of AFNodeId qt pkeys -> pure $ AFNodeId qt pkeys AFColumn c -> pure $ AFColumn c AFObjectRelation annRel -> @@ -286,14 +291,13 @@ compositeValueToJSON = \case data RemoteJoinField = RemoteJoinField { _rjfRemoteSchema :: !RemoteSchemaInfo -- ^ The remote schema server info. - , _rjfAlias :: !G.Alias -- ^ Top level alias of the field - , _rjfField :: !G.Field -- ^ The field AST + , _rjfAlias :: !Alias -- ^ Top level alias of the field + , _rjfField :: !(G.Field G.NoFragments Variable) -- ^ The field AST , _rjfFieldCall :: ![G.Name] -- ^ Path to remote join value - , _rjfVariables :: ![(G.VariableDefinition,A.Value)] -- ^ Variables used in the AST } deriving (Show, Eq) -- | Generate composite JSON ('CompositeValue') parameterised over 'RemoteJoinField' --- from remote join map and query response JSON from Postgres. +-- from remote join map and query response JSON from Postgres. traverseQueryResponseJSON :: (MonadError QErr m) => RemoteJoinMap -> AO.Value -> m (CompositeValue RemoteJoinField) @@ -315,19 +319,24 @@ traverseQueryResponseJSON rjm = mkRemoteSchemaField siblingFields remoteJoin = do counter <- getCounter let RemoteJoin fieldName inputArgs selSet hasuraFields fieldCall rsi _ = remoteJoin - hasuraFieldVariables = map (G.Variable . G.Name . getFieldNameTxt) $ toList hasuraFields - siblingFieldArgs = Map.fromList $ - map ((G.Variable . G.Name) *** ordJsonvalueToGValue) siblingFields + hasuraFieldVariables <- mapM (parseGraphQLName . getFieldNameTxt) $ toList hasuraFields + siblingFieldArgsVars <- mapM (\(k,val) -> do + (,) <$> parseGraphQLName k <*> ordJSONValueToGValue val) + $ siblingFields + let siblingFieldArgs = Map.fromList $ siblingFieldArgsVars hasuraFieldArgs = flip Map.filterWithKey siblingFieldArgs $ \k _ -> k `elem` hasuraFieldVariables - fieldAlias = pathToAlias (appendPath fieldName path) counter - queryField <- fieldCallsToField (map _rfaArgument inputArgs) hasuraFieldArgs selSet fieldAlias fieldCall + fieldAlias <- pathToAlias (appendPath fieldName path) counter + queryField <- fieldCallsToField (inputArgsToMap inputArgs) hasuraFieldArgs selSet fieldAlias fieldCall pure $ RemoteJoinField rsi fieldAlias queryField (map fcName $ toList $ NE.tail fieldCall) - (concat $ mapMaybe _rfaVariable inputArgs) where - ordJsonvalueToGValue = jsonValueToGValue . AO.fromOrdered + ordJSONValueToGValue :: (MonadError QErr m) => AO.Value -> m (G.Value Void) + ordJSONValueToGValue = + either (throw400 ValidationFailed) pure . jsonToGraphQL . AO.fromOrdered + + inputArgsToMap = Map.fromList . map (_rfaArgument &&& _rfaValue) traverseObject obj = do let fields = AO.toList obj @@ -348,6 +357,46 @@ traverseQueryResponseJSON rjm = Nothing -> Just <$> traverseValue fieldPath value pure $ CVObject $ OMap.fromList processedFields +convertFieldWithVariablesToName :: G.Field G.NoFragments Variable -> G.Field G.NoFragments G.Name +convertFieldWithVariablesToName = fmap getName + +inputValueToJSON :: InputValue Void -> A.Value +inputValueToJSON = \case + JSONValue j -> j + GraphQLValue g -> graphQLValueToJSON g + where + graphQLValueToJSON :: G.Value Void -> A.Value + graphQLValueToJSON = \case + G.VNull -> A.Null + G.VInt i -> A.toJSON i + G.VFloat f -> A.toJSON f + G.VString t -> A.toJSON t + G.VBoolean b -> A.toJSON b + G.VEnum (G.EnumValue n) -> A.toJSON n + G.VList values -> A.toJSON $ graphQLValueToJSON <$> values + G.VObject objects -> A.toJSON $ graphQLValueToJSON <$> objects + +defaultValue :: InputValue Void -> Maybe (G.Value Void) +defaultValue = \case + JSONValue _ -> Nothing + GraphQLValue g -> Just g + +collectVariables :: G.Value Variable -> HashMap G.VariableDefinition A.Value +collectVariables = \case + G.VNull -> mempty + G.VInt _ -> mempty + G.VFloat _ -> mempty + G.VString _ -> mempty + G.VBoolean _ -> mempty + G.VEnum _ -> mempty + G.VList values -> foldl Map.union mempty $ map collectVariables values + G.VObject values -> foldl Map.union mempty $ map collectVariables $ Map.elems values + G.VVariable var@(Variable _ gType val) -> + let name = getName var + jsonVal = inputValueToJSON val + defaultVal = defaultValue val + in Map.singleton (G.VariableDefinition name gType defaultVal) jsonVal + -- | Fetch remote join field value from remote servers by batching respective 'RemoteJoinField's fetchRemoteJoinFields :: ( HasVersion @@ -366,7 +415,6 @@ fetchRemoteJoinFields env manager reqHdrs userInfo remoteJoins = do let batchList = toList batch gqlReq = fieldsToRequest G.OperationTypeQuery (map _rjfField batchList) - (concatMap _rjfVariables batchList) gqlReqUnparsed = (GQLQueryText . G.renderExecutableDoc . G.ExecutableDocument . unGQLExecDoc) <$> gqlReq -- NOTE: discard remote headers (for now): (_, _, respBody) <- execRemoteGQ' env manager userInfo reqHdrs gqlReqUnparsed rsi G.OperationTypeQuery @@ -375,7 +423,6 @@ fetchRemoteJoinFields env manager reqHdrs userInfo remoteJoins = do Right r -> do respObj <- either throw500 pure $ AO.asObject r let errors = AO.lookup "errors" respObj - if | isNothing errors || errors == Just AO.Null -> case AO.lookup "data" respObj of Nothing -> throw400 Unexpected "\"data\" field not found in remote response" @@ -389,10 +436,13 @@ fetchRemoteJoinFields env manager reqHdrs userInfo remoteJoins = do where remoteSchemaBatch = Map.groupOnNE _rjfRemoteSchema remoteJoins - fieldsToRequest :: G.OperationType -> [G.Field] -> [(G.VariableDefinition,A.Value)] -> GQLReqParsed - fieldsToRequest opType gfields vars = - case vars of - [] -> + fieldsToRequest :: G.OperationType -> [G.Field G.NoFragments Variable] -> GQLReqParsed + fieldsToRequest opType gFields = + let variableInfos = Just <$> foldl Map.union mempty $ Map.elems $ fmap collectVariables $ G._fArguments $ head gFields + gFields' = map (G.fmapFieldFragment G.inline . convertFieldWithVariablesToName) gFields + in + case variableInfos of + Nothing -> GQLReq { _grOperationName = Nothing , _grQuery = @@ -400,31 +450,31 @@ fetchRemoteJoinFields env manager reqHdrs userInfo remoteJoins = do [ G.ExecutableDefinitionOperation (G.OperationDefinitionTyped ( emptyOperationDefinition - { G._todSelectionSet = map G.SelectionField gfields + { G._todSelectionSet = map G.SelectionField gFields' } ) ) ] , _grVariables = Nothing } - vars' -> + + Just vars' -> GQLReq { _grOperationName = Nothing , _grQuery = - GQLExecDoc - [ G.ExecutableDefinitionOperation - (G.OperationDefinitionTyped + GQLExecDoc + [ G.ExecutableDefinitionOperation + (G.OperationDefinitionTyped ( emptyOperationDefinition - { G._todSelectionSet = map G.SelectionField gfields - , G._todVariableDefinitions = nub (map fst vars') - } - ) - ) + { G._todSelectionSet = map G.SelectionField gFields' + , G._todVariableDefinitions = map fst $ Map.toList vars' + } + ) + ) ] - , _grVariables = Just $ Map.fromList - (map (\(varDef, val) -> (G._vdVariable varDef, val)) vars') - } - + , _grVariables = Just $ Map.fromList + (map (\(varDef, val) -> (G._vdName varDef, val)) $ Map.toList vars') + } where emptyOperationDefinition = G.TypedOperationDefinition { @@ -434,8 +484,6 @@ fetchRemoteJoinFields env manager reqHdrs userInfo remoteJoins = do , G._todDirectives = [] , G._todSelectionSet = [] } - - -- | Replace 'RemoteJoinField' in composite JSON with it's json value from remote server response. replaceRemoteFields :: MonadError QErr m @@ -446,7 +494,7 @@ replaceRemoteFields compositeJson remoteServerResponse = compositeValueToJSON <$> traverse replaceValue compositeJson where replaceValue rj = do - let alias = G.unAlias $ _rjfAlias rj + let alias = _rjfAlias rj fieldCall = _rjfFieldCall rj extractAtPath (alias:fieldCall) $ AO.Object remoteServerResponse @@ -467,76 +515,96 @@ replaceRemoteFields compositeJson remoteServerResponse = -- selection set at the leaf of the tree we construct. fieldCallsToField :: forall m. MonadError QErr m - => [G.Argument] - -> Map.HashMap G.Variable G.Value - -> G.SelectionSet + => Map.HashMap G.Name (InputValue Variable) + -- ^ user input arguments to the remote join field + -> Map.HashMap G.Name (G.Value Void) + -- ^ Contains the values of the variables that have been defined in the remote join definition + -> G.SelectionSet G.NoFragments Variable -- ^ Inserted at leaf of nested FieldCalls - -> G.Alias + -> Alias -- ^ Top-level name to set for this Field -> NonEmpty FieldCall - -> m G.Field + -> m (G.Field G.NoFragments Variable) fieldCallsToField rrArguments variables finalSelSet topAlias = fmap (\f -> f{G._fAlias = Just topAlias}) . nest where -- almost: `foldr nest finalSelSet` - nest :: NonEmpty FieldCall -> m G.Field + nest :: NonEmpty FieldCall -> m (G.Field G.NoFragments Variable) nest ((FieldCall name remoteArgs) :| rest) = do - templatedArguments <- createArguments variables remoteArgs + templatedArguments <- convert <$> createArguments variables remoteArgs + graphQLarguments <- traverse peel rrArguments (args, selSet) <- case NE.nonEmpty rest of Just f -> do s <- nest f pure (templatedArguments, [G.SelectionField s]) Nothing -> - let argsToMap = Map.fromList . map (G._aName &&& G._aValue) - arguments = map (uncurry G.Argument) $ Map.toList $ - Map.unionWith mergeValue - (argsToMap rrArguments) - (argsToMap templatedArguments) + let arguments = Map.unionWith mergeValue + graphQLarguments + -- converting (G.Value Void) -> (G.Value Variable) to merge the + -- 'rrArguments' with the 'variables' + templatedArguments in pure (arguments, finalSelSet) pure $ G.Field Nothing name args [] selSet + convert :: Map.HashMap G.Name (G.Value Void) -> Map.HashMap G.Name (G.Value Variable) + convert = fmap G.literal + + peel :: InputValue Variable -> m (G.Value Variable) + peel = \case + GraphQLValue v -> pure v + JSONValue _ -> + -- At this point, it is theoretically impossible that we have + -- unpacked a variable into a JSONValue, as there's no "outer + -- scope" at which this value could have been peeled. + -- FIXME: check that this is correct! + throw500 "internal error: encountered an already expanded variable when folding remote field arguments" + -- FIXME: better error message + -- This is a kind of "deep merge". -- For e.g. suppose the input argument of the remote field is something like: -- `where: { id : 1}` -- And during execution, client also gives the input arg: `where: {name: "tiru"}` -- We need to merge the input argument to where: {id : 1, name: "tiru"} -mergeValue :: G.Value -> G.Value -> G.Value +mergeValue :: G.Value Variable -> G.Value Variable -> G.Value Variable mergeValue lVal rVal = case (lVal, rVal) of - (G.VList (G.ListValueG l), G.VList (G.ListValueG r)) -> - G.VList $ G.ListValueG $ l <> r - (G.VObject (G.ObjectValueG l), G.VObject (G.ObjectValueG r)) -> - let fieldsToMap = Map.fromList . map (G._ofName &&& G._ofValue) - in G.VObject $ G.ObjectValueG $ map (uncurry G.ObjectFieldG) $ Map.toList $ - Map.unionWith mergeValue (fieldsToMap l) (fieldsToMap r) + (G.VList l, G.VList r) -> + G.VList $ l <> r + (G.VObject l, G.VObject r) -> + G.VObject $ Map.unionWith mergeValue l r (_, _) -> error $ "can only merge a list with another list or an " <> "object with another object" -- | Create an argument map using the inputs taken from the hasura database. createArguments :: (MonadError QErr m) - => Map.HashMap G.Variable G.Value + => Map.HashMap G.Name (G.Value Void) -> RemoteArguments - -> m [G.Argument] + -> m (HashMap G.Name (G.Value Void)) createArguments variables (RemoteArguments arguments) = either (throw400 Unexpected . \errors -> "Found errors: " <> T.intercalate ", " errors) - (pure . map (\(G.ObjectFieldG key val) -> G.Argument key val)) + pure (toEither (substituteVariables variables arguments)) -- | Substitute values in the argument list. substituteVariables - :: HashMap G.Variable G.Value -- ^ Values to use. - -> [G.ObjectFieldG G.Value] -- ^ A template. - -> Validation [Text] [G.ObjectFieldG G.Value] -substituteVariables values = traverse (traverse go) + :: HashMap G.Name (G.Value Void) -- ^ Values of the variables to substitute. + -> HashMap G.Name (G.Value G.Name) -- ^ Template which contains the variables. + -> Validation [Text] (HashMap G.Name (G.Value Void)) +substituteVariables values = traverse go where - go v = case v of - G.VVariable variable -> - case Map.lookup variable values of - Nothing -> Failure ["Value for variable " <> G.unVariable variable <<> " not provided"] + go = \case + G.VVariable variableName -> + case Map.lookup variableName values of + Nothing -> Failure ["Value for variable " <> variableName <<> " not provided"] Just value -> pure value - G.VList (G.ListValueG listValue) -> - fmap (G.VList . G.ListValueG) (traverse go listValue) - G.VObject (G.ObjectValueG objectValue) -> - fmap (G.VObject . G.ObjectValueG) (traverse (traverse go) objectValue) - _ -> pure v + G.VList listValue -> + fmap G.VList (traverse go listValue) + G.VObject objectValue -> + fmap G.VObject (traverse go objectValue) + G.VInt i -> pure $ G.VInt i + G.VFloat d -> pure $ G.VFloat d + G.VString txt -> pure $ G.VString txt + G.VEnum e -> pure $ G.VEnum e + G.VBoolean b -> pure $ G.VBoolean b + G.VNull -> pure $ G.VNull diff --git a/server/src-lib/Hasura/RQL/DML/Returning.hs b/server/src-lib/Hasura/RQL/DML/Returning.hs index cb4f112ff27..bfc92cdc5f4 100644 --- a/server/src-lib/Hasura/RQL/DML/Returning.hs +++ b/server/src-lib/Hasura/RQL/DML/Returning.hs @@ -2,18 +2,13 @@ module Hasura.RQL.DML.Returning where import Hasura.Prelude import Hasura.RQL.DML.Internal +import Hasura.RQL.DML.Returning.Types import Hasura.RQL.DML.Select import Hasura.RQL.Types import Hasura.SQL.Types -import qualified Data.Text as T -import qualified Hasura.SQL.DML as S - -data MutFldG v - = MCount - | MExp !T.Text - | MRet !(AnnFieldsG v) - deriving (Show, Eq) +import qualified Data.Text as T +import qualified Hasura.SQL.DML as S traverseMutFld :: (Applicative f) @@ -25,15 +20,6 @@ traverseMutFld f = \case MExp t -> pure $ MExp t MRet flds -> MRet <$> traverse (traverse (traverseAnnField f)) flds -type MutFld = MutFldG S.SQLExp - -type MutFldsG v = Fields (MutFldG v) - -data MutationOutputG v - = MOutMultirowFields !(MutFldsG v) - | MOutSinglerowObject !(AnnFieldsG v) - deriving (Show, Eq) - traverseMutationOutput :: (Applicative f) => (a -> f b) @@ -44,8 +30,6 @@ traverseMutationOutput f = \case MOutSinglerowObject annFields -> MOutSinglerowObject <$> traverseAnnFields f annFields -type MutationOutput = MutationOutputG S.SQLExp - traverseMutFlds :: (Applicative f) => (a -> f b) @@ -54,8 +38,6 @@ traverseMutFlds traverseMutFlds f = traverse (traverse (traverseMutFld f)) -type MutFlds = MutFldsG S.SQLExp - hasNestedFld :: MutationOutputG a -> Bool hasNestedFld = \case MOutMultirowFields flds -> any isNestedMutFld flds @@ -109,6 +91,7 @@ mkMutFldExp cteAlias preCalAffRows strfyNum = \case in S.SESelect $ mkSQLSelect JASMultipleRows $ AnnSelectG selFlds tabFrom tabPerm noSelectArgs strfyNum + {- Note [Mutation output expression] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ An example output expression for INSERT mutation: @@ -151,7 +134,6 @@ mkMutationOutputExp qt allCols preCalAffRows cte mutOutput strfyNum = where mutationResultAlias = Iden $ snakeCaseQualObject qt <> "__mutation_result_alias" allColumnsAlias = Iden $ snakeCaseQualObject qt <> "__all_columns_alias" - allColumnsSelect = S.CTESelect $ S.mkSelect { S.selExtr = map S.mkExtr $ map pgiColumn $ sortCols allCols , S.selFrom = Just $ S.mkIdenFromExp mutationResultAlias diff --git a/server/src-lib/Hasura/RQL/DML/Returning/Types.hs b/server/src-lib/Hasura/RQL/DML/Returning/Types.hs new file mode 100644 index 00000000000..73889153dbe --- /dev/null +++ b/server/src-lib/Hasura/RQL/DML/Returning/Types.hs @@ -0,0 +1,42 @@ +module Hasura.RQL.DML.Returning.Types where + + +import Hasura.Prelude + +import qualified Data.Aeson as J +import qualified Data.HashMap.Strict.InsOrd as OMap +import qualified Data.Text as T +import qualified Hasura.SQL.DML as S + +import Hasura.EncJSON +import Hasura.RQL.DML.Select.Types + + +data MutFldG v + = MCount + | MExp !T.Text + | MRet !(AnnFieldsG v) + deriving (Show, Eq) + +type MutFld = MutFldG S.SQLExp + +type MutFldsG v = Fields (MutFldG v) + +data MutationOutputG v + = MOutMultirowFields !(MutFldsG v) + | MOutSinglerowObject !(AnnFieldsG v) + deriving (Show, Eq) + +type MutationOutput = MutationOutputG S.SQLExp + +type MutFlds = MutFldsG S.SQLExp + +buildEmptyMutResp :: MutationOutput -> EncJSON +buildEmptyMutResp = \case + MOutMultirowFields mutFlds -> encJFromJValue $ OMap.fromList $ map (second convMutFld) mutFlds + MOutSinglerowObject _ -> encJFromJValue $ J.Object mempty + where + convMutFld = \case + MCount -> J.toJSON (0 :: Int) + MExp e -> J.toJSON e + MRet _ -> J.toJSON ([] :: [J.Value]) diff --git a/server/src-lib/Hasura/RQL/DML/Select.hs b/server/src-lib/Hasura/RQL/DML/Select.hs index 9fae910f80b..27e828111cb 100644 --- a/server/src-lib/Hasura/RQL/DML/Select.hs +++ b/server/src-lib/Hasura/RQL/DML/Select.hs @@ -2,8 +2,11 @@ module Hasura.RQL.DML.Select ( selectP2 , convSelectQuery , asSingleRowJsonResp - , module Hasura.RQL.DML.Select.Internal , runSelect + , selectQuerySQL + , selectAggregateQuerySQL + , connectionSelectQuerySQL + , module Hasura.RQL.DML.Select.Internal ) where @@ -36,7 +39,7 @@ convSelCol fieldInfoMap _ (SCExtRel rn malias selQ) = do let pgWhenRelErr = "only relationships can be expanded" relInfo <- withPathK "name" $ askRelType fieldInfoMap rn pgWhenRelErr - let (RelInfo _ _ _ relTab _) = relInfo + let (RelInfo _ _ _ relTab _ _) = relInfo (rfim, rspi) <- fetchRelDet rn relTab resolvedSelQ <- resolveStar rfim rspi selQ return [ECRel rn malias resolvedSelQ] @@ -125,7 +128,7 @@ convOrderByElem sessVarBldr (flds, spi) = \case [ fldName <<> " is a" , " computed field and can't be used in 'order_by'" ] - -- TODO Rakesh + -- TODO Rakesh (from master) FIRemoteRelationship {} -> throw400 UnexpectedPayload (mconcat [ fldName <<> " is a remote field" ]) OCRel fldName rest -> do @@ -231,7 +234,7 @@ convExtRel fieldInfoMap relName mAlias selQ sessVarBldr prepValBldr = do -- Point to the name key relInfo <- withPathK "name" $ askRelType fieldInfoMap relName pgWhenRelErr - let (RelInfo _ relTy colMapping relTab _) = relInfo + let (RelInfo _ relTy colMapping relTab _ _) = relInfo (relCIM, relSPI) <- fetchRelDet relName relTab annSel <- convSelectQ relTab relCIM relSPI selQ sessVarBldr prepValBldr case relTy of @@ -277,6 +280,18 @@ selectP2 jsonAggSelect (sel, p) = where selectSQL = toSQL $ mkSQLSelect jsonAggSelect sel +selectQuerySQL :: JsonAggSelect -> AnnSimpleSel -> Q.Query +selectQuerySQL jsonAggSelect sel = + Q.fromBuilder $ toSQL $ mkSQLSelect jsonAggSelect sel + +selectAggregateQuerySQL :: AnnAggregateSelect -> Q.Query +selectAggregateQuerySQL = + Q.fromBuilder . toSQL . mkAggregateSelect + +connectionSelectQuerySQL :: ConnectionSelect S.SQLExp -> Q.Query +connectionSelectQuerySQL = + Q.fromBuilder . toSQL . mkConnectionSelect + asSingleRowJsonResp :: Q.Query -> [Q.PrepArg] -> Q.TxE QErr EncJSON asSingleRowJsonResp query args = encJFromBS . runIdentity . Q.getRow diff --git a/server/src-lib/Hasura/RQL/DML/Select/Internal.hs b/server/src-lib/Hasura/RQL/DML/Select/Internal.hs index a6b1c8ca006..073273130da 100644 --- a/server/src-lib/Hasura/RQL/DML/Select/Internal.hs +++ b/server/src-lib/Hasura/RQL/DML/Select/Internal.hs @@ -14,7 +14,7 @@ import qualified Data.HashMap.Strict as HM import qualified Data.List.NonEmpty as NE import qualified Data.Text as T -import Hasura.GraphQL.Resolve.Types +import Hasura.GraphQL.Schema.Common import Hasura.Prelude import Hasura.RQL.DML.Internal import Hasura.RQL.DML.Select.Types @@ -635,7 +635,7 @@ processOrderByItems sourcePrefix' fieldAlias' similarArrayFields orderByItems = S.mkQIdenExp (mkBaseTableAlias sourcePrefix) $ toIden $ pgiColumn pgColInfo AOCObjectRelation relInfo relFilter rest -> withWriteObjectRelation $ do - let RelInfo relName _ colMapping relTable _ = relInfo + let RelInfo relName _ colMapping relTable _ _ = relInfo relSourcePrefix = mkObjectRelationTableAlias sourcePrefix relName fieldName = mkOrderByFieldName relName (relOrderByAlias, relOrdByExp) <- @@ -650,7 +650,7 @@ processOrderByItems sourcePrefix' fieldAlias' similarArrayFields orderByItems = ) AOCArrayAggregation relInfo relFilter aggOrderBy -> withWriteArrayRelation $ do - let RelInfo relName _ colMapping relTable _ = relInfo + let RelInfo relName _ colMapping relTable _ _ = relInfo fieldName = mkOrderByFieldName relName relSourcePrefix = mkArrayRelationSourcePrefix sourcePrefix fieldAlias similarArrayFields fieldName diff --git a/server/src-lib/Hasura/RQL/DML/Select/Types.hs b/server/src-lib/Hasura/RQL/DML/Select/Types.hs index 40c08f6121c..d8664f57f1a 100644 --- a/server/src-lib/Hasura/RQL/DML/Select/Types.hs +++ b/server/src-lib/Hasura/RQL/DML/Select/Types.hs @@ -3,21 +3,26 @@ module Hasura.RQL.DML.Select.Types where -import Control.Lens hiding ((.=)) +import Control.Lens.TH (makeLenses, makePrisms) import Data.Aeson.Types -import Data.Hashable -import Language.Haskell.TH.Syntax (Lift) +import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Aeson as J -import qualified Data.HashMap.Strict as HM -import qualified Data.List.NonEmpty as NE -import qualified Data.Sequence as Seq -import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G +import qualified Data.HashMap.Strict as HM +import qualified Data.List.NonEmpty as NE +import qualified Data.Sequence as Seq +import qualified Data.Text as T +import qualified Language.GraphQL.Draft.Syntax as G +import Hasura.GraphQL.Parser.Schema import Hasura.Prelude -import Hasura.RQL.Types -import qualified Hasura.SQL.DML as S +import Hasura.RQL.Types.BoolExp +import Hasura.RQL.Types.Column +import Hasura.RQL.Types.Common +import Hasura.RQL.Types.DML +import Hasura.RQL.Types.Function +import Hasura.RQL.Types.RemoteRelationship +import Hasura.RQL.Types.RemoteSchema +import qualified Hasura.SQL.DML as S import Hasura.SQL.Types type SelectQExt = SelectG ExtCol BoolExp Int @@ -195,14 +200,14 @@ data AnnColumnField data RemoteFieldArgument = RemoteFieldArgument - { _rfaArgument :: !G.Argument - , _rfaVariable :: !(Maybe [(G.VariableDefinition,J.Value)]) + { _rfaArgument :: !G.Name + , _rfaValue :: !(InputValue Variable) } deriving (Eq,Show) data RemoteSelect = RemoteSelect { _rselArgs :: ![RemoteFieldArgument] - , _rselSelection :: !G.SelectionSet + , _rselSelection :: !(G.SelectionSet G.NoFragments Variable) , _rselHasuraColumns :: !(HashSet PGColumnInfo) , _rselFieldCall :: !(NonEmpty FieldCall) , _rselRemoteSchema :: !RemoteSchemaInfo diff --git a/server/src-lib/Hasura/RQL/DML/Update.hs b/server/src-lib/Hasura/RQL/DML/Update.hs index 0a97cbf0450..e09af4c62b2 100644 --- a/server/src-lib/Hasura/RQL/DML/Update.hs +++ b/server/src-lib/Hasura/RQL/DML/Update.hs @@ -1,49 +1,50 @@ module Hasura.RQL.DML.Update - ( validateUpdateQueryWith - , validateUpdateQuery - , AnnUpdG(..) + ( AnnUpdG(..) , traverseAnnUpd - , AnnUpd , execUpdateQuery + , updateOperatorText , runUpdate ) where import Data.Aeson.Types -import Instances.TH.Lift () +import Instances.TH.Lift () -import qualified Data.HashMap.Strict as M -import qualified Data.Sequence as DS +import qualified Data.HashMap.Strict as M +import qualified Data.Sequence as DS import Hasura.EncJSON import Hasura.Prelude -import Hasura.RQL.DML.Insert (insertCheckExpr) +import Hasura.RQL.DML.Insert (insertCheckExpr) import Hasura.RQL.DML.Internal import Hasura.RQL.DML.Mutation import Hasura.RQL.DML.Returning +import Hasura.RQL.DML.Update.Types import Hasura.RQL.GBoolExp -import Hasura.RQL.Instances () +import Hasura.RQL.Instances () import Hasura.RQL.Types -import Hasura.Server.Version (HasVersion) +import Hasura.Server.Version (HasVersion) import Hasura.Session import Hasura.SQL.Types -import qualified Database.PG.Query as Q -import qualified Hasura.SQL.DML as S +import qualified Database.PG.Query as Q +import qualified Hasura.SQL.DML as S import qualified Data.Environment as Env import qualified Hasura.Tracing as Tracing -data AnnUpdG v - = AnnUpd - { uqp1Table :: !QualifiedTable - , uqp1SetExps :: ![(PGCol, v)] - , uqp1Where :: !(AnnBoolExp v, AnnBoolExp v) - , upq1Check :: !(AnnBoolExp v) - -- we don't prepare the arguments for returning - -- however the session variable can still be - -- converted as desired - , uqp1Output :: !(MutationOutputG v) - , uqp1AllCols :: ![PGColumnInfo] - } deriving (Show, Eq) + +-- NOTE: This function can be improved, because we use +-- the literal values defined below in the 'updateOperators' +-- function in 'Hasura.GraphQL.Schema.Mutation'. It would +-- be nice if we could avoid duplicating the string literal +-- values +updateOperatorText :: UpdOpExpG a -> Text +updateOperatorText (UpdSet _) = "_set" +updateOperatorText (UpdInc _) = "_inc" +updateOperatorText (UpdAppend _) = "_append" +updateOperatorText (UpdPrepend _) = "_prepend" +updateOperatorText (UpdDeleteKey _) = "_delete_key" +updateOperatorText (UpdDeleteElem _) = "_delete_elem" +updateOperatorText (UpdDeleteAtPath _) = "_delete_at_path" traverseAnnUpd :: (Applicative f) @@ -52,19 +53,17 @@ traverseAnnUpd -> f (AnnUpdG b) traverseAnnUpd f annUpd = AnnUpd tn - <$> traverse (traverse f) setExps + <$> traverse (traverse $ traverse f) opExps <*> ((,) <$> traverseAnnBoolExp f whr <*> traverseAnnBoolExp f fltr) <*> traverseAnnBoolExp f chk <*> traverseMutationOutput f mutOutput <*> pure allCols where - AnnUpd tn setExps (whr, fltr) chk mutOutput allCols = annUpd - -type AnnUpd = AnnUpdG S.SQLExp + AnnUpd tn opExps (whr, fltr) chk mutOutput allCols = annUpd mkUpdateCTE :: AnnUpd -> S.CTE -mkUpdateCTE (AnnUpd tn setExps (permFltr, wc) chk _ _) = +mkUpdateCTE (AnnUpd tn opExps (permFltr, wc) chk _ columnsInfo) = S.CTEUpdate update where update = @@ -74,11 +73,31 @@ mkUpdateCTE (AnnUpd tn setExps (permFltr, wc) chk _ _) = $ [ S.selectStar , S.Extractor (insertCheckExpr "update check constraint failed" checkExpr) Nothing ] - setExp = S.SetExp $ map S.SetExpItem setExps + setExp = S.SetExp $ map (expandOperator columnsInfo) opExps tableFltr = Just $ S.WhereFrag tableFltrExpr tableFltrExpr = toSQLBoolExp (S.QualTable tn) $ andAnnBoolExps permFltr wc checkExpr = toSQLBoolExp (S.QualTable tn) chk +expandOperator :: [PGColumnInfo] -> (PGCol, UpdOpExpG S.SQLExp) -> S.SetExpItem +expandOperator infos (column, op) = S.SetExpItem $ (column,) $ case op of + UpdSet e -> e + UpdInc e -> S.mkSQLOpExp S.incOp identifier (asNum e) + UpdAppend e -> S.mkSQLOpExp S.jsonbConcatOp identifier (asJSON e) + UpdPrepend e -> S.mkSQLOpExp S.jsonbConcatOp (asJSON e) identifier + UpdDeleteKey e -> S.mkSQLOpExp S.jsonbDeleteOp identifier (asText e) + UpdDeleteElem e -> S.mkSQLOpExp S.jsonbDeleteOp identifier (asInt e) + UpdDeleteAtPath a -> S.mkSQLOpExp S.jsonbDeleteAtPathOp identifier (asArray a) + where + identifier = S.SEIden $ toIden column + asInt e = S.SETyAnn e S.intTypeAnn + asText e = S.SETyAnn e S.textTypeAnn + asJSON e = S.SETyAnn e S.jsonbTypeAnn + asArray a = S.SETyAnn (S.SEArray a) S.textArrTypeAnn + asNum e = S.SETyAnn e $ + case find (\info -> pgiColumn info == column) infos <&> pgiType of + Just (PGColumnScalar s) -> S.mkTypeAnn $ PGTypeScalar s + _ -> S.numericTypeAnn + convInc :: (QErrM m) => (PGColumnType -> Value -> m S.SQLExp) @@ -181,7 +200,7 @@ validateUpdateQueryWith sessVarBldr prepValBldr uq = do convOp fieldInfoMap preSetCols updPerm (M.toList $ uqMul uq) $ convMul prepValBldr defItems <- withPathK "$default" $ - convOp fieldInfoMap preSetCols updPerm (zip (uqDefault uq) [()..]) convDefault + convOp fieldInfoMap preSetCols updPerm ((,()) <$> uqDefault uq) convDefault -- convert the returning cols into sql returing exp mAnnRetCols <- forM mRetCols $ \retCols -> @@ -190,8 +209,11 @@ validateUpdateQueryWith sessVarBldr prepValBldr uq = do resolvedPreSetItems <- M.toList <$> mapM (convPartialSQLExp sessVarBldr) preSetObj - let setExpItems = resolvedPreSetItems ++ setItems ++ incItems ++ - mulItems ++ defItems + let setExpItems = resolvedPreSetItems ++ + setItems ++ + incItems ++ + mulItems ++ + defItems when (null setExpItems) $ throw400 UnexpectedPayload "atleast one of $set, $inc, $mul has to be present" @@ -208,7 +230,7 @@ validateUpdateQueryWith sessVarBldr prepValBldr uq = do return $ AnnUpd tableName - setExpItems + (fmap UpdSet <$> setExpItems) (resolvedUpdFltr, annSQLBoolExp) resolvedUpdCheck (mkDefaultMutFlds mAnnRetCols) diff --git a/server/src-lib/Hasura/RQL/DML/Update/Types.hs b/server/src-lib/Hasura/RQL/DML/Update/Types.hs new file mode 100644 index 00000000000..48b81b7a25b --- /dev/null +++ b/server/src-lib/Hasura/RQL/DML/Update/Types.hs @@ -0,0 +1,36 @@ +module Hasura.RQL.DML.Update.Types where + + +import Hasura.Prelude + +import qualified Hasura.SQL.DML as S + +import Hasura.RQL.DML.Returning.Types +import Hasura.RQL.Types.BoolExp +import Hasura.RQL.Types.Column +import Hasura.SQL.Types + + +data AnnUpdG v + = AnnUpd + { uqp1Table :: !QualifiedTable + , uqp1OpExps :: ![(PGCol, UpdOpExpG v)] + , uqp1Where :: !(AnnBoolExp v, AnnBoolExp v) + , uqp1Check :: !(AnnBoolExp v) + -- we don't prepare the arguments for returning + -- however the session variable can still be + -- converted as desired + , uqp1Output :: !(MutationOutputG v) + , uqp1AllCols :: ![PGColumnInfo] + } deriving (Show, Eq) + +type AnnUpd = AnnUpdG S.SQLExp + +data UpdOpExpG v = UpdSet !v + | UpdInc !v + | UpdAppend !v + | UpdPrepend !v + | UpdDeleteKey !v + | UpdDeleteElem !v + | UpdDeleteAtPath ![v] + deriving (Eq, Show, Functor, Foldable, Traversable, Generic, Data) diff --git a/server/src-lib/Hasura/RQL/GBoolExp.hs b/server/src-lib/Hasura/RQL/GBoolExp.hs index 42b18bea87b..a5b61cf8898 100644 --- a/server/src-lib/Hasura/RQL/GBoolExp.hs +++ b/server/src-lib/Hasura/RQL/GBoolExp.hs @@ -313,7 +313,7 @@ annColExp rhsParser colInfoMap (ColExp fieldName colVal) = do return $ AVRel relInfo annRelBoolExp FIComputedField _ -> throw400 UnexpectedPayload "Computed columns can not be part of the where clause" - -- TODO Rakesh + -- TODO Rakesh (from master) FIRemoteRelationship{} -> throw400 UnexpectedPayload "remote field unsupported" @@ -335,7 +335,7 @@ convColRhs tableQual = \case bExps = map (mkFieldCompExp tableQual colFld) opExps return $ foldr (S.BEBin S.AndOp) (S.BELit True) bExps - AVRel (RelInfo _ _ colMapping relTN _) nesAnn -> do + AVRel (RelInfo _ _ colMapping relTN _ _) nesAnn -> do -- Convert the where clause on the relationship curVarNum <- get put $ curVarNum + 1 diff --git a/server/src-lib/Hasura/RQL/Instances.hs b/server/src-lib/Hasura/RQL/Instances.hs index 155cbed8438..2125d78fe37 100644 --- a/server/src-lib/Hasura/RQL/Instances.hs +++ b/server/src-lib/Hasura/RQL/Instances.hs @@ -9,51 +9,46 @@ import qualified Data.Aeson as J import qualified Data.HashMap.Strict as M import qualified Data.HashSet as S import qualified Data.URL.Template as UT +import qualified Database.PG.Query as Q import qualified Language.GraphQL.Draft.Syntax as G import qualified Language.Haskell.TH.Syntax as TH import qualified Text.Regex.TDFA as TDFA import qualified Text.Regex.TDFA.Pattern as TDFA -import qualified Database.PG.Query as Q +import Control.DeepSeq (NFData (..)) import Data.Functor.Product import Data.GADT.Compare +import Data.Text import Instances.TH.Lift () import System.Cron.Parser import System.Cron.Types -import Data.Text -instance NFData G.Argument -instance NFData G.Directive -instance NFData G.ExecutableDefinition -instance NFData G.Field instance NFData G.FragmentDefinition -instance NFData G.FragmentSpread instance NFData G.GType -instance NFData G.InlineFragment -instance NFData G.OperationDefinition instance NFData G.OperationType -instance NFData G.Selection -instance NFData G.TypedOperationDefinition -instance NFData G.Value -instance NFData G.ValueConst instance NFData G.VariableDefinition -instance (NFData a) => NFData (G.ObjectFieldG a) instance NFData UT.Variable instance NFData UT.TemplateItem instance NFData UT.URLTemplate -deriving instance NFData G.Alias -deriving instance NFData G.EnumValue -deriving instance NFData G.ExecutableDocument -deriving instance NFData G.ListType -deriving instance NFData G.Name -deriving instance NFData G.NamedType -deriving instance NFData G.Nullability -deriving instance NFData G.StringValue -deriving instance NFData G.Variable +instance NFData G.Name where + rnf = rnf . G.unName + +instance NFData a => NFData (G.Directive a) +instance NFData a => NFData (G.ExecutableDefinition a) +instance (NFData (a b), NFData b) => NFData (G.Field a b) +instance NFData a => NFData (G.FragmentSpread a) +instance (NFData (a b), NFData b) => NFData (G.InlineFragment a b) +instance (NFData (a b), NFData b) => NFData (G.OperationDefinition a b) +instance (NFData (a b), NFData b) => NFData (G.Selection a b) +instance (NFData (a b), NFData b) => NFData (G.TypedOperationDefinition a b) +instance NFData a => NFData (G.Value a) + deriving instance NFData G.Description -deriving instance (NFData a) => NFData (G.ListValueG a) -deriving instance (NFData a) => NFData (G.ObjectValueG a) +deriving instance NFData G.EnumValue +deriving instance NFData G.Nullability + +deriving instance NFData a => NFData (G.ExecutableDocument a) -- instances for CronSchedule from package `cron` instance NFData StepField @@ -89,13 +84,6 @@ deriving instance TH.Lift TDFA.PatternSetCharacterClass deriving instance TH.Lift TDFA.PatternSetCollatingElement deriving instance TH.Lift TDFA.PatternSetEquivalenceClass -instance (GEq f, GEq g) => GEq (Product f g) where - Pair a1 a2 `geq` Pair b1 b2 - | Just Refl <- a1 `geq` b1 - , Just Refl <- a2 `geq` b2 - = Just Refl - | otherwise = Nothing - instance (GCompare f, GCompare g) => GCompare (Product f g) where Pair a1 a2 `gcompare` Pair b1 b2 = case gcompare a1 b1 of GLT -> GLT @@ -121,5 +109,5 @@ instance Q.FromCol CronSchedule where Left err -> Left err Right dbCron -> case parseCronSchedule dbCron of - Left err' -> Left $ "invalid cron schedule " <> pack err' + Left err' -> Left $ "invalid cron schedule " <> pack err' Right cron -> Right cron diff --git a/server/src-lib/Hasura/RQL/Types.hs b/server/src-lib/Hasura/RQL/Types.hs index 17f37ed2082..973a4ba157d 100644 --- a/server/src-lib/Hasura/RQL/Types.hs +++ b/server/src-lib/Hasura/RQL/Types.hs @@ -4,7 +4,7 @@ module Hasura.RQL.Types , UserInfoM(..) , HasHttpManager (..) - , HasGCtxMap (..) + -- , HasGCtxMap (..) , SQLGenCtx(..) , HasSQLGenCtx(..) @@ -38,6 +38,8 @@ module Hasura.RQL.Types , module R ) where +import Control.Monad.Unique + import Hasura.Prelude import Hasura.Session import Hasura.SQL.Types @@ -64,8 +66,6 @@ import Hasura.RQL.Types.SchemaCache as R import Hasura.RQL.Types.SchemaCache.Build as R import Hasura.RQL.Types.Table as R -import qualified Hasura.GraphQL.Context as GC - import qualified Data.HashMap.Strict as M import qualified Data.Text as T import qualified Network.HTTP.Client as HTTP @@ -75,7 +75,7 @@ data QCtx { qcUserInfo :: !UserInfo , qcSchemaCache :: !SchemaCache , qcSQLCtx :: !SQLGenCtx - } deriving (Show, Eq) + } class HasQCtx a where getQCtx :: a -> QCtx @@ -143,13 +143,13 @@ instance (Monoid w, HasHttpManager m) => HasHttpManager (WriterT w m) where instance (HasHttpManager m) => HasHttpManager (TraceT m) where askHttpManager = lift askHttpManager -class (Monad m) => HasGCtxMap m where - askGCtxMap :: m GC.GCtxMap +-- class (Monad m) => HasGCtxMap m where +-- askGCtxMap :: m GC.GCtxMap -instance (HasGCtxMap m) => HasGCtxMap (ReaderT r m) where - askGCtxMap = lift askGCtxMap -instance (Monoid w, HasGCtxMap m) => HasGCtxMap (WriterT w m) where - askGCtxMap = lift askGCtxMap +-- instance (HasGCtxMap m) => HasGCtxMap (ReaderT r m) where +-- askGCtxMap = lift askGCtxMap +-- instance (Monoid w, HasGCtxMap m) => HasGCtxMap (WriterT w m) where +-- askGCtxMap = lift askGCtxMap newtype SQLGenCtx = SQLGenCtx @@ -184,7 +184,7 @@ instance (HasSystemDefined m) => HasSystemDefined (TraceT m) where newtype HasSystemDefinedT m a = HasSystemDefinedT { unHasSystemDefinedT :: ReaderT SystemDefined m a } - deriving ( Functor, Applicative, Monad, MonadTrans, MonadIO, MonadError e, MonadTx + deriving ( Functor, Applicative, Monad, MonadTrans, MonadIO, MonadUnique, MonadError e, MonadTx , HasHttpManager, HasSQLGenCtx, TableCoreInfoRM, CacheRM, CacheRWM, UserInfoM ) runHasSystemDefinedT :: SystemDefined -> HasSystemDefinedT m a -> m a diff --git a/server/src-lib/Hasura/RQL/Types/Action.hs b/server/src-lib/Hasura/RQL/Types/Action.hs index 70c466ad137..92b85b14c4e 100644 --- a/server/src-lib/Hasura/RQL/Types/Action.hs +++ b/server/src-lib/Hasura/RQL/Types/Action.hs @@ -5,8 +5,16 @@ module Hasura.RQL.Types.Action , ActionName(..) , ActionMutationKind(..) + , _ActionAsynchronous , ActionDefinition(..) + , adArguments + , adOutputType + , adType + , adHeaders + , adForwardClientHeaders + , adHandler , ActionType(..) + , _ActionMutation , CreateAction(..) , UpdateAction(..) , ActionDefinitionInput @@ -21,7 +29,6 @@ module Hasura.RQL.Types.Action , aiName , aiOutputObject , aiDefinition - , aiPgScalars , aiPermissions , aiComment , ActionPermissionInfo(..) @@ -31,15 +38,22 @@ module Hasura.RQL.Types.Action , ActionMetadata(..) , ActionPermissionMetadata(..) + + , AnnActionExecution(..) + , AnnActionMutationAsync(..) + , ActionExecContext(..) + , AsyncActionQueryFieldG(..) + , AnnActionAsyncQuery(..) ) where -import Control.Lens (makeLenses) +import Control.Lens (makeLenses, makePrisms) import Hasura.Incremental (Cacheable) import Hasura.Prelude import Hasura.RQL.DDL.Headers -import Hasura.RQL.Types.CustomTypes +import Hasura.RQL.DML.Select.Types import Hasura.RQL.Types.Common +import Hasura.RQL.Types.CustomTypes import Hasura.Session import Hasura.SQL.Types import Language.Haskell.TH.Syntax (Lift) @@ -50,6 +64,8 @@ import qualified Data.Aeson.TH as J import qualified Data.HashMap.Strict as Map import qualified Database.PG.Query as Q import qualified Language.GraphQL.Draft.Syntax as G +import qualified Network.HTTP.Client as HTTP +import qualified Network.HTTP.Types as HTTP newtype ActionName = ActionName { unActionName :: G.Name } @@ -57,7 +73,10 @@ newtype ActionName , Hashable, DQuote, Lift, Generic, NFData, Cacheable) instance Q.FromCol ActionName where - fromCol bs = ActionName . G.Name <$> Q.fromCol bs + fromCol bs = do + text <- Q.fromCol bs + name <- G.mkName text `onNothing` Left (text <> " is not valid GraphQL name") + pure $ ActionName name instance Q.ToPrepArg ActionName where toPrepVal = Q.toPrepVal . G.unName . unActionName @@ -71,20 +90,21 @@ instance Cacheable ActionMutationKind $(J.deriveJSON J.defaultOptions { J.constructorTagModifier = J.snakeCase . drop 6} ''ActionMutationKind) +$(makePrisms ''ActionMutationKind) newtype ArgumentName = ArgumentName { unArgumentName :: G.Name } deriving ( Show, Eq, J.FromJSON, J.ToJSON, J.FromJSONKey, J.ToJSONKey , Hashable, DQuote, Lift, Generic, NFData, Cacheable) -data ArgumentDefinition +data ArgumentDefinition a = ArgumentDefinition { _argName :: !ArgumentName - , _argType :: !GraphQLType + , _argType :: !a , _argDescription :: !(Maybe G.Description) - } deriving (Show, Eq, Lift, Generic) -instance NFData ArgumentDefinition -instance Cacheable ArgumentDefinition + } deriving (Show, Eq, Functor, Foldable, Traversable, Lift, Generic) +instance (NFData a) => NFData (ArgumentDefinition a) +instance (Cacheable a) => Cacheable (ArgumentDefinition a) $(J.deriveJSON (J.aesonDrop 4 J.snakeCase) ''ArgumentDefinition) data ActionType @@ -93,20 +113,22 @@ data ActionType deriving (Show, Eq, Lift, Generic) instance NFData ActionType instance Cacheable ActionType +$(makePrisms ''ActionType) -data ActionDefinition a +data ActionDefinition a b = ActionDefinition - { _adArguments :: ![ArgumentDefinition] + { _adArguments :: ![a] , _adOutputType :: !GraphQLType , _adType :: !ActionType , _adHeaders :: ![HeaderConf] , _adForwardClientHeaders :: !Bool - , _adHandler :: !a + , _adHandler :: !b } deriving (Show, Eq, Lift, Functor, Foldable, Traversable, Generic) -instance (NFData a) => NFData (ActionDefinition a) -instance (Cacheable a) => Cacheable (ActionDefinition a) +instance (NFData a, NFData b) => NFData (ActionDefinition a b) +instance (Cacheable a, Cacheable b) => Cacheable (ActionDefinition a b) +$(makeLenses ''ActionDefinition) -instance (J.FromJSON a) => J.FromJSON (ActionDefinition a) where +instance (J.FromJSON a, J.FromJSON b) => J.FromJSON (ActionDefinition a b) where parseJSON = J.withObject "ActionDefinition" $ \o -> do _adArguments <- o J..:? "arguments" J..!= [] _adOutputType <- o J..: "output_type" @@ -120,7 +142,7 @@ instance (J.FromJSON a) => J.FromJSON (ActionDefinition a) where t -> fail $ "expected mutation or query, but found " <> t return ActionDefinition {..} -instance (J.ToJSON a) => J.ToJSON (ActionDefinition a) where +instance (J.ToJSON a, J.ToJSON b) => J.ToJSON (ActionDefinition a b) where toJSON (ActionDefinition args outputType actionType headers forwardClientHeaders handler) = let typeAndKind = case actionType of ActionQuery -> [ "type" J..= ("query" :: String)] @@ -134,7 +156,8 @@ instance (J.ToJSON a) => J.ToJSON (ActionDefinition a) where , "handler" J..= handler ] <> typeAndKind -type ResolvedActionDefinition = ActionDefinition ResolvedWebhook +type ResolvedActionDefinition = + ActionDefinition (ArgumentDefinition (G.GType, NonObjectCustomType)) ResolvedWebhook data ActionPermissionInfo = ActionPermissionInfo @@ -148,7 +171,7 @@ type ActionOutputFields = Map.HashMap G.Name G.GType getActionOutputFields :: AnnotatedObjectType -> ActionOutputFields getActionOutputFields = - Map.fromList . map (unObjectFieldName *** fst) . Map.toList . _aotAnnotatedFields + Map.fromList . map ( (unObjectFieldName . _ofdName) &&& (fst . _ofdType)) . toList . _otdFields data ActionInfo = ActionInfo @@ -156,13 +179,13 @@ data ActionInfo , _aiOutputObject :: !AnnotatedObjectType , _aiDefinition :: !ResolvedActionDefinition , _aiPermissions :: !ActionPermissionMap - , _aiPgScalars :: !(HashSet PGScalarType) , _aiComment :: !(Maybe Text) } deriving (Show, Eq) $(J.deriveToJSON (J.aesonDrop 3 J.snakeCase) ''ActionInfo) $(makeLenses ''ActionInfo) -type ActionDefinitionInput = ActionDefinition InputWebhook +type ActionDefinitionInput = + ActionDefinition (ArgumentDefinition GraphQLType) InputWebhook data CreateAction = CreateAction @@ -223,3 +246,53 @@ instance J.FromJSON ActionMetadata where <*> o J..:? "comment" <*> o J..: "definition" <*> o J..:? "permissions" J..!= [] + +----------------- Resolve Types ---------------- + +data AnnActionExecution v + = AnnActionExecution + { _aaeName :: !ActionName + , _aaeOutputType :: !GraphQLType -- ^ output type + , _aaeFields :: !(AnnFieldsG v) -- ^ output selection + , _aaePayload :: !J.Value -- ^ jsonified input arguments + , _aaeOutputFields :: !ActionOutputFields + -- ^ to validate the response fields from webhook + , _aaeDefinitionList :: ![(PGCol, PGScalarType)] + , _aaeWebhook :: !ResolvedWebhook + , _aaeHeaders :: ![HeaderConf] + , _aaeForwardClientHeaders :: !Bool + , _aaeStrfyNum :: !Bool + } deriving (Show, Eq) + +data AnnActionMutationAsync + = AnnActionMutationAsync + { _aamaName :: !ActionName + , _aamaPayload :: !J.Value -- ^ jsonified input arguments + } deriving (Show, Eq) + +data AsyncActionQueryFieldG v + = AsyncTypename !Text + | AsyncOutput !(AnnFieldsG v) + | AsyncId + | AsyncCreatedAt + | AsyncErrors + deriving (Show, Eq) + +type AsyncActionQueryFieldsG v = Fields (AsyncActionQueryFieldG v) + +data AnnActionAsyncQuery v + = AnnActionAsyncQuery + { _aaaqName :: !ActionName + , _aaaqActionId :: !v + , _aaaqOutputType :: !GraphQLType + , _aaaqFields :: !(AsyncActionQueryFieldsG v) + , _aaaqDefinitionList :: ![(PGCol, PGScalarType)] + , _aaaqStringifyNum :: !Bool + } deriving (Show, Eq) + +data ActionExecContext + = ActionExecContext + { _aecManager :: !HTTP.Manager + , _aecHeaders :: !HTTP.RequestHeaders + , _aecSessionVariables :: !SessionVariables + } diff --git a/server/src-lib/Hasura/RQL/Types/BoolExp.hs b/server/src-lib/Hasura/RQL/Types/BoolExp.hs index 50effe39b8d..a341864cc46 100644 --- a/server/src-lib/Hasura/RQL/Types/BoolExp.hs +++ b/server/src-lib/Hasura/RQL/Types/BoolExp.hs @@ -33,15 +33,14 @@ module Hasura.RQL.Types.BoolExp , AnnBoolExpPartialSQL , PreSetCols + , PreSetColsG , PreSetColsPartial ) where -import Hasura.Incremental (Cacheable) import Hasura.Prelude -import Hasura.RQL.Types.Column -import Hasura.RQL.Types.Common -import Hasura.Session -import Hasura.SQL.Types + +import qualified Data.Aeson.Types as J +import qualified Data.HashMap.Strict as M import qualified Hasura.SQL.DML as S @@ -54,8 +53,13 @@ import Data.Aeson.TH import Instances.TH.Lift () import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Aeson.Types as J -import qualified Data.HashMap.Strict as M +import Hasura.Incremental (Cacheable) +import Hasura.RQL.Types.Column +import Hasura.RQL.Types.Common +import Hasura.Session +import Hasura.SQL.Types + + data GExists a = GExists @@ -346,6 +350,7 @@ type AnnBoolExpSQL = AnnBoolExp S.SQLExp type AnnBoolExpFldPartialSQL = AnnBoolExpFld PartialSQLExp type AnnBoolExpPartialSQL = AnnBoolExp PartialSQLExp +type PreSetColsG v = M.HashMap PGCol v type PreSetColsPartial = M.HashMap PGCol PartialSQLExp type PreSetCols = M.HashMap PGCol S.SQLExp diff --git a/server/src-lib/Hasura/RQL/Types/Column.hs b/server/src-lib/Hasura/RQL/Types/Column.hs index dedffc8d21f..b642176bd91 100644 --- a/server/src-lib/Hasura/RQL/Types/Column.hs +++ b/server/src-lib/Hasura/RQL/Types/Column.hs @@ -35,23 +35,21 @@ import Control.Lens.TH import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH -import Data.Sequence.NonEmpty -import Language.Haskell.TH.Syntax (Lift) - import Hasura.Incremental (Cacheable) import Hasura.RQL.Instances () import Hasura.RQL.Types.Error import Hasura.SQL.Types import Hasura.SQL.Value +import Language.Haskell.TH.Syntax (Lift) newtype EnumValue - = EnumValue { getEnumValue :: T.Text } - deriving (Show, Eq, Lift, NFData, Hashable, ToJSON, ToJSONKey, FromJSON, FromJSONKey, Cacheable) + = EnumValue { getEnumValue :: G.Name } + deriving (Show, Eq, Ord, Lift, NFData, Hashable, ToJSON, ToJSONKey, FromJSON, FromJSONKey, Cacheable) newtype EnumValueInfo = EnumValueInfo { evComment :: Maybe T.Text - } deriving (Show, Eq, Lift, NFData, Hashable, Cacheable) + } deriving (Show, Eq, Ord, Lift, NFData, Hashable, Cacheable) $(deriveJSON (aesonDrop 2 snakeCase) ''EnumValueInfo) type EnumValues = M.HashMap EnumValue EnumValueInfo @@ -62,7 +60,7 @@ data EnumReference = EnumReference { erTable :: !QualifiedTable , erValues :: !EnumValues - } deriving (Show, Eq, Generic, Lift) + } deriving (Show, Eq, Ord, Generic, Lift) instance NFData EnumReference instance Hashable EnumReference instance Cacheable EnumReference @@ -79,7 +77,7 @@ data PGColumnType -- always have type @text@), but we really want to distinguish this case, since we treat it -- /completely/ differently in the GraphQL schema. | PGColumnEnumReference !EnumReference - deriving (Show, Eq, Generic) + deriving (Show, Eq, Ord, Generic) instance NFData PGColumnType instance Hashable PGColumnType instance Cacheable PGColumnType @@ -113,13 +111,13 @@ parsePGScalarValue columnType value = case columnType of PGColumnEnumReference (EnumReference tableName enumValues) -> WithScalarType PGText <$> (maybe (pure $ PGNull PGText) parseEnumValue =<< decodeValue value) where - parseEnumValue :: Text -> m PGScalarValue - parseEnumValue textValue = do - let enumTextValues = map getEnumValue $ M.keys enumValues - unless (textValue `elem` enumTextValues) $ throw400 UnexpectedPayload - $ "expected one of the values " <> T.intercalate ", " (map dquote enumTextValues) - <> " for type " <> snakeCaseQualObject tableName <<> ", given " <>> textValue - pure $ PGValText textValue + parseEnumValue :: G.Name -> m PGScalarValue + parseEnumValue enumValueName = do + let enums = map getEnumValue $ M.keys enumValues + unless (enumValueName `elem` enums) $ throw400 UnexpectedPayload + $ "expected one of the values " <> T.intercalate ", " (map dquote enums) + <> " for type " <> snakeCaseQualObject tableName <<> ", given " <>> enumValueName + pure $ PGValText $ G.unName enumValueName parsePGScalarValues :: (MonadError QErr m) @@ -149,7 +147,7 @@ data PGRawColumnInfo -- consistently identified by its position. , prciType :: !PGScalarType , prciIsNullable :: !Bool - , prciDescription :: !(Maybe PGDescription) + , prciDescription :: !(Maybe G.Description) } deriving (Show, Eq, Generic) instance NFData PGRawColumnInfo instance Cacheable PGRawColumnInfo @@ -165,7 +163,7 @@ data PGColumnInfo , pgiPosition :: !Int , pgiType :: !PGColumnType , pgiIsNullable :: !Bool - , pgiDescription :: !(Maybe PGDescription) + , pgiDescription :: !(Maybe G.Description) } deriving (Show, Eq, Generic) instance NFData PGColumnInfo instance Cacheable PGColumnInfo diff --git a/server/src-lib/Hasura/RQL/Types/Common.hs b/server/src-lib/Hasura/RQL/Types/Common.hs index 6a10c1c63a8..abbbe4d3f55 100644 --- a/server/src-lib/Hasura/RQL/Types/Common.hs +++ b/server/src-lib/Hasura/RQL/Types/Common.hs @@ -49,13 +49,14 @@ import Hasura.Prelude import Hasura.RQL.DDL.Headers () import Hasura.RQL.Types.Error import Hasura.SQL.Types +import Hasura.RQL.DDL.Headers () + import Control.Lens (makeLenses) import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH -import Data.Sequence.NonEmpty import Data.URL.Template import Instances.TH.Lift () import Language.Haskell.TH.Syntax (Lift, Q, TExp) @@ -150,11 +151,12 @@ instance Q.FromCol RelType where data RelInfo = RelInfo - { riName :: !RelName - , riType :: !RelType - , riMapping :: !(HashMap PGCol PGCol) - , riRTable :: !QualifiedTable - , riIsManual :: !Bool + { riName :: !RelName + , riType :: !RelType + , riMapping :: !(HashMap PGCol PGCol) + , riRTable :: !QualifiedTable + , riIsManual :: !Bool + , riIsNullable :: !Bool } deriving (Show, Eq, Generic) instance NFData RelInfo instance Cacheable RelInfo @@ -250,7 +252,7 @@ data InpValInfo = InpValInfo { _iviDesc :: !(Maybe G.Description) , _iviName :: !G.Name - , _iviDefVal :: !(Maybe G.ValueConst) + , _iviDefVal :: !(Maybe (G.Value Void)) , _iviType :: !G.GType } deriving (Show, Eq, TH.Lift, Generic) instance Cacheable InpValInfo diff --git a/server/src-lib/Hasura/RQL/Types/ComputedField.hs b/server/src-lib/Hasura/RQL/Types/ComputedField.hs index 1deea120613..37c5e74bf26 100644 --- a/server/src-lib/Hasura/RQL/Types/ComputedField.hs +++ b/server/src-lib/Hasura/RQL/Types/ComputedField.hs @@ -4,21 +4,21 @@ Description: Schema cache types related to computed field module Hasura.RQL.Types.ComputedField where -import Hasura.Incremental (Cacheable) +import Hasura.Incremental (Cacheable) import Hasura.Prelude import Hasura.RQL.Types.Common import Hasura.RQL.Types.Function import Hasura.SQL.Types -import Control.Lens hiding ((.=)) +import Control.Lens hiding ((.=)) import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH -import Instances.TH.Lift () -import Language.Haskell.TH.Syntax (Lift) +import Instances.TH.Lift () +import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Sequence as Seq -import qualified Database.PG.Query as Q +import qualified Data.Sequence as Seq +import qualified Database.PG.Query as Q newtype ComputedFieldName = ComputedFieldName { unComputedFieldName :: NonEmptyText} diff --git a/server/src-lib/Hasura/RQL/Types/CustomTypes.hs b/server/src-lib/Hasura/RQL/Types/CustomTypes.hs index 8226fa01c9f..01f28339016 100644 --- a/server/src-lib/Hasura/RQL/Types/CustomTypes.hs +++ b/server/src-lib/Hasura/RQL/Types/CustomTypes.hs @@ -3,11 +3,12 @@ module Hasura.RQL.Types.CustomTypes , emptyCustomTypes , GraphQLType(..) , isListType - , isListType' , EnumTypeName(..) , EnumValueDefinition(..) , EnumTypeDefinition(..) , ScalarTypeDefinition(..) + , intScalar, floatScalar, stringScalar, boolScalar, idScalar + , defaultScalars , InputObjectFieldName(..) , InputObjectFieldDefinition(..) , InputObjectTypeName(..) @@ -17,42 +18,39 @@ module Hasura.RQL.Types.CustomTypes , RelationshipName(..) , TypeRelationship(..) , trName, trType, trRemoteTable, trFieldMapping - , TypeRelationshipDefinition , ObjectTypeName(..) , ObjectTypeDefinition(..) - , CustomTypeName - , CustomTypeDefinition(..) - , CustomTypeDefinitionMap - , OutputFieldTypeInfo(..) - , AnnotatedObjectType(..) + , ObjectType + , AnnotatedScalarType(..) + , NonObjectCustomType(..) + , NonObjectTypeMap + , AnnotatedObjectFieldType(..) + , fieldTypeToScalarType + , AnnotatedObjectType , AnnotatedObjects - , AnnotatedRelationship - , NonObjectTypeMap(..) + , AnnotatedCustomTypes(..) + , emptyAnnotatedCustomTypes ) where -import Control.Lens.TH (makeLenses) -import Language.Haskell.TH.Syntax (Lift) +import Control.Lens.TH (makeLenses) +import Instances.TH.Lift () +import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Aeson as J -import qualified Data.Aeson.Casing as J -import qualified Data.Aeson.TH as J -import qualified Data.Text as T +import qualified Data.Aeson as J +import qualified Data.Aeson.Casing as J +import qualified Data.Aeson.TH as J +import qualified Data.HashMap.Strict as Map +import qualified Data.List.NonEmpty as NEList +import qualified Data.Text as T +import qualified Language.GraphQL.Draft.Parser as GParse +import qualified Language.GraphQL.Draft.Printer as GPrint +import qualified Language.GraphQL.Draft.Syntax as G +import qualified Text.Builder as T -import qualified Data.HashMap.Strict as Map -import qualified Data.List.NonEmpty as NEList -import Instances.TH.Lift () -import qualified Language.GraphQL.Draft.Parser as GParse -import qualified Language.GraphQL.Draft.Printer as GPrint -import qualified Language.GraphQL.Draft.Printer.Text as GPrintText -import qualified Language.GraphQL.Draft.Syntax as G - -import qualified Hasura.GraphQL.Validate.Types as VT - -import Hasura.Incremental (Cacheable) +import Hasura.Incremental (Cacheable) import Hasura.Prelude -import Hasura.RQL.Instances () import Hasura.RQL.Types.Column -import Hasura.RQL.Types.Common (RelType) +import Hasura.RQL.Types.Common (RelType) import Hasura.RQL.Types.Table import Hasura.SQL.Types @@ -61,7 +59,7 @@ newtype GraphQLType deriving (Show, Eq, Lift, Generic, NFData, Cacheable) instance J.ToJSON GraphQLType where - toJSON = J.toJSON . GPrintText.render GPrint.graphQLType . unGraphQLType + toJSON = J.toJSON . T.run . GPrint.graphQLType . unGraphQLType instance J.FromJSON GraphQLType where parseJSON = @@ -71,12 +69,7 @@ instance J.FromJSON GraphQLType where Right a -> return $ GraphQLType a isListType :: GraphQLType -> Bool -isListType (GraphQLType ty) = isListType' ty - -isListType' :: G.GType -> Bool -isListType' = \case - G.TypeList _ _ -> True - G.TypeNamed _ _ -> False +isListType (GraphQLType ty) = G.isListType ty newtype InputObjectFieldName = InputObjectFieldName { unInputObjectFieldName :: G.Name } @@ -87,14 +80,14 @@ data InputObjectFieldDefinition { _iofdName :: !InputObjectFieldName , _iofdDescription :: !(Maybe G.Description) , _iofdType :: !GraphQLType - -- TODO: default + -- TODO (from master): default } deriving (Show, Eq, Lift, Generic) instance NFData InputObjectFieldDefinition instance Cacheable InputObjectFieldDefinition $(J.deriveJSON (J.aesonDrop 5 J.snakeCase) ''InputObjectFieldDefinition) newtype InputObjectTypeName - = InputObjectTypeName { unInputObjectTypeName :: G.NamedType } + = InputObjectTypeName { unInputObjectTypeName :: G.Name } deriving (Show, Eq, Ord, Hashable, J.FromJSON, J.ToJSON, DQuote, Lift, Generic, NFData, Cacheable) data InputObjectTypeDefinition @@ -112,7 +105,7 @@ newtype ObjectFieldName deriving ( Show, Eq, Ord, Hashable, J.FromJSON, J.ToJSON, DQuote , J.FromJSONKey, J.ToJSONKey, Lift, Generic, NFData, Cacheable) -data ObjectFieldDefinition +data ObjectFieldDefinition a = ObjectFieldDefinition { _ofdName :: !ObjectFieldName -- we don't care about field arguments/directives @@ -121,10 +114,10 @@ data ObjectFieldDefinition -- context will be hard to pass to the webhook , _ofdArguments :: !(Maybe J.Value) , _ofdDescription :: !(Maybe G.Description) - , _ofdType :: !GraphQLType - } deriving (Show, Eq, Lift, Generic) -instance NFData ObjectFieldDefinition -instance Cacheable ObjectFieldDefinition + , _ofdType :: !a + } deriving (Show, Eq, Lift, Functor, Foldable, Traversable, Generic) +instance (NFData a) => NFData (ObjectFieldDefinition a) +instance (Cacheable a) => Cacheable (ObjectFieldDefinition a) $(J.deriveJSON (J.aesonDrop 4 J.snakeCase) ''ObjectFieldDefinition) newtype RelationshipName @@ -141,39 +134,49 @@ data TypeRelationship t f instance (NFData t, NFData f) => NFData (TypeRelationship t f) instance (Cacheable t, Cacheable f) => Cacheable (TypeRelationship t f) $(makeLenses ''TypeRelationship) - -type TypeRelationshipDefinition = - TypeRelationship QualifiedTable PGCol - $(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''TypeRelationship) newtype ObjectTypeName - = ObjectTypeName { unObjectTypeName :: G.NamedType } + = ObjectTypeName { unObjectTypeName :: G.Name } deriving ( Show, Eq, Ord, Hashable, J.FromJSON, J.FromJSONKey, DQuote , J.ToJSONKey, J.ToJSON, Lift, Generic, NFData, Cacheable) -data ObjectTypeDefinition +data ObjectTypeDefinition a b c = ObjectTypeDefinition { _otdName :: !ObjectTypeName , _otdDescription :: !(Maybe G.Description) - , _otdFields :: !(NEList.NonEmpty ObjectFieldDefinition) - , _otdRelationships :: !(Maybe [TypeRelationshipDefinition]) + , _otdFields :: !(NonEmpty (ObjectFieldDefinition a)) + , _otdRelationships :: !(Maybe (NonEmpty (TypeRelationship b c))) } deriving (Show, Eq, Lift, Generic) -instance NFData ObjectTypeDefinition -instance Cacheable ObjectTypeDefinition +instance (NFData a, NFData b, NFData c) => NFData (ObjectTypeDefinition a b c) +instance (Cacheable a, Cacheable b, Cacheable c) => Cacheable (ObjectTypeDefinition a b c) $(J.deriveJSON (J.aesonDrop 4 J.snakeCase) ''ObjectTypeDefinition) data ScalarTypeDefinition = ScalarTypeDefinition - { _stdName :: !G.NamedType + { _stdName :: !G.Name , _stdDescription :: !(Maybe G.Description) } deriving (Show, Eq, Lift, Generic) instance NFData ScalarTypeDefinition instance Cacheable ScalarTypeDefinition +instance Hashable ScalarTypeDefinition $(J.deriveJSON (J.aesonDrop 4 J.snakeCase) ''ScalarTypeDefinition) +-- default scalar names +intScalar, floatScalar, stringScalar, boolScalar, idScalar :: G.Name +intScalar = $$(G.litName "Int") +floatScalar = $$(G.litName "Float") +stringScalar = $$(G.litName "String") +boolScalar = $$(G.litName "Boolean") +idScalar = $$(G.litName "ID") + +defaultScalars :: [ScalarTypeDefinition] +defaultScalars = + map (flip ScalarTypeDefinition Nothing) + [intScalar, floatScalar, stringScalar, boolScalar, idScalar] + newtype EnumTypeName - = EnumTypeName { unEnumTypeName :: G.NamedType } + = EnumTypeName { unEnumTypeName :: G.Name } deriving (Show, Eq, Ord, Hashable, J.FromJSON, J.ToJSON, DQuote, Lift, Generic, NFData, Cacheable) data EnumValueDefinition @@ -196,23 +199,13 @@ instance NFData EnumTypeDefinition instance Cacheable EnumTypeDefinition $(J.deriveJSON (J.aesonDrop 4 J.snakeCase) ''EnumTypeDefinition) -data CustomTypeDefinition - = CustomTypeScalar !ScalarTypeDefinition - | CustomTypeEnum !EnumTypeDefinition - | CustomTypeInputObject !InputObjectTypeDefinition - | CustomTypeObject !ObjectTypeDefinition - deriving (Show, Eq, Lift) -$(J.deriveJSON J.defaultOptions ''CustomTypeDefinition) - -type CustomTypeDefinitionMap = Map.HashMap G.NamedType CustomTypeDefinition -newtype CustomTypeName - = CustomTypeName { unCustomTypeName :: G.NamedType } - deriving (Show, Eq, Hashable, J.ToJSONKey, J.FromJSONKey) +type ObjectType = + ObjectTypeDefinition GraphQLType QualifiedTable PGCol data CustomTypes = CustomTypes { _ctInputObjects :: !(Maybe [InputObjectTypeDefinition]) - , _ctObjects :: !(Maybe [ObjectTypeDefinition]) + , _ctObjects :: !(Maybe [ObjectType]) , _ctScalars :: !(Maybe [ScalarTypeDefinition]) , _ctEnums :: !(Maybe [EnumTypeDefinition]) } deriving (Show, Eq, Lift, Generic) @@ -223,29 +216,53 @@ $(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''CustomTypes) emptyCustomTypes :: CustomTypes emptyCustomTypes = CustomTypes Nothing Nothing Nothing Nothing -type AnnotatedRelationship = - TypeRelationship TableInfo PGColumnInfo +data AnnotatedScalarType + = ASTCustom !ScalarTypeDefinition + | ASTReusedPgScalar !G.Name !PGScalarType + deriving (Show, Eq, Lift) +$(J.deriveJSON J.defaultOptions ''AnnotatedScalarType) -data OutputFieldTypeInfo - = OutputFieldScalar !VT.ScalarTyInfo - | OutputFieldEnum !VT.EnumTyInfo +data NonObjectCustomType + = NOCTScalar !AnnotatedScalarType + | NOCTEnum !EnumTypeDefinition + | NOCTInputObject !InputObjectTypeDefinition + deriving (Show, Eq, Lift) +$(J.deriveJSON J.defaultOptions ''NonObjectCustomType) + +type NonObjectTypeMap = Map.HashMap G.Name NonObjectCustomType + +data AnnotatedObjectFieldType + = AOFTScalar !AnnotatedScalarType + | AOFTEnum !EnumTypeDefinition deriving (Show, Eq) +$(J.deriveToJSON J.defaultOptions ''AnnotatedObjectFieldType) -data AnnotatedObjectType - = AnnotatedObjectType - { _aotDefinition :: !ObjectTypeDefinition - , _aotAnnotatedFields :: !(Map.HashMap ObjectFieldName (G.GType, OutputFieldTypeInfo)) - , _aotRelationships :: !(Map.HashMap RelationshipName AnnotatedRelationship) - } deriving (Show, Eq) +fieldTypeToScalarType :: AnnotatedObjectFieldType -> PGScalarType +fieldTypeToScalarType = \case + AOFTEnum _ -> PGText + AOFTScalar annotatedScalar -> annotatedScalarToPgScalar annotatedScalar + where + annotatedScalarToPgScalar = \case + ASTReusedPgScalar _ scalarType -> scalarType + ASTCustom ScalarTypeDefinition{..} -> + if | _stdName == idScalar -> PGText + | _stdName == intScalar -> PGInteger + | _stdName == floatScalar -> PGFloat + | _stdName == stringScalar -> PGText + | _stdName == boolScalar -> PGBoolean + | otherwise -> PGJSON -instance J.ToJSON AnnotatedObjectType where - toJSON = J.toJSON . show +type AnnotatedObjectType = + ObjectTypeDefinition (G.GType, AnnotatedObjectFieldType) TableInfo PGColumnInfo -type AnnotatedObjects = Map.HashMap ObjectTypeName AnnotatedObjectType +type AnnotatedObjects = Map.HashMap G.Name AnnotatedObjectType -newtype NonObjectTypeMap - = NonObjectTypeMap { unNonObjectTypeMap :: VT.TypeMap } - deriving (Show, Eq, Semigroup, Monoid) +data AnnotatedCustomTypes + = AnnotatedCustomTypes + { _actNonObjects :: !NonObjectTypeMap + , _actObjects :: !AnnotatedObjects + } deriving (Show, Eq) -instance J.ToJSON NonObjectTypeMap where - toJSON = J.toJSON . show +emptyAnnotatedCustomTypes :: AnnotatedCustomTypes +emptyAnnotatedCustomTypes = + AnnotatedCustomTypes mempty mempty diff --git a/server/src-lib/Hasura/RQL/Types/Error.hs b/server/src-lib/Hasura/RQL/Types/Error.hs index 237d5b6d636..e4810625407 100644 --- a/server/src-lib/Hasura/RQL/Types/Error.hs +++ b/server/src-lib/Hasura/RQL/Types/Error.hs @@ -344,7 +344,7 @@ formatMsg str = case T.splitOn "the key " txt of where txt = T.pack str -runAesonParser :: (QErrM m) => (Value -> Parser a) -> Value -> m a +runAesonParser :: (QErrM m) => (v -> Parser a) -> v -> m a runAesonParser p = liftIResult . iparse p diff --git a/server/src-lib/Hasura/RQL/Types/QueryCollection.hs b/server/src-lib/Hasura/RQL/Types/QueryCollection.hs index 580c47c1a08..bce0756a1f7 100644 --- a/server/src-lib/Hasura/RQL/Types/QueryCollection.hs +++ b/server/src-lib/Hasura/RQL/Types/QueryCollection.hs @@ -1,20 +1,34 @@ -module Hasura.RQL.Types.QueryCollection where +module Hasura.RQL.Types.QueryCollection + ( CollectionName + , CollectionDef(..) + , CreateCollection(..) + , AddQueryToCollection(..) + , DropQueryFromCollection(..) + , DropCollection(..) + , CollectionReq(..) + , GQLQuery(..) + , GQLQueryWithText(..) + , QueryName(..) + , ListedQuery(..) + , getGQLQuery + , queryWithoutTypeNames + , stripTypenames + ) where -import Hasura.GraphQL.Validate.Types (stripTypenames) -import Hasura.Incremental (Cacheable) +import Hasura.Incremental (Cacheable) import Hasura.Prelude -import Hasura.RQL.Types.Common (NonEmptyText) +import Hasura.RQL.Instances () +import Hasura.RQL.Types.Common (NonEmptyText) import Hasura.SQL.Types import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH -import Language.GraphQL.Draft.Instances () -import Language.Haskell.TH.Syntax (Lift) +import Language.Haskell.TH.Syntax (Lift) -import qualified Data.Text as T -import qualified Database.PG.Query as Q -import qualified Language.GraphQL.Draft.Syntax as G +import qualified Data.Text as T +import qualified Database.PG.Query as Q +import qualified Language.GraphQL.Draft.Syntax as G newtype CollectionName = CollectionName {unCollectionName :: NonEmptyText} @@ -28,7 +42,7 @@ newtype QueryName deriving (Show, Eq, Ord, NFData, Hashable, Lift, ToJSON, ToJSONKey, FromJSON, DQuote, Generic, Arbitrary, Cacheable) newtype GQLQuery - = GQLQuery {unGQLQuery :: G.ExecutableDocument} + = GQLQuery { unGQLQuery :: G.ExecutableDocument G.Name } deriving (Show, Eq, NFData, Hashable, Lift, ToJSON, FromJSON, Cacheable) newtype GQLQueryWithText @@ -50,6 +64,39 @@ queryWithoutTypeNames = GQLQuery . G.ExecutableDocument . stripTypenames . G.getExecutableDefinitions . unGQLQuery +-- WIP NOTE +-- this was lifted from Validate. Should this be here? +stripTypenames :: forall var. [G.ExecutableDefinition var] -> [G.ExecutableDefinition var] +stripTypenames = map filterExecDef + where + filterExecDef :: G.ExecutableDefinition var -> G.ExecutableDefinition var + filterExecDef = \case + G.ExecutableDefinitionOperation opDef -> + G.ExecutableDefinitionOperation $ filterOpDef opDef + G.ExecutableDefinitionFragment fragDef -> + let newSelset = filterSelSet $ G._fdSelectionSet fragDef + in G.ExecutableDefinitionFragment fragDef{G._fdSelectionSet = newSelset} + + filterOpDef = \case + G.OperationDefinitionTyped typeOpDef -> + let newSelset = filterSelSet $ G._todSelectionSet typeOpDef + in G.OperationDefinitionTyped typeOpDef{G._todSelectionSet = newSelset} + G.OperationDefinitionUnTyped selset -> + G.OperationDefinitionUnTyped $ filterSelSet selset + + filterSelSet :: [G.Selection frag var'] -> [G.Selection frag var'] + filterSelSet = mapMaybe filterSel + filterSel :: G.Selection frag var' -> Maybe (G.Selection frag var') + filterSel s = case s of + G.SelectionField f -> + if G._fName f == $$(G.litName "__typename") + then Nothing + else + let newSelset = filterSelSet $ G._fSelectionSet f + in Just $ G.SelectionField f{G._fSelectionSet = newSelset} + _ -> Just s + + data ListedQuery = ListedQuery { _lqName :: !QueryName diff --git a/server/src-lib/Hasura/RQL/Types/RemoteRelationship.hs b/server/src-lib/Hasura/RQL/Types/RemoteRelationship.hs index 0e25d4b8ebe..01193eae2cc 100644 --- a/server/src-lib/Hasura/RQL/Types/RemoteRelationship.hs +++ b/server/src-lib/Hasura/RQL/Types/RemoteRelationship.hs @@ -22,7 +22,6 @@ import Hasura.RQL.Types.Common import Hasura.RQL.Types.RemoteSchema import Hasura.SQL.Types - import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH @@ -51,70 +50,68 @@ fromRemoteRelationship = FieldName . remoteRelationshipNameToText -- | Resolved remote relationship data RemoteFieldInfo = RemoteFieldInfo - { _rfiName :: !RemoteRelationshipName + { _rfiName :: !RemoteRelationshipName -- ^ Field name to which we'll map the remote in hasura; this becomes part -- of the hasura schema. - , _rfiGType :: G.GType - , _rfiParamMap :: !(HashMap G.Name InpValInfo) - -- ^ Fully resolved arguments (no variable references, since this uses - -- 'G.ValueConst' not 'G.Value'). - , _rfiHasuraFields :: !(HashSet PGColumnInfo) - , _rfiRemoteFields :: !(NonEmpty FieldCall) - , _rfiRemoteSchema :: !RemoteSchemaInfo + , _rfiParamMap :: !(HashMap G.Name G.InputValueDefinition) + -- ^ Input arguments to the remote field info; The '_rfiParamMap' will only + -- include the arguments to the remote field that is being joined. The + -- names of the arguments here are modified, it will be in the format of + -- _remote_rel__ + , _rfiHasuraFields :: !(HashSet PGColumnInfo) + -- ^ Hasura fields used to join the remote schema node + , _rfiRemoteFields :: !RemoteFields + , _rfiRemoteSchema :: !RemoteSchemaInfo + , _rfiSchemaIntrospect :: G.SchemaIntrospection + -- ^ The introspection data is used to make parsers for the arguments and the selection set + , _rfiRemoteSchemaName :: !RemoteSchemaName + -- ^ Name of the remote schema, that's used for joining } deriving (Show, Eq, Generic) instance Cacheable RemoteFieldInfo instance ToJSON RemoteFieldInfo where toJSON RemoteFieldInfo{..} = object [ "name" .= _rfiName - , "g_type" .= toJsonGType _rfiGType , "param_map" .= fmap toJsonInpValInfo _rfiParamMap , "hasura_fields" .= _rfiHasuraFields - , "remote_fields" .= RemoteFields _rfiRemoteFields + , "remote_fields" .= _rfiRemoteFields , "remote_schema" .= _rfiRemoteSchema ] where - -- | Convert to JSON, using Either as an auxilliary type. - toJsonGType gtype = - toJSON - (case gtype of - G.TypeNamed (G.Nullability nullability) namedType -> - Left (nullability, namedType) - G.TypeList (G.Nullability nullability) (G.ListType listType) -> - Right (nullability, listType)) - - toJsonInpValInfo InpValInfo {..} = + toJsonInpValInfo (G.InputValueDefinition desc name type' defVal) = object - [ "desc" .= _iviDesc - , "name" .= _iviName - , "def_val" .= fmap gValueConstToValue _iviDefVal - , "type" .= _iviType + [ "desc" .= desc + , "name" .= name + , "def_val" .= fmap gValueToJSONValue defVal + , "type" .= type' ] - gValueConstToValue = + gValueToJSONValue :: G.Value Void -> Value + gValueToJSONValue = \case - (G.VCInt i) -> toJSON i - (G.VCFloat f) -> toJSON f - (G.VCString (G.StringValue s)) -> toJSON s - (G.VCBoolean b) -> toJSON b - G.VCNull -> Null - (G.VCEnum s) -> toJSON s - (G.VCList (G.ListValueG list)) -> toJSON (map gValueConstToValue list) - (G.VCObject (G.ObjectValueG xs)) -> constFieldsToObject xs + G.VNull -> Null + G.VInt i -> toJSON i + G.VFloat f -> toJSON f + G.VString s -> toJSON s + G.VBoolean b -> toJSON b + G.VEnum s -> toJSON s + G.VList list -> toJSON (map gValueToJSONValue list) + G.VObject obj -> fieldsToObject obj - constFieldsToObject = + fieldsToObject = Object . HM.fromList . map - (\G.ObjectFieldG {_ofName = G.Name name, _ofValue} -> - (name, gValueConstToValue _ofValue)) + (\(name, val) -> + (G.unName name, gValueToJSONValue val)) . + HM.toList -- | For some 'FieldCall', for instance, associates a field argument name with -- either a list of either scalar values or some 'G.Variable' we are closed -- over (brought into scope, e.g. in 'rtrHasuraFields'. newtype RemoteArguments = RemoteArguments - { getRemoteArguments :: [G.ObjectFieldG G.Value] + { getRemoteArguments :: (HashMap G.Name (G.Value G.Name)) } deriving (Show, Eq, Lift, Cacheable, NFData) instance ToJSON RemoteArguments where @@ -123,19 +120,22 @@ instance ToJSON RemoteArguments where fieldsToObject = Object . HM.fromList . - map (\G.ObjectFieldG {_ofName=G.Name name, _ofValue} -> (name, gValueToValue _ofValue)) + map + (\(name, val) -> + (G.unName name, gValueToValue val)) . + HM.toList gValueToValue = \case - (G.VVariable (G.Variable v)) -> toJSON ("$" <> v) - (G.VInt i) -> toJSON i - (G.VFloat f) -> toJSON f - (G.VString (G.StringValue s)) -> toJSON s - (G.VBoolean b) -> toJSON b - G.VNull -> Null - (G.VEnum s) -> toJSON s - (G.VList (G.ListValueG list)) -> toJSON (map gValueToValue list) - (G.VObject (G.ObjectValueG xs)) -> fieldsToObject xs + G.VVariable v -> toJSON ("$" <> G.unName v) + G.VInt i -> toJSON i + G.VFloat f -> toJSON f + G.VString s -> toJSON s + G.VBoolean b -> toJSON b + G.VNull -> Null + G.VEnum s -> toJSON s + G.VList list -> toJSON (map gValueToValue list) + G.VObject obj -> fieldsToObject obj instance FromJSON RemoteArguments where parseJSON = \case @@ -143,26 +143,33 @@ instance FromJSON RemoteArguments where _ -> fail "Remote arguments should be an object of keys." where -- Parsing GraphQL input arguments from JSON - parseObjectFieldsToGValue hashMap = - traverse + parseObjectFieldsToGValue hashMap = do + bleh <- + traverse (\(key, value) -> do - name <- parseJSON (String key) - parsedValue <- parseValueAsGValue value - pure G.ObjectFieldG {_ofName = name, _ofValue = parsedValue}) - (HM.toList hashMap) + name <- case G.mkName key of + Nothing -> fail $ T.unpack key <> " is an invalid key name" + Just name' -> pure name' + parsedValue <- parseValueAsGValue value + pure (name,parsedValue)) + (HM.toList hashMap) + pure $ HM.fromList bleh parseValueAsGValue = \case Object obj -> - fmap (G.VObject . G.ObjectValueG) (parseObjectFieldsToGValue obj) + fmap G.VObject (parseObjectFieldsToGValue obj) Array array -> - fmap (G.VList . G.ListValueG . toList) (traverse parseValueAsGValue array) + fmap (G.VList . toList) (traverse parseValueAsGValue array) String text -> case T.uncons text of Just ('$', rest) | T.null rest -> fail "Invalid variable name." - | otherwise -> pure (G.VVariable (G.Variable (G.Name rest))) - _ -> pure (G.VString (G.StringValue text)) + | otherwise -> + case G.mkName rest of + Nothing -> fail "Invalid variable name." + Just name' -> pure $ G.VVariable name' + _ -> pure (G.VString text) Number !scientificNum -> pure (either (\(_::Float) -> G.VFloat scientificNum) G.VInt (floatingOrInteger scientificNum)) Bool !boolean -> pure (G.VBoolean boolean) @@ -172,7 +179,7 @@ instance FromJSON RemoteArguments where -- -- https://graphql.github.io/graphql-spec/June2018/#sec-Language.Arguments -- --- TODO we don't seem to support empty RemoteArguments (like 'hello'), but this seems arbitrary: +-- TODO (from master) we don't seem to support empty RemoteArguments (like 'hello'), but this seems arbitrary: data FieldCall = FieldCall { fcName :: !G.Name @@ -237,7 +244,7 @@ data RemoteRelationship = -- ^ Field name to which we'll map the remote in hasura; this becomes part -- of the hasura schema. , rtrTable :: !QualifiedTable - , rtrHasuraFields :: !(Set FieldName) -- TODO? change to PGCol + , rtrHasuraFields :: !(Set FieldName) -- TODO (from master)? change to PGCol -- ^ The hasura fields from 'rtrTable' that will be in scope when resolving -- the remote objects in 'rtrRemoteField'. , rtrRemoteSchema :: !RemoteSchemaName diff --git a/server/src-lib/Hasura/RQL/Types/RemoteSchema.hs b/server/src-lib/Hasura/RQL/Types/RemoteSchema.hs index f64582a7592..e149e8a4539 100644 --- a/server/src-lib/Hasura/RQL/Types/RemoteSchema.hs +++ b/server/src-lib/Hasura/RQL/Types/RemoteSchema.hs @@ -27,13 +27,9 @@ newtype RemoteSchemaName , Generic, Cacheable, Arbitrary ) -remoteSchemaNameToTxt :: RemoteSchemaName -> Text -remoteSchemaNameToTxt = unNonEmptyText . unRemoteSchemaName - data RemoteSchemaInfo = RemoteSchemaInfo - { rsName :: !RemoteSchemaName - , rsUrl :: !N.URI + { rsUrl :: !N.URI , rsHeaders :: ![HeaderConf] , rsFwdClientHeaders :: !Bool , rsTimeoutSeconds :: !Int @@ -94,16 +90,15 @@ getUrlFromEnv env urlFromEnv = do validateRemoteSchemaDef :: (MonadError QErr m, MonadIO m) => Env.Environment - -> RemoteSchemaName -> RemoteSchemaDef -> m RemoteSchemaInfo -validateRemoteSchemaDef env rsName (RemoteSchemaDef mUrl mUrlEnv hdrC fwdHdrs mTimeout) = +validateRemoteSchemaDef env (RemoteSchemaDef mUrl mUrlEnv hdrC fwdHdrs mTimeout) = case (mUrl, mUrlEnv) of (Just url, Nothing) -> - return $ RemoteSchemaInfo rsName url hdrs fwdHdrs timeout + return $ RemoteSchemaInfo url hdrs fwdHdrs timeout (Nothing, Just urlEnv) -> do - url <- getUrlFromEnv env urlEnv - return $ RemoteSchemaInfo rsName url hdrs fwdHdrs timeout + url <- getUrlFromEnv env urlEnv + return $ RemoteSchemaInfo url hdrs fwdHdrs timeout (Nothing, Nothing) -> throw400 InvalidParams "both `url` and `url_from_env` can't be empty" (Just _, Just _) -> diff --git a/server/src-lib/Hasura/RQL/Types/SchemaCache.hs b/server/src-lib/Hasura/RQL/Types/SchemaCache.hs index a99c26ecc9e..2996e8145cc 100644 --- a/server/src-lib/Hasura/RQL/Types/SchemaCache.hs +++ b/server/src-lib/Hasura/RQL/Types/SchemaCache.hs @@ -17,12 +17,8 @@ module Hasura.RQL.Types.SchemaCache , TableCache , ActionCache - , OutputFieldTypeInfo(..) - , AnnotatedObjectType(..) - , AnnotatedObjects , TypeRelationship(..) , trName, trType, trRemoteTable, trFieldMapping - , NonObjectTypeMap(..) , TableCoreInfoG(..) , TableRawInfo , TableCoreInfo @@ -47,6 +43,8 @@ module Hasura.RQL.Types.SchemaCache , isMutable , mutableView + , IntrospectionResult(..) + , ParsedIntrospection(..) , RemoteSchemaCtx(..) , RemoteSchemaMap @@ -115,10 +113,11 @@ module Hasura.RQL.Types.SchemaCache , getFuncsOfTable , askFunctionInfo , CronTriggerInfo(..) - , mergeRemoteTypesWithGCtx ) where import Hasura.Db +import Hasura.GraphQL.Context (GQLContext, RoleContext) +import qualified Hasura.GraphQL.Parser as P import Hasura.Incremental (Dependency, MonadDepend (..), selectKeyD) import Hasura.Prelude import Hasura.RQL.Types.Action @@ -130,11 +129,14 @@ import Hasura.RQL.Types.Error import Hasura.RQL.Types.EventTrigger import Hasura.RQL.Types.Function import Hasura.RQL.Types.Metadata +--import Hasura.RQL.Types.Permission import Hasura.RQL.Types.QueryCollection import Hasura.RQL.Types.RemoteSchema + import Hasura.RQL.Types.ScheduledTrigger import Hasura.RQL.Types.SchemaCacheTypes import Hasura.RQL.Types.Table +import Hasura.Session import Hasura.SQL.Types import Hasura.Tracing (TraceT) @@ -143,11 +145,11 @@ import Data.Aeson.Casing import Data.Aeson.TH import System.Cron.Types +import qualified Data.ByteString.Lazy as BL import qualified Data.HashMap.Strict as M import qualified Data.HashSet as HS import qualified Data.Text as T -import qualified Hasura.GraphQL.Context as GC -import qualified Hasura.GraphQL.Validate.Types as VT +import qualified Language.GraphQL.Draft.Syntax as G reportSchemaObjs :: [SchemaObjId] -> T.Text reportSchemaObjs = T.intercalate ", " . sort . map reportSchemaObj @@ -166,12 +168,29 @@ mkComputedFieldDep reason tn computedField = type WithDeps a = (a, [SchemaDependency]) +data IntrospectionResult + = IntrospectionResult + { irDoc :: G.SchemaIntrospection + , irQueryRoot :: G.Name + , irMutationRoot :: Maybe G.Name + , irSubscriptionRoot :: Maybe G.Name + } + +data ParsedIntrospection + = ParsedIntrospection + { piQuery :: [P.FieldParser (P.ParseT Identity) (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + , piMutation :: Maybe [P.FieldParser (P.ParseT Identity) (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + , piSubscription :: Maybe [P.FieldParser (P.ParseT Identity) (RemoteSchemaInfo, G.Field G.NoFragments P.Variable)] + } + data RemoteSchemaCtx = RemoteSchemaCtx - { rscName :: !RemoteSchemaName -- TODO: Name should already be in RemoteSchemaInfo - , rscGCtx :: !GC.GCtx - , rscInfo :: !RemoteSchemaInfo - } deriving (Show, Eq) + { rscName :: !RemoteSchemaName + , rscIntro :: !IntrospectionResult + , rscInfo :: !RemoteSchemaInfo + , rscRawIntrospectionResult :: !BL.ByteString + , rscParsed :: ParsedIntrospection + } instance ToJSON RemoteSchemaCtx where toJSON = toJSON . rscInfo @@ -209,19 +228,20 @@ type ActionCache = M.HashMap ActionName ActionInfo -- info of all actions data SchemaCache = SchemaCache - { scTables :: !TableCache - , scActions :: !ActionCache - , scFunctions :: !FunctionCache - , scRemoteSchemas :: !RemoteSchemaMap - , scAllowlist :: !(HS.HashSet GQLQuery) - , scCustomTypes :: !(NonObjectTypeMap, AnnotatedObjects) - , scGCtxMap :: !GC.GCtxMap - , scDefaultRemoteGCtx :: !GC.GCtx - , scRelayGCtxMap :: !GC.RelayGCtxMap - , scDepMap :: !DepMap - , scInconsistentObjs :: ![InconsistentMetadata] - , scCronTriggers :: !(M.HashMap TriggerName CronTriggerInfo) - } deriving (Show, Eq) + { scTables :: !TableCache + , scActions :: !ActionCache + , scFunctions :: !FunctionCache + , scRemoteSchemas :: !RemoteSchemaMap + , scAllowlist :: !(HS.HashSet GQLQuery) + , scGQLContext :: !(HashMap RoleName (RoleContext GQLContext)) + , scUnauthenticatedGQLContext :: !GQLContext + , scRelayContext :: !(HashMap RoleName (RoleContext GQLContext)) + , scUnauthenticatedRelayContext :: !GQLContext + -- , scCustomTypes :: !(NonObjectTypeMap, AnnotatedObjects) + , scDepMap :: !DepMap + , scInconsistentObjs :: ![InconsistentMetadata] + , scCronTriggers :: !(M.HashMap TriggerName CronTriggerInfo) + } $(deriveToJSON (aesonDrop 2 snakeCase) ''SchemaCache) getFuncsOfTable :: QualifiedTable -> FunctionCache -> [FunctionInfo] @@ -309,7 +329,3 @@ getDependentObjsWith f sc objId = induces (SOTable tn1) (SOTableObj tn2 _) = tn1 == tn2 induces objId1 objId2 = objId1 == objId2 -- allDeps = toList $ fromMaybe HS.empty $ M.lookup objId $ scDepMap sc - -mergeRemoteTypesWithGCtx :: VT.TypeMap -> GC.GCtx -> GC.GCtx -mergeRemoteTypesWithGCtx remoteTypeMap gctx = - gctx {GC._gTypes = remoteTypeMap <> GC._gTypes gctx } diff --git a/server/src-lib/Hasura/RQL/Types/SchemaCacheTypes.hs b/server/src-lib/Hasura/RQL/Types/SchemaCacheTypes.hs index ced437613a9..1fd328fe9a7 100644 --- a/server/src-lib/Hasura/RQL/Types/SchemaCacheTypes.hs +++ b/server/src-lib/Hasura/RQL/Types/SchemaCacheTypes.hs @@ -14,8 +14,8 @@ import Hasura.RQL.Types.EventTrigger import Hasura.RQL.Types.Permission import Hasura.RQL.Types.RemoteRelationship import Hasura.RQL.Types.RemoteSchema -import Hasura.Session import Hasura.SQL.Types +import Hasura.Session data TableObjId = TOCol !PGCol @@ -33,7 +33,7 @@ data SchemaObjId | SOTableObj !QualifiedTable !TableObjId | SOFunction !QualifiedFunction | SORemoteSchema !RemoteSchemaName - deriving (Eq, Generic) + deriving (Eq, Generic) instance Hashable SchemaObjId @@ -89,21 +89,21 @@ instance Hashable DependencyReason reasonToTxt :: DependencyReason -> Text reasonToTxt = \case - DRTable -> "table" - DRColumn -> "column" - DRRemoteTable -> "remote_table" - DRLeftColumn -> "left_column" - DRRightColumn -> "right_column" - DRUsingColumn -> "using_column" - DRFkey -> "fkey" - DRRemoteFkey -> "remote_fkey" - DRUntyped -> "untyped" - DROnType -> "on_type" - DRSessionVariable -> "session_variable" - DRPayload -> "payload" - DRParent -> "parent" - DRRemoteSchema -> "remote_schema" - DRRemoteRelationship -> "remote_relationship" + DRTable -> "table" + DRColumn -> "column" + DRRemoteTable -> "remote_table" + DRLeftColumn -> "left_column" + DRRightColumn -> "right_column" + DRUsingColumn -> "using_column" + DRFkey -> "fkey" + DRRemoteFkey -> "remote_fkey" + DRUntyped -> "untyped" + DROnType -> "on_type" + DRSessionVariable -> "session_variable" + DRPayload -> "payload" + DRParent -> "parent" + DRRemoteSchema -> "remote_schema" + DRRemoteRelationship -> "remote_relationship" instance ToJSON DependencyReason where toJSON = String . reasonToTxt diff --git a/server/src-lib/Hasura/RQL/Types/Table.hs b/server/src-lib/Hasura/RQL/Types/Table.hs index 8cf12e4b7c2..24a86e39f19 100644 --- a/server/src-lib/Hasura/RQL/Types/Table.hs +++ b/server/src-lib/Hasura/RQL/Types/Table.hs @@ -41,6 +41,7 @@ module Hasura.RQL.Types.Table , _FIComputedField , _FIRemoteRelationship , fieldInfoName + , fieldInfoGraphQLName , fieldInfoGraphQLNames , getFieldInfoM , getPGColumnInfoM @@ -48,7 +49,6 @@ module Hasura.RQL.Types.Table , sortCols , getRels , getComputedFieldInfos - , getRemoteRels , isPGColInfo , RelInfo(..) @@ -79,7 +79,8 @@ module Hasura.RQL.Types.Table ) where -import Hasura.GraphQL.Utils (showNames) +-- import qualified Hasura.GraphQL.Context as GC + import Hasura.Incremental (Cacheable) import Hasura.Prelude import Hasura.RQL.Types.BoolExp @@ -90,6 +91,7 @@ import Hasura.RQL.Types.Error import Hasura.RQL.Types.EventTrigger import Hasura.RQL.Types.Permission import Hasura.RQL.Types.RemoteRelationship +import Hasura.Server.Utils (duplicates, englishList) import Hasura.Session import Hasura.SQL.Types @@ -97,11 +99,11 @@ import Control.Lens import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH -import Data.List.Extended (duplicates) import Language.Haskell.TH.Syntax (Lift) import qualified Data.HashMap.Strict as M import qualified Data.HashSet as HS +import qualified Data.List.NonEmpty as NE import qualified Data.Text as T import qualified Language.GraphQL.Draft.Syntax as G @@ -139,9 +141,9 @@ instance FromJSON TableCustomRootFields where , update, updateByPk , delete, deleteByPk ] - when (not $ null duplicateRootFields) $ fail $ T.unpack $ + for_ (nonEmpty duplicateRootFields) \duplicatedFields -> fail $ T.unpack $ "the following custom root field names are duplicated: " - <> showNames duplicateRootFields + <> englishList "and" (dquoteTxt <$> duplicatedFields) pure $ TableCustomRootFields select selectByPk selectAggregate insert insertOne update updateByPk delete deleteByPk @@ -182,19 +184,26 @@ fieldInfoName = \case FIComputedField info -> fromComputedField $ _cfiName info FIRemoteRelationship info -> fromRemoteRelationship $ _rfiName info +fieldInfoGraphQLName :: FieldInfo -> Maybe G.Name +fieldInfoGraphQLName = \case + FIColumn info -> Just $ pgiName info + FIRelationship info -> G.mkName $ relNameToTxt $ riName info + FIComputedField info -> G.mkName $ computedFieldNameToText $ _cfiName info + FIRemoteRelationship info -> G.mkName $ remoteRelationshipNameToText $ _rfiName info + -- | Returns all the field names created for the given field. Columns, object relationships, and -- computed fields only ever produce a single field, but array relationships also contain an -- @_aggregate@ field. fieldInfoGraphQLNames :: FieldInfo -> [G.Name] -fieldInfoGraphQLNames = \case - FIColumn info -> [pgiName info] - FIRelationship info -> - let name = G.Name . relNameToTxt $ riName info - in case riType info of +fieldInfoGraphQLNames info = case info of + FIColumn _ -> maybeToList $ fieldInfoGraphQLName info + FIRelationship relationshipInfo -> fold do + name <- fieldInfoGraphQLName info + pure $ case riType relationshipInfo of ObjRel -> [name] - ArrRel -> [name, name <> "_aggregate"] - FIComputedField info -> [G.Name . computedFieldNameToText $ _cfiName info] - FIRemoteRelationship info -> pure $ G.Name $ remoteRelationshipNameToText $ _rfiName info + ArrRel -> [name, name <> $$(G.litName "_aggregate")] + FIComputedField _ -> maybeToList $ fieldInfoGraphQLName info + FIRemoteRelationship _ -> maybeToList $ fieldInfoGraphQLName info getCols :: FieldInfoMap FieldInfo -> [PGColumnInfo] getCols = mapMaybe (^? _FIColumn) . M.elems @@ -209,9 +218,6 @@ getRels = mapMaybe (^? _FIRelationship) . M.elems getComputedFieldInfos :: FieldInfoMap FieldInfo -> [ComputedFieldInfo] getComputedFieldInfos = mapMaybe (^? _FIComputedField) . M.elems -getRemoteRels :: FieldInfoMap FieldInfo -> [RemoteFieldInfo] -getRemoteRels = mapMaybe (^? _FIRemoteRelationship) . M.elems - isPGColInfo :: FieldInfo -> Bool isPGColInfo (FIColumn _) = True isPGColInfo _ = False @@ -413,8 +419,8 @@ type TableRawInfo = TableCoreInfoG PGColumnInfo PGColumnInfo -- | Fully-processed table info that includes non-column fields. type TableCoreInfo = TableCoreInfoG FieldInfo PGColumnInfo -tciUniqueOrPrimaryKeyConstraints :: TableCoreInfoG a b -> [Constraint] -tciUniqueOrPrimaryKeyConstraints info = +tciUniqueOrPrimaryKeyConstraints :: TableCoreInfoG a b -> Maybe (NonEmpty Constraint) +tciUniqueOrPrimaryKeyConstraints info = NE.nonEmpty $ maybeToList (_pkConstraint <$> _tciPrimaryKey info) <> toList (_tciUniqueConstraints info) data TableInfo diff --git a/server/src-lib/Hasura/SQL/DML.hs b/server/src-lib/Hasura/SQL/DML.hs index c16a4f8b57c..a29f004a8d8 100644 --- a/server/src-lib/Hasura/SQL/DML.hs +++ b/server/src-lib/Hasura/SQL/DML.hs @@ -547,6 +547,9 @@ instance Hashable FromItem mkSelFromItem :: Select -> Alias -> FromItem mkSelFromItem = FISelect (Lateral False) +mkSelectWithFromItem :: SelectWithG Select -> Alias -> FromItem +mkSelectWithFromItem = FISelectWith (Lateral False) + mkLateralFromItem :: Select -> Alias -> FromItem mkLateralFromItem = FISelect (Lateral True) diff --git a/server/src-lib/Hasura/SQL/Types.hs b/server/src-lib/Hasura/SQL/Types.hs index 406dd9f57e4..04bceb43f24 100644 --- a/server/src-lib/Hasura/SQL/Types.hs +++ b/server/src-lib/Hasura/SQL/Types.hs @@ -32,6 +32,7 @@ module Hasura.SQL.Types , DQuote(..) , dquote + , squote , dquoteList , IsIden(..) @@ -50,6 +51,7 @@ module Hasura.SQL.Types , QualifiedObject(..) , qualObjectToText , snakeCaseQualObject + , qualifiedObjectToName , PGScalarType(..) , WithScalarType(..) @@ -68,6 +70,7 @@ import qualified Database.PG.Query as Q import qualified Database.PG.Query.PTI as PTI import Hasura.Prelude +import Hasura.RQL.Types.Error import Data.Aeson import Data.Aeson.Casing @@ -122,14 +125,19 @@ instance DQuote T.Text where dquoteTxt = id {-# INLINE dquoteTxt #-} -deriving instance DQuote G.NamedType -deriving instance DQuote G.Name +instance DQuote G.Name where + dquoteTxt = dquoteTxt . G.unName + deriving instance DQuote G.EnumValue dquote :: (DQuote a) => a -> T.Text dquote = T.dquote . dquoteTxt {-# INLINE dquote #-} +squote :: (DQuote a) => a -> T.Text +squote = T.squote . dquoteTxt +{-# INLINE squote #-} + dquoteList :: (DQuote a, Foldable t) => t a -> T.Text dquoteList = T.intercalate ", " . map dquote . toList {-# INLINE dquoteList #-} @@ -301,6 +309,13 @@ snakeCaseQualObject (QualifiedObject sn o) | sn == publicSchema = toTxt o | otherwise = getSchemaTxt sn <> "_" <> toTxt o +qualifiedObjectToName :: (ToTxt a, MonadError QErr m) => QualifiedObject a -> m G.Name +qualifiedObjectToName objectName = do + let textName = snakeCaseQualObject objectName + onNothing (G.mkName textName) $ throw400 ValidationFailed $ + "cannot include " <> objectName <<> " in the GraphQL schema because " <> textName + <<> " is not a valid GraphQL identifier" + type QualifiedTable = QualifiedObject TableName type QualifiedFunction = QualifiedObject FunctionName @@ -356,7 +371,7 @@ data PGScalarType | PGRaster | PGUUID | PGUnknown !T.Text - deriving (Show, Eq, Lift, Generic, Data) + deriving (Show, Eq, Ord, Lift, Generic, Data) instance NFData PGScalarType instance Hashable PGScalarType instance Cacheable PGScalarType @@ -399,62 +414,67 @@ instance DQuote PGScalarType where dquoteTxt = toSQLTxt textToPGScalarType :: Text -> PGScalarType -textToPGScalarType t = case t of - "serial" -> PGSerial - "bigserial" -> PGBigSerial +textToPGScalarType t = fromMaybe (PGUnknown t) (lookup t pgScalarTranslations) - "smallint" -> PGSmallInt - "int2" -> PGSmallInt +-- Inlining this results in pretty terrible Core being generated by GHC. - "integer" -> PGInteger - "int4" -> PGInteger +{-# NOINLINE pgScalarTranslations #-} +pgScalarTranslations :: [(Text, PGScalarType)] +pgScalarTranslations = + [ ("serial" , PGSerial) + , ("bigserial" , PGBigSerial) - "bigint" -> PGBigInt - "int8" -> PGBigInt + , ("smallint" , PGSmallInt) + , ("int2" , PGSmallInt) - "real" -> PGFloat - "float4" -> PGFloat + , ("integer" , PGInteger) + , ("int4" , PGInteger) - "double precision" -> PGDouble - "float8" -> PGDouble + , ("bigint" , PGBigInt) + , ("int8" , PGBigInt) - "numeric" -> PGNumeric - "decimal" -> PGNumeric + , ("real" , PGFloat) + , ("float4" , PGFloat) - "money" -> PGMoney + , ("double precision" , PGDouble) + , ("float8" , PGDouble) - "boolean" -> PGBoolean - "bool" -> PGBoolean + , ("numeric" , PGNumeric) + , ("decimal" , PGNumeric) - "character" -> PGChar + , ("money" , PGMoney) - "varchar" -> PGVarchar - "character varying" -> PGVarchar + , ("boolean" , PGBoolean) + , ("bool" , PGBoolean) - "text" -> PGText - "citext" -> PGCitext + , ("character" , PGChar) - "date" -> PGDate + , ("varchar" , PGVarchar) + , ("character varying" , PGVarchar) - "timestamp" -> PGTimeStamp - "timestamp without time zone" -> PGTimeStamp + , ("text" , PGText) + , ("citext" , PGCitext) - "timestamptz" -> PGTimeStampTZ - "timestamp with time zone" -> PGTimeStampTZ + , ("date" , PGDate) - "timetz" -> PGTimeTZ - "time with time zone" -> PGTimeTZ + , ("timestamp" , PGTimeStamp) + , ("timestamp without time zone" , PGTimeStamp) - "json" -> PGJSON - "jsonb" -> PGJSONB + , ("timestamptz" , PGTimeStampTZ) + , ("timestamp with time zone" , PGTimeStampTZ) - "geometry" -> PGGeometry - "geography" -> PGGeography + , ("timetz" , PGTimeTZ) + , ("time with time zone" , PGTimeTZ) - "raster" -> PGRaster - "uuid" -> PGUUID - _ -> PGUnknown t + , ("json" , PGJSON) + , ("jsonb" , PGJSONB) + , ("geometry" , PGGeometry) + , ("geography" , PGGeography) + + , ("raster" , PGRaster) + , ("uuid" , PGUUID) + ] instance FromJSON PGScalarType where parseJSON (String t) = return $ textToPGScalarType t @@ -548,7 +568,7 @@ data WithScalarType a -- 'Hasura.RQL.Types.PGColumnType' to handle arrays, not just scalars, then the parameterization can -- go away. -- --- TODO: This is incorrect modeling, as 'PGScalarType' will capture anything (under 'PGUnknown'). +-- TODO (from master): This is incorrect modeling, as 'PGScalarType' will capture anything (under 'PGUnknown'). -- This should be fixed when support for all types is merged. data PGType a = PGTypeScalar !a diff --git a/server/src-lib/Hasura/SQL/Value.hs b/server/src-lib/Hasura/SQL/Value.hs index d638c6c6ef3..84963df4119 100644 --- a/server/src-lib/Hasura/SQL/Value.hs +++ b/server/src-lib/Hasura/SQL/Value.hs @@ -4,6 +4,8 @@ module Hasura.SQL.Value , pgScalarValueToJson , withConstructorFn , parsePGValue + , scientificToInteger + , scientificToFloat , TxtEncodedPGVal(..) , txtEncodedPGVal @@ -124,6 +126,21 @@ withConstructorFn ty v | ty == PGRaster = S.SEFnApp "ST_RastFromHexWKB" [v] Nothing | otherwise = v + +scientificToInteger :: (Integral i, Bounded i) => Scientific -> AT.Parser i +scientificToInteger num = case toBoundedInteger num of + Just parsed -> pure parsed + Nothing -> fail $ "The value " ++ show num ++ " lies outside the " + ++ "bounds or is not an integer. Maybe it is a " + ++ "float, or is there integer overflow?" + +scientificToFloat :: (RealFloat f) => Scientific -> AT.Parser f +scientificToFloat num = case toBoundedRealFloat num of + Right parsed -> pure parsed + Left _ -> fail $ "The value " ++ show num ++ " lies outside the " + ++ "bounds. Is it overflowing the float bounds?" + + parsePGValue :: PGScalarType -> Value -> AT.Parser PGScalarValue parsePGValue ty val = case (ty, val) of (_ , Null) -> pure $ PGNull ty @@ -133,28 +150,11 @@ parsePGValue ty val = case (ty, val) of (_ , _) -> parseTyped where parseBoundedInt :: forall i. (Integral i, Bounded i) => Value -> AT.Parser i - parseBoundedInt val' = - withScientific - ("Integer expected for input type: " ++ show ty) - go - val' - where - go num = case toBoundedInteger num of - Just parsed -> return parsed - Nothing -> fail $ "The value " ++ show num ++ " lies outside the " - ++ "bounds or is not an integer. Maybe it is a " - ++ "float, or is there integer overflow?" + parseBoundedInt = withScientific ("Integer expected for input type: " ++ show ty) scientificToInteger + parseBoundedFloat :: forall a. (RealFloat a) => Value -> AT.Parser a - parseBoundedFloat val' = - withScientific - ("Float expected for input type: " ++ show ty) - go - val' - where - go num = case toBoundedRealFloat num of - Left _ -> fail $ "The value " ++ show num ++ " lies outside the " - ++ "bounds. Is it overflowing the float bounds?" - Right parsed -> return parsed + parseBoundedFloat = withScientific ("Float expected for input type: " ++ show ty) scientificToFloat + parseTyped = case ty of PGSmallInt -> PGValSmallInt <$> parseBoundedInt val PGInteger -> PGValInteger <$> parseBoundedInt val diff --git a/server/src-lib/Hasura/Server/API/PGDump.hs b/server/src-lib/Hasura/Server/API/PGDump.hs index 2e4d0c2a805..ccfc097ddbe 100644 --- a/server/src-lib/Hasura/Server/API/PGDump.hs +++ b/server/src-lib/Hasura/Server/API/PGDump.hs @@ -7,12 +7,12 @@ module Hasura.Server.API.PGDump import Control.Exception (IOException, try) import Data.Aeson.Casing import Data.Aeson.TH -import qualified Data.ByteString.Lazy as BL -import Data.Char (isSpace) -import qualified Data.List as L -import qualified Data.Text as T +import qualified Data.ByteString.Lazy as BL +import Data.Char (isSpace) +import qualified Data.List as L +import qualified Data.Text as T import Data.Text.Conversions -import qualified Database.PG.Query as Q +import qualified Database.PG.Query as Q import Hasura.Prelude import qualified Hasura.RQL.Types.Error as RTE import System.Exit diff --git a/server/src-lib/Hasura/Server/API/Query.hs b/server/src-lib/Hasura/Server/API/Query.hs index fa4dc8f03d1..66022de2917 100644 --- a/server/src-lib/Hasura/Server/API/Query.hs +++ b/server/src-lib/Hasura/Server/API/Query.hs @@ -1,9 +1,9 @@ -- | The RQL query ('/v1/query') {-# LANGUAGE NamedFieldPuns #-} - module Hasura.Server.API.Query where import Control.Lens +import Control.Monad.Unique import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH @@ -205,7 +205,7 @@ runQuery env pgExecCtx instanceId userInfo sc hMgr sqlGenCtx systemDefined query & peelRun runCtx pgExecCtx accessMode (Just traceCtx) & runExceptT & liftIO - pure (either + pure (either ((, mempty) . Left) (\((js, meta), rsc, ci) -> (Right (js, rsc, ci), meta)) a) either throwError withReload resE @@ -349,7 +349,7 @@ reconcileAccessModes (Just mode1) (Just mode2) runQueryM :: ( HasVersion, QErrM m, CacheRWM m, UserInfoM m, MonadTx m - , MonadIO m, HasHttpManager m, HasSQLGenCtx m + , MonadIO m, MonadUnique m, HasHttpManager m, HasSQLGenCtx m , HasSystemDefined m , Tracing.MonadTrace m ) diff --git a/server/src-lib/Hasura/Server/App.hs b/server/src-lib/Hasura/Server/App.hs index 2501917ae94..2bd8119e549 100644 --- a/server/src-lib/Hasura/Server/App.hs +++ b/server/src-lib/Hasura/Server/App.hs @@ -1,5 +1,4 @@ -{-# LANGUAGE CPP #-} -{-# LANGUAGE DataKinds #-} +{-# LANGUAGE CPP #-} module Hasura.Server.App where @@ -8,6 +7,7 @@ import Control.Exception (IOException, try) import Control.Monad.Morph (hoist) import Control.Monad.Trans.Control (MonadBaseControl) import Data.String (fromString) +import Hasura.Prelude hiding (get, put) import Control.Monad.Stateless import Data.Aeson hiding (json) @@ -41,9 +41,7 @@ import qualified Web.Spock.Core as Spock import Hasura.Db import Hasura.EncJSON import Hasura.GraphQL.Logging (MonadQueryLog (..)) -import Hasura.GraphQL.Resolve.Action import Hasura.HTTP -import Hasura.Prelude hiding (get, put) import Hasura.RQL.DDL.Schema import Hasura.RQL.Types import Hasura.RQL.Types.Run @@ -63,6 +61,7 @@ import Hasura.SQL.Types import qualified Hasura.GraphQL.Execute as E import qualified Hasura.GraphQL.Execute.LiveQuery as EL import qualified Hasura.GraphQL.Execute.LiveQuery.Poll as EL +import qualified Hasura.GraphQL.Execute.Plan as E import qualified Hasura.GraphQL.Explain as GE import qualified Hasura.GraphQL.Transport.HTTP as GH import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH @@ -73,7 +72,6 @@ import qualified Hasura.Server.API.PGDump as PGD import qualified Hasura.Tracing as Tracing import qualified Network.Wai.Handler.WebSockets.Custom as WSC - data SchemaCacheRef = SchemaCacheRef { _scrLock :: MVar () @@ -107,7 +105,7 @@ data ServerCtx , scSQLGenCtx :: !SQLGenCtx , scEnabledAPIs :: !(S.HashSet API) , scInstanceId :: !InstanceId - , scPlanCache :: !E.PlanCache + -- , scPlanCache :: !E.PlanCache -- See Note [Temporarily disabling query plan caching] , scLQState :: !EL.LiveQueriesState , scEnableAllowlist :: !Bool , scEkgStore :: !EKG.Store @@ -152,7 +150,7 @@ logInconsObjs logger objs = withSCUpdate :: (MonadIO m, MonadBaseControl IO m) => SchemaCacheRef -> L.Logger L.Hasura -> m (a, RebuildableSchemaCache Run) -> m a -withSCUpdate scr logger action = do +withSCUpdate scr logger action = withMVarMasked lk $ \() -> do (!res, !newSC) <- action liftIO $ do @@ -275,8 +273,8 @@ mkSpockAction serverCtx qErrEncoder qErrModifier apiHandler = do tracingCtx (fromString (B8.unpack pathInfo)) - requestId <- getRequestId headers - + requestId <- getRequestId headers + mapActionT runTraceT $ do -- Add the request ID to the tracing metadata so that we -- can correlate requests and traces @@ -390,13 +388,13 @@ v1Alpha1GQHandler queryType query = do (sc, scVer) <- liftIO $ readIORef $ _scrCache scRef pgExecCtx <- asks (scPGExecCtx . hcServerCtx) sqlGenCtx <- asks (scSQLGenCtx . hcServerCtx) - planCache <- asks (scPlanCache . hcServerCtx) + -- planCache <- asks (scPlanCache . hcServerCtx) enableAL <- asks (scEnableAllowlist . hcServerCtx) logger <- asks (scLogger . hcServerCtx) responseErrorsConfig <- asks (scResponseInternalErrorsConfig . hcServerCtx) env <- asks (scEnvironment . hcServerCtx) - let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx planCache + let execCtx = E.ExecutionCtx logger sqlGenCtx pgExecCtx {- planCache -} (lastBuiltSchemaCache sc) scVer manager enableAL flip runReaderT execCtx $ @@ -427,10 +425,7 @@ v1GQRelayHandler v1GQRelayHandler = v1Alpha1GQHandler E.QueryRelay gqlExplainHandler - :: forall m . - ( HasVersion - , MonadIO m - ) + :: forall m. (MonadIO m) => GE.GQLExplain -> Handler (Tracing.TraceT m) (HttpResponse EncJSON) gqlExplainHandler query = do @@ -438,17 +433,17 @@ gqlExplainHandler query = do scRef <- asks (scCacheRef . hcServerCtx) sc <- getSCFromRef scRef pgExecCtx <- asks (scPGExecCtx . hcServerCtx) - sqlGenCtx <- asks (scSQLGenCtx . hcServerCtx) - env <- asks (scEnvironment . hcServerCtx) - logger <- asks (scLogger . hcServerCtx) +-- sqlGenCtx <- asks (scSQLGenCtx . hcServerCtx) +-- env <- asks (scEnvironment . hcServerCtx) +-- logger <- asks (scLogger . hcServerCtx) + -- let runTx :: ReaderT HandlerCtx (Tracing.TraceT (Tracing.NoReporter (LazyTx QErr))) a -- -> ExceptT QErr (ReaderT HandlerCtx (Tracing.TraceT m)) a - let runTx rttx = ExceptT . ReaderT $ \ctx -> do - runExceptT (Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadOnly) (runReaderT rttx ctx)) + -- let runTx rttx = ExceptT . ReaderT $ \ctx -> do + -- runExceptT (Tracing.interpTraceT (runLazyTx pgExecCtx Q.ReadOnly) (runReaderT rttx ctx)) - res <- GE.explainGQLQuery env logger pgExecCtx runTx sc sqlGenCtx - (restrictActionExecuter "query actions cannot be explained") query + res <- GE.explainGQLQuery pgExecCtx sc query return $ HttpResponse res [] v1Alpha1PGDumpHandler :: (MonadIO m) => PGD.PGDumpReqBody -> Handler m APIResp @@ -556,11 +551,11 @@ mkWaiApp :: forall m. ( HasVersion , MonadIO m +-- , MonadUnique m , MonadStateless IO m , LA.Forall (LA.Pure m) , ConsoleRenderer m , HttpLog m - -- , UserAuthentication m , UserAuthentication (Tracing.TraceT m) , MetadataApiAuthorization m , E.MonadGQLExecutionCheck m @@ -607,9 +602,11 @@ mkWaiApp -> EKG.Store -> m HasuraApp mkWaiApp env isoLevel logger sqlGenCtx enableAL pool pgExecCtxCustom ci httpManager mode corsCfg enableConsole consoleAssetsDir - enableTelemetry instanceId apis lqOpts planCacheOptions responseErrorsConfig liveQueryHook (schemaCache, cacheBuiltTime) ekgStore = do + enableTelemetry instanceId apis lqOpts _ {- planCacheOptions -} responseErrorsConfig liveQueryHook (schemaCache, cacheBuiltTime) ekgStore = do - (planCache, schemaCacheRef) <- initialiseCache + -- See Note [Temporarily disabling query plan caching] + -- (planCache, schemaCacheRef) <- initialiseCache + schemaCacheRef <- initialiseCache let getSchemaCache = first lastBuiltSchemaCache <$> readIORef (_scrCache schemaCacheRef) let corsPolicy = mkDefaultCorsPolicy corsCfg @@ -618,7 +615,7 @@ mkWaiApp env isoLevel logger sqlGenCtx enableAL pool pgExecCtxCustom ci httpMana lqState <- liftIO $ EL.initLiveQueriesState lqOpts pgExecCtx postPollHook wsServerEnv <- WS.createWSServerEnv logger pgExecCtx lqState getSchemaCache httpManager - corsPolicy sqlGenCtx enableAL planCache + corsPolicy sqlGenCtx enableAL {- planCache -} let serverCtx = ServerCtx { scPGExecCtx = pgExecCtx @@ -630,7 +627,7 @@ mkWaiApp env isoLevel logger sqlGenCtx enableAL pool pgExecCtxCustom ci httpMana , scSQLGenCtx = sqlGenCtx , scEnabledAPIs = apis , scInstanceId = instanceId - , scPlanCache = planCache + -- , scPlanCache = planCache , scLQState = lqState , scEnableAllowlist = enableAL , scEkgStore = ekgStore @@ -657,18 +654,21 @@ mkWaiApp env isoLevel logger sqlGenCtx enableAL pool pgExecCtxCustom ci httpMana getTimeMs :: IO Int64 getTimeMs = (round . (* 1000)) `fmap` getPOSIXTime - initialiseCache :: m (E.PlanCache, SchemaCacheRef) + -- initialiseCache :: m (E.PlanCache, SchemaCacheRef) + initialiseCache :: m SchemaCacheRef initialiseCache = do cacheLock <- liftIO $ newMVar () cacheCell <- liftIO $ newIORef (schemaCache, initSchemaCacheVer) - planCache <- liftIO $ E.initPlanCache planCacheOptions - let cacheRef = SchemaCacheRef cacheLock cacheCell (E.clearPlanCache planCache) - pure (planCache, cacheRef) + -- planCache <- liftIO $ E.initPlanCache planCacheOptions + let cacheRef = SchemaCacheRef cacheLock cacheCell (E.clearPlanCache {- planCache -}) + -- pure (planCache, cacheRef) + pure cacheRef httpApp :: ( HasVersion , MonadIO m +-- , MonadUnique m , MonadBaseControl IO m , ConsoleRenderer m , HttpLog m @@ -748,7 +748,7 @@ httpApp corsCfg serverCtx enableConsole consoleAssetsDir enableTelemetry = do Spock.get "dev/plan_cache" $ spockAction encodeQErr id $ mkGetHandler $ do onlyAdmin - respJ <- liftIO $ E.dumpPlanCache $ scPlanCache serverCtx + respJ <- liftIO $ E.dumpPlanCache {- $ scPlanCache serverCtx -} return $ JSONResp $ HttpResponse (encJFromJValue respJ) [] Spock.get "dev/subscriptions" $ spockAction encodeQErr id $ mkGetHandler $ do @@ -776,7 +776,6 @@ httpApp corsCfg serverCtx enableConsole consoleAssetsDir enableTelemetry = do -> (QErr -> QErr) -> APIHandler (Tracing.TraceT m) a -> Spock.ActionT m () spockAction = mkSpockAction serverCtx - -- all graphql errors should be of type 200 allMod200 qe = qe { qeStatus = HTTP.status200 } gqlExplainAction = spockAction encodeQErr id $ mkPostHandler $ mkAPIRespHandler gqlExplainHandler diff --git a/server/src-lib/Hasura/Server/Auth.hs b/server/src-lib/Hasura/Server/Auth.hs index 4e17d612e32..bcb386d2c06 100644 --- a/server/src-lib/Hasura/Server/Auth.hs +++ b/server/src-lib/Hasura/Server/Auth.hs @@ -40,6 +40,7 @@ import qualified Network.HTTP.Types as N import Hasura.Logging import Hasura.Prelude import Hasura.RQL.Types + import Hasura.Server.Auth.JWT hiding (processJwt_) import Hasura.Server.Auth.WebHook import Hasura.Server.Utils diff --git a/server/src-lib/Hasura/Server/Auth/JWT.hs b/server/src-lib/Hasura/Server/Auth/JWT.hs index a489606ec54..e6e2a8a00c9 100644 --- a/server/src-lib/Hasura/Server/Auth/JWT.hs +++ b/server/src-lib/Hasura/Server/Auth/JWT.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE CPP #-} module Hasura.Server.Auth.JWT ( processJwt , RawJWT @@ -22,9 +23,12 @@ import Control.Lens import Control.Monad.Trans.Control (MonadBaseControl) import Control.Monad.Trans.Maybe import Data.IORef (IORef, readIORef, writeIORef) + import Data.Time.Clock (NominalDiffTime, UTCTime, diffUTCTime, getCurrentTime) +#ifndef PROFILING import GHC.AssertNF +#endif import Network.URI (URI) import Data.Aeson.Internal (JSONPath) @@ -114,7 +118,7 @@ data HasuraClaims $(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''HasuraClaims) --- NOTE: these must stay lowercase; TODO consider using "Data.CaseInsensitive" +-- NOTE: these must stay lowercase; TODO(from master) consider using "Data.CaseInsensitive" allowedRolesClaim :: T.Text allowedRolesClaim = "x-hasura-allowed-roles" @@ -183,7 +187,9 @@ updateJwkRef (Logger logger) manager url jwkRef = do let parseErr e = JFEJwkParseError (T.pack e) $ "Error parsing JWK from url: " <> urlT !jwkset <- either (logAndThrow . parseErr) return $ J.eitherDecode' respBody liftIO $ do +#ifndef PROFILING $assertNFHere jwkset -- so we don't write thunks to mutable vars +#endif writeIORef jwkRef jwkset -- first check for Cache-Control header to get max-age, if not found, look for Expires header @@ -234,7 +240,7 @@ updateJwkRef (Logger logger) manager url jwkRef = do -- When no 'x-hasura-user-role' is specified in the request, the mandatory -- 'x-hasura-default-role' [2] from the JWT claims will be used. --- [1]: https://hasura.io/docs/1.0/graphql/manual/auth/authentication/unauthenticated-access.html +-- [1]: https://hasura.io/docs/1.0/graphql/manual/auth/authentication/unauthenticated-access.html -- [2]: https://hasura.io/docs/1.0/graphql/manual/auth/authentication/jwt.html#the-spec processJwt :: ( MonadIO m @@ -376,8 +382,8 @@ processAuthZHeader jwtCtx@JWTCtx{jcxClaimNs, jcxClaimsFormat} authzHeader = do -- parse x-hasura-allowed-roles, x-hasura-default-role from JWT claims parseHasuraClaims :: forall m. (MonadError QErr m) => J.Object -> m HasuraClaims parseHasuraClaims claimsMap = do - HasuraClaims <$> - parseClaim allowedRolesClaim "should be a list of roles" <*> + HasuraClaims <$> + parseClaim allowedRolesClaim "should be a list of roles" <*> parseClaim defaultRoleClaim "should be a single role name" where @@ -473,7 +479,7 @@ instance J.FromJSON JWTConfig where "RS256" -> runEither $ parseRsaKey rawKey "RS384" -> runEither $ parseRsaKey rawKey "RS512" -> runEither $ parseRsaKey rawKey - -- TODO: support ES256, ES384, ES512, PS256, PS384 + -- TODO(from master): support ES256, ES384, ES512, PS256, PS384 _ -> invalidJwk ("Key type: " <> T.unpack keyType <> " is not supported") runEither = either (invalidJwk . T.unpack) return @@ -482,11 +488,9 @@ instance J.FromJSON JWTConfig where failJSONPathParsing err = fail $ "invalid JSON path claims_namespace_path error: " ++ err - -- Utility: parseJwtClaim :: (J.FromJSON a, MonadError QErr m) => J.Value -> Text -> m a parseJwtClaim v errMsg = case J.fromJSON v of J.Success val -> return val J.Error e -> throw400 JWTInvalidClaims $ errMsg <> ": " <> T.pack e - diff --git a/server/src-lib/Hasura/Server/Auth/WebHook.hs b/server/src-lib/Hasura/Server/Auth/WebHook.hs index 9f99fa75923..5bf08f810d5 100644 --- a/server/src-lib/Hasura/Server/Auth/WebHook.hs +++ b/server/src-lib/Hasura/Server/Auth/WebHook.hs @@ -32,7 +32,6 @@ import Hasura.Server.Utils import Hasura.Session import qualified Hasura.Tracing as Tracing - data AuthHookType = AHTGet | AHTPost @@ -88,9 +87,9 @@ userInfoFromAuthHook logger manager hook reqHeaders = do let contentType = ("Content-Type", "application/json") headersPayload = J.toJSON $ Map.fromList $ hdrsToText reqHeaders H.httpLbs (req' { H.method = "POST" - , H.requestHeaders = addDefaultHeaders [contentType] - , H.requestBody = H.RequestBodyLBS . J.encode $ object ["headers" J..= headersPayload] - }) manager + , H.requestHeaders = addDefaultHeaders [contentType] + , H.requestBody = H.RequestBodyLBS . J.encode $ object ["headers" J..= headersPayload] + }) manager logAndThrow :: H.HttpException -> m a logAndThrow err = do diff --git a/server/src-lib/Hasura/Server/Cors.hs b/server/src-lib/Hasura/Server/Cors.hs index 22ec8dde79f..f85c2508c89 100644 --- a/server/src-lib/Hasura/Server/Cors.hs +++ b/server/src-lib/Hasura/Server/Cors.hs @@ -60,8 +60,8 @@ instance J.ToJSON CorsConfig where J.object [ "disabled" J..= dis , "ws_read_cookie" J..= mWsRC , "allowed_origins" J..= origs - ] - + ] + instance J.FromJSON CorsConfig where parseJSON = J.withObject "cors config" \o -> do let parseAllowAll "*" = pure CCAllowAll diff --git a/server/src-lib/Hasura/Server/Init.hs b/server/src-lib/Hasura/Server/Init.hs index 7937471b55c..2a078666211 100644 --- a/server/src-lib/Hasura/Server/Init.hs +++ b/server/src-lib/Hasura/Server/Init.hs @@ -23,8 +23,8 @@ import Network.Wai.Handler.Warp (HostPreference) import Options.Applicative import qualified Hasura.Cache.Bounded as Cache -import qualified Hasura.GraphQL.Execute as E import qualified Hasura.GraphQL.Execute.LiveQuery as LQ +import qualified Hasura.GraphQL.Execute.Plan as E import qualified Hasura.Logging as L import Hasura.Db @@ -199,7 +199,7 @@ mkServeOptions rso = do return (flip AuthHookG ty <$> mUrlEnv) -- Also support HASURA_GRAPHQL_AUTH_HOOK_TYPE - -- TODO:- drop this in next major update + -- TODO (from master):- drop this in next major update authHookTyEnv mType = fromMaybe AHTGet <$> withEnv mType "HASURA_GRAPHQL_AUTH_HOOK_TYPE" @@ -320,7 +320,7 @@ serveCmdFooter = eventEnvs = [ eventsHttpPoolSizeEnv, eventsFetchIntervalEnv ] eventsHttpPoolSizeEnv :: (String, String) -eventsHttpPoolSizeEnv = +eventsHttpPoolSizeEnv = ( "HASURA_GRAPHQL_EVENTS_HTTP_POOL_SIZE" , "Max event threads" ) @@ -379,7 +379,7 @@ pgTimeoutEnv = pgConnLifetimeEnv :: (String, String) pgConnLifetimeEnv = ( "HASURA_GRAPHQL_PG_CONN_LIFETIME" - , "Time from connection creation after which the connection should be destroyed and a new one " + , "Time from connection creation after which the connection should be destroyed and a new one " <> "created. (default: none)" ) @@ -454,7 +454,7 @@ enableConsoleEnv = enableTelemetryEnv :: (String, String) enableTelemetryEnv = ( "HASURA_GRAPHQL_ENABLE_TELEMETRY" - -- TODO: better description + -- TODO (from master): better description , "Enable anonymous telemetry (default: true)" ) @@ -848,7 +848,7 @@ enableAllowlistEnv = -- being 70kb. 128mb per-HEC seems like a reasonable default upper bound -- (note there is a distinct stripe per-HEC, for now; so this would give 1GB -- for an 8-core machine), which gives us a range of 2,000 to 18,000 here. --- Analysis of telemetry is hazy here; see +-- Analysis of telemetry is hazy here; see -- https://github.com/hasura/graphql-engine/issues/5363 for some discussion. planCacheSizeEnv :: (String, String) planCacheSizeEnv = diff --git a/server/src-lib/Hasura/Server/Init/Config.hs b/server/src-lib/Hasura/Server/Init/Config.hs index 8040bb46a97..99dc868dbdd 100644 --- a/server/src-lib/Hasura/Server/Init/Config.hs +++ b/server/src-lib/Hasura/Server/Init/Config.hs @@ -14,8 +14,8 @@ import Data.Time import Network.Wai.Handler.Warp (HostPreference) import qualified Hasura.Cache.Bounded as Cache -import qualified Hasura.GraphQL.Execute as E import qualified Hasura.GraphQL.Execute.LiveQuery as LQ +import qualified Hasura.GraphQL.Execute.Plan as E import qualified Hasura.Logging as L import Hasura.Prelude @@ -142,7 +142,7 @@ data API | DEVELOPER | CONFIG deriving (Show, Eq, Read, Generic) - + $(J.deriveJSON (J.defaultOptions { J.constructorTagModifier = map toLower }) ''API) diff --git a/server/src-lib/Hasura/Server/Logging.hs b/server/src-lib/Hasura/Server/Logging.hs index 9c443fac4c3..e0d71a3df7d 100644 --- a/server/src-lib/Hasura/Server/Logging.hs +++ b/server/src-lib/Hasura/Server/Logging.hs @@ -145,7 +145,7 @@ class (Monad m) => HttpLog m where -- ^ the response bytes -> BL.ByteString -- ^ the compressed response bytes - -- ^ TODO: make the above two type represented + -- ^ TODO (from master): make the above two type represented -> Maybe (DiffTime, DiffTime) -- ^ IO/network wait time and service time (respectively) for this request, if available. -> Maybe CompressionType diff --git a/server/src-lib/Hasura/Server/SchemaUpdate.hs b/server/src-lib/Hasura/Server/SchemaUpdate.hs index 7c2d200e352..2a0917a8939 100644 --- a/server/src-lib/Hasura/Server/SchemaUpdate.hs +++ b/server/src-lib/Hasura/Server/SchemaUpdate.hs @@ -1,3 +1,4 @@ +{-# LANGUAGE CPP #-} module Hasura.Server.SchemaUpdate (startSchemaSyncThreads) where @@ -18,7 +19,9 @@ import Data.Aeson import Data.Aeson.Casing import Data.Aeson.TH import Data.IORef +#ifndef PROFILING import GHC.AssertNF +#endif import qualified Control.Concurrent.Extended as C import qualified Control.Concurrent.STM as STM @@ -161,7 +164,9 @@ listener sqlGenCtx pool logger httpMgr updateEventRef Left e -> logError logger threadType $ TEJsonParse $ T.pack e Right payload -> do logInfo logger threadType $ object ["received_event" .= payload] +#ifndef PROFILING $assertNFHere payload -- so we don't write thunks to mutable vars +#endif -- Push a notify event to Queue STM.atomically $ STM.writeTVar updateEventRef $ Just payload diff --git a/server/src-lib/Hasura/Server/Telemetry.hs b/server/src-lib/Hasura/Server/Telemetry.hs index e7f29780b9a..f8b279e9e7a 100644 --- a/server/src-lib/Hasura/Server/Telemetry.hs +++ b/server/src-lib/Hasura/Server/Telemetry.hs @@ -32,7 +32,6 @@ import qualified Data.ByteString.Lazy as BL import qualified Data.HashMap.Strict as Map import qualified Data.List as L import qualified Data.Text as T -import qualified Language.GraphQL.Draft.Syntax as G import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types as HTTP import qualified Network.Wreq as Wreq @@ -167,7 +166,7 @@ computeMetrics sc _mtServiceTimings _mtPgVersion = $ Map.map _tiEventTriggerInfoMap userTables _mtRemoteSchemas = Map.size $ scRemoteSchemas sc _mtFunctions = Map.size $ Map.filter (not . isSystemDefined . fiSystemDefined) $ scFunctions sc - _mtActions = computeActionsMetrics (scActions sc) (snd . scCustomTypes $ sc) + _mtActions = computeActionsMetrics $ scActions sc in Metrics{..} @@ -181,26 +180,22 @@ computeMetrics sc _mtServiceTimings _mtPgVersion = permsOfTbl :: TableInfo -> [(RoleName, RolePermInfo)] permsOfTbl = Map.toList . _tiRolePermInfoMap -computeActionsMetrics :: ActionCache -> AnnotatedObjects -> ActionMetric -computeActionsMetrics ac ao = +computeActionsMetrics :: ActionCache -> ActionMetric +computeActionsMetrics actionCache = ActionMetric syncActionsLen asyncActionsLen queryActionsLen typeRelationships customTypesLen - where actions = Map.elems ac - syncActionsLen = length . filter ((==(ActionMutation ActionSynchronous)) . _adType . _aiDefinition) $ actions - asyncActionsLen = length . filter ((==(ActionMutation ActionAsynchronous)) . _adType . _aiDefinition) $ actions - queryActionsLen = length . filter ((==ActionQuery) . _adType . _aiDefinition) $ actions + where actions = Map.elems actionCache + syncActionsLen = length . filter ((== ActionMutation ActionSynchronous) . _adType . _aiDefinition) $ actions + asyncActionsLen = length . filter ((== ActionMutation ActionAsynchronous) . _adType . _aiDefinition) $ actions + queryActionsLen = length . filter ((== ActionQuery) . _adType . _aiDefinition) $ actions - outputTypesLen = length . L.nub . (map (_adOutputType . _aiDefinition)) $ actions - inputTypesLen = length . L.nub . concat . (map ((map _argType) . _adArguments . _aiDefinition)) $ actions + outputTypesLen = length . L.nub . map (_adOutputType . _aiDefinition) $ actions + inputTypesLen = length . L.nub . concatMap (map _argType . _adArguments . _aiDefinition) $ actions customTypesLen = inputTypesLen + outputTypesLen - typeRelationships = length . L.nub . concat . map ((getActionTypeRelationshipNames ao) . _aiDefinition) $ actions - - -- gives the count of relationships associated with an action - getActionTypeRelationshipNames :: AnnotatedObjects -> ResolvedActionDefinition -> [RelationshipName] - getActionTypeRelationshipNames annotatedObjs actionDefn = - let typeName = G.getBaseType $ unGraphQLType $ _adOutputType actionDefn - annotatedObj = Map.lookup (ObjectTypeName typeName) annotatedObjs - in maybe [] (Map.keys . _aotRelationships) annotatedObj + typeRelationships = + length . L.nub . concatMap + (map _trName . maybe [] toList . _otdRelationships . _aiOutputObject) $ + actions -- | Logging related diff --git a/server/src-lib/Hasura/Server/Utils.hs b/server/src-lib/Hasura/Server/Utils.hs index 84d37c4dedd..7f6d2e6c431 100644 --- a/server/src-lib/Hasura/Server/Utils.hs +++ b/server/src-lib/Hasura/Server/Utils.hs @@ -229,13 +229,13 @@ instance FromJSON APIVersion where 2 -> return VIVersion2 i -> fail $ "expected 1 or 2, encountered " ++ show i -englishList :: NonEmpty Text -> Text -englishList = \case +englishList :: Text -> NonEmpty Text -> Text +englishList joiner = \case one :| [] -> one - one :| [two] -> one <> " and " <> two + one :| [two] -> one <> " " <> joiner <> " " <> two several -> let final :| initials = NE.reverse several - in T.intercalate ", " (reverse initials) <> ", and " <> final + in T.intercalate ", " (reverse initials) <> ", " <> joiner <> " " <> final makeReasonMessage :: [a] -> (a -> Text) -> Text makeReasonMessage errors showError = diff --git a/server/src-lib/Hasura/Session.hs b/server/src-lib/Hasura/Session.hs index 73a71ecd314..d2f6628ca9e 100644 --- a/server/src-lib/Hasura/Session.hs +++ b/server/src-lib/Hasura/Session.hs @@ -13,6 +13,7 @@ module Hasura.Session , mkSessionVariables , sessionVariablesToHeaders , getSessionVariableValue + , getSessionVariablesSet , getSessionVariables , UserAdminSecret(..) , UserRoleBuild(..) @@ -23,6 +24,7 @@ module Hasura.Session , mkUserInfo , adminUserInfo , BackendOnlyFieldAccess(..) + , userInfoToList ) where import Hasura.Incremental (Cacheable) @@ -39,6 +41,7 @@ import Language.Haskell.TH.Syntax (Lift) import qualified Data.CaseInsensitive as CI import qualified Data.HashMap.Strict as Map +import qualified Data.HashSet as Set import qualified Data.Text as T import qualified Database.PG.Query as Q import qualified Network.HTTP.Types as HTTP @@ -109,6 +112,9 @@ sessionVariablesToHeaders = getSessionVariables :: SessionVariables -> [Text] getSessionVariables = map sessionVariableToText . Map.keys . unSessionVariables +getSessionVariablesSet :: SessionVariables -> Set.HashSet SessionVariable +getSessionVariablesSet = Map.keysSet . unSessionVariables + getSessionVariableValue :: SessionVariable -> SessionVariables -> Maybe SessionVariableValue getSessionVariableValue k = Map.lookup k . unSessionVariables @@ -196,3 +202,9 @@ maybeRoleFromSessionVariables sessionVariables = adminUserInfo :: UserInfo adminUserInfo = UserInfo adminRoleName mempty BOFADisallowed + +userInfoToList :: UserInfo -> [(Text, Text)] +userInfoToList userInfo = + let vars = map (first sessionVariableToText) $ Map.toList $ unSessionVariables . _uiSession $ userInfo + rn = roleNameToTxt . _uiRole $ userInfo + in (sessionVariableToText userRoleHeader, rn) : vars diff --git a/server/src-lib/Hasura/Tracing.hs b/server/src-lib/Hasura/Tracing.hs index c50fbb008b0..3bf7bb0fdfa 100644 --- a/server/src-lib/Hasura/Tracing.hs +++ b/server/src-lib/Hasura/Tracing.hs @@ -36,16 +36,17 @@ import qualified Network.HTTP.Client as HTTP import qualified Network.HTTP.Types.Header as HTTP import qualified System.Random as Rand import qualified Web.HttpApiData as HTTP + import qualified Data.Binary as Bin import qualified Data.ByteString.Base16 as Hex - + -- | Any additional human-readable key-value pairs relevant -- to the execution of a block of code. type TracingMetadata = [(Text, Text)] -newtype Reporter = Reporter - { runReporter +newtype Reporter = Reporter + { runReporter :: forall io a . MonadIO io => TraceContext @@ -93,7 +94,7 @@ newtype TraceT m a = TraceT { unTraceT :: ReaderT (TraceContext, Reporter) (Writ instance MonadTrans TraceT where lift = TraceT . lift . lift -instance MFunctor TraceT where +instance MFunctor TraceT where hoist f (TraceT rwma) = TraceT (hoist (hoist f) rwma) deriving instance MonadBase b m => MonadBase b (TraceT m) @@ -117,17 +118,17 @@ runTraceT name tma = do runTraceTWith :: MonadIO m => TraceContext -> Reporter -> Text -> TraceT m a -> m a runTraceTWith ctx rep name tma = - runReporter rep ctx name - $ runWriterT + runReporter rep ctx name + $ runWriterT $ runReaderT (unTraceT tma) (ctx, rep) - + -- | Run an action in the 'TraceT' monad transformer in an -- existing context. runTraceTInContext :: (MonadIO m, HasReporter m) => TraceContext -> Text -> TraceT m a -> m a runTraceTInContext ctx name tma = do rep <- askReporter runTraceTWith ctx rep name tma - + -- | Run an action in the 'TraceT' monad transformer in an -- existing context. runTraceTWithReporter :: MonadIO m => Reporter -> Text -> TraceT m a -> m a @@ -155,7 +156,7 @@ class Monad m => MonadTrace m where -- | Reinterpret a 'TraceT' action in another 'MonadTrace'. -- This can be useful when you need to reorganize a monad transformer stack. -interpTraceT +interpTraceT :: MonadTrace n => (m (a, TracingMetadata) -> n (b, TracingMetadata)) -> TraceT m a @@ -178,7 +179,7 @@ instance MonadIO m => MonadTrace (TraceT m) where lift . runReporter rep subCtx name . runWriterT $ runReaderT (unTraceT ma) (subCtx, rep) currentContext = TraceT (asks fst) - + currentReporter = TraceT (asks snd) attachMetadata = TraceT . tell @@ -206,7 +207,7 @@ word64ToHex :: Word64 -> Text word64ToHex randNum = bsToTxt $ Hex.encode numInBytes where numInBytes = BL.toStrict (Bin.encode randNum) --- | Decode 16 character hex string to Word64 +-- | Decode 16 character hex string to Word64 -- | Hex.Decode returns two tuples: (properly decoded data, string starts at the first invalid base16 sequence) hexToWord64 :: Text -> Maybe Word64 hexToWord64 randText = do @@ -214,17 +215,17 @@ hexToWord64 randText = do decodedWord64 = Bin.decode $ BL.fromStrict decoded guard (BS.null leftovers) pure decodedWord64 - + -- | Inject the trace context as a set of HTTP headers. injectHttpContext :: TraceContext -> [HTTP.Header] -injectHttpContext TraceContext{..} = +injectHttpContext TraceContext{..} = ("X-B3-TraceId", txtToBs $ word64ToHex tcCurrentTrace) : ("X-B3-SpanId", txtToBs $ word64ToHex tcCurrentSpan) : [ ("X-B3-ParentSpanId", txtToBs $ word64ToHex parentID) | parentID <- maybeToList tcCurrentParent ] - + -- | Extract the trace and parent span headers from a HTTP request -- and create a new 'TraceContext'. The new context will contain -- a fresh span ID, and the provided span ID will be assigned as @@ -238,15 +239,15 @@ extractHttpContext hdrs = do <*> pure (hexToWord64 =<< HTTP.parseHeaderMaybe =<< lookup "X-B3-SpanId" hdrs) --- | Inject the trace context as a JSON value, appropriate for +-- | Inject the trace context as a JSON value, appropriate for -- storing in (e.g.) an event trigger payload. injectEventContext :: TraceContext -> J.Value injectEventContext TraceContext{..} = J.object - [ "trace_id" J..= tcCurrentTrace - , "span_id" J..= tcCurrentSpan + [ "trace_id" J..= tcCurrentTrace + , "span_id" J..= tcCurrentSpan ] - + -- | Extract a trace context from an event trigger payload. extractEventContext :: J.Value -> IO (Maybe TraceContext) extractEventContext e = do @@ -257,8 +258,8 @@ extractEventContext e = do <*> pure (e ^? JL.key "trace_context" . JL.key "span_id" . JL._Integral) -- | Perform HTTP request which supports Trace headers -tracedHttpRequest - :: MonadTrace m +tracedHttpRequest + :: MonadTrace m => HTTP.Request -- ^ http request that needs to be made -> (HTTP.Request -> m a) diff --git a/server/tests-py/graphql_server.py b/server/tests-py/graphql_server.py index adc44f480a4..40098f89248 100644 --- a/server/tests-py/graphql_server.py +++ b/server/tests-py/graphql_server.py @@ -581,10 +581,14 @@ class Echo(graphene.ObjectType): class EchoQuery(graphene.ObjectType): echo = graphene.Field( Echo, - int_input=graphene.Int( default_value=1234), + int_input=graphene.Int(default_value=1234), list_input=graphene.Argument(graphene.List(graphene.String), default_value=["hi","there"]), obj_input=graphene.Argument(SizeInput, default_value=SizeInput.default()), enum_input=graphene.Argument(GQColorEnum, default_value=GQColorEnum.RED.name), + r_int_input=graphene.Int(required=True, default_value=1234), + r_list_input=graphene.Argument(graphene.List(graphene.String, required=True), default_value=["general","Kenobi"]), + r_obj_input=graphene.Argument(SizeInput, required=True, default_value=SizeInput.default()), + r_enum_input=graphene.Argument(GQColorEnum, required=True, default_value=GQColorEnum.RED.name), ) def resolve_echo(self, info, int_input, list_input, obj_input, enum_input): diff --git a/server/tests-py/queries/graphql_mutation/enums/insert_enum_field_bad_value.yaml b/server/tests-py/queries/graphql_mutation/enums/insert_enum_field_bad_value.yaml index ca4f5d6b647..892bd4731b0 100644 --- a/server/tests-py/queries/graphql_mutation/enums/insert_enum_field_bad_value.yaml +++ b/server/tests-py/queries/graphql_mutation/enums/insert_enum_field_bad_value.yaml @@ -3,7 +3,7 @@ url: /v1/graphql status: 200 response: errors: - - message: 'unexpected value "not_a_real_color" for enum: ''colors_enum''' + - message: 'expected one of the values red, purple, yellow, orange, green, or blue for type "colors_enum", but found "not_a_real_color"' extensions: code: validation-failed path: $.selectionSet.insert_users.args.objects[0].favorite_color diff --git a/server/tests-py/queries/graphql_mutation/insert/basic/can_insert_in_insertable_view.yaml b/server/tests-py/queries/graphql_mutation/insert/basic/can_insert_in_insertable_view.yaml new file mode 100644 index 00000000000..56135323d8e --- /dev/null +++ b/server/tests-py/queries/graphql_mutation/insert/basic/can_insert_in_insertable_view.yaml @@ -0,0 +1,27 @@ +- description: Inserts in insertable view + url: /v1/graphql + status: 200 + response: + data: + insert_person_mut_view: + returning: + - details: + name: + last: murphy + first: json + + query: + query: | + mutation insert_person_mut_view{ + insert_person_mut_view( + objects: [ + { + details: {name: {first: json last: murphy}} + } + ] + ) { + returning { + details + } + } + } diff --git a/server/tests-py/queries/graphql_mutation/insert/basic/cannot_insert_in_non_insertable_view.yaml b/server/tests-py/queries/graphql_mutation/insert/basic/cannot_insert_in_non_insertable_view.yaml new file mode 100644 index 00000000000..5c8aee86bff --- /dev/null +++ b/server/tests-py/queries/graphql_mutation/insert/basic/cannot_insert_in_non_insertable_view.yaml @@ -0,0 +1,25 @@ +- description: Inserts in insertable view + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: $.selectionSet.insert_person_const_view + code: validation-failed + message: "field \"insert_person_const_view\" not found in type: 'mutation_root'" + + query: + query: | + mutation insert_person_const_view{ + insert_person_const_view( + objects: [ + { + details: {name: {first: json last: murphy}} + } + ] + ) { + returning { + details + } + } + } diff --git a/server/tests-py/queries/graphql_mutation/insert/basic/person_valid_variable_but_invalid_graphql_value.yaml b/server/tests-py/queries/graphql_mutation/insert/basic/person_valid_variable_but_invalid_graphql_value.yaml new file mode 100644 index 00000000000..6808865f8d8 --- /dev/null +++ b/server/tests-py/queries/graphql_mutation/insert/basic/person_valid_variable_but_invalid_graphql_value.yaml @@ -0,0 +1,33 @@ +- description: Insert Person via a GraphQL mutation, the variable used is a valid JSON value but an invalid GraphQL value + url: /v1/graphql + status: 200 + query: + variables: + value: + 1: + name: sherlock holmes + address: 221b Baker St, Marlyebone + query: | + mutation insert_person($value: jsonb) { + insert_person( + objects: [ + { + details: $value + } + ] + ) { + returning { + id + details + } + } + } + response: + data: + insert_person: + returning: + - id: 1 + details: + 1: + name: sherlock holmes + address: 221b Baker St, Marlyebone diff --git a/server/tests-py/queries/graphql_mutation/insert/basic/schema_setup.yaml b/server/tests-py/queries/graphql_mutation/insert/basic/schema_setup.yaml index f75d472624b..c93036a59fd 100644 --- a/server/tests-py/queries/graphql_mutation/insert/basic/schema_setup.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/basic/schema_setup.yaml @@ -1,6 +1,12 @@ type: bulk args: +#Set timezone +- type: run_sql + args: + sql: | + SET TIME ZONE 'UTC'; + #Author table - type: run_sql args: @@ -44,6 +50,23 @@ args: schema: public name: person +#Person views +- type: run_sql + args: + sql: | + CREATE VIEW person_const_view AS select * from person LIMIT 600; + CREATE VIEW person_mut_view AS select * from person; + +- type: track_table + args: + schema: public + name: person_const_view + +- type: track_table + args: + schema: public + name: person_mut_view + #Order table - type: run_sql args: @@ -148,9 +171,3 @@ args: args: schema: public name: test_types - -#Set timezone -- type: run_sql - args: - sql: | - SET TIME ZONE 'UTC'; diff --git a/server/tests-py/queries/graphql_mutation/insert/basic/schema_teardown.yaml b/server/tests-py/queries/graphql_mutation/insert/basic/schema_teardown.yaml index 70a818ba170..0fd401bf097 100644 --- a/server/tests-py/queries/graphql_mutation/insert/basic/schema_teardown.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/basic/schema_teardown.yaml @@ -11,6 +11,8 @@ args: - type: run_sql args: sql: | + drop view person_const_view; + drop view person_mut_view; drop table person - type: run_sql diff --git a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_area_less_than_4_points_err.yaml b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_area_less_than_4_points_err.yaml index f30c0ad1947..517278693a4 100644 --- a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_area_less_than_4_points_err.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_area_less_than_4_points_err.yaml @@ -5,7 +5,7 @@ response: errors: - extensions: code: parse-failed - path: $.variableValues.areas[0].area.coordinates[0] + path: $.selectionSet.insert_area.args.objects[0].area message: A LinearRing needs at least 4 Positions query: variables: diff --git a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_geometry_unexpected_type_err.yaml b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_geometry_unexpected_type_err.yaml index 0b18219b513..2c6ec78bb09 100644 --- a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_geometry_unexpected_type_err.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_geometry_unexpected_type_err.yaml @@ -5,7 +5,7 @@ response: errors: - extensions: code: parse-failed - path: $.variableValues.landmarks[0].location + path: $.selectionSet.insert_landmark.args.objects[0].location message: 'unexpected geometry type: Random' query: variables: diff --git a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_landmark_single_position_err.yaml b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_landmark_single_position_err.yaml index 14d41cfa2d6..2d8e03ccb7f 100644 --- a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_landmark_single_position_err.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_landmark_single_position_err.yaml @@ -4,7 +4,7 @@ status: 200 response: errors: - extensions: - path: '$.variableValues.landmarks[0].location.coordinates' + path: '$.selectionSet.insert_landmark.args.objects[0].location' code: parse-failed message: A Position needs at least 2 elements query: diff --git a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_linear_ring_last_point_not_equal_to_first_err.yaml b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_linear_ring_last_point_not_equal_to_first_err.yaml index 97c31cdfea9..27ba237341c 100644 --- a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_linear_ring_last_point_not_equal_to_first_err.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_linear_ring_last_point_not_equal_to_first_err.yaml @@ -5,7 +5,7 @@ response: errors: - extensions: code: parse-failed - path: $.variableValues.areas[0].area.coordinates[0] + path: $.selectionSet.insert_area.args.objects[0].area message: the first and last locations have to be equal for a LinearRing query: variables: diff --git a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_road_single_point_err.yaml b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_road_single_point_err.yaml index e9fc66f3a8d..499461e22c0 100644 --- a/server/tests-py/queries/graphql_mutation/insert/geojson/insert_road_single_point_err.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/geojson/insert_road_single_point_err.yaml @@ -4,7 +4,7 @@ status: 200 response: errors: - extensions: - path: $.variableValues.roads[0].path.coordinates + path: $.selectionSet.insert_road.args.objects[0].path code: parse-failed message: A LineString needs at least 2 Positions query: @@ -17,7 +17,7 @@ query: type: LineString query: | mutation insertRoad($roads: [road_insert_input!]!) { - insert_straight_road(objects: $roads) { + insert_road(objects: $roads) { returning{ id name diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_insert_fail.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_insert_fail.yaml index 6adc3efd7e5..46e2d77fe2f 100644 --- a/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_insert_fail.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_insert_fail.yaml @@ -2,7 +2,7 @@ description: As backend user without header url: /v1/graphql status: 200 headers: - X-Hasura-Role: backend_user + X-Hasura-Role: backend_user_2 response: errors: - extensions: diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_no_admin_secret_fail.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_no_admin_secret_fail.yaml index f7b6abc11df..a86d9473339 100644 --- a/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_no_admin_secret_fail.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/backend_user_no_admin_secret_fail.yaml @@ -2,7 +2,7 @@ description: As backend user with header. This test is run only if any authoriza url: /v1/graphql status: 200 headers: - X-Hasura-Role: backend_user + X-Hasura-Role: backend_user_2 X-Hasura-Use-Backend-Only-Permissions: 'true' response: errors: diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/leads_upsert_check_with_headers.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/leads_upsert_check_with_headers.yaml new file mode 100644 index 00000000000..4752b20a137 --- /dev/null +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/leads_upsert_check_with_headers.yaml @@ -0,0 +1,49 @@ +- description: Trying to upsert with required headers set should succeed + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: sales + X-Hasura-User-Id: sales 1 + response: + data: + insert_leads: + affected_rows: 1 + query: + query: | + mutation { + insert_leads(objects: + [{id: 1, name: "lead 2", added_by: "sales 1"}] + , on_conflict: {constraint: leads_pkey, update_columns: [name]} + ) { + affected_rows + } + } + +- description: Trying to upsert without the required headers set should fail + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: sales + response: + errors: + - extensions: + path: "$" + code: not-found + message: 'missing session variables: "x-hasura-user-id"' + query: + query: | + mutation { + insert_leads( + objects: [{ + id: 1 + name: "lead 2" + added_by: "sales 1" + }] + on_conflict: { + constraint: leads_pkey + update_columns: [name] + } + ) { + affected_rows + } + } diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/schema_setup.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/schema_setup.yaml index b4bd18a5236..7e213954195 100644 --- a/server/tests-py/queries/graphql_mutation/insert/permissions/schema_setup.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/schema_setup.yaml @@ -464,6 +464,15 @@ args: id: X-Hasura-User-Id is_admin: true +- type: create_update_permission + args: + table: user + role: backend_user + permission: + check: {} + filter: {} + columns: '*' + - type: create_insert_permission args: table: user @@ -475,6 +484,17 @@ args: set: is_admin: true +- type: create_insert_permission + args: + table: user + role: backend_user_2 + permission: + check: {} + columns: '*' + backend_only: true + set: + is_admin: true + - type: create_select_permission args: table: user @@ -483,6 +503,14 @@ args: columns: '*' filter: {} +- type: create_select_permission + args: + table: user + role: backend_user_2 + permission: + columns: '*' + filter: {} + - type: create_insert_permission args: table: user @@ -493,3 +521,41 @@ args: backend_only: false set: is_admin: false + +- type: run_sql + args: + sql: | + create table leads ( + id serial primary key, + name text not null, + added_by text not null + ); + +- type: track_table + args: + schema: public + name: leads + + +# a sales role can add a new lead without any check +- type: create_insert_permission + args: + table: leads + role: sales + permission: + columns: [id, name, added_by] + check: {} + set: {} + +# a sales role can only update the leads added by them +- type: create_update_permission + args: + table: leads + role: sales + permission: + columns: [name] + filter: + added_by: "X-Hasura-User-Id" + check: + name: + _ne: "" diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/schema_teardown.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/schema_teardown.yaml index 2062062439e..82b59d8f253 100644 --- a/server/tests-py/queries/graphql_mutation/insert/permissions/schema_teardown.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/schema_teardown.yaml @@ -21,4 +21,5 @@ args: drop table computer; drop table "user"; drop table account; + drop table leads; cascade: true diff --git a/server/tests-py/queries/graphql_mutation/insert/permissions/user_with_no_backend_privilege.yaml b/server/tests-py/queries/graphql_mutation/insert/permissions/user_with_no_backend_privilege.yaml index 690c52ee9ea..23e71ab8073 100644 --- a/server/tests-py/queries/graphql_mutation/insert/permissions/user_with_no_backend_privilege.yaml +++ b/server/tests-py/queries/graphql_mutation/insert/permissions/user_with_no_backend_privilege.yaml @@ -1,9 +1,9 @@ -description: As user with no backend privilege +description: As backend user and without backend only permissions url: /v1/graphql status: 200 headers: - X-Hasura-Role: user - X-Hasura-Use-Backend-Only-Permissions: 'true' + X-Hasura-Role: backend_user + X-Hasura-Use-Backend-Only-Permissions: 'false' response: errors: - extensions: diff --git a/server/tests-py/queries/graphql_mutation/update/basic/article_column_multiple_operators.yaml b/server/tests-py/queries/graphql_mutation/update/basic/article_column_multiple_operators.yaml index 46c0d252103..47bb3c754e9 100644 --- a/server/tests-py/queries/graphql_mutation/update/basic/article_column_multiple_operators.yaml +++ b/server/tests-py/queries/graphql_mutation/update/basic/article_column_multiple_operators.yaml @@ -4,9 +4,10 @@ status: 200 response: errors: - extensions: - path: "$" + path: "$.selectionSet.update_article.args" code: validation-failed - message: column found in multiple operators; "id" in _set, _inc. "author_id" in _set, _inc + message: column found in multiple operators; "author_id" in _set, _inc. "id" in _set, _inc + query: query: | mutation { diff --git a/server/tests-py/queries/graphql_mutation/update/basic/schema_setup.yaml b/server/tests-py/queries/graphql_mutation/update/basic/schema_setup.yaml index 8be1025591d..44cbbc1556d 100644 --- a/server/tests-py/queries/graphql_mutation/update/basic/schema_setup.yaml +++ b/server/tests-py/queries/graphql_mutation/update/basic/schema_setup.yaml @@ -80,6 +80,7 @@ args: - type: run_sql args: sql: | + SET lc_monetary TO "en_US.utf-8"; CREATE TABLE numerics ( id SERIAL PRIMARY KEY, num_smallint SMALLINT, diff --git a/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_with_select_access.yaml b/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_with_select_access.yaml new file mode 100644 index 00000000000..dd8d0d7173e --- /dev/null +++ b/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_with_select_access.yaml @@ -0,0 +1,42 @@ +- description: The 'columns' argument to 'count' should be exposed, as the role has select access to the cols + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: role_with_access_to_cols + response: + data: + article_aggregate: + aggregate: + count: 3 + query: + query: | + query { + article_aggregate { + aggregate { + count(columns:[title,content]) + } + } + } + +- description: The aggregate functions that use column data should be exposed, as the role has select access to them + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: role_with_access_to_cols + response: + data: + article_aggregate: + aggregate: + max: + id: 3 + query: + query: | + query { + article_aggregate { + aggregate { + max { + id + } + } + } + } diff --git a/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_without_select_access.yaml b/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_without_select_access.yaml new file mode 100644 index 00000000000..864fbfcd187 --- /dev/null +++ b/server/tests-py/queries/graphql_query/agg_perm/article_agg_with_role_without_select_access.yaml @@ -0,0 +1,63 @@ +- description: User can query for the count of the rows without having select access + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: role_without_access_to_cols + response: + data: + article_aggregate: + aggregate: + count: 3 + query: + query: | + query { + article_aggregate { + aggregate { + count + } + } + } + +- description: The 'columns' argument to 'count' should not be exposed, because the role doesn't have access + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: role_without_access_to_cols + response: + errors: + - extensions: + path: $.selectionSet.article_aggregate.selectionSet.aggregate.selectionSet.count + code: validation-failed + message: "\"count\" has no argument named \"columns\"" + query: + query: | + query { + article_aggregate { + aggregate { + count(columns:[title,content]) + } + } + } + +- description: The aggregate functions that use column data should not be exposed + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: role_without_access_to_cols + response: + errors: + - extensions: + path: $.selectionSet.article_aggregate.selectionSet.aggregate.selectionSet.max + code: validation-failed + message: "field \"max\" not found in type: 'article_aggregate_fields'" + query: + query: | + query { + article_aggregate { + aggregate { + max { + published_on + } + } + } + } diff --git a/server/tests-py/queries/graphql_query/agg_perm/setup.yaml b/server/tests-py/queries/graphql_query/agg_perm/setup.yaml index a0b7c2170d6..ec911b2c5f5 100644 --- a/server/tests-py/queries/graphql_query/agg_perm/setup.yaml +++ b/server/tests-py/queries/graphql_query/agg_perm/setup.yaml @@ -157,3 +157,23 @@ args: filter: {} allow_aggregations: true limit: 1 + +- type: create_select_permission + args: + table: article + role: role_without_access_to_cols + permission: + columns: [] + filter: {} + allow_aggregations: true + limit: 1 + +- type: create_select_permission + args: + table: article + role: role_with_access_to_cols + permission: + columns: "*" + filter: {} + allow_aggregations: true + limit: 1 diff --git a/server/tests-py/queries/graphql_query/basic/select_query_fragment_cycles.yaml b/server/tests-py/queries/graphql_query/basic/select_query_fragment_cycles.yaml index e66c9290831..f9d643e6ed4 100644 --- a/server/tests-py/queries/graphql_query/basic/select_query_fragment_cycles.yaml +++ b/server/tests-py/queries/graphql_query/basic/select_query_fragment_cycles.yaml @@ -27,4 +27,4 @@ response: - extensions: path: $.selectionSet.author.selectionSet.authorFragment.selectionSet.articles.selectionSet.articleFragment.selectionSet.author.selectionSet code: validation-failed - message: cannot spread fragment "authorFragment" within itself via articleFragment,authorFragment + message: the fragment definition(s) authorFragment and articleFragment form a cycle diff --git a/server/tests-py/queries/graphql_query/basic/select_query_test_types.yaml b/server/tests-py/queries/graphql_query/basic/select_query_test_types.yaml index 5c325a86df3..3ce2dc28c03 100644 --- a/server/tests-py/queries/graphql_query/basic/select_query_test_types.yaml +++ b/server/tests-py/queries/graphql_query/basic/select_query_test_types.yaml @@ -40,23 +40,23 @@ response: obj: c1: c2 arr: [1,2,3] - _underscore: 0 - '!@#$%^': special - translations: - hello world!: hi - objs: - - 你好: Hello! + _underscore: 0 + '!@#$%^': special + translations: + hello world!: hi + objs: + - 你好: Hello! c32_json_dollar: a: b obj: c1: c2 arr: [1,2,3] - _underscore: 0 - '!@#$%^': special - translations: - hello world!: hi - objs: - - 你好: Hello! + _underscore: 0 + '!@#$%^': special + translations: + hello world!: hi + objs: + - 你好: Hello! c32_json_child_prop: c2 c32_json_child_prop_no_dot: b c32_json_array_item: 1 @@ -68,26 +68,26 @@ response: c32_json_nested_special_array_double_quote_dot: Hello! c33_jsonb: c: d - arr: [4,5,6] + arr: [4,5,6] obj: - e1: e2 - objs: - - 你好: Hello! - '!@#$%^': special - _underscore: 0 - translations: - hello world!: hi + e1: e2 + objs: + - 你好: Hello! + '!@#$%^': special + _underscore: 0 + translations: + hello world!: hi c33_jsonb_dollar: c: d - arr: [4,5,6] + arr: [4,5,6] obj: - e1: e2 - objs: - - 你好: Hello! - '!@#$%^': special - _underscore: 0 - translations: - hello world!: hi + e1: e2 + objs: + - 你好: Hello! + '!@#$%^': special + _underscore: 0 + translations: + hello world!: hi c33_jsonb_child_prop: e2 c33_jsonb_child_prop_no_dot: d c33_jsonb_array_item: 6 diff --git a/server/tests-py/queries/graphql_query/basic/setup.yaml b/server/tests-py/queries/graphql_query/basic/setup.yaml index 5ea2a2b4402..9f18bb9b120 100644 --- a/server/tests-py/queries/graphql_query/basic/setup.yaml +++ b/server/tests-py/queries/graphql_query/basic/setup.yaml @@ -18,6 +18,18 @@ args: price numeric ); +#Set timezone +- type: run_sql + args: + sql: | + SET TIME ZONE 'UTC'; + +#Set money locale +- type: run_sql + args: + sql: | + SET lc_monetary TO "en_US.utf-8"; + #Test table with different types - type: run_sql args: @@ -293,9 +305,3 @@ args: - name: "John\\" - name: "Clarke" - name: "clarke" - -#Set timezone -- type: run_sql - args: - sql: | - SET TIME ZONE 'UTC'; diff --git a/server/tests-py/queries/graphql_query/boolexp/raster/query_st_intersects_rast_fail.yaml b/server/tests-py/queries/graphql_query/boolexp/raster/query_st_intersects_rast_fail.yaml index 719df183e5f..d74dd1bbba7 100644 --- a/server/tests-py/queries/graphql_query/boolexp/raster/query_st_intersects_rast_fail.yaml +++ b/server/tests-py/queries/graphql_query/boolexp/raster/query_st_intersects_rast_fail.yaml @@ -4,7 +4,7 @@ status: 200 response: errors: - extensions: - path: "$.variableValues.rast" + path: "$.selectionSet.dummy_rast.args.where.rast._st_intersects_rast" code: parse-failed message: invalid hexadecimal representation of raster well known binary format diff --git a/server/tests-py/queries/graphql_query/enums/introspect_user_role.yaml b/server/tests-py/queries/graphql_query/enums/introspect_user_role.yaml index 78323d63bf3..e39a7bba81d 100644 --- a/server/tests-py/queries/graphql_query/enums/introspect_user_role.yaml +++ b/server/tests-py/queries/graphql_query/enums/introspect_user_role.yaml @@ -1,4 +1,12 @@ # https://github.com/hasura/graphql-engine/issues/5200 + +# NOTE:- +# The GraphQL schema generation refactor (https://github.com/hasura/graphql-engine/pull/4111) auto fixes the +# aforementioned issue, but in different way, by restricting the generation of *_by_pk root fields +# when all primary key columns are not marked for selection in permission. The actual fix +# (https://github.com/hasura/graphql-engine/pull/5522) is to generate the typeinfos for all primary key columns +# irrespective of select permissions. So, the test case is modified accordingly to check +# the absence of zones_by_pk query root field. description: Test introspecting enum types as user role url: /v1/graphql status: 200 @@ -6,14 +14,9 @@ headers: X-Hasura-Role: user response: data: - country: - kind: ENUM - name: country_enum - enumValues: - - name: India - description: Republic of India - - name: USA - description: United States of America + query_root_fields: + fields: + - name: zones zones: fields: - name: code @@ -27,12 +30,9 @@ response: query: query: | { - country: __type(name: "country_enum") { - name - kind - enumValues { + query_root_fields: __type(name: "query_root") { + fields { name - description } } zones: __type(name: "zones") { diff --git a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_bad_value.yaml b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_bad_value.yaml index 7a359e73eec..4dc6daf5ee0 100644 --- a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_bad_value.yaml +++ b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_bad_value.yaml @@ -3,7 +3,7 @@ url: /v1/graphql status: 200 response: errors: - - message: 'unexpected value "not_a_real_color" for enum: ''colors_enum''' + - message: 'expected one of the values red, purple, yellow, orange, green, or blue for type "colors_enum", but found "not_a_real_color"' extensions: code: validation-failed path: $.selectionSet.users.args.where.favorite_color._eq diff --git a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_string.yaml b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_string.yaml index 2f8d378208a..e1775e9641e 100644 --- a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_string.yaml +++ b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_string.yaml @@ -3,7 +3,7 @@ url: /v1/graphql status: 200 response: errors: - - message: expecting an enum + - message: expected an enum value for type "colors_enum", but found a string extensions: code: validation-failed path: $.selectionSet.users.args.where.favorite_color._eq diff --git a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_variable_bad_value.yaml b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_variable_bad_value.yaml index 36a663ddf01..60e281590c4 100644 --- a/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_variable_bad_value.yaml +++ b/server/tests-py/queries/graphql_query/enums/select_where_enum_eq_variable_bad_value.yaml @@ -3,10 +3,10 @@ url: /v1/graphql status: 200 response: errors: - - message: 'unexpected value "not_a_real_color" for enum: ''colors_enum''' + - message: 'expected one of the values red, purple, yellow, orange, green, or blue for type "colors_enum", but found "not_a_real_color"' extensions: code: validation-failed - path: $.variableValues.color + path: $.selectionSet.users.args.where.favorite_color._eq query: query: | query ($color: colors_enum) { diff --git a/server/tests-py/queries/graphql_query/limits/select_query_article_string_limit_error.yaml b/server/tests-py/queries/graphql_query/limits/select_query_article_string_limit_error.yaml index 0ce2dddb570..9aa8bf7e433 100644 --- a/server/tests-py/queries/graphql_query/limits/select_query_article_string_limit_error.yaml +++ b/server/tests-py/queries/graphql_query/limits/select_query_article_string_limit_error.yaml @@ -6,7 +6,7 @@ response: - extensions: code: validation-failed path: $.selectionSet.article.args.limit - message: expecting Integer value for "limit" + message: expected a 32-bit integer for type "Int", but found a string query: query: | query { diff --git a/server/tests-py/queries/graphql_query/permissions/artist_select_query_Track_fail.yaml b/server/tests-py/queries/graphql_query/permissions/artist_select_query_Track_fail.yaml index 88dc44fcd1a..1e441863108 100644 --- a/server/tests-py/queries/graphql_query/permissions/artist_select_query_Track_fail.yaml +++ b/server/tests-py/queries/graphql_query/permissions/artist_select_query_Track_fail.yaml @@ -1,4 +1,4 @@ -description: Artist can only select his/her tracks. Without sending header (Error) +description: Artist can only select their tracks. Without sending header (Error) url: /v1/graphql status: 200 headers: @@ -8,7 +8,7 @@ response: - extensions: code: not-found path: "$" - message: '"x-hasura-artist-id" header is expected but not found' + message: 'missing session variables: "x-hasura-artist-id"' query: query: | query { diff --git a/server/tests-py/queries/graphql_query/permissions/select_articles_without_required_headers.yaml b/server/tests-py/queries/graphql_query/permissions/select_articles_without_required_headers.yaml new file mode 100644 index 00000000000..addf43ee0f8 --- /dev/null +++ b/server/tests-py/queries/graphql_query/permissions/select_articles_without_required_headers.yaml @@ -0,0 +1,55 @@ +- description: Select related articles while querying authors, but without setting the headers required for selecting articles + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: critic + response: + errors: + - extensions: + path: "$" + code: not-found + message: 'missing session variables: "x-hasura-critic-id"' + query: + query: | + query { + author { + name + articles { + title + content + is_published + } + } + } + +- description: Select related articles while querying authors with a role which doesn't require any headers + to be set to query articles + status: 200 + url: /v1/graphql + headers: + X-Hasura-Role: anonymous + response: + data: + author: + - name: Author 1 + articles: + - title: Article 2 + content: Sample article content 2 + is_published: true + - name: Author 2 + articles: + - title: Article 3 + content: Sample article content 3 + is_published: true + query: + query: | + query { + author { + name + articles { + title + content + is_published + } + } + } diff --git a/server/tests-py/queries/graphql_query/permissions/setup.yaml b/server/tests-py/queries/graphql_query/permissions/setup.yaml index 5e1960ed198..10e34a6df15 100644 --- a/server/tests-py/queries/graphql_query/permissions/setup.yaml +++ b/server/tests-py/queries/graphql_query/permissions/setup.yaml @@ -283,6 +283,42 @@ args: name: search_tracks schema: public +#Create Books table +- type: run_sql + args: + sql: | + CREATE TABLE books ( + id int, + author_name text, + book_name text, + published_on timestamptz, + PRIMARY KEY (id,book_name) + ); + +# Track table Books +- type: track_table + args: + schema: public + name: books + +- type: insert + args: + table: books + objects: + - id: 1 + author_name: J.K. Rowling + book_name: Harry Porter + published_on: "1997-06-26" + +#Create select permission on books, granting permission only to one of the columns of the primary key +- type: create_select_permission + args: + table: books + role: user + permission: + columns: ["author_name","book_name","published_on"] + filter: {} + #Permission based on PostGIS operators - type: run_sql args: @@ -523,3 +559,24 @@ args: columns: - id - bid_price + +- type: create_select_permission + args: + table: article + role: critic + permission: + columns: + - title + - content + - is_published + filter: + id: + _eq: X-Hasura-Critic-Id + +- type: create_select_permission + args: + table: author + role: critic + permission: + columns: ["name"] + filter: {} diff --git a/server/tests-py/queries/graphql_query/permissions/teardown.yaml b/server/tests-py/queries/graphql_query/permissions/teardown.yaml index 3ae9e98b922..77c1f4459a7 100644 --- a/server/tests-py/queries/graphql_query/permissions/teardown.yaml +++ b/server/tests-py/queries/graphql_query/permissions/teardown.yaml @@ -7,6 +7,7 @@ args: DROP TABLE author; DROP TABLE "Track" cascade; DROP TABLE "Artist"; + DROP TABLE books; DROP TABLE geom_table; DROP TABLE jsonb_table; DROP TABLE gpa cascade; diff --git a/server/tests-py/queries/graphql_query/permissions/user_should_not_be_able_to_access_books_by_pk.yaml b/server/tests-py/queries/graphql_query/permissions/user_should_not_be_able_to_access_books_by_pk.yaml new file mode 100644 index 00000000000..9138233d051 --- /dev/null +++ b/server/tests-py/queries/graphql_query/permissions/user_should_not_be_able_to_access_books_by_pk.yaml @@ -0,0 +1,20 @@ +- description: User cannot access books_by_pk + url: /v1/graphql + status: 200 + headers: + X-Hasura-Role: user + response: + errors: + - extensions: + path: $.selectionSet.books_by_pk + code: validation-failed + message: "field \"books_by_pk\" not found in type: 'query_root'" + query: + query: | + query { + books_by_pk(id:1,book_name:"Harry Porter") { + author_name + book_name + published_on + } + } diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_1.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_1.yaml new file mode 100644 index 00000000000..aecc5005fda --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_1.yaml @@ -0,0 +1,49 @@ +description: Get last page of articles with 3 items +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + last: 3 + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiA0fQ== + endCursor: eyJpZCIgOiA2fQ== + hasPreviousPage: true + hasNextPage: false + edges: + - cursor: eyJpZCIgOiA0fQ== + node: + title: Article 4 + content: Sample article content 4 + author_id: 2 + - cursor: eyJpZCIgOiA1fQ== + node: + title: Article 5 + content: Sample article content 5 + author_id: 2 + - cursor: eyJpZCIgOiA2fQ== + node: + title: Article 6 + content: Sample article content 6 + author_id: 3 diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_2.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_2.yaml new file mode 100644 index 00000000000..d0210d68e50 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_2.yaml @@ -0,0 +1,45 @@ +description: Get last page of articles with 2 items before 'Article 4' +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + last: 2 + before: "eyJpZCIgOiA0fQ==" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiAyfQ== + endCursor: eyJpZCIgOiAzfQ== + hasPreviousPage: true + hasNextPage: true + edges: + - cursor: eyJpZCIgOiAyfQ== + node: + title: Article 2 + content: Sample article content 2 + author_id: 1 + - cursor: eyJpZCIgOiAzfQ== + node: + title: Article 3 + content: Sample article content 3 + author_id: 1 diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_3.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_3.yaml new file mode 100644 index 00000000000..1e255c73568 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/backward/page_3.yaml @@ -0,0 +1,40 @@ +description: Get last page of articles before 'Article 2' +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + last: 2 + before: "eyJpZCIgOiAyfQ==" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiAxfQ== + endCursor: eyJpZCIgOiAxfQ== + hasPreviousPage: false + hasNextPage: true + edges: + - cursor: eyJpZCIgOiAxfQ== + node: + title: Article 1 + content: Sample article content 1 + author_id: 1 diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_1.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_1.yaml new file mode 100644 index 00000000000..388d8f4bcef --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_1.yaml @@ -0,0 +1,49 @@ +description: Get 1st page of articles with 3 items +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + first: 3 + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiAxfQ== + endCursor: eyJpZCIgOiAzfQ== + hasPreviousPage: false + hasNextPage: true + edges: + - cursor: eyJpZCIgOiAxfQ== + node: + title: Article 1 + content: Sample article content 1 + author_id: 1 + - cursor: eyJpZCIgOiAyfQ== + node: + title: Article 2 + content: Sample article content 2 + author_id: 1 + - cursor: eyJpZCIgOiAzfQ== + node: + title: Article 3 + content: Sample article content 3 + author_id: 1 diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_2.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_2.yaml new file mode 100644 index 00000000000..f8afdcc7ae7 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_2.yaml @@ -0,0 +1,45 @@ +description: Get 2nd page of articles with 2 items +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + first: 2 + after: "eyJpZCIgOiAzfQ==" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiA0fQ== + endCursor: eyJpZCIgOiA1fQ== + hasPreviousPage: true + hasNextPage: true + edges: + - cursor: eyJpZCIgOiA0fQ== + node: + title: Article 4 + content: Sample article content 4 + author_id: 2 + - cursor: eyJpZCIgOiA1fQ== + node: + title: Article 5 + content: Sample article content 5 + author_id: 2 diff --git a/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_3.yaml b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_3.yaml new file mode 100644 index 00000000000..a41fcb04ab0 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/article_pagination_no_orderby/forward/page_3.yaml @@ -0,0 +1,40 @@ +description: Get 3rd page of articles +url: /v1beta1/relay +status: 200 +query: + query: | + query { + article_connection( + first: 3 + after: "eyJpZCIgOiA1fQ==" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + title + content + author_id + } + } + } + } +response: + data: + article_connection: + pageInfo: + startCursor: eyJpZCIgOiA2fQ== + endCursor: eyJpZCIgOiA2fQ== + hasPreviousPage: true + hasNextPage: false + edges: + - cursor: eyJpZCIgOiA2fQ== + node: + title: Article 6 + content: Sample article content 6 + author_id: 3 diff --git a/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_1.yaml b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_1.yaml new file mode 100644 index 00000000000..635a543850c --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_1.yaml @@ -0,0 +1,44 @@ +description: Fetch 1st page from last of articles ordered by their article count +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + last: 1 + order_by: {articles_aggregate: {count: asc}} + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + name + articles_aggregate{ + aggregate{ + count + } + } + } + } + } + } +response: + data: + author_connection: + pageInfo: + startCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9 + endCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9 + hasPreviousPage: true + hasNextPage: false + edges: + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9 + node: + name: Author 1 + articles_aggregate: + aggregate: + count: 3 diff --git a/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_2.yaml b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_2.yaml new file mode 100644 index 00000000000..eeaa0801e7f --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_2.yaml @@ -0,0 +1,51 @@ +description: Fetch 2nd page from last of articles ordered by their article count +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + last: 2 + order_by: {articles_aggregate: {count: asc}} + before: "eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + name + articles_aggregate{ + aggregate{ + count + } + } + } + } + } + } +response: + data: + author_connection: + pageInfo: + startCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9 + endCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAyfSwgImlkIiA6IDJ9 + hasPreviousPage: true + hasNextPage: true + edges: + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9 + node: + name: Author 3 + articles_aggregate: + aggregate: + count: 1 + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAyfSwgImlkIiA6IDJ9 + node: + name: Author 2 + articles_aggregate: + aggregate: + count: 2 diff --git a/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_3.yaml b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_3.yaml new file mode 100644 index 00000000000..03d47c3c03d --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/backward/page_3.yaml @@ -0,0 +1,45 @@ +description: Fetch 3rd page from last of articles ordered by their article count +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + last: 1 + order_by: {articles_aggregate: {count: asc}} + before: "eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + name + articles_aggregate{ + aggregate{ + count + } + } + } + } + } + } +response: + data: + author_connection: + pageInfo: + startCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAwfSwgImlkIiA6IDR9 + endCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAwfSwgImlkIiA6IDR9 + hasPreviousPage: false + hasNextPage: true + edges: + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAwfSwgImlkIiA6IDR9 + node: + name: Author 4 + articles_aggregate: + aggregate: + count: 0 diff --git a/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_1.yaml b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_1.yaml new file mode 100644 index 00000000000..f87cf796b2e --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_1.yaml @@ -0,0 +1,50 @@ +description: Fetch 1st page of articles ordered by their article count +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + first: 2 + order_by: {articles_aggregate: {count: asc}} + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + name + articles_aggregate{ + aggregate{ + count + } + } + } + } + } + } +response: + data: + author_connection: + pageInfo: + startCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAwfSwgImlkIiA6IDR9 + endCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9 + hasPreviousPage: false + hasNextPage: true + edges: + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAwfSwgImlkIiA6IDR9 + node: + name: Author 4 + articles_aggregate: + aggregate: + count: 0 + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9 + node: + name: Author 3 + articles_aggregate: + aggregate: + count: 1 diff --git a/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_2.yaml b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_2.yaml new file mode 100644 index 00000000000..195402efc2a --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/author_pagination_articles_aggregate_orderby/forward/page_2.yaml @@ -0,0 +1,51 @@ +description: Fetch 2nd page of articles ordered by their article count +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + first: 2 + order_by: {articles_aggregate: {count: asc}} + after: "eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAxfSwgImlkIiA6IDN9" + ){ + pageInfo{ + startCursor + endCursor + hasPreviousPage + hasNextPage + } + edges{ + cursor + node{ + name + articles_aggregate{ + aggregate{ + count + } + } + } + } + } + } +response: + data: + author_connection: + pageInfo: + startCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAyfSwgImlkIiA6IDJ9 + endCursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9 + hasPreviousPage: true + hasNextPage: false + edges: + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAyfSwgImlkIiA6IDJ9 + node: + name: Author 2 + articles_aggregate: + aggregate: + count: 2 + - cursor: eyJhcnRpY2xlc19hZ2dyZWdhdGUiIDogeyJjb3VudCIgOiAzfSwgImlkIiA6IDF9 + node: + name: Author 1 + articles_aggregate: + aggregate: + count: 3 diff --git a/server/tests-py/queries/graphql_query/relay/basic/invalid_node_id.yaml b/server/tests-py/queries/graphql_query/relay/basic/invalid_node_id.yaml new file mode 100644 index 00000000000..9546a3b377d --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/basic/invalid_node_id.yaml @@ -0,0 +1,19 @@ +description: Query node interface with invalid node id +url: /v1beta1/relay +status: 200 +query: + query: | + query { + node(id: "eyJpZCIgOiA0fQ=="){ + __typename + ... on author{ + name + } + } + } +response: + errors: + - extensions: + path: "$.selectionSet.node" + code: validation-failed + message: the node id is invalid diff --git a/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/after_and_before.yaml b/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/after_and_before.yaml index 7dc25837432..04e241c40af 100644 --- a/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/after_and_before.yaml +++ b/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/after_and_before.yaml @@ -16,6 +16,6 @@ query: response: errors: - extensions: - path: "$.selectionSet.author_connection" + path: $.selectionSet.author_connection.args code: validation-failed message: '"after" and "before" are not allowed at once' diff --git a/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/first_and_last.yaml b/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/first_and_last.yaml index dfa895d6e66..c6fa9dc9714 100644 --- a/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/first_and_last.yaml +++ b/server/tests-py/queries/graphql_query/relay/basic/pagination_errors/first_and_last.yaml @@ -16,6 +16,6 @@ query: response: errors: - extensions: - path: "$.selectionSet.author_connection" + path: "$.selectionSet.author_connection.args" code: validation-failed message: '"first" and "last" are not allowed at once' diff --git a/server/tests-py/queries/graphql_query/relay/pagination_errors/after_and_before.yaml b/server/tests-py/queries/graphql_query/relay/pagination_errors/after_and_before.yaml new file mode 100644 index 00000000000..7dc25837432 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/pagination_errors/after_and_before.yaml @@ -0,0 +1,21 @@ +description: Use after and before arguments in the same query +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + after: "eyJpZCIgOiAyfQ==" + before: "eyJpZCIgOiA0fQ==" + ){ + edges{ + cursor + } + } + } +response: + errors: + - extensions: + path: "$.selectionSet.author_connection" + code: validation-failed + message: '"after" and "before" are not allowed at once' diff --git a/server/tests-py/queries/graphql_query/relay/pagination_errors/first_and_last.yaml b/server/tests-py/queries/graphql_query/relay/pagination_errors/first_and_last.yaml new file mode 100644 index 00000000000..dfa895d6e66 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/pagination_errors/first_and_last.yaml @@ -0,0 +1,21 @@ +description: Use first and last arguments in the same query +url: /v1beta1/relay +status: 200 +query: + query: | + query { + author_connection( + first: 1 + last: 2 + ){ + edges{ + cursor + } + } + } +response: + errors: + - extensions: + path: "$.selectionSet.author_connection" + code: validation-failed + message: '"first" and "last" are not allowed at once' diff --git a/server/tests-py/queries/graphql_query/relay/setup.yaml b/server/tests-py/queries/graphql_query/relay/setup.yaml new file mode 100644 index 00000000000..44f330d7c43 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/setup.yaml @@ -0,0 +1,79 @@ +type: bulk +args: +- type: run_sql + args: + sql: | + CREATE TABLE author( + id SERIAL PRIMARY KEY, + name TEXT UNIQUE NOT NULL + ); + + INSERT INTO author (name) + VALUES ('Author 1'), ('Author 2'), ('Author 3'), ('Author 4'); + + CREATE TABLE article ( + id SERIAL PRIMARY KEY, + title TEXT, + content TEXT, + author_id INTEGER REFERENCES author(id) + ); + + INSERT INTO article (title, content, author_id) + VALUES + ( + 'Article 1', + 'Sample article content 1', + 1 + ), + ( + 'Article 2', + 'Sample article content 2', + 1 + ), + ( + 'Article 3', + 'Sample article content 3', + 1 + ), + ( + 'Article 4', + 'Sample article content 4', + 2 + ), + ( + 'Article 5', + 'Sample article content 5', + 2 + ), + ( + 'Article 6', + 'Sample article content 6', + 3 + ); + +# Track tables and define relationships +- type: track_table + args: + name: author + schema: public + +- type: track_table + args: + name: article + schema: public + +- type: create_object_relationship + args: + table: article + name: author + using: + foreign_key_constraint_on: author_id + +- type: create_array_relationship + args: + table: author + name: articles + using: + foreign_key_constraint_on: + table: article + column: author_id diff --git a/server/tests-py/queries/graphql_query/relay/teardown.yaml b/server/tests-py/queries/graphql_query/relay/teardown.yaml new file mode 100644 index 00000000000..65471ac1d13 --- /dev/null +++ b/server/tests-py/queries/graphql_query/relay/teardown.yaml @@ -0,0 +1,8 @@ +type: bulk +args: +- type: run_sql + args: + cascade: true + sql: | + DROP TABLE article; + DROP TABLE author; diff --git a/server/tests-py/queries/graphql_validation/json_column_value.yaml b/server/tests-py/queries/graphql_validation/json_column_value.yaml new file mode 100644 index 00000000000..c67cdc433f8 --- /dev/null +++ b/server/tests-py/queries/graphql_validation/json_column_value.yaml @@ -0,0 +1,38 @@ +- description: JSON variables should not be interpreted as graphql input values + url: /v1/graphql + status: 200 + response: + data: + insert_article_one: + body: + 1: 2 + 2: 3 + query: + query: | + mutation insert_article($body: jsonb) { + insert_article_one(object: {body: $body}) { + body + } + } + variables: + body: + 1: 2 + 2: 3 + +- description: variables within JSON values should be properly interpolated + url: /v1/graphql + status: 200 + response: + data: + insert_article_one: + body: + - header: "X-HEADER-THINGY" + query: + query: | + mutation insert_article($header: jsonb) { + insert_article_one(object: {body: [{header: $header}]}) { + body + } + } + variables: + header: "X-HEADER-THINGY" diff --git a/server/tests-py/queries/graphql_validation/null_value_err.yaml b/server/tests-py/queries/graphql_validation/null_value_err.yaml index c76b9bc8066..4881a56f69d 100644 --- a/server/tests-py/queries/graphql_validation/null_value_err.yaml +++ b/server/tests-py/queries/graphql_validation/null_value_err.yaml @@ -6,7 +6,7 @@ response: - extensions: path: "$.selectionSet.update_author.args.where" code: "validation-failed" - message: "null value found for non-nullable type: author_bool_exp!" + message: expected an object for type "author_bool_exp", but found null query: query: | mutation update_author { diff --git a/server/tests-py/queries/graphql_validation/null_variable_value_err.yaml b/server/tests-py/queries/graphql_validation/null_variable_value_err.yaml index 50142dbb280..1f6b04872fc 100644 --- a/server/tests-py/queries/graphql_validation/null_variable_value_err.yaml +++ b/server/tests-py/queries/graphql_validation/null_variable_value_err.yaml @@ -4,9 +4,10 @@ status: 200 response: errors: - extensions: - path: "$.variableValues.author_id" - code: "validation-failed" - message: "null value found for non-nullable type: Int!" + path: $.selectionSet.update_author.args.where.id._eq + code: validation-failed + message: expected a 32-bit integer for type "Int", but found null + query: variables: author_id: null diff --git a/server/tests-py/queries/graphql_validation/setup.yaml b/server/tests-py/queries/graphql_validation/setup.yaml index dd084a96828..a3bd4709c65 100644 --- a/server/tests-py/queries/graphql_validation/setup.yaml +++ b/server/tests-py/queries/graphql_validation/setup.yaml @@ -1,15 +1,54 @@ type: bulk args: -#Author table +- type: run_sql + args: + sql: | + CREATE EXTENSION IF NOT EXISTS postgis; + + +# Author table - type: run_sql args: sql: | create table author( - id serial primary key, - name text unique + id SERIAL PRIMARY KEY, + name TEXT UNIQUE, + location GEOGRAPHY(Point) ); - type: track_table args: schema: public name: author + + +# Article table +- type: run_sql + args: + sql: | + create table article( + id SERIAL PRIMARY KEY, + body JSONB + ); +- type: track_table + args: + schema: public + name: article + + +# Some other table +- type: run_sql + args: + sql: | + create table misgivings( + i INTEGER, + f REAL + ); +- type: run_sql + args: + sql: | + insert into misgivings values (43, 102); +- type: track_table + args: + schema: public + name: misgivings diff --git a/server/tests-py/queries/graphql_validation/teardown.yaml b/server/tests-py/queries/graphql_validation/teardown.yaml index b112569f5e2..80ba3164f95 100644 --- a/server/tests-py/queries/graphql_validation/teardown.yaml +++ b/server/tests-py/queries/graphql_validation/teardown.yaml @@ -4,3 +4,11 @@ args: args: sql: | drop table author +- type: run_sql + args: + sql: | + drop table article +- type: run_sql + args: + sql: | + drop table misgivings diff --git a/server/tests-py/queries/graphql_validation/variable_type_mismatch.yaml b/server/tests-py/queries/graphql_validation/variable_type_mismatch.yaml new file mode 100644 index 00000000000..6d7b877b5b4 --- /dev/null +++ b/server/tests-py/queries/graphql_validation/variable_type_mismatch.yaml @@ -0,0 +1,223 @@ +- description: Variable type mismatch in column parser + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.update_author.args._set.name" + code: "validation-failed" + message: variable "name" is declared as Int!, but used where String is expected + query: + query: | + mutation update_author($name: Int!) { + update_author(where: {id: {_eq: 0}}, _set: {name: $name}) { + returning { + id + name + } + } + } + variables: + name: "foo" + +- description: Variable type mismatch in scalar parser + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.author.args.limit" + code: "validation-failed" + message: variable "limit" is declared as String, but used where Int is expected + query: + query: | + query get_author($limit: String) { + author(limit: $limit) { + id + name + } + } + variables: + limit: 42 + +- description: Input type coercion is not variable coercion + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.insert_misgivings_one.args.object.f" + code: "validation-failed" + message: variable "i" is declared as Int, but used where Float is expected + query: + query: | + mutation have_misgivings($i: Int) { + insert_misgivings_one(object: {f: $i}) { + i + } + } + variables: + i: 42 + +- description: Variable type mismatch with custom scalar + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.insert_author.args.objects[0].location" + code: "validation-failed" + message: variable "location" is declared as geometry, but used where geography is expected + query: + query: | + mutation insert_author($location: geometry) { + insert_author(objects: {name: "bar" location: $location}) { + affected_rows + } + } + variables: + location: + - 42 + - 101 + +- description: "Variable type mismatch: nullable variable at non-nullable location" + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.insert_author_one.args.object" + code: "validation-failed" + message: variable "author" is declared as author_insert_input, but used where author_insert_input! is expected + query: + query: | + mutation insert_author($author: author_insert_input) { + insert_author_one(object: $author) { + id + } + } + variables: + author: + name: "baz" + location: null + +- description: "Variable type match: nullable variable with non-null default at non-nullable location" + url: /v1/graphql + status: 200 + response: + data: + insert_author_one: + id: 1 + query: + query: | + mutation insert_author($author: author_insert_input = {name: "default"}) { + insert_author_one(object: $author) { + id + } + } + variables: + author: + name: "baz" + location: null + +- description: "Variable type mismatch: nullable variable with null default at non-nullable location" + url: /v1/graphql + status: 200 + response: + errors: + - extensions: + path: "$.selectionSet.insert_author_one.args.object" + code: "validation-failed" + message: variable "author" is declared as author_insert_input, but used where author_insert_input! is expected + query: + query: | + mutation insert_author($author: author_insert_input = null) { + insert_author_one(object: $author) { + id + } + } + variables: + author: + name: "baz" + location: null + +- description: "Variable type match: nullable variable at location with default" + url: /v1/graphql + status: 200 + response: + data: + __type: + fields: + - name: id + - name: location + - name: name + + query: + query: | + query author_type($includeDeprecated: Boolean) { + __type(name: "author") { + fields(includeDeprecated: $includeDeprecated) { + name + } + } + } + variables: + includeDeprecated: False + + +- description: Variable type match nullability + url: /v1/graphql + status: 200 + response: + data: + insert_author_one: + id: 2 + query: + query: | + mutation insert_author($name: String!) { + insert_author_one(object: {name: $name}) { + id + } + } + variables: + name: "ct" + +- description: Variable type match optional + url: /v1/graphql + status: 200 + response: + data: + insert_author_one: + id: 3 + query: + query: | + mutation insert_author($name: String) { + insert_author_one(object: {name: $name}) { + id + } + } + variables: + name: "asdfdsfllhjh" + +- description: Variable type match default + url: /v1/graphql + status: 200 + response: + data: + __type: + fields: + - name: id + - name: location + - name: name + + query: + query: | + query author_type($includeDeprecated: Boolean) { + __type(name: "author") { + fields(includeDeprecated: $includeDeprecated) { + name + } + } + } + variables: + includeDeprecated: False diff --git a/server/tests-py/queries/remote_schemas/add_remote_schema_err_missing_arg.yaml b/server/tests-py/queries/remote_schemas/add_remote_schema_err_missing_arg.yaml index f2d453a45a0..7d05e8b03ac 100644 --- a/server/tests-py/queries/remote_schemas/add_remote_schema_err_missing_arg.yaml +++ b/server/tests-py/queries/remote_schemas/add_remote_schema_err_missing_arg.yaml @@ -4,7 +4,7 @@ status: 400 response: path: $.args error: |- - Interface field argument 'Character'."id"("ifaceArg":) required, but Object field 'Droid'."id" does not provide it + Interface field argument 'Character'."id"("ifaceArg":) required, but Object field 'Human'."id" does not provide it code: remote-schema-error query: type: add_remote_schema diff --git a/server/tests-py/queries/remote_schemas/add_remote_schema_with_union_err_wrapped_type.yaml b/server/tests-py/queries/remote_schemas/add_remote_schema_with_union_err_wrapped_type.yaml index 2a698512d83..ffdbd9f1bec 100644 --- a/server/tests-py/queries/remote_schemas/add_remote_schema_with_union_err_wrapped_type.yaml +++ b/server/tests-py/queries/remote_schemas/add_remote_schema_with_union_err_wrapped_type.yaml @@ -3,7 +3,7 @@ url: /v1/query status: 400 response: path: $.args - error: 'Error in $.types[1].possibleTypes[0].name: parsing Text failed, expected String, but encountered Null' + error: 'Error in $.types[1].possibleTypes[0].name: parsing Name failed, expected String, but encountered Null' code: remote-schema-error query: type: add_remote_schema diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/mutation_output_with_remote_join_fields.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/mutation_output_with_remote_join_fields.yaml new file mode 100644 index 00000000000..470e59f18ba --- /dev/null +++ b/server/tests-py/queries/remote_schemas/remote_relationships/mutation_output_with_remote_join_fields.yaml @@ -0,0 +1,60 @@ +- description: Creating a mutation with remote-join fields in the result + url: /v1/graphql + status: 200 + response: + data: + insert_authors_one: + name: alice + messageBasic: + id: 1 + name: alice + msg: You win! + query: + query: | + mutation { + insert_authors_one ( + object: { + name: "alice" + } + ) { + name + messageBasic { + id + name + msg + } + } + } + +- description: Creating a mutation with remote-join fields in the result + url: /v1/graphql + status: 200 + response: + data: + insert_authors: + affected_rows: 1 + returning: + - name: bob + messageBasic: + id: 2 + name: bob + msg: You lose! + query: + query: | + mutation { + insert_authors ( + objects: [{ + name: "bob" + }] + ) { + affected_rows + returning { + name + messageBasic { + id + name + msg + } + } + } + } diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_arr.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_arr.yaml index 907be03f60d..38b8d14317b 100644 --- a/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_arr.yaml +++ b/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_arr.yaml @@ -32,7 +32,7 @@ response: message: intentional-error locations: - line: 1 - column: 74 + column: 71 - path: - messagesNestedArgs__1 - 0 @@ -40,7 +40,7 @@ response: message: intentional-error locations: - line: 1 - column: 160 + column: 144 - path: - messagesNestedArgs__2 - 0 @@ -48,7 +48,7 @@ response: message: intentional-error locations: - line: 1 - column: 246 + column: 217 path: $ code: unexpected message: Errors from remote server diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_obj.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_obj.yaml index 77b77dac82b..3074e01b3d4 100644 --- a/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_obj.yaml +++ b/server/tests-py/queries/remote_schemas/remote_relationships/query_with_errors_obj.yaml @@ -28,21 +28,21 @@ response: message: intentional-error locations: - line: 1 - column: 54 + column: 49 - path: - messageBasic__1 - errorMsg message: intentional-error locations: - line: 1 - column: 120 + column: 100 - path: - messageBasic__2 - errorMsg message: intentional-error locations: - line: 1 - column: 186 + column: 151 path: $ code: unexpected message: Errors from remote server diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/setup.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/setup.yaml index f75815f0e3c..c5241b78282 100644 --- a/server/tests-py/queries/remote_schemas/remote_relationships/setup.yaml +++ b/server/tests-py/queries/remote_schemas/remote_relationships/setup.yaml @@ -40,3 +40,15 @@ args: url: http://localhost:4000 forward_client_headers: false +- type: run_sql + args: + sql: | + create table authors ( + id serial primary key, + name text + ); + +- type: track_table + args: + schema: public + name: authors diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/setup_remote_rel_basic_with_authors.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/setup_remote_rel_basic_with_authors.yaml new file mode 100644 index 00000000000..438aab63264 --- /dev/null +++ b/server/tests-py/queries/remote_schemas/remote_relationships/setup_remote_rel_basic_with_authors.yaml @@ -0,0 +1,11 @@ +type: create_remote_relationship +args: + name: messageBasic + table: authors + hasura_fields: + - id + remote_schema: my-remote-schema + remote_field: + message: + arguments: + id: "$id" diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/subscription_with_remote_join_fields.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/subscription_with_remote_join_fields.yaml new file mode 100644 index 00000000000..fc7c79e0824 --- /dev/null +++ b/server/tests-py/queries/remote_schemas/remote_relationships/subscription_with_remote_join_fields.yaml @@ -0,0 +1,21 @@ +description: Creating a subscription with remote-join fields should throw error +url: /v1/graphql +status: 200 +response: + errors: + - extensions: + path: $ + code: not-supported + message: Remote relationships are not allowed in subscriptions +query: + query: | + subscription { + profiles { + name + messageBasic { + id + name + msg + } + } + } diff --git a/server/tests-py/queries/remote_schemas/remote_relationships/teardown.yaml b/server/tests-py/queries/remote_schemas/remote_relationships/teardown.yaml index 5f8b73db588..29e6b346750 100644 --- a/server/tests-py/queries/remote_schemas/remote_relationships/teardown.yaml +++ b/server/tests-py/queries/remote_schemas/remote_relationships/teardown.yaml @@ -10,6 +10,11 @@ args: sql: | drop table if exists user_profiles +- type: run_sql + args: + sql: | + drop table if exists authors + # also drops remote relationship as direct dep - type: remove_remote_schema args: diff --git a/server/tests-py/remote_schemas/nodejs/.gitignore b/server/tests-py/remote_schemas/nodejs/.gitignore index 3b776d7f9e6..4ceeeb4955b 100644 --- a/server/tests-py/remote_schemas/nodejs/.gitignore +++ b/server/tests-py/remote_schemas/nodejs/.gitignore @@ -1,3 +1,2 @@ node_modules/ *.zip -package-lock.json diff --git a/server/tests-py/remote_schemas/nodejs/package-lock.json b/server/tests-py/remote_schemas/nodejs/package-lock.json new file mode 100644 index 00000000000..086bd003e70 --- /dev/null +++ b/server/tests-py/remote_schemas/nodejs/package-lock.json @@ -0,0 +1,1437 @@ +{ + "name": "aws-lambda-nodejs", + "version": "1.0.0", + "lockfileVersion": 1, + "requires": true, + "dependencies": { + "@apollo/protobufjs": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/@apollo/protobufjs/-/protobufjs-1.0.4.tgz", + "integrity": "sha512-EE3zx+/D/wur/JiLp6VCiw1iYdyy1lCJMf8CGPkLeDt5QJrN4N8tKFx33Ah4V30AUQzMk7Uz4IXKZ1LOj124gA==", + "requires": { + "@protobufjs/aspromise": "^1.1.2", + "@protobufjs/base64": "^1.1.2", + "@protobufjs/codegen": "^2.0.4", + "@protobufjs/eventemitter": "^1.1.0", + "@protobufjs/fetch": "^1.1.0", + "@protobufjs/float": "^1.0.2", + "@protobufjs/inquire": "^1.1.0", + "@protobufjs/path": "^1.1.2", + "@protobufjs/pool": "^1.1.0", + "@protobufjs/utf8": "^1.1.0", + "@types/long": "^4.0.0", + "@types/node": "^10.1.0", + "long": "^4.0.0" + }, + "dependencies": { + "@types/node": { + "version": "10.17.28", + "resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.28.tgz", + "integrity": "sha512-dzjES1Egb4c1a89C7lKwQh8pwjYmlOAG9dW1pBgxEk57tMrLnssOfEthz8kdkNaBd7lIqQx7APm5+mZ619IiCQ==" + } + } + }, + "@apollographql/apollo-tools": { + "version": "0.4.8", + "resolved": "https://registry.npmjs.org/@apollographql/apollo-tools/-/apollo-tools-0.4.8.tgz", + "integrity": "sha512-W2+HB8Y7ifowcf3YyPHgDI05izyRtOeZ4MqIr7LbTArtmJ0ZHULWpn84SGMW7NAvTV1tFExpHlveHhnXuJfuGA==", + "requires": { + "apollo-env": "^0.6.5" + } + }, + "@apollographql/graphql-playground-html": { + "version": "1.6.26", + "resolved": "https://registry.npmjs.org/@apollographql/graphql-playground-html/-/graphql-playground-html-1.6.26.tgz", + "integrity": "sha512-XAwXOIab51QyhBxnxySdK3nuMEUohhDsHQ5Rbco/V1vjlP75zZ0ZLHD9dTpXTN8uxKxopb2lUvJTq+M4g2Q0HQ==", + "requires": { + "xss": "^1.0.6" + } + }, + "@protobufjs/aspromise": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/@protobufjs/aspromise/-/aspromise-1.1.2.tgz", + "integrity": "sha1-m4sMxmPWaafY9vXQiToU00jzD78=" + }, + "@protobufjs/base64": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/@protobufjs/base64/-/base64-1.1.2.tgz", + "integrity": "sha512-AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==" + }, + "@protobufjs/codegen": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/@protobufjs/codegen/-/codegen-2.0.4.tgz", + "integrity": "sha512-YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==" + }, + "@protobufjs/eventemitter": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@protobufjs/eventemitter/-/eventemitter-1.1.0.tgz", + "integrity": "sha1-NVy8mLr61ZePntCV85diHx0Ga3A=" + }, + "@protobufjs/fetch": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@protobufjs/fetch/-/fetch-1.1.0.tgz", + "integrity": "sha1-upn7WYYUr2VwDBYZ/wbUVLDYTEU=", + "requires": { + "@protobufjs/aspromise": "^1.1.1", + "@protobufjs/inquire": "^1.1.0" + } + }, + "@protobufjs/float": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/@protobufjs/float/-/float-1.0.2.tgz", + "integrity": "sha1-Xp4avctz/Ap8uLKR33jIy9l7h9E=" + }, + "@protobufjs/inquire": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@protobufjs/inquire/-/inquire-1.1.0.tgz", + "integrity": "sha1-/yAOPnzyQp4tyvwRQIKOjMY48Ik=" + }, + "@protobufjs/path": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/@protobufjs/path/-/path-1.1.2.tgz", + "integrity": "sha1-bMKyDFya1q0NzP0hynZz2Nf79o0=" + }, + "@protobufjs/pool": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@protobufjs/pool/-/pool-1.1.0.tgz", + "integrity": "sha1-Cf0V8tbTq/qbZbw2ZQbWrXhG/1Q=" + }, + "@protobufjs/utf8": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz", + "integrity": "sha1-p3c2C1s5oaLlEG+OhY8v0tBgxXA=" + }, + "@types/accepts": { + "version": "1.3.5", + "resolved": "https://registry.npmjs.org/@types/accepts/-/accepts-1.3.5.tgz", + "integrity": "sha512-jOdnI/3qTpHABjM5cx1Hc0sKsPoYCp+DP/GJRGtDlPd7fiV9oXGGIcjW/ZOxLIvjGz8MA+uMZI9metHlgqbgwQ==", + "requires": { + "@types/node": "*" + } + }, + "@types/body-parser": { + "version": "1.19.0", + "resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.0.tgz", + "integrity": "sha512-W98JrE0j2K78swW4ukqMleo8R7h/pFETjM2DQ90MF6XK2i4LO4W3gQ71Lt4w3bfm2EvVSyWHplECvB5sK22yFQ==", + "requires": { + "@types/connect": "*", + "@types/node": "*" + } + }, + "@types/connect": { + "version": "3.4.33", + "resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.33.tgz", + "integrity": "sha512-2+FrkXY4zllzTNfJth7jOqEHC+enpLeGslEhpnTAkg21GkRrWV4SsAtqchtT4YS9/nODBU2/ZfsBY2X4J/dX7A==", + "requires": { + "@types/node": "*" + } + }, + "@types/content-disposition": { + "version": "0.5.3", + "resolved": "https://registry.npmjs.org/@types/content-disposition/-/content-disposition-0.5.3.tgz", + "integrity": "sha512-P1bffQfhD3O4LW0ioENXUhZ9OIa0Zn+P7M+pWgkCKaT53wVLSq0mrKksCID/FGHpFhRSxRGhgrQmfhRuzwtKdg==" + }, + "@types/cookies": { + "version": "0.7.4", + "resolved": "https://registry.npmjs.org/@types/cookies/-/cookies-0.7.4.tgz", + "integrity": "sha512-oTGtMzZZAVuEjTwCjIh8T8FrC8n/uwy+PG0yTvQcdZ7etoel7C7/3MSd7qrukENTgQtotG7gvBlBojuVs7X5rw==", + "requires": { + "@types/connect": "*", + "@types/express": "*", + "@types/keygrip": "*", + "@types/node": "*" + } + }, + "@types/cors": { + "version": "2.8.7", + "resolved": "https://registry.npmjs.org/@types/cors/-/cors-2.8.7.tgz", + "integrity": "sha512-sOdDRU3oRS7LBNTIqwDkPJyq0lpHYcbMTt0TrjzsXbk/e37hcLTH6eZX7CdbDeN0yJJvzw9hFBZkbtCSbk/jAQ==", + "requires": { + "@types/express": "*" + } + }, + "@types/express": { + "version": "4.17.7", + "resolved": "https://registry.npmjs.org/@types/express/-/express-4.17.7.tgz", + "integrity": "sha512-dCOT5lcmV/uC2J9k0rPafATeeyz+99xTt54ReX11/LObZgfzJqZNcW27zGhYyX+9iSEGXGt5qLPwRSvBZcLvtQ==", + "requires": { + "@types/body-parser": "*", + "@types/express-serve-static-core": "*", + "@types/qs": "*", + "@types/serve-static": "*" + } + }, + "@types/express-serve-static-core": { + "version": "4.17.9", + "resolved": "https://registry.npmjs.org/@types/express-serve-static-core/-/express-serve-static-core-4.17.9.tgz", + "integrity": "sha512-DG0BYg6yO+ePW+XoDENYz8zhNGC3jDDEpComMYn7WJc4mY1Us8Rw9ax2YhJXxpyk2SF47PQAoQ0YyVT1a0bEkA==", + "requires": { + "@types/node": "*", + "@types/qs": "*", + "@types/range-parser": "*" + } + }, + "@types/fs-capacitor": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/@types/fs-capacitor/-/fs-capacitor-2.0.0.tgz", + "integrity": "sha512-FKVPOCFbhCvZxpVAMhdBdTfVfXUpsh15wFHgqOKxh9N9vzWZVuWCSijZ5T4U34XYNnuj2oduh6xcs1i+LPI+BQ==", + "requires": { + "@types/node": "*" + } + }, + "@types/graphql-upload": { + "version": "8.0.3", + "resolved": "https://registry.npmjs.org/@types/graphql-upload/-/graphql-upload-8.0.3.tgz", + "integrity": "sha512-hmLg9pCU/GmxBscg8GCr1vmSoEmbItNNxdD5YH2TJkXm//8atjwuprB+xJBK714JG1dkxbbhp5RHX+Pz1KsCMA==", + "requires": { + "@types/express": "*", + "@types/fs-capacitor": "*", + "@types/koa": "*", + "graphql": "^14.5.3" + }, + "dependencies": { + "graphql": { + "version": "14.7.0", + "resolved": "https://registry.npmjs.org/graphql/-/graphql-14.7.0.tgz", + "integrity": "sha512-l0xWZpoPKpppFzMfvVyFmp9vLN7w/ZZJPefUicMCepfJeQ8sMcztloGYY9DfjVPo6tIUDzU5Hw3MUbIjj9AVVA==", + "requires": { + "iterall": "^1.2.2" + } + } + } + }, + "@types/http-assert": { + "version": "1.5.1", + "resolved": "https://registry.npmjs.org/@types/http-assert/-/http-assert-1.5.1.tgz", + "integrity": "sha512-PGAK759pxyfXE78NbKxyfRcWYA/KwW17X290cNev/qAsn9eQIxkH4shoNBafH37wewhDG/0p1cHPbK6+SzZjWQ==" + }, + "@types/keygrip": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/@types/keygrip/-/keygrip-1.0.2.tgz", + "integrity": "sha512-GJhpTepz2udxGexqos8wgaBx4I/zWIDPh/KOGEwAqtuGDkOUJu5eFvwmdBX4AmB8Odsr+9pHCQqiAqDL/yKMKw==" + }, + "@types/koa": { + "version": "2.11.3", + "resolved": "https://registry.npmjs.org/@types/koa/-/koa-2.11.3.tgz", + "integrity": "sha512-ABxVkrNWa4O/Jp24EYI/hRNqEVRlhB9g09p48neQp4m3xL1TJtdWk2NyNQSMCU45ejeELMQZBYyfstyVvO2H3Q==", + "requires": { + "@types/accepts": "*", + "@types/content-disposition": "*", + "@types/cookies": "*", + "@types/http-assert": "*", + "@types/keygrip": "*", + "@types/koa-compose": "*", + "@types/node": "*" + } + }, + "@types/koa-compose": { + "version": "3.2.5", + "resolved": "https://registry.npmjs.org/@types/koa-compose/-/koa-compose-3.2.5.tgz", + "integrity": "sha512-B8nG/OoE1ORZqCkBVsup/AKcvjdgoHnfi4pZMn5UwAPCbhk/96xyv284eBYW8JlQbQ7zDmnpFr68I/40mFoIBQ==", + "requires": { + "@types/koa": "*" + } + }, + "@types/long": { + "version": "4.0.1", + "resolved": "https://registry.npmjs.org/@types/long/-/long-4.0.1.tgz", + "integrity": "sha512-5tXH6Bx/kNGd3MgffdmP4dy2Z+G4eaXw0SE81Tq3BNadtnMR5/ySMzX4SLEzHJzSmPNn4HIdpQsBvXMUykr58w==" + }, + "@types/mime": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/@types/mime/-/mime-2.0.3.tgz", + "integrity": "sha512-Jus9s4CDbqwocc5pOAnh8ShfrnMcPHuJYzVcSUU7lrh8Ni5HuIqX3oilL86p3dlTrk0LzHRCgA/GQ7uNCw6l2Q==" + }, + "@types/node": { + "version": "14.0.27", + "resolved": "https://registry.npmjs.org/@types/node/-/node-14.0.27.tgz", + "integrity": "sha512-kVrqXhbclHNHGu9ztnAwSncIgJv/FaxmzXJvGXNdcCpV1b8u1/Mi6z6m0vwy0LzKeXFTPLH0NzwmoJ3fNCIq0g==" + }, + "@types/node-fetch": { + "version": "2.5.7", + "resolved": "https://registry.npmjs.org/@types/node-fetch/-/node-fetch-2.5.7.tgz", + "integrity": "sha512-o2WVNf5UhWRkxlf6eq+jMZDu7kjgpgJfl4xVNlvryc95O/6F2ld8ztKX+qu+Rjyet93WAWm5LjeX9H5FGkODvw==", + "requires": { + "@types/node": "*", + "form-data": "^3.0.0" + } + }, + "@types/qs": { + "version": "6.9.4", + "resolved": "https://registry.npmjs.org/@types/qs/-/qs-6.9.4.tgz", + "integrity": "sha512-+wYo+L6ZF6BMoEjtf8zB2esQsqdV6WsjRK/GP9WOgLPrq87PbNWgIxS76dS5uvl/QXtHGakZmwTznIfcPXcKlQ==" + }, + "@types/range-parser": { + "version": "1.2.3", + "resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.3.tgz", + "integrity": "sha512-ewFXqrQHlFsgc09MK5jP5iR7vumV/BYayNC6PgJO2LPe8vrnNFyjQjSppfEngITi0qvfKtzFvgKymGheFM9UOA==" + }, + "@types/serve-static": { + "version": "1.13.5", + "resolved": "https://registry.npmjs.org/@types/serve-static/-/serve-static-1.13.5.tgz", + "integrity": "sha512-6M64P58N+OXjU432WoLLBQxbA0LRGBCRm7aAGQJ+SMC1IMl0dgRVi9EFfoDcS2a7Xogygk/eGN94CfwU9UF7UQ==", + "requires": { + "@types/express-serve-static-core": "*", + "@types/mime": "*" + } + }, + "@types/ws": { + "version": "7.2.6", + "resolved": "https://registry.npmjs.org/@types/ws/-/ws-7.2.6.tgz", + "integrity": "sha512-Q07IrQUSNpr+cXU4E4LtkSIBPie5GLZyyMC1QtQYRLWz701+XcoVygGUZgvLqElq1nU4ICldMYPnexlBsg3dqQ==", + "requires": { + "@types/node": "*" + } + }, + "@wry/equality": { + "version": "0.1.11", + "resolved": "https://registry.npmjs.org/@wry/equality/-/equality-0.1.11.tgz", + "integrity": "sha512-mwEVBDUVODlsQQ5dfuLUS5/Tf7jqUKyhKYHmVi4fPB6bDMOfWvUPJmKgS1Z7Za/sOI3vzWt4+O7yCiL/70MogA==", + "requires": { + "tslib": "^1.9.3" + } + }, + "accepts": { + "version": "1.3.7", + "resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.7.tgz", + "integrity": "sha512-Il80Qs2WjYlJIBNzNkK6KYqlVMTbZLXgHx2oT0pU/fjRHyEp+PEfEPY0R3WCwAGVOtauxh1hOxNgIf5bv7dQpA==", + "requires": { + "mime-types": "~2.1.24", + "negotiator": "0.6.2" + } + }, + "apollo-cache-control": { + "version": "0.11.1", + "resolved": "https://registry.npmjs.org/apollo-cache-control/-/apollo-cache-control-0.11.1.tgz", + "integrity": "sha512-6iHa8TkcKt4rx5SKRzDNjUIpCQX+7/FlZwD7vRh9JDnM4VH8SWhpj8fUR3CiEY8Kuc4ChXnOY8bCcMju5KPnIQ==", + "requires": { + "apollo-server-env": "^2.4.5", + "apollo-server-plugin-base": "^0.9.1" + } + }, + "apollo-datasource": { + "version": "0.7.2", + "resolved": "https://registry.npmjs.org/apollo-datasource/-/apollo-datasource-0.7.2.tgz", + "integrity": "sha512-ibnW+s4BMp4K2AgzLEtvzkjg7dJgCaw9M5b5N0YKNmeRZRnl/I/qBTQae648FsRKgMwTbRQIvBhQ0URUFAqFOw==", + "requires": { + "apollo-server-caching": "^0.5.2", + "apollo-server-env": "^2.4.5" + } + }, + "apollo-engine-reporting": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/apollo-engine-reporting/-/apollo-engine-reporting-2.3.0.tgz", + "integrity": "sha512-SbcPLFuUZcRqDEZ6mSs8uHM9Ftr8yyt2IEu0JA8c3LNBmYXSLM7MHqFe80SVcosYSTBgtMz8mLJO8orhYoSYZw==", + "requires": { + "apollo-engine-reporting-protobuf": "^0.5.2", + "apollo-graphql": "^0.5.0", + "apollo-server-caching": "^0.5.2", + "apollo-server-env": "^2.4.5", + "apollo-server-errors": "^2.4.2", + "apollo-server-plugin-base": "^0.9.1", + "apollo-server-types": "^0.5.1", + "async-retry": "^1.2.1", + "uuid": "^8.0.0" + } + }, + "apollo-engine-reporting-protobuf": { + "version": "0.5.2", + "resolved": "https://registry.npmjs.org/apollo-engine-reporting-protobuf/-/apollo-engine-reporting-protobuf-0.5.2.tgz", + "integrity": "sha512-4wm9FR3B7UvJxcK/69rOiS5CAJPEYKufeRWb257ZLfX7NGFTMqvbc1hu4q8Ch7swB26rTpkzfsftLED9DqH9qg==", + "requires": { + "@apollo/protobufjs": "^1.0.3" + } + }, + "apollo-env": { + "version": "0.6.5", + "resolved": "https://registry.npmjs.org/apollo-env/-/apollo-env-0.6.5.tgz", + "integrity": "sha512-jeBUVsGymeTHYWp3me0R2CZRZrFeuSZeICZHCeRflHTfnQtlmbSXdy5E0pOyRM9CU4JfQkKDC98S1YglQj7Bzg==", + "requires": { + "@types/node-fetch": "2.5.7", + "core-js": "^3.0.1", + "node-fetch": "^2.2.0", + "sha.js": "^2.4.11" + } + }, + "apollo-graphql": { + "version": "0.5.0", + "resolved": "https://registry.npmjs.org/apollo-graphql/-/apollo-graphql-0.5.0.tgz", + "integrity": "sha512-YSdF/BKPbsnQpxWpmCE53pBJX44aaoif31Y22I/qKpB6ZSGzYijV5YBoCL5Q15H2oA/v/02Oazh9lbp4ek3eig==", + "requires": { + "apollo-env": "^0.6.5", + "lodash.sortby": "^4.7.0" + } + }, + "apollo-link": { + "version": "1.2.14", + "resolved": "https://registry.npmjs.org/apollo-link/-/apollo-link-1.2.14.tgz", + "integrity": "sha512-p67CMEFP7kOG1JZ0ZkYZwRDa369w5PIjtMjvrQd/HnIV8FRsHRqLqK+oAZQnFa1DDdZtOtHTi+aMIW6EatC2jg==", + "requires": { + "apollo-utilities": "^1.3.0", + "ts-invariant": "^0.4.0", + "tslib": "^1.9.3", + "zen-observable-ts": "^0.8.21" + } + }, + "apollo-server": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/apollo-server/-/apollo-server-2.1.0.tgz", + "integrity": "sha512-Uo5RFHGtUPq3OvycLXCll5QgXf2wNVBFYUhapByADBP4E1KRgbyl9Fbf82OgcbbLYwEZTlQMbyBpd6hX8XJKAw==", + "requires": { + "apollo-server-core": "^2.1.0", + "apollo-server-express": "^2.1.0", + "express": "^4.0.0", + "graphql-subscriptions": "^0.5.8", + "graphql-tools": "^3.0.4" + } + }, + "apollo-server-caching": { + "version": "0.5.2", + "resolved": "https://registry.npmjs.org/apollo-server-caching/-/apollo-server-caching-0.5.2.tgz", + "integrity": "sha512-HUcP3TlgRsuGgeTOn8QMbkdx0hLPXyEJehZIPrcof0ATz7j7aTPA4at7gaiFHCo8gk07DaWYGB3PFgjboXRcWQ==", + "requires": { + "lru-cache": "^5.0.0" + } + }, + "apollo-server-core": { + "version": "2.16.1", + "resolved": "https://registry.npmjs.org/apollo-server-core/-/apollo-server-core-2.16.1.tgz", + "integrity": "sha512-nuwn5ZBbmzPwDetb3FgiFFJlNK7ZBFg8kis/raymrjd3eBGdNcOyMTJDl6J9673X9Xqp+dXQmFYDW/G3G8S1YA==", + "requires": { + "@apollographql/apollo-tools": "^0.4.3", + "@apollographql/graphql-playground-html": "1.6.26", + "@types/graphql-upload": "^8.0.0", + "@types/ws": "^7.0.0", + "apollo-cache-control": "^0.11.1", + "apollo-datasource": "^0.7.2", + "apollo-engine-reporting": "^2.3.0", + "apollo-server-caching": "^0.5.2", + "apollo-server-env": "^2.4.5", + "apollo-server-errors": "^2.4.2", + "apollo-server-plugin-base": "^0.9.1", + "apollo-server-types": "^0.5.1", + "apollo-tracing": "^0.11.1", + "fast-json-stable-stringify": "^2.0.0", + "graphql-extensions": "^0.12.4", + "graphql-tag": "^2.9.2", + "graphql-tools": "^4.0.0", + "graphql-upload": "^8.0.2", + "loglevel": "^1.6.7", + "sha.js": "^2.4.11", + "subscriptions-transport-ws": "^0.9.11", + "ws": "^6.0.0" + }, + "dependencies": { + "graphql-tools": { + "version": "4.0.8", + "resolved": "https://registry.npmjs.org/graphql-tools/-/graphql-tools-4.0.8.tgz", + "integrity": "sha512-MW+ioleBrwhRjalKjYaLQbr+920pHBgy9vM/n47sswtns8+96sRn5M/G+J1eu7IMeKWiN/9p6tmwCHU7552VJg==", + "requires": { + "apollo-link": "^1.2.14", + "apollo-utilities": "^1.0.1", + "deprecated-decorator": "^0.1.6", + "iterall": "^1.1.3", + "uuid": "^3.1.0" + } + }, + "uuid": { + "version": "3.4.0", + "resolved": "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz", + "integrity": "sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==" + } + } + }, + "apollo-server-env": { + "version": "2.4.5", + "resolved": "https://registry.npmjs.org/apollo-server-env/-/apollo-server-env-2.4.5.tgz", + "integrity": "sha512-nfNhmGPzbq3xCEWT8eRpoHXIPNcNy3QcEoBlzVMjeglrBGryLG2LXwBSPnVmTRRrzUYugX0ULBtgE3rBFNoUgA==", + "requires": { + "node-fetch": "^2.1.2", + "util.promisify": "^1.0.0" + } + }, + "apollo-server-errors": { + "version": "2.4.2", + "resolved": "https://registry.npmjs.org/apollo-server-errors/-/apollo-server-errors-2.4.2.tgz", + "integrity": "sha512-FeGxW3Batn6sUtX3OVVUm7o56EgjxDlmgpTLNyWcLb0j6P8mw9oLNyAm3B+deHA4KNdNHO5BmHS2g1SJYjqPCQ==" + }, + "apollo-server-express": { + "version": "2.16.1", + "resolved": "https://registry.npmjs.org/apollo-server-express/-/apollo-server-express-2.16.1.tgz", + "integrity": "sha512-Oq5YNcaMYnRk6jDmA9LWf8oSd2KHDVe7jQ4wtooAvG9FVUD+FaFBgSkytXHMvtifQh2wdF07Ri8uDLMz6IQjTw==", + "requires": { + "@apollographql/graphql-playground-html": "1.6.26", + "@types/accepts": "^1.3.5", + "@types/body-parser": "1.19.0", + "@types/cors": "^2.8.4", + "@types/express": "4.17.7", + "accepts": "^1.3.5", + "apollo-server-core": "^2.16.1", + "apollo-server-types": "^0.5.1", + "body-parser": "^1.18.3", + "cors": "^2.8.4", + "express": "^4.17.1", + "graphql-subscriptions": "^1.0.0", + "graphql-tools": "^4.0.0", + "parseurl": "^1.3.2", + "subscriptions-transport-ws": "^0.9.16", + "type-is": "^1.6.16" + }, + "dependencies": { + "graphql-subscriptions": { + "version": "1.1.0", + "resolved": "https://registry.npmjs.org/graphql-subscriptions/-/graphql-subscriptions-1.1.0.tgz", + "integrity": "sha512-6WzlBFC0lWmXJbIVE8OgFgXIP4RJi3OQgTPa0DVMsDXdpRDjTsM1K9wfl5HSYX7R87QAGlvcv2Y4BIZa/ItonA==", + "requires": { + "iterall": "^1.2.1" + } + }, + "graphql-tools": { + "version": "4.0.8", + "resolved": "https://registry.npmjs.org/graphql-tools/-/graphql-tools-4.0.8.tgz", + "integrity": "sha512-MW+ioleBrwhRjalKjYaLQbr+920pHBgy9vM/n47sswtns8+96sRn5M/G+J1eu7IMeKWiN/9p6tmwCHU7552VJg==", + "requires": { + "apollo-link": "^1.2.14", + "apollo-utilities": "^1.0.1", + "deprecated-decorator": "^0.1.6", + "iterall": "^1.1.3", + "uuid": "^3.1.0" + } + }, + "uuid": { + "version": "3.4.0", + "resolved": "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz", + "integrity": "sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==" + } + } + }, + "apollo-server-plugin-base": { + "version": "0.9.1", + "resolved": "https://registry.npmjs.org/apollo-server-plugin-base/-/apollo-server-plugin-base-0.9.1.tgz", + "integrity": "sha512-kvrX4Z3FdpjrZdHkyl5iY2A1Wvp4b6KQp00DeZqss7GyyKNUBKr80/7RQgBLEw7EWM7WB19j459xM/TjvW0FKQ==", + "requires": { + "apollo-server-types": "^0.5.1" + } + }, + "apollo-server-types": { + "version": "0.5.1", + "resolved": "https://registry.npmjs.org/apollo-server-types/-/apollo-server-types-0.5.1.tgz", + "integrity": "sha512-my2cPw+DAb2qVnIuBcsRKGyS28uIc2vjFxa1NpRoJZe9gK0BWUBk7wzXnIzWy3HZ5Er11e/40MPTUesNfMYNVA==", + "requires": { + "apollo-engine-reporting-protobuf": "^0.5.2", + "apollo-server-caching": "^0.5.2", + "apollo-server-env": "^2.4.5" + } + }, + "apollo-tracing": { + "version": "0.11.1", + "resolved": "https://registry.npmjs.org/apollo-tracing/-/apollo-tracing-0.11.1.tgz", + "integrity": "sha512-l7g+uILw7v32GA46IRXIx5XXbZhFI96BhSqrGK9yyvfq+NMcvVZrj3kIhRImPGhAjMdV+5biA/jztabElAbDjg==", + "requires": { + "apollo-server-env": "^2.4.5", + "apollo-server-plugin-base": "^0.9.1" + } + }, + "apollo-utilities": { + "version": "1.3.4", + "resolved": "https://registry.npmjs.org/apollo-utilities/-/apollo-utilities-1.3.4.tgz", + "integrity": "sha512-pk2hiWrCXMAy2fRPwEyhvka+mqwzeP60Jr1tRYi5xru+3ko94HI9o6lK0CT33/w4RDlxWchmdhDCrvdr+pHCig==", + "requires": { + "@wry/equality": "^0.1.2", + "fast-json-stable-stringify": "^2.0.0", + "ts-invariant": "^0.4.0", + "tslib": "^1.10.0" + } + }, + "array-flatten": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz", + "integrity": "sha1-ml9pkFGx5wczKPKgCJaLZOopVdI=" + }, + "async-limiter": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/async-limiter/-/async-limiter-1.0.1.tgz", + "integrity": "sha512-csOlWGAcRFJaI6m+F2WKdnMKr4HhdhFVBk0H/QbJFMCr+uO2kwohwXQPxw/9OCxp05r5ghVBFSyioixx3gfkNQ==" + }, + "async-retry": { + "version": "1.3.1", + "resolved": "https://registry.npmjs.org/async-retry/-/async-retry-1.3.1.tgz", + "integrity": "sha512-aiieFW/7h3hY0Bq5d+ktDBejxuwR78vRu9hDUdR8rNhSaQ29VzPL4AoIRG7D/c7tdenwOcKvgPM6tIxB3cB6HA==", + "requires": { + "retry": "0.12.0" + } + }, + "asynckit": { + "version": "0.4.0", + "resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz", + "integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k=" + }, + "backo2": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/backo2/-/backo2-1.0.2.tgz", + "integrity": "sha1-MasayLEpNjRj41s+u2n038+6eUc=" + }, + "body-parser": { + "version": "1.19.0", + "resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.0.tgz", + "integrity": "sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==", + "requires": { + "bytes": "3.1.0", + "content-type": "~1.0.4", + "debug": "2.6.9", + "depd": "~1.1.2", + "http-errors": "1.7.2", + "iconv-lite": "0.4.24", + "on-finished": "~2.3.0", + "qs": "6.7.0", + "raw-body": "2.4.0", + "type-is": "~1.6.17" + }, + "dependencies": { + "http-errors": { + "version": "1.7.2", + "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.2.tgz", + "integrity": "sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==", + "requires": { + "depd": "~1.1.2", + "inherits": "2.0.3", + "setprototypeof": "1.1.1", + "statuses": ">= 1.5.0 < 2", + "toidentifier": "1.0.0" + } + }, + "inherits": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz", + "integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=" + }, + "setprototypeof": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", + "integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" + } + } + }, + "busboy": { + "version": "0.3.1", + "resolved": "https://registry.npmjs.org/busboy/-/busboy-0.3.1.tgz", + "integrity": "sha512-y7tTxhGKXcyBxRKAni+awqx8uqaJKrSFSNFSeRG5CsWNdmy2BIK+6VGWEW7TZnIO/533mtMEA4rOevQV815YJw==", + "requires": { + "dicer": "0.3.0" + } + }, + "bytes": { + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.0.tgz", + "integrity": "sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==" + }, + "combined-stream": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz", + "integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==", + "requires": { + "delayed-stream": "~1.0.0" + } + }, + "commander": { + "version": "2.20.3", + "resolved": "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz", + "integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==" + }, + "content-disposition": { + "version": "0.5.3", + "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.3.tgz", + "integrity": "sha512-ExO0774ikEObIAEV9kDo50o+79VCUdEB6n6lzKgGwupcVeRlhrj3qGAfwq8G6uBJjkqLrhT0qEYFcWng8z1z0g==", + "requires": { + "safe-buffer": "5.1.2" + }, + "dependencies": { + "safe-buffer": { + "version": "5.1.2", + "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", + "integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==" + } + } + }, + "content-type": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.4.tgz", + "integrity": "sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA==" + }, + "cookie": { + "version": "0.4.0", + "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.0.tgz", + "integrity": "sha512-+Hp8fLp57wnUSt0tY0tHEXh4voZRDnoIrZPqlo3DPiI4y9lwg/jqx+1Om94/W6ZaPDOUbnjOt/99w66zk+l1Xg==" + }, + "cookie-signature": { + "version": "1.0.6", + "resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.6.tgz", + "integrity": "sha1-4wOogrNCzD7oylE6eZmXNNqzriw=" + }, + "core-js": { + "version": "3.6.5", + "resolved": "https://registry.npmjs.org/core-js/-/core-js-3.6.5.tgz", + "integrity": "sha512-vZVEEwZoIsI+vPEuoF9Iqf5H7/M3eeQqWlQnYa8FSKKePuYTf5MWnxb5SDAzCa60b3JBRS5g9b+Dq7b1y/RCrA==" + }, + "cors": { + "version": "2.8.5", + "resolved": "https://registry.npmjs.org/cors/-/cors-2.8.5.tgz", + "integrity": "sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==", + "requires": { + "object-assign": "^4", + "vary": "^1" + } + }, + "cssfilter": { + "version": "0.0.10", + "resolved": "https://registry.npmjs.org/cssfilter/-/cssfilter-0.0.10.tgz", + "integrity": "sha1-xtJnJjKi5cg+AT5oZKQs6N79IK4=" + }, + "debug": { + "version": "2.6.9", + "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz", + "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==", + "requires": { + "ms": "2.0.0" + } + }, + "define-properties": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/define-properties/-/define-properties-1.1.3.tgz", + "integrity": "sha512-3MqfYKj2lLzdMSf8ZIZE/V+Zuy+BgD6f164e8K2w7dgnpKArBDerGYpM46IYYcjnkdPNMjPk9A6VFB8+3SKlXQ==", + "requires": { + "object-keys": "^1.0.12" + } + }, + "delayed-stream": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz", + "integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk=" + }, + "depd": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/depd/-/depd-1.1.2.tgz", + "integrity": "sha1-m81S4UwJd2PnSbJ0xDRu0uVgtak=" + }, + "deprecated-decorator": { + "version": "0.1.6", + "resolved": "https://registry.npmjs.org/deprecated-decorator/-/deprecated-decorator-0.1.6.tgz", + "integrity": "sha1-AJZjF7ehL+kvPMgx91g68ym4bDc=" + }, + "destroy": { + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/destroy/-/destroy-1.0.4.tgz", + "integrity": "sha1-l4hXRCxEdJ5CBmE+N5RiBYJqvYA=" + }, + "dicer": { + "version": "0.3.0", + "resolved": "https://registry.npmjs.org/dicer/-/dicer-0.3.0.tgz", + "integrity": "sha512-MdceRRWqltEG2dZqO769g27N/3PXfcKl04VhYnBlo2YhH7zPi88VebsjTKclaOyiuMaGU72hTfw3VkUitGcVCA==", + "requires": { + "streamsearch": "0.1.2" + } + }, + "ee-first": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/ee-first/-/ee-first-1.1.1.tgz", + "integrity": "sha1-WQxhFWsK4vTwJVcyoViyZrxWsh0=" + }, + "encodeurl": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-1.0.2.tgz", + "integrity": "sha1-rT/0yG7C0CkyL1oCw6mmBslbP1k=" + }, + "es-abstract": { + "version": "1.17.6", + "resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.17.6.tgz", + "integrity": "sha512-Fr89bON3WFyUi5EvAeI48QTWX0AyekGgLA8H+c+7fbfCkJwRWRMLd8CQedNEyJuoYYhmtEqY92pgte1FAhBlhw==", + "requires": { + "es-to-primitive": "^1.2.1", + "function-bind": "^1.1.1", + "has": "^1.0.3", + "has-symbols": "^1.0.1", + "is-callable": "^1.2.0", + "is-regex": "^1.1.0", + "object-inspect": "^1.7.0", + "object-keys": "^1.1.1", + "object.assign": "^4.1.0", + "string.prototype.trimend": "^1.0.1", + "string.prototype.trimstart": "^1.0.1" + } + }, + "es-to-primitive": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/es-to-primitive/-/es-to-primitive-1.2.1.tgz", + "integrity": "sha512-QCOllgZJtaUo9miYBcLChTUaHNjJF3PYs1VidD7AwiEj1kYxKeQTctLAezAOH5ZKRH0g2IgPn6KwB4IT8iRpvA==", + "requires": { + "is-callable": "^1.1.4", + "is-date-object": "^1.0.1", + "is-symbol": "^1.0.2" + } + }, + "escape-html": { + "version": "1.0.3", + "resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz", + "integrity": "sha1-Aljq5NPQwJdN4cFpGI7wBR0dGYg=" + }, + "etag": { + "version": "1.8.1", + "resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz", + "integrity": "sha1-Qa4u62XvpiJorr/qg6x9eSmbCIc=" + }, + "eventemitter3": { + "version": "3.1.2", + "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-3.1.2.tgz", + "integrity": "sha512-tvtQIeLVHjDkJYnzf2dgVMxfuSGJeM/7UCG17TT4EumTfNtF+0nebF/4zWOIkCreAbtNqhGEboB6BWrwqNaw4Q==" + }, + "express": { + "version": "4.17.1", + "resolved": "https://registry.npmjs.org/express/-/express-4.17.1.tgz", + "integrity": "sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==", + "requires": { + "accepts": "~1.3.7", + "array-flatten": "1.1.1", + "body-parser": "1.19.0", + "content-disposition": "0.5.3", + "content-type": "~1.0.4", + "cookie": "0.4.0", + "cookie-signature": "1.0.6", + "debug": "2.6.9", + "depd": "~1.1.2", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "etag": "~1.8.1", + "finalhandler": "~1.1.2", + "fresh": "0.5.2", + "merge-descriptors": "1.0.1", + "methods": "~1.1.2", + "on-finished": "~2.3.0", + "parseurl": "~1.3.3", + "path-to-regexp": "0.1.7", + "proxy-addr": "~2.0.5", + "qs": "6.7.0", + "range-parser": "~1.2.1", + "safe-buffer": "5.1.2", + "send": "0.17.1", + "serve-static": "1.14.1", + "setprototypeof": "1.1.1", + "statuses": "~1.5.0", + "type-is": "~1.6.18", + "utils-merge": "1.0.1", + "vary": "~1.1.2" + }, + "dependencies": { + "safe-buffer": { + "version": "5.1.2", + "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", + "integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==" + }, + "setprototypeof": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", + "integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" + } + } + }, + "fast-json-stable-stringify": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz", + "integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==" + }, + "finalhandler": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.1.2.tgz", + "integrity": "sha512-aAWcW57uxVNrQZqFXjITpW3sIUQmHGG3qSb9mUah9MgMC4NeWhNOlNjXEYq3HjRAvL6arUviZGGJsBg6z0zsWA==", + "requires": { + "debug": "2.6.9", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "on-finished": "~2.3.0", + "parseurl": "~1.3.3", + "statuses": "~1.5.0", + "unpipe": "~1.0.0" + } + }, + "form-data": { + "version": "3.0.0", + "resolved": "https://registry.npmjs.org/form-data/-/form-data-3.0.0.tgz", + "integrity": "sha512-CKMFDglpbMi6PyN+brwB9Q/GOw0eAnsrEZDgcsH5Krhz5Od/haKHAX0NmQfha2zPPz0JpWzA7GJHGSnvCRLWsg==", + "requires": { + "asynckit": "^0.4.0", + "combined-stream": "^1.0.8", + "mime-types": "^2.1.12" + } + }, + "forwarded": { + "version": "0.1.2", + "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.1.2.tgz", + "integrity": "sha1-mMI9qxF1ZXuMBXPozszZGw/xjIQ=" + }, + "fresh": { + "version": "0.5.2", + "resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz", + "integrity": "sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=" + }, + "fs-capacitor": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/fs-capacitor/-/fs-capacitor-2.0.4.tgz", + "integrity": "sha512-8S4f4WsCryNw2mJJchi46YgB6CR5Ze+4L1h8ewl9tEpL4SJ3ZO+c/bS4BWhB8bK+O3TMqhuZarTitd0S0eh2pA==" + }, + "function-bind": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.1.tgz", + "integrity": "sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A==" + }, + "graphql": { + "version": "14.2.1", + "resolved": "https://registry.npmjs.org/graphql/-/graphql-14.2.1.tgz", + "integrity": "sha512-2PL1UbvKeSjy/lUeJqHk+eR9CvuErXoCNwJI4jm3oNFEeY+9ELqHNKO1ZuSxAkasPkpWbmT/iMRMFxd3cEL3tQ==", + "requires": { + "iterall": "^1.2.2" + } + }, + "graphql-extensions": { + "version": "0.12.4", + "resolved": "https://registry.npmjs.org/graphql-extensions/-/graphql-extensions-0.12.4.tgz", + "integrity": "sha512-GnR4LiWk3s2bGOqIh6V1JgnSXw2RCH4NOgbCFEWvB6JqWHXTlXnLZ8bRSkCiD4pltv7RHUPWqN/sGh8R6Ae/ag==", + "requires": { + "@apollographql/apollo-tools": "^0.4.3", + "apollo-server-env": "^2.4.5", + "apollo-server-types": "^0.5.1" + } + }, + "graphql-subscriptions": { + "version": "0.5.8", + "resolved": "https://registry.npmjs.org/graphql-subscriptions/-/graphql-subscriptions-0.5.8.tgz", + "integrity": "sha512-0CaZnXKBw2pwnIbvmVckby5Ge5e2ecmjofhYCdyeACbCly2j3WXDP/pl+s+Dqd2GQFC7y99NB+53jrt55CKxYQ==", + "requires": { + "iterall": "^1.2.1" + } + }, + "graphql-tag": { + "version": "2.10.1", + "resolved": "https://registry.npmjs.org/graphql-tag/-/graphql-tag-2.10.1.tgz", + "integrity": "sha512-jApXqWBzNXQ8jYa/HLkZJaVw9jgwNqZkywa2zfFn16Iv1Zb7ELNHkJaXHR7Quvd5SIGsy6Ny7SUKATgnu05uEg==" + }, + "graphql-tools": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/graphql-tools/-/graphql-tools-3.1.1.tgz", + "integrity": "sha512-yHvPkweUB0+Q/GWH5wIG60bpt8CTwBklCSzQdEHmRUgAdEQKxw+9B7zB3dG7wB3Ym7M7lfrS4Ej+jtDZfA2UXg==", + "requires": { + "apollo-link": "^1.2.2", + "apollo-utilities": "^1.0.1", + "deprecated-decorator": "^0.1.6", + "iterall": "^1.1.3", + "uuid": "^3.1.0" + }, + "dependencies": { + "uuid": { + "version": "3.4.0", + "resolved": "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz", + "integrity": "sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==" + } + } + }, + "graphql-upload": { + "version": "8.1.0", + "resolved": "https://registry.npmjs.org/graphql-upload/-/graphql-upload-8.1.0.tgz", + "integrity": "sha512-U2OiDI5VxYmzRKw0Z2dmfk0zkqMRaecH9Smh1U277gVgVe9Qn+18xqf4skwr4YJszGIh7iQDZ57+5ygOK9sM/Q==", + "requires": { + "busboy": "^0.3.1", + "fs-capacitor": "^2.0.4", + "http-errors": "^1.7.3", + "object-path": "^0.11.4" + } + }, + "has": { + "version": "1.0.3", + "resolved": "https://registry.npmjs.org/has/-/has-1.0.3.tgz", + "integrity": "sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==", + "requires": { + "function-bind": "^1.1.1" + } + }, + "has-symbols": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.1.tgz", + "integrity": "sha512-PLcsoqu++dmEIZB+6totNFKq/7Do+Z0u4oT0zKOJNl3lYK6vGwwu2hjHs+68OEZbTjiUE9bgOABXbP/GvrS0Kg==" + }, + "http-errors": { + "version": "1.8.0", + "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.8.0.tgz", + "integrity": "sha512-4I8r0C5JDhT5VkvI47QktDW75rNlGVsUf/8hzjCC/wkWI/jdTRmBb9aI7erSG82r1bjKY3F6k28WnsVxB1C73A==", + "requires": { + "depd": "~1.1.2", + "inherits": "2.0.4", + "setprototypeof": "1.2.0", + "statuses": ">= 1.5.0 < 2", + "toidentifier": "1.0.0" + } + }, + "iconv-lite": { + "version": "0.4.24", + "resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz", + "integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==", + "requires": { + "safer-buffer": ">= 2.1.2 < 3" + } + }, + "inherits": { + "version": "2.0.4", + "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz", + "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==" + }, + "ipaddr.js": { + "version": "1.9.1", + "resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz", + "integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==" + }, + "is-callable": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/is-callable/-/is-callable-1.2.0.tgz", + "integrity": "sha512-pyVD9AaGLxtg6srb2Ng6ynWJqkHU9bEM087AKck0w8QwDarTfNcpIYoU8x8Hv2Icm8u6kFJM18Dag8lyqGkviw==" + }, + "is-date-object": { + "version": "1.0.2", + "resolved": "https://registry.npmjs.org/is-date-object/-/is-date-object-1.0.2.tgz", + "integrity": "sha512-USlDT524woQ08aoZFzh3/Z6ch9Y/EWXEHQ/AaRN0SkKq4t2Jw2R2339tSXmwuVoY7LLlBCbOIlx2myP/L5zk0g==" + }, + "is-regex": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/is-regex/-/is-regex-1.1.1.tgz", + "integrity": "sha512-1+QkEcxiLlB7VEyFtyBg94e08OAsvq7FUBgApTq/w2ymCLyKJgDPsybBENVtA7XCQEgEXxKPonG+mvYRxh/LIg==", + "requires": { + "has-symbols": "^1.0.1" + } + }, + "is-symbol": { + "version": "1.0.3", + "resolved": "https://registry.npmjs.org/is-symbol/-/is-symbol-1.0.3.tgz", + "integrity": "sha512-OwijhaRSgqvhm/0ZdAcXNZt9lYdKFpcRDT5ULUuYXPoT794UNOdU+gpT6Rzo7b4V2HUl/op6GqY894AZwv9faQ==", + "requires": { + "has-symbols": "^1.0.1" + } + }, + "iterall": { + "version": "1.3.0", + "resolved": "https://registry.npmjs.org/iterall/-/iterall-1.3.0.tgz", + "integrity": "sha512-QZ9qOMdF+QLHxy1QIpUHUU1D5pS2CG2P69LF6L6CPjPYA/XMOmKV3PZpawHoAjHNyB0swdVTRxdYT4tbBbxqwg==" + }, + "lodash.sortby": { + "version": "4.7.0", + "resolved": "https://registry.npmjs.org/lodash.sortby/-/lodash.sortby-4.7.0.tgz", + "integrity": "sha1-7dFMgk4sycHgsKG0K7UhBRakJDg=" + }, + "loglevel": { + "version": "1.6.8", + "resolved": "https://registry.npmjs.org/loglevel/-/loglevel-1.6.8.tgz", + "integrity": "sha512-bsU7+gc9AJ2SqpzxwU3+1fedl8zAntbtC5XYlt3s2j1hJcn2PsXSmgN8TaLG/J1/2mod4+cE/3vNL70/c1RNCA==" + }, + "long": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/long/-/long-4.0.0.tgz", + "integrity": "sha512-XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA==" + }, + "lru-cache": { + "version": "5.1.1", + "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz", + "integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==", + "requires": { + "yallist": "^3.0.2" + } + }, + "media-typer": { + "version": "0.3.0", + "resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz", + "integrity": "sha1-hxDXrwqmJvj/+hzgAWhUUmMlV0g=" + }, + "merge-descriptors": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-1.0.1.tgz", + "integrity": "sha1-sAqqVW3YtEVoFQ7J0blT8/kMu2E=" + }, + "methods": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz", + "integrity": "sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=" + }, + "mime": { + "version": "1.6.0", + "resolved": "https://registry.npmjs.org/mime/-/mime-1.6.0.tgz", + "integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==" + }, + "mime-db": { + "version": "1.44.0", + "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.44.0.tgz", + "integrity": "sha512-/NOTfLrsPBVeH7YtFPgsVWveuL+4SjjYxaQ1xtM1KMFj7HdxlBlxeyNLzhyJVx7r4rZGJAZ/6lkKCitSc/Nmpg==" + }, + "mime-types": { + "version": "2.1.27", + "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.27.tgz", + "integrity": "sha512-JIhqnCasI9yD+SsmkquHBxTSEuZdQX5BuQnS2Vc7puQQQ+8yiP5AY5uWhpdv4YL4VM5c6iliiYWPgJ/nJQLp7w==", + "requires": { + "mime-db": "1.44.0" + } + }, + "ms": { + "version": "2.0.0", + "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz", + "integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=" + }, + "negotiator": { + "version": "0.6.2", + "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.2.tgz", + "integrity": "sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==" + }, + "node-fetch": { + "version": "2.6.0", + "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.0.tgz", + "integrity": "sha512-8dG4H5ujfvFiqDmVu9fQ5bOHUC15JMjMY/Zumv26oOvvVJjM67KF8koCWIabKQ1GJIa9r2mMZscBq/TbdOcmNA==" + }, + "object-assign": { + "version": "4.1.1", + "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz", + "integrity": "sha1-IQmtx5ZYh8/AXLvUQsrIv7s2CGM=" + }, + "object-inspect": { + "version": "1.8.0", + "resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.8.0.tgz", + "integrity": "sha512-jLdtEOB112fORuypAyl/50VRVIBIdVQOSUUGQHzJ4xBSbit81zRarz7GThkEFZy1RceYrWYcPcBFPQwHyAc1gA==" + }, + "object-keys": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz", + "integrity": "sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==" + }, + "object-path": { + "version": "0.11.4", + "resolved": "https://registry.npmjs.org/object-path/-/object-path-0.11.4.tgz", + "integrity": "sha1-NwrnUvvzfePqcKhhwju6iRVpGUk=" + }, + "object.assign": { + "version": "4.1.0", + "resolved": "https://registry.npmjs.org/object.assign/-/object.assign-4.1.0.tgz", + "integrity": "sha512-exHJeq6kBKj58mqGyTQ9DFvrZC/eR6OwxzoM9YRoGBqrXYonaFyGiFMuc9VZrXf7DarreEwMpurG3dd+CNyW5w==", + "requires": { + "define-properties": "^1.1.2", + "function-bind": "^1.1.1", + "has-symbols": "^1.0.0", + "object-keys": "^1.0.11" + } + }, + "object.getownpropertydescriptors": { + "version": "2.1.0", + "resolved": "https://registry.npmjs.org/object.getownpropertydescriptors/-/object.getownpropertydescriptors-2.1.0.tgz", + "integrity": "sha512-Z53Oah9A3TdLoblT7VKJaTDdXdT+lQO+cNpKVnya5JDe9uLvzu1YyY1yFDFrcxrlRgWrEFH0jJtD/IbuwjcEVg==", + "requires": { + "define-properties": "^1.1.3", + "es-abstract": "^1.17.0-next.1" + } + }, + "on-finished": { + "version": "2.3.0", + "resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.3.0.tgz", + "integrity": "sha1-IPEzZIGwg811M3mSoWlxqi2QaUc=", + "requires": { + "ee-first": "1.1.1" + } + }, + "parseurl": { + "version": "1.3.3", + "resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz", + "integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==" + }, + "path-to-regexp": { + "version": "0.1.7", + "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.7.tgz", + "integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=" + }, + "proxy-addr": { + "version": "2.0.6", + "resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.6.tgz", + "integrity": "sha512-dh/frvCBVmSsDYzw6n926jv974gddhkFPfiN8hPOi30Wax25QZyZEGveluCgliBnqmuM+UJmBErbAUFIoDbjOw==", + "requires": { + "forwarded": "~0.1.2", + "ipaddr.js": "1.9.1" + } + }, + "qs": { + "version": "6.7.0", + "resolved": "https://registry.npmjs.org/qs/-/qs-6.7.0.tgz", + "integrity": "sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==" + }, + "range-parser": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz", + "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==" + }, + "raw-body": { + "version": "2.4.0", + "resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.0.tgz", + "integrity": "sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==", + "requires": { + "bytes": "3.1.0", + "http-errors": "1.7.2", + "iconv-lite": "0.4.24", + "unpipe": "1.0.0" + }, + "dependencies": { + "http-errors": { + "version": "1.7.2", + "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.2.tgz", + "integrity": "sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==", + "requires": { + "depd": "~1.1.2", + "inherits": "2.0.3", + "setprototypeof": "1.1.1", + "statuses": ">= 1.5.0 < 2", + "toidentifier": "1.0.0" + } + }, + "inherits": { + "version": "2.0.3", + "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz", + "integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=" + }, + "setprototypeof": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", + "integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" + } + } + }, + "retry": { + "version": "0.12.0", + "resolved": "https://registry.npmjs.org/retry/-/retry-0.12.0.tgz", + "integrity": "sha1-G0KmJmoh8HQh0bC1S33BZ7AcATs=" + }, + "safe-buffer": { + "version": "5.2.1", + "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz", + "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==" + }, + "safer-buffer": { + "version": "2.1.2", + "resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz", + "integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==" + }, + "send": { + "version": "0.17.1", + "resolved": "https://registry.npmjs.org/send/-/send-0.17.1.tgz", + "integrity": "sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==", + "requires": { + "debug": "2.6.9", + "depd": "~1.1.2", + "destroy": "~1.0.4", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "etag": "~1.8.1", + "fresh": "0.5.2", + "http-errors": "~1.7.2", + "mime": "1.6.0", + "ms": "2.1.1", + "on-finished": "~2.3.0", + "range-parser": "~1.2.1", + "statuses": "~1.5.0" + }, + "dependencies": { + "http-errors": { + "version": "1.7.3", + "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.3.tgz", + "integrity": "sha512-ZTTX0MWrsQ2ZAhA1cejAwDLycFsd7I7nVtnkT3Ol0aqodaKW+0CTZDQ1uBv5whptCnc8e8HeRRJxRs0kmm/Qfw==", + "requires": { + "depd": "~1.1.2", + "inherits": "2.0.4", + "setprototypeof": "1.1.1", + "statuses": ">= 1.5.0 < 2", + "toidentifier": "1.0.0" + } + }, + "ms": { + "version": "2.1.1", + "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.1.tgz", + "integrity": "sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==" + }, + "setprototypeof": { + "version": "1.1.1", + "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", + "integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" + } + } + }, + "serve-static": { + "version": "1.14.1", + "resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.1.tgz", + "integrity": "sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==", + "requires": { + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "parseurl": "~1.3.3", + "send": "0.17.1" + } + }, + "setprototypeof": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz", + "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==" + }, + "sha.js": { + "version": "2.4.11", + "resolved": "https://registry.npmjs.org/sha.js/-/sha.js-2.4.11.tgz", + "integrity": "sha512-QMEp5B7cftE7APOjk5Y6xgrbWu+WkLVQwk8JNjZ8nKRciZaByEW6MubieAiToS7+dwvrjGhH8jRXz3MVd0AYqQ==", + "requires": { + "inherits": "^2.0.1", + "safe-buffer": "^5.0.1" + } + }, + "statuses": { + "version": "1.5.0", + "resolved": "https://registry.npmjs.org/statuses/-/statuses-1.5.0.tgz", + "integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow=" + }, + "streamsearch": { + "version": "0.1.2", + "resolved": "https://registry.npmjs.org/streamsearch/-/streamsearch-0.1.2.tgz", + "integrity": "sha1-gIudDlb8Jz2Am6VzOOkpkZoanxo=" + }, + "string.prototype.trimend": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/string.prototype.trimend/-/string.prototype.trimend-1.0.1.tgz", + "integrity": "sha512-LRPxFUaTtpqYsTeNKaFOw3R4bxIzWOnbQ837QfBylo8jIxtcbK/A/sMV7Q+OAV/vWo+7s25pOE10KYSjaSO06g==", + "requires": { + "define-properties": "^1.1.3", + "es-abstract": "^1.17.5" + } + }, + "string.prototype.trimstart": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/string.prototype.trimstart/-/string.prototype.trimstart-1.0.1.tgz", + "integrity": "sha512-XxZn+QpvrBI1FOcg6dIpxUPgWCPuNXvMD72aaRaUQv1eD4e/Qy8i/hFTe0BUmD60p/QA6bh1avmuPTfNjqVWRw==", + "requires": { + "define-properties": "^1.1.3", + "es-abstract": "^1.17.5" + } + }, + "subscriptions-transport-ws": { + "version": "0.9.17", + "resolved": "https://registry.npmjs.org/subscriptions-transport-ws/-/subscriptions-transport-ws-0.9.17.tgz", + "integrity": "sha512-hNHi2N80PBz4T0V0QhnnsMGvG3XDFDS9mS6BhZ3R12T6EBywC8d/uJscsga0cVO4DKtXCkCRrWm2sOYrbOdhEA==", + "requires": { + "backo2": "^1.0.2", + "eventemitter3": "^3.1.0", + "iterall": "^1.2.1", + "symbol-observable": "^1.0.4", + "ws": "^5.2.0" + }, + "dependencies": { + "ws": { + "version": "5.2.2", + "resolved": "https://registry.npmjs.org/ws/-/ws-5.2.2.tgz", + "integrity": "sha512-jaHFD6PFv6UgoIVda6qZllptQsMlDEJkTQcybzzXDYM1XO9Y8em691FGMPmM46WGyLU4z9KMgQN+qrux/nhlHA==", + "requires": { + "async-limiter": "~1.0.0" + } + } + } + }, + "symbol-observable": { + "version": "1.2.0", + "resolved": "https://registry.npmjs.org/symbol-observable/-/symbol-observable-1.2.0.tgz", + "integrity": "sha512-e900nM8RRtGhlV36KGEU9k65K3mPb1WV70OdjfxlG2EAuM1noi/E/BaW/uMhL7bPEssK8QV57vN3esixjUvcXQ==" + }, + "toidentifier": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.0.tgz", + "integrity": "sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==" + }, + "ts-invariant": { + "version": "0.4.4", + "resolved": "https://registry.npmjs.org/ts-invariant/-/ts-invariant-0.4.4.tgz", + "integrity": "sha512-uEtWkFM/sdZvRNNDL3Ehu4WVpwaulhwQszV8mrtcdeE8nN00BV9mAmQ88RkrBhFgl9gMgvjJLAQcZbnPXI9mlA==", + "requires": { + "tslib": "^1.9.3" + } + }, + "tslib": { + "version": "1.13.0", + "resolved": "https://registry.npmjs.org/tslib/-/tslib-1.13.0.tgz", + "integrity": "sha512-i/6DQjL8Xf3be4K/E6Wgpekn5Qasl1usyw++dAA35Ue5orEn65VIxOA+YvNNl9HV3qv70T7CNwjODHZrLwvd1Q==" + }, + "type-is": { + "version": "1.6.18", + "resolved": "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz", + "integrity": "sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==", + "requires": { + "media-typer": "0.3.0", + "mime-types": "~2.1.24" + } + }, + "unpipe": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz", + "integrity": "sha1-sr9O6FFKrmFltIF4KdIbLvSZBOw=" + }, + "util.promisify": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/util.promisify/-/util.promisify-1.0.1.tgz", + "integrity": "sha512-g9JpC/3He3bm38zsLupWryXHoEcS22YHthuPQSJdMy6KNrzIRzWqcsHzD/WUnqe45whVou4VIsPew37DoXWNrA==", + "requires": { + "define-properties": "^1.1.3", + "es-abstract": "^1.17.2", + "has-symbols": "^1.0.1", + "object.getownpropertydescriptors": "^2.1.0" + } + }, + "utils-merge": { + "version": "1.0.1", + "resolved": "https://registry.npmjs.org/utils-merge/-/utils-merge-1.0.1.tgz", + "integrity": "sha1-n5VxD1CiZ5R7LMwSR0HBAoQn5xM=" + }, + "uuid": { + "version": "8.3.0", + "resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.0.tgz", + "integrity": "sha512-fX6Z5o4m6XsXBdli9g7DtWgAx+osMsRRZFKma1mIUsLCz6vRvv+pz5VNbyu9UEDzpMWulZfvpgb/cmDXVulYFQ==" + }, + "vary": { + "version": "1.1.2", + "resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz", + "integrity": "sha1-IpnwLG3tMNSllhsLn3RSShj2NPw=" + }, + "ws": { + "version": "6.2.1", + "resolved": "https://registry.npmjs.org/ws/-/ws-6.2.1.tgz", + "integrity": "sha512-GIyAXC2cB7LjvpgMt9EKS2ldqr0MTrORaleiOno6TweZ6r3TKtoFQWay/2PceJ3RuBasOHzXNn5Lrw1X0bEjqA==", + "requires": { + "async-limiter": "~1.0.0" + } + }, + "xss": { + "version": "1.0.8", + "resolved": "https://registry.npmjs.org/xss/-/xss-1.0.8.tgz", + "integrity": "sha512-3MgPdaXV8rfQ/pNn16Eio6VXYPTkqwa0vc7GkiymmY/DqR1SE/7VPAAVZz1GJsJFrllMYO3RHfEaiUGjab6TNw==", + "requires": { + "commander": "^2.20.3", + "cssfilter": "0.0.10" + } + }, + "yallist": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz", + "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==" + }, + "zen-observable": { + "version": "0.8.15", + "resolved": "https://registry.npmjs.org/zen-observable/-/zen-observable-0.8.15.tgz", + "integrity": "sha512-PQ2PC7R9rslx84ndNBZB/Dkv8V8fZEpk83RLgXtYd0fwUgEjseMn1Dgajh2x6S8QbZAFa9p2qVCEuYZNgve0dQ==" + }, + "zen-observable-ts": { + "version": "0.8.21", + "resolved": "https://registry.npmjs.org/zen-observable-ts/-/zen-observable-ts-0.8.21.tgz", + "integrity": "sha512-Yj3yXweRc8LdRMrCC8nIc4kkjWecPAUVh0TI0OUrWXx6aX790vLcDlWca6I4vsyCGH3LpWxq0dJRcMOFoVqmeg==", + "requires": { + "tslib": "^1.9.3", + "zen-observable": "^0.8.0" + } + } + } +} diff --git a/server/tests-py/remote_schemas/nodejs/package.json b/server/tests-py/remote_schemas/nodejs/package.json index 7c2e95c55ad..db4a8b062d7 100644 --- a/server/tests-py/remote_schemas/nodejs/package.json +++ b/server/tests-py/remote_schemas/nodejs/package.json @@ -10,8 +10,8 @@ "author": "", "license": "ISC", "dependencies": { - "apollo-server": "^2.1.0", - "graphql": "^0.13.1", - "graphql-tag": "^2.10.1" + "apollo-server": "2.1.0", + "graphql": "14.2.1", + "graphql-tag": "2.10.1" } } diff --git a/server/tests-py/test_graphql_mutations.py b/server/tests-py/test_graphql_mutations.py index 23d0819968a..32637904b99 100644 --- a/server/tests-py/test_graphql_mutations.py +++ b/server/tests-py/test_graphql_mutations.py @@ -53,6 +53,15 @@ class TestGraphQLInsert: def test_insert_null_col_value(self, hge_ctx): check_query_f(hge_ctx, self.dir() + "/order_col_shipped_null.yaml") + def test_insert_valid_variable_but_invalid_graphql_value(self, hge_ctx): + check_query_f(hge_ctx, self.dir() + "/person_valid_variable_but_invalid_graphql_value.yaml") + + def test_can_insert_in_insertable_view(self, hge_ctx): + check_query_f(hge_ctx, self.dir() + "/can_insert_in_insertable_view.yaml") + + def test_cannot_insert_in_non_insertable_view(self, hge_ctx): + check_query_f(hge_ctx, self.dir() + "/cannot_insert_in_non_insertable_view.yaml") + @classmethod def dir(cls): return "queries/graphql_mutation/insert/basic" @@ -192,6 +201,9 @@ class TestGraphqlInsertPermission: else: pytest.skip("authorization not configured, skipping the test") + def test_check_set_headers_while_doing_upsert(self,hge_ctx): + check_query_f(hge_ctx, self.dir() + "/leads_upsert_check_with_headers.yaml") + @classmethod def dir(cls): return "queries/graphql_mutation/insert/permissions" diff --git a/server/tests-py/test_graphql_queries.py b/server/tests-py/test_graphql_queries.py index 5f434318b39..647806ce875 100644 --- a/server/tests-py/test_graphql_queries.py +++ b/server/tests-py/test_graphql_queries.py @@ -143,6 +143,12 @@ class TestGraphQLQueryAggPerm: def test_author_post_agg_order_by(self, hge_ctx, transport): check_query_f(hge_ctx, self.dir() + '/author_post_agg_order_by.yaml', transport) + def test_article_agg_without_select_access_to_any_col(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + '/article_agg_with_role_without_select_access.yaml', transport) + + def test_article_agg_with_select_access(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + '/article_agg_with_role_with_select_access.yaml', transport) + @classmethod def dir(cls): return 'queries/graphql_query/agg_perm' @@ -331,6 +337,12 @@ class TestGraphqlQueryPermissions: def test_in_and_nin(self, hge_ctx, transport): check_query_f(hge_ctx, self.dir() + '/in_and_nin.yaml', transport) + def test_user_accessing_books_by_pk_should_fail(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + '/user_should_not_be_able_to_access_books_by_pk.yaml') + + def test_author_articles_without_required_headers_set(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + '/select_articles_without_required_headers.yaml', transport) + @classmethod def dir(cls): return 'queries/graphql_query/permissions' diff --git a/server/tests-py/test_remote_relationships.py b/server/tests-py/test_remote_relationships.py index 42b761f0780..109d44ca533 100644 --- a/server/tests-py/test_remote_relationships.py +++ b/server/tests-py/test_remote_relationships.py @@ -182,11 +182,11 @@ class TestExecution: assert st_code == 200, resp check_query_f(hge_ctx, self.dir() + 'query_with_arguments.yaml') - # def test_with_variables(self, hge_ctx): - # check_query_f(hge_ctx, self.dir() + 'mixed_variables.yaml') - # st_code, resp = hge_ctx.v1q_f(self.dir() + 'setup_remote_rel_nested_args.yaml') - # assert st_code == 200, resp - # check_query_f(hge_ctx, self.dir() + 'remote_rel_variables.yaml') + def test_with_variables(self, hge_ctx): + # check_query_f(hge_ctx, self.dir() + 'mixed_variables.yaml') -- uses heterogenous execution, due to which this assert fails + st_code, resp = hge_ctx.v1q_f(self.dir() + 'setup_remote_rel_nested_args.yaml') + assert st_code == 200, resp + check_query_f(hge_ctx, self.dir() + 'remote_rel_variables.yaml') # def test_with_fragments(self, hge_ctx): # check_query_f(hge_ctx, self.dir() + 'mixed_fragments.yaml') @@ -222,6 +222,15 @@ class TestExecution: assert st_code == 200, resp check_query_f(hge_ctx, self.dir() + 'rename_table_with_remote_rel_dependency.yaml') + def test_remote_joins_with_subscription_should_throw_error(self, hge_ctx): + st_code, resp = hge_ctx.v1q_f(self.dir() + 'setup_remote_rel_basic.yaml') + assert st_code == 200, resp + check_query_f(hge_ctx, self.dir() + 'subscription_with_remote_join_fields.yaml') + + def test_remote_joins_in_mutation_response(self, hge_ctx): + st_code, resp = hge_ctx.v1q_f(self.dir() + 'setup_remote_rel_basic_with_authors.yaml') + assert st_code == 200, resp + check_query_f(hge_ctx, self.dir() + 'mutation_output_with_remote_join_fields.yaml') class TestDeepExecution: diff --git a/server/tests-py/test_schema_stitching.py b/server/tests-py/test_schema_stitching.py index 642d3148e99..eaad62dddbd 100644 --- a/server/tests-py/test_schema_stitching.py +++ b/server/tests-py/test_schema_stitching.py @@ -4,6 +4,7 @@ import string import random import ruamel.yaml as yaml import json +import graphql import queue import requests import time @@ -77,7 +78,7 @@ class TestRemoteSchemaBasic: with open('queries/graphql_introspection/introspection.yaml') as f: query = yaml.safe_load(f) resp, _ = check_query(hge_ctx, query) - assert check_introspection_result(resp, ['Hello'], ['hello']) + assert check_introspection_result(resp, ['String'], ['hello']) @pytest.mark.allow_server_upgrade_test def test_introspection_as_user(self, hge_ctx): @@ -522,17 +523,25 @@ def get_fld_by_name(ty, fldName): def get_arg_by_name(fld, argName): return _filter(lambda a: a['name'] == argName, fld['args']) -def compare_args(argH, argR): +def compare_args(arg_path, argH, argR): assert argR['type'] == argH['type'], yaml.dump({ 'error' : 'Types do not match for arg ' + arg_path, 'remote_type' : argR['type'], 'hasura_type' : argH['type'] }) - assert argR['defaultValue'] == argH['defaultValue'], yaml.dump({ - 'error' : 'Default values do not match for arg ' + arg_path, - 'remote_default_value' : argR['defaultValue'], - 'hasura_default_value' : argH['defaultValue'] - }) + compare_default_value(argR['defaultValue'], argH['defaultValue']) + +# There doesn't seem to be any Python code that can correctly compare GraphQL +# 'Value's for equality. So we try to do it here. +def compare_default_value(valH, valR): + a = graphql.parse_value(valH) + b = graphql.parse_value(valR) + if a == b: + return True + for field in a.fields: + assert field in b.fields + for field in b.fields: + assert field in a.fields def compare_flds(fldH, fldR): assert fldH['type'] == fldR['type'], yaml.dump({ @@ -546,7 +555,7 @@ def compare_flds(fldH, fldR): has_arg[arg_path] = False for argH in get_arg_by_name(fldH, argR['name']): has_arg[arg_path] = True - compare_args(argH, argR) + compare_args(arg_path, argH, argR) assert has_arg[arg_path], 'Argument ' + arg_path + ' in the remote schema root query type not found in Hasura schema' reload_metadata_q = { diff --git a/server/tests-py/test_validation.py b/server/tests-py/test_validation.py index c825025ccfd..0ac9d1bd649 100644 --- a/server/tests-py/test_validation.py +++ b/server/tests-py/test_validation.py @@ -14,6 +14,12 @@ class TestGraphQLValidation: def test_null_variable_value(self, hge_ctx, transport): check_query_f(hge_ctx, self.dir() + "/null_variable_value_err.yaml", transport) + def test_variable_type_mismatch(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + "/variable_type_mismatch.yaml", transport) + + def test_json_column_value(self, hge_ctx, transport): + check_query_f(hge_ctx, self.dir() + "/json_column_value.yaml", transport) + @classmethod def dir(cls): return "queries/graphql_validation" diff --git a/server/tests-py/validate.py b/server/tests-py/validate.py index 3eecd52a4db..a18b311364a 100644 --- a/server/tests-py/validate.py +++ b/server/tests-py/validate.py @@ -318,7 +318,7 @@ def assert_graphql_resp_expected(resp_orig, exp_response_orig, query, resp_hdrs= # If it is a batch GraphQL query, compare each individual response separately for (exp, out) in zip(as_list(exp_response), as_list(resp)): matched_ = equal_CommentedMap(exp, out) - if is_err_msg(exp): + if is_err_msg(exp) and is_err_msg(out): if not matched_: warnings.warn("Response does not have the expected error message\n" + dump_str.getvalue()) return resp, matched